Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Graphics Open Source Upgrades Linux Hardware

Intel Open-Sources Broadwell GPU Driver & Indicates Major Silicon Changes 103

An anonymous reader writes "Intel shipped open-source Broadwell graphics driver support for Linux this weekend. While building upon the existing Intel Linux GPU driver, the kernel driver changes are significant in size for Broadwell. Code comments from Intel indicate that these processors shipping in 2014 will have "some of the biggest changes we've seen on the execution and memory management side of the GPU" and "dwarf any other silicon iteration during my tenure, and certainly can compete with the likes of the gen3->gen4 changes." Come next year, Intel may now be able to better take on AMD and NVIDIA discrete graphics solutions."
This discussion has been archived. No new comments can be posted.

Intel Open-Sources Broadwell GPU Driver & Indicates Major Silicon Changes

Comments Filter:
  • by Calibax ( 151875 ) * on Sunday November 03, 2013 @06:22PM (#45320759)

    It's not like AMD, nVidia, PowerVR, etc. are standing still Every year brings better graphics, and Intel needs to keep pace.

    But since they came late to the game, they have a patent minefield in front of them.

    • by moozoo ( 1308855 )
      >they have a patent minefield in front of them. nvidia and Intel have a patent agreement. Intel licensed their technology for 1.5 billion (over six years I think) AMD and Intel have patent agreements with regards to CPU technologies and some of those would apply to graphics (interconnects , memory etc) I doubt Intel and AMD would have trouble coming to an agreement with graphics patents.
    • Sure, but for now AMD and Nvidia seem to be happy rebadging previous-gen chips with new names and calling it a day. 2014 is almost here and still nobody knows anything about Maxwell, which was already supposed to be shipping by this point. With huge per generation improvements and a significant process advantage, Intel could really put the hurt on them in the lower end of the market, which is the majority of it.

      • No need to use the past tense. Even among the obviously gamer/enthusiast slanted systems represented in the Steam Hardware Survey [steampowered.com], they place surprisingly well. Among people who don't care, buying a discrete video card went away some time ago, and Intel gets a default win on anything non-AMD.

        Not a terribly thrilling market to dominate; but you make it up in volume, I imagine.
    • by Anonymous Coward

      my last 4 builds used only intel CPUs, mostly because of the HD integrated GPUs.

      yes, they are late, but they are the only one not requiring me to have a binary blob in my system. granted, my 3D perf is lousy...

    • by smash ( 1351 )
      Eventually though, GPUs become "good enough" until software catches up. I suspect we're getting close to that point now.
      • There's likely to be a long plateau. Past a certain resolution, there's no visible difference and so you have a maximum size for a frame buffer. Past another threshold, there's no benefit in increasing geometry complexity because every polygon is smaller than a pixel in the final image, so you don't see any difference. No one cares about turning up the detail settings to maximum if they can't see that it's better. Then there are some small improvements, such as stereoscopic displays, but they just doubl

        • by smash ( 1351 )
          Yeah, i'm not counting 3d holographic/volumetric type display. Presumably that will be the next generation of device, once we're "done" with the current 2d display method of rendering and some new display tech comes out.
    • Intel has never been competitive with discrete GPUs from nVidia, AMD.

      Olbg. http://www.dvhardware.net/news/nvidia_intel_insides_gpu_santa.jpg [dvhardware.net]

      • Intel has never been competitive with discrete GPUs from nVidia, AMD.

        In doing research on the new Macbook Pros, it sure looked like the performance of the Intel Iris Pro shipping in the Macbook Pro 15" is pretty competitive with the nVidia 650M.

        • Maybe, but that is comparing low power notebook chips. Try comparing it in desktops and the picture changes by quite a bit. Of course, it is also worth considering that a modern Haswell CPU uses 84W of power while a modern AMD GPU uses 300W of power. If you gave Intel 300W of power to work with, I'm sure they could come up with something impressive.
          • I don't know if you've noticed, but desktops are a niche market now. We're almost five years passed the point where laptop sales passed desktop sales, and that trend hasn't changed. Laptop parts are where the high volumes are and that's where the big profits come from. Ask SGI some time how focussing on the high end and ignoring the mass market works as a strategy in the GPU business.
            • You are of course correct that desktop sales have slowed, but the question becomes, "how many people are just upgrading their desktops rather than replacing them?"

              You can't really upgrade a notebook, however my desktop case is more than 6 years old, it has seen everything replaced in that time frame, including the power supply (needed more power and more PCI-E connectors). That doesn't count as a desktop sale, but I sure spent thousands of dollars in parts.

              In any case, for high end graphics, you end up

              • You are of course correct that desktop sales have slowed, but the question becomes, "how many people are just upgrading their desktops rather than replacing them?"

                No it doesn't. Desktop upgrades were always a tiny niche for hobbyists.

                • For a "tiny niche", it sure seems like there are a lot of companies selling hardware for it. NewEgg has become quite a large business largely based on selling hardware to such people.
                  • NewEgg also sells assembled machines. Want to take a guess at what proportion of their total sales each make up? If you don't believe me, go and look up the numbers. Last ones I read, well under 5% of all computers sold ever received an after-market upgrade, and most of those were just RAM upgrades.

                    When you're talking about a market with a billion or so sales every year, it's not surprising that there are companies that do well catering to a fraction of a percent of the total market, but that doesn't

          • Maybe, but that is comparing low power notebook chips.

            Since that's the real market though now I think that's where it should be compared - and the other comparison is, can it reasonably run a high performance game? It meets that metric now also.

            For me that's really what I mean by "competitive", is that if I'm given a choice between an Intel Iris Pro and some discrete GPU I won't necessarily pick the discrete part. That is true right now, you can get decent enough performance out of the Intel chipset that

        • by EdZ ( 755139 )
          When your flagship GPU is about level with a mid-range mobile part from a year and a half ago...
          • by smash ( 1351 )
            If you're referring to the Iris GT3e - the GT3e is a mobile part itself. And its part of the CPU, so it is competitive with GT650M in around HALF THE POWER. In other words, if intel were to enable it for multi-socket SMP, they could have a dual quad core system with two GT3e Iris Pro GPUs in a similar thermal/power envelope as the previous model machines with Ivy Bridge + Nvidia GT650M.
          • I've been tracking GPU $ & Performance since 2000. Every year you can buy the same the GPU power for ~ $100 cheaper.

            My Titan GTX is NOT going to be competitive with any shitty mobile GPU for at least 5 years.

  • For low and some mid-range stuff, sure. But Intel is never going to be able to get above that so long as nVidia and AMD keep cranking out new components year after year. All Intel should be striving for is decent 4K@60 support, making sure multi-monitor systems don't break, and that compositing works as intended.
    • For low and some mid-range stuff, sure. But Intel is never going to be able to get above that so long as nVidia and AMD keep cranking out new components year after year

      Personally I love the thought (and so does the market, and manufactures) of getting a more powerful Fanless; Cheap; supported by reliable first party open source developers discreet GPU. That gives me a massive boost over what I am getting over my current APU performance. In reality its only a few specialists (albeit more newsworthy) that really buy into the high end anyway.

      • Re: (Score:1, Insightful)

        And what Intel is offering right now with Haswell is a massive improvement over any APU that AMD has produced to date. Which shouldn't be much of a surprise considering AMD's constant spectacular disappointments. But you're never going to get a cheap fanless discrete GPU with the power to hold up against a GTX660 or better hardware.
        • by Kjella ( 173770 )

          But you're never going to get a cheap fanless discrete GPU with the power to hold up against a GTX660 or better hardware.

          Never? Like how cheap fanless GPUs from 2013 don't beat the crap out of any high end graphics card from 1998 never? Don't go there. But yes, for any given level of technology you can always do more with a 250W+ dGPU power budget than a <10W fanless thing. But do gamers need it? From the Steam hardware survey 98.5% of the users play at 1920x1200 or below and of them 16% on Intel graphics. Not every game is a Crysis, many of them simply play well on any recent dGPU but suck just a little bit too much on in

      • Re: (Score:2, Informative)

        by Anonymous Coward

        discrete= separate

        discreet= quietly

    • by smash ( 1351 )
      Nah, if intel can ramp up fast enough in the next couple of years they will reach "good enough" status and software won't demand better for a few years. The average user doesn't have a 4k display (even on steam - which is skewed towards gamers - the most common res on steam is either 1680x1050 or 1920x1080 at the moment).
    • I wouldn't be surprised if compositing on 4K@60FPS works just fine already, provided the machine has display outputs that support the resolution and refresh rate. 2560x1600 via DisplayPort, for instance, was already available on Core2Duo laptops with 4500MHD graphics (that's... 4 generations before Haswell)...

      Looks like even Ivy Bridge's HD4000 supports 4K: http://ultrabooknews.com/2012/10/31/new-intel-hd-3000-and-hd-4000-drivers-support-for-windows-8-4k-ultra-hd-opengl-4-0-and-more/ [ultrabooknews.com]

    • I'm sure Intel is deeply disappointed to only have 60% of the GPU market. The board and shareholders must be crying all the way to the bank.

      The problem with your line of reasoning is that it's exactly what SGI said in the mid '90s. That other companies were welcome to the low-end commodity GPU market, they'd keep the profitable high end graphical workstation market. Unfortunately for them, the cheaper parts kept improving and gradually passed a lot of people's thresholds for 'good enough'. Intel sell

      • by smash ( 1351 )
        Seriously, I'd LOVE to see intel enable multi-socket for broadwell mobile CPUs. Can you imagine - 2x quad core CPUs, 2x integrated intel graphics (some variant of SLI or similar GPU load sharing). 8 Cores. 16 threads. ~70-90w TDP. You could stick that shit in a laptop - when running on battery just turn off a socket. When running on AC, it would fly.
  • what about high speed video ram channels? wait no we have to the same slow pool of slower system ram.

    • by Anonymous Coward

      That's Exactly what Crystal Well (a.k.a. Iris Pro) is 128 MiBytes of very fast RAM with latency about 1/2 that of DRAM.

      • That's Exactly what Crystal Well (a.k.a. Iris Pro) is 128 MiBytes of very fast RAM with latency about 1/2 that of DRAM.

        In typical 'Intel - because we can.' product differentiation, they've unfortunately gone and made that bit tricky to get: Apparently, only their R-series Haswells, the BGA ones, have the eDRAM on the desktop. On the mobile side, it's reserved for the highest end i7s, I don't know if any of those are LGAs.

        I don't doubt that it makes economic sense; but Intel is continuing their annoying policy of deliberately having no ideal low-to-midrange part: If you go for a lower performance CPU, as a price-sensitive

  • by msobkow ( 48369 ) on Sunday November 03, 2013 @09:24PM (#45321765) Homepage Journal

    There is only one change I'd like to see made sooner rather than later:

    Stop using my main memory as a video buffer!!!

    The main reason I opt for discrete graphics solutions is not because of the performance of the graphics, but the lack of main memory throughput degradation. I build boxes to compute, not sling graphics.

    • So you want to waste your video memory when you arent using the GPU?
    • by smash ( 1351 )
      Better option: just increase main memory bandwidth, and then everything gets better.
      • by Anonymous Coward

        yeah, let's get expensive dual-port gddr5 for main memory. good idea.

        or, you know, we can make the hard part expensive, and the easy part 16 gigabytes...

    • by alexo ( 9335 )

      There is only one change I'd like to see made sooner rather than later:

      Stop using my main memory as a video buffer!!!

      The main reason I opt for discrete graphics solutions is not because of the performance of the graphics, but the lack of main memory throughput degradation. I build boxes to compute, not sling graphics.

      Once you start thinking of the GPU as a math coprocessor (that incidentally also slings graphics very well), your views on the subject may change.

  • I am so sick and tired of crap graphics on LINUX it isn't funny.

    A fully open source solution from Intel and perhaps AMD would absolutely destroy Nvidia in the LINUX space.

    I like the efforts so far AMD as made, and I applaud them for it, but it took them way too long.

    If Intel can come out with a better GPU, MESA would be able to achieve OpenGL 4+ compatibility much faster.

    Nividia mostly and to some part AMD has destroyed LINUX's ability to get onto the desktop.

    Part of this I think is due to board collusion b

    • Since Linux is about 2% of the total installed OS market and since the percentage of THAT market that plays games is limited, I doubt that Intel and AMD care very much about it. Linux on the desktop is a fantasy, I recall hearing "year of the Linux desktop" back in 1994, still hasn't happened...
      • You are right, Linux desktop is a dead end. Linux APPLIANCES are going to be huge.
        • Maybe, but once it moves from "Linux desktop computer that you can do anything with" to "Linux appliance that you can't make any unapproved changes to", the difference becomes academic.
          • I dont think you understand how fast the floor is rising. The ONLY people that can get away with locking hardware in the consumer space right now is companies that enjoy government monopoly protection (patent, copyright, telecom/cable)
            • I think you overestimate the number of people who care. Apple has one of the most closed ecosystems in the world, but clearly it doesn't matter. Android and Windows provide plenty of competition, but millions still line up to buy completely locked down devices from Apple.
              • Its not about caring. Its about providing a compelling product. Linux appliances will be compelling products, either in providing a cheaper entry or more liberty, both cases increasing value. Has nothing ot do with 'caring'. Provide a superior product in all ways, and the people will come. Save your betamax argument, it wasnt superior in all ways.
                • Will they? Are you expecting an open standard Linux box that you can modify? Are you expecting the likes of Comcast and Time Warner to ever allow that?

                  What use will the box have? What would you do with it?

                  In front of my main TV, I have a Ruku 3, a PS3, a DirecTV box, and a Wii U. None of those are open and none of the companies behind them really want them to be. Maybe Ruku might be the closest, but the services on it are as closed as closed gets.

                  Maybe you should let me know what these Linux applia

Whoever dies with the most toys wins.

Working...