Intel Open-Sources Broadwell GPU Driver & Indicates Major Silicon Changes 103
An anonymous reader writes "Intel shipped open-source Broadwell graphics driver support for Linux this weekend. While building upon the existing Intel Linux GPU driver, the kernel driver changes are significant in size for Broadwell. Code comments from Intel indicate that these processors shipping in 2014 will have "some of the biggest changes we've seen on the execution and memory management side of the GPU" and "dwarf any other silicon iteration during my tenure, and certainly can compete with the likes of the gen3->gen4 changes." Come next year, Intel may now be able to better take on AMD and NVIDIA discrete graphics solutions."
Intel is keeping pace (Score:5, Informative)
It's not like AMD, nVidia, PowerVR, etc. are standing still Every year brings better graphics, and Intel needs to keep pace.
But since they came late to the game, they have a patent minefield in front of them.
Re: (Score:3, Insightful)
An eye for an eye leaves the whole world blind.
Also, two wrongs don't make a right.
Re:Intel is keeping pace (Score:5, Funny)
An eye for an eye leaves the whole world blind.
Wouldn't that just turn everybody into pirates?
Also, two wrongs don't make a right.
3 lefts do!
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
Why 1080p @60fps? Both the PS4 and Xbone will only be 30fps at the majority of games at 1080p. If Intel can reach parity with on board graphics to the new consoles that are just coming out they will have eaten into AMD's APU lead, since Intel currently crushes AMD when it comes to CPU performance.
Re:Intel is keeping pace (Score:4, Insightful)
Why 1080p @60fps? Both the PS4 and Xbone will only be 30fps at the majority of games at 1080p.
not true
in fact both consoles will target 50-60Hz + vsync, but X180 will do 720p (so last gen) while PS4 will 900-1080p (due to better gpu)
Re: (Score:3)
Intel used to licence the PowerVR stuff for the pathetic onboard video before they rolled their own.
That is not true. Some (not all) Atom platforms use PowerVR stuff, that's all. Intel has rolled plenty of in-house GPUs before and after those.
Re: (Score:2)
Of course they do. Why do you think a good graphics card costs so damn much?!?!
Re: (Score:3)
Sure, but for now AMD and Nvidia seem to be happy rebadging previous-gen chips with new names and calling it a day. 2014 is almost here and still nobody knows anything about Maxwell, which was already supposed to be shipping by this point. With huge per generation improvements and a significant process advantage, Intel could really put the hurt on them in the lower end of the market, which is the majority of it.
Re: (Score:2)
Not a terribly thrilling market to dominate; but you make it up in volume, I imagine.
Re: (Score:1)
my last 4 builds used only intel CPUs, mostly because of the HD integrated GPUs.
yes, they are late, but they are the only one not requiring me to have a binary blob in my system. granted, my 3D perf is lousy...
Re: (Score:2)
Re: (Score:3)
There's likely to be a long plateau. Past a certain resolution, there's no visible difference and so you have a maximum size for a frame buffer. Past another threshold, there's no benefit in increasing geometry complexity because every polygon is smaller than a pixel in the final image, so you don't see any difference. No one cares about turning up the detail settings to maximum if they can't see that it's better. Then there are some small improvements, such as stereoscopic displays, but they just doubl
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What's to say this isn't the plan?
Whether it is called a GPU or a CPU, if intel makes it and ships it, I don't think they particularly care.
Re: (Score:3)
Intel has never been competitive with discrete GPUs from nVidia, AMD.
Olbg. http://www.dvhardware.net/news/nvidia_intel_insides_gpu_santa.jpg [dvhardware.net]
It seems like they are competitive now (Score:2)
Intel has never been competitive with discrete GPUs from nVidia, AMD.
In doing research on the new Macbook Pros, it sure looked like the performance of the Intel Iris Pro shipping in the Macbook Pro 15" is pretty competitive with the nVidia 650M.
Re: (Score:2)
Re: (Score:3)
Re: (Score:1)
You can't really upgrade a notebook, however my desktop case is more than 6 years old, it has seen everything replaced in that time frame, including the power supply (needed more power and more PCI-E connectors). That doesn't count as a desktop sale, but I sure spent thousands of dollars in parts.
In any case, for high end graphics, you end up
Re: (Score:2)
You are of course correct that desktop sales have slowed, but the question becomes, "how many people are just upgrading their desktops rather than replacing them?"
No it doesn't. Desktop upgrades were always a tiny niche for hobbyists.
Re: (Score:1)
Re: (Score:3)
NewEgg also sells assembled machines. Want to take a guess at what proportion of their total sales each make up? If you don't believe me, go and look up the numbers. Last ones I read, well under 5% of all computers sold ever received an after-market upgrade, and most of those were just RAM upgrades.
When you're talking about a market with a billion or so sales every year, it's not surprising that there are companies that do well catering to a fraction of a percent of the total market, but that doesn't
Re: (Score:2)
Maybe, but that is comparing low power notebook chips.
Since that's the real market though now I think that's where it should be compared - and the other comparison is, can it reasonably run a high performance game? It meets that metric now also.
For me that's really what I mean by "competitive", is that if I'm given a choice between an Intel Iris Pro and some discrete GPU I won't necessarily pick the discrete part. That is true right now, you can get decent enough performance out of the Intel chipset that
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I've been tracking GPU $ & Performance since 2000. Every year you can buy the same the GPU power for ~ $100 cheaper.
My Titan GTX is NOT going to be competitive with any shitty mobile GPU for at least 5 years.
Re: (Score:2, Insightful)
Re:Meh (Score:4, Insightful)
Re: (Score:3)
$80...$100 The highest benchmark score for Intel is 3781 with the G3430, and the highest benchmark score for AMD is 4353 with the AMD A8-5600K
$100..$120 The highest benchmark score for Intel is 4399 with the i3-3225, and the highest benchmark score for AMD is 6401 with the FX-6300.
$120..$140 The highest benchmark score for Intel is 4928 with the i3-4130, and the highest benchmark score for AMD is 6609 with the FX-8120.
$140..$160 The highest benchm
Re: (Score:2)
Re: (Score:2)
Of course, you could then look at this benchmark is more indicative of what people actually run: http://www.cpubenchmark.net/singleThread.html [cpubenchmark.net]
$80...$100 The highest benchmark score for Intel is 1,988 with the G3430, and the highest benchmark score for AMD is 1,385 with the AMD A8-5600K
$100..$120 The highest benchmark score for Intel is 1,797 with the i3-3240 (G3430 better buy), and the highest benchmark score for AMD is 1,526 with the FX-4350.
$120..$140 The highest benchmark score for Intel is 1,859 with th
Re: (Score:2)
Of course, you could then look at this benchmark is more indicative of what people actually run
Word processors, web browsing, etc.. thats what you were thinking about, right?
I prefer to only consider what people actually wait for. You are of course right that most things that people do on a computer are single threaded, but nearly all of those very same things arent waited for by users because computers were more than fast enough for those tasks a decade ago. More performance has little to no benefit at all on those tasks.
and the highest benchmark score for AMD is so far down the list I got bored.
You got bored before looking at the first AMD part in the price range? Real
Re: (Score:2)
Re: (Score:2)
AMD not only continues to win the performance per dollar comparison
Until 12 months of a higher electric bill have passed. AMD is the new Pentium 4.
Re: (Score:1)
Wow. You're spectacularly bad at reading comprehension. ("Spectacularly" means "really, really", and "reading comprehension" means "understanding what you read". "Understanding" means, like, "getting it".)
Meh (Score:2)
Competition is more than performance (Score:3)
For low and some mid-range stuff, sure. But Intel is never going to be able to get above that so long as nVidia and AMD keep cranking out new components year after year
Personally I love the thought (and so does the market, and manufactures) of getting a more powerful Fanless; Cheap; supported by reliable first party open source developers discreet GPU. That gives me a massive boost over what I am getting over my current APU performance. In reality its only a few specialists (albeit more newsworthy) that really buy into the high end anyway.
Re: (Score:1, Insightful)
Re: (Score:3)
But you're never going to get a cheap fanless discrete GPU with the power to hold up against a GTX660 or better hardware.
Never? Like how cheap fanless GPUs from 2013 don't beat the crap out of any high end graphics card from 1998 never? Don't go there. But yes, for any given level of technology you can always do more with a 250W+ dGPU power budget than a <10W fanless thing. But do gamers need it? From the Steam hardware survey 98.5% of the users play at 1920x1200 or below and of them 16% on Intel graphics. Not every game is a Crysis, many of them simply play well on any recent dGPU but suck just a little bit too much on in
Re: (Score:2, Informative)
discrete= separate
discreet= quietly
Re: (Score:2)
I like discreet GPUs...
Re: Competition is more than performance (Score:2)
Re: (Score:2)
Re: (Score:2, Interesting)
Discrete math coprocessors are actually the interesting one, because they were integrated, and then un-integrated again. We just re-named them to "GPUs" (that is after all all a GPU is, a very parallel vector maths processor, with a tiny bit of rasterisation hardware tacked onto it). That said, yes I fully expect that integration of GPUs is only going to continue.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I wouldn't be surprised if compositing on 4K@60FPS works just fine already, provided the machine has display outputs that support the resolution and refresh rate. 2560x1600 via DisplayPort, for instance, was already available on Core2Duo laptops with 4500MHD graphics (that's... 4 generations before Haswell)...
Looks like even Ivy Bridge's HD4000 supports 4K: http://ultrabooknews.com/2012/10/31/new-intel-hd-3000-and-hd-4000-drivers-support-for-windows-8-4k-ultra-hd-opengl-4-0-and-more/ [ultrabooknews.com]
Re: (Score:3)
I'm sure Intel is deeply disappointed to only have 60% of the GPU market. The board and shareholders must be crying all the way to the bank.
The problem with your line of reasoning is that it's exactly what SGI said in the mid '90s. That other companies were welcome to the low-end commodity GPU market, they'd keep the profitable high end graphical workstation market. Unfortunately for them, the cheaper parts kept improving and gradually passed a lot of people's thresholds for 'good enough'. Intel sell
Re: (Score:2)
what about high speed video ram channels? (Score:1)
what about high speed video ram channels? wait no we have to the same slow pool of slower system ram.
Re: (Score:1)
That's Exactly what Crystal Well (a.k.a. Iris Pro) is 128 MiBytes of very fast RAM with latency about 1/2 that of DRAM.
Re: (Score:2)
That's Exactly what Crystal Well (a.k.a. Iris Pro) is 128 MiBytes of very fast RAM with latency about 1/2 that of DRAM.
In typical 'Intel - because we can.' product differentiation, they've unfortunately gone and made that bit tricky to get: Apparently, only their R-series Haswells, the BGA ones, have the eDRAM on the desktop. On the mobile side, it's reserved for the highest end i7s, I don't know if any of those are LGAs.
I don't doubt that it makes economic sense; but Intel is continuing their annoying policy of deliberately having no ideal low-to-midrange part: If you go for a lower performance CPU, as a price-sensitive
One change I want to see (Score:5, Insightful)
There is only one change I'd like to see made sooner rather than later:
Stop using my main memory as a video buffer!!!
The main reason I opt for discrete graphics solutions is not because of the performance of the graphics, but the lack of main memory throughput degradation. I build boxes to compute, not sling graphics.
Re: (Score:2)
Re: (Score:2)
YES!!!
Re: (Score:2)
Re: (Score:1)
yeah, let's get expensive dual-port gddr5 for main memory. good idea.
or, you know, we can make the hard part expensive, and the easy part 16 gigabytes...
Re: (Score:2)
There is only one change I'd like to see made sooner rather than later:
Stop using my main memory as a video buffer!!!
The main reason I opt for discrete graphics solutions is not because of the performance of the graphics, but the lack of main memory throughput degradation. I build boxes to compute, not sling graphics.
Once you start thinking of the GPU as a math coprocessor (that incidentally also slings graphics very well), your views on the subject may change.
Re: (Score:3)
You really REALLY need to look at what you are testing.
We are heavy users of OpenGL, and care critically about its performance.
And from that point of view, you are very VeRy wrong.
all current intel GMA implementations (even the super rare super-cache based implementations) are
terribly terrible slow compared even to old 8800gt.
we are talking significantly less than half the performance in many more advanced uses.
Yes, they can flat shade a limited number of polys quite well, and even do a little multitex, but
Re: (Score:2)
Re: (Score:2)
Having owned both I'd rate the HD3000 below an nVidia 7600GT in practice. It'll really move on older Source stuff but it starts to struggle even on HL2E1 unless you turn off a lot of the shinies. That said, being able to get a playable framerate and reasonably authentic looks out of modern games is a huge leap for laptop performance. I'm really impressed with the new era of laptop components from AMD and Intel alike.
Re: (Score:2)
The HD2000 series from 3 generations ago already beat your 8800GT. The current Iris 5200 sits between a GeForce 9800 and a GeForce 280 in terms of performance.
You sure? Got any benchmark comparisons? I'm honestly curious because the comparisons I've seen don't jive. For example:
http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html [tomshardware.com]
That page shows the HD 2000 on par with cards like Nvidia FX 5800, or ATI X1400.
HD 3000 is shows around Nvidia 6600 GT, or ATI X1600 PRO.
HD 4000 is shows around Nvidia 6800 GT or 7600 GT, or ATI X800 XT or HD 3650.
The HD 2500, HD 4200, HD 4400, HD 4600, HD 5000, Iris 5100, and Iris 5200 are not listed there.
I know t
Thank God for this news... (Score:2)
I am so sick and tired of crap graphics on LINUX it isn't funny.
A fully open source solution from Intel and perhaps AMD would absolutely destroy Nvidia in the LINUX space.
I like the efforts so far AMD as made, and I applaud them for it, but it took them way too long.
If Intel can come out with a better GPU, MESA would be able to achieve OpenGL 4+ compatibility much faster.
Nividia mostly and to some part AMD has destroyed LINUX's ability to get onto the desktop.
Part of this I think is due to board collusion b
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
What use will the box have? What would you do with it?
In front of my main TV, I have a Ruku 3, a PS3, a DirecTV box, and a Wii U. None of those are open and none of the companies behind them really want them to be. Maybe Ruku might be the closest, but the services on it are as closed as closed gets.
Maybe you should let me know what these Linux applia