AMD Releases Open Source Fusion Driver 126
An anonymous reader writes "Yesterday AMD released open source Linux driver support for their Fusion APUs, primarily for the first Ontario processor. As detailed on Phoronix, this includes support for kernel mode-setting, 2D acceleration, and 3D acceleration via both Mesa and Gallium3D graphics drivers."
SALE ? (Score:1, Interesting)
core2 is dieing intel's next on board video 9400m (Score:2, Interesting)
core2 is dieing intel's next on board video is at nvidia 9400m level but it also locks out better on board video.
Some of apple systems may not fit in a full X16 pci-e video chip.
Apple is may use i3 / i5 cpu with a added ati or nvidia chip. But they don't like to use add in video in there low end systems.
When AMD turns to 28nm production... (Score:3, Interesting)
Any chance Apple could use that for the next versions of Mac mini and MacBooks? Or is a Core 2 Duo with nVidia 320M still better than Fusion?
... according to Fudzilla.com
http://www.fudzilla.com/notebooks/item/20888-amd-apple-deal-is-28nm-notebooks [fudzilla.com]
"Fusion goes Apple 28 / 32nm
It all started here, when AMD’s Senior VP and Chief Sales Officer Emilio Ghilardi was brave enough to show an image of several Apple products in a Fusion presentation. After we wrote our part AMD was quick to deny it, perhaps a bit too quick, which gave us a reason to dig some more, only to find that we were on the right track.
We asked around and some sources close to Intel / Nvidia have denied the rumour saying that they know nothing about it. However, just a day later we managed to confirm that the leak is real and that Apple will indeed use Fusion, here.
Our industry sources have indicated that the deal will be announced in at some point 2011, that it will involve 28nm and 32nm Fusion parts particularly Krishna and that Apple plans to launch notebooks based on AMD chips. Apple is also not cold hearted on Trinity 32nm Fusion parts.
The announcement can be as far as a year away, as 28nm parts won't materialise until the second half of 2011 and since AMD doesn’t have a tablet chip, it won’t happen in iPad segment. At this point Apple doesn’t plan to use any AMD chips in desktop or server parts, but in case Bulldozer impresses us all, maybe Steve might change his mind.
So if you like Apple and love AMD, start saving money as roughly a year from now you should be able to buy Apple notebook with Fusion Krishna / Trinity class APU."
And if you want Fusion benchmarks, check the usual suspects:
http://techreport.com/articles.x/19981 [techreport.com]
http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked [anandtech.com]
Time to move away from NVidia now? (Score:5, Interesting)
Long ago, I went with ATI video because it had the best support for Linux. Eventually, NVidia caught on to this trend and started supporting Linux too... and better than ATI. So I switched. Now NVidia has screwed the community that had helped it to grow in popularity by putting out "Optimus" hybrid graphics everywhere and then refusing to update their Linux drivers to support it and refusing to release any details about it either. So now, the best anyone had been able to do is disable the nvidia GPU to reduce power consumption in laptops not able to utilize the nvidia hardware.
As AMD/ATI is doing this, perhaps my next selection will be to the exclusion of NVidia (again).
When will these jerks ever learn? The future of computing is in embedded devices and those devices will run Linux. Get Linux developers using YOUR hardware and it will have a better shot at a prosperous future as well. So far, Intel and ATI are the only options.
Re:Is Fusion any good? (Score:3, Interesting)
I'm not sure if that's something which AMD has any designs on, though I'd be shocked if they weren't.
Re:When AMD turns to 28nm production... (Score:5, Interesting)
Re:Time to move away from NVidia now? (Score:4, Interesting)
NVidia had their opportunity but since AMD got their ATI dept's act together their GPU performance and importantly their Linux support has come on in leaps & bounds.
With NVidia being squeezed out of the chipset market by AMD & Intel and even the consumer Graphics card able to play most FPS games at more than adequate frame rates, I see (sadly) NVidia slowly but surely going the way of Novel's Netware. i.e. to an inevitable death.
They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger
Re:Time to move away from NVidia now? (Score:1, Interesting)
It's always been time to 'move away' from nvidia.
I asked years ago which card to use. Linux gaming fanboys said go nvidia, their closed blobs have great 3d support.
WTF do I care about that?!?! When they decide to release new models and want to force everyone to upgrade, they
quit maintaining the old blobs. Therefore, I'm left in simple open source 2D land with both vendors. Plus some marginal 3D here and there with both, but whatever. And further, when that blob starts crashing as I update
around it, expose bugs, get cracked, etc, guess what, it's closed, no further releases. Oops.
Now it's this year, and guess what, ATI has the same decent open 2D support they always have.
And looky here, even more 3D specs are being released, again as always... so GOGOGO AMD/ATI.
Intel has had their own capable graphics for well over a decade. They don't need a graphics partner.
AMD did not have one, and wisely bought the less encumbered ATI.
Excepting all the windows gaming fanboys, Nvidia graphics division is just waffling in the breeze for the moment. It has no purpose in life, no one has need to buy them.
And further, both AMD and Intel have inhouse/partnered north/south bridges and bios partners.
So nvidias other lines of business are similarly mooted.
This news just cements my prior conclusion to go with AMD/ATI in the future and shows I was right back then.
NEVER follow the advice of fanboys when you're seeking to keep your long term investments usable.
Re:Time to move away from NVidia now? (Score:3, Interesting)
I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset. The "fglrx" driver does not support it as it is too old, and the open source driver causes massive display trashing and lockups. Whatever they did to it, it's not a faithful R2xx, and so the existing driver (which works on genuine, old R2xx stuff) does not work on it. But that's not all; AMD also didn't bother to contribute proper support for the power saving features of this processor or chipset, nor decent documentation for either... for anything but Vista.
In short, I wouldn't trust AMD to actually provide you proper Linux support. I'm sitting here at an AMD "netbook" (subnotebook really) without it. Indeed, this machine is really only properly supported under Vista; power saving doesn't work under Windows 7! That's actually an artifact of the video driver, though, which lags behind the Vista version. If I load the VGA driver then power saving works right. However, since AMD makes the whole chipset, I get to blame them for it no matter how you slice it.
Re:Time to move away from NVidia now? (Score:1, Interesting)
I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset.
The GPU in rs690 is actually a 4xx variant, not 2xx. Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature ?
Re:Time to move away from NVidia now? (Score:3, Interesting)
The GPU in rs690 is actually a 4xx variant, not 2xx.
Not according to the X driver.
Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature [x.org] ?
It's the CPU power-saving that AMD did not contribute to Linux, not the GPU. I can't USE the GPU long enough under linux to get to the point where I need power saving.