AMD Releases Open Source Fusion Driver 126
An anonymous reader writes "Yesterday AMD released open source Linux driver support for their Fusion APUs, primarily for the first Ontario processor. As detailed on Phoronix, this includes support for kernel mode-setting, 2D acceleration, and 3D acceleration via both Mesa and Gallium3D graphics drivers."
Fusion (Score:1, Insightful)
Fusion is going to be important. AMD will finally have a portable product that rivals Intel. Integrated video hardware is now commonplace on the desktop [dell.com]. Embedded AMD hardware is beginning to appear [innocoregaming.com] and Fusion will accelerate this.
Intel doesn't have a 3D core they can integrate onto the CPU die. Bottom line is AMD has an edge.
Linux drivers - stable?? (Score:1, Insightful)
Drivers that finally enable full capability of the hardware are a must, be that OSS or closed source.
nVidia has a long term support in their Linux drivers - they are the same as their Windows or OS X drivers, just added a GPL glue layer. Are the AMD drivers going to be long term supported too? Stable??
To me,
stable > long term support >>> OSS > closed-source
only because I'm not planning on debugging video drivers!
Re:Fusion (Score:3, Insightful)
Re:Time to move away from NVidia now? (Score:5, Insightful)
Not so much with cutting edge gaming rigs, but with older computers especially it's fairly common for video cards to outlive their manufacturer support and still contain a few bugs or optimization problems.
Re:Time to move away from NVidia now? (Score:5, Insightful)
Time? You are late. ATI has been releasing specs and employing engineers to write opensource drivers for some time already. I haven't bought a Nvidia GPU for years, and I have no plans to do it in the future.
Re:Linux drivers - stable?? (Score:3, Insightful)
It all comes down to "supporting old stuff does not bring new sales, therefore is really bad in the long term" vs "I still use/want to use old stuff, therefore I want it to work, and as long as it fits me, I'll support it."
In the OSS community, the only hardware that's not supported is really the hardware that's not used. Hell they even managed to support closed nvidia hardware without any help from nvidia (see nouveau 2D/3D driver)
Also i'm more confident about OSS drivers being stable than closed source ones. Agreed OSS ones right now are still a bit unfinished, but they really are working fine on r600-700 with classic mesa. In fact I've been playing through all Stalker games recently with decent performances.
Chances are, OSS drivers are good enough for the vast majority of you. Maybe hardcore gamers will bitch, but that's all.
Re:Time to move away from NVidia now? (Score:3, Insightful)
That said, nVidia has given very long support on old graphics cards. The primary reason that they support Linux is OpenGL workstations, which demand long support cycles and regular users get the benefit. And as even AMD admit, you get bigger benefits from cross-platform code (Win/Mac/*nix) than you do from the open source collaboration, as long as it's not possible to open up the closed source drivers due to DRM licensing, licensed code and so on. The open source team is very small compared to the Catalyst/fglrx, whether you count just the AMD employees or all the contributors. I have an HD5850 and the open source drivers are still not in any way on par with nVidia's (or AMDs) closed source drivers despite being 14 months since release, in some ways it'll probably never get there. As long as it's possible to fix bugs with the given documentation on how it should work you are good, if you're triggering some kind of undocumented lockup it might not be that easy getting resources on it except to say "don't do that".
Re:Price of Android pod touch (Score:4, Insightful)
That's the same argument fanboys always use to call Apple products cheaper. Hand pick your specific criteria the must be included (app store) and excluded (and actual phone . . .) until you get just the right oddball combination of features that you can call it cheaper.
Meanwhile, when you compare the iPod Touch to other touch-screen media players, it's pricing is atrocious, and Apple's laptops, desktops, and servers all fair equally poorly against their general competitors.
As a matter of fact the only segment in which Apple competes well on price is with iPhone. It's about the same as other similar smartphones. Other than that though? You're definitely paying your turtleneck-tax.
Re:Time to move away from NVidia now? (Score:3, Insightful)
I don't know about "time to", but in any case where the software is open vs. closed, the open-source community will not make the effort with the closed system. This will absolutely make linux hackers choose AMD graphics now, which will almost certainly result in improved reliability of AMD cards in linux systems overall, and eventually almost total domination of the consumer linux segment by AMD graphics.
Re:Time to move away from NVidia now? (Score:3, Insightful)
Intel has all-but-formally-announced their intention to lock Nvidia out of everything they can, as fast as the feds will let them. On die-video, no QPI licence, trimming PCIe lanes off lower end products, etc. AMD has not been as frankly rude about it; but their on-die video will be even more competent than Intel's, and they control a smaller slice of the market, in any case. Pretty much across the board, Nvidia can reasonably expect to be shoved out of anything too small, power-constrained, thermal-constrained, or cost-constrained to have either a full discrete GPU(in laptops) or a full PCIe expansion slot, populated,(desktops/servers).
Unless they can think of some fairly clever pushback, and fast, this will leave them with a market of A)Enthusiast gamers(who tend to run Windows and replace GPUs frequently) B)Serious CAD/Visualization guys(who may or may not run Windows; but whose Very Expensive software packages depend on Nvidia's 'makes the train run on time' approach to OpenGL support, rather than software freedom, seemless OSX integration, or still working in 5 years) and C) GPU compute types (who, again, are running very expensive software on very expensive hardware, and care that it works and, if they are large enough, that they can get engineering support). None of these markets place a premium on FOSS drivers, and most of them make driver quality and featurefulness a major part of Nvidia's competitive advantage(going from 'foremost provider of GPU computing solutions' to 'just another fabless silicon vendor whose stuff will work if you target Gallium3D' would be bad news for Nvidia...).
AMD and Intel, on the other hand, while deadly rivals, are in virtually identical positions RE: FOSS drivers: For their low-end stuff, drivers are just a pain in the ass. Especially for Intel, if team Linux will overlook their suckitude because their ttys come back after suspend, or whatever it happens to be, that is a pure win. They are both racing to make low to midrange GPU capabilities just part of the CPU, and it is very much to their advantage if all their CPU capabilities are Just Supported on whatever OSes the market cares about. I would expect to see increasing divergence in strategy between Intel/AMD on the one hand and Nvidia on the other.
Re:When AMD turns to 28nm production... (Score:5, Insightful)
And you think Apple customers are that worried about price?
Apple customers are going to pay a premium no matter what. It's Apple that wants the discount. The less Apple pays for the hardware, the larger the margin they get with each product. Apple's customers aren't going to see any discount, even if Apple's discount is $100 per processor to move to AMD.
Apple has $50B in the cash. Considering what they sell, that's an absurd amount of money. Enough to buy Sony outright. It just goes to show you the enormous margins that consumers pay for Apple products. It's like Sun / Oracle / Cisco in the 90s except these are consumers that are paying the outrageous margins rather than large money-fat corporations.
Re:Time to move away from NVidia now? (Score:3, Insightful)
I'll grant that the situation's always sucked for non-x86 platforms, but Nvidia's done a remarkable job of supporting their older hardware in Linux.
Drivers for GeForce FX Cards, Updated 10/18/2010 [nvidia.com]
Drivers for GeForce2-4 Cards, Updated 11/16/2010 [nvidia.com]
Drivers for the Riva 128 (?!) through GeForce256, Updated 08/04/2010 [nvidia.com]
There are also supported drivers for all of these products for AMD64 Linux. It's no substitute for an open source driver - I support nouveau - but declaring that they leave their old cards unsupported is patently false. They're still one of the only games in town for CUDA and GPU computing. And, as someone who has a house full of systems running Nvidia graphics cards, Nvidia has treated me very well.
Re:Time to move away from NVidia now? (Score:4, Insightful)
They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger
Doesn't NVidia make the Tegra/Tegra2 processors for mobile devices?
Re:Take a step back, look at the big picture. (Score:3, Insightful)
Uh, software is still written to specific hardware. You may write it in C, but C doesn't determine the register mappings and semantics. *addr=value is still just mov [addr],$value
Yes, but largely NO!
I write my software in C. The same code compiles and runs on x64 and on x86. The COMPILER translates my cross platform "*addr=value;" into the apropriate machine level instructions. My C software programs do not concern themselves with the specific register mappings and processor semantics; this has been abstracted away by the C Compiler.
I agree that driver software may be written to the specific hardware, but the purpose of a driver is to abstract said "register mappings and semantics" so that the majority of other software (All other software EXCEPT the driver), don't have to worry about the "register mappings" and/or other "semantics".
Inline Assembly code is not cross platform, and in many cases a compiler can make better improvements than the assembly code in question does.
When is the last time you used a significant program that was written entirely in assembly?
Again, I must reiterate: Take a step back, look at the big picture.
You're focusing on the little edge part which may get cut off without significantly changing the picture at all.