Forgot your password?
typodupeerror
AMD Graphics Open Source Linux Hardware

AMD Releases Open Source Fusion Driver 126

Posted by kdawson
from the knock-yourselves-out dept.
An anonymous reader writes "Yesterday AMD released open source Linux driver support for their Fusion APUs, primarily for the first Ontario processor. As detailed on Phoronix, this includes support for kernel mode-setting, 2D acceleration, and 3D acceleration via both Mesa and Gallium3D graphics drivers."
This discussion has been archived. No new comments can be posted.

AMD Releases Open Source Fusion Driver

Comments Filter:
  • SALE ? (Score:1, Interesting)

    by Anonymous Coward on Tuesday November 23, 2010 @12:15PM (#34319774)
    Aren't too many Phoronix news lattely ? Are they on sale ?
  • by Joe The Dragon (967727) on Tuesday November 23, 2010 @12:16PM (#34319790)

    core2 is dieing intel's next on board video is at nvidia 9400m level but it also locks out better on board video.

    Some of apple systems may not fit in a full X16 pci-e video chip.

    Apple is may use i3 / i5 cpu with a added ati or nvidia chip. But they don't like to use add in video in there low end systems.

  • by IYagami (136831) on Tuesday November 23, 2010 @12:27PM (#34319972)

    Any chance Apple could use that for the next versions of Mac mini and MacBooks? Or is a Core 2 Duo with nVidia 320M still better than Fusion?

    ... according to Fudzilla.com

    http://www.fudzilla.com/notebooks/item/20888-amd-apple-deal-is-28nm-notebooks [fudzilla.com]

    "Fusion goes Apple 28 / 32nm
    It all started here, when AMD’s Senior VP and Chief Sales Officer Emilio Ghilardi was brave enough to show an image of several Apple products in a Fusion presentation. After we wrote our part AMD was quick to deny it, perhaps a bit too quick, which gave us a reason to dig some more, only to find that we were on the right track.

    We asked around and some sources close to Intel / Nvidia have denied the rumour saying that they know nothing about it. However, just a day later we managed to confirm that the leak is real and that Apple will indeed use Fusion, here.

    Our industry sources have indicated that the deal will be announced in at some point 2011, that it will involve 28nm and 32nm Fusion parts particularly Krishna and that Apple plans to launch notebooks based on AMD chips. Apple is also not cold hearted on Trinity 32nm Fusion parts.

    The announcement can be as far as a year away, as 28nm parts won't materialise until the second half of 2011 and since AMD doesn’t have a tablet chip, it won’t happen in iPad segment. At this point Apple doesn’t plan to use any AMD chips in desktop or server parts, but in case Bulldozer impresses us all, maybe Steve might change his mind.

    So if you like Apple and love AMD, start saving money as roughly a year from now you should be able to buy Apple notebook with Fusion Krishna / Trinity class APU."

    And if you want Fusion benchmarks, check the usual suspects:
    http://techreport.com/articles.x/19981 [techreport.com]
    http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked [anandtech.com]

  • by erroneus (253617) on Tuesday November 23, 2010 @12:33PM (#34320032) Homepage

    Long ago, I went with ATI video because it had the best support for Linux. Eventually, NVidia caught on to this trend and started supporting Linux too... and better than ATI. So I switched. Now NVidia has screwed the community that had helped it to grow in popularity by putting out "Optimus" hybrid graphics everywhere and then refusing to update their Linux drivers to support it and refusing to release any details about it either. So now, the best anyone had been able to do is disable the nvidia GPU to reduce power consumption in laptops not able to utilize the nvidia hardware.

    As AMD/ATI is doing this, perhaps my next selection will be to the exclusion of NVidia (again).

    When will these jerks ever learn? The future of computing is in embedded devices and those devices will run Linux. Get Linux developers using YOUR hardware and it will have a better shot at a prosperous future as well. So far, Intel and ATI are the only options.

  • by hedwards (940851) on Tuesday November 23, 2010 @12:41PM (#34320130)
    I think there was some speculation that it could be used alongside the main GPU as in some of the newer multicard ones. Basically to do things like calculating what things are visible so that the processor doesn't have to send those over the bus. Normally the GPU itself does that after the data goes over the wire, doing it on chip would be a lot cheaper, and probably quite doable if you've got another chip that ends up doing most of the rest of the work. I suspect that they'll find a way of adding that sort of flexibility.

    I'm not sure if that's something which AMD has any designs on, though I'd be shocked if they weren't.
  • by hedwards (940851) on Tuesday November 23, 2010 @12:43PM (#34320162)
    One of the complaints I've had about Apple was that they don't have any products at all that use AMD chips. Not really a deal breaker, but I prefer AMD because for as long as I can recall they've had the best performance for the price. Sure Intel is almost always faster, but just about anybody can if their not worried about price.
  • by RotateLeftByte (797477) on Tuesday November 23, 2010 @12:50PM (#34320252)

    NVidia had their opportunity but since AMD got their ATI dept's act together their GPU performance and importantly their Linux support has come on in leaps & bounds.
    With NVidia being squeezed out of the chipset market by AMD & Intel and even the consumer Graphics card able to play most FPS games at more than adequate frame rates, I see (sadly) NVidia slowly but surely going the way of Novel's Netware. i.e. to an inevitable death.
    They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger

  • by Anonymous Coward on Tuesday November 23, 2010 @01:08PM (#34320520)

    It's always been time to 'move away' from nvidia.
    I asked years ago which card to use. Linux gaming fanboys said go nvidia, their closed blobs have great 3d support.
    WTF do I care about that?!?! When they decide to release new models and want to force everyone to upgrade, they
    quit maintaining the old blobs. Therefore, I'm left in simple open source 2D land with both vendors. Plus some marginal 3D here and there with both, but whatever. And further, when that blob starts crashing as I update
    around it, expose bugs, get cracked, etc, guess what, it's closed, no further releases. Oops.

    Now it's this year, and guess what, ATI has the same decent open 2D support they always have.
    And looky here, even more 3D specs are being released, again as always... so GOGOGO AMD/ATI.
    Intel has had their own capable graphics for well over a decade. They don't need a graphics partner.
    AMD did not have one, and wisely bought the less encumbered ATI.
    Excepting all the windows gaming fanboys, Nvidia graphics division is just waffling in the breeze for the moment. It has no purpose in life, no one has need to buy them.

    And further, both AMD and Intel have inhouse/partnered north/south bridges and bios partners.
    So nvidias other lines of business are similarly mooted.
    This news just cements my prior conclusion to go with AMD/ATI in the future and shows I was right back then.
    NEVER follow the advice of fanboys when you're seeking to keep your long term investments usable.

  • by drinkypoo (153816) <martin.espinoza@gmail.com> on Tuesday November 23, 2010 @02:22PM (#34321578) Homepage Journal

    I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset. The "fglrx" driver does not support it as it is too old, and the open source driver causes massive display trashing and lockups. Whatever they did to it, it's not a faithful R2xx, and so the existing driver (which works on genuine, old R2xx stuff) does not work on it. But that's not all; AMD also didn't bother to contribute proper support for the power saving features of this processor or chipset, nor decent documentation for either... for anything but Vista.

    In short, I wouldn't trust AMD to actually provide you proper Linux support. I'm sitting here at an AMD "netbook" (subnotebook really) without it. Indeed, this machine is really only properly supported under Vista; power saving doesn't work under Windows 7! That's actually an artifact of the video driver, though, which lags behind the Vista version. If I load the VGA driver then power saving works right. However, since AMD makes the whole chipset, I get to blame them for it no matter how you slice it.

  • by Anonymous Coward on Tuesday November 23, 2010 @04:03PM (#34323074)

    I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset.

    The GPU in rs690 is actually a 4xx variant, not 2xx. Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature ?

  • by drinkypoo (153816) <martin.espinoza@gmail.com> on Wednesday November 24, 2010 @06:27AM (#34329406) Homepage Journal

    The GPU in rs690 is actually a 4xx variant, not 2xx.

    Not according to the X driver.

    Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature [x.org] ?

    It's the CPU power-saving that AMD did not contribute to Linux, not the GPU. I can't USE the GPU long enough under linux to get to the point where I need power saving.

How many Unix hacks does it take to change a light bulb? Let's see, can you use a shell script for that or does it need a C program?

Working...