AMD Releases Open Source Fusion Driver 126
An anonymous reader writes "Yesterday AMD released open source Linux driver support for their Fusion APUs, primarily for the first Ontario processor. As detailed on Phoronix, this includes support for kernel mode-setting, 2D acceleration, and 3D acceleration via both Mesa and Gallium3D graphics drivers."
SALE ? (Score:1, Interesting)
Re: (Score:1)
Fusion (Score:1, Insightful)
Fusion is going to be important. AMD will finally have a portable product that rivals Intel. Integrated video hardware is now commonplace on the desktop [dell.com]. Embedded AMD hardware is beginning to appear [innocoregaming.com] and Fusion will accelerate this.
Intel doesn't have a 3D core they can integrate onto the CPU die. Bottom line is AMD has an edge.
Re: (Score:3, Insightful)
Re:Fusion (Score:4, Informative)
huh? [geek.com]
Double huh? [anandtech.com]
It's rare to read someone post something both factually and subjectively wrong at the same time in so few words. Congratulations.
Re: (Score:2)
1) Sandy Bridge doesn't exist yet, and won't until next year. It'll be great when it does exist, though.
2) The GP was probably talking about OpenCL support, which is generally lagging on Intel IGPs. Apple is into OpenCL in a big way, and the lack of OpenCL support was reportedly one of the reasons they stayed with the aging Core2s (paired with NVIDA GPUs) for the current generation of Macbook Air. Maybe we'll see OpenCL support next year with Sandy?
Re: (Score:1)
Well, he was saying Intel doesn't have a "3D chipset they can integrate onto the processor die" when in fact they have already done so. And if we're talking about released products, where's Fusion?
And actually their IGP is reasonable these days, even pre-SB. It's not fit for high res gaming, of course, but it's suitable for a majority of non-gaming needs.
I don't think SB will have OpenCL support, from my limited understanding Intel's GPU solution is less flexible than ATI/NVidia's.
Re: (Score:2)
I'm using an Atom D510 right now, with on-die integrated graphics. Of course, these are not GPUs in the modern sense of "general processing unit", but at least it is an Intel 3D chipset on the processor.
http://en.wikipedia.org/wiki/List_of_Intel_Atom_microprocessors#.22Pineview.22_.2845_nm.29_2 [wikipedia.org]
Re: (Score:1)
Time to move away from NVidia now? (Score:5, Interesting)
Long ago, I went with ATI video because it had the best support for Linux. Eventually, NVidia caught on to this trend and started supporting Linux too... and better than ATI. So I switched. Now NVidia has screwed the community that had helped it to grow in popularity by putting out "Optimus" hybrid graphics everywhere and then refusing to update their Linux drivers to support it and refusing to release any details about it either. So now, the best anyone had been able to do is disable the nvidia GPU to reduce power consumption in laptops not able to utilize the nvidia hardware.
As AMD/ATI is doing this, perhaps my next selection will be to the exclusion of NVidia (again).
When will these jerks ever learn? The future of computing is in embedded devices and those devices will run Linux. Get Linux developers using YOUR hardware and it will have a better shot at a prosperous future as well. So far, Intel and ATI are the only options.
Re:Time to move away from NVidia now? (Score:5, Insightful)
Not so much with cutting edge gaming rigs, but with older computers especially it's fairly common for video cards to outlive their manufacturer support and still contain a few bugs or optimization problems.
Re: (Score:3, Insightful)
That said, nVidia has given very long support on old graphics cards. The primary reason that they support Linux is OpenGL workstations, which demand long support cycles and regular users get the benefit. And as even AMD admit, you get bigger benefits from cross-platform code (Win/Mac/*nix) than you do from the open source collaboration, as long as it's not possible to open up the closed source drivers due to DRM licensing, licensed code and so on. The open source team is very small compared to the Catalyst/
Re: (Score:3, Insightful)
Intel has all-but-formally-announced their intention to lock Nvidia out of everything they can, as fast as the feds will let them. On die-video, no QPI licence, trimming PCIe lanes off lower end products, etc. AMD has not been as frankly rude about it; but their on-die video will be even more competent than Intel's, and they control a smaller slice of the market, in any case. Pretty much across the board
Re: (Score:1)
Re:Time to move away from NVidia now? (Score:4, Interesting)
NVidia had their opportunity but since AMD got their ATI dept's act together their GPU performance and importantly their Linux support has come on in leaps & bounds.
With NVidia being squeezed out of the chipset market by AMD & Intel and even the consumer Graphics card able to play most FPS games at more than adequate frame rates, I see (sadly) NVidia slowly but surely going the way of Novel's Netware. i.e. to an inevitable death.
They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger
Re:Time to move away from NVidia now? (Score:4, Insightful)
They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger
Doesn't NVidia make the Tegra/Tegra2 processors for mobile devices?
Re: (Score:2)
Yeah, I'd forgotten that. But like AMD & Intel, their market share is miniscule. Most other CPU/GPU builders are far better placed in the market.
The Register has a bit about the ARM A15 and a 1U server rack with the current A9 CPU inside. Now this is interesting. Low power Powerful servers. I use a couple of EEEBoz boxes but the A15 has certainly made me think again about dumping X86 CPU's for this use.
Re: (Score:2)
So for an ATI Radeon 4870 video card (512 MB) in Debian/Linux, will I have easy, good support, and speeds with ATI's closed binary drivers like NVIDIA's? I am currently using my old NVIDIA GeFVorce 8800 GT and GeForce 5200 FX video cards in my old Debian/Linux boxes. They work well even with games, Google Earth, 3D screen savers, etc. I do want speed. I am not crazy about the open drivers since they tend to be slow and I do 3D stuff. :(
Re: (Score:2)
Normally I'd shake my head and walk away. However, with the Boxee Box and other embedded devices running on Atom and other low end x86 CPUs and Linux, having the community bash out bugs in your drivers means that nVidia can get to the top of the embedded heap.
Re:Time to move away from NVidia now? (Score:5, Insightful)
Time? You are late. ATI has been releasing specs and employing engineers to write opensource drivers for some time already. I haven't bought a Nvidia GPU for years, and I have no plans to do it in the future.
Re: (Score:2)
I had considered ATI but heard bad things about the driver situation at the time (open source problems hadn't fully been worked through at the time).
How is $YOUR_LINUX_DISTRO these days with open-source ATI drivers? Any problems with Linux games, other programs requiring 3D (like KDE Stars), and Windows games running on Wine?
Re: (Score:2, Troll)
Any problems with Linux games
Nope, they both work.
I know, "both", we all know the obvious one, but can you name the other one?
Re: (Score:2)
I regret getting an ATI for my desktop at work. (yes Linux)
- Using 2 monitors is sketchy at best... the performance dropped with the 2nd monitor. 3D support is OK on one monitor, but not so good on dual. I miss the cube. Xinerama or separate desktops?! that's crap ATI! Twinview is better from NVidia.
- I have frequent system crashes because of the proprietary video driver (once a week is frequent), and have to hand configure the xorg.conf to remove crap from the display, etc. Waste of time.
The OSS for the vi
Re: (Score:1)
On my laptop that has an ATI Radeon Mobility X-1800 chipset (R300 series) support by the open source driver is basically OK. Desktop Effects in Gnome/Compiz work well, although not all effects are available. The subjective performance is good. With recent linux kernels there also is support for power management of the graphics chipset.
BUT:
1.) VGA output does not work. The video signal is distorted.
2.) With KDE 3.5 this driver/chipset configuration is blacklisted for Desktop Effects.
3.) The machine freezes
Re: (Score:1, Interesting)
It's always been time to 'move away' from nvidia.
I asked years ago which card to use. Linux gaming fanboys said go nvidia, their closed blobs have great 3d support.
WTF do I care about that?!?! When they decide to release new models and want to force everyone to upgrade, they
quit maintaining the old blobs. Therefore, I'm left in simple open source 2D land with both vendors. Plus some marginal 3D here and there with both, but whatever. And further, when that blob starts crashing as I update
around it, expose b
Re: (Score:3, Insightful)
I'll grant that the situation's always sucked for non-x86 platforms, but Nvidia's done a remarkable job of supporting their older hardware in Linux.
Drivers for GeForce FX Cards, Updated 10/18/2010 [nvidia.com]
Drivers for GeForce2-4 Cards, Updated 11/16/2010 [nvidia.com]
Drivers for the Riva 128 (?!) through GeForce256, Updated 08/04/2010 [nvidia.com]
There are also supported drivers for all of these products for AMD64 Linux. It's no substitute for an open source driver - I support nouveau - but declaring that they leave their old cards un
Re: (Score:2)
Re: (Score:2)
Notice that the release notes for the 96.43.19 version, released a week ago, includes "support for X.org xserver 1.8 & 1.9". Yes, xserver 1.8 wasn't supported until a week ago. It was released in April, seven months ago. In other words, if you wanted to run a current xorg for over half of this year with a Geforce 4 and the binary blob, you were out of luck; and their official position (which they fortunately reverted) was "we still support those cards, but only on xserver1.7.999". This is not "remarkable support" at all.
Note that Debian has nothing but experimental support for 1.9 and just recently only stable support for 1.8, while being 4 versions of the kernel behind in Sid [still stuck on 2.6.32]. Yes, they just released 2.6.36 in experimental but basically view it as ``SOL if something goes wrong because we're busy getting another stable out whenever the hell we figure out the time for it to be ready to come out...'' I'm stuck waiting for even Nvidia's 260.19.21-1 [only got a build because a few of us bugged them to
Re: (Score:2)
Quite right. As Steve Max noted in the grandparent it can take time for new X server support to manifest in Nvidia's drivers for older products; however, adding that support is a non-trivial undertaking, and I daresay that most systems running pre-GeForce 6 hardware probably enjoy relatively stable and conservative Linux configurations. While that's still a nettling inconvenience for some, I'll take it over the relatively uneven and sometimes slipshod support available for other manufacturer's cards in Li
Re: (Score:1)
From what I've seen in forums nouveau is not so bad actually.
I like that the team is working hard on using Gallium effectively (for example they used it to make VA-API work using VDPAU (decode only)).
As far as I can tell the nouveau driver is more feature complete than the open source ATI driver, though maybe not as performant (I'm waiting for Open Source VA-API support for h.264 with encoding and decoding, I am not too concerned about the overall performance if it can run compiz).
Intel appears to be the cl
Re: (Score:3, Insightful)
I don't know about "time to", but in any case where the software is open vs. closed, the open-source community will not make the effort with the closed system. This will absolutely make linux hackers choose AMD graphics now, which will almost certainly result in improved reliability of AMD cards in linux systems overall, and eventually almost total domination of the consumer linux segment by AMD graphics.
Re: (Score:2)
Re: (Score:3, Interesting)
I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset. The "fglrx" driver does not support it as it is too old, and the open source driver causes massive display trashing and lockups. Whatever they did to it, it's not a faithful R2xx, and so the existing driver (which works on genuine, old R2xx stuff) does not work on it. But that's not all; AMD also didn't bother to con
Re: (Score:1, Interesting)
I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset.
The GPU in rs690 is actually a 4xx variant, not 2xx. Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature ?
Re: (Score:3, Interesting)
The GPU in rs690 is actually a 4xx variant, not 2xx.
Not according to the X driver.
Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature [x.org] ?
It's the CPU power-saving that AMD did not contribute to Linux, not the GPU. I can't USE the GPU long enough under linux to get to the point where I need power saving.
Linux drivers - stable?? (Score:1, Insightful)
Drivers that finally enable full capability of the hardware are a must, be that OSS or closed source.
nVidia has a long term support in their Linux drivers - they are the same as their Windows or OS X drivers, just added a GPL glue layer. Are the AMD drivers going to be long term supported too? Stable??
To me,
stable > long term support >>> OSS > closed-source
only because I'm not planning on debugging video drivers!
Re: (Score:3, Informative)
But now there is support for Windows, OSX and Linux. If your OS isn't on that list then they aren't providing you with anything, or even the ability to do it yourself without doing some real funky stuff with wrappers.
Re: (Score:1)
There was GPL glue to get them working on AMD64 pretty quick in my memory, It worked fine, and was pretty easy.
I actually can't verify it as faster than 1 year, but it was available when i purchased my computer in sept '04, with the first clawhammer released sept '03.
Not an official driver, just a little fragment to make the official one work (it was a patched NVIDIA installer, so easy to do).
Re: (Score:2)
The AMD drivers AFAIK are supported almost entirely by the community. There are some devs at AMD that contribute to the drivers, and AMD has been releasing specs and documentation in increments. 3D is still not fully supported on most chipsets, so you need the FGLRX drivers for that, but for normal 2D desktop operations the radeon drivers are actually faster and more stable (fewer artifacts, etc.). At least, that's been my experience. The radeon driver also supports KMS (FGLRX does not) meaning it's the
Re: (Score:3, Insightful)
It all comes down to "supporting old stuff does not bring new sales, therefore is really bad in the long term" vs "I still use/want to use old stuff, therefore I want it to work, and as long as it fits me, I'll support it."
In the OSS community, the only hardware that's not supported is really the hardware that's not used. Hell they ev
Re: (Score:2)
You can get long term support in closed ecosystems, even very closed ones(just ask the nice chap from IBM Mainframe sales...); but it'll cost you. Often a great deal and you had better be sure that the vendor is contractually on board; because they can, otherwise, pull the rug out from under you at their option(If they stop selling product X licenses/support contracts, you pretty much have to stop using product X. Game over. Copyright law. Have
Re: (Score:2)
The issue facing AMD on their older GPU/chipsets is the licensed tech/code that was included. Due to the contractual obligation, AMD is having to go through all of the code/designs and remove the information they licensed. On the new 4xx/5xx/6xx GPU's though, AMD has redesigned the chips with OSS support in mind as it will allow them to eventually quit wrtiting a driver at all and simply support the community that will be creating it.
Ontario Processor? (Score:5, Funny)
I went to buy an Ontario processor, but cheaped out -- I ended up with a Quebec processor. Now, I can't understand a thing it says, it never seems to do anything, and I keep having to give it money!
Re: (Score:1)
Ontario ones are better then the cheap china ones (Score:3, Funny)
Ontario ones are better then the cheap china ones.
Re: (Score:3, Funny)
Also, it periodically runs a system-wide poll to separate and form its own machine that stays in the same case, draws from the same power supply, and uses all the same resources :)
Re: (Score:1)
I am a Quebecker you insensitive clod!
Fusion (Score:2)
Re: (Score:1)
We were all waiting for the years of "Fusion" and "Linux on the desktop", but instead we got "Linux on the Fusion".
Re: (Score:2)
No, net power gain from fusion is still a few decades away.
3d Video cards have been sucking in power to create a small fusion reaction in their GPU for at least a few years now.
Re: (Score:2)
That's just what AMD marketing wants you to think. They chose the "Fusion" trademark so they could perpetually delay their products (they're already pushing 2 years late compared to the date they first announced after buying ATI).
Open Source Fusion Driver (Score:1)
Take a step back, look at the big picture. (Score:2)
Originally computers were huge proprietary things.
Now they are small and commonplace.
In the past software was written to a specific hardware, now it's not ( C is cross platform compared to assembly, folks ).
Games no longer draw graphics by directly reading and writing raster data directly into hardcoded "video memory" regions.
Abstraction layers (such as a graphics API + Drivers) are a must in todays software environment. Why? To support cross platform software development. Many of todays games sit on top
Re: (Score:2)
Uh, software is still written to specific hardware. You may write it in C, but C doesn't determine the register mappings and semantics. *addr=value is still just mov [addr],$value
Re: (Score:3, Insightful)
Uh, software is still written to specific hardware. You may write it in C, but C doesn't determine the register mappings and semantics. *addr=value is still just mov [addr],$value
Yes, but largely NO!
I write my software in C. The same code compiles and runs on x64 and on x86. The COMPILER translates my cross platform "*addr=value;" into the apropriate machine level instructions. My C software programs do not concern themselves with the specific register mappings and processor semantics; this has been abstracted away by the C Compiler.
I agree that driver software may be written to the specific hardware, but the purpose of a driver is to abstract said "register mappings and semantic
Re: (Score:2)
My C software programs do not concern themselves with the specific register mappings and processor semantics; this has been abstracted away by the C Compiler.
Your compiler knows jack about the registers on a video card.
If you're writing a video device driver, as here, you're going to be poking values into board-register addresses, streaming data in and out of specific port addresses, and that code isn't portable. It may seem very familiar from board to board, but unless the features you implement are completely trivial it's not. Even at the library or application level you'll have to deal with the different features of the cards, which means a whole new set of
Re: (Score:2)
Re: (Score:2)
An open source driver would give details on hardware internals.
More so than the hardware itself? I think not. If you want me to see a movie I must be able to see the movie, ergo, it is impossible for you to keep me from recording the movie if you let me take the movie home.
If you let me take the hardware home, and the hardware is expected to function for me, then I will be able to operate and analyze the hardware and write my own damn driver which, you guessed it, will expose the same details as the Mfgr's driver would if it were open source... Hell, not giving me t
Fusion for (light) servers? (Score:2)
There's still a niche that isn't very well served, where these low-power Fusion CPUs appear they could kick some major ass: the always-on-24/7 lightly-loaded server. I'm currently using Athlon II xxxe for this, but I'd happily downgrade processing power in exchange for lower wattage.
Shit, Atom would be good enough, if the motherboards had enough SATA ports or slots for me to add SATA cards, but I never found any that did. Gimme a 9W or 18W processor on a board that I can somehow hook up 8 drives to, and
Re: (Score:2)
Re: (Score:2)
You get quite far with a single 2.5" disk these days, and that's easy to fit in every tiny server. 8 spinning disks is niche.
Re: (Score:2)
One drive is too few. All mechanical drives must be used in RAID1 pairs. (Or RAID5 if you're super-stingy and have extreme disk space requirements but IMHO that's almost always the wrong answer.)
Then, for PVRs, I figure the ideal number of pairs (mythtv storage groups) is tuners+1, so you can record and play without any seeking or fragmentation of the recordings, thus 6 drives is the ideal for an hdhomerun-fed system. (The +1 is a simplification; it can theoretically rise to approach the number of fronte
Re: (Score:2)
Damn, that looks exactly like what I wanted. Very nice find. Thank you, AC.
Anything missing? (Score:2)
Last big announcement about an AMD code drop, there was still something missing, though I don't remember what. Features, performance, whatever, there was still something not there that was either present in proprietary AMD drivers or nVidia drivers.
Are we past that yet? Is it finally time to dump nVidia for AMD?
Still not 100% open... (Score:2)
Let me know when I can buy a GPU where every single feature of the card (INCLUDING the on-board dedicated circuitry for decoding video) can be used in the open source drivers and then maybe I will care...
core2 is dieing intel's next on board video 9400m (Score:2, Interesting)
core2 is dieing intel's next on board video is at nvidia 9400m level but it also locks out better on board video.
Some of apple systems may not fit in a full X16 pci-e video chip.
Apple is may use i3 / i5 cpu with a added ati or nvidia chip. But they don't like to use add in video in there low end systems.
Re: (Score:2)
Hey scrub, Fusion is the same size as intel's onboard video. RTFA.
Re: (Score:2)
Re: (Score:3, Interesting)
Requires running the vertex shader first (Score:2)
Basically to do things like calculating what things are visible so that the processor doesn't have to send those over the bus.
Calculating occlusion requires knowing where the points are relative to the camera's line of sight, which requires running the vertex shaders first. How much of a speed boost would result from running the vertex shaders on an on-die GPU and delegating pixel shading to the discrete GPU? I'd imagine that pixel shaders, which are run for each pixel, need a lot more time than vertex shaders.
When AMD turns to 28nm production... (Score:3, Interesting)
Any chance Apple could use that for the next versions of Mac mini and MacBooks? Or is a Core 2 Duo with nVidia 320M still better than Fusion?
... according to Fudzilla.com
http://www.fudzilla.com/notebooks/item/20888-amd-apple-deal-is-28nm-notebooks [fudzilla.com]
"Fusion goes Apple 28 / 32nm
It all started here, when AMD’s Senior VP and Chief Sales Officer Emilio Ghilardi was brave enough to show an image of several Apple products in a Fusion presentation. After we wrote our part AMD was quick to deny it, perhaps a bit too quick, which gave us a reason to dig some more, only to find that we were on the right track.
We asked around and some sources close to I
Re:When AMD turns to 28nm production... (Score:5, Interesting)
Re:When AMD turns to 28nm production... (Score:4, Funny)
And you think Apple customers are that worried about price?
Re:When AMD turns to 28nm production... (Score:5, Insightful)
And you think Apple customers are that worried about price?
Apple customers are going to pay a premium no matter what. It's Apple that wants the discount. The less Apple pays for the hardware, the larger the margin they get with each product. Apple's customers aren't going to see any discount, even if Apple's discount is $100 per processor to move to AMD.
Apple has $50B in the cash. Considering what they sell, that's an absurd amount of money. Enough to buy Sony outright. It just goes to show you the enormous margins that consumers pay for Apple products. It's like Sun / Oracle / Cisco in the 90s except these are consumers that are paying the outrageous margins rather than large money-fat corporations.
Re: (Score:2)
It just have to be shiny and with apple logo. The rest does not matter.
Re: (Score:2)
And you think Apple customers are that worried about price?
Those price points guarantees Apple a higher profit margin.
Re: (Score:3, Informative)
I'm not sure about Android portable media players, but there are tablets that could be regarded as equivilent in intended use to an iPad.
Re: (Score:2)
I'm not sure about Android portable media players, but there are tablets that could be regarded as equivilent in intended use to an iPad.
In order for your objection to be carried you are going to have to show that there's Android-based media players with an app store which are also otherwise comparable to the iPod Touch and/or iPad. So far tablets don't have app store access. Perhaps in Gingerbread, but I wouldn't hold my breath.
Re: (Score:2)
Re: (Score:2)
Their tablets are uniformly poorly reviewed compared to Apple's offerings and the battery life is not even in the same ballpark.
Re: (Score:2)
Re:Price of Android pod touch (Score:4, Insightful)
That's the same argument fanboys always use to call Apple products cheaper. Hand pick your specific criteria the must be included (app store) and excluded (and actual phone . . .) until you get just the right oddball combination of features that you can call it cheaper.
Meanwhile, when you compare the iPod Touch to other touch-screen media players, it's pricing is atrocious, and Apple's laptops, desktops, and servers all fair equally poorly against their general competitors.
As a matter of fact the only segment in which Apple competes well on price is with iPhone. It's about the same as other similar smartphones. Other than that though? You're definitely paying your turtleneck-tax.
Re: (Score:1)
Last time I specced an Apple desktop vs any others is was the cheapest.
It had to be compared to workstations. As there were no other brands matching Apple's in the desktop range.
Re: (Score:3, Informative)
Funny. Last time I bought a PC, the cheapest Apple option for my needs was the most expensive iMac. It would have cost twice what I paid, and performed worse. Apple simply isn't competitive in the midrange.
Re: (Score:2, Informative)
Just looked at their prices too, they've gotten worse.
And the crappy displays on the iMacs (maybe this has improved) kill them for serious use, leaving the cheapest desktop at $2500, and it's only one CPU.
But trying to match their 5k computer at Dell runs 6k.
Re: (Score:2)
when you compare the iPod Touch to other touch-screen media players, it's pricing is atrocious
What other touch screen media players can I try in stores, even those without an app store?
Re: (Score:2)
Archos, Zune, etc.. There are other little guys in the market as well, but I can't remember them off the top of my head. Archos has pegged on Android, so they'll probably get app store access and Zune is tied to the Microsoft ecosystem for better or worse.
Re: (Score:2)
Archos has pegged on Android, so they'll probably get app store access
Any evidence of that? Archos has had Android media players out for years, but none have Android Market access. Archos even set up its own "AppsLib" store out of frustration with Google not opening up Android Market to devices other than telephones. But as far as I know, the selection on AppsLib is nowhere near that of Android Market.
Re: (Score:2)
Why try it in a store? Buy it online and if you do not like it return it.
Welcome to the 21st century.
The 15% restocking fee (Score:2)
Buy it online and if you do not like it return it.
And pay how many 15% restocking fees?
Re: (Score:1)
You don't need to handpick criteria in order to make Apple products seem cheaper. All you really need to do is include some criteria other than clock speeds and memory capacities. Geeks have a tendency to ignore advantages that aren't trivially quantifiable, even when those advantages have real monetary value to most consumers.
When you look at Apple's product line, you find that many products have no true head-on competitors. Most obvious are the iPod Touch, the iMac, and the Mac Mini. Those are products th
Re: (Score:2)
As I understand it the android market is not a requirement for installing apps.
Re: (Score:2)
As I understand it the android market is not a requirement for installing apps.
It is if the publisher of the application that you want to use has decided not to make it available other than through Android Market. I imagine this to be the case especially with paid apps that have no close freeware substitute.
Re: (Score:2)
AMD CPUs you mean? Apple uses Radeon parts.
As for why they don't go AMD, it's probably due to AMD's supply line for midrange CPU parts. Note that Apple doesn't sell Celeron chips. they sell C2D, Core i5, i7 and Xeon.
Re: (Score:2)
For quite some time, Paul Otellini, board member of Google, was also sitting on Intel's board while Eric Schmidt sat on Apple's board. While this linking of boards in itself is not proof of anything it is suggestive that some "out of band" communication may have been taking place. (In my opinion, Google as a huge consumer of data center hardware should be expected to avoid on its own recognizance all appearance of bias towards one processor manufacturer or another.)
Re: (Score:2)
And if you want future Apple news to happen, don't leak it.
Steve Jobs has canceled actual products and ripped up supplier contracts for much less than this.
Re: (Score:2)
Does that work also for "inventing" new products / making sure Apple won't go into certain market segments? ;)
Re: (Score:2)
Any chance Apple could use that for the next versions of Mac mini and MacBooks? Or is a Core 2 Duo with nVidia 320M still better than Fusion?
... according to Fudzilla.com
http://www.fudzilla.com/notebooks/item/20888-amd-apple-deal-is-28nm-notebooks [fudzilla.com]
"Fusion goes Apple 28 / 32nm It all started here, when AMD’s Senior VP and Chief Sales Officer Emilio Ghilardi was brave enough to show an image of several Apple products in a Fusion presentation. After we wrote our part AMD was quick to deny it, perhaps a bit too quick, which gave us a reason to dig some more, only to find that we were on the right track.
We asked around and some sources close to Intel / Nvidia have denied the rumour saying that they know nothing about it. However, just a day later we managed to confirm that the leak is real and that Apple will indeed use Fusion, here.
Our industry sources have indicated that the deal will be announced in at some point 2011, that it will involve 28nm and 32nm Fusion parts particularly Krishna and that Apple plans to launch notebooks based on AMD chips. Apple is also not cold hearted on Trinity 32nm Fusion parts.
The announcement can be as far as a year away, as 28nm parts won't materialise until the second half of 2011 and since AMD doesn’t have a tablet chip, it won’t happen in iPad segment. At this point Apple doesn’t plan to use any AMD chips in desktop or server parts, but in case Bulldozer impresses us all, maybe Steve might change his mind.
So if you like Apple and love AMD, start saving money as roughly a year from now you should be able to buy Apple notebook with Fusion Krishna / Trinity class APU."
And if you want Fusion benchmarks, check the usual suspects: http://techreport.com/articles.x/19981 [techreport.com] http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked [anandtech.com]
Invest that savings into AMD Stock and when the additions happen enjoy the ride. Apple will never use Intel or AMD in their embedded devices [iPhone, iPod, iPads] as their A# ARM based CPU/GPU combo they can control and develop with incredibly high ROI.
Re: (Score:2)
The word is that they are seriously considering it at least. (And "the word" is the best you get when discussing Apple)
Re:Cool (Score:4, Informative)