Torvalds Slams NVIDIA's Linux Support 663
New submitter jppiiroinen writes "Linus Torvalds received the Millennium prize last week for his work on Linux operating system. He was already in Finland, so Aalto University arranged a talk session with him (video). During the Q&A, a person asks why NVIDIA does not play well with Linux. Torvalds explained shortly that NVIDIA has been one of the worst companies to work with Linux project — which makes it even worse that NVIDIA ships a high number of chips for Android devices (which use Linux inside). Torvalds even summarized that ('Nvidia, f*** you!') in a playful manner. What has been your experience on NVIDIA drivers with Linux?"
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:5, Interesting)
Better than AMD. (Score:2, Interesting)
I recently decided that I've put up with my Radeon for too long and dumped it for an NVIDIA GPU on my desktop. I like how AMD is more willing to work with the open-source community, but the driver they are working on isn't coming along fast enough and development seems more focused on adding support for newer hardware (not a bad goal, of course) rather then squeezing out more performance, and unfortunately with desktop environments like Gnome 3 demanding a decent level of 3D acceleration I found that the performance of the open-driver isn't up to the task of handling a multi-display desktop with decent performance with more than a couple windows open (it gets worse if I also want to play a video on my 2nd screen) on my Radeon 5850/4850. My 5-6 year old PC using just 2D acceleration felt snapper than Gnome 3 did with my radeon.. which makes me sad. NVIDIA's propietary driver does a lot better in that regard. Catalyst appears to be dropping support for all but the most recent GPUs, I'm not sure how well it does with Gnome 3, since they are so far behind with kernel and xorg support that I can never actually run it with downgrading a whole bunch of stuff.
Only good experiences, using the binary blobs (Score:4, Interesting)
I have had far less issues with any nVidia card than ATi/AMD. i have had both for more than ten years.
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:2, Interesting)
I suspect your definition of "realistic time frame" is a bit different than some people (especially those who consider Linux their primary OS, and won't touch Windows).
When I buy new hardware, I wouldn't want to wait months/years to use it... So nVidia clearly considers Linux a second-class citizen, which may be OK for you, but not for some.
It doesn't mean that the companies are any better - but nVidia's "high road" as you make it out to be really just makes them the bottleneck when it comes to hardware driver support. It puts them in a position where they MUST create the drivers in a timely fashion, because there is no other choice.
Whereas, with other vendors, the existing reference drivers can often be fiddled with to gain partial support for new hardware, and as specifications are released, anyone with the know-how can begin adding support for that hardware - the bottleneck becomes the availability of talent and motivation.
Anyhow, I take a different road - I avoid high-end graphics hardware entirely, and since I'm not a "gamer", it doesn't matter to me. I just use hand-me-down hardware that people give me and I'm content with it - but I do usually favor AMD's graphics chips since they are more open by nature.
Re:Problems? Really? (Score:2, Interesting)
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:1, Interesting)
A lot of andriod phones are stuck at 2.2/2.3 because the old drivers won't run on the new kernel.
Then Linux should keep their driver interfaces stable.
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:2, Interesting)
Hum. Actually, yes. Yes it is. Or their cards would just be a complex mass of metals, plastic and rare earths with no use whatsoever. Except maybe as a suboptimal heater.
N.
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:4, Interesting)
Then who's job is it?
Who's job is it to create reference drivers and release specifications for the hardware that nVidia makes?
nVidia seems to make it their job by refusing to release specifications for the hardware they create. They create this hardware to work with other open-specification hardware and software, and yet, they intentionally keep their own specifications secret.
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:4, Interesting)
> They only support a narrow range of product before they fall off, and you get the 2D only version.
News to me. I have pre AMD buyout ATI 9200's on AGP slots still in production. The run Compiz pretty well. And they ran right out of the box, no futzing with odd repos, limiting to specific kernels etc. Just install and go.
Avoid the newest and AMD just works. Nvidia doesn't. Yet.
On the other hand I have an Nvidia in in my MythTV and I do have to futz with drivers in exchange for accelerated HD playback with GPU assisted deinterlacing.
It is stupid, keeping the specs secret is a lose/lose for everyone.
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:0, Interesting)
Then who's job is it?
When the OS is your biggest money maker. Windows it's Nvidia's job.
When the OS is mostly a loss to the company. Linux then it's Linux's job.
Re:Problems? Really? (Score:5, Interesting)
> What I know is when nvidia came out, I was seeing
> thousands of posts from people desperately seeking
> answers on how to get them to work, and thousands
> more on how to make their X Window survive upgrades.
Yeah, and they solved that problem. An entire module rebuild facility for kernel upgrades was probably developed just for Nvidia.
That benefits ATI blob drivers too. This is a good thing since you are unlikely to get suitable performance (if you care about that sort of thing) without the blob driver.
Re:Alot better than ATI (Score:5, Interesting)
> OMFG NOOOOO there is no possible way that NVIDIA could operate like every other chip maker on the face of the planet.
No, we want them TO operate like 'every other chip maker' and get with the program. Name another major chip vendor who hasn't figured out that getting into the Linux kernel is a required checkoff for market success. Doubly so for any product used in the enterprise vs the fanboi market. NVidia's CUDA is about the entire list these days, the last major holdout.
A few still maintain a desperate final stand in the embedded market but few new vendors go the closed route and every year brings another of the dead enders over to the open camp. First to fall were the storage products, then ethernet and cpu makers. Wifi is holding out on the blobs due to fear of the spectrum regulators but most now support an open driver for the kernel to firmware interface.
In defence of Linus (Score:5, Interesting)
I like Linus. He speaks his mind, and that is good. He does not strive to be a politically correct suck-up like most of the soulless corporate speaking heads you see all over the place.
He has every right to say "Nvidia, fuck you". How should his message be sugar-coated? Should he write a 500-page NY Times bestselling book about the matter? Hold a seminar? Issue a press release? Have a meeting with Nvidia CEO, CTO and CIO, presenting empty Powerpoint fluff and wanking around the issue in such abstract terms that it can be interpreted in any which way, after which everyone thinks they've done their part but nothing happens as a result? No. It suffices with three small words. Why waste more time and effort? To not hurt someone's feelings? Don't be such a baby.
I think part of how Linus comes through as he does is a cultural thing. Although he has lived in the USA a lot, he's still a Finn. If you need to deliver a message to someone who is not behaving, you deliver a message, wrapping it up in a pink box with a greeting card full of hearts is pointless. And let's face it, Nvidia hasn't been a model citizen - if you're a dick, don't be surprised that others are dicks towards you.
And you ask: what incentives does Nvidia have to support Linux? Well, how about not making life hard for the people who pay actual real money for Nvidia's products? And not making life hard for the people who try to support the Nvidia products on a great OS on their free time?
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:5, Interesting)
It is stupid, keeping the specs secret is a lose/lose for everyone.
When I worked for a chip company, we weren't allowed to release specs.
Why? Guess.
Patents. We had no idea whether we were violating some obscure patent that no-one had ever heard of, and we weren't willing to put the specs out there where any troll could sue us for millions.
Comment removed (Score:4, Interesting)
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:4, Interesting)
Of course, but the point it still there. No one owes anyone anything. Nvidia has absolutely zero responsibility to make their chips work with Linux. They are free to decide exactly which platforms they want to support. Some people in the Linux community is however so used to getting everything they want for free that they for some reason think that they should have everything for free and that companies that don't do that are somehow evil. No they are not evil, it just happens that they are not especially friendly either. It is fair.
No, it's totally unfair. Have you ever shopped for a graphic card recently, with the goal to put that in your Linux box? There's currently only 2 choices: Nvidia or ATI. Both have totally horrible drivers in Linux, because the chip makers aren't being COOPERATIVE. That is, just not giving enough so that someone can make a decent driver. The problem isn't that Nvidia isn't helping, the problem is that they aren't helping AND we have no other choice.
I hate to be in such a position that I'm forced into buying a product that:
1/ I don't like (cards are too big, often with stupid fans)
2/ Is too expensive for what it does (I'm just asking for my computer output on DVI / HDMI / VGA... what's that hard and expensive to make?)
3/ Has features which I don't care about (eg: 3D and gaming shit...)
But there's simply no alternatives.
The only thing I'd like would be to have a fucking decent card that I can plug on a screen (if possible, multiple, all supported). But in this day and age, it's not possible to have simply THAT, working correctly. So yes, we do have all the reasons to hate Nvidia and ATI/AMD.
Oh, another one. Again, nobody asks for software, just documentation. It'd be even better if they were not even shipping at all a Linux driver, because their non-free stupid software is crap. In fact, I HATE as well the bloated screen configuration that Nvidia delivers on their site. It's simply not convenient at all. To switch from the laptop's screen to TV, you need around 15 clicks. Also, why should we use something special for Nvidia, and not the screen manager that's in Gnome by default? This proprietary software needs to DIE, especially that we have no choice but that one.
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:5, Interesting)
No, that's because they're major releases, when it's OK for the ABI to change. Again, something that every other remotely OS manages quite well.
Yes it does, and yes it is. Drivers breaking between minor revisions within a major release is very uncommon on pretty much every platform except Linux.
Heck, it's not unusual to see drivers working across even major revisions.
Windows, OS X (more recent versions, at least), Solaris, FreeBSD, etc. It's a struggle to find any remotely major OS other than Linux without stable ABIs.
Comment removed (Score:5, Interesting)
Re:Which is why I find it doubly funny (Score:5, Interesting)
Basically it is a really complex problem, and of course each new version of the graphics hardware brings in a new setup to deal with.
I think particularly the last part. Unlike CPUs that have to be 99.9% the same to support already compiled binary code, graphics drivers only care about the DirectX/OpenGL layer. Everything about how you accelerate those commands is being rewritten constantly. For example the AMD OSS drivers cover three very different architectures, VLIW5, VLIW4 and GCN. And within each architecture you have different generations with different ways of doing things and instruction sets. The hardware API is changing because they're working closely with the driver team, who are the only ones talking to the hardware - until you try writing an OSS driver. Third party chips like HDMI change both suppliers and versions so the hardware API changes, without code changes practically nothing works on a new card. There's a lot of upkeep.
The other part is that the generic code is woefully behind the times, regardless of the driver code. Mesa still only supports OpenGL 3.0, which was released in July 2008 and that support only came first this year - at that point 5.5 years behind the specification. So if you want to run recent OpenGL code you need closed source drivers because the whole stack is missing, not just the driver code. Basically even if AMD is doing the same bits as AMD does for Windows, nobody's doing the OpenGL equivalent of Microsoft's work on DirectX. Or well obviously some are working on it, but not enough to keep up.
The last part which makes sharing code between the open and closed source driver hard is DRM. AMD simply can't let the open source driver have any code that would make it easy to poke at what the closed source driver is doing like for example patch it to dump a BluRay to disk (despite AACS, BD+ and HDMI all being broken). Same goes for audio and PAP. Even just keeping the DRM bits in a little blob by itself would be painting a big sign saying "reverse engineer this". This means things have to go back and forth with the lawyers all the time, and you need this information because of what I wrote in the beginning.
On the bright side Intel seems to want to use more of their own graphics in coming Atoms - google "Intel Valley View" for more - because PowerVR has been the absolutely worst of the bunch when it comes to Linux support - and pretty terrible at Windows support too from what I gather. And at least according to AMD their OSS support is getting better with each generation, even though it has a long way to go...
Re:Which is why I find it doubly funny (Score:4, Interesting)
Re:THEN YOU DO IT MISTER HIGH AND MIGHTY !! (Score:2, Interesting)
Re:Once upon a time (Score:5, Interesting)
I was looking for an excuse to expand on this already overlong story but didn't want to be rude and self-reply. Thanks for giving the opportunity.
One must be mindful that these offers were all carrot and no stick. The developers came with a plausible story: we have experience and insight into the big company's software, as many of us came from there. We know how to pass validation. We have the inside track to getting on the CD, and can speed your way to market. We can use our secret ways to optimize it because we have special insight we can't share even with you. All we ask (other than pay) is that the interfaces become private between us. We will help you develop your hardware so that the hardware interfaces presented are optimal for interfacing with the software, and we don't want to share that work with others for no pay, which is fair, right? They had good stuff to offer too: the benefits of some deep research into compositing that the hardware vendors couln't get some other way - but it always came with this catch. And it seemed like such a little catch at the time since there were no credible challengers to the big company's ware. And it seemed quite reasonable to work together and not share with outsiders. But the devil is in the details.
Only rarely would the stick come out, in reference to some other company: "oh, that seems to be a smart way to think. So-and-so thought so." So-and-so being a dead company who failed to come around to the "right" way of thinking. The threat implied was never stated outright.
Later, when hardware vendors want more, they get more committed. Implement that new hardware feature in the OS game engine rendering interface? Sure.. but there's more cost than just money. Want the standard user interface to leverage high-end blurring, transparency and shadow features... sure.. but how that works has to remain private between us. That requires a specially committed level of partnership. Along the way there were more patents to incorporate and license, and a stronger bond to build until the hardware manufacturer is committed to the big vendor's software and none other - in a way they can't be free of even if they want to be. These aren't just patents and copyrights: they're trade secrets too, and those are immortal. Each is as much to blame as the other, as they used each other to mutual advantage. There's enough dirt in there to get mud all over everybody and nobody wants that.
Every now and then some PFY trying to implement a feature for X will call up the hardware vendor hoping for some help. "So I've got some app in the debugger, and I can see it load a texture in the buffer and trigger the interrupt that submits it to your hardware. But there are mode-setting things in here that have been deserialized and I can't see which one goes first, or the right grammar for the call so when it doesn't crash it looks like crap. Throw me a bone. Feed me just a tiny little hint please, I'm dying here." These calls used to be fielded by actual developers who might be conflicted and want to say the easy truth but would instead give the same bored answer every time: "sorry, but that's a trade secret." And never would they say the big secret: "and it's not our trade secret so we'll never be able to answer these questions." Now it's probably handled by some flunky in Bangalore who couldn't give the right answer if he wanted to. It might as well be a recording - but they still want to pretend that they care.
This is all in the desktop and laptop arena of course. Servers are different. The big software company didn't have tyranny over server vendors like they did over desktops. Servers had to support Unix at first, and then Novell, and then Linux - to the point where no server company could survive or even be taken seriously with servers that could only run the big company's software - though they did try, notably with Broadcom network chipsets. The special features of the software/hardware interface just weren't as importa