Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Graphics Software Linux

NVIDIA Releases New Video API For Linux 176

Ashmash writes "Phoronix is reporting on a new Linux driver nVidia is about to release that brings PureVideo features to Linux. This video API will reportedly be in nVidia's 180 series driver for Linux, Solaris, and *BSD. PureVideo has been around for several nVidia product generations, but it's the first time they're bringing this feature to these non-Windows operating systems to provide an improved multimedia experience. This new API is named VDPAU, and is described as: 'The Video Decode and Presentation API for Unix (VDPAU) provides a complete solution for decoding, post-processing, compositing, and displaying compressed or uncompressed video streams. These video streams may be combined (composited) with bitmap content, to implement OSDs and other application user interfaces.'"
This discussion has been archived. No new comments can be posted.

NVIDIA Releases New Video API For Linux

Comments Filter:
  • Form follows code. (Score:3, Interesting)

    by Ostracus ( 1354233 ) on Friday November 14, 2008 @07:41PM (#25766857) Journal

    Fine. Now what programs use this API?

  • Does this mean... (Score:1, Interesting)

    by Anonymous Coward on Friday November 14, 2008 @08:26PM (#25767179)

    that my linux media box can now do video decoding on the video card instead of processor?

  • Re:ATI (Score:5, Interesting)

    by Ash-Fox ( 726320 ) on Friday November 14, 2008 @08:30PM (#25767213) Journal

    ATI was opening up their drivers. The OSS drivers were working well, and Nvidia wasn't doing anything. Nvidia addressed their horrible Linux XRender support, and now this. I may just have to stick with Nvidia in the spring.

    It is actually quite far from the truth.

    You might want to read a blog post I wrote about why nVidia rocks when x.org does not [livejournal.com]. It's likely to give you more reasons to move over to nVidia over ATi.

    The only thing nVidia is not doing, is making their enhancements opensource.

  • Re:ATI (Score:5, Interesting)

    by Jherek Carnelian ( 831679 ) on Friday November 14, 2008 @09:21PM (#25767587)

    You might want to read a blog post I wrote about why nVidia rocks when x.org does not. It's likely to give you more reasons to move over to nVidia over ATi.

    I don't find your arguments compelling.

    For one thing, you assert that "because of vocal powers in the foundation that demand that things should stay compliant to a specification and they should work around the architecture rather than strip out certain pieces and implement them, add proper new features (memory management and API functions to go with it)" -- yet my reading of the Xorg mailing lists suggests that is exactly what is being done with the GEM memory manager and API's [phoronix.com] previously there was the TTM memory manager, but the APIs were not satisfactory, so they ripped it out and started again.

    The bulk of your argument seems to be that Nvidia's got a much more complete OpenGL implementation than does anyone else. Nevermind that almost all of it is simply code duped from their MS Windows driver, your argument is really the ages-old "if it works, then who cares if it is closed source" argument we've heard time and time again.

    Of course the fallacy of that approach becomes obvious the second it stops working and you are helpless to do anything about it.

    That happened to a guy I know, he spent about $600 on a pair of top-end nvidia cards a few years back. All based on nvidia's highly touted support for linux. Except the cards did not work with his IBM T220 [wikipedia.org] monitor. It wasn't anything to do with the ultra-high resolution. It was a trivial bug in the nvidia drivers - if the card could not read an EDID, the drivers assumed the card had a single-link DVI transmitter. A stupid, stupid bug because the actual nvidia chip had the DVI transmitters onboard and they were always dual-link, there was no way for any card in that generation to even be single link, and of course no matter what directives we specified in the config file, the driver "knew better."

    He had to go out and spend another ~$150 for two Gefen DVI Detectives [gefen.com] just to enable the nvidia card to see an edid so that the driver would correctly turn on the chip's DVI transmitter.

    Nvidia's vaunted customer support? Totally clueless and useless, they completely dropped the ball, just ignoring the issue once they realized it was more than a "did you plug in the power cord" level issue.

    And don't think that problem was unique to an odd-ball monitor - the same lack of edid is an issue for anyone using unidirectional fibre DVI extender cables.

    So, while it is great for you personally that Nvidia's drivers work perfectly with the hardware you own, I'm pretty sure your tune would change right quick if you had to just bend over and take it due to such a trivial bug, the kind that could easily be fixed with a single line or two of code, if you just had the source.

  • Re:ATI (Score:4, Interesting)

    by Ash-Fox ( 726320 ) on Friday November 14, 2008 @09:44PM (#25767757) Journal

    Here is to hoping that Wayland addresses some of these issues.

    Wayland is not a new x11 implementation, it's a completely new windowing implementation, similar to Aqua as it would have widgets built into the server among other things.

    Personally, I love x11, it's great - Majority of the issues currently with x.org and xfree86 are not x11 related, but architecture problems in x.org itself. A clean new implementation of x11 would probably benefit us a lot more than another Y windows, Wayward, Aqua etc.

  • Re:Why? (Score:3, Interesting)

    by AaronW ( 33736 ) on Friday November 14, 2008 @10:07PM (#25767863) Homepage
    I use whatever works. For me, the nVidia closed source driver works better than any open source driver.

    I struggled for months with ATI and cursed it every few minutes when it would screw up the text in my editor (closed source driver). By the time the open source driver came out I had already dumped the computer for one with an nVidia card because the drivers just work, out of the box. Intel wouldn't recognize the monitor, ATI would constantly screw up, if I could even configure it for the monitor, but the nVidia one just worked. They provide better features than any other driver I've seen, both open and closed source, and their performance has always been better.

    Similarly with photo software. I use the Linux version of Bibble Pro [bibblelabs.com] because I have yet to find an open source equivalent that is anywhere close. Gimp doesn't come close in its RAW handling or ease of use for workflow processing.

    I've tried to use ATI but I had way too many problems and terrible performance. Similarly with the open source Intel drivers which did not work at all.

    Open Source has many benefits, but when it doesn't do what I need, I'll pay for something that does.

    I write software, both open and closed source and am currently hacking on the Linux kernel. I think both have their place and neither side is perfect.
  • by Kjella ( 173770 ) on Friday November 14, 2008 @10:31PM (#25768005) Homepage

    Yeah, people always say that, until a show-stopper bug comes along in the 2% that's closed and they can't do a thing about it.

    As opposed to the 100% that's closed and not nearly as terrible as you make it out to be? Reality is that most open source bugs I can't do anything with either, for practical values of "can't". I could file a bug report, been there done that and often it falls into the same black hole as closed source software. I could try to dig around the code myself, but just getting all the build requirements and trying to figure out the code base usually takes hours of time I don't want to spend. Bonus points if it's written in a language I don't know well. Or I could hire someone to do it, but I'd have to negotiate, make some reproducable cases (real fun if it happens randomly or specific to my hardware) and follow up with testing and payment. If nobody gives a shit about my problem and don't want to fix it to improve the application but only for the money, then hiring your personal developer gets real expensive real quick. None of my home PCs run anything mission critical, the solution to 99% of my issues is to simply roll back or not upgrade, if I've wisely tested something in advance.

    I think open source is really great in that other people can borrow code from other projects, applications can fork, groups can develop specific functionality independently and so on which makes it possible for development teams to work with everything that's been done before rather than starting from scratch. That process I think in the end creates very mature software, but to be honest I really don't see it being that much help with fixing acute problems. Open source means there's competition on being support, and that you can hire people to do custom development, but unless you're paying very well there's no guarantee at all your problem will be solved in a timely fashion. Which is like closed source support in my experience, some bugs get fixed while others can be largely ignored. I suppose the escalation options are better with open source but really I don't see a situation where I in capacity of being a home user would ever use them.

  • Re:is jesus real? (Score:2, Interesting)

    by Anonymous Coward on Friday November 14, 2008 @10:56PM (#25768105)
    You mean that some scruffy, middle eastern looking dude with a history of extremist rhetoric [wikipedia.org], violence [wikipedia.org], and worst of all Marxist communist socialist redistributionist tendencies [wikipedia.org] might not find a warm welcome?

    I bet we'd sent him to Gitmo.
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Saturday November 15, 2008 @01:59AM (#25768815) Journal

    usually takes hours of time I don't want to spend.

    But at least you have the option of spending that time.

    My personal example, from quite awhile ago -- Linux Kernel 2.4, which didn't have native support for AGP 3.0 / AGP 8x. No matter what I did, I couldn't force it back to an older standard, and I wouldn't have wanted to, anyway. Which was all fine -- ATI implemented AGP in their drivers to compensate, but it was broken -- detected my card as AGP2 instead of 3. So AGP didn't work -- I don't remember if this meant no hardware acceleration, or no X at all, but it did suck.

    So I cracked open the source -- that AGP stuff was in the open part, at the time -- found the detection algorithm, commented the whole block out, and hardcoded it to AGP3.

    Now, granted, there's no reason I should have to dig into the source for that. The detection should just work, and failing that, it should be possible to override that autodetection without recompiling your kernel.

    But either way, I was able to work around the issue in a way which would have been impossible if it was closed source. My only alternative was to either buy new hardware (and hope it was compatible this time), or go back to Windows.

    I was 16 at the time. I'm paid more now, but still not enough to keep buying new hardware until something works. I always lean towards open source, unless there's a compelling reason not to.

    That said, I'd be a hypocrite if I didn't mention -- I'm typing this on a Dell which came preloaded with Ubuntu. It's got an nVidia chip in it, which actually works fairly well. I do use Skype to talk to my brother in Taiwan.

    But for things like that to happen, the proprietary version has to be sufficiently better that I'm willing to give up the ability to fix things myself. I'd be very wary of buying a car with the hood welded shut -- but if it's, say, an affordable Porche, it might bother me less.

  • by Yfrwlf ( 998822 ) on Saturday November 15, 2008 @05:54AM (#25769563)
    It's also what the open source community gets for not having at least one actual cross-distro packaging standard. With the push for ODF and other great standards that are available for use on Linux and other OSes, I'm very sick and tired of distro companies promoting that kind of lock-in. All package managers should be compatible with at least one packaging standard, but ideally all the standards/formats which exist, and any standards that exist which can't be easily adopted in all the most common package managers and be made cross-distro need to just die off until they are more friendly to the community which should be all about getting along and interoperability, especially within open source platforms.
  • by Ed Avis ( 5917 ) <ed@membled.com> on Saturday November 15, 2008 @07:33AM (#25769867) Homepage

    It's one thing to have a packaging standard for third-party applications which install in their own directory and require well-known libraries defined in the Linux Standard Base. I agree, there should be a cross-distro standard for installing these programs (and there is: LSB defines a package format, the only problem is getting the third-party vendors to use it). But the Nvidia drivers are not just any old application; they want to overwrite standard system files and otherwise mess around with things. It's unreasonable to expect all distributions to support that.

    BTW, the moderation of your comment as 'Troll' is a sad reflection on the Linux-groupthink around here.

  • Re:ATI (Score:2, Interesting)

    by Ash-Fox ( 726320 ) on Saturday November 15, 2008 @11:19AM (#25770579) Journal

    Of course X does direct rendering. It's called Direct Rendering Interface - DRI. And the new improved DRI2 being worked on now.

    It might being worked on, but the current state is that it isn't there and it isn't available right now.

    His other argument is that Xorg will never be able to have a unified memory manager... which is exactly what TTM and its successor GEM do.

    You are putting words in my mouth, I never stated it will never have a unified memory manager, I was saying that there are conflicts within the management of x.org to get this done. I'm aware of Gem, are you aware of how it got implemented and stripped out a few times already?

    And noone in the Xorg team claims that indirect rendering is as fast as direct rendering.

    Where did I say I talked to the Xorg team?

    Companies like NVidia just replace chunks of Xorg without contributing anything back.

    nVidia replace chunk of Xorg to make it work properly without making their changes opensource. We can't really say "contributing anything back" because they aren't really getting anything in the first place, other than a properly working X11 setup.

    Whereas its companies like Intel that actually contribute to improving X for everything - pushing a unified memory manager (TTM/GEM) into the kernel etc.

    Where is it? I don't see it in Ubuntu intrepid.

  • Re:ATI (Score:4, Interesting)

    by Ash-Fox ( 726320 ) on Saturday November 15, 2008 @11:27AM (#25770603) Journal

    Some of the programs you mentioned plan on being X.org-compatible from what I understand. If so, that'd basically be the same thing as making a "new" X.org. But, it would help adoption to keep the name.

    The thing is, I don't see anything wrong with the current X11 protocol. I see plenty of things wrong with the current architecture provided by x.org and xfree86.

    We don't need yet another windowing system, as the current limitations we have are purely due to implementation, not by protocol design.

"We don't care. We don't have to. We're the Phone Company."