Experimental Virtual Graphics Port Support For Linux 74
With his first accepted submission, billakay writes "A recently open-sourced experimental Linux infrastructure created by Bell Labs researchers allows 3D rendering to be performed on a GPU and displayed on other devices, including DisplayLink dongles. The system accomplishes this by essentially creating 'Virtual CRTCs', or virtual display output controllers, and allowing arbitrary devices to appear as extra ports on a graphics card."
The code and instructions are at GitHub. This may also be the beginning of good news for people with MUX-less dual-GPU laptops that are currently unsupported.
Comment removed (Score:5, Interesting)
Re: (Score:1, Informative)
Let's hope it doesn't "work" like PulseAudio.
Re: (Score:2)
Re: (Score:2)
It doesn't sound to me like this is that much like either PulseAudio or Jack. Those are both sound systems based on userspace daemons focused on flexible sound mixing, while this virtual graphics system is within the Linux kernel and seems to be focused on simply moving pixels from one hardware device to some other device.
Wayland [freedesktop.org] is more like PulseAudio or Jack for graphics. Its proponents think it has advantages over the much thicker, more complex daemon we've used for decades called the X11 server.
Re: (Score:3, Funny)
no,no "work" is not the word you are looking for when describing pulse audio.
I had a nightmare last night that PA was keeping ALSA captive demanding the /dev/null using /dev/rand spoke a few words about it's for
release of 1000000 CPU cycles the system was keeping for thread scheduling.
In the end we used an SCSI driver to nuke the damn thing to
an NPTL.
Unfortunately when we stormed the desolated daemon we found out the cruel
things it had been doing to ALSA all along, leaving it a mutated and deformed
carcass.
Re: (Score:2)
If you want to see the worst abuse of PulseAudio, check out the Nokia N900 linux phone. Its using a combination of PulseAudio and a few other projects along with a bunch of closed-source blobs to do lots of the audio stuff in the phone. (the closed source blobs exist to keep certain proprietary algorithms for things like speaker protection and other things that are needed in a cellphone (exactly what is unknown since Nokia hasn't documented them)
Re: (Score:1)
Um, pulse audio was a nightmare on *every* distribution for a *long* time. Might still be, I wouldn't know. I'm still holding on to my SB's so I don't need that crap.
PA works (Score:2, Informative)
PA works just fine as long as the one who sets it up more or less knows what he's doing. Ubuntu and most user friendly distros had packagers who didn't, hence massive problems. Of course, there are other real problems like Skype borks which mostly come from Skype using ALSA in an arguably incorrect way that when used with PA shows why directly accessing guestimated hw: stuff is a bad idea. But things people almost always complain about were caused by inept Ubuntu devs not real problems with PA.
Re: (Score:1)
Re: (Score:2)
PA works just fine as long as the one who sets it up more or less knows what he's doing. Ubuntu and most user friendly distros had packagers who didn't, hence massive problems
And obviously, it is reasonable to expect users to know more about setting up their distro than the people who put the distro together.
Re: (Score:3)
Sometime I am somewhat impressed that Lennart never even did try to sue Ubuntu for all the badmouthing and ill-will and extra time he had to handle because of how Ubuntu did not know how to use pulseaudio, broke it in more ways than one thought possible, and then ship it in a "stable" release...
I feel pain in your words and that sounds like the pain I had after last Ubuntu upgrade. I ditched pulseaudio because of all the problems I had and now they installed it per default. I can't believe how many things they brake every time. After so many years being faithful to Ubuntu I will switch to Debian unstable to get more stable system!
Re: (Score:2)
PulseAudio works great for me and makes my life a lot easier, so it would be fine if it did.
Re: (Score:2)
Let's hope it doesn't "work" like PulseAudio.
Fortunately, it seems not. These people actually seem to know what the fuck they're talking about.
Comment removed (Score:4, Informative)
Re: (Score:2)
On the other hand, if you have spare CPU cycles, you could take that output video and compress it to MP4, which VNC doesn't yet support. Still, far less efficient than sending the 3D commands over the wire for the device on the other end to render.
Re: (Score:3)
Re: (Score:2)
However, it would let turn a laptop or tablet into a 2nd monitor, which could be rather useful at times if you don't normally have a dual screen setup.
Re: (Score:2)
Re: (Score:2)
I did read the parent post, though I don't know if you posted elsewhere.
So what would the use case for this be? Just cause there are better options, doesn't mean it couldn't be used like that.
Re: (Score:2)
Re: (Score:2)
What you don't seem to realise is that a high level API *is* compression: it specifies low-level details, in a less cumbersome, more compact, more efficient way.
But being high level, it packs those compact and efficient routines in seven layers of abstraction, so the end result is bigger and slower.
High level APIs are good for making it easier for programmers, and producing consistent code, but neither for size nor for speed, unless you compare with another high level API that uses less efficient algorithms.
Re: (Score:2)
X11 has done network-transparent video since forever. Screens that don't exist have been around a long time too (Xvnc).
The part where this is better than existing solutions is you get a hardware-accelerated framebuffer without having to attach it to a physical monitor. Thus, you could get a hardware-accelerated Xvnc, or create a virtual second head and network-attach it to a second computer. You might even do that over VNC, so it's not really an alternative to VNC... it's a new capability.
Re: (Score:2)
Does anyone know if this would this provide a performance boost over something like VNC for similar things? Or how about the possibility to pass rendered output as a fake video capture card input to a virtual machine? I think I get what this does, but I'm kind of wondering how exactly it's better than current solutions to these problems.
An obvious way to use this would be to target some kind of virtual frame buffer in regular RAM that VNC or other remote protocol could take advantage of. Currently, you have to point VNC to a real frame buffer that is displayed on a GPU's output to take advantage of the acceleration. However, if you switched the virtual frame buffer the GPU renders to, you could have acceleration for an arbitrary number of them as long as applications don't need to use acceleration features all the time.
Need some help here (Score:2)
I get the sending info to multiple places. Are they talking about sending different streams to these monitors/what-have-you? Otherwise it just sounds like tossing in a splitter in the video signal.
Yes, I did read TFA, and I guess I'm missing something.
Help please?
Re:Need some help here (Score:5, Informative)
"In a nutshell, a GPU driver can create (almost) arbitrary number of virtual CRTCs and register them with the Direct Rendering Manager (DRM) module. These virtual CRTCs can then be attached to devices (real hardware or software modules emulating devices) that are external to the GPU. These external devices become display units for the frame buffer associated with the attached virtual CRTC. It is also possible to attach external devices to real (physical) CRTC and allow the pixels to be displayed on both the video connector of the GPU and the external device."
Re: (Score:1)
Re: (Score:3)
I've used DMX with Chromium to give 3D accelerated X over 28 monitors on 7 machines. Works, but the performance can be terrible if you don't have the interconnect to deal with what you're rendering. With gigabit basic X applications could cope, but firefox with google maps would take seconds per redraw. Depending on the 3D app you /can/ get decent performance though.
Current Bottlenecks? (Score:2)
These external devices become display units for the frame buffer
Looking at the HDMI specs for guidance, a high-res frame buffer might run 10Gbps. That's still considered a hard amount of data to push around inside a PC, right?
GPU accelerated sound? (Score:3)
Wonder if this could be used to create a GPU accelerated sound system?
Take the scene modeling, texture objects based on their acoustic properties, create light sources for every sound source, and output the scene to a sound device that translates the visual frame into a soundscape for output.
Or am I just not up to date with audio acceleration technologies (since I've never upgraded beyond a cheap headset).
Aureal3D (Score:4, Informative)
That's basically what the old Aureal technology did a decade ago--took the 3D scene data and passed it to the audio card for processing. It was awesome--Half-Life with four speakers was eerily realistic.
Re: (Score:1)
the a3d did great positional audio with only 2 speakers. like that demo that had bees flying all around you.
Re: (Score:2)
God I remember that demo! It was actually a little spooky... they'd fly behind you and the hairs on the back of your neck would stand up because your brain was telling you there was a huge bee back there.
That Came on my brand new Compaq which had Windows 98, an AMD K6-3D at about 200mhz, 32MB of RAM, and a 4GB HDD.
And now I feel old...
Re: (Score:2)
I think the bee demo was from Sensaura. I worked up there for a few happy years until Creative ermmm... 'nuff said.
Maybe both companies had a bee demo...
Re: (Score:2)
Nothing special, just an implementation of HRTF [wikimedia.org].
Re: (Score:2)
Well, mainstream soundcards have been good enough for realistic sound since the 90s, so it isn't really a problem that needs to be offloaded to anything else. It's a lot easier to fake realistic audio in realtime than realistic graphics. Half-Life with an EAX setup sounded amazing, but it wasn't exactly photorealistic.
Re: (Score:2)
Though your idea is interesting, I doubt it could benefit from the virtual graphics approach described in TFA. This is about rendering pixels to arbitrary outputs, while it sounds like you're talking about much higher level manipulation. There are already capable, programmable DSPs for advanced audio processing such as the EMU10k1 series from Creative and I expect you could use OpenCL or something like it to do sound processing on a GPU if desired.
Pff, nothing new (Score:2)
Stick a webcam in front of the screen, compress/pipe webcam output to the remote client. Voila, instant 3D remote display!
Re:Pff, nothing new (Score:5, Interesting)
Indeed -- not new, at all.
Similar tricks were used a dozen or so years ago by Mesa 3D to get standalone 3dfx Voodoo cards to output accelerated OpenGL in a window on the X desktop. The 3D stuff rendered on a dedicated 3D card, and its output framebuffer was eventually displayed by a second, 2D-oriented card that actually had the monitor connected.
Re: (Score:2)
Perhaps doing it in a generic hardware agnostic way is new?
Re: (Score:3)
Perhaps, depending on how hardware-agnostic the APIs in question were/are.
Then again VirtualGL [wikipedia.org] has been around for a bit, too, which brings network transparency thrown into the mix. I don't know how much more hardware-agnostic such a thing could be...
Re: (Score:2)
Yes, I think that's exactly why this is interesting. Increasingly, PCs have multiple video outputs of various types as well as multiple GPUs. If you can decouple the GPU used to render something from the output used to display it without a huge performance hit, that opens up all kinds of possibilities.
Re: (Score:2)
You neglect to mention that said standalone 3D cards were physically connected to the 2D card via a pass-through cable which was what sent the video signal from one card to the other, allowing it to appear on your monitor.
This is a software solution of the same effect that will work on any card, even remote cards on different machines. Hardly the same thing.
Re: (Score:2)
You neglect to remember that the Voodoo 1 and 2 were only capable of full-screen output using that passthrough cable, and had no conventional 2D processing capabilities of their own. The pass-through cable was essentially just a component of an automatic A/B switch: You could either visualize the output of one card, or of the other, but never both at the same time. (At least not by those means.)
To render 3D stuff on a Voodoo 1/2 and have it displayed inside of a window instead of full-screen required[1]
Re: (Score:2)
I was too young/inexperienced/poor to actually lay hands on relative big-iron like SGI back in their heyday, but it wouldn't surprise me at all if that was true: There was a lot of really awesome tech being sold by them around that time, and it's an
plan9 (Score:1)
Re: (Score:2)
Reminds me of plan9, beautiful design and concept.
I agree about it being a beautiful design and concept. Why send expensive aggressive robots to dominate a new species you find on a new planet, when you can just raise their dead and control the masses with slow moving zombies?
I sure hope I'm not misunderstanding your reference.
In the Kernel please (Score:5, Informative)
David Airlie's HotPlug video work is really cool. I'm not surprised something bigger is coming out of it. What I really like are Elija's thoughts on putting it in the kernel so support is for more than X. Below is from the DRI-Dev thread. http://lists.freedesktop.org/archives/dri-devel/2011-November/015985.html [freedesktop.org]
On Thu, 3 Nov 2011, David Airlie wrote:
>
> Well the current plan I had for this was to do it in userspace, I don't think the kernel
> has any business doing it and I think for the simple USB case its fine but will fallover
> when you get to the non-trivial cases where some sort of acceleration is required to move
> pixels around. But in saying that its good you've done what something, and I'll try and spend
> some time reviewing it.
>
The reason I opted for doing this in kernel is that I wanted to confine
all the changes to a relatively small set of modules. At first this was a
pragmatic approach, because I live out of the mainstream development tree
and I didn't want to turn my life into an ethernal
merging/conflict-resolution activity.
However, a more fundamental reason for it is that I didn't want to be tied .... yet) that live directly on the top of libdrm.
to X. I deal with some userland applications (that unfortunately I can't
provide much detail of
So I set myself a goal of "full application transparency". Whatever is
thrown at me, I wanted to be able to handle without having to touch any
piece of application or library that the application relies on.
I think I have achieved this goal and really everything I tried just
worked out of the box (with an exception of two bug fixes to ATI DDX
and Xorg, that are bugs with or without my work).
-- Ilija
CRTC??? (Score:1)
WTF. Cathode ray tube controller? What an antiquated concept.
Re: (Score:2)
No, the CRTC [crtc.gc.ca].
Re: (Score:2)
Console Redirect Transfer Controller?
CEASE AND DESIST! (and pay up, succa!) (Score:2)
Re: (Score:2)
What do you mean? An African or European ton of bricks?