Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Displays Linux Technology

New Oculus SDK Adds Experimental Linux Support and Unity Free For Rift Headset 24

An anonymous reader writes: Oculus, creator of the Rift VR headset, has released a new version of their SDK which brings with it long sought-after support for Linux, which the company says is "experimental." Linux support was previously unavailable since the launch of the company's second development kit, the DK2. The latest SDK update also adds support for Unity Free, the non-commercial version of the popular game authoring engine. Previously, Unity developers needed the Pro version—costing $1,500 or $75/month—to create experiences for the Oculus Rift.
This discussion has been archived. No new comments can be posted.

New Oculus SDK Adds Experimental Linux Support and Unity Free For Rift Headset

Comments Filter:
  • about time (Score:5, Funny)

    by Nyder ( 754090 ) on Saturday October 25, 2014 @03:05PM (#48230739) Journal

    this is the year of the linux VR.

  • by Anonymous Coward

    This is slightly old news. I spent a chunk of yesterday setting up Unity free, and creating some test environments with the Oculus camera rig and player controls in them. I can confirm that so far it works great, and is extremely easy to do, even for a complete Unity noob like me.

  • by fuzzyfuzzyfungus ( 1223518 ) on Saturday October 25, 2014 @03:48PM (#48230941) Journal
    As I understand it, the DK2 hardware interacts with the host computer at three points: there's an HDMI video in, which feeds the two screens(presented as a 1920x1080 monitor; but physically split into two 960x1080 panels), a USB interface for the in-device sensors and housekeeping purposes(accelerrometer, magnetometer, gyroscope, firmware updates, latency testing device), and a USB connected IR camera for head-tracking based on the IR LEDs on the head-mounted portion of the device).

    How much OS-specific work needs to happen, and how is it distributed?

    I'm assuming that the HDMI-in is fairly normal, unless they really broke the EDID/DDC or something(but obviously not going to be very pleasant unless the application drawing to the '1920x1080 monitor' knows that each of my eyes is only getting half of it).

    Barring very good reasons(probably involving latency), I'd assume that the camera is just a UVC device; but that actually using it as anything but an expensive webcam requires the OR-specific head-tracking software to have access to it (the meat of which is presumably cross-platform; but DirectShow vs. V4L2 and other interacting-with-the-system stuff won't be).

    The headset's USB interface presumably needs a specific driver, since 'read the outputs of a bunch of sensors and also firmware update' isn't exactly a USB Device Class; but would presumably be a comparatively lightweight 'wrap the sensor outputs and get them to the host as quickly as possible' thing, with the bulk of the motion and position tracking logic being mostly OS independent except for the layers it has to interact with to get headset and camera data.

    Is this largely the extent of it (2 mostly standard interface, one device specific driver, plus having the motion and position tracking software running on Linux and interacting with the OS-specific interfaces to the drivers)? Do I fundamentally misunderstand how work is broken up within the Oculus system? Do I basically understand it; but it turns out that latency demands are so stringent that a variety of brutal modifications to the typical OS graphics system and GPU drivers are also required?
    • by tlhIngan ( 30335 )

      How much OS-specific work needs to happen, and how is it distributed?

      I'm assuming that the HDMI-in is fairly normal, unless they really broke the EDID/DDC or something(but obviously not going to be very pleasant unless the application drawing to the '1920x1080 monitor' knows that each of my eyes is only getting half of it).

      Barring very good reasons(probably involving latency), I'd assume that the camera is just a UVC device; but that actually using it as anything but an expensive webcam requires the OR-spec

    • The headset's USB interface presumably needs a specific driver, since 'read the outputs of a bunch of sensors and also firmware update' isn't exactly a USB Device Class;

      03h [wikipedia.org] and FEh [usb.org]?

      An accelemeter is really just a joystick with six axes, as is a 3D locator.

Keep up the good work! But please don't ask me to help.

Working...