Forgot your password?
typodupeerror
Intel Games Linux

Valve & Intel Collaborating On Open-Source Drivers 66

Posted by Soulskill
from the straining-out-the-blobs dept.
An anonymous reader writes "It looks like Valve's Linux team that's still growing has found much interest in open-source graphics drivers. Intel Linux graphics driver developers and Valve's Linux team were meeting for the past week to look at each other's code, work out performance goals, and collaborate on new features. Ian Romanick of Intel blogs, 'The funny thing is Valve guys say the same thing about drivers. There were a couple times where we felt like they were trying to convince us that open source drivers are a good idea. We had to remind them that they were preaching to the choir. :) Their problem with closed drivers (on all platforms) is that it's such a blackbox that they have to play guess-and-check games. There's no way for them to know how changing a particular setting will affect the performance. If performance gets worse, they have no way to know why. If they can see where time is going in the driver, they can make much more educated guesses.' Perhaps the companies are paying attention to Linus Torvalds' memo to NVIDIA?"
This discussion has been archived. No new comments can be posted.

Valve & Intel Collaborating On Open-Source Drivers

Comments Filter:
  • Intel been too busy singing in harmony to open-source their drivers all these years.

  • by sl4shd0rk (755837) on Friday July 20, 2012 @03:41PM (#40716961)

    I've had a gamut of issues with openGL support on linux over the years. NVIDIA was the easiest to get working and by far the best support (in my experience anyway) but was by no means bug-free. Intel drivers and chipsets remain schizophrenic at best and let's not mention S3 or other laptop chipsets.

    Hopefully these guys can add some weight into the push for better video support from both Intel and NVIDIA.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      I'm not here to flame, but does AMD not exist anymore? You said nothing of them, good or bad.

      • AMD is really good, imo. The only issue I have it using my 40inch tv as a second monitor on an old laptop so get a few random glitches. I don't know if that's due to driver issues or my graphics card isn't quite up to the job.

        That or (most likely) it's yet another Ubuntu 12.04 regression.
  • by Anonymous Coward

    Graphics, Intel, Drivers (and better, driver quality - If I wanted to talk about shit drivers and behaviour, and utter suckage, Intel are *right'* there. Counter to this, their later stuff has been a bit better, but the HD3000/HD4000 are still poor in serious GFX work.

    In Valve are serious about gaming on a linux base, it can't be at the ground zero of current Intel GFX. Well, it can - but I won't be the slightest bit interested.

    • by Bengie (1121981)
      It was my understanding that Intel's graphics drivers are open-sourced along with the hardware specs. I thought the open-source philosophy is to fork and fix.

      As much as I love opensource, I find it kind of funny/sad how the end users complain about getting open-source drivers and how open-source is so much better, then when the drivers are provided they just complain about how bad the drivers are.
      • Re:Erm (Score:4, Interesting)

        by ZeroSumHappiness (1710320) on Friday July 20, 2012 @04:29PM (#40717607)

        Agreed, this is about Valve and Intel /teaming up/ to make their drivers /better/. Intel hasn't had the man-hours or budget to work on graphics that NVIDIA has had over the years? They've only started caring about gaming-class graphics what, two years ago, if that? Now that they do care and now that they're going to town with a first rate gaming group maybe they'll get better than NVIDIA and AMD really quickly. In fact, though, they don't need to get even to the same level as NVIDIA and AMD to be on my radar -- they just have to beat two or three generation old graphics, since I tend to play three or four generation old games -- I /just/ got Mirror's Edge and I'll be running it on an NVIDIA 400-series card. If Intel can beat the 500-series by the time I build a new computer I'm not buying a discrete graphics card for it.

    • Re:Erm (Score:5, Insightful)

      by Gadget_Guy (627405) * on Friday July 20, 2012 @04:40PM (#40717787)

      In Valve are serious about gaming on a linux base, it can't be at the ground zero of current Intel GFX. Well, it can - but I won't be the slightest bit interested.

      Well Valve can't be serious about Windows gaming either, because even their most recent games still run pretty well on Intel graphics.

      • by Jonner (189691)

        In Valve are serious about gaming on a linux base, it can't be at the ground zero of current Intel GFX. Well, it can - but I won't be the slightest bit interested.

        Well Valve can't be serious about Windows gaming either, because even their most recent games still run pretty well on Intel graphics.

        Valve seems to understand better than many game developers that pretty frames that take a lot of GPU power to render do not necessarily make good games.

        • Re:Erm (Score:5, Insightful)

          by gman003 (1693318) on Friday July 20, 2012 @08:04PM (#40719929)

          To put it more succinctly:

          Valve understands that a *fun* game will be fun. As long as the graphics are good enough to support the gameplay, the ame will be fun whether you're running it at 2006-era graphics or at 2016-era graphics.

          Valve understands this. They make a fun game, then make it run on the lowest hardware they expect will be commonplace. They design their system to be scalable. They allow features to be disabled, have an extensive set of shader fallbacks. Examine this somewhat-outdated wiki page [valvesoftware.com] detailing the features enabled and disabled for each DirectX level in the original Half-Life 2. That's no longer current, I believe - they patched it to use a newer engine revision that I think dropped support for some of the lower levels, and I know it added higher ones.

          I have played that game many times on many different computers. It was fun on my Athlon 3000, Radeon X700 build. It's fun on my dual-Xeon, Radeon X1900 rig. It was fun on my Core 2 Duo, GeForce 9600M laptop. It was fun on my Phenom II X3, Radeon 4830 build. It would probably be fun on this new Core i7, GeForce 660M laptop, but I haven't replayed it yet on this.

          The only machine it wasn't fun on? My ancient Pentium II, Rage Pro laptop, and that was because it glitched like crazy - corrupted textures, BSOD after a few minutes. The machine just could not handle some of the things that were actually necessary for gameplay - the Havok physics (used in puzzles), the fade-in shaders (used for one-way gates), the dynamic lights (used to highlight gunfire). Remove those, and it wouldn't have been a fun game, so Valve just didn't remove it.

          But the rest? Water refract/reflect shaders? Rim lighting? Normal maps? Soft shadows? Turn them off if necessary. They don't make the game less fun. Less immersive, perhaps - that's why they have them as an option - but the fun doesn't change.

          And the fun is what is important.

          • by Anonymous Coward

            I wouldn't call that succinct, but it was insightful, interesting and verbose.

          • support old level directx and shader and you'll have to spend more man hours and create a team just to make sure your game runs correctly with those. Add the higher support directx level and you got another team. Just look at the market on what people currently have and make the math. In the end, they would spend more money if they support both and old and new "technology". They have to drop the old stuff. It sucks for people who doesn't have high grade directx hardware but the team can concentrate more on
  • If you want to improve intel, make a graphics processor that dosent get it ass beat down by a 40$ 6 year old geforce

Reference the NULL within NULL, it is the gateway to all wizardry.

Working...