Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Software

Best Supported Video Card For Linux/XFree86? 181

Crixus asks: "I'm about to build a dual CPU box on which to run Linux. Currently, what is the best supported video card under the latest Xfree86 releases. Which card(s) can I buy that would be obvious 'can't go wrong' choices?" This question pops in to the submissions bin quite a bit, even though we have discussed this issue several times in the past. However times change, and as the years pass the technologies change. What does this year offer in the way of compatible video cards for XFree and Linux? Those of you who have this question might also want to check out AnandTech's October Video Card Comparison.
This discussion has been archived. No new comments can be posted.

Best Supported Video Card For Linux/XFree86?

Comments Filter:
  • by Anonymous Coward
    The Matrox Millenium cards have been the only cards that I've had 100% support and 0% trouble from. I've built many high-end systems (both Intel and Alpha) and ran different OSes (Linux, NT, W95, BeOS, Solaris). The Matrox cards are wonderfully engineered and are totally supported by most every h/w and s/w vendor out there. Go for it. You won't be dissapointed.
  • According to Matrox, the binary only part cannot be open sourced due to third party licenses, as well as Macrovision implementation - which cannot be revealed yet.

    However, they're planning to release more documents for their cards..

    I have Matrox G400 here (Dual head) and I Geforce at work. Although the NVidia driver is way faster in 3D - it crashes a lot, and leave you with a graphical screen - and you have to reboot in order to exit this mode..

    Sigh...
  • Nop.

    The binary only part of matrox drivers has now been compiled on Alpha also, so Alpha users can use Dual head now..
  • You say multihead there in XFree86 4.0.2, does that include those single G400 cards with multihead on it? It'd be wonderful to throw out one of these cards I currently use. :)
  • I recently got a new box with a nvidia card, so I thought I'd throw in my $0.02 here.

    With X 3.3.6, the voodoo 3 2000 and 3500 (which I have direct experience with) work great. They are fast, provide nice 3d through glide, and quake3 play works wonderfully.

    Ditto with X4. Using the tdfx driver, and the tdfx and agppart kernel modules gave me the same performance (and fps) under debian's X 4.0.1.

    I upgraded to a K7-900 and nVidia GeForce 2 GTS (32mb) card and have had a chance to play with it. The card dropped into X4 just perfectly, and no great pains were gone through to get it all working.

    I have yet to get q3 going under it though. This is party my laziness however. The provided nvidia drivers work fine, but when starting quake, there is serious stuttering (think switching to software gl emulation). I downloaded and threw in the nvidia drivers off their page, and then I was bitched at that the system couldn't find /dev/nv* when it booted up. I assume that that means I should get off my ass and build the kernel module that I downloaded huh? I'm guessing that that will solve my problems.

    However, if anyone has any advice to a gamer about what kernels are supported/patched for the nvidia drivers, or which files should be downloaded/installed, I'd appreciate it.
  • I'm working out this very same issue on the supremely helpful matrox support linux forum [matrox.com].

    Sign up and visit this dicussion [matrox.com].

  • I would say my voodoo3 2000 is a very nice card. One of the first supported by Xfree86 and OpenGL/mesa GLX GLU.

    I would bet that the newest Voodoo5 is a damned decent card on any linux box.

    Just my $0.02 worth
  • by Tom Rini ( 680 ) on Friday December 22, 2000 @07:18AM (#543368) Homepage
    It does depend a lot on what you're doing. If you
    want a nice card for driving big monitors, and getting work done, along with the occasional 3D game, I'd say go with a Matrox G400. They're really nice, well supported, and for all the features that work now and will work someday (DVD), they're a good deal. I'd stay away from the 450s for now, just because they're still rather new, and from the reviews I've seen they only really beat the 400 in price. I've had a Millennium 1, 2, G100, G200 and now a G400 and all
    have worked wonderfully under linux.
  • theres no hardware gamma correction with matrox cards in linux. thats why i went with ati (r128) instead. it works in xf4 with some glitches, like switching text mode problems, but havent tried xf-4.0.2 with it yet.
  • You just have to use XFree86 4.0.x with the NVidia binary drivers.. the current version they have released is 0.95.

    Apparently, the GeForce2MX won't work right if you use the standard 'nv' driver that come with XFree86 4.0.1, but with the 0.95 X driver and the matching GLX driver, the GeF2MX works great, and has very impressive performance in Unreal Tournament to boot.

  • I have a VD3000 in my box and it has given me no problems at all. I am really happy with it because it just does what I want and I have not had to mess with it at all.
  • Installation under BeOS:

    1 install sound card and Ethernet card
    2 Boot
    Finished

    --
  • I've had a Geforce DDR since this summer, with mixed results.

    With the default 2.2 kernel, XFree86 4.0 setup I had at the time, following instructions I got the NVidia drivers compiled and set up pretty easily. Hell, if you're running an untouched Red Hat (or some others) without even recompiling the kernel, then you can pretty much download their RPM drivers, change a line in XF86Config, and restart X.

    Sure, there's something that made me uneasy with using binary-only drivers and loading a half-megabyte kernel module, but oh well. Oh, and did I mention the annoying bugs? There's the "X hosed if a 3D app gets killed suddenly bug", and the "horrible graphics glitches when switching between multiple X servers" bug, and the "if anything goes wrong, your text consoles will be permanently black until you reboot, even if restarting X works fine" bug, and well, you get the picture. Some of their problems got a little better in the 5 revisions of their Linux drivers I've seen. Some didn't. Will the rest of them get fixed, including the "we'll have this fixed in a few days messages" on their months old Linux FAQ? I don't know, but I don't like that the answer depends solely on the NVidia Marketing Department's appraisal of my value as a future customer.

    But anyway, X was pretty stable, Quake III was damn fast, and life was good.

    Then I decided to try 2.4, because of better support for my motherboard's onboard sound among other reasons. The Nvidia kernel module wouldn't compile against test7, test8,... I uninstalled all the binary stuff, switched to the open source 2D drivers, and forsake 3D support outside of Windows for months. Fortunately, the beneficial effect this had on my GPA prevented the loss from angering me greatly.

    Finally, someone came out with a patch to compile the kernel module against 2.4.0-test11, and stuff ran great again with the NVidia drivers. Same niggling bugs, but 3D was golden.

    Then came 2.4.0-test12. Now NVidia's drivers compile but don't work, so I'm back to the open source stuff. Of course, when I mean don't work, I mean *really* don't work: If I run X with the closed nvidia driver, it fails and thrashes the screen until the next (blind or remotely logged in) reboot. If I run X with the open nv driver, it works, but I don't get 3D, and if I were to foolishly run an app linked to the binary NVidia GLX drivers (like 3D Xscreensaver hacks, before I turned them off), it crashes the kernel.

    That's right. Do you want blazing fast 3D support, or do you want months of uptime? NVidia's drivers remind me of software I had to fix at work last summer: use them in exactly the situations that the original coders tested with, and you're gold. Try something that should be within the design spec, but is still out of the ordinary, and down it goes. You can debug code by understanding what can go wrong and preventing it, or you can debug it by running it, fixing whatever leads to a crash/error, and repeating it until you stop seeing crashes. nvidia_drv.o smells like a product of the latter method.

    I've had some problems (including crashes that resemble the black screen Linux crashes, but with no known cause) with the card under Windows too, but not too much more than the problems I have with everything under Windows. I had an epiphany the other day, realizing that if just one binary-only third party driver under Linux can suck like this, what happens in Windows where *every* driver is code that neither I nor Microsoft gets to see or debug? I almost pity the Windows OS developers now; they probably get blamed for three times the crashes they're actually responsible for.

    Anyway: if you want blazing fast 3D, there's really no choice but NVidia. Their hardware is a generation ahead of ATI and 3dfx, and two ahead of everyone else out there. If you want a card that's excellent at everything else, and are willing to trade excellent Linux drivers for mediocre 3D acceleration, go with Matrox.
  • What are you talking about? The nvidia drivers are very fast in linux, slower then windows by a couple fps, dri doesn't mean shit, its just an extra layer of stuff that would make the drivers SLOWER. nvidia has their own direct rendering system.
  • I have a Voodoo3 3000 AGP/16MB in my Linux desktop machine right now. The drivers are great, all resolutions supported and great 3D support. Beyond that, I wanna say you can get a Voodoo3 3000 AGP for about $80 [pricewatch.com] now.
  • I agree. I use a G400 with a Millenium II PCI in a dual-head configuration. These cards are completely supported, from 2d to multi-head, to direct 3d rendering. As a bonus, the Matrox cards are the only cards with hardware acceleration for the Render extension in XFree86 4.0.2.
  • I have the multihead model, but I haven't tried to use both outputs. My understanding is that the second head of the G400 1) can only be used with the Matrox binary drivers, and 2) won't acheive the resolution and refresh that I desire. I don't mind burning a PCI slot for the trusty Millenium II.
  • by Jeffrey Baker ( 6191 ) on Friday December 22, 2000 @01:42PM (#543378)
    Let me be more specific. According to my manuals, the RAMDAC on the Millenium II is 250 MHz, while the RAMDAC on the second head of the G400 is only 135MHz. The main RAMDAC of the G400 is 360 MHz.

    This means that the best mode I could get from the second head of the G400 with a vertical refresh > 70Hz would be 1360x1020. The Millenium II can do 1800x1350, giving me 75% more pixels on the screen. Thanks to whoever wrote this modeline calculator [inria.fr].

  • I have a Voodoo3 2000 AGP in my card at home.

    Works great under XF 3.3.6 and only ran me about 60 bucks when I bought it.
  • well you have to remember that issues that used to be associated w/cards are no longer a real issue b/c of the softbooting of the BIOS by Xfree4. I have an AGP Voodoo3 3k and a PCI Cirrus Logic 5446 and it works fine w/dual heads.
  • by Cosmo ( 7086 ) on Friday December 22, 2000 @07:20AM (#543381) Homepage
    I've been running a multihead system (Riva 128 AGP & Voodoo 3000 PCI) for a while and I really like it. I was suprised when the two cards came up togethor without any real effort (XFree86 -configure rocks BTW), however, when I recompiled my 2.2.x kernel to include support for SMP the machine would lock about half the time I tried to start X for the first time since the machine had been rebooted. After I got X started it was no problem until the next reboot. I suspect there's some sort of problem in the int10 code for softbooting the second monitor with SMP. Oh well. I just upgraded to debian woody and compiled a 2.4 test kernel and the problem seems to be fixed, I havn't had any problems softbooting the second card since.

    Just my two cents, someone else might find them interesting or useful.
  • so, let me get this straight, you're comparing the install of a whole line of video cards with a specific video card / motherboard / sound card combination, that from bad driver composition makes it a pain and making it look like Windows 2000 is inferior because of it. I'm sorry if I find that a bit of a weak argument if you're trying to go for the Linux is better than Windows Rally Chant. I mean, I've had my share of incompatibilities and driver peculiarities, but I rarely blame the os as much as I blame the driver writers and component manufacturers, mainly I'll blame or praise the os for the ease of actually instaling the drivers, or the ease or figuring out a problem, and for the most part I do have to say that Linux can be quite daunting in that respect.

    matguy
  • There is no choice for a good video card. There are tons of choices. And even quite old ones.

    For a 3D, generally NVidia cards will go. Specially if you wanna play the few, unfortunately, OpenGL games that go on Linux. However if you are doing some thing like scientific work where you need some tough 3D stuff, forget about TNTs and GeForces. They are good, fast, but the NVidia's hacks are miserable when you come out of the game arena. Conflicts happen, frequently you fall into segfaults and more curious, NVidias are horribly slow with some apps. Well, it's NVidia's fault, some standards should be followed anyway. In the "scientific" case, even a Voodoo3 will do a much better job rather than the latest GeForce. Maybe others will do the bad work but I didn't test any.

    And if your taste are 2D graphics, then things get worser. NVidia and late 3Dfx make only an average quality 2D system. They work relatively fast but quality is HORRIBLE. Well, if you have something to do with design. The more professional you are the worse it comes. In this case even many modern AGP cards get 0 in comparation to their old PCI sisters. Till now people use Diamond's like FireGL for graphics, as colours are much more perfect on them than its modern counterparts. Only recently I saw one ATI Rage AGP card where I could get right into design tasks without thinking too much about how the thing is set up. On NVidia I always get things wrong. On Voodoo I always curse their reddish mess.

    So you may try to get an "ideal card". Ok, my question: What you wanna do?
  • And no, the current cards suck in 2-D speed

    Are you running a 300dpi monitor or something? Jesus, the current cards are fast enough in 2d, quality is much more important.

  • Agreed. Since we're on the ATI subject, I've got an All In Wonder 128 card (the 16 MB version, not the Pro/32MB one).

    OpenGL support does exist by default in XFree86 4.0, but it's quite slow. I get 50-60 fps with Windows on my Celeron 400, but only 35-40 fps in Linux. When explosions come in Quake 3 I actually see what I call the "webcam effect".

    Video capture is out of the question, of course, since there's no software for it. There's a program called xatitv that works well, but it hasn't been updated in about 5 months, is quite beta and doesn't enable bilinear filtering on the TV image, giving you the 'line effect' when the image moves sideways. (Remember that PAL/NTSC are interlaced)

    2D support is pretty good, but what card can't handle that nowadays?

    In any case, stay away from ATI if you want to play 3D games in Linux.

    [before someone comments on it: XFree 4.0.2 gave me a 5% speed increase in exchange for a very unstable server]

    Flavio
  • And don't tell me this doesn't happen all the time, because I work in an office full of gamers most of whom have gone through the above...

    Actually, it only happens all the time if you don't know what you're doing. As soon as you understand the finer points of partitioning with FAT/NTFS4/NTFS, MBRs, and installation order, it only needs done once.

    I started with a 95/NT4 dual boot, then upgraded to Win98, the took NT4 to 2000, then wiped out 98 and put WinME on that partition. And I have a copper-heatsinked GeForce2, and play a mean Barbarian or Paladin in D2, so you can consider me a gamer; I support a corporation full of NT4-based domains for a living so I guess you can consider me an office guy.

    It works fine.

    And to stay on-topic, My Linux box is running a very nice VLB ATI card with 1meg of VRAM under RH7.0 to proxy my DSL to the iMac and the PC. Price? The cost of some dinner for the guy that gave me the PC and helped me set up the firewall. :)

  • I agree that the geforce2 performs well with xfree 4.0 but it should just be pointed out that NVidia's drivers are closed-source and still considered beta by nvidia. Some would complain that they are not open source for idealogical reasons, others for practical reasons (less bugfixes, less frequent releases, not in main xfree tree), while others will just be happy that they work relatively well.

    I'll let you make up your mind on that issue.

  • Furthermore, if NVidia open-sourced the Linux drivers, they'd be giving away trade secrets.

    If you're interested in why NVidia's drivers are closed-source, I would recommend reading this brief interview [thedukeofurl.org] of an nvidia developer.

  • i guess you're not a fan of porn. if you've been on the internet for 7 years, how could you have avoided it?

    my choice for best video card under linux, anything by matrox.

  • "The card performs very well in windows (200+fps easily in q3a)" I doubt it! My roommate has a Athlon 900 with a 5500 and the max fps I have seen in 80. Also my little brother has a PIII 800 with a 5500 and he gets about the same fps... I wonder what drivers you use to get 200+ fps.
  • Quake III on my GeForce 2 MX under Win2K would freeze after a few minutes until I turned my AGP setting in BIOS from 4x to 2x. The frame rate is still good and I haven't had any problems under Linux, except that it takes a few seconds to switch into graphics mode.

    Brent
  • If you're intending to game a lot, shelling out a few hundred dollars for a card from NVidia wouldn't seem like a losing proposition.

    If the machine is destined to sit in a closet, or maybe even a desk, where it's main role would be anything less graphics intensive (development, server tasks) then why shell out extra money for a graphics processor you'll never need... You can pick up an 8 meg ATI Xpert@Work AGP card (so it doesn't take away from your PCI slots...) for less than $40 or $50 dollars. And for me, it's been rock solid. You could even get away with spending less, if you knew you'd never be plugging the machine into a display that was running at higher than 1024x768.

    It's oh so hard to make recommendations to people without having all the information that you need.
  • Maybe slashdot should run an annual feature, or since X is released roughly every 6 months, they should release a semi-annual feature / questionair.

    I think the best place to look is to start with is Big Ed's Tech Site http://bigedstechsite.com/ or Tom's Hardware review (not sure of the URL). There is also the linux hardware database http://lhd.datapower.com/ .

    Lots of cards are supported now, and the LHD, is a good place to get the reviews. Now if people would only use it and submit their ratings of hardware....

    Well What are you waiting for go to it!

    I don't want a lot, I just want it all!
    Flame away, I have a hose!

  • My G400 has no fan, either. Killer card, too (:


    --
  • I'm doing that with X 4.0.1. 4.0.2 had some... problems, so I downgraded for now (might play with it over the weekend).

    I am using the secondary head to to TVOUT, but I have tried connecting two monitors, and it worked fine.

    You do need the Matrox binary driver.


    --
  • I use that resolution at work (actually, slightly higher res (modelines rule). It really helps to have a 21" monitor.

    At home I'm stuck with what Sun calls a 17" (actually 15.7"), which lives at slightly more than 1280x1024. At least under Linux... Windoze for some reason refuses to go over... um. 1148xsomething?... I forget.


    --
  • I love BeOS, but I have to point out that this is true if the cards are supported in the first place, which is a big if.


    -------
  • I got a GeForce 2 GTS for a visualization project at work. Boy, what a miserable experience.

    The last driver released by NVidia was on 9/5, so I'm hoping they come out with another one, maybe for the 2.4 kernel (the current one only works with 2.2).

    You have to build a kernel module that works with the agpgart module (which is mature in 2.4, but only experimental in 2.2 btw). Then you update XF86Config to use their GLX driver. I had some problems compiling the kernel module because of problems in the kernel headers supplied with Redhat 7 - the SMP #defines were screwed up.

    But eventually I did get it all set up and working - with some GL programs. But many GL programs - including several of the Xscreensavers and GLUT demos - make my computer crash instantly with no warning. It just *POP* resets. I've traced through several GLUT demos with a debugger and I still can't find the exact thing that does it but I think it has to do with display lists. I've tried setting it up on SMP and non-SMP configurations, but to no avail.

    The bottom line: The NVidia drivers crash my computer HARD every time I try to do anything meaningful with OpenGL.


    -------
  • Plug in card.
    Start gaming.

    You forgot some steps:

    Start cursing because the games you want to play don't work properly under Windows 2000.
    Repartition hard drive.
    Re-install Windows 98 for games.
    Find driver CD because new video card is too new for Windows 98 to have shipped with drivers for it.
    Re-install games in Windows 98.
    Re-install Windows 2000 because Windows 98 blew away Windows 2000 partition.
    Re-install all of your other software in Windows 2000.

    And don't tell me this doesn't happen all the time, because I work in an office full of gamers most of whom have gone through the above...

  • This card works just fine for me with one exception (and I'm also wondering which card can do this). I would like a card that could do a color depth of 32 on Linux. I have to stick with 16 right now because GIMP and some other applications create a white bar across the screen everytime I undo (with GIMP) or resize the window with other apps (not all of them GTK). I used to have a Matrox G200 at work and that was a great card. I was able to have a 1280x1024 console (nice IRC session size without the need for X, being able to use w3m for a good majority of web browsing, midnight commander was great too in this aspect) which is just a treat to have.

    I wouldn't recommend the Voodoo3 2000 (as I have it because I duel boot and Voodoo3 2000 allows me to play Diablo II 3dfx), but definately go for a Matrox450 if you can afford one.
  • selling the thing for 80 bucks. EIGHTY BUCKS.
    Not a bad price to pay for something you can't do or don't want to do yourself.
  • Hmmm - But Xi Technologies is not ATI.

    A third party is providing the "driver".

    Also, although I'm not sure, I expect Xi technologies actually provide the X server for the card and not the driver, but I may be wrong here.
  • I too, am building a new server for Xmas.

    I'm looking at a 1.1Ghz Athlon, and lots of memory/disk space. Debian unstable will be used to power the whole thing. One thing I'd really like though is to be able to pipe the Cable TV into the box, and then also pipe the XFree Display out to my 60" TV. The idea being with the amount of HDD space I'm installing I should be able to pull off using my computer as a intellegent VCR. Plus it'll make Quake III just rock.

    The Video Card I've been looking at is the Matrox G450eTV as discribed here : http://www.matrox.com/mga/products/marv_g450_etv/h ome.cfm [matrox.com] .

    Any experiences with doing TV signals under Linux (or even just URL's to places to check) would be appreciated.

  • I can fully confirm this. I own a GeForce MX and a AMD750 based board.
    For me turning off AGP support also helped to bring my Hauppage TV-card back to stability.

    CU...
    Marcus Camen
  • I've always configured boxes with linux in mind ... for 2D machines I usually use: Quantum IDE HDD, BX board, ATI RagePro 8MB, 3COM 3c59x compatible and a standard ATAPI cd-rom drive.
    Works A-OK out of the box with X3 and X4, for all my purposes ...
    For 3D I am using a Voodoo 3000 card ... since I bought it, i've had 0 problems in Windows or Linux ... works 100%, no issues whatsoever. I dont give a ratsazz about frame rates, who cares about FPS ... I find the visual quality of Q3 and other games to be just grand on my V3
    GREG
  • Whats your problem with RagePro's??? They cost $10 and work faster for 2D than anything else i've tried.
  • Yes but that driver is still 2d only... Basic 3d support is still forthcoming. (The 2d driver is pretty sweet though, looks great, nicely accelerated, and supports both XRender and XVideo extensions)
  • Cirrus logic and trident. 3dfx too. All work with their generic drivers on high res. Never had a problem configuring to get 1280x1024 (for you developers).

    ---
  • Bottom line, you may aswell stick to windows if you are going to allow closed source binaries into your kernel.

    Yes! Yes! Because lord knows it'll take FOREVER for NVidia forever to come out with the patch, since they hate their customers and love nothing more than to cackle with evil mercenary capitalist glee while the pathetic fools who bought their products go down in flames; whereas the champions of open source stand always at attention, since they have nothing better to do in their lives than write bugfixes.

    I mean COME ON. By providing the linux community with drivers (open or closed) NVidia has shown that they value linux users as customers. They have a vested interest in maintaining a good reputation in the linux community, which means keeping drivers updated, stable and secure. Not to mention your whole closing statement, which basically spits in the face of the entire Linux community. God forbid Linux stand on it's own merit as an operating system. If Micro$oft released all its application/OS source code tomorrow, would you switch to windows? This type of pig-headed Open-Uber-Alles attitude is really irritating. Just because Open Source development has some benefits over closed source development does NOT mean that it is a panacea.
    ----
    Dave
    MicrosoftME®? No, Microsoft YOU, buddy! - my boss
  • According to the matrox linux forum (enter via http://www.matrox.com/mga) the g450 doesn't seem to have a working tvout for X as yet...



    Tv out on a g450 doesn't currently work and should hopefully be fixed in future versions of the driver.

  • Please explain to me why a company in the business of developing software is a "scumbag company" for charging money for software that they've funded the development of? The possibility that they might give a product away for free seems counterintuitive. This is a fairly common thing within a capitalist system. A company makes something, and then they sell it at a profit and make money. Seems logical, eh? Especially if you want that thing to continue to be made. I like that a whole lot better than, "company makes something, gives that thing away for free and then goes out of business because they didn't make any profit."
  • I have to agree with this, I've been running an SMP box for arround 2 years now (first with PII300's, now with PIII450's, soon to go to 700's) and Matrox is definitly your best bet for compatibility and overall X funtionality. I used a Matrox G200 with a cheap 17" monitor, no problems. I haven't used the G4XX series, but from what I hear they should perform similarly. I'm now using a GeForce 2 GTS (last summer my G200 got fried, along with my motherboard) and I had buy new monitor for it (19") because it wouldn't support 1280x1024 on my old monitor, and it still won't go up to 1600x1200 in X (will in W2K) and I get persistant flicker due to a 60Hz refresh rate (I have a small desk fan running beside the monitor, 60Hz power + 60Hz refresh rate = Flicker). If you're interested in 3D modelling, there's also some useful Mesa patches for Matrox cards, I got fairly good FPS in blender.
  • Yes, stay away from the GeForce 2 MX, UNLESS YOU HAVE THE ABILITY TO READ AND/OR OPERATE A COMPUTER.

    Sure you have to be careful to get rid of old GL stuff, but it is pretty well documented, and pretty logical which files will cause problems.

    The GeForce 2MX is pretty damn sweet under xfree86, although I believe my shady motherboards APM was causing crashes.
  • by tippergore ( 32520 ) on Friday December 22, 2000 @07:20AM (#543414) Homepage
    Much to my chargrin, recently when slashdot linked to the 'cheap video card lineup', I bought an ATI Radeon for my linux workstation.

    Don't do this, because there are no X drivers for it. Well, technically, there are drivers for it, but unfortunately some scumbag company called Xi Technologies [acceleratedx.com] is selling the thing for 80 bucks. EIGHTY BUCKS.

    Third party Radeon drivers for linux are expected Q1 2001, but I suppose we shall see. Also, as an added bonus, ATI has a java applet on their page that happily crashes linux netscape after a couple of page views.

    ATI may be willing to part with design specs for driver development, but I'm not exactly sure if that necessarily makes them linux friendly.

    Be aware.

  • Trident 8900 ISA

    Works great.

  • by Pont ( 33956 ) on Friday December 22, 2000 @08:25AM (#543416)
    Speaking from experience here. I have a GeForce 2 GTS.

    nVidia's linux drivers are very fast. maybe the timedemos show better scores in Windows, but subjectively, playing Q3 in linux is smoother. I haven't played in W2K though. Maybe it has to do with 9x's sucky multi-tasking.

    The only problem is that these fast drivers are closed-source. The only thing that ever crashes my system is X. They don't always wake up after apm puts the graphics card to sleep. The machine is completely hung and only a hard-reset will do. Whenever I recompile my kernel and forget to recompile the nVidia kernel module, my graphics card locks up hard and there's no way to get it back. I have to "use the force" and switch to a virtual terminal and login as root and reboot without being able to see what I'm typing. These are exactly the kind of annoying things that would have been fixed in open source drivers by now.

    I understand why nVidia has closed source drivers. They have other people's IP in them and they have fancy tech in there that would give an edge to the competition (at least that's what they believe and we don't have the evidence to determine otherwise). Their drivers are fast, but they are closed source and that is a serious downside to consider.

    To make a long story short, if 3D gaming on an x86 under linux is your highest priority, get an nVidia card from a good manufacturer that is based on the reference design.

    If stability is more important than 3D performance, get something else.

    Although, I guess now with XFree86 4.0.2 you could have the best of both worlds. Use the open source drivers that come with XFree86 normally, but switch to the nVidia drivers when you want good 3D. GF2 and MX weren't supported in 4.0.1.
  • If you want a CRISP and sharp 2D display and reasonably fast 3D, go with the Matrox G200. You can get these used for next to nothing. I recently got an 8 meg AGP for $20.

    The G400 is a step up. It has much better 3D performance than the G200. Matrox 3D is not industry top-of-the-line at current. But like the G200, the G400 has the sharpest display in the business. It also has dual head support, either on the card or as a cheap add-on option, depending on the model. Dual head is great if you can find a second, cheap monitor. Like the G200, you can find G400s cheap. I've seen them go for $60-80 for the 16 meg single head version.

    The G450 is pretty much the G400, except that multi-head is the only version sold, and the second display shares the same fast RAMDAC as the first display, meaning you could run two very sharp displays in the 2048x1536 range. (The second display on the G400 multi-head loses significant clarity or refresh rate (your choice) above 1600x1200.) There are, however, some issues with drivers for the second head destabilizing the system. (Hopefully someone knows more about when this may be resolved?)

    If you want FAST FAST FAST 3D above all else, are willing to sacrifice a bit of crispness at the higher resolutions, and aren't militant about demanding open source drivers, have a listen to the nvidia and ATI advocates. Both are excellent cards, though ATI's driver support is currently a little behind nvidia's.

  • I second this, for unbeatable 2D performance and open driver support the Matrox G400 is the way to go.

    If you want extremely fast GL support and don't mind waiting around for months before getting the latest and greatest of features (RENDER for example) because the drivers are closed, try an NVidia card.

    -- iCEBaLM
  • by iCEBaLM ( 34905 ) on Friday December 22, 2000 @07:19AM (#543419)
    Visit us at #nvidia on irc.openprojects.net and we'll try to help you out.

    -- iCEBaLM
  • A few extra points to consider about the G450. First of all, the Rainbow Runner G-Series is an addon card for the G400 (possibly the G200 as well) that adds TV input to the mix. This is nice because you don't have to deal with overlay cables and whatnot, and it is supported on Linux. Unfortunately, according to an e-mail I received from Matrox, there will be no RRG for the G450.

    Also, the G450 requires that you compile in a binary library to the standard G400 drivers in order to get dual head support, and this driver is only for x86 Linux AFAIK. However, if that doesn't bother you and you don't what TV input, you'll probably be happy with the G450.
  • Well,

    I think that G400 from Matrox is currently the best choice, it has really fast 2D, high freq. and resolution support (I have it running at 1600x1200x85Hz at 32bpp on a Sony 500PS and it really rocks!). With Matrox you get all the goodies of the new XFree 4.0.x, and have support for DRM (Quake3 at 40-60fps), XVideo (DivX, BTTV, DVD) and soon full dualhead/Xinerama support.

    And the most important Matrox is a company that really support Linux, releasing detailed info on their products and not binary drivers that work ocassionally.

    Just my opinion, but although you are planning to run only quake3 and need 120fps because you have fly-like eyes you won't go wrong buying a G400 (I would suggest a MAX for the dualhead and the extra speed, but other G400 are also very good).

    - german

  • And I had a similar problem. I have an Abit BP6 with two o/c'ed Celerons. When it came to a gfx card I chose the ATI Rage Fury Pro VIVO 32mb (Rage128Pro chipset). I can't say I've tried the 3D in Linux - but everything is really sweet elsewhere. Xwindows is fast and more importantly - very reliable image (no disappearing icons or wierd colors). Like the question says, SMP with the latest XFree86 (4+) this card is GOOD!

    It's also an excellent price atm.
  • I am assuming that you are going to be using XFree86 and not AccelX or another X server.

    If you plan on using XFree86 3.3.6, then I would suggest going with a Voodoo 3 3000. The Voodoo 4/5 drivers are extremely immature and most likely will not ever reach the performance level they should.

    I would suggest, however, that you go with XFree86 4.0.2 and get an NVidia Geforce 2 (GTS/MX/Ultra). I personally have 2 dual CPU Intel boxes and use the NVidia Geforce 2 in both of them. XFree86 4.0.1's included nv driver did *not* support the Geforce 2. This has been fixed in 4.0.2.

    If you are wanting multi-monitor display, go with a Matrox G400. It is an extremely nice card, but doesn't quite have the power when it comes to gaming.
  • Second that. Matrox Millenium 2 (PCI) has been my card of choice for a while for 2D under Linux. Rock solid, no problems whatsoever, good performance.
    ------------------------------------ -------------
  • ... responses want to do a lot with one box. I also see that same behavior with a lot of "sysadmins" running web/mail/dns/db/etc.. all on the same box.

    Just an observation.

    I know that we are not made of money. But if you have money to blow, I would get two separate systems. Crank out 2D performance with a Matrox card on your Linux system and reserve this for work. Dork out a W2K/98 system to play games and run the biggest and baddest 3D card out there because you know that Windows will support it.

    I run Linux as my workstation because I am ten times more productive with Linux than I am with Windows; (I actually know Windows a lot better and I could only dream to be a called a Unix geek but I digress).

    When I get home, I do not work. My home rig is there to check e-mail, play mp3's and of course, play games. Linux is great for all and everything else but Windows is champs for gaming (only... nothing else). (Please do not respond with Quake III on Linux because we all know that UT is the best FPS game out there).

    But if I had to settle on a card, Matrox G4xx. Dual head, superb 2D performance and good (not superb) 3D performance.

  • And ethical responsible from a linux point of view.
  • I would say my voodoo3 2000 is a very nice card. One of the first supported by Xfree86 and OpenGL/mesa GLX GLU.

    I have a Voodoo3 3000 PCI, 2D perfomance is very good, I haven't tested 3D, I don't play any games. But I'm a bit disapointed about 3dfx, if they sell me a card that gets very hot and it was not the cheapest one, they should include some fans, I had to buy two, makes a lot of noice...:-(

    Michael
  • Nothing special, I just run X (1280 x 1024 /16bpp), the "CPU" of the card gets very warm, but there is a small, I would call it voltage regulator, with a passive heatsink attachet, that gets without cooling extrem hot, you can't even touch it....:-(

    Michael
    You can email me, if you would like more info on this topic.
  • Is the NVidia driver secure? >>>>>>>>>>>> Who cares? Are you doing 3D rendering on your server? Before you mention networked 3D rendering farms, let me remind you that none of the pro 3D rendering machines (Intergraph, SGI, Sun, etc) have open drivers.
  • If you're looking for

    Performance: Get a GeForce2. In everything from 3D OpenGL to 2D X performance, it whips everything else out there (including the G400)

    Features: Get a Matrox G400. They tend to have the most feature support.

    Picture Quality: Get a Radeon. The Radeon's 3D and 2D quality is amazing, though the 2D performance is a little limp.
  • nVidia's linux drivers are very fast. maybe the timedemos show better scores in Windows, but subjectively, playing Q3 in linux is smoother. I haven't played in W2K though. Maybe it has to do with 9x's sucky multi-tasking.
    >>>>>>>>>>>
    Screw Win2K, try NT4. It still has the smoothest QuakeIII experience out there.
  • The card performs very well in windows (200+fps easily in q3a),
    >>>>>>>>>
    Not really. GeForce cards are cheaper and faster. And it takes a P4 1.5GHz with a GeForce2 Ultra to run Q3 at 204fps at 640x480. I seriously doubt that a V5 get anywhere close to that.
  • Where do you shop? On pricewatch, a Voodoo3 is about $50, and you can get a GeForce2MX for the $80 you quoted.
  • get a card that supports vesa 2.0 (or higher) and then use the vesa framebuffer kernel driver, and the xserver for framebuffer

    if you later switch to another vesa 2.0 compatible card, it will magically work w/ X w/out any configuration changes (unless it doesn't have enough memory for that resoltuon and color depth...)
  • uhmm .. no. I whish so. IIRC the geforce2 with its binary only driver is the fastest one for 2-D X11. I don't remember where it was, but they showed benchmarks (x11perf) where the geforce was about 30-40% faster than the G400 MAX.

    Now for politcal reasons, this is a total different story. I whish nvidia would open at least the 2-D specs of their cards.
    (so that new features won't run only on MGA chips ;)

    besides ... anyone knows of any card company is making research in 2-D cards ? I don't play fps games. I want FAST 2D cards. And no, the current cards suck in 2-D speed.


    Samba Information HQ
  • I have had some personal experience with NVIDIA cards and Linux running on an SMP system. To sum it up briefly:

    NVIDIA + SMP = BAD CHOICE!!!

    In the graphics lab where I work we have about 10 dual CPU Dell's with GeForce2MX graphics cards, which the lab purchased on my reccomendation :-(. We are running Linux 2.17 and XFree 4.0.1, with the latest NVIDIA drivers (version 0.9-5). The only way we could get the system work reliably was to turn the hardware 3D off and use the Mesa libraries (that come with XFree). With the hardware 3D acceleration turned on the machines keept locking up, and always under different circumstances, mostly though when more than one OpenGL context was active at a time. But some credit is due to NVIDIA - Descent 3 was running always without any problems with 3D h/w accel. turned on.

    If you check the NVIDIA's LINUX FAQ, they acknowledge the SMP problem exists and promised it would be fixed in the next release. Maybe I would believe them, except that that's exactly what they promised in the FAQ that came with the previous release:-( ...draw your own conclusions. I personally don't think there are enough SMP users out there yet to make it a serious enough concern for NVDIDIA to fix this problem.

    If you decide to go with a single CPU system, the card is quite reliable and fast, although I did lock up the system after couple of days. With SMP system, you have basically two choices: disable h/w 3D acceleration or run only Descent 3 :-)

    We tried 4 different versions of NVIDIA's cards (TNT2, GeForce, GeForce2 MX and GTS) with 3 different kernels (2.14, 2.16, 2.17) but the results were all the same - lockups, lockups and more lockups. We even tried to experiment with turning on/off DMA for our hard-drives, without success. Sometimes the entire system would freeze up - could not even ping it; sometimes only the X-server locked up, which could be re-started remotely. The lockups would occur sometimes after only 20 seconds of work. On occasions all you would have to do freeze the system was to start a single OGL application. Other times the system would work fine for up to 30 minutes, with five OGL windows running simultaneously, but then freeze when you quit one of them... completely unstable and unpredictable system.

    I have not tried the h/w 3D acceleration with the new 2.4 kernel yet. Perhaps someone else has, in which case I would love to hear their story.

    Pavol Federl

    email: pfederl@netscape.net
  • Doesn't Xfree 4.x probe for refresh rates? It seems to have got my monitor refresh rate correct with no mode lines in the XF86Config file...
  • by Greyfox ( 87712 ) on Friday December 22, 2000 @07:27AM (#543444) Homepage Journal
    Matrox cards seem to be the most thoroughly documented and therefore the least restrictive. If you need to rely on your manufacturer for binary-only drivers, you won't be able to upgrade to the kernel that will inevitably break it. I had the same problem with an intel card for a while and got stuck on an older 2.2 kernel until I upgraded to Xfree86 4.x. That's a real drag, let me tell you.

    While I'd definitely go for a Geforce2 if they had open source drivers, I'll never buy one of their cards while I have to rely on them for a binary module. My Matrox G400 at home is very nice, and if I needed a little more I'd go for a G450. You won't get the FPS of the Geforce but you won't have to worry about Matrox deciding to stop supporting Linux either.

  • I think there are two very different questions:

    1) Which card will give a screen, right out of the box, with XFree86 standard configuration
    2) Which card has the most goodies, acceleration, 3D instructions, multiprocessing, cheese-grating, etc., best supported under XFree86

    For question one, lots of popular cards work, if they've been out for a while.

    For question two, that's a whole debate.

  • I've got a Voodoo5 5500 in my box, and I tried some beta drivers (availiable at linux.3dfx.com [3dfx.com]) that really didn't perform very well.

    I was unable to get any color depth greater than 8bpp as I recall. This was on xfree 4.0.1 and I followed their instructions to the letter. I tried this on slackware-current (current as of a month or so ago).

    The card performs very well in windows (200+fps easily in q3a), but if you're considering buying one to run in X, I'd suggest you stay away from the voodoo4/5 line until they produce some better drivers.
  • It depends on if you need fast 3D or not. As much as open source zealots hate to admit it, NVidia has the best Linux 3D support out there right now. I've been using their drivers since they were released, and they haven't crashed in months (since the latest version was released). Setting up the drivers can be a pain, however, unless you've done it before. So, if you just want 2D, and maybe limited 3D, then go for Matrox. Once the ATI Radeon drivers are available (I'm not sure if the 4.0.2 ones support 3D), you may want to try that, too.

    ------

  • The current version of Red Hat Linux is 7, which uses XFree 4 support for ATI Rage 128 cards - try it, the support is vastly improved, you even get accelerated 3D out of the box :)
  • You might get more focused and informative (+1) responses if you would state the purpose of the box you are building.

    • Do you need 3D acceleration?
    • Only fast 2D?
    • Is dual head something you're interested in?
    If all you need is good 2D support, there are many cheap cards that will work for you.

    If you need good 3D performance, then your options are more limited, though I'm sure most of your responses will focus on this area.

  • Got to love the tendency for people to get snide when someone doesn't know the answer to something. Especially wheen they have no idea of who they're talking to or what areas they may have expertise in.
  • What if the problem is fundamental to the driver design? We have seen plenty of times where such a risk has been accepted in a closed environment because it is too hard/ too much work to fix. They might say that it will be fixed in the next major release and in the meantime don't run 3d apps while you are on a network, and be pleased with it!

    Yes, it's nice to see a company do something for Linux BUT if it is not open source I would not let it at my kernel....period! If MS opened Windows tomorrow I would not go near it, the system is monolithic and would take a long long time to make secure (if ever, and if it was made secure I suspect that many of its "features" would be destroyed, such as its ease of use). If someone is asking for a video card for Linux you can safely assume they regard Linux as more important to them than Windows. I would also assume that security, source access or stability would be the reason. If you are letting a closed piece of software into your linux kernel, afaic you are no longer running Linux, you are running a hybred that depends on the closed piece of software for its stability and securety. If you are willing to do that....that's your choice.

    BTW I know I can't spell :-)

  • Forgive me if I am talking shit but.....

    X drivers have nothing to do with the Linux kernel

    The point of all this eventual 3d revolution on linux is that now the X-server DOES have access to the kernel bypassing extra levels and therefore speeding the process up. The standard way to do this is with DRI (Direct Rendering Infrastructure) whereas NVidia use their own method...and DRI is the piece of kernel that lets it happen. NVidia ship a kernel module to implement their own version.

  • by tjwhaynes ( 114792 ) on Friday December 22, 2000 @08:29AM (#543470)

    The GeForce 2MX is pretty damn sweet under xfree86, although I believe my shady motherboards APM was causing crashes.

    I had instability nightmares for ages with my TNT2U on an AMD 750 chipset mobo, until I turned off the NVidia AGP support... After about six hours of coding (and maybe a couple of sessions of Quake III Arena :-) ) I suddenly realized that it hadn't crashed at all. Since then I have had only two crashes in four months, which is pitiful for a Linux machine but much better than a crash every two hours or so before the change.

    Scan your /etc/X11/XF86Config-4 file for the Section "Screen" ... EndSection. Add the following line

    Options "NvAgp" "0"

    It isn't guaranteed to fix every NVidia crash, but I've had reports from a few people that this fix has radically improved stability. Especially if you happen to have an Aureal Vortex soundcard in your system.

    Cheers,

    Toby Haynes

  • This of course all assumes that you have X 4.0.1 and don't mind the closed source drivers. Now if 3D performance needs to be *really* fast this may be true but if you want nice solid 2d performance and pretty good 3d Matrox has open drivers that work very well also if you don't need the 3D (the question was not about 3D after all) the Matrox cards cost less and are very well supported.
  • I guess you mean the GeForce 2 MX...which is the card I have. Honestly it works very well, but it is quite tedious to get working. What you need to do is:
    • Download XFree 4.0.1 (or later, I guess) and install it. I didn't recompile, just took binaries.
    • Download the sources from NVidia and recompile them (I know, they are wrappers with binaries)
    • Beware of library (.so) clashes, I had a duplicate library and I just deleted the oldest one, after that no problem
    • Read the (quite complicated) FAQ at NVidia very very well. Very important is the change of "nv" into "nvidia" in XF86Config
    • To make it work with my screen I had to specify the resolution I wanted to use in XF86Config and only that one. Otherwhise it would go for the lowest resolution, I have no clue why. (I have a fancy LCD flatscreen, could be the problem)

    Hope this will help you. I'm not a guru or so, I followed the instructions and (after some searching and trying) I made it work.
    I use Peanut Linux 8.1 (it is some small slackware derivate, methinks) with KDE (no Gnome, sorry).
    Good luck!

  • There are currently 2D Radeon drivers in the DRI cvs and possibly in XFree86 4.0.2. The DRI team is also working on the 3D drivers.

    Ranessin
  • Video capture is out of the question, of course, since there's no software for it. There's a program called xatitv that works well, but it hasn't been updated in about 5 months, is quite beta and doesn't enable bilinear filtering on the TV image, giving you the 'line effect' when the image moves sideways. (Remember that PAL/NTSC are interlaced)

    I guess you're not aware that the developers of xatitv have been spending quite a while merging their code into the r128 Xv extension, allowing user to use xawtv to view video. There is also an effort underway in getting the video4linux loopback device to use the Xv extension as an input source, thereby making the Rage 128 cards (and, hopefully any card with an Xv extension) v4l capable.

    Check out the livid-gatos [linuxvideo.org] website for more info.

    Ranessin
  • by xoror ( 207092 ) on Friday December 22, 2000 @07:40AM (#543507)

    Mesa has been updated to the 3.4 stable release.

    A driver for ATI Radeon adapters has been added.

    ATI driver support for multi-head configurations and non-Intel platforms has been improved.

    I copied this piece out of the changelog for xfree 4.02 ...

  • Not a bad price to pay for something you can't do or don't want to do yourself

    True. If it were not something that you should get for free.

    Think of it this way. Who in their right mind would pay $80 for "drivers" for a piece of hardware in order to get their Windows to run it. The company making such hardware, rightfully, should be penalized (i.e. go broke). If ATI doesn't want to support Linux, then they deserve whatever they get as a result. People would, and should, buy other hardware that does support their OS of choice. (Note: I'm not anti ATI.)

    What you say is true, for instance, for an application. A spreadsheet. Or word processor. Or specialized application such as a cafeteria management system designed specifically for hospitals. (you get the idea of specialized application.)

    I personally find the idea that I would have to pay for a driver to operate my hardware to be absurd. I would take my hardware business to someone who wants my business. Some hardware vendors may not want to support Linux (directly, or indirectly), and that is their perogative.
  • at work we've been having problems with the Savage cards with the S3 chip set, so we switched to Diamond Stealth 550's, they work pretty nicely.
  • by BlowCat ( 216402 ) on Friday December 22, 2000 @08:08AM (#543517)
    But G450 has no fan. This has determined my choice. This has been the only reason. I have paid extra bucks to have less noice on my desktop. And I don't regret.

    By the way, the latest kernel prepatch supports framebuffer on G450. I haven't tried it yet.

  • I found this out the hard way -- Rage Pro 128 is not supported in RedHat 6.2/XFree86 3.3.x. SuSE has patches for it, I understand, but if you've got one of these cards, you'll be better off running XFree86 4.0.x.
    Thus sprach DrQu+xum.
    # grep /etc/fstab dos
    /dev/da1a /msdos vfat rw 0 0
  • I had this question last night from a customer, and that was my choice. Sure, I could've picked ATI because of its 2d performance, but between the bad drivers and the worse hardware problems (capacitors breaking off of the Radeons), I ditched that idea. Sure, the NVidia Linux drivers are not open source, but that's one of my reasons for picking it; the only driver source is from NVidia itself, a company that has chosen to make its own Linux drivers. Furthermore, if NVidia open-sourced the Linux drivers, they'd be giving away trade secrets. Unlike other moronic hardware companies (Digital:Convergence, anyone?), they wish to hold on to profitability.

    I'd really like to see the entire XF86 video code revamped, though. Why can't there be automatic probing of valid refresh rates, like there is in every other OS with a decent SVGA system? The technology for doing so has only been there for six years (like that isn't enough time for the X crew to pick up on it).

  • Xfree86 is quickly approaching the point where this will no longer be an issue.

    Most modern and legacy cards now have decent support under XFree86. And with the recent addition of a VESA driver, we are getting to the point where you will hardly have to consider the video card issue when you install Linux.

    My advice is to find a card that advertises itself to be capable of meeting your requirements, then check the XFree86 docs to see what kind of support that card has. If these two items are a close enough match for your needs, go for it!

    Personally, I only use Laptops, so I am stuck using whatever graphics chipset the manufacturer included on the board. But since 1993, I have never had a problem getting at least 16bpp out of a laptop.
  • ONE LAST SUPER-IMPORTANT DETAIL!!!

    This isn't mentioned ANYWHERE in NVidia's FAQ (thanks a lot, NVidia). If you're using a Geforce2 on a Via KX133 or KT133 chipset motherboard, you'll need to do one more thing.

    NVidia's driver comes with a kernel module that needs to be loaded. That kernel module depends on the agpgart.o kernel module, and as of yet agpgart.o doesn't have explicit support for those chipsets (although it works fine with them). When you try to load it with insmod or moddep (as the Makefile in the tarball does) it will fail.

    This article at www.tomshardware.com explains the fix for this.

    http://www5.tomshardware.com/graphic/00q4/001002/i ndex.html [tomshardware.com]
  • The Matrox G400 (for OpenGL) and the G200 (for 2D) really are great cards for several reasons:

    • DRI [sourceforge.net]. You can speed up your OpenGL performance in XFree86 4 with direct rendering, almost to the point of Windows' OpenGL performance. Matrox cards are consistently supported first when a new project springs up for X (Mesa, Utah-GLX, DRI).
    • Support [matrox.com]. Matrox has a full-time technical support person working in the Linux Forum to help users with Matrox cards and getting them to do things like dual-head and OpenGL.
    • Drivers [matrox.com]. Matrox actually helps by making Linux drivers for their own cards. Better yet, they even give the source code [matrox.com] out.

    There are other reasons as well, but these are my favorite

GREAT MOMENTS IN HISTORY (#7): April 2, 1751 Issac Newton becomes discouraged when he falls up a flight of stairs.

Working...