Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Why Linux Has Failed on the Desktop 995

SlinkySausage writes "Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs, says kernel developer Con Kolivas. Kolivas recently walked away from years of work on the kernel in despair. APCmag.com has a lengthy interview with Kolivas, who explains what he sees is wrong with Linux from a performance perspective and how Microsoft has succeeded in crushing innovation in personal computers."
This discussion has been archived. No new comments can be posted.

Why Linux Has Failed on the Desktop

Comments Filter:
  • It hasn't (Score:4, Informative)

    by jshriverWVU ( 810740 ) on Tuesday July 24, 2007 @11:22AM (#19970125)
    Been using it as a desktop since 96, and have several friends who've been using it as a desktop for more than 5 years. Even my girlfriend uses it as a desktop now, and had only 1 day to "convert" to the usage, and she's not that computer savvy.

    Now it's all in the marketing and politics, but on the software side it's there.

  • by spun ( 1352 ) <loverevolutionary@@@yahoo...com> on Tuesday July 24, 2007 @11:29AM (#19970235) Journal
    let's just nip this little tangent in the bud, shall we? he's saying the Linux kernel is so bloated with enterprise level crap, and is so optimized for the server role, that it performs poorly on the desktop.
  • by Compholio ( 770966 ) on Tuesday July 24, 2007 @11:32AM (#19970279)
    The article really focuses on how quickly the desktop responds to user operations. I haven't personally found this to be a problem on the 2.6 kernels; however, to say that work is not being done in this area is unfair. Kernel Trap has had several articles on people working on CPU schedulers to address this problem, recently the Completely Fair Scheduler was merged to potentially solve this problem: http://kerneltrap.org/node/11773 [kerneltrap.org].
  • by slickwillie ( 34689 ) on Tuesday July 24, 2007 @11:33AM (#19970301)
    It's been working fine on my desktop since Slackware '96.
  • Coral Cache (Score:3, Informative)

    by Frankie70 ( 803801 ) on Tuesday July 24, 2007 @11:44AM (#19970475)
    Here [nyud.net]
  • by mhall119 ( 1035984 ) on Tuesday July 24, 2007 @11:49AM (#19970567) Homepage Journal
    For me:

    Virtual Desktops
    Bash (not sure what shells OS X comes with)
    Beagle (no sure how spotlight compares)
    Apt
    Beryl (ok, not really a need, but a definite want)
    Evolution
  • by mhall119 ( 1035984 ) on Tuesday July 24, 2007 @11:53AM (#19970653) Homepage Journal
    I forgot middle-click to paste in my list, I try to do that all the time when I'm using windows and it drives me crazy.
  • by jedidiah ( 1196 ) on Tuesday July 24, 2007 @12:08PM (#19970907) Homepage
    Utter rubbish.

    I use Linux as a PVR and it's more than up to the task. It can maintain adequate performance and responsiveness even when doing heavy number crunching. My MythTV boxes are quite often running at 100% cpu and a load average of 5 or 10.

    Forget "audio skipping".

    Let's try realtime video capture + realtime video decoding + 3 video transcoding jobs all going at the same time.

    I can even still use my mythbackend as a desktop with very respectable responsiveness while all of this is going on.

    "most people" are at a loss to see what his problem is.
  • by Chirs ( 87576 ) on Tuesday July 24, 2007 @12:11PM (#19970953)

    Yeah, actually his patches were pretty good. He taught himself C, grokked the kernel coding style, and was a presence on the kernel mailing list. He maintained the -ck set of patches for quite a while, and wrote a couple of new schedulers (staircase, staircase deadline, rotating staircase deadline) based around the concept of fairness.

    After quite a bit of discussion, Ingo Molnar produced the CFS (completely fair scheduler) which just recently got merged. The bulk of the new scheduler was written in 62 hours, then finetuned over many weeks on the kernel mailing list. He gave credit to Con for proving the fair scheduler design concept, and for some of the tuning.

    A number of people were disappointed by the perceived nepotism, where it appeared that Ingo's got merged because he was in the "in" crowd. I expect this is part of what triggered Con's decision to leave. On the other hand, the two schedulers are very different and it may be that one is really technically better than the other--I haven't compared the two in detail.
  • PVR != Desktop (Score:1, Informative)

    by AHumbleOpinion ( 546848 ) on Tuesday July 24, 2007 @12:15PM (#19971021) Homepage
    Utter rubbish

    I would avoid phrases like that if you are going to compare and embedded application to a desktop.

    I use Linux as a PVR and it's more than up to the task

    A PVR proves nothing about a desktop environment. A PVR is a far simpler application and easy to tune for since it is an embedded application. A desktop has a far greater load and a much more unpredictable one at that.
  • by AHumbleOpinion ( 546848 ) on Tuesday July 24, 2007 @12:20PM (#19971087) Homepage
    Maybe the author doesn't grok the idea of setting the kernel to be responsive for the desktop. It's not rocket science, you know.

    Of course not, Microsoft does it for the customer so they don't need to learn how to do it themselves. Would it be so hard for a Linux distro to do so as well when it is doing a "workstation" rather than a "server" install. Some distro ask and have this info regarding intended use.

    I think you are exemplifying the "by nerds for nerds" attitude that the author of the article would probably argue is holding back Linux adoption.
  • by rs232 ( 849320 ) on Tuesday July 24, 2007 @12:30PM (#19971259)
    "At that time the IBM personal computer and compatibles were still clunky, expensive, glorified word processing DOS machines"

    "Enter the dark era. The hardware driven computer developments failed due to poor marketing, development and a whole host of other problems. This is when the software became king, and instead of competing, all hardware was slowly being designed to yield to the software and operating system design"

    "However, the desktop PC is crap. It's rubbish. The experience is so bloated and slowed down in all the things that matter to us. We all own computers today that were considered supercomputers 10 years ago .. So why on earth is everything so slow?"

    "I watched the development and to be honest... I was horrified. The names of all the kernel hackers I had come to respect and observe were all frantically working away on this new and improved kernel and pretty much everyone was working on all this enterprise crap that a desktop cares not about"

    "Or click on a window and drag it across the screen and it would spit and stutter in starts and bursts. Or write one large file to disk and find that the mouse cursor would move and everything else on the desktop would be dead without refreshing for a minute"

    --

    Why Linux Has Failed on the Desktop

    "Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs", Zonk quoting SlinkySausage.

    Quoting him out of context and making him say something he didn't say ... for shame Zonk ... the headline is also misleading.
  • Re:PVR != Desktop (Score:5, Informative)

    by AusIV ( 950840 ) on Tuesday July 24, 2007 @12:36PM (#19971349)

    Utter rubbish
    I would avoid phrases like that if you are going to compare and embedded application to a desktop.

    And I would avoid correcting people when you don't know what you're talking about.

    MythTV is not an embedded application, it's a software application that runs on a general purpose PC. I, like the GP, have a desktop computer that runs MythTV. It can record two channels at once while flagging commercials or transcoding a third TV show while I use it as a desktop or watch a fourth TV show. The audio doesn't skip nor does the desktop feel slow (as the GGP suggested) until I'm functioning at 100% CPU, which is fairly rare.

  • Re:Don't think so (Score:5, Informative)

    by ardor ( 673957 ) on Tuesday July 24, 2007 @12:43PM (#19971459)
    His point is that the kernels are optimized for servers. That is, focus on throughput, performance, but not latency or responsiveness. A desktop has the latter two as priorities, while sacrificing the former two. As an example, it doesn't matter if that mpeg4 video I/O eats a little more CPU, as long as other tasks don't interrupt its playback.
  • by Khaed ( 544779 ) on Tuesday July 24, 2007 @12:45PM (#19971485)
    Virtual desktops, which another user pointed out. MythTV. I'm sure there's an analogue somewhere in Windows, but I doubt it's free.

    It works. I had more trouble getting my current printer to work with XP than I did in Ubuntu.

    I prefer the Gnome interface. I have a few panels with different purposes, and each one has a hide button (but no arrows). I keep them collapsed on the left side of my screen. It's become instinctual for me to click in certain places for shortcuts, the menu, virtual desktops, etc.

    This one could probably be done in Windows with some work: The left Windows Key minimizes all windows, and the right one mutes sound. I know Windows+M does the former in Windows, but this is a single key, not a combination. Also, scroll lock opens a file browser, etc. Shift+Left-Win_key opens Firefox, Shift+Right-Win_Key opens Thunderbird.

    In my experience, Linux IS more stable. And as I'm the kind of nerd who installed Slackware and spent eight months in it, it should be apparent I don't have a problem tweaking my system.

    The thing is has over OS X is pretty simple: Linux runs on my desktop PC. I'm sure I COULD get OSX on here, but I COULD hack a boat engine to run in a car. It doesn't make it a good idea.
  • Re:Utter rubbish (Score:3, Informative)

    by Sancho ( 17056 ) on Tuesday July 24, 2007 @12:46PM (#19971487) Homepage

    All of the applications I use on my Linux desktop are developed with highly portable (yep, cross-platform) toolkits such as GTK+ and Qt. Most run very well on many architectures and many kernels (Linux 2.4 and 2.6, *BSD including Darwin, Solaris, etc).
    You must be new here.

    On Slashdot, it's not cross-platform if there's a single platform that it doesn't run on. See the various complaints about Flash not being "cross-platform" since it doesn't natively run on 64-bit Linux.

    But seriously, most of the time when people say "cross-platform," they mean that it should run natively on Windows, Linux, and OS X. You can often force Linux applications to run on Windows if there's not a native port, but it's usually a pain in the ass.
  • by mhall119 ( 1035984 ) on Tuesday July 24, 2007 @01:08PM (#19971813) Homepage Journal
    Have you tried Krita? It's a part of the KOffice suite, but I think it handles up to 32 bit and has CMYK and color profile support. Also, have you tried Gimp 2.3? I think it's added some more of these things that have been missing.
  • Solution (Score:2, Informative)

    by tinkerghost ( 944862 ) on Tuesday July 24, 2007 @01:15PM (#19971897) Homepage

    His point is that the kernels are optimized for servers. That is, focus on throughput, performance, but not latency or responsiveness. A desktop has the latter two as priorities, while sacrificing the former two. As an example, it doesn't matter if that mpeg4 video I/O eats a little more CPU, as long as other tasks don't interrupt its playback.

    nice -n 10 totem

    if that's your issue, then create a daemon that renices the priorities of pre-set programs to some given level - better yet tweak the module that starts programs to nice them as they start. Works better than blocking the background tasks by bumping everything that's happening under a users uid, while still providing the lower latency issue.

  • Re:Don't think so (Score:5, Informative)

    by munpfazy ( 694689 ) on Tuesday July 24, 2007 @01:29PM (#19972193)
    But that's not the title of the article. It's just the title of a horribly written slashdot post. The article itself is pretty reasonably, and makes some excellent points.

    But, I suppose, "why linux has failed on the desktop" sounds catchier than "a well known kernel hacker muses on the relationship between software and hardware in PC innovation and discusses the problems he sees in the way the mainline kernel developers address desktop user needs."
  • Re:Don't think so (Score:5, Informative)

    by BlueStraggler ( 765543 ) on Tuesday July 24, 2007 @01:34PM (#19972279)

    All that enterprise crap is what keeps the platform solid and almost crash free.

    I want to agree with you, I really do. But my SuSE 10.1 desktop regularly has fits where it becomes completely unuseable - if I can manage to get a shell, I find that the load has spiked to 5-10 (on a single core system) when the system was doing *nothing*. Just this morning, I woke up, poured a bowl of cereal, walked over to it to read some Slashdot over my Cheerios, and found the system thrashing and refusing to come out of screensaver because the load was so high. This happened while I was sleeping. I had to ssh in from my Powerbook to kill off any processes that appeared to be using CPU before the system would respond to the mouse.

    Meanwhile at work, we just tossed an Ubuntu server that should have been reasonably swift, but was regularly DOS'ing itself by spiking to loads of 40 or more several times a day under normal use. A load of 40-60, on a single-core machine! We "fixed" it by spending thousands of dollars replacing it with a pair of multicore beast with scads of memory and fast disks, which seems to overpower the problem.

    Then there's that server belonging to a client, a RHES 4 system. When I ssh in through a tunnel to update it, it insists on running the update program as an X client for crissakes. Then it tells me to register the system at a URL, but the URL cannot be selected or copied to the clipboard. This is "enterprise" quality software?

    Back at work, the dev server is still a RedHat 7.3 clunker. It has a half dozen developers fine-tuning their infinite loops, fork bombs, broken joins, buffer overruns, and spaghetti code, all day long. It simply never crashes or hangs, never gets slow, and never complains about the abuse it receives. It's a rock-solid dream. Except that it's a damn nuisance to update, since it's so old. And it's only hobbyist-quality software, after all, built before RedHat went all enterprise-centric

    Posted, with regrets, from my Powerbook. I'm starting to think that software built for the home user is a safer bet than the "enterprise" shite I'm dealing with every day.

  • by Animats ( 122034 ) on Tuesday July 24, 2007 @01:40PM (#19972385) Homepage

    First, the Slashdot article is terrible. The article isn't about "why Linux is failing on the desktop", it's about why a kernel developer who was trying to improve scheduling performance quit.

    The scheduling issue is interesting. I used to work on mainframe schedulers, I've done real-time work, and I'm familiar with the issue in game implementation, so I know how hard this is. We could do better than what we have now, but not by some magic fix to the scheduler. We have to look at interactivity as a real time problem.

    It is, too. Alan Kay used to say that there is no more excuse for a delay between pressing a key on a computer and having something happen than there is on a piano. We haven't been faithful to that, and it subtly drives users nuts.

    One useful idea from the real time world is explicit "sporadic scheduling". Some real time operating systems have this. A process can explicitly request that it wants, say, 10ms of CPU time every 100ms. The scheduler must reject that request if the system is overbooked. If it does accept the request, the scheduler has committed that much resource to the process. If the process overruns its time slot, it loses priority and an overrun is tallied.

    This is what an audio or video player should be using. This is how you get audio and video that don't pause or skip. For this to work, the player must be able to calculate, for each system it runs on, exactly what resources are needed to play the current content. This may take more analysis and benchmarking than many programmers are used to doing. It's worthwhile to make overruns visible to tools outside the application, so that users can detect broken applications. To a real time programmer, overrunning your time slot means "broken". You have to think that way.

    On the interactivity front, it's useful for a thread to be able to request a high priority for a short period after an event, with a priority drop to follow quickly if it keeps the CPU too long. That's how you get the mouse cursor to track reliably. Of course, the thread that handles mouse events has to pass off all the real work to other threads, not stall the thread handling fast events.

    It's also probably time to end paging to disk. When it works, paging at best doubles the effective RAM. But paging inherently results in long unexpected delays. If you want interactivity, don't page. Real-time systems don't. Neither do game consoles. RAM is so cheap that it's not worth it. (1GB starts at US$56 today at Crucial.) Paging devices maxed out around 10,000 RPM since the 1960s, and haven't improved much since. Give it up. Today, paging is in practice mostly a means for dealing with memory hogging apps. (Hint: open "about.config" in Firefox and turn off "browser.cache.memory.enable". so it doesn't save screen dumps of each page for faster tab switching.) It's probably time for Linux to not page interactive processes by default.

    This implies an operating system that says "no" when you put on too much load, instead of cramming it in and doing it badly. Open too many windows of video, and at some point the player won't open another one. There's nothing wrong with that, but most Linux/Unix apps don't handle resource rejections from the operating system well.

  • Re:Yes he is (Score:1, Informative)

    by Anonymous Coward on Tuesday July 24, 2007 @02:58PM (#19973563)
    Enterprise users ARE normal users. In fact, in a large enough Enterprise, you will find a greater diversity of users than anywhere else. Enterprise organizations employee people who barely know how to check email. They employee people who have their own domain name that they got off of Yahoo. They even employee /. readers.

    Enterprise's are the reason why the majority home computer is Windows. It is easier for the average or normal user to only have to learn ONE OS. Not the half dozen different OS's I multi-boot into.

    That really is the whole point of why Linux has trouble on the Desktop. The average user doesn't understand how to modify the Windows Registry. Why would they understand the 'nice' command?
  • by kollywabbles ( 645848 ) on Tuesday July 24, 2007 @03:04PM (#19973641)
    Uhh... Open-Xchange 5?

    Integrates with Outlook clients, even.

    Way more features than Exchange, too. Better, faster webaccess... groupware, collaboration, doc management, company forums... you name it.

    I've been replacing Exchange servers with it all year long.
  • Re:Applications (Score:2, Informative)

    by Jthon ( 595383 ) on Tuesday July 24, 2007 @03:23PM (#19973931)

    Would I rather run Linux? Yes. Vista thrashes the disk around like crazy the whole time the machine is on, and it can only see 2.5 gigs of the 4gigs of RAM I have installed. I suppose I could shell out a few hundred for 64but Vista, but who knows what drivers will and won't work in that.

    Well if you purchased the retail version of Vista you can go to http://www.microsoft.com/windows/products/windowsv ista/editions/64bit.mspx [microsoft.com] to get a free dvd of the 64 bit version. It also appears that you don't get a new product key so if you have access to the 64 bit media you can just use your 32 bit key to install 64 bit vista. Though I have no idea if this works or is allowed with an OEM copy of Vista.

    This might work for you until there's an update to Ubuntu which allows your 8800 GTX to be fully supported.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday July 24, 2007 @03:53PM (#19974379) Journal

    Granted. But this goes back to the ago-old argument that one should never buy a new computer, because 6 months later it will be half-priced anyway.

    What I do is, I look for the sweet spot.

    That is: 4 cores is an improvement over one, certainly. However, back when I bought this computer, one core running at 2 ghz or 2.4 ghz was something like twice to three times the price of one core running at 1.8 ghz.

    In other words, I'm not telling you to wait 6 months, I'm telling you to buy the hardware that was cutting-edge 6 months ago, which is 80-90% of what's cutting-edge now, but half the price. It also means you can do this twice as often as you could afford to before.

    Generally, I find that this just-behind-the-curve hardware is almost as good, costs much less, and is more likely to be physically more reliable, as well as have better drivers in any OS.

    If I am forbidden to buy the best computer I can afford, when I need said computer in order to run Linux, then that could be considered by some to be a failure of Linux on the desktop. It is not necessarily the fault of Linux, but it is indeed a problem.

    I'll agree with you there, and in your situation, it is kind of worse -- but take your soundcard. There are all kinds of high-end soundcards supported by Linux.

    So, in some cases, bleeding-edge hardware really won't be supported. In other cases, it's that you can get hardware just as good at about the same price that will be much better supported, if you do the research ahead of time.

    And if that research takes too much time, buy a Dell with Ubuntu and let them do the work for you.

    By the way -- would it really cost $200 for you to upgrade to 64-bit, or is that the full version? Is it possible to buy just an upgrade?

  • Re:Don't think so (Score:3, Informative)

    by Frumious Wombat ( 845680 ) on Tuesday July 24, 2007 @03:57PM (#19974455)
    Basically, your complaints (and I remember the mysterious "box doing nothing with a load of 25" issue" (probably X related), and the replies (recompile everything from scratch after patching), point towards why it's time for you to try *BSD or more likely, Solaris 10. If your boss will spring for new hardware, of course, you should just get a copy of OSX Server with unlimited client access. 15 years ago in Grad school my boss watched me IPL a VMS box from tape, read the console for a few minute, and then emphatically told me that machines like that were supposed to be extinct by now. One might argue the same for OS's which still require you to manually configure the source, and then start recompiling the kernel and libraries. I'm running my lab off an OSX server box, with all the standard services (it manages both the desktops and the compute nodes on the private network), and really would be hard pressed to give a reason that I thought I had to change to Linux, Windows, or anything else. Nobody has ever suggested I recompile anything on that box to make it work right, and there's very little you can't tweak from the gui or through editing rc.local (good old BSD one file for tuning).

    A few years back, when Sun was pushing its Linux desktop, it made its sales reps (at least the ones in the upper midwest), use it on their business laptops. Within the year, every one of those laptops had somehow mutated into PowerBooks running OSX, with far happier reps using them. I somehow transitioned from a dual xeon to an old G4 without noticing, because of the apps and seamless experience which just let me work, which is what I suspect you've seen with your Powerbook as well. Someome took the time to add some polish, and it shows.
  • Re:Don't think so (Score:5, Informative)

    by a.d.trick ( 894813 ) on Tuesday July 24, 2007 @04:05PM (#19974567) Homepage

    Two words: Direct Rendering

    The issues your describing have almost nothing to do with Linux and everything to do with your graphics card driver (or lack thereof). If you've ever run Windows XP on a system without your graphics card driver you will experience the same thing. In fact, in my experience it's quite a bit worse.

    There certainly are some things that could be optimized in Linux, but I those are relatively insignificant.

  • choppiness (Score:4, Informative)

    by hawk ( 1151 ) <hawk@eyry.org> on Tuesday July 24, 2007 @04:31PM (#19974935) Journal
    I would blame linux, not X :)

    While the difference isn't nearly what it used to be, FreeBSD has always had far less of that on the same hardware and the same version of X.

    Back on a 486 (and even my K6, iirc) linux could freeze for seconds under loads under 4, while at least the mouse kept working at 20 and up.

    The last time I compared on the same hardware (a couple of years ago), Linux was merely "annoying" under load, rather than the older "unusable"

    hawk
  • by Movi ( 1005625 ) on Tuesday July 24, 2007 @05:18PM (#19975597)
    Seems someone didn't RTFA. CK is saying that while he was pushing the kernel to be speedy on the desktop, all the other developers were pushing it to be speedy on the server, sacrificing the desktop performance (and this is because all the others were big-corporation workers).

    And yes i know it isn't "either the desktop or the server" but i can see his POV beeing somewhat right.
  • by ckolivas ( 1132603 ) on Tuesday July 24, 2007 @09:52PM (#19978507)
    It seems they were sensitive to my complaint and have changed the title of the story at apcmag now. The slashdot title for the interview and their misquoting was... unfortunate.
  • by ghostunit ( 868434 ) on Wednesday July 25, 2007 @12:42AM (#19979617)
    I work as a programmer and use a linux server at work. I like a lot the idea of an open system where I can modify whatever, make scripts to control the behavior of the system, etc. So I try it at home and here's what happens:

    Mandriva 2006 worked on a friend's laptop so I try to install it at home. Works fine for a few hours until, for some still unknown reason, the headphones suddenly emit ear-jarring static at the maximum possible volume no matter what I'm doing or what settings I use. I try reinstalling. The problem repeats.

    A few months later Mandriva 2007 comes out and I give it a try. Installation finishes and on first boot-up the system freezes on the "setting hardware clock" step. Ridiculous. I try reinstalling, same problem. Try again, this time updating from Mandriva 2006. It works, but random stuff is broken. I trash it.

    But it's fine since there's this great new distro called Ubuntu that everyone's praising, right? I can't stand gnome so I download kubuntu and proceed to install. I live in Japan so I tell the installer to set the system language as Japanese. First thing I try is surfing the web, but the text is all monospaced and is hard to read. I try messing with firefox and the system settings, no good.

    Ok moving on, I need wine for my japanese word processor, so I go to the system admin menu and click on the wine icon. Tells me it will install it for me. Repository linking and such and such stuff. Result? "it appears you don' have wine installed on your system". I try installing it from the package manager gui and the command line. It doesn't even know it exists.

    Well, let's look some videos. I open them with MPlayer but an error message window pops-up intermittently non-stop telling me "cannot find PCM audio controller" eventhough the audio is fine. Ok, let's try finding help on the net. Only one forum post describing the same problem and the answer is, he knows how to fix it, but he won't bother because it's in the the man page. One hour later of messing with the program's config files the error message is no longer plastered all over my screen. Fine. Hey, how do I display this file's subtitles? not supported?

    Well, let's try Xine. Um, why is it that when I press the scan forward/backward key it moves like 2 minutes? I just want it to skip 5 seconds! shouldn't that be the default behavior? I dig into the config files. One hour later I find the key, but it will only give me 7 seconds. Fine. I'm content watching anime until weird artifices appear. Whole parts are covered in random green pixels.

    I try moving files around with the file system gui, konqueror. I can't move it because I'm not root. That would be fine, except there's nowhere I can give a su command to this thing. While doing whatever, I get random application crashes, a window with a kde gear with a bomb inside appears.

    I incorrectly shutdown the system and reiserfs won't start until it's rebuilt the tree or something. 2 hours later it's done. It once ruined one of my directory trees. NTFS wouldn't even bother me with any of this.

    Linux is slow to boot-up. Maybe Linux is fast but X is damn slow and clunky. When windows xp is done, Linux is just starting X, or finishing mounting the file systems.

    System display on kubuntu irritates my vision. I install the nvidia driver, set the refresh level correctly and yet I can't look at the thing for 15 minutes without starting to feel bothered. Mandriva was fine.

    Conclusion: Linux works well running things that have had a lot of work and testing in them, like apache and websphere, but in the desktop it's a mismash of poorly coded apps with an even more poorly integrated system beneath. Oh, and it wont't work with my tv card.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...