Forgot your password?
typodupeerror

Why Linux Has Failed on the Desktop 995

Posted by Zonk
from the just-maybe-games-are-involved dept.
SlinkySausage writes "Linux is burdened with 'enterprise crap' that makes it run poorly on desktop PCs, says kernel developer Con Kolivas. Kolivas recently walked away from years of work on the kernel in despair. APCmag.com has a lengthy interview with Kolivas, who explains what he sees is wrong with Linux from a performance perspective and how Microsoft has succeeded in crushing innovation in personal computers."
This discussion has been archived. No new comments can be posted.

Why Linux Has Failed on the Desktop

Comments Filter:
  • Don't think so (Score:4, Insightful)

    by jasonmicron (807603) on Tuesday July 24, 2007 @10:21AM (#19970097)
    how Microsoft has succeeded in crushing innovation in personal computers.

    I found that rather funny. Blaming Microsoft for your own lack of creativity and ingenuity.

    Besides, Steve Jobs would very much disagree.
  • by halivar (535827) <bfelger.gmail@com> on Tuesday July 24, 2007 @10:21AM (#19970099) Homepage
    Some of us find it quite up to the task. The choice of desktop OS is up the consumer, and their individual needs. Some people need Windows, some people need Mac. Some of us need Linux because Windows and Mac have failed on OUR desktops.
  • Applications (Score:4, Insightful)

    by grasshoppa (657393) <skennedy.tpno-co@org> on Tuesday July 24, 2007 @10:23AM (#19970127) Homepage
    The answer is simple: Application support. That's why desktop linux has failed. Nevermind the rest of the chatter; I can tell you that had I had the applications needed, I would have switched two organizations over to linux desktops by now, possibly more.

    And it's not a problem of performance; It's a question of politics. We have to convince enough software vendors to start coding in a cross-platform language/way.
  • by ArcherB (796902) * on Tuesday July 24, 2007 @10:25AM (#19970169) Journal
    One of the strengths of Linux is also its biggest weakness. If someone has a computer and for some strange reason needs to install an OS, which Linux distro do they choose? I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.

    Another problem is the MS dominance over the OS market. It's hard to buy a computer without Windows and even harder to purchase one with Linux preinstalled. Your average computer user is not going to purchase a computer that won't run (because of no OS) and even if they did, when they go to the store pick up an OS, all they see is Windows.

    Linux users need to stick to a Distro that works, is easy, is well known, and comes as an option to be preinstalled on computers from the majority of manufacturers, even if it is along side Windows or as a bootable DVD thrown into the box.
  • by A nonymous Coward (7548) * on Tuesday July 24, 2007 @10:27AM (#19970193)
    Right, blame it all on the kernel performance, as if the average user could even notice, say, a 10% difference in any kind of speed.

    Nothing to do with a monopoly.

    Nothing to do with existing applications that WINE can't handle.

    Just kernel speed. He's a freakin' genius, this boy is.
  • Re:Don't think so (Score:5, Insightful)

    by jellomizer (103300) * on Tuesday July 24, 2007 @10:29AM (#19970219)
    Creativity is very rarly an out of the blue thing, It is about looking at many alternatives trying to take what you like about them and make it your own, with perhaps somthing extra to get them to work correcly together.

    Having many Different OS's and Computers around we would be much better off seeing what works what doesn't why it does and how to improve on it. Back in the 80s If I were asked how would a Desktop System look in 2007 I would have given a much different answer (In my mind a 2007 desktop would look more like Plan 9 and less like windows) But during the 80s the Only GUI i had experinece with was Gem Desktop and I didn't particully care for it. I expected graphics in 2007 to be a bit better then they are now, But the OS in my mind would have frames not windows.
  • Not failed, niche (Score:2, Insightful)

    by athloi (1075845) on Tuesday July 24, 2007 @10:29AM (#19970223) Homepage Journal
    Microsoft designs software like GM builds cars: for the average person, which is defined by having average needs. For checking email, web surfing, and using Quicken, Windows is the better product. For those with either far broader needs, or much more specialized ones, there's Linux or FreeBSD. However, Kolivas makes a good point: Linux has not adapted to the desktop paradigm and so alienates many potential users with its somewhat doctrinaire requirement that they learn entirely new ways of doing common tasks.
  • by RedHat Rocky (94208) on Tuesday July 24, 2007 @10:31AM (#19970251)
    I find it hard to get any work done with just a kernel.

    Oh, perhaps what the poster is really talking about is Linux distributions. Ah! In that case, Ubuntu has made major major progress and I would say is "mother" ready. I, in fact, have my family of four and my parents doing their daily computing on Ubuntu.

    Failed on the Desktop? Hardly.
  • Again??? (Score:4, Insightful)

    by Pedrito (94783) on Tuesday July 24, 2007 @10:31AM (#19970269) Homepage
    Everyone has a take on it. Haven't we had this discussion a hundred times on Slashdot?

    My personal opinion, after having used Linux quite a bit, is simply that Linux isn't ready for the desktop. While many apps have easy to install packages, a lot of apps don't. Particularly smaller, single-developer shareware kind of apps. Many of these require getting source and compiling, something my mother or grandmother won't be able to do.

    Speaking of my mother and grandmother, the other thing they already find confusing enough is the Windows directory layout. Linux is FAR more complicated in that department. They'd find organizing their documents much more difficult.

    Finally, frankly, I don't find the UIs all that intuitive to use. I've used Gnome and KDE. I prefer KDE, but I have issues with both. It took me a while to figure out how to drag and drop gzip compressed files from KDE. I can't even remember how it works off the top of my head, I'd have to go do it again. But it definitely wasn't as intuitive as drag and drop from say WinZip to a folder in Windows.

    The fact is, Linux just isn't ready for the desktop. Don't get me wrong, huge strides have been made over the past few years in usability and I suspect it'll get there eventually, but it's not there.

    Another issue is the community, which in many places is hostile to newbies. I've been insulted on more Linux support forums for asking question than I've ever been on Windows support forums. There are places to get good support for Linux, but there are a lot of really hostile ones too. Windows may have some hostile ones, but I just run into it far less frequently.

    This is just my personal opinion, based on my experiences with it. Other people may have had different experiences. I still love Linux for certain things and I run a Linux box as a file server, firewall, database server and for video editing. I'd never trust connecting a Windows box directly to the internet, but I've always trusted Linux for that. But as a desktop environment, it just doesn't work for me.
  • by Culture20 (968837) on Tuesday July 24, 2007 @10:33AM (#19970297)
    When did that happen? I keep convincing people to try out linux (or at least OSS on their MS Windows boxes), and quite a few make the switch. The people I see sticking with Windows are the gamers (which might explain MS's drive to push the Xbox360).
  • Re:It hasn't (Score:2, Insightful)

    by pete.com (741064) on Tuesday July 24, 2007 @10:36AM (#19970343)
    ... and here I thought there were more than several people and one girlfriend using desktop computers; shows what I know!

    The fact that you have a girlfriend makes your opinion suspect anyway.
  • by Hideaki (1115733) on Tuesday July 24, 2007 @10:36AM (#19970353)
    I disagree. If all I wanted to do with my PC is surf the web, check email, chat with friend and use office apps like the "average" computer user does, I'd switch to linux in a heart beat, in fact, I did for a while. But the one thing that keeps me bound to this monopoly that is Windows is gaming, which goes back to application availability. I don't think what the article is saying is true at all. I ran linux for months and didn't notice any of this "slow speed" in fact, it ran faster than windows, if anything. Even for games running through Wine.
  • by spun (1352) <loverevolutionary&yahoo,com> on Tuesday July 24, 2007 @10:38AM (#19970371) Journal
    Note too that much of the work being done in open source these days comes from companies like IBM, Redhat, and Novell, not from Joe Q. Randomhacker. These companies see the server market as the largest, most profitable Linux market. That's where their throwing their development dollars. Hey, here's an idea: why not make desktop distribution without all that enterprise crap in the kernel?
  • by hey! (33014) on Tuesday July 24, 2007 @10:38AM (#19970381) Homepage Journal
    What is "Linux" for that matter?

    If by "failed" you mean "failed to achieve X market share", I should think the answer is obvious: normal people don't give a flying fuck what kernel their operating system uses. And since their computers come with Windows preinstalled, they are not going to swap operating systems to get a better kernel -- or a better license. Even MacOS wouldn't be where it is, if it was developed and sold as a purely OS product, instead of being bundled with Apple computers.

    On the server end, people are concerned about capacity, performance, and licensing restrictions, so it's a different ball game.

    People have only two problems with the Linux kernel, and neither of them is due to the existence of enterprise features: (1) the USB doodad they just bought doesn't work automatically and (2) the specific application doesn't support any version of Linux. As to why this is so, it all comes back to the fact they don't care what kernel they have and they already have Windows, so people in the business of catering to them don't bother to do anything to fix these problems. If they did, user apathy means it wouldn't make a big difference in Linux desktop adoption.

    In the end, this is a situation that only Microsoft can change, and that by screwing up. Maybe they have with Vista, but I think not. Vista will be like the old 640K DOS memory limit. Industry (other than MS) will move heaven and earth to accomodate it, should it become the status quo, which given user indifference will probably happen.
  • by Bizzeh (851225) on Tuesday July 24, 2007 @10:40AM (#19970407) Homepage
    this isnt a flame, or a troll, this is the truth, which you can go to any non-fanboy site, and find out for your self.

    GUI in linux is slow... face it.. its true...
    GUI in linux is klunky...
    GUI in linux isnt centralised around 1 goal.. you have several parties all throwing in their 2 pence.
    GUI in linux isnt intuitive. (it is more so than it was, but still not enough).

    Linux as a kernel is slower than NT and BSD as kernels.
    Linux, untill recently, was terrible at scheduling.
    Linux is still terrible at threading.

    all these combined together, mean for a bad experiance that your average joe user (nobody on slashdot can relate to average joe, since if you read slashdot, you are not average joe), which will leave them frustrated and anoyed that they cant use the system they paid alot of money for.
  • by LWATCDR (28044) on Tuesday July 24, 2007 @10:42AM (#19970453) Homepage Journal
    Wow you really didn't bother to read the link did you.
    The author was speaking about how poorly Linux performed ON the desktop. Thinks like audio skipping and the desktop feeling slow. He was talking about how the Kernel was so slanted to big iron and the server market that it has ignored desktop performance. The was also talking about how hard it is to create benchmarks that show interactive responsiveness.
    He also talked about how hard it is for "normal" users to communicate problems to Kernel developers.

    What he is talking about is how Linux has failed to perform as well as a desktop as it does a server.

    What most people have failed to notice or care about is this is a person that actually tried to fix problems by writing code! He was a truly working under the FOSS ideal and has given up.

    Too bad so many people are dismissing what he has to say.
  • Re:Applications (Score:5, Insightful)

    by nyctopterus (717502) on Tuesday July 24, 2007 @10:43AM (#19970455) Homepage
    ... and Photoshop, Illustrator, and video editing apps. There is a hell of a lot of media software that people need that you just can't get on Linux (either natively of a replacement). I want to switch (from OS X), but I just can't. For example, what do I replace After Effects with?
  • Re:Applications (Score:4, Insightful)

    by slackmaster2000 (820067) on Tuesday July 24, 2007 @10:45AM (#19970501)
    Agreed.

    There are lots of applications out there, but perhaps too many. So many applications have everything but that one critical feature that an organization has come to rely on. "But it's open source and you can implement it yourself!" True, but that costs real money too. Quite a bit actually.

    And then there are the vertical applications that we can't move away from because we've got years and years worth of data stuck in the swamp. Yes, we could migrate but at what cost? Business doesn't care about operating systems or information philosophy, it cares about getting the job done to make money. It would take a considerable cost advantage to move an organization of medium size or larger from a Windows environment to a Linux environment.

    I spent years on and off trying to figure out how to move my company to Linux both on the desktop and the server. It's just too much, even still. Our business is manufacturing Widgets, and we get along just fine in our Windows world. If we were starting over from scratch today with the 5 or so employees we had when I first started a decade ago, I would make different choices. I despise 3/4 of Microsoft server products, and I hate the cost of MS Office.
  • Re:Applications (Score:2, Insightful)

    by shelterpaw (959576) on Tuesday July 24, 2007 @10:46AM (#19970511)
    That may be true for the Business environment, but your not accounting for the multimedia environment. I can't use Linux as a main desktop machine because it just doesn't have powerful enough applications. Some are great, but they're not up to snuff with products from Adobe and others. Though you can do more with some linux applications, they tend to be a hodge podge of apps, even iLife is a better suite of applications. iLife applications work together so effortlessly and I've tried a slew of applications on Linux which can't touch it. Then take into account professional music apps. Their are some for Linux, but there's is just not enough support.

    At least in the business environment, I can live with OpenOffice.
  • by Anonymous Coward on Tuesday July 24, 2007 @10:46AM (#19970515)
    recently the Completely Fair Scheduler was merged

    That _might_ be one of the reasons he's pissed off, you know ...

  • Re:Don't think so (Score:5, Insightful)

    by ImaLamer (260199) <john.lamar@g[ ]l.com ['mai' in gap]> on Tuesday July 24, 2007 @10:47AM (#19970531) Homepage Journal
    The whole thing is dead wrong. All that enterprise crap is what keeps the platform solid and almost crash free.

    Sure, some extra code may slow things down, but since Linux, Windows and even MacOS now, is all based on server kernels (linux's own, VMS/WNT for anything newer than Windows 2000, *BSD) they don't crash too much. YOU may have problems with XP or 2000, but you shouldn't be. I've had an XP install going for more than four years, Windows 2000 running for months. (If you can't do this, you should not be using it, nuff said)

    Code doesn't care how many employees you have. Maybe this guy belongs at Ubuntu, where things are moving towards the 'desktop'. Just ask my new Ubuntu installation on my laptop - it's running like a desktop just fine. I just finished 5 hours of World of Warcraft on it!
  • by dlenmn (145080) on Tuesday July 24, 2007 @10:48AM (#19970539) Homepage
    From TFA:

    Although I'd never learnt how to program, looking at the code it eventually started making sense.

    I was left pointing out to people what I thought the problem was from looking at that particular code. After ranting and raving and saying what I thought the problem was, I figured I'd just start tinkering myself and try and tune the thing myself.
    It could be that he was a natural and had great intuition, or it could be that he had no idea what he was really doing. Does anyone know? Were his patches any good? I'd have some doubts if some dude with no programming experience came along and started claiming that everything was wrong with the kernel but he knew how to save it...
  • Exchange, bitches! (Score:5, Insightful)

    by LibertineR (591918) on Tuesday July 24, 2007 @10:50AM (#19970611)
    Competing OS are not keeping Linux off the desktop in big numbers, EXCHANGE is, period.

    Anyone familiar with how Microsoft locks in customers will tell you the same thing.

    We have reached a point where neither the desktop OS or the Server OS doesnt matter as much as the apps they run. Exchange is the one app that is almost a must-have. Anyone can list all the non-proprietary stuff that runs 80% of Exchange functionality, or 50%, but does it better, and so on and so on.

    Give it up, and start building something that takes Exchange on directly, feature for feature, with better recovery, and message pushing to handheld devices.

    Or, maybe just shutup? This has been obvious for years. Microsoft keeps improving Exchange, Enterprises keep buying it, and everything else that goes with it.

    Linux cannot exist on its own with a bunch of 50-to-80% solutions, expecting to fill the gap by the temporary pleasure of giving Microsoft the finger from time to time.

    Either compete or change the game. Only Google and Apple seem to get this.

    And can we stop asking this question over and over again?

  • by ArcherB (796902) * on Tuesday July 24, 2007 @10:51AM (#19970613) Journal

    I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.

    I can't name them, either. But I also can't name all the available versions of Windows. So what?
    Let's see how I do:
    Available?
    XP Home, Server, and Enterprise
    Server 2k3
    Vista Home, Professional, and Ultimate.

    Let's see how I did by comparing to this [wikipedia.org] Wiki page. I missed:
    Windows Vista Starter
    Windows Vista Enterprise
    and the Windows 2k3 server editions. There are six, which I lumped into one.

    Of course, I skipped the embedded. Also, many of those are Enterprise editions of 2003 that you probably won't find people confusing them for a desktop OS (well, they might until the see the price tag!). Also, I skipped older versions of Windows since they can no longer be purchased. They would include Windows 1-3, NT4 and Windows 2k.

    I'd copy and paste the Linux list from Distrowatch, but they only list top 100.
  • by cyphercell (843398) on Tuesday July 24, 2007 @11:02AM (#19970795) Homepage Journal
    the title of the article is flamebait, of course people are going to respond to it.
  • by Anonymous Coward on Tuesday July 24, 2007 @11:04AM (#19970819)
    It's reasoning like yours that Linux has never *taken hold* on the desktop. Failure would connote that Linux at one point made a big hit on "everyone's" desktop and was left for something else. This NEVER HAPPENED.

    Mod article 100% flamebait. (Though that might be a little redundant seeing how that's pretty much how every article is.)

    --Parasonic
  • by smooth wombat (796938) on Tuesday July 24, 2007 @11:04AM (#19970825) Homepage Journal
    Easy as hell! I have my grandmother using WindowMaker and I set up four buttons for her - Word processor, Email, Web Browsesr, and Instant Messenger.


    Stop. Reread what you just posted. First you say it's easy to use. Then you say that you configured your grandmothers machine with four buttons she can use to access the things she uses most.

    If it's so easy, why did you have to configure those buttons? Why couldn't your grandmother do it herself?

    I'm not saying whatever version of Linux your grandmother is using isn't easy to use. What I am saying is that well-meaning folks like you who support Linux on the desktop always use an example such as the one you gave to show how easy Linux is to use yet, by your own admission, you had to do the setup. You had to do the configuration.

    This isn't to say that configuring Windows is necessarily easy or even intuitive. However, either through force of repetition or blind luck, the average person is able to configure a Windows environment more easily than a Linux environment.

    I don't personally use Linux though I have fiddled with Slack 10 and Debian so maybe my perceptions are off, but the overall point is that those who support Linux and who say how easy it is to use ALWAYS say they got a family member/SO/whomever to use it AFTER they configured it for them so therefore, it must be easy to use. That's looking at it from the wrong angle.

    I wrote in a post a while back about documentation and how the biggest problem with it is that it isn't detailed enough for the average person. People, despite the innate intelligence we are supposedly born with, like to be handheld the first few times when doing something. Particularly if they have never done it before.

    You and I may be able to program our vcrs and dvd players (well, not me yet. See my journal for why) without reading the manual but that is only because we have been exposed to the general process for so long that we can draw upon our past experiences to get us through the configuration. Joe Average can't (or won't depending upon how militant they are).

    I don't know what the answer is because installing an OS, even as streamlined as Microsoft, Apple and the various Linus distributions have done, is still not easy. There are still questions that need to be answered to configure it that I'm certain your grandmother couldn't answer without your guidance.

    Yes, once the OS is installed and configured things will just work but as has been said a billionteen times before, people don't want to have go through a long configuration process. They want to be able to put in a floppy/CD/glass block/whatever and other than double-clicking on an icon, have the software installed and ready to go.

    I realize this is somewhat of a rant but those of you who work with Linux on a daily basis think that using your distro is simple and easy. Which it is but ONLY because you've been working with it for X months/years/decades/eons and know it pretty much inside and out. Take someone off the street and have them do an install of the OS or a piece of software on Linux and I can guarantee you they will tell you to do things to yourself which are not possible (except if you're a master contortionist).

    Easy is a relative term. What is easy for you or I is not easy to our parents or grandparents. Those who produce Linux distros need to understand this and have it plastered all over their work spaces so every time they do something they should always ask themselves, "Is this something that Joe Average can do?" not, "Well shoot, this is simple. All one has to do is rm -f *%!@, then grep for dlist -t to be sure it was disjoined at which point they can do an apt get something and finally a make something. I can do that in my sleep!" (and yes, I know what I wrote makes no sense. That it is exactly what the outside world hears when you folks talk about doing something)

  • by Anonymous Coward on Tuesday July 24, 2007 @11:06AM (#19970859)
    That may be fucking fine for you but some fucking people like fucking choices. But at the same fucking time, I fucking agree that too much fucking diversity can cause a fucking lack of fucking direction.
  • Re:Applications (Score:5, Insightful)

    by Asphalt (529464) on Tuesday July 24, 2007 @11:07AM (#19970877)
    I don't think Linux on the Desktop has failed. I have been running it in one form of the other since the mid-90's, and have a Ubuntu machine now. As a matter of fact, every other machine but this one is running Linux on the desktop.

    But ... I am currently using Windows Vista as I am typing this.

    Why?

    Because I got a new machine. Quad Core with 2x8800GTX cards powering 3 monitors and an X-Fi Sound Card. It looks and sounds great.

    But when I tried to install Ubuntu 7.04 on it over the pre-installed Vista, I got a blank screen. Apparently the 10 month old 8800GTX drivers are not included on the Ubuntu install disk. Yes, there are some workarounds, using a text install, installing envy from a shell, and some other tips that may or may not work (results have been mixed), but it's a leap of faith. People that are running the 8800 cards in Ubuntu have been generally disappointed at their performance from the reading I have done on the Ubuntu forms, finding them slower than 7xxx series cards, and even slower then 6800's. What a waste of expensive graphics cards. And there does not even exist a driver to power my X-Fi soundcard. So I would not get sound. Sweet, a computer with no sound. All that music I downl ... I mean BOUGHT ... would never get heard.

    And that may be a bit of a problem for the Linux Desktop. It is hard to start out with a Linux desktop if you have psuedo-cutting-edge hardware. Many people buying new machines have to wait some time for stable and easily installable drivers to appear for their hardware, and by the time they appear, they are already fully entrenched in Windows, have their file structure laid out, etc.

    I am sure I will eventually have Linux installed on this machine, but it will be long after it is a high-end machine. I am not going to waste good hardware on drivers that don't work, or work sub-optimally.

    But this is not the fault of Linux. The folks who release the drivers just don't care too much about Linux. That is the problem.

    In two years, this will be a killer Linux workstation. Today, it would make a shitty Linux workstation.

    So, Vista it is for the time being. I have already gotten a BSOD. The OS is nuttier than a squirrels turd and is a general pain in the ass. But my applications run, everything installs the moment I plug it on (joystick, pocket PC, Bluetooth adapter, SD cards, etc) ... I can see what I am doing, the audio sounds great and I can get things done.

    Would I rather run Linux? Yes. Vista thrashes the disk around like crazy the whole time the machine is on, and it can only see 2.5 gigs of the 4gigs of RAM I have installed. I suppose I could shell out a few hundred for 64but Vista, but who knows what drivers will and won't work in that.

    But at least I have audio and video on the OS I have now. It's an imperfect work.

    The Achilles heel for Linux desktops has been and always will be fast and easy driver support, IMHO.

    Linux works great on slightly older hardware, but by the time the hardware is slightly older, it is more difficult to get converts. People tend to dance with who brought them, and on most machines, that is Windows.

  • by vivaoporto (1064484) on Tuesday July 24, 2007 @11:07AM (#19970883)
    You know. The same thing happens to me, but instead of with Linux distributions, it happens with cars, televisions and other goods.

    If someone has far away job and for some strange reason needs to own a personal vehicle, which car do they choose? I've driven cars for years and I still can't name all the available models. I doubt ANYONE can.

    Car drivers need to stick to a model that works, is easy, is well known, and comes as an option to be sold at every auto shop and used car fair, even if it is along side the models from the brand that particular auto shop represents.

    I cannot imagine a reasoning more beaten up and less relevant than this one. While it is true that people prefers pre-packaged goods, too much choice was never a problem in the other markets. There is a multitude of car brands, TV brands, beer brands, all of them differing in a way or another but every of them catering to their target audience. And we do not see people fighting to get this or that car (beer, TV set) brand to dominate the market because of an eventual technical superiority, better taste, features, etc.

    That is because what is the best alternative for one may not be the best for other, because people taste differs, because people need differs. The only difference from that to computer Operational Systems is that the collaborative culture brought by the microcomputer "revolution" make people expect a level of interoperability and interchangeability between these different branded machine that they don't expect in other ones, like cars, for instance.

    And to blame the lack of interoperability we have nobody else than certain proprietary software companies (there are many of them for me to enumerate by name, but you know the ones I'm talking about), that could agree on standards that would thrive interoperability (imagine what would the industry be if they didn't agreed on ASCII, for instance), but instead put their short time gains over it and helps to push the whole industry back a couple of decades.

    To summarize: too much choice happens everywhere, and it is a good thing, inclusive in computing, as long as there are interoperability among the choices. Linux (the kernel) and its most of its userland is open source, open specs so, the lack of interoperability can't be blamed on them.
  • and she's not that computer savvy.

    Perhaps that was the cause for her lack of problems. My guess is that she didn't need much to do with the computer anyway, apart than writing stuff for homework.

    An OS is like a car. The more you know it, the more you tune it, and you optimize and squeeze every little bit of performance from it to fit your needs. You install new seats, a more comfy steering wheel, cup holders, etc etc.

    The problem comes with the change. One day you're sitting in your perfectly-tuned american car FR with V8 engine, and then you switch to an european car. And suppose it's a 4WD or FF with a rotary engine. For starters, the steering wheel is on what you knew was the passenger's seat. You have to change speed with the left hand instead of the right hand. You have to look to the right instead of the left.
    It's much worse when you realize that the knowledge and tools that helped you to tune your old car don't work with the new car (how the heck do you fix a rotary?). It's a completely different monster, and you have to RELEARN EVERYTHING FROM SCRATCH. Lots of knowledge lost.

    For example, to quickly search for a file in Windows, I open a commandline, and type dir *mask* /s /b. In Linux it's a different command (find -name), and if you're not logged in as root, you get all these "access denied" warnings (where the heck did i put that web server root directory?).

    To get help, you don't type "command /?". You type "man command", and then you have to scroll thru pages of explanations that you don't fully understand. (And don't get me started on the config).

    Back to the cars analogy. If you're just LEARNING to drive, "ah, it has a steering wheel and pedals." It's easy. Of course it's easy! Because you don't know ANYTHING.

    The real problem with switching to Linux is having to UNLEARN every bit of knowledge you've gained about windows with the years. It's much more painful when you're a Windows power user.
  • Re:Applications (Score:3, Insightful)

    by jedidiah (1196) on Tuesday July 24, 2007 @11:13AM (#19970979) Homepage
    Multiple platform support is a boon to software quality. It encourages better engineering practices and allows the QA department to subject the codebase to a far more interesting collection of obscure configurations.

    Of course most companies just see quality (of any sort, not just software) as a needless cost.
  • Re:Applications (Score:2, Insightful)

    by christianT (604736) on Tuesday July 24, 2007 @11:16AM (#19971029)
    I think what we need to change our focus. So much emphasis is put on migrating existing behemoth environments to Linux. Why don't we focus more on the new startup businesses that aren't currently locked into their current platform by data migration costs, and lack of that one little function that they have come to rely on?
  • by Anonymous Coward on Tuesday July 24, 2007 @11:24AM (#19971151)
    One word. Ubuntu.
  • by DragonWriter (970822) on Tuesday July 24, 2007 @11:25AM (#19971157)

    One of the strengths of Linux is also its biggest weakness. If someone has a computer and for some strange reason needs to install an OS, which Linux distro do they choose? I've run Linux for years and I still can't name all the available distros. I doubt ANYONE can.


    So? I can't name all the different kinds of laundry detergent (or even Tide laundry detergent), but its not hard for me to find one that works well enough and use it.

    Why would I need to be able to name all of the Linux distros in order to use one? I don't need to use all of them.

    Another problem is the MS dominance over the OS market. It's hard to buy a computer without Windows and even harder to purchase one with Linux preinstalled. Your average computer user is not going to purchase a computer that won't run (because of no OS) and even if they did, when they go to the store pick up an OS, all they see is Windows.


    I'm pretty to sure that, e.g., Fry's sells both Linux pre-installed systems and retail-box versions of some of the commercial Linux distros. Dell sells systems with Linux pre-installed. And, of course, plenty of computers come with MacOS pre-installed, which, while it isn't Linux, is certainly "without Windows".

    Linux users need to stick to a Distro that works, is easy, is well known, and comes as an option to be preinstalled on computers from the majority of manufacturers, even if it is along side Windows or as a bootable DVD thrown into the box.


    Since no distribution exists that "comes as an option to be preinstalled on computers from the majority of manufacturers", and its clear that none will until the demand for Linux from consumers is great enough to force that option, you seem to be saying that to become popular, Linux must first be popular. (I disagree that Linux users have to stick to one distro for this to be the case; as long as there is application compatibility across distros, it wouldn't matter if each hardware manufacturer offered a different distro.)

    But there is a kernel of truth there in that Linux won't acheive desktop competitiveness (in the market sense rather than the quality sense), until lots of people get exposed to Linux other than by choosing to buy it individually, so that they don't have to take the risk of what they perceive as "the unknown". Which means that the only way Linux will compete is the same way MS got its desktop dominance back in the days of DOS: providing enterprises a reason to standardize on it so that its what people need to use for work.

  • Downward Spiral (Score:2, Insightful)

    by imadoofus (233751) on Tuesday July 24, 2007 @11:26AM (#19971171)
    The article appears to be unavailable at the moment, but the issue appears to be a downward spiral. The reason all the enterprise features are there is because Linux is primarily used in the data center. This, in turn, would require Linux to have more enterprise features.

    Like others have posted, my family has been using Linux on the desktop for the past three years, but the real question is "how do we break the cycle and place more priority on the desktop?"
  • by Anonymous Coward on Tuesday July 24, 2007 @11:29AM (#19971233)

    For me, Linux offers ease of use. It "just works" on my laptop (A Dell Inspiron 9100). With Windows, I need to download a driver from ATI before I can get a resolution of greater than 800x600. Ubuntu automatically recognizes my card, and correctly sets the resolution to 1680x1050. With Windows, I need to download a driver for my wireless card, Ubuntu recognized my card and configured it automatically.
    Lucky for you, but don't pretend your experience is typical.

    On my desktop, Windows recognized the monitor and was able to use the full resolution with its generic drivers. Performance was terrible, but once I installed the specific drivers, it worked fine.

    With Ubuntu, it simply reported "sync out of range" and there was nothing that could be done. Safe mode generated the same error, and with no UI to interact with, that's the end of it.

    Likewise, when I tried Ubuntu on a laptop, it recognized the wireless card and then refused to use it. (It just doesn't work - trying to set the WEP key does nothing, it just says "activating device" and then returns to not working.)

    Windows on both machines just work. Granted drivers had to be installed, but once they were installed, it just worked. No additional effort. No "sync out of range".

    Now this experience obviously isn't typical either, but it demonstrates the main problem with Ubuntu: when it fails, there's no way to get help. Your options are basically to whine on forums, and then get completely useless advice like editing configuration files on a read-only CD with an OS that doesn't display a UI.

    With Windows, there's a support number you can call, or you can take it to a local computer store, or ask for help among the massive number of Windows users - in short, you're not stuck with snobs on forums who think you should be able to hand-edit configuration files without being able to see anything on the screen.
  • Rubbish (Score:3, Insightful)

    by soccerisgod (585710) on Tuesday July 24, 2007 @11:31AM (#19971275)

    I can agree to some of the things said by this guy, but all in all, it's rubbish. Sure, response times are one thing and I think they've been addressed very well by preemption features and configurable scheduler frequency, but to blame a slow desktop experience on the kernel is just stupid. Really stupid. If you wonder where all your megahurzes go, try looking at your KDEs and Gnomes first, your animated gizmos, your 3d desktop gimmicks and applets and your java crap.

  • by caseih (160668) on Tuesday July 24, 2007 @11:35AM (#19971337)
    Since I don't have mod points, I'll just burn karma instead.

    In short, you don't know what you're talking about. He's not talking about *throughput* which Linux does very well at. He's talking about latency and interactive performance. A system where the desktop is snappy and responsive, where the CPU wastes cycles if need be just to makes sure the mouse doesn't lag and that windows are redrawn in a prompt, synchronized way. A kernel optimized for desktop performance (have you ever *used* Quartz on OS/X?) will sacrifice overall throughput and raw total performance for low latency servicing of the things a user actually looks at on the screen. It's this perception of performance that matters on the desktop. If users sees a fraction of a second delay or stuttering in his UI, this is perceived as "slow!"

    For example, my Fedora Core 6 box running on an older AMD 2800+ XP, is plenty fast at lots of things. I can compile large programs fairly quickly, and do all kinds of things. But dragging a window across the screen not only is slow, but it also can cause my audio to skip.

    On the same processor (even under VMware!) Windows XP is smooth and the UI responsive. Of course under the hood Windows doesn't fair so well. I can't compile with as much raw speed, and although the UI is responsive, the code connected to it may not be executing in a speedy manner, causing me to have to wait for the computer. But the important part is that the windows draw smooth and fast. Resizing a window or moving a window is silky smooth.

    Even Vista, though it ultimately is slower than XP and Linux, has a UI that appears to be super fast and slick, much faster than any Linux desktop (remember perception *is* reality). Just try to use it sometimes.

    Now his patches combined with, say Compiz, go a long ways to making Linux have the responsiveness that desktop users require, the apparent schizophrenia on the part of Linux developers in relation to the desktop has frustrated him and driven him away. This is a tragedy.
  • by IGnatius T Foobar (4328) on Tuesday July 24, 2007 @11:36AM (#19971347) Homepage Journal
    Exchange is on its way out ... or at least the idea that Exchange can lock an organization into Windows is on its way out. Slowly but surely, open source is building Exchange alternatives. Citadel [citadel.org] is one such alternative, and it's rapidly gaining in popularity.
  • by geoffspear (692508) on Tuesday July 24, 2007 @11:37AM (#19971373) Homepage
    Firefox instead of Safari.

    You can set up a Mac to dual boot in OS X and Linux and you're too dumb to figure out how to install Firefox on it?
  • by LighterShadeOfBlack (1011407) on Tuesday July 24, 2007 @11:41AM (#19971433) Homepage

    It's reasoning like yours that Linux has never *taken hold* on the desktop. Failure would connote that Linux at one point made a big hit on "everyone's" desktop and was left for something else. This NEVER HAPPENED.
    "Failure" connotes that Linux, or at least some Linux vendors, have attempted to target the desktop audience and have not substantially penetrated it. Are you saying that isn't accurate?
  • by LWATCDR (28044) on Tuesday July 24, 2007 @11:49AM (#19971539) Homepage Journal
    What ticks me off is that the person in the interview was trying to take FOSS to the next level.
    He wasn't saying that Linux was a worse desktop than Windows. In fact the said that Windows had many of the same problems! He wants Linux to be the best that it can be. Not just good as Windows or not just better than Windows but the best that it could be.
  • by LWATCDR (28044) on Tuesday July 24, 2007 @11:53AM (#19971591) Homepage Journal
    What CPU are you using?
    How much Ram?
    I had an old PIII that I used as a server and backup desktop. Frankly It is SLOW. Yes it is an old box but the desktop was PAINFUL to use. It ran Windows2k just fine but Linux worked better as a server on it.
    Why shouldn't an old and slow machine make a good desktop?
    I tend to blame Gnome and KDE for the low speed and have yet to play with any of the light desktops but the person that was interviewed has some very interesting points.
  • by cpuh0g (839926) on Tuesday July 24, 2007 @11:58AM (#19971655)

    How do you know what the "views and leanings" of the founders are - are you one of them or are you just assuming that they are open source zealots who are programmed to automatically hate all that does not agree with Richard Stallman's manifesto?

    Why does supporting Open Source mean that you have to assume that the big corporations are somehow evil and out to corrupt? Are you even aware that the biggest contributors to open source are some of the biggest corporations - IBM, Sun, Apple, Intel, etc. ? It is not a black-and-white world out there, there are good ideas being advanced on both "sides", to ignore that fact and adopt an antagonistic stance towards all that don't agree with you is childish and ignorant.

    The rest of your post really makes no sense - if you call FUD, be specific - what part is FUD and why do you think it is so?

  • by FreeGamer (1001924) on Tuesday July 24, 2007 @12:09PM (#19971817) Homepage
    Anybody who has used Linux on a system that is not swimming in ram or >1Ghz CPU will have been afflicted by performance problems at some point.

    My 700mhz celeron machine would frequently grind to a halt if it was overloaded. If I compiled whilst listening to music it would usually cause the audio to become laggy and choppy. (I had a SBAWE32.) Of course, with newer hardware like you kids are used to lately, none of these things will happen, but computers that are not new should not be consigned to the dustbin, and if we make Linux more responsive on old machine then maybe it will perform better on new machines too?

    This article is certainly not bullshit. Con Kolivas is the voice of many Lusers who have witnessed such problems over the years.
  • by Anonymous Coward on Tuesday July 24, 2007 @12:26PM (#19972143)
    Can Citadel talk to Exchange clients? No? Game over.
  • by timrichardson (450256) * on Tuesday July 24, 2007 @12:27PM (#19972167) Homepage
    Half way through the interview I slammed on the mental brakes. Linux is so famous for getting more from old hardware. My Debian distribution boots much more quickly than Windows. And waiting for me in apt-get an upgrade to a new kernel with a new "fair" scheduler. After slamming the brakes, I didn't get off the bus though. Con is a great guy, looking for 120% activity in his life. His insights are more to do with kernel development than Linux on the desktop. Con: Success with your further endeavors, and for sure you will find something related to computers quite soon. An Amiga user never gets that out of their system.
  • Re:Solution (Score:5, Insightful)

    by bladesjester (774793) <slashdot&jameshollingshead,com> on Tuesday July 24, 2007 @12:32PM (#19972245) Homepage Journal
    The average user has no idea how to do that, nor should they have to.

    That's part of the reason why Linux will never really hit it big on the desktop.
  • Re:warning: ot (Score:5, Insightful)

    by pintpusher (854001) on Tuesday July 24, 2007 @12:36PM (#19972293) Journal
    ISTM there are two kinds of computer users: those who've learned how to use one system and those who've learned how to use computers in general. (no comment intended here as to which type the parent is)

    The first can apply to any OS: the user has learned the specific, repetitive motions that get done for them what they need to have done on their specific system (usually this would be windows, but could be any OS). Their "skills" are easily translated to another system of the same type and maybe to a similar but not too distantly related system. An XP user could probably move down to win98 or up to say vista without too much difficulty, though they'd find certain things don't work the way they'd expect. An XFCE user could move fairly easily to gnome, not so much to KDE, but still okay. But if you ask this user to move to a completely different system (from any win to any linux, or from XFCE to say fluxbox or even worse to wmii (I love wmii, BTW)) then there's trouble. They don't actually have "skills", they have a set of rote responses that look like "skills" but aren't.

    The second set have not necessarily learned any particular set of actions to get desired results. Instead they have learned about how computers work in general and have developed a set of "skills" for _determining_ which actions get desired results. The difference is subtle but important. Someone in this set can sit down at a computer they've never seen or even heard of before and figure out how to get something done in relatively short order. They will likely never be as productive on any one particular machine as the folks from the other set, but they will be proficient at _any_ machine in short order.

    To determine which set you belong to, try something radically different from what you normally use for a good period of time and see what happens. If you give up within hours, then you're probably from the first set. If you find that you've forgotten exactly when you changed, then you're like from the second set and are probably already looking for something even more different to try.

    So, after "Preview" (see what a good boy I am!), I have to add that there is probably a third set: those who probably belong in the second set, but have never fully developed that set of skills choosing instead to dive deep into the particular system of their choosing and learning it intimately. These folks can do _anything_ on their machine, but have given up their potential to learn breadth in exchange for depth.
  • by Kaeles (971982) on Tuesday July 24, 2007 @12:42PM (#19972409)
    of changing their operating system, most people I know who are not computer savvy think that windows is a hardware part of the machine (seriously!).

    To them, changing OS's would be like changing engines in a car (too many car analogies but hey :P )

    Anyways,
    I have a XUbuntu box running on an 800mhz with 256 ram, and I can browse, chat, and watch a movie with no lack of response whatsoever....

    I think the biggest problem in desktop linux is the windowing systems, perhaps if the distros would auto-detect or ask what speed your computer is and installed a WM that was fitted to the task, noone would have a problem with linux.

    That and we need to educate people and let them know that windows is no different from any other software on their computer (also that pushing buttons will NOT "break" the hardware hehehe).
  • by allcar (1111567) on Tuesday July 24, 2007 @12:53PM (#19972619)
    In the article, he expressly states that the sort of intensive number crunching you refer to is fine. The concerns he IS expressing are related to interactive responsiveness. I, for one, sympathise with these concerns. We should not need cutting edge hardware to smoothly re-position a window.
  • Sigh (Score:2, Insightful)

    by localman (111171) on Tuesday July 24, 2007 @12:54PM (#19972639) Homepage
    All or nothing. Winner take all. There can be only one.

    Is it okay if Linux and Windows and Mac (and the rest) just go along and play their part in the big picture? And over time they'll shuffle around a bit, too? Can we get over worrying about who is the top dog?
  • Re:Solution (Score:5, Insightful)

    by Jaqenn (996058) on Tuesday July 24, 2007 @12:55PM (#19972657)

    That's part of the reason why your attempt at trolling will never really hit it big on Slashdot.
    He's not trolling! He's not saying that your approach fails to fix the problem. He's not even saying that he doesn't like to type.

    He's saying that someone afraid of their computer can't do it. And until Linux can be used by people afraid of their computer, it won't appeal to the majority of the desktop PC market.
  • Re:Don't think so (Score:4, Insightful)

    by Qwavel (733416) on Tuesday July 24, 2007 @12:58PM (#19972699)

    Yes, Apple has succeeded to take market share from MS on the desktop, while Linux has failed, but you are leaving out an extremely important piece: MS was working very hard to make sure that Apple displaced Linux. Tons of evidence has surfaced in e-mails and in interviews with ex-MS people that MS saw Linux as a real threat, whereas they saw Apple as safe and even as useful (in relation to the Justice Department).

    Now, MS never intended that Apple should take over digital media, marginalize WMA, etc. They miscalculated on that, but on the desktop, Apple has managed to make MS look much better. Not only can they claim that there is competition on the desktop, but now it is harder to blame them for charging too much or for promoting lock-in.

    So, I think the statement that you are rupiating could be modified to "how Microsoft has succeeded in crushing Linux in personal computers." because it is Linux and open source that they wanted to crush, not innovation.
  • Re:Don't think so (Score:5, Insightful)

    by MBGMorden (803437) on Tuesday July 24, 2007 @01:05PM (#19972801)
    As a user of all 3 (and a few more), I must disagree. EVERY operating system has it's little pauses like you describe, but Linux in particular drags the whole time, just in small incremements.

    Mr. Kolivas in the article hit the nail on the head. Take linux windows. Drag one around. It chops around into various little segments and such as it moves. Drag an icon. Select stuff. Reposition a toolbar (or buttons on it). There are these fraction of a second delays. It's almost like walking on stilts. You're on the floor, and you feel when your feet hit the floor, but there is feeling of some layer in between where you're not REALLY touching the the floor. Same applies to Linux and it's GUI (or at least it's most common collection of tools that we call it's GUI).

    Now personally, I'm not so sure the problem is in the kernel; I've always been more apt to blame Xfree86 (and now X.org) instead, but the fact remains that it just doesn't feel right.

    Mac OS X on the other hand, has a MUCH better flow to it. BeOS's approach to such things was practically perfect (my 450mhz K6-II with 128mb of RAM running BeOS feels faster than an Athlon XP 2100 with 1GB of RAM running Gentoo Linux). Even Windows, despite it's many other problems, feels more responsive on the desktop than Linux.

    What the problem is for sure, I don't know, but I'd certainly like to see it fixed. Windows is well, Windows (boring and evil). Macs work too well for their own good for a tinkerer (they work, work well, and not as many people feel like fiddling with them, so the development community is much smaller). BeOS is dead. Other operating systems like SkyOS an Syllable are just too obscure. Linux is where it's at for programming enthusiasts. It would sure be nice to be able to use it for more than that though :).
  • by FragHARD (640825) on Tuesday July 24, 2007 @01:08PM (#19972853) Journal
    Yeh, they tried for 6 months to get some M$ games to run probably in wine or some other converter ;) and no matter how hard they tried the games would not run as fast or at all in linux; So they went back to windows to play and now all is fine as they are hard at work playing games again at full speed with an occasional BSOD's and freezes.
  • Re:Don't think so (Score:4, Insightful)

    by inKubus (199753) on Tuesday July 24, 2007 @01:15PM (#19972965) Homepage Journal
    With Linux, there's no cohesive community backing it and forcing it into hardware manufacturers. But, it's free. So it gets put into things by default (like Tivos, wireless routers, etc), and ends up working great. Desktop-wise, people are willing to pay to not have to be responsible. Just like people get totally ripped off at a dry cleaning shop because they don't want to wash and iron their shirts.

    Things I would like to see in Linux:

    Standardized single-sign on/authentication solution. Yes, I know there's kerberos, but someone needs to build an easy to use API over kerberos which allows you to make a simple call like "bool isTrusted()" to handle security throughout the app. ONE SIGN ON. ONE KEY staying with the user session, whether they open a shell, click on an app in KDE or Gnome, SSH or NFS to another machine or disk. One sign on. Please. This is one thing that Windows does so simply and elegantly. And yes, I know they crippled Kerberos and stuff. But it works. It really does. One of the most impressive things about Windows to me with no real Linux analogue. To get the same thing in Linux, you have to know what you're doing. In windows, you check the "Trusted for Delegation" box and make sure the computer has an account in LDAP.

    That's about it. I have about 4 linux boxes, 1 macosx, and several win2k3 servers. I enjoy working with Linux the most because I have a lot of control. But when it comes to getting something "good enough" set up from scratch to live, windows beats Linux hands down. Thus, CIO's and CEO's buy it. If it were possible to have a nice standardized teaching method to teach nice standardized Linux installs and get enough people through there to make a difference, it would be possible to stage a serious invasion of MS shops. The reason is that they have "good enough" all ready, but they are starting to get new ideas that the microsoft stuff is not capable of doing quickly, and MS themselves have become too big and bloated as a company to get anything done in a timely fashion. Whereas a small consortium is much more nimble. The problem is there's NO LEADERSHIP. It's a bunch of nerds leading each other around, arguing about the correct text editor to use and/or what window manager is best. When there emerges a clear leader, not a technology leader but someone with the vision of truthful computing who can get us all thinking the right way, then we can make a push. This leader will not be in it for the money. Although he/she may already have a lot of it. This person will DEFINITELY not be from the academic, CS or otherwise, sector. Perhaps a politician, but more likely a businessman. Above all, a great leader with the vision to provide something better than good enough, and the army to build it.

  • Re:Don't think so (Score:5, Insightful)

    by sxeraverx (962068) on Tuesday July 24, 2007 @01:15PM (#19972971)
    Linux CANNOT have a killer app, because it contradicts what Linux stands for: Freedom, Openness, Choice, to name a few. If the Linux community creates something, it's damn well going to be F/OSS, and therefore, portable to just about any other platform. The fact that something is proprietary is the essence of what makes it "killer," and that just might be why Linux hasn't been able to dominate.
  • by maillemaker (924053) on Tuesday July 24, 2007 @01:19PM (#19973033)
    >if that's your issue, then create a daemon that renices the priorities of pre-set programs to
    >some given level - better yet tweak the module that starts programs to nice them as they start. Works
    >better than blocking the background tasks by bumping everything that's happening under a users uid, while
    >still providing the lower latency issue.

    Here is what they average computer user will think of your solution:

    1) What's a daemon?
    2) What does "renices" mean?
    3) What are priorities?
    4) What is a pre-set program?
    5) What is a module?
    6) What does it mean to block a task?
    7) What is a background task?
    8) What is a UID?
    9) What is latency?

  • Re:It hasn't (Score:3, Insightful)

    by nuzak (959558) on Tuesday July 24, 2007 @01:20PM (#19973043) Journal
    > she'll have to think twice about dumping me!

    About the only thing worse than being lonely is living with someone who hates you.
  • Re:Yes he is (Score:1, Insightful)

    by bladesjester (774793) <slashdot&jameshollingshead,com> on Tuesday July 24, 2007 @01:27PM (#19973145) Homepage Journal
    Actually, knucklehead, I was the exec editor of an open source enterprise magazine. I also know how average users go about things.

    I grew up using the command line (both DOS and Unix) and while I like linux, it was not designed with the average user in mind. No Unix or Unix-like distro has been and they've been around a heck of a lot longer than windows. That's part of the reason why it won't truly catch on in the desktop market for "normal" people.

    I'd say the trolls are you and the one who claimed he would teach the world how to do it.
  • Re:Don't think so (Score:5, Insightful)

    by Doctor Crumb (737936) on Tuesday July 24, 2007 @01:40PM (#19973331) Homepage
    Depends which side you're on...
  • Re:Solution (Score:1, Insightful)

    by cbreaker (561297) on Tuesday July 24, 2007 @01:55PM (#19973529) Journal
    Hey, who's the kiddo? You *did* say "never" didn't you?

    It doesn't matter how much the kernel is built for joe shmoe. It doesn't matter. If you've installed any recent distribution lately you'd see that there's a lot of attention to the desktop experience and things have gotten a great deal better. Yup, building a viable OS these days, with computers and software sharing so much complexity and user demands so different between users is very difficult. It's amazing how far they've come.

    It also doesn't matter how long UNIX systems have been around. They were never intended to be a serious desktop replacement (partly because there WAS no such thing in the early UNIX days) where as recent Linux has been.

    You need to separate what you think Linux is and what you think a Distribution is. Ubuntu is not the Linux kernel. The Linux kernel is not Mandriva.

  • Pre-installation (Score:3, Insightful)

    by houghi (78078) on Tuesday July 24, 2007 @01:56PM (#19973535)
    There is NOW just one reason and that is pre-installed systems are not readily available. Yes, if you LOOK for them, you can do it, but ther reality is that as long as you do not specify, you get a Windows machine and even if you DO specify, you often do not have a choice.

    Now after 15 years, people have been brainwashed into using Windows.

    Another is that change costs money and managers have three issues about changing. The first is that they need to admit that they were wrong the previous years. Second they will need to train/hire new people and thirdly, it will not look good on their budget fr the next year. Perhaps great over a 5 year period, but not many companies are intersted in 5 year plans.
  • flamebait (Score:4, Insightful)

    by thegnu (557446) <thegnu&gmail,com> on Tuesday July 24, 2007 @01:58PM (#19973559) Journal
    Actually, knucklehead, I was the exec editor of an open source enterprise magazine.
    Right, and flamebait is insulting people just because you're an asshole. And I don't mean "you" in the general sort of sense.

    No Unix or Unix-like distro has been and they've been around a heck of a lot longer than windows.
    I know several people on Ubuntu who struggle with Windows. Plus, saying Unix has been around a long time and it's going nowhere is like saying, in 1994, that the internet has been around a long time, and it's going nowhere. The initiative towards a widespread Linux desktop has been around a few years, max. OSS movements take a while to rev up, and this one's doing quite nicely.

    Be quiet, be schooled, thank the nice man as he leaves.
  • Re:Solution (Score:4, Insightful)

    by cronot (530669) on Tuesday July 24, 2007 @02:04PM (#19973645)

    Unix and Unix-like systems have been around for a very long time (a lot longer than windows) and have yet to hit big on the desktop.

    Really? Like OSX?

    Granted, it doesn't have as much market on the desktop, but it's still the second. And it only is the second exactly because of the reasons de GP pointed out.

  • Re:Don't think so (Score:5, Insightful)

    by electroniceric (468976) on Tuesday July 24, 2007 @02:08PM (#19973705)

    The good thing is that Linux, GNU, and Open Source development are moving along at a faster pace than Windows is and sooner or later it will begin to surpass other OSes and GUIs in features, stability, flexibility, future potential, etc (if it already hasn't). There are weak spots as all products have them. I think Open Source will respond better to enhancing those features faster than a monolithic monopoly ever could. Not to mention there are huge numbers of potential developers that will be creating prior art and even IP that companies such as Microsoft can only steal if they want to move ahead. That's a tremendous boom.
    Wow, those are some big shoes to fill, and filling them rests on some pretty big ifs.

    Read The Mythical Man-Month. One of the most cogent things Brooks has to say is about project coherency, best exemplified in the desktop world by Apple. What Macs give you above all, their primary value proposition, is coherency of design.

    Coherency tends to be one of the weakest suits for many or most Open Source projects, especially those without a central entity to define the direction. The exceptions tend to be server or kernel-side: Apache, Linux Kernel, databases, etc, and I'd claim this is because there's a well-defined set of CS problems being solved there. KDE, which I use daily, has absolutely no coherency of design. That's why it does well as a testbed for new features, approaches, etc, but very poorly at consistency of experience.

    Brooks' argument, which is pretty credible, is that coherency comes from having one or a few project architects and consistently returning to their vision. They absolutely need to spend a lot of their time listening to users and developers and reacting to their feedback, but ultimately someone's vision is what makes a codebase hang together. Con is saying that the architects of Linux are basically not that interested in the desktop experience on vanilla hardware, because they're most interested in more traditional CS questions that tend to play themselves out much more in the enterprise space than the desktop space. As a non-CS guy in the software development world, this really strikes a chord with me. The Linux desktop is built on very similar components to the Mac desktop, yet is worlds away in usability. And that's basically because a) nobody is defining, shepherding and advocating usability requirements at the OS level, and b) the desktop projects don't have a architect/requirements definer at all.

    The rest of the article, and particularly the extravagant claims about success and failure are pretty much what you'd expect from a smart, non-CS, hardworking, disgruntled community member who has not been taken as seriously as he ought. The same dynamic pretty clearly played itself out in the climate change debate over the "hockey stick", where Mann et. al. were too dismissive of smart, hardworking, somewhat contrarian, non-climate science authors of counterclaims, McKitrick and McIntyre (M&M). Mann's work has withstood M&M's criticism well, and frankly M&M dropped the ball on some key items (like not properly modeling how various quantities vary with latitude - a big blooper in climate science) but the whole debate would have had less drama (and therefore been less ripe for political cherry-picking) had M&M not been seen to be marginalized by the climate science community. To me the lesson is not the technical merits of Con's solutions, but the lack of serious attention to his points about where the focus is in kernel development. That's the interesting part of this story, and one that Linus should really take to heart.
  • by bogie (31020) on Tuesday July 24, 2007 @02:14PM (#19973799) Journal
    Just get over that fantasy already. At the bare minimum Patents and DRM guarantee that in the long run Linux will never function as a drop-in replacement for Windows or OS X for Joe User. Certain font settings can't be turned on by default, most audio/video codecs are patented and designed for Windows/OS X use, hardware vendors want to keep their secrets and still don't care about providing drivers for Linux, and worst of all Windows and OS X work well enough to make justifying the move to Linux a difficult proposition. Personally I'm OK with all of this, I just wish people would stop beating this dead horse.
  • Re:Don't think so (Score:3, Insightful)

    by Dare nMc (468959) on Tuesday July 24, 2007 @02:16PM (#19973827)

    His point is that the kernels are optimized for servers.

    I guess I read into it differently.
    I thought it was more about no TIVOisation allowed. IE the original OS was just a OS, you can build on it what you will, enter different file managers, etc, etc. and they were allowed to A) be compatible without too much hacking B) have a chance to make money without being bought out by M.S.

    With MS not publishing the API's used for their apps, 3rd party developers are at a disadvantage. with so much integrated into the OS, without documentation about how to unravel/replace/build simular...

    Well how that affects no development. Is independent developers/consumers cant reasonably take their known OS/ work OS and make it work as a base. So PC manufactures/board manufactures aren't building very many specialty spin off's. EXAMPLE: very very limited option in buying a quiet PC in a small form factor, that could, for example, make a nice TIVO looking PC, that is 1) low power and 2) build upon a OS that people are familiar with.

    I don't think MS is to blame for this, people wanted simple PC's to learn without learning anything to start. Their market wasn't command line hackers, so slowly that completely faded out in the MS development cycle. And developers are getting stuck with shrink wrapped package as a starting platform. Thus you are limited to thinking inside the box
  • Re:Don't think so (Score:4, Insightful)

    by Daniel Phillips (238627) on Tuesday July 24, 2007 @02:16PM (#19973831)

    His point is that the kernels are optimized for servers. That is, focus on throughput, performance, but not latency or responsiveness.
    Actually, the poor interactive performance of the Linux scheduler was due to a combination of a server-oriented performance hack (O(1) scheduler) and an ineffective attempt to propagate the notion of "interactivity" between processes. So in this case, both a server hack and a desktop hack contributed to the problem.

    Thankfully fixed now, due to Con figuring out how to satisfy both efficiency and latency objectives with a single scheduler, and Ingo rudely but efficiently pushing his own interpretation of Con's work into mainline. Moral of the story: sometimes the process is bumpy and feelings get hurt, but the code doesn't care, it just keeps getting better.
  • by LighterShadeOfBlack (1011407) on Tuesday July 24, 2007 @02:34PM (#19974103) Homepage

    Yes, they have attempted to target it. However, few oems will support linux in fear of microsoft's wrath. Of the few that do, the "choice" given to users is pathetic at best. If it is failing on the desktop, it is not because of inferiority in any way; rather, it is due to microsoft's threats to oems.
    This one always comes up. I just don't get it. I could start building and selling machines with Linux pre-installed right now if I wanted to. OK so it wouldn't be a huge endeavour at first and to start with profits would be limited due to lack of an economy of scale, but if the demand is as great as Linux supporters claim it should only be a matter of time before all that changes. In other words, the threat to existing OEMs is not the be-all-and-end-all of pre-installed Linux computers. It's a free market and if I or anyone else so chooses they can go out and start their own business, and the fact that nobody is doing that says one of two things:
    - People have tried and failed. ie. the demand for Linux is far lower than people like to think it is
    - Nobody's really tried it. In which case the Linux community (if not the Linux OS itself) has only themselves to blame.
  • Re:Solution (Score:3, Insightful)

    by jeti (105266) on Tuesday July 24, 2007 @02:56PM (#19974427) Homepage
    Interestingly, Kolivas provided a patch that would have
    made "nice" work properly. According to the article, the
    patch was turned down, because people relied on the
    kernel to guess priorities instead of being fair (which
    sabotated re-nicing).
  • Re:Solution (Score:4, Insightful)

    by tinkerghost (944862) on Tuesday July 24, 2007 @03:00PM (#19974489) Homepage

    The average user has no idea how to do that, nor should they have to.
    I think you entirely missed the point:

    if that's your issue, then create a daemon that renices the priorities of pre-set programs to some given level - better yet tweak the module that starts programs to nice them as they start.

    A tweak to the start module in the kernal should be able to set the nice level of any program when it starts - giving latency sensative software more priority & dropping those insensative. Skype cares about latency - terminal doesn't care all that much. The whole point of the comment was that a process already exists to deal with the majority of the latency issues described and either a daemon or a tweak to the start module should be able to use that process & adjust usage based on the program without user intervention.

    The administrator creates a file /etc/nicety that ranks programs as needed: [programs] /usr/local/bin/Skype 15 /usr/local/bin/gterm -4 /usr/local/bin/totem 15 ... [user] bob 45 alice 18 ...

    With the user section, you could even prevent a slob from killing the system in a multi user environment by limiting his total niceness. Anything over nice(max) results in everything being trimmed back proportionally. If you boot into single user mode, that section is ignored.

    ulimit does provide some of these constraints, but it works on the whole userspace for its memory & process quotas & per process for the nice limit.

  • Re:Don't think so (Score:4, Insightful)

    by rrkap (634128) on Tuesday July 24, 2007 @03:09PM (#19974635) Homepage

    Tell me, do you compile your shit natively or do you install binaries (such as using the RHEL bootable CD, which in general installs binaries)? Because I've never had any problems if I've taken a day to uninstall everything, download the newest SOURCE and recompile natively on my box with my library versions and my compiler, optimized for my memory controller and my CPU. After recompiling my Kernel image with same and rebooting. If you expect Open Source, in most cases amateur, developers to make their software automatically detect and work with older library versions, compile portable enough binaries to run on your hacked together system, you are sorely mistaken. Do it right, trust me. Binaries ARE NOT PORTABLE. They sort of work, sometimes. C source is PORTABLE. USE THE SOURCE.

    I think you've just perfectly summarized why Linux is not popular as a desktop platform.

  • by monkeyboythom (796957) on Tuesday July 24, 2007 @03:25PM (#19974837)

    But it takes top down leadership to run things.

    Clearly, Con Kolivas wanted to participate because he felt Linux could be and should be improved on the desktop and set out to do that. However, from his account, he appears to have run into indifference and outright rejection of some of his solutions. Now, if Linux was run like a company, say Microsoft, would this happen? maybe, if it wasn't his main line of work. As a hobby, most suggestions are simply that. But if he is asked to work on the kernel and he doesn't work well with his boss, whether or not the code is good, most times he will be let go or reassigned to another department.

    Yes, Linux does have leadership and a hierarchical structure. But it isn't run with investor supported, or market driven shareholders. If anything, Linux runs on donations. And here is where I think the problem lies twofold. first, people participate and then leave citing burnout because they feel that since they volunteered their time, then the things they do must be worth their effort. And when their effort isn't acknowledged (or used), ego play sets in and causes ill will.

    How to mitigate this issue? Leadership needs to take an overall view of progress from a homogenized as well as server and client distro view. Clearly, there are incompatible things going on between server and desktop that warrant separation. And too, recognize that some things may slip by and just recognize not everything can be perfect. For the individual, this is harsh, but fork it and build your own distro if you think they are wrong and you are right. Time will tell and then perhaps it will unfork and then everyone can kiss your ass.

    The second problem that occurs with this issue is business do not want a product that has been built with love and sideline passion. That want a product warranted by wage slaves and a company driven by profit. Companies are outright scared of using and investing in a product that someone built in their spare time and only works on it when it suits their own schedule.

  • Re:flamebait (Score:3, Insightful)

    by thegnu (557446) <thegnu&gmail,com> on Tuesday July 24, 2007 @03:32PM (#19974961) Journal
    1998 (it was not uncommon to see Redhat boxes on store shelves)
    Red Hat is hardly a movement. I understand that we had Mandrake and Debian, but have you objectively examined the curve in the rate of progress? No, you haven't.

    I don't do things without question, and experience has taught me better than to believe crap that's being spewed by rabid fanboys.
    Right, so that rules out Torvalds, RMS, Ballmer, Jobs, and, err.... maybe something like 10% of Windows users, 50% of Mac users, and 30% of Linux users (I'm not only talking about desktops here)? Not everybody. I think that you're much more likely to be the kind of person who labels people he doesn't agree with knuckleheads, rabid fanboys, and kiddo. In other words, a pompous ass.

    As for being schooled, like I said, I've already been a voice in the community.
    So you're saying that you can throw out all your books once you're on TV? (or maybe livejournal in your case (nested parenthesis needed to point out that I'm being facetious, because you've demonstrated that it may just go right over your head))

    I know you're probably all 40 or something, but grow up. I used to know this guy who would always use the argument, "I have 37 years on your ass," and you remind me of him. My thoughts on the argument? Just because you're old doesn't mean you're not stupid.
  • Re:Don't think so (Score:3, Insightful)

    by BlueStraggler (765543) on Tuesday July 24, 2007 @03:36PM (#19975011)

    Tell me, do you compile your shit natively or do you install binaries

    I understand what you're getting at, but the situation is not what you'd expect. The problematic SuSE system is hand-tweaked, which fixed a bad disk performance problem, but doesn't stop the load thrashing. The rock-solid RH7.3 system is an out-of-the-box binary, which furthermore was cloned and moved to an entirely new system when the original system disk started to show IO errors - so it's been a solid binary on two generations of hardware.

    If you expect Open Source, in most cases amateur, developers to make their software automatically detect and work with older library versions, compile portable enough binaries to run on your hacked together system, you are sorely mistaken.

    I don't expect the amateur developers to build robust, portable binaries. But they did! I *do* expect the enterprise developers at Novell, IBM, etc. to build robust, portable binaries. But they don't.

  • Re:Solution (Score:3, Insightful)

    by Stamen (745223) on Tuesday July 24, 2007 @03:43PM (#19975115)
    "UNIX-Like"?

    True, if by "UNIX-Like" you mean "is UNIX". OS X is no less of a UNIX than any other UNIX. And OS X Leopard has officially been certified as UNIX 03:
    http://www.opengroup.org/openbrand/register/brand3 555.htm [opengroup.org]

    People just can't get their idea, which they formed years ago, of what UNIX is out of their head. UNIX often has GUIs. UNIX can be very user friendly. UNIX doesn't require you to edit your documents in VI. Just because UNIX can power a mainframe calculating quantum physics, doesn't mean you can't take nudie pictures, of your girlfriend, easily with your iPhone running UNIX.

    Serious question, is there anyone but Microsoft not using Unix?

  • by ZorroXXX (610877) <{moc.liamg} {ta} {ladvolh}> on Tuesday July 24, 2007 @04:06PM (#19975437)

    If they don't know what's latency, then why do they bother about the problem at all?

    A couple of years ago Thomas Hesse, president of Sony BMG managed to say "Most people, I think, don't even know what a rootkit is, so why should they care about it?" and this was rightfully frowned upon as an valid argument. I fail to see how your is different, am I missing something?

  • Re:Don't think so (Score:3, Insightful)

    by toriver (11308) on Tuesday July 24, 2007 @04:34PM (#19975795)
    You forgot to add "four-eyes" at the end there.
  • Re:Solution (Score:5, Insightful)

    by tinkerghost (944862) on Tuesday July 24, 2007 @04:39PM (#19975839) Homepage

    We're talking about linux on the DESKTOP. Your average home user does NOT have an admin that works on their desktop.

    And how exactly does the average home Windows user set the parameters on the software they install - oh yes, it's all done automagically. Why, I do believe that installation software for linux currently adds things to the SE linux contexts when it installs. But I suppose that would be impossible to do to a simple file when your creating a module to do it.

    Here how about rather than toss an idea off my head, I spell it out in a step by step process:

    1. A developer [not the desktop user] creates a daemon that runs with the ability to renice a program above 0.
      or
      A developer [not the desktop user] modifies the module in the kernal that starts processes to set the nice level of the process according to a set record already defined in a file [in /etc/nicety for our example].
    2. A developer [not the desktop user] creates a config tool that allows the definition files [in /etc/nicety for our example] to be edited in a graphical manner.
    3. This daemon or modified module and config tool are included in a distro.
    4. The packager of a program includes a file for a directory [/etc/nicety for our example]. This file contains a list of the executable files in the package and the nice level they should be run at.
    5. Someone with administrative rights [not an administrator because your average home user doesn't have an admin] to the computer installs the software with the appropriate package management system.
    6. Users start the program & miraculously their programs are niced to the appropriate level.
    7. Someone with administrative rights [not an administrator because your average home user doesn't have an admin] - uses the config tool [if needed], permitting them to increase or decrease the nicety to aleviate problems caused by the default settings.

    There did that include enough detail that your straw men are dispelled? The core protocols exist to minimize the problem - nice. A patch to set the nice level on starting shouldn't be all that hard to do, a daemon to reset them after starting even easier. Your argument that a home user needs an 'admin' who understands daemons & patches etc to do this is only valid if you also feel the average Windows box needs an 'admin' to install AIM.

    As several people pointed out, Con's dissatisfaction with the kernel dev team is that they wouldn't change the way nice works to better impliment it's stated purpose - splitting CPU time based on the interactiveness of a program. However, even in it's current state, a system to automate nicing the processes would resolve most of the issues people are seeing with desktop responsiveness. In combination with any of the new schedulers, it should make just about anyone happy.

  • Re:Don't think so (Score:4, Insightful)

    by synthespian (563437) on Tuesday July 24, 2007 @05:01PM (#19976099)
    The Linux desktop is built on very similar components to the Mac desktop, yet is worlds away in usability. And that's basically because a) nobody is defining, shepherding and advocating usability requirements at the OS level, and b) the desktop projects don't have a architect/requirements definer at all.

    And whose fault is this? How many usability studies has GNOME conducted? NOVELL, IIRC, has done a only a handful, many years later. And KDE has set up an usability group one one or two years ago (and I've yet to read any paper on it). Not only that, GNOME has adopted the practice of not even paying attention to bug reports (look up Eugenia Loli-Queru's arguement with the GNOME project on this).

    Almost all the free software GUIs are not innovating *at all* on usability. They are all about little cosmetic changes. Mac OS X and Vista have left them behind the curve (and don't mention Beryl...what's the point of a spinning cube ?! How does that increase usability? Or wobbly windows?!!) Sometimes they inovate a little, but in the opposite direction, like Ion.

    And frankly when someone tries something new, nobody pays attention. Like OpenCroquet. Like some experimental Java desktops. You can't really expect anything other from developers hellbent on C programming...What can you expect from GMOME? All I expect from a C project of that size is that it's going to be further and further behind the curve...We can't even expecct anything from the likes of Novell: their Mono is not really being developed as a multiplatform tool, is it? (So, no FOSS desktop like GNOME or KDE).

    The real shame is having companies that are basically full with non-creative individuals injecting money on FOSS.

    By the way, "Linux" is not the only Unix-like OS that uses GNOME and KDE.
  • Re:Don't think so (Score:3, Insightful)

    by owlstead (636356) on Tuesday July 24, 2007 @05:38PM (#19976595)
    No, it doesn't. If it does, then *you* are on the wrong side.
  • Re:Solution (Score:4, Insightful)

    by Brad1138 (590148) * <brad1138@yahoo.com> on Tuesday July 24, 2007 @05:59PM (#19976851)
    Boy you are really missing the point. bladesjester is correct. Your "simplified" answer gave me a headache and I have been into computers since the early 80's. The "computer geek" that understands that makes up probably less than 5% of computer users. Which is as hi as Linux desktop market penetration will ever get if they continue to keep things so complex/complicated.
  • by rtssmkn (900096) on Tuesday July 24, 2007 @06:47PM (#19977429)

    At our company, we use a heterogeneous setting: Windows XP and Linux (Kubuntu that is).

    The XP machines we set up for development purposes (Eclipse, PHP, XAMP) are now, not quite
    7 months after their first installment, at the bottom of usability, so to say. Everything
    on these machines takes longer and longer, with nothing having been installed in-between
    except the required updates for the development platform.

    On the linux machines it is different, quite a few update to the installed development
    software, upgrades to the system and installation of a few gadgets and still everything
    works fine and most of all responsive.

    And, what is more, yesterdays I received a notification by XP (professional taht is, based
    on the proven NT platform), saying that the MSDOS driver would not function properly
    after inserting a certain CDROM... remember that this is a off-the-shelf installation of
    the system.

    To my account, Linux is more desktop ready than any other platform, except perhaps the also
    Unix based Mac OS X environment.

    Just my 2

    Regards.

  • Re:Solution (Score:1, Insightful)

    by Anonymous Coward on Tuesday July 24, 2007 @07:33PM (#19977883)

    Boy you are really missing the point.
    No, you and "bladesjester" are really missing the point. The point was that there are things that Linux developers can do and already are doing for computer users to make Linux run better and differently on the desktop versus on servers. Things that computer users need to know nothing about in order to benefit from.

    To repeat, the suggestion about what things can and are being done was aimed at what Linux Desktop developers could do to make it simple and fast for computer users, not just for themselves.

    For a computer user, Linux can be and is just as simple to run as Microsoft's Windows is.

  • Re:Don't think so (Score:4, Insightful)

    by kklein (900361) on Tuesday July 24, 2007 @08:06PM (#19978153)

    Okay, I'll be as clear as I can be here: Linux will never take over the desktop. Ever. Ever. Why? Because it's a pain in the arse.

    Never, in all my years of working on the Mac and Windows, have I been required to type something like "sudo vim /etc/X11/xorg.conf" and then try to tell my computer to display something over 640x480 resolution--and even then not having it work, even after following 3 different, progressively complex, methods of getting an nVidia driver to work.

    Every year or so, I try to set up a Linux machine with whatever the new darling distro is. Only once have I gotten one to work acceptably, but there were still issues I wasn't happy with. And that took about a week of reading poorly-written manpages. Just the other day I gave Ubuntu 7.0.4 a shot. I gave up after 2 hours of fiddling to get working video.

    That is after having to futz with my CMOS to boot it--a step most people wouldn't know to do.

    Linux people are, and I'm going to be brutally honest here, morons. Not computer morons, obviously, because they have the skills and general knowledge required to get Linux to at least boot and display video properly, but morons because they lack even a basic understanding of what other people want from computers. Linux people are, and this will be news to precisely no one, geeks. As such, their opinions on computers are absolutely irrelevant to anyone other than fellow geeks.

    People do not want to fuss. They want to buy a computer, turn it on, and start putting in software they bought at Wal-Mart without ever even thinking about what is going on below the UI. Hell, as far as most of them know, there ISN'T anything below the GUI. That's what it has taken to get the computer into every home in every developed country in the world: compatibility and ease-of-use.

    Linux offers neither of these things.

    Ultimately, the FOSS model is fundamentally flawed. People write things they find fun or that they really need--motivations we in the education business refer to as intrinsic, which is the best kind of motivation there is. The problem is that no one finds things like video drivers fun. There's no huge drive to make sure all the features of the video card are supported, because you won't need them anyway. So, without some kind of extrinsic motivation, like profit, certain jobs just never get done--or at best, get done half-assedly.

    This problem is exacerbated by the fact that the people doing the developing are uber-geeks (we know this for certain because they are evidently coding for fun), and therefore don't sweat having to tweak a text file here and there. They pat themselves on the back for getting it to run at all (as they should--it's quite the accomplishment, and something to be marveled at!) and get so excited that they mistake this small success to be proof that everybody can and should be running Linux just like them. But they shouldn't, because (polishing off my old Slashdot chestnut)...

    Linux is a toy.

    It is a hobby OS. People have gotten this claptrap toy to do some pretty great things, and it's a no-brainer for any kind of application where the computer isn't expected to do anything very exciting (games, iTunes, iMovie/Windows Movie Maker, hook up any random scanner you buy--Only geeks are "excited" by hosting webpages and/or directing network traffic) or where you need a really small footprint (embedded). But that does not a desktop OS make. Not for the unquantifiably vast majority of computer users, anyway.

    Look, everyone hates Microsoft. Apple has their own hassles to deal with. But both are so astonishingly better at serving the customer's needs and desires than the Linux distros will ever be that the fact that some people even need that pointed out to them simply demonstrates, clearly and unequivocally, that those people are, as I have already stated above, morons.

    I'm sorry, but it's true.

  • by sl3xd (111641) * on Tuesday July 24, 2007 @11:01PM (#19979381) Journal
    Ashton (the interviewer) chose the title that says why linux failed on the desktop without consulting me. If you actually read the interview I never once say that linux failed on the desktop.

    Well, now you have a personal understanding of why a lot of people are turning from "mainstream" journalism to alternative sources. The journalistic process isn't exactly honest or honorable, is it?

    I did think it odd that after arguing against fair scheduling for quite a while, Ingo, et. al. decided to implement it (and how rapidly it was dropped into the kernel). I've read a few articles about the sudden change of heart. I'm sorry things worked out that way; I can definitely get an idea how disappointing that you didn't even get any credit for championing fair scheduling, nor were you given any involvement in implementing the CFS.

    On the other hand, I also recall reading a paper that was given at OLS 2006 that was more or less stating that "Userspace Sucks"; there's a lot of work to be done there.
  • by arivanov (12034) on Wednesday July 25, 2007 @01:20AM (#19980027) Homepage
    This is his take on that. That is not necessarily the reality of what was happening.

    While I agree with his take that 2.6 was a sequential rewrite and breakage of working parts a number of them were essential for having a fast working kernel in desktop and end-user applications. For example the full TTY layer rewrite around 2.6.15-16 was essential in improving the speed of newer high-end modems (GPRS/3G anyone) as well as other apps using the tty layer (there is still a lot of those). Same for the NFS improvements and introducing NFS 4.0 in 2.6.14 and so on and so fourth. Overall, 2.6 should not have been released. It should have been numbered 2.5.99 until circa 2.6.16 level. Alternatively we should have been at 2.8 or 3.0 now.

    He was mostly pushing towards scheduler improvements, which was somewhere low on the priority list in the early 2.6. While I do not necessarily agree with Linux rephrasing "Wall Street" as "Scheduler is for Whimps, real hackers do VM", it is a fair description of the necessities of early 2.6 development. The VM in early 2.6 sucked, only to be broken around 2.6.10 followed with some cascading breakage into filesystems and just barely fixed around 2.6.16+. From there on scheduler was payed some attention and quite a lot actually. Some of the suggestions that came in that period actually are quite good. Most importantly they have some mathematical basis behind them. They are not hacked to gether and their behaviour is predictable. Unfortunately, they are considerably harder to tweak or tinker. So I am not surprised he is slamming the door. What I find annoying is that he is blaming everyone but himself on it. Not nice.

The speed of anything depends on the flow of everything.

Working...