Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
GUI OS X Operating Systems GNOME Linux

The True Challenges of Desktop Linux 505

olau writes "Hot on the heels on the opinion piece on how Mac OS X killed Linux on the desktop is a more levelheaded analysis by another GNOME old-timer Christian Schaller who doesn't think Mac OS X killed anything. In fact, in spite of the hype surrounding Mac OS X, it seems to barely have made a dent in the overall market, he argues. Instead he points to a much longer list of thorny issues that Linux historically has faced as a contender to Microsoft's double-monopoly on the OS and the Office suite."
This discussion has been archived. No new comments can be posted.

The True Challenges of Desktop Linux

Comments Filter:
  • by future assassin ( 639396 ) on Friday August 31, 2012 @08:03PM (#41195939) Homepage

    who think that when they buy something it belongs to them to do with as they wish, there will always be Linux. As it is seems that WIndows 8 MS is taking that away and so is Apple.

    As a non developer or programmer seems to me Linux is stronger than ever.

  • by execthis ( 537150 ) on Friday August 31, 2012 @08:10PM (#41195993)

    Wordperfect was already being used extensively by legal offices. It would not have been a huge jump to get legal offices to switch to Linux running Wordperfect. But after version 8 Wordperfect was not a native Linux port but this convoluted thing that ran through an emulator layer which was insane. Then, not long after it died. That was the end of the chance for Linux to make an advance to the corporate/business desktop.
    I'm sure some other things didn't help as well. I still think one major issue is that package managers do not have a way to screen out crusty projects. There should be a way to ignore all software which hasn't been developed or changed in X amount of time, with X=6 months, 9, months, whatever, but some value that cuts out the immense amount of crust.
    I also think Linux should have done more to entice hardware and software makers to use it. In fact, it should have done everything absolutely possible to make life easy for hardware and software makers, including more flexible licenses. I don't think people were realistic enough to realize that, without the needed support of hardware and software makers, everything else is almost a moot point.

  • Linux fails itself (Score:1, Interesting)

    by 10101001 10101001 ( 732688 ) on Friday August 31, 2012 @08:23PM (#41196109) Journal

    Let me tell you a little story about a recent experience I've had with Linux (Ubuntu to be specific) that should give you an idea on why, I think, Linux is something of a failure on the desktop.

    I've been using Ubuntu 10.04 LTS for a long time. Well, 3 years. You see, Ubuntu only supports (as in, will fix security bugs) a version of their distro for a maximum of three years--and even then, you have to use a Long Term Service release to see that. Compare that to XP, which shows a much longer period of support. Oh, you say, but an upgrade is free. Well, let's talk about that upgrade to 12.04 LTS.

    On my system, I had to d/l ~6GB of package updates for Ubuntu itself and then another ~3GB for "Third Party Sources"--let's ignore those "Third Party Sources" for the moment since that's its own thing. Most of that comes down to the point that, as another poster pointed out, libraries go through regular ABI breakage. Hence, a LTS version can't readily upgrade a library progressively, indefinitely. Instead, a break, in the form of a new distro version, or some heavily-lifting constant backporting (what Debian does) has to be done--the latter of which merely delays the inevitable--which is a very arduous process. Why? Well, the biggest reasons are as follows:

    The Installation, once started, can't really be aborted. Because of the interconnected nature of Linux distros, which package managers help manage, there's no way to do a clean break to pause and resume progress. It helps none that Linux itself does a piss poor job of supporting things like hibernation, replay ability, or the general framework of supporting containers so one could, given enough disk space, simply have the older and newer distro installed at the same time and it be almost trivial to support resuming. Instead, one is left with an installation that could take a day or more--more so because changed config replacement/keeping isn't grouped so the installation will repeatedly stall unless you're willing to nurse a 5+ hour install. And that doesn't even get into the obvious stuff--ndiswrapper either moved packages or something which resulted in a lack of wireless support on my newly installed distro version which rather hampers finding out where it moved to online and downloading the new ndiswrapper package. Thankfully, due to a "feature" of older kernel images not being removed when their package is removed--which violates the concept of a package manager managing things (which further brings up the subject of configuration files, but I digress)--I was able to resolve the issue. Meanwhile, umount still segfaults on hal-based mounts. :/

    Now, I'm sure people could argue "well, that's just an issue with Ubuntu" or "I've never had problems with upgrading my distro". But, the point is, the underlying architecture isn't robust for dealing with the sort of issues endemic to the FOSS world of library upgrades or even drivers moving/disappearing. And the argument that "well, Windows/Mac OS X is no better/is worse" does nothing about showing why Linux is better and something people should want to choose. And I do agree Linux is better. It's just marginally better in a lot of areas and those better areas are aggravating at times and, at least in short bursts, worse than the alternatives.

  • by Blakey Rat ( 99501 ) on Friday August 31, 2012 @09:42PM (#41196545)

    So the solution there is to ship BIG EXPANSIVE libraries with the OS, and keep on top of them so new stuff is supported by those libraries ASAP. You don't have 75 copies of zlib.dll, you have one-- and it's owned and updated by the OS.

    Take Microsoft's .net for example. The library covers pretty much everything you can imagine wanting to do with a computer, and it's constantly updated as new file formats/etc arrive. But since there's only ONE .net, the library is still one holistic thing that can be updated when security problems arise without breaking anything.

    That's not to say that .net is the perfect solution to all problems, but it's definitely worth examining how other vendors solve the problems in Linux.

    For what it's worth, I come from Mac Classic, a platform that never had DLLs in the first place (but did have a huge expansive built-in library). Frankly, I've never been convinced that shared libraries were a good idea, even when HD space was expensive. But that's just me.

  • by Anonymous Coward on Friday August 31, 2012 @10:17PM (#41196703)

    Yes the binary should still run, and the SAME binary should run across several distros and several versions of those distros. Even in the current messed environment it is possible if you are very careful, use the oldest compiler you can find so that all of your users have newer versions of libc and libstdc++ and build and bundle all the rest of the libraries yourself including the GUI libraries, and be careful on the X11 options on configure since you can't count on xfixes. This is why commercial development has little patience for Linux. From: Linux user since 0.92 kernel and Principal developer of a commercial desktop Linux statistical visualization product. product is still sold, and thriving on Windows and Mac, even an iPad version, but now discontinued on Linux! Sadly, Without ABI stability and at least compatibility libraries, Linux will not be more than a niche on the desktop.

  • by Kjella ( 173770 ) on Friday August 31, 2012 @11:12PM (#41197049) Homepage

    Windows is a consumer/business product geared to people who want (and are convinced they need) a high level of support.

    Lolwut? I know there are many businesses that want support because it's their bread and butter, but my home desktop isn't supported by anyone and I think that's pretty common. The reasons are more:

    1) It's what most other people run meaning most shit has been found by somebody else and fixed. Maybe the driver developers should care that they have crap support for the 1% that's Linux or the 5% that's Mac but they sure as hell care if 90%+ of their market think they're crap. Of course this is a chicken and egg situation, if Linux had 90%+ market share it'd be the one with stellar support but it isn't. How any laptops still have problems with power management and suspend/resume? How many dare ship a laptop with those functions broken in Windows?
    2) Because most people are on Windows, most software is written for Windows. I'm sure you can try arguing that quality beats quantity, but it doesn't hold up in practice. Most of the commercial software have people to do all the boring and tedious work and polish that so often is skimped on in the OSS community. Not to mention most OSS is available on Windows, sure if you want to use GIMP you can but you can also get Photoshop or whatever else you fancy. The list of Linux-exclusive killer apps is short if not empty.
    3) With lots of users, there's also lots of people that might be able to help you. If people have any kind of installation instructions or guides or tutorials for something, it's likely to be for Windows and possibly Mac. How to do it in Linux? You're on your own. It's not that I can't find out on my own and there's usually something analogous but it's still time spent and if you don't like to play with those details then it's time wasted. If you get any training at work it's likely to be for Windows or Windows applications.

    Using Windows is travelling down the well worn path, if you're using Linux you're far more paving way. I'd also wager that any person able to manage a Linux box could just as easily have managed a Windows box, Seriously, if you spend any significant time managing your home desktop then you're doing it wrong.

  • Re:Blames (Score:5, Interesting)

    by Doctor_Jest ( 688315 ) on Saturday September 01, 2012 @12:50AM (#41197617)

    I think both the hardcore and "user-friendly" Linux versions can coexist fine. I have used the former and the latter, but I settled on Debian when I got my 64-bit computer (a cheapazoid dell AMD refurb...)

    The old farts like myself who cut their teeth on Commodore 64's and Atari 800's are still looking for something to tinker with (there are exceptions, of course), and Linux fills that need nicely. I can remember installing Slackware from floppies while in college, because I wandered into the computer lab and started dinking around with HPUX.. Back then there wasn't much of a WWW... Of course the hacker in me grew substantially when I found I could use a free OS on my PC. Sure it was a beast to get my ET4000 card recognized by X, (I never got it fully working), but having my own shell prompt on my lowly PC rekindled my love of tinkering. Not since I got my first Atari 800XL for Christmas (with a datasette) had I felt like using a computer was fun again...

    That is not to say I want to force my love of tinkering on anyone else... heck, I remember trying to free enough RAM in DOS to run certain games... that was the "not fun" side of tinkering, and I can see why people are reluctant to return to those lawless days of yesteryear. :)

    Linux can thrive and succeed without 95% of the marketshare. Sure there are some high profile things the commercial OS vendors will always keep close to their chest... but for everything else, there's always an alternative. Linux represents that, but cannot gain traction because those who have an idea about going to Linux remember the stories the "old farts" liked to tell about the hell it was getting the OS to work....

    Would I like to kick Microsoft to the curb? Most assuredly. Would I like to see Apple again become a niche player? Without a doubt... but where Linux is going is exciting enough, and well it should make others take notice. (and your sig is so appropriate... mine was 2004 - 2011, though. ;) heheheh.)

  • by scream at the sky ( 989144 ) on Saturday September 01, 2012 @01:02AM (#41197669) Homepage

    This is exactly what happened with me.

    I'd been using Debian and its various derivatives since Woody was the unstable distribution, and I had always been happy with it (so I thought)

    Then, in April it was time to buy myself a new laptop, and I bought a 13" MacBook Pro on a whim, knowing that I could install Debian if I wanted to with no issues, but I figured I would try OSX out to see what the deal is.

    5 months later, Debian has been relegated to running in a VM Ware Fusion instance that takes up 8GB of disk space, and gets booted once a month or so, and I am really wishing I had just bought a Mac back in '99 when I first started pissing around with Debian.

  • by Yaztromo ( 655250 ) on Saturday September 01, 2012 @01:44AM (#41197799) Homepage Journal

    No his conclusion makes sense for precisely this reason. If OS X increased by more than Linux's share, than the increase cannot be explained by Linux users switching to OS X. Thus there are other reasons besides Linux that users switch to OS X.

    Two big problems with that reasoning:

    1. It assumes that the numbers of users out there has remained stagnant for the last 10 years or so, and
    2. It assumes that users never switched from other platforms than Windows

    ...both of which are incorrect. Had OS X not existed, there is every possibility that at least some current OS X users would be Linux users (assuming in part that they want to run a UNIX-like OS). I'd fall into that category -- I switched to OS X 10.3 from OS/2 WARP v4.5, in significant part because OS X was UNIX with an excellent user interface, on really nice hardware (particularly for portable systems). If OS X didn't exist, I would have turned to desktop Linux. Many OS/2 users made the same jump (I chatted with David Barnes [wikipedia.org] two years ago, and he was using a Mac as well), and while I can't speak for all of them, many had no desire to fall into the Windows ecosystem, and would probably have become Linux users if not for OS X.

    All that said, I'm also a Linux user -- I have two headless Debian systems on my network running various services, with Xquartz installed on my Macs for when I want to run graphical Linux applications.

    The overall point being, it's possible that if OS X didn't exit, many more of the new computer users in the last decade, and those that switched (particularly from now-legacy non-Windows platforms) may have chose Linux instead of OS X. I don't think its so much about people moving away from Linux as it is more people choosing one over the other in the first place.

    (One of my man claims to fame is having RMS himself tell me I'm not into OSS enough because I use a Mac -- even though I've led several OSS projects, and contributed to a dozen more. Oh well, can't please everyone I suppose).


  • Closeminded? (Score:4, Interesting)

    by RanceJustice ( 2028040 ) on Saturday September 01, 2012 @02:53AM (#41197975)

    Absolutely incorrect and unfortunately myopic - there is a much wider user base behind FOSS, with motivation from the altruistic to the selfish, realistic or idealistic. Look at RMS and the FSF, look at Debian, look at all the elements of FOSS that are designed primarily with philosophical purity in mind. Look at those who just want to be able to have total control over the software they need to get their business done, don't care about philosophical purity but want to ensure their coders have the access, understanding, and control necessary to write a module, update, or fix. Look at those who believe in privacy and openness in the face of the many moneyed interests that are seeking to lock down everything they control for profit while harvesting any information they can find that belongs to others, and create varying projects to provide alternatives; believing in the betterment they bring to society. Tor is a prime example - there are many alternative darknets, proxies, VPNs etc... for hackers, but Tor is made to be easy enough for someone with a relatively modest knowledge base can make use of it.

    here are most certainly elements of the Linux and FOSS community that are altruistic and create software for a wide variety of non-technical users. Mozilla is a great example - they created some of the best known FOSS in the world and provided a browser (and mail/news client) that both at the technical/code level and usage level put the software completely at the control of the user AND have successfully made it easy to use. Firefox and Thunderbird for instance aren't like lynx and pine/mutt; they're software that adheres to FOSS tenets and allows the guru to inspect and modify to their liking, while also being easy enough that anyone who has used a browser before can make of them. Even more impressive is that because of great design with respect to addons and the like, AdBlock Plus, NoScript, HTTPSEverywhere can be easily loaded with a few clicks as opposed to being the kind of thing that required expert-level scripting to use. Granted, these addons (like the software itself) were created by the knowledgeable, but were made accessible by design. These weren't tools hacked together to solve the problem of a particular user and little more, but instead were inclusive and because of that, thrive.

If you want to put yourself on the map, publish your own map.