Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
GUI OS X Operating Systems GNOME Linux

The True Challenges of Desktop Linux 505

olau writes "Hot on the heels on the opinion piece on how Mac OS X killed Linux on the desktop is a more levelheaded analysis by another GNOME old-timer Christian Schaller who doesn't think Mac OS X killed anything. In fact, in spite of the hype surrounding Mac OS X, it seems to barely have made a dent in the overall market, he argues. Instead he points to a much longer list of thorny issues that Linux historically has faced as a contender to Microsoft's double-monopoly on the OS and the Office suite."
This discussion has been archived. No new comments can be posted.

The True Challenges of Desktop Linux

Comments Filter:
  • by MrEricSir ( 398214 ) on Friday August 31, 2012 @07:57PM (#41195893) Homepage

    FTA:

    The core of his argument seems to be that the lack of ABI stability was the main reason we didnâ(TM)t get a significant market share in the desktop market. Personally I think this argument doesnâ(TM)t hold water at all...

    This is one argument I really don't get, and yet the FOSS library maintainers seem to be adamant that they must be able to break their ABIs whenever they want.

    Yes, I know keeping a stable ABI is hard. But here's the deal: as a maintainer, it's your job.

    Let's not forget that the point of libraries is to develop software on top of them. If the library ABIs are shifting all the time, then those libraries have failed at their most fundamental task.

    There's absolutely zero excuses for why an app written three years ago shouldn't run fine today. None. If MS and Apple can do it, then so can you.

    But it's worse than that. Writing a GUI application that runs just on the past two or three versions of Ubuntu requires writing your own compatability layers, or at least peppering your code with #defines. Why on earth would we want to put this burden on application developers?

  • by Anonymous Coward on Friday August 31, 2012 @07:57PM (#41195895)
    There are lots of challenges, most you're probably familiar with if you've ever helped a family member setup Ubuntu on their own computer. Yeah, sucks, some people say we will never be better than Apple. But you know what, Apple *really* didn't succeed, they only went from 5% market share to 7.5% market share, according to my memory and what I read today on Wikipedia. My conclusion? We have some challenges to overcome. Oh yeah, and, O'Doyle rules!
  • by hawguy ( 1600213 ) on Friday August 31, 2012 @08:04PM (#41195949)

    At my company, out of 500 computer users, we have around 60% Windows and 40% OSX, Linux users (including me) don't even account for 1% of our desktops (but factor heavily in our servers - we're around 50% Windows, 40% Linux and 10% OSX (which will be moved to Linux before the end of the year). Most of the OSX users are normal business users (finance, IT, etc) not graphic designers or other users that traditionally have preferred OSX.

    There's little reason for anyone here to run Linux to do their work - Office 2011 runs well on OSX and gives users an Office Suite and Outlook that's compatible with the rest of the corporation. And there's the whole Apple Ecosystem that some people like to be inside of.

    Even though I run Linux, I still do most of my work on a Win 7 virtual machine because some apps just don't run well (or at all) on Linux. I tried Crossover Office/Wine for a while to run Office, but it wasn't worth dealing with the quirks, it runs much better on Windows. Plus, some of our corporate tools and infrastructure management tools run only on Windows (or require MSIE for full functionality). We run a terminal server for OSX users that need to run Windows apps.

    OSX may not have killed Linux, but it sure has kicked it into the corner.

  • by cynyr ( 703126 ) on Friday August 31, 2012 @08:12PM (#41196005)

    but does the binary have to run or just work if you configure; make; make install again? right the OSS world assumes that software can be recompiled, and most only needs that. Sometimes it needs a simple patch, but yes breaking ABI isn't really an issue. Breaking an API is much more of one.

  • by MrEricSir ( 398214 ) on Friday August 31, 2012 @08:16PM (#41196037) Homepage

    but does the binary have to run or just work if you configure; make; make install again?

    First of all, if you do that it's no longer the same binary.

    Secondly, why would you place that burden on the user? The whole point of software is to solve problems for users, not to create new ones.

  • I have one. (Score:5, Insightful)

    by Anonymous Coward on Friday August 31, 2012 @08:20PM (#41196069)

    There's absolutely zero excuses for why an app written three years ago shouldn't run fine today.

    You sound like you're a paying customer or their boss. If said maintainers are volunteers and doing this in their spare time and juggling work and family and just having a life, I think they have an excuse.

    If it were me and I heard horseshit like your post, I'd say, "Here's the code. Knock yourself out. I'm taking my kid to the movies like I promised him three releases ago."

  • by __aaltlg1547 ( 2541114 ) on Friday August 31, 2012 @08:20PM (#41196071)

    I think the real root of the difference is that Linux serves a different market. Apple Mac OS X is a consumer product pitched for people who want their computers to "just work." Windows is a consumer/business product geared to people who want (and are convinced they need) a high level of support. Linux is not either of those and never will be. It's a system made by and for programmers and other techies who want to be free of the monopolistic practices and have full control of their own machines from top to bottom.

    I think Linux may in fact be close to saturating that market. It may make inroads into the business and consumer user spaces. I think it will and should because businesses shouldn't be using things that are very expensive and promote lock-in when there are good-enough alternatives that meet most of their needs. Corporate customers are very conservative about risk, and they perceive that buying a professionally supported commercial product is a lower-risk option. And they've drunk the Kool-Aid regarding how efficient their office applications are.

    In reality, Windows customers probably pay the steepest price for their OS choice. It requires tons of support in a corporate environment and exposes you to a much higher risk of malware infections and security breaches. Maybe you need Windows on a few of your machines -- those of people who need to establish an appearance of "Corporate" credibility. And maybe you need some Macs for certain applications where the Mac apps give you enough of a productivity improvement to pay for the expensive system. But most of the worker bees can do as well or better on Linux at much less cost. But it will never come with support. Support will be either hire-your-own or contracted separately.

  • Basically you're argument is that it's all about Microsoft Office? I agree with you, then it has nothing to do with how Good or Bad GNOME vs OSX are. The Linux Desktop will not happen on any serious scale until the corporate world stops revolving around Office and there isn't a damn thing we can do about it.

  • by NoNonAlphaCharsHere ( 2201864 ) on Friday August 31, 2012 @08:28PM (#41196133)
    You'd probably have a point there if every single Windows app didn't ship with 42 DLLs that only work with/for that particular app, providing a shim between the app and the OS. In contrast, Linux apps are actually expected to interface with shared libraries not directly under the particular app developers control.
  • Casual User Here (Score:5, Insightful)

    by Iskender ( 1040286 ) on Friday August 31, 2012 @08:36PM (#41196183)

    As a single-booting but casual Linux user I don't really know if these libraries are what makes distributing software such a pain, but whatever the reason is something needs to change, and the point about software distribution was spot on.

    Package management is nice, but if something isn't available through it I won't install it. Why not? Because:
    * I have to compile it myself. This often results in errors which I can't handle.
    * I have to edit config files. Might be xorg.conf, might be something else. All I know is someone failed to make it work out of the box properly. Things will break.
    * I have to find the application. Yes, that's right: often applications leave no trace after installing, especially when using a manager. They're buried in the complex-just-cause Unixey filesystem. Typing the name into the CLI fails too of course.

    Now all of these problems can be solved, some seemingly trivially. This doesn't matter - the fact that I can edit xorg.conf means I'm probably in the top 3-5% of all computer users as far as Linux goes, meaning it could just as well be impossible for a normal user.

    Users are used to the Windows XP interface and Linux is frequently more like it than Windows 7 is, so the exterior isn't a problem. The ACTUAL usability problem is installing software - it needs to work universally so people can actually do things and therefore be interested in and dependent on the OS.

  • by perpenso ( 1613749 ) on Friday August 31, 2012 @08:46PM (#41196245)
    No Mac OS X has not killed desktop Linux. However it has halted Linux's advance into the desktop market. Much like Linux did not kill MS Windows Server, it halted the advance of Windows Server into what had been traditional *nix server territory.

    That said ...

    So he argues that Mac OS X has not displaced Linux because its overall marketshare has only gone from 5 to 7.5%?

    That seems to be an odd conclusion. That growth is nearly twice the entire Linux marketshare according to his cited numbers. If he wanted to argue Mac OS X is not displacing Windows he would have a point. As for Linux he really offers no evidence.

    Yet the number of Mac laptops seen at Linux specific conferences, and the number long term Linux users confessing they moved to Mac OS X, are so common as to be far more than mere anecdotes.

    The truth is that a bunch of people out there wanted a *nix environment. Workstations were beyond their reach and Linux filled an empty niche by delivering *nix on PC hardware. Many historic Linux users just want an affordable *nix and didn't care about the politics and drama of the FSF and the "free software" movement. So when Mac OS X delivered another affordable *nix implementation that runs side by side with a nice consumer GUI environment that has support from many commercial software publishers they switched. It also helped that the Mac hardware delivers the "holy grail" of running Mac OS X, Windows and Linux. Sure you can emulate but for things like games you are probably better off booting into Windows. Something many Linux users do too.
  • by perpenso ( 1613749 ) on Friday August 31, 2012 @08:54PM (#41196295)

    I was a Linux user beginning with Redhat 3. I went through Redhat, Mandrake, Fedora, Gentoo and Ubuntu. I've also used Solaris for a daily workstation.

    Then I was assigned a Mac at a new job (running Tiger), and have never used anything else for a desktop since. I've had no reason to. I still keep an Ubuntu box in the house, but it's a server.

    My name is Anecdotal Evidence, it's true, but whatever. I went Mac, and never looked back.

    Your experience is so common it goes beyond anecdotal. Many Linux users just wanted a *nix environment. They did not care about the FSF, the GPL, the free software movement, etc. They just wanted to run some *nix applications and tools. Linux was originally their only affordable option to workstations back in the day. Mac OS X comes along and they have another affordable *nix option. One that also gives them a consumer oriented desktop and off-the-shelf consumer and business productivity software. Mac OS X basically offers a superset of the software they can run under Linux.

  • by rtb61 ( 674572 ) on Friday August 31, 2012 @08:55PM (#41196297) Homepage

    As for Linux's share of the desktop, well, everyone knows as long as you continue to count all dual boots as windows and all OS free hardware as nothing, then Linux will continue with a far smaller market share in mass media fantasy than in actual reality.

    Both Apple and M$ wet their pants in fear of Android and Android is Linux.

  • by monkeyhybrid ( 1677192 ) on Friday August 31, 2012 @08:55PM (#41196301)

    but does the binary have to run or just work if you configure; make; make install again?

    First of all, if you do that it's no longer the same binary.

    So? If most of your software is FOSS and can be recompiled, why do you care if it's the same binary or not?

    Secondly, why would you place that burden on the user? The whole point of software is to solve problems for users, not to create new ones.

    It's not often that burden is placed on the user; package maintainers for each Linux distribution generally take care of compiling and making sure the relevant libraries are in place. With every distribution upgrade I do there's been less and less reason to compile anything myself. In fact, IIRC, I've not compiled a single piece of third-party software for my use for at least a year or two.

    A moving ABI really isn't a problem at all for the vast majority of Linux users, especially if most of the software we use is FOSS and available from a distribution's repositories. Now, that's not to say it doesn't cause a few headaches for package maintainers...

  • by Zombie Ryushu ( 803103 ) on Friday August 31, 2012 @08:56PM (#41196305)

    No more "New Distros". No more new package managers, If you have applications, make meta-packages. What really needs to happen is, DEB and RPM need to talk to each other. Stop making "New Distro that changes everything needlessly again."

    Make applications that solve problems, make meta-packages for large suites of applications, make it so RPM distros can talk to DEB databases and vice versa. Agree on a system. And give the "I'm going to make a new distro where the Wallpaper is blue rather than brown" a big glass of shut-up juice. There needs to be one overlording Linux.

  • by obarthelemy ( 160321 ) on Friday August 31, 2012 @09:01PM (#41196329)

    I think the basic issue is that Linux is an OS by nerds, for nerds. Which is fine, as long as they don't pretend they're something else.

    - While using a preinstalled Linux system can be OK (if the system is vanilla, well installed, and you don't want to change anything), installing/admin-ing a Linux system requires the CLI within 10 minutes
    - the code might be good, the documentation is horrendous. Codenames are fun except when you don't care about them and have to keep a post-it note to remember if Carmic Crap is 8.10 or 9.14; once you know that, you got to try and find relevant info (MAN pages are often out of sync and/or a bit unclear; forum posts rarely states which versions they apply to or not...). I think this is both accidental (writing doc is boring and unglamorous) and by design (if only a few people can make head or tail of something, their market value increases)
    - the feature set is chosen to impress your programmer peers, not to seduce/help non-techies.
    - many distros, GUIs... are *released* in what is barely a beta state (early Unity, KDE4...). People howl at MS putting out crap v1s... Linux does worse with v4s...

    Engineers often wonder what the world would be like without marketing- nor business-men. The answer is: Desktop Linux.

  • by otuz ( 85014 ) on Friday August 31, 2012 @09:07PM (#41196363) Homepage

    I don't think the parent is trolling. This is practically the sorry state of Firefox, which would probably be something like version 4.6.7 using the old versioning system. WebKit has left Mozilla in the dust, maybe they should switch bandwagons and just release a Firefox-y application wrapper built on WebKit?

  • by Tyler Eaves ( 344284 ) on Friday August 31, 2012 @09:14PM (#41196411)

    Hard drive space is cheap.
    My time isn't.

    I know which situation has caused me more heartache.

  • by 0123456 ( 636235 ) on Friday August 31, 2012 @09:21PM (#41196431)

    Hard drive space is cheap.
    My time isn't.

    I know which situation has caused me more heartache.

    You mean, finding all seventy five copies of zlib.dll strewn through random directories on your system which have exploitable security holes so you can individually replace them all with a patched version?

  • by LodCrappo ( 705968 ) on Friday August 31, 2012 @09:31PM (#41196491)

    Let me preface this by saying I am not trying to be mean or disrespectful in any way. I'm probably assuming things about you that aren't true, yadda yadda. Mostly I think you're a great example of why Linux is not appropriate for "the Desktop" if that means "anybody that feels like using it", and probably never will be.

    The topic you complain of (complexity of installing software) is a topic that can be mastered in very little time. Gaining a working understanding the linux filesystem, paths, editing config files, and basic use of make would take the average person only a few hours of study. Add the ability to copy/paste messages into google and follow instructions, and installing software simply will not be a difficult task any more.

    However, instead of learning how to do these things, you'd prefer that someone develop some amazing automated installation system that Just Works. I can understand the appeal, but I just don't see any motivation for anyone to create such a thing.

    Most open source software exists because some capable person needed or wanted or was just interested in something and decided to make it. They may add on requested features, others may join the project and extend it far beyond the original scope, but at the core there was that original personal desire.

    I think it's logical to believe that most people who are skilled enough to write software of the complexity required for a "universal magic just works installer" have very little need for such a thing. Installing software simply isn't a challenging task. It's tedious, sure, but not particularly interesting. The number of open source hackers who would volunteer a massive chunk of their time so that the "average guy" doesn't have to spend a couple hours learning is just not very high.

    just my $0.02.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Friday August 31, 2012 @09:39PM (#41196529)
    Comment removed based on user account deletion
  • by martin-boundary ( 547041 ) on Friday August 31, 2012 @09:41PM (#41196541)
    The OP is talking about ABIs (Application Binary Interface), but as your post implies, that's a red herring. Who cares if the low level binary interface that handles OS and library system calls changes? Just recompile the software for the most recent version of everything you've got.

    We can do that in the FOSS world, because we ship the source to everything and the APIs are what matters. The ABI "problem" is a nonproblem that's really a side effect of the misguided commercial belief in secrecy.

    If you're a company that only wants to sell a compiled binary to a bunch of clients, then you don't get to complain if the binary you prepared fifteen years ago for some distro using linux 2.1 no longer works in 2012.

    Just tell your clients to run the older distro, or else recompile your code for a modern distro. Or you know, you could make your code open source, and reap the benefits of community support.

  • Re:I have one. (Score:4, Insightful)

    by shutdown -p now ( 807394 ) on Friday August 31, 2012 @10:11PM (#41196687) Journal

    Which is precisely why Linux on the Desktop is still confined to 1%.

  • Blames (Score:5, Insightful)

    by Taco Cowboy ( 5327 ) on Friday August 31, 2012 @10:13PM (#41196689) Journal

    I am using Linux

    I have been using Linux since the early 1990's

    In other words, I am no fanbois of Windows nor Apple

    But, reading TFA and the previous one (the one accusing Apple for killing Linux Desktop), I got that uneasy feeling that people behind the Linux Desktop are adapting the stance of blaming others for whatever they have failed

    No, I am not saying that the Linux Desktop people haven't put in much work into making Linux Desktop a reality - they have - or else we wouldn't have so many choices like we have today, from KDE to GNOME to Enlightenment to many others

    But what I am saying is, whatever failure there is, regarding Linux Desktop, should be examined within the Linux context

    Blaming Microsoft or Apple or even the Almighty Himself won't make Linux Desktop a better choice

    If we really want Linux Desktop to be used by more people, we must explore ways to make the UI truly intuitive, and that by itself, has been a constant challenge for the Linux Desktop people

    In fact, we don't need to look further than "Unity / Gnome 3" to find what's WRONG with Linux Desktop

    Maybe you will disagree with what I have said, but the truth is sometime not hard to swallow

    We must admit that Linux Desktop is a failure, and we must find way to re-make Linux Desktop so that it doesn't sux so badly
     

  • Bingo (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Friday August 31, 2012 @10:19PM (#41196713)

    You can't have it both ways. If you are happy with the "Whatever it is free, the quality can vary and people can do whatever they like," then cool. That's great but understand and accept it'll never be mainstream desktop. If you want that mainstream desktop, then you have to start to support users. You have to make things easy for them. "Just get the source code," can NEVER be something you utter.

    So if you want Linux to always just be the "geek OS" on the desktop or something people use when they are going to stack a bunch of custom shit on (like Android) then no problem. However if you want to advocate Linux for all and a Linux desktop then you need to accept that some shit has to change.

  • by Anonymous Coward on Friday August 31, 2012 @10:30PM (#41196791)

    As someone who's spent months dicking around with OpenMW (among other compiled apps) on gentoo thanks to a combination of ABI incompatibility, differing build options (which can cause random segfaults in SOME but not ALL apps! OGRE being a notable one here.), as well as someone who's just had their videocard dumped by ATI with no apparent intention of matching features being added to the open source drivers (Radeon HD4770, which supports OpenCL 1.0 in the fglrx drivers, but which there's a mailing list comment regarding the OpenCL support in clang and 'why should we bother supporting OpenCL 1.2, and especially 1.0, since it'd take too much work to document the assumptions we've made regarding OCL 1.1/1.2 API support. As well as current experimental OpenCL support only supporting 5450 and above cards.) Why am I pissed off by this? Because that particular card is still within about 15-25 percent of the performance of the 7770 cards, and if you compare them on double precision FP support, it spanks everything short of the HD7950+ in the current generation of cards.)

    Combined with the constant breakage that's lead to 'deprecated arches' both in the linux kernel as well as the gnu toolchain (try using either on non-arm/x86 nowadays and tell me how well the vanilla packages work for you!), and you have to wonder how 'open source' is working for us. Open source has just become the new proprietary mess in that now, not only do you get stuck hoping the developer won't change the ABI on you, but you can't even be sure the API will work the same between toolchain versions. A good example of this was the aforementioned segfaults, which happen with a mixture of g++-4.2 and 4.6 compiled packages, but don't occur when compiled only one way or the other. Combined with a nightmarish assumption of warnings and -f features in various packages, actually getting a non-trivial package built can be a nightmare (I say CAN be, because it really varies from package to package, arch, and the particulars of your build enviroment.) The point though being that in some cases it can be almost impossible to track down the particular failing point, even using tools like strace to track it down to about where in the process it's failing.

    In the case of OpenMW it was some quirk of the threading between boost, openmw, and ogre, and having any of those compiled against gcc 4.2 instead of 4.6, although all individual pieces compiled fine, the only place failure showed up was down one specific path involving threads (which Ogre has support for like 3 different models of, and the wrong settings could potentially break a pre-compiled app.)

    It's been 15 years since I started with linux, and in that time the build enviroment has only become a bigger mess. Less even due to the tools than due to developer apathy to the plights of USERS. There are plenty of developers who take it seriously, but all it takes is one or two in the chain of dependancies to ruin it for everyone. This is even more true during this mass migration to C++, even among build tools. One seeming minor change in the wrong place could lead to abi breakage in a corner-case that nobody tested for, and now you've got contaminated binary code in god only knows how many places before you catch it.

    What are you going to do to remedy that now?

    - A crotchety but not that old Linux user.

  • Re:I have one. (Score:5, Insightful)

    by MobileTatsu-NJG ( 946591 ) on Friday August 31, 2012 @10:37PM (#41196831)

    You sound like you're a paying customer or their boss. If said maintainers are volunteers and doing this in their spare time and juggling work and family and just having a life, I think they have an excuse.

    Look, if you're going to pull this 'you get what you pay for' nonsense then you're not allowed to try to convert people over to OSS. You can't have it both ways.

  • Re:minimalist (Score:5, Insightful)

    by devphaeton ( 695736 ) on Friday August 31, 2012 @11:03PM (#41196975)

    Perhaps Linux needs a minimalist leader. Throw everything out. Then step by step, bring back features and see what works, and what doesn't. In the process make sure that everything has a consistent look and feel.

    Believe it or not, that used to be Ubuntu. Back 8 or 10 years ago, there were all these distributions that offered 'choice!' by loading the biggest Gnome or KDE desktop crammed to the gills with EVERY and I mean EVERY app that was available. Stable, beta, working or not. You opened a panel and there were 17 calculators to choose from, 23 IRC clients, about 15 web browsers, 7 different terminal apps... you get the idea. Most of it was half-broken shit.

    The beauty of Ubuntu in the beginning (I thought) was that they cut out all of that. You got a nice, slick installer that installed Debian Unstable (which we'd all known for years was fine for everyday use) with a slick graphical installer. You booted up to a nicely themed Gnome desktop with only the best ONE of each type of application installed. They were smart about choosing what apps to include by default, and I felt that their choices resonated very closely with experienced linux users who generally all agreed on the best app for a particular usage. The whole Debian repository was mirrored and available, but you didn't have to dig through a bunch of crap to find the stuff that you most likely would have chosen to install yourself. Configs were all clicky-clicky, but all your fave debian cli tools like aptitude still worked as expected.

    I really thought that Ubuntu was going to become the polished distro that brought Year Of The Linux Desktop(tm) from fantasy to reality. I still think that they had a real chance to pull that off. (At least up until about 8.0, then it started to get weird).

    My $0.02 plus tax.

  • by Eil ( 82413 ) on Friday August 31, 2012 @11:05PM (#41196981) Homepage Journal

    This is one argument I really don't get, and yet the FOSS library maintainers seem to be adamant that they must be able to break their ABIs whenever they want.

    Yes, FOSS library maintainers want to be able to break their ABIs. They do it often. And that's fine. Why? Because we have this thing called versioning. You can write your application against libfoo.so.2, and the author of libfoo can rewrite the thing from ground-up and call it libfoo.so.3. And guess what? Your application works just fine because libfoo.so.2 didn't disappear from the face of the earth. You just install libfoo.so.2 and libfoo.so.3 side-by-side and everybody's happy. This is a primary strength of open source, not a weakness.

  • by Anonymous Coward on Friday August 31, 2012 @11:18PM (#41197079)

    Or you just don't bother with Linux. You really just don't get it.

  • by enos ( 627034 ) on Friday August 31, 2012 @11:20PM (#41197097)

    It's called a package manager and every major distribution has one.

    Every major distribution has their own one that's incompatible with every other major distribution's. That's even though the package systems do the same job. Even distros that use the same package management system don't share compatible repositories.

    So you just turned supporting "Linux" into supporting Ubuntu, RedHat, SuSE, etc.

  • by Burz ( 138833 ) on Saturday September 01, 2012 @12:00AM (#41197347) Homepage Journal

    Package managers do not solve the problem of compatibility across different distros. In fact, not even across the semi-major upgrades you see each month with a single distro. PMs also contribute to making the development environment app-unfriendly because they don't work well with anything that hasn't been subsumed into "the repository"... i.e. independent software distribution is really an uphill slog to the point where even Mozilla gave up on packaging apps for Linux-based distros long ago; Mozilla packages apps for Windows and OS X.

    Really, if you don't make it easy for curious types to make something interesting and to then share it easily with others, then the platform doesn't work. People will continue cutting their programming teeth on OS X and Windows and will stay there or with other platforms that satisfy the same criteria. So-called "Desktop Linux" doesn't even have an SDK! The longbeard hacker politics affecting the Linux Foundation demand that it doesn't have an SDK. Skittering around in Google's wake, they saw fit to create an SDK for Mobile Linux but heaven forbid if we get one for ye olde desktop.

    The subculture stubbornly refuses to standardize both the user experience and that of the app developer. And so it drives both groups away.

  • by saikou ( 211301 ) on Saturday September 01, 2012 @01:50AM (#41197811) Homepage

    I know I'm probably missing some (ok, many) points, but at this stage I gave up on Linux Desktop -- using mac and windows (when I have to) is easier.

    So...
    > We are trying to compete with a near monopoly (Windows)
    Duh, that's the whole point, right? To be better than Windows and lure in corporate and home users to the glory of Linux Desktop. Try again

    > Companies tend to depend on a myriad of applications to run their business, and just a couple of them not running under Linux would be enough to derail a transition to Linux desktops
    So what, home users are already done for? Fine. Why not pick a few large corporations, find those pesky migration derailers and fix them? Oh, busy with something else, you say, okay.

    > We were competing not only with other operating systems, but with a Office productivity application monopoly
    But didn't Linux community provide something that was "totally able to replace Office" and kinda compatible? Oh, you mean that's not a Desktop Issue at that point, but Office Issue. I see.

    > We are trying to compete by supporting an unlimited range of hardware options
    Well, Windows does it mostly by giving manufacturers a relatively straightforward way to provide binary drivers, so as long as you don't yank the compatibility rug for some reason it'd "just work". Oh, you're saying binary drivers only over your dead body? Okay. Have you tried, I don't know, support less of a range of hardware options if it strains resources?

    > We divided our efforts into multiple competing APIs (GNOME vs KDE)
    Because nothing tickles people's fancies as endless fights over which one of two incompatible ways to do things is The Only Way. With inevitable hissy-fits, splits in the groups and forks into those very competing APIs. I see.

    > There was never a clear method of distributing software on Linux outside the distro specific package system.
    I guess the reason why there isn't one clear method is the previous reason of constant in-fighting?

    > Many of our underlaying systems were a bit immature
    This has been "The Year of Linux Desktop Breakthrough" for many years now. Still a bit immature?

    > Software patents on multimedia codecs made it hard to create a good out of the box experience for multimedia
    But the manufacturer-supplied binary dri.... Okay, okay, you think not having a binary driver is important. No multimedia for you then.

    > Competing with free applications is never a tempting proposition for 3rd party vendors
    Given how free applications don't seem to be a runaway success, I kinda doubt it. Given quality of some of those free apps my doubt rises rapidly.

    > We never reached a critical mass where porting to desktop Linux tended to make sense
    A bit chicken and the egg thing, no? If the reason why dekstop Linux doesn't get new adopters is that it has too few adopters, better give up right now.

    > An impression was created that Linux users would not pay for any software
    Well, the target is windows users, right? Linux users are already on a linux desktop. Windows users are known to pay for windows licenses (well, mostly)

    > The different update cycles of the distributions made it hard to know when a new API would be available ‘everywhere’
    Oh, so no universal APIs, conflicting distros and general "herding cats" type of problems. I see.

    > Success in other areas drained resources away from the desktop
    That's essentially saying "well didn't even want it to succeed that much".

    I suppose if only someone had a good set of developers, clearly set goals, no in-fighting, stable APIs, predictable release schedules, support for binary drivers and whatever end user wants/needs and not what's "ideologically right", the whole Linux Desktop Takes Over Windows World would happen. But our individual preferences are more important. And nobody is willing to sacrifice anything for seemingly important goal of luring users

  • Re:Blames (Score:5, Insightful)

    by evorster ( 2664141 ) on Saturday September 01, 2012 @02:20AM (#41197891) Homepage
    There is no desktop war. There is no "Problem with the linux desktop". Use whatever suits your needs.

    Most of the developers in Linux could not care less what you use on your desktop, and conversely they do not care about your opinion of what they use on their desktop.

    This "Desktop war" only exists in the minds of people that feel threatened by what's running on someone else's desktop.

    KDE has been my default desktop for many years, and it does everything I need, and looks great doing it. I have yet to visit one site where flash player from Adobe does not work. libreoffice is damn close to M$, and the differences there is only because M$ does not follow any standard when making their applications, not even the standards they have written. You can hardly blame Linux or any of the applications it runs natively for the lock-in tactics of Microsoft.

    For me, at least, I don't see a reason to add to Apple's war chest, or support Microsoft when both of them use the money to harm the computing field in general. This means I can't play EVERY game out there, but there are a lot of games that run natively in Linux, and a mind-boggling amount that will run fine in wine or dosbox.
    I am an avid photographer and video editor, and DigiKam and Kdenlive has me covered there.

    There is only one reason I have a dual boot partition to Windows 7. I have an anrdoid phone that only allows it's firmware to be updated in Windows. Same with the linux based GPS I have. It's a slap in the face from the developers of these devices, and if there were any linux only options, I would buy those devices instead.

    I guess what I am trying to say is this:
    Vote with your wallet, and only support companies which have business practices that you approve of. There is something about voting that seems to be lost on some countries. Sure you only have one voice... but that voice counts. Vote for something you agree with, no matter if it's the third/last whatever. When enough people agree with you, that underdog BECOMES the leading party... but this won't happen unless you vote properly, instead of the lesser of two evils. (Ms/apple or rep/dem)

    :)
  • by Anonymous Coward on Saturday September 01, 2012 @02:49AM (#41197967)

    Who cares if the low level binary interface that handles OS and library system calls changes?

    It used to be my job. I worked on a comercial program used by digital hardware engineers. We supported many flavors of UNIX.

    Just recompile the software for the most recent version of everything you've got.

    You want me to compile, ship, and support a binary on every flavor of Linux a customer has? We tried to figure out what customers had, and supported the top four. This only covered about 60% of requests.

    We can do that in the FOSS world, because we ship the source to everything and the APIs are what matters. The ABI "problem" is a nonproblem that's really a side effect of the misguided commercial belief in secrecy.

    Paying money for closed source software is how 99.99% of users expect to get software that does the job and is well supported. If you refuse to support this use case, do not expect a significant number of users on your platform.

    Just tell your clients to run the older distro, or else recompile your code for a modern distro.

    I don't think you thought this through. Every customer standardized on a very specific version of Linux. No other version could be used at their company. They had to, because a lack of ABI compatibility means they would have to recompile everything for each linux configuration. If they run Red Hat enterprise 4 release 56, they can not use a binary built for anything else. I watched many customers buy tens of Sun machines to run our software, instead of using far cheaper Linux boxes, to avoid installing one of the top four Linux distros that we supported.

    Or you know, you could make your code open source, and reap the benefits of community support.

    That doesn't work for the vast majority of software out there. If you have ideological reasons for excluding that vast majority, then don't complain that they don't support your movement.

  • by Gordonjcp ( 186804 ) on Saturday September 01, 2012 @03:05AM (#41198023) Homepage

    No. You just ignore Linux and sell to people who use Windows

    I don't see the problem with that. Don't let the door hit you as you leave.

    Something that commercial software companies just don't appear to get is that the Linux ecosystem just isn't prepared to let them sell a big binary blob and never update it for ten years. We're not here to give them an easy time and help them make money.

    If you want to make money from your software, you have to earn it.

  • Re:Blames (Score:5, Insightful)

    by smash ( 1351 ) on Saturday September 01, 2012 @03:10AM (#41198039) Homepage Journal

    The problem hasn't been with the Linux desktop UI for about 10 years now. Gnome 2 was fine. KDE 2 or 3.x was fine.

    The problems are to do with driver support, upgrade breakage, package manager brain damage and ABI breakage scaring off commercial software.

    The desktop itself is fine, and no amount of faffing about with 3d rotating desktops and other nerdgasm worth stuff will fix that. Stop fucking around with eye candy and fix the core issues the operating system has first. The current progress seems much like the old "re-arranging the deck chairs on the titanic", and I say that as a former Linux on desktop user since 1995.

    I got sick and tired of the core issues remaining whilst shiny-new-desktop of the month kept being released and replacing all my somewhat stable apps with new buggy replacements.

  • Circle rhetorics (Score:5, Insightful)

    by dutchwhizzman ( 817898 ) on Saturday September 01, 2012 @03:58AM (#41198181)

    You are talking about current Linux users and application suppliers that seem to not bother about ABI stability. If you want to get the other 95+ percent of people that use desktop computers using your product, you may want to look at their needs and not solely at the needs of the few you are catering for already.

    Diversity is good for an ecosystem, evolution depends on it. However, too much instability and chaos and evolution loses because most of the deviations are too crippled to grow into something useful, even if they have some very good mutations. This is true for the development of the organisms themselves, but also for people wanting to "farm" these organisms.

    Large corporations making enterprise software don't want to bother with supporting variations that rather quickly run in to thousands of different possible software combinations that require adaptation in their product or service to make it work. Why do you think Oracle is only supporting a few Linux distributions for it's RDBMS? It's not just because they want to promote their own distribution, but because it simply is a pain in the behind to have to support someone's Arch or Gentoo box and finding out after dozens of expensive analysis by actual expensive software debugging experts to find out some flag is set different during compile time, or a minor version of some library is used that has an obscure bug that only gets triggered in specific circumstances. Just a few of those cases and your profit model is out of the window. It's just way too risky.

    Both MicroSoft and Apple have a tendency to announce well ahead if they want to retire some framework for binary compatibility so application developers can adapt their product to the new alternative way ahead of time and still support older versions of their product for years to come. Windows is still offering most (if not all) 16bit windows ABIs in some form on some OSes still supported today. Apple took many years to kill "Classic" support, support for PPC cpus and legacy frameworks have been around for years before they stopped supporting anything but cocoa.

    If you compiled an app for OSX or Windows XP 5 years ago using the then latest standards, chances that it will run without any modification or extra work on a freshly installed system with OSX 10.7 or Windows 7 are very high. Try that with a graphical application for a Linux desktop and at the very minimal, you'd probably be looking at installing "compat libs" if your distro supplies them at all. This is a support nightmare and a nuisance at the least for people able to deal with this sort of problem themselves. For Linux to make it to the desktop successfully this needs to change. Linux needs it's Visicalc, WordPerfect, Office, PhotoShop or similar "must have killer application" to get a decent share of desktop usage and making it hard for application makers to choose Linux for that isn't going to make that happen.

  • by Burz ( 138833 ) on Saturday September 01, 2012 @08:30AM (#41198857) Homepage Journal

    Package managers do not solve the problem of compatibility across different distros.

    That's correct. That's not the aim.

    Then stop referring to "Linux" as if it were a single desktop OS.

    The subject here is "Desktop Linux" in the sense that a typical consumer can use it with confidence. The term isn't intended to mean "Linux for workstations and people with IT skills". It has to fit in with personal computing culture, which demands that a typical user can realistically install/manage/uninstall a typical application, with a GUI, and do so independently.

    The tarballs that Mozilla has available for "Desktop Linux" users do not fit the expectations that go along with personal computing, and the obtuse response you gave about this exemplifies what is wrong with the Linux aficionado mentality (and also what's wrong with the rest of your reply). The fact that your defense of the status-quo didn't even mention RPM is all sorts of wrong, because RPM is one of the few standards defined in LSB that is relevant to software installation. If you like the idea of users getting the latest Firefox by untarring as root into a location in /usr (part of the OS), overwriting the rpm/dpkg-managed version that came with the OS, and being left with no way for auto-updates to occur (remember, this is the Linux version which doesn't auto-update for the same reason that Mozilla doesn't provide a *real* package) that's your prerogative -- just keep in mind that what you're advancing may be closer to a circus than a healthy desktop environment.

    But hey, users should just rely on updates from the repository. It only took the better distros 5+ years to start supplying major application updates to a handful of titles like Firefox without having to do a complete OS upgrade.

    I personally like the way linux works and find development on it much easier. If the penalty for attracting app developers like you is to make it more Android/Windows/iOS like, then please, stay well away for ever.

    The list of OS undesirables in your world is pretty substantial (though its interesting you omitted OS X). In any case, I predict that anything capable of breaking out of the distro-asylum model would as a prerequisite *not* be called Linux. This has actually happened with Android.

    Does (Linux-based) Android give you sleepless nights or ruin your ability to compute the way you wish? If not, then why would you care if another Android-like phenomenon were to come along, only this time for x86 desktop hardware?

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...