Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Linux Distributors Work Towards Desktop Standards 247

WebHostingGuy wrote to mention an MSNBC article discussing a move by several Linux distributors to standardize on a set of components for desktop versions of the operating system. From the article: "The standard created by the Free Standards Group should make it easier for developers to write applications that will work on Linux versions from different distributors. Linux has a firm foothold as an operating system for servers -- it's popular for hosting Web sites, for instance -- but has only a few percent of the desktop market."
This discussion has been archived. No new comments can be posted.

Linux Distributors Work Towards Desktop Standards

Comments Filter:
  • by jellomizer ( 103300 ) * on Saturday April 22, 2006 @06:22AM (#15179878)
    After the talk there will be 2 Major Faction. While one may win. The Second one will go Screw you and make their own design in-spite of the the talks. That is the problem with Ego Driven Software vs. Profit driven. While they both have their advantages and disadvantage. Ego Driven Software while the Code my be better quality but have a much harder time agreeing with other people. But Profit driven Software tends to be more consistent but software quality tends to be a little lower.
    • and microsoft is a combination of both? they have the low quality of profit driven software as well as egoistic recreation/bastardization of standards just to be as incompatible as possible with the rest of the world.
    • That is exactly what I thought when I read this. This sort of thing has been tried before but so far it never worked for Linux. Let's hope this time people will lower their testosteron levels a bit for the greater cause, because having three major desktop operating systems is way better than the two we have now!
    • by asuffield ( 111848 ) <asuffield@suffields.me.uk> on Saturday April 22, 2006 @08:01AM (#15180077)
      You're half right. The bit you got wrong is that the profit motive does not inspire people to produce consistent software. Most commercial software is just inconsistent, with everything around it and sometimes even with itself. This happens because each piece of software has a different project leader, and nobody in management above them understands enough to impose a single vision on the whole system. Given a choice, an individual project team will usually attempt to differentiate their project from all the others, in the hope of getting more money and/or recognition.

      So the conclusion is probably that different software created by different people is usually going to be different. That's probably a good thing and you should just get used to it. Nobody can invent a single way to do things that is right for every piece of software you might want to use in the future.
    • What did/do they have in common? All were/will be killed off by a company that is profit motivated and uses illegal actions. The ego stuff was done by these companies and then one low quality profit company has managed to kill all but 2 and they are just a matter of time (Intuit is doing ok, for the moment, but turbo tax is slowly being gutted and their targeting markets will happen soon with MS stuff; Intuit will start a slow downwards).
    • by swillden ( 191260 ) <shawn-ds@willden.org> on Saturday April 22, 2006 @09:31AM (#15180381) Journal

      After the talk there will be 2 Major Faction. While one may win. The Second one will go Screw you and make their own design in-spite of the the talks.

      History disagrees. While the Linux Standards Base and Freedesktop.org projects haven't solved all of the problems -- and probably aren't fully adhered to by any distribution -- they have already made a huge difference in the compatibility of Linux distributions, and I think efforts like this are exactly what we need to continue pushing interoperability forward.

      I say this, by the way, as a developer who just finished developing a cross-platform, commercial, binary-only application for Linux. The app I was working on definitely pushed the limits of the interoperability, since it was an authentication system that replaced key system components, and in spite of that it went very smoothly. The differences between the half-dozen Linux distros I had to tweak the package for were very small. Actually, the more difficult issue was making things work in spite of customizations the admin may have made -- I just had to punt on that one, making the installer intentionally brittle in the face of unanticipated modifications to, for example, the X startup scripts, and then providing the admin with the ability to customize the installer to adapt to local changes.

      After my experience of the last year, I wouldn't have any hesitation about developing more "normal" applications to run on multiple Linux platforms, and I expect initiatives like this one (which is from the same consortium that brought us LSB) will continue to reduce the platform differences that cause problems. I think we may even be able to get to the point where app developers may actually be able to target LSB (or whatever its successor is called) rather than having to tweak for individual distributions.

    • You mean like in the commercial world, the Apple faction said "screw you" to the Microsoft/IBM faction and did everything their own way? Or like the MS Office group said "screw you" to both the MFC and Vista groups and keeps violating GUI guidelines on Windows? Or like Mac developers can't agree on a consistent toolkit (there are half a dozen different ones in common use), consistent look, or consistent installer?

      It's good that the Linux desktop is being unified further, but it certainly has to fear no com
    • The Second one will go Screw you and make their own design in-spite of the the talks. That is the problem with Ego Driven Software vs. Profit driven.

      Oh, we've never seen that in the non free world. Have you ever seen a free software advocate hire Madonna for a release or make a speech by projecting their head onto an 80 foot screen? Do you remember a little anti-trust trail where a parade of computer industry giants testified about how often and hard M$ would screw them?

      Ego Driven Software while the C

  • reasons why (Score:4, Interesting)

    by fl!ptop ( 902193 ) on Saturday April 22, 2006 @06:26AM (#15179883) Journal
    interesting that msn bills move as 'making the operating system compete better with windows' instead of 'making it easier for developers to write applications that work on different flavors.'

    i would think the former is a result of the latter, instead of the other way around.
  • by MichaelSmith ( 789609 ) on Saturday April 22, 2006 @06:28AM (#15179888) Homepage Journal

    I can run KDE applications under fvwm and Gnome, as long as the runtime libraries are there. I don't see why it is hard to have QT and GTK libraries on each system.

    The only remaining issue is cut and paste with rich content but the article doesn't talk about that.

    • by jellomizer ( 103300 ) * on Saturday April 22, 2006 @06:38AM (#15179904)
      It is an issue of consistency. If I am running Gnome I know when I am running a KDE app because it looks a feels slightly off. The same if I am using a straight X11 App. Linux for the desktop is not about Window Managers. It is about giving Developers tools to make their Apps Desktop Friendly, And ability to make sure Linux Apps look good no matter what WM you are using.
      • i wouldn't start my kde if it would look and behave like gnome ;)

        now about the `issue` itself, redhat is dragging along a bunch of people to push some kind of one-standard-for-all (cough-cough-bs-cough-cough-profite-cough). they want to unify some things (the article didn't really elaborate what ...), and therefor make all the stuff more the same.

        i don't know about you, but if i'd want everything to look the same, i could aswell choose osx or winblows (nah, not really win, it's not ...). i
        • by MarkByers ( 770551 ) on Saturday April 22, 2006 @07:12AM (#15179988) Homepage Journal
          i understand that this will help to push linux into the streets blabla, but is this really what we all want ? or is this the beginning of the end of linux as we know it ?

          No. There will always be distributions that do it their own way despite what any standards organisations say. You will always be free to use these distributions. No-one can force standards into Free software (if you try, people can fork), but you can make the standards so good that distributions (and their users) want them. If people don't want them, they won't be successful.
    • I don't see why it is hard to have QT and GTK libraries on each system.
      Because:
      - its ugly design
      - it involves lots of code duplication
      - it sucks on lean platforms (for example Maemo)
      - it doubles your chances of being hit by a security flaw
      - it produces a lot of unmaintained basic infrastucture code (like VFS) where the implementation is the spec.
      - standards are a Good Thing
    • The standards in question are things like "how do I install a menu item in a way that works across distributions" and "how do I distribute C++ apps in ways that don't randomly crash" and "what libraries can I expect a Linux system to have".

      The whole "as long as the runtime libraries are there" catch is what it's all about. It's not reasonable to expect people to deal with dependencies.

      • The whole "as long as the runtime libraries are there" catch is what it's all about. It's not reasonable to expect people to deal with dependencies.

        It's the package maintainers job to deal with library dependencies. I a Linux distro is unwilling to do this, why should I use it in the first place since it is obiously of low quality?

        • World of Warcraft is not in most distros, is it low quality? Compared to say, bzFlag?

          Don't get me wrong, I love a good game of bzFlag, but saying "If a Linux distro is unwilling to do this, why should I use it in the first place since it is obiously of low quality?" isn't going to fly with most people ...

        • Well, it would lighten the load on tracking library dependencies, that's for sure. It seems that one of the easy ways of figuring it out is trying to compile the package in question in a chrooted environment and resolving missing libraries as you find them, but that's also a shit shoot if you're looking for its optional libraries without futzing around with the source code.
    • I guess the issue isn't what widget set you use, but the fact that you have two stacks underpinning both of them with little sharing of libraries or settings. When KDE & GNOME learn to play nice and share the same settings, desktop files, theme engine, and behaviour is the day that Linux move on from pointless widget wars (that no end user cares anything about) and focus on delivering a high quality desktop.
      • Well said. The fact that the Linux end-user even knows what "toolkit" they are running in a given program is geek-marketing gone insane. Talk about a totally irrelevant choice-point foised on the poor users.

        A typical Windows install has dozens of toolkits in use, and the user is none the wiser. This is mainly because most common settings are shared in the registry, providing a rough consistancy -- not because the toolkits is are so awesome.
  • I have to ask... (Score:4, Interesting)

    by MobileTatsu-NJG ( 946591 ) on Saturday April 22, 2006 @06:35AM (#15179900)
    This question is going to seem rude, and I apologize for this, but why didn't this happen years ago? I'm asking out of curiosity, not as a jab at the community. It seems to me that this sort of standard would have been quite valuable as soon as GUIs became prevalent with Linux.
    • by Pneuma ROCKS ( 906002 ) on Saturday April 22, 2006 @07:39AM (#15180037) Homepage

      I think GUI, despite being prevalent for quite some time, have been very, very low in the priority list of Linux developers. The community has focused more on the low-level, kernel and architecture areas, and the rest has suffered from it. IMO, GUIs in Linux have always been an afterthought, and that's the reason they suck so much (again, IMO).

      This sheds light in a key problem with open-source software: developers will work in what they want to work, not necessaily in what needs to be done.

      Yeah, mod me down, see if I care.

      • by Kjella ( 173770 )
        Well, if I get to pick my preference it goes like this:

        1. Make it work
        2. Make it work well
        3. Make it work fast
        4. Make it work fancy

        Besides, I don't think it's wrong to say that for many Linux has been and is only a server OS and they couldn't really care about a GUI. But somehow I get the impression that those most deeply into inner mechanics of kernels and drivers might not be the most qualified to make good GUIs either, so I'm not sure it's suffered that much. And even so, GUIs are really subjective and h
    • Once upon a time there was one major desktop environment for Linux, KDE.

      Some people didn't like the licensing of the Qt toolkit KDE was built on, so they started work on GNOME, which was to be the truly free alternative.

      The licensing of the Qt toolkit was changed. However, out of ego the GNOME developers continued, and so we ended up with two major desktop environments.
    • There are a few reasons why LSB compliance and the like isn't considered critical by many devs:

      1. Most distros use package managers - which let you put all of an applications files wherever you want, and thus the application doesn't need to know where the right place to install itself is.

      2. Most distros compile their own binaries from source. So, if the original developer linked his binary version against glibc-2.0, and RedHat has v2.3, they'll just recompile it themselves, and distribute their own versi
  • Dell, HP, Toshiba, etc. etc. STILL package Windows with every new PC that leaves the shop. I have seen no indication that they plan on changing that any time soon. Sure...Dell might say he likes Ubuntu, but I'll believe it when the first Dell ships with Ubuntu and a Ubuntu sticker on the front where the Windows sticker used to be...you know "This PC specifically designed for Ubuntu Linux."

    I don't know too many people that are going to go out and buy a while-box PC (other than geeks) and load Linux, when f
  • Yes and consistency can only be achieve by standardizing. Unfortunately this doesn't only hold true for the desktop, it's equally or even more important for the applications. So far Jim Zemlin, executive director of the Free Standards Group, doesn't seem to realize this else the FSG would have already standardized on a single set of application guidelines as outlined in wyoGuide ( http://wyoguide.sf.net/ [sf.net]). Since this isn't the case so far we still have to wait for the breakthrough of the Linux desktop.

    If any
    • Because unpopular software on Windows like Winamp religiously follows the Windows design guidelines . How the hell did this blatant turfing for his own, really tangental site, get modded up? People will deal with new and non-standard apps quite well for the most part. This isn't about the interface presented to the user, it's about the parts that are common to all desktops like menus and hooks to the WM.
  • Finally! (Score:3, Interesting)

    by yootje ( 770109 ) on Saturday April 22, 2006 @06:40AM (#15179911) Homepage
    What Linux needs is standardization. Having 921034 options to choose from is sometimes a good thing, but sometimes you have the feeling: why don't they just work all on 1 fantastic piece of software?
    • Re:Finally! (Score:5, Insightful)

      by Coryoth ( 254751 ) on Saturday April 22, 2006 @06:59AM (#15179953) Homepage Journal
      Having 921034 options to choose from is sometimes a good thing, but sometimes you have the feeling: why don't they just work all on 1 fantastic piece of software?

      Because the worlds open source developers are not a giant slave pool designed to do your bidding.

      Open source will always be chaotic and involve a great deal of duplication because that's that nature of the beast. The gain you get from that cost is much more open software that's developed rapidly and tends to work as a free market for ideas: the better ideas eventually win out (though that may take some time). If you want something different then you want Apple or Micrsoft with their rigid top down control structure which ensures that everyone is working toward a single unified goal (as much as is possible), and all the work is directed. The upside is consistency and a unified vision, but the downside is that the whole thing is more locked up, an often slower development cycle, and a tendency to get hit with the same stupid mistakes release after release after release just because it appeals to the guy at the top.

      It's a choice and you can pick the software ecology that suits your needs. Just don't go expecting one to behave like the other on your whim - there are deep fundamental philosophical divisions about how to develop software (to let it evolve from the bottom up, or direct it from the top down) that are largely irreconcilable.

      Jedidiah.
      • It kills me how on the one hand you guys got absolutely nuts about web standards, document standards, etc. -- "just code to the standard and it'll magically work!" is the mantra around here. But as soon someone says that the Linux desktop or Linux distributions need to standardize on this or that the tired old "but that would stifle choice!" line gets trotted out.

        Hmm, maybe that's what Microsoft thinks. They break standards because standards would just limit their choice...
        • It's simple: Standards for communication and interoperatability = good. Standards for implimentation = bad.

          For example HTTP is a communication standard which is good. HTML is a communications standard which is good. However having one master implimentation of the HTML standard (say if we all decided to standardise our web browsers on IE) is bad - the implimentation might for example be insecure or corrupt, or it might just work badly. You might have one person who wants to display the HTML in all it
        • It kills me how on the one hand you guys got absolutely nuts about web standards, document standards, etc. -- "just code to the standard and it'll magically work!" is the mantra around here. But as soon someone says that the Linux desktop or Linux distributions need to standardize on this or that the tired old "but that would stifle choice!" line gets trotted out.

          Oh I'm all for standards for Linux, it's just that, given the development model, its going to have to be standards for protocols, communication, e
    • First off Linux is a kernel.

      As for the distros, yes there is redundancy. It's annoying. I tried to tell Redhat and SUSE to merge but they refused. For the most part outwardly they're all the same. You get some un-optimized heavily modified Kernel that you can't trace back to the vanilla and a plethora of pre-built tools with whacky --enable-* flags set. It's annoying and highly unproductive.

      As for the options, keep in mind unlike [say] Windows a Linux based distro can target a variety of actual real wo
    • Re:Finally! (Score:3, Insightful)

      by JanneM ( 7445 )
      why don't they just work all on 1 fantastic piece of software?

      Because there is no one answer to what makes a piece of software fantastic.

      When intelligent people can reasonably disagree on it, don't be surprised - or dismayed - when the end result is several divergent designs. That is truly a case where any one of the designs are good, and importantly, better than a compromise between them.

    • Re:Finally! (Score:5, Insightful)

      by TERdON ( 862570 ) on Saturday April 22, 2006 @07:12AM (#15179986) Homepage
      why don't they just work all on 1 fantastic piece of software?

      Because there couldn't be such a thing - it's an oxymoron.

      Basically, the requirements of the piece of software would be heavily contradictory - dead-easy to use, but still incredibly powerful. Few such programs exist - because they are virtually impossible to make.

      Example: file managers. On the one hand, you have explorer, finder, nautilus et al, which all are at least relatively easy to use even for a newbie. Many find them far to little powerful, especially on /., where the favourite probably is raw /bin/bash, which is far more powerful, but also really hard to learn.

      The same principle holds for most other software. Either you make an easily usable, or a powerful version. The powerful version will, by definition, need a lot of learning on the part of the users, and thus can't be easily usable.

      When you try to unite these two conflicting requirements, the most likely outcome is one of:

      1) Cluttered interface, which intimidates the newcomer
      2) Clean interface, but with all powerful features hidden away from sight so the advanced user has to look for them.
      3) Millions of settings in an unmanagable settings dialog, toggling the different features on and off.

      Conclusion: One software normally can't be the great software - not for every single user. The shifting requirements different individuals have will without doubt make them prefer different software - and that isn't really a bad thing. If everybody ran the same software, there wouldn't be as much incitement for developing new, powerful features!
  • by i_want_you_to_throw_ ( 559379 ) on Saturday April 22, 2006 @06:43AM (#15179917) Journal
    I have tried using Linux on the desktop MANY MANY times and always found myself stymied by getting printers to work and so forth. I have always been adamanat about using it for servers where it's very much worth the time to figure out Linux to have the benefits of it as a server product (bulletproof security, etc).
    As a desktop product though I wasn't about to spend all day dicking around with trying to get it to work. That's was then.... this is now...

    I have been using Linux as a desktop for several months now and it has flawlessly detected all my perpherals, and I Have now been able to spend more time doing development which is what I get paid to do.

    Linux is getting better in this area and Linux is going to start making inroads. Slowly but surely...
    • Again this is an example of faulty logic.

      You're saying because your printer manufacturer hasn't followed the 25 year old PS standard that Linux is broken? Why not buy "Linux ready printers" [some Samsung laser printers for instance].

      After that driver install is easy and you basically print through CUPS.

      But again, this is totally a manufacturer problem not Linux. It isn't like Linus can force manufacturers to include Linux drivers for their non-standard proprietary shit.

      Tom
    • it's making inroads but it still needs some more work done, IMHO. I'm trying linux for the first time now on a second computer I have around (Ubuntu, to be more specific), and while I was pleasantly surprised at how usable was right off the bat (even the internet worked), it has dissapointed me so far at how many times I have had to resort to command line from the beggining (to get MP3 working), and how hard it was to get something I would have though would be simple as installing a video driver to get dece
  • by gimpimp ( 218741 ) on Saturday April 22, 2006 @06:47AM (#15179926) Homepage
    We've had standards bodies for a long time. LSB, Freedesktop, etc - none of will help increase market share. Sure, they make like easier for developers, ie a gnome icon theme will soon work on a kde desktop. But the single major problem on linux is dependancy hell. I have nightmares about this.

    Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem. Until we can install and remove applications as easily as OSX users can, we don't stand a chance.

    If you were a new user to unix, what would you prefer:
    A) open synaptic, search the thousands of packages, hope you find what you're after, install it.
    B) download an app folder, drag it to your appliactions folder. go.

    Without this ease of use, there's no chance. I still laugh at people who say linux is ready, whilst at the same time they can't install the latest firefox on their box because it depends on the latest gtk which depends on the latest glib, which depends on....
    • by IamTheRealMike ( 537420 ) on Saturday April 22, 2006 @07:04AM (#15179963)
      Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem.

      Well, autopackage was designed to deal with many of the problems repository based distribution has, so, I would strongly disagree with the notion that it's just "a pretty frontend around the same problem". We've put many, many times more effort into things like reliable/easy installs than making it pretty (though there is still much to do).

      Without this ease of use, there's no chance. I still laugh at people who say linux is ready, whilst at the same time they can't install the latest firefox on their box because it depends on the latest gtk which depends on the latest glib, which depends on....

      This problem affects any OS. You can't install Safari on MacOS X 10.1 either, if I remember correctly. It's true that Linux suffers this problem worst of all though, because there's no unified platform, and because there's no profit motive so little incentive for developers to go "the extra mile" to reduce system requirements. But it's a separate (though related) problem to how you install software.

    • by Florian ( 2471 ) <cantsin@zedat.fu-berlin.de> on Saturday April 22, 2006 @07:10AM (#15179980) Homepage
      download an app folder, drag it to your appliactions folder. go.
      Unfortunately not. OS X programs often spread their files all over the file system, with a mess of binary configuration files, possible netinfo entries (akin to the Windows registry...), etc. There is no standard method in OS X to cleanly remove them - just deleting the application won't do the trick in most cases. Even Windows is superior in that respect.

      Besides, downloading binary code somewhere from the Internet and installing it in your system is a security nightmare and practice that should be abandoned ASAP. I find the Linux/BSD model of providing all software in distribution-provided repositories blessed by the distribution's maintainers vastly superior to OS X, with unmatched clean and safe installation, removal and upgrading of software. (How, for example, do you upgrade all your Mac OS X software with one command or click?) I use both Debian and Mac OS X and find Debian vastly superior in this respect.


      • OS X programs often spread their files all over the file system, with a mess of binary configuration files, possible netinfo entries

        Wha...? What you say may hold true for server software .pkg installs, but these days those are few and far between, and by definition include a bill of materials that tells you where and what files are being installed (select "Show Files" during the install to see). Most OS X application software these days are .app drag and drop installs, and they will "spread their files
      • Well, the big players pushing LSB tend to be companies that don't want to let anybody see their source.

        I'm sure RedHat/SUSE/Debian/Mandrake/Gentoo/etc would be more than happy to build and package Oracle and Websphere so that they work perfectly on every version of their OS. However, to do that they would need to have the source, and Oracle and IBM don't want to give that up.

        For open-source software LSB isn't really a big deal. If they install their netscape plugins in the wrong place, then Debian will ju
    • so basically the thing you want, is full support. someone who would test all the dependancies and stuff when every package is released. and you want it for free.

      good luck, i hope you get a ferrari for free too.
    • by i_should_be_working ( 720372 ) on Saturday April 22, 2006 @07:19AM (#15179998)
      If you were a new user to unix, what would you prefer:
      A) open synaptic, search the thousands of packages, hope you find what you're after, install it.
      B) download an app folder, drag it to your appliactions folder. go.


      You forgot the part in B) where you search through the internet for the home page of the application. Then you read the home page trying to find out how to download it. Once you see the "download" link you go through a couple of pages asking you what version you want and what mirror you want to use. Then after waiting for the download you finally start the actuall installation.

      Whereas with A) it's more like: Open Synaptic, use the search field to find the app faster than you would on the net, install it.

      I prefer option A. It's more convinient for me and the repository based system has other benefits I'd rather not do without. I can see where you are coming from, but different people prefer different things. I'm just glad the distros agree with me (or rather I agree with them).

      And for the record, it's not the distribution or Linux devs who are stopping app folders from coming to GNU/Linux. They already exist. Nothing stops someone from bundling everything a program needs in a self-contained folder. That's how most of the proprietary apps I use are packaged. Open source devs could do this with their programs too, but it would be more effort without much benefit when the distros are going to package it anyway.
      • by asuffield ( 111848 ) <asuffield@suffields.me.uk> on Saturday April 22, 2006 @08:16AM (#15180114)
        Nothing stops someone from bundling everything a program needs in a self-contained folder. That's how most of the proprietary apps I use are packaged. Open source devs could do this with their programs too, but it would be more effort without much benefit when the distros are going to package it anyway.

        Actually, it's not because it's more effort. It's because it is fundamentally a bad idea.

        If you bundle everything you need into one blob for each application, then suddenly your system has installed several hundred copies of gtk, all at different versions. Obviously this is quite wasteful of space, but even that is not the real problem. This is:

        A security advisory was just released for all copies of gtk before a given version.

        What exactly do you do now? You don't know which of your hundreds of applications has got that code included in it. Even if you could figure it out, you now have to either rebuild all of those by hand (if you can), or go to each individual upstream developer and download an updated version from them. If you're a desktop user then you probably aren't going to get this done, so you'll be running with known security holes in some applications. If you're a sysadmin then you're probably going to find a new job.

        I would say that the ability to install security updates in a reasonably painless and secure manner is one of the most fundamental tests of any distrbution method. Applications-as-self-contained-blobs fails it badly.
        • For the most part most Window's apps bundle all the DLL's they need to run with the installer. This is inefficient when it comes to bandwidth, but it isn't necessarily the mess you suggest it is if the developer has done their job correctly.

          For a start, these DLL's should be installed into shared location (Common Files, or the System folder). Secondly, most installers now warn and ask you that you are about to overwrite a file of newer version than is currently being installed, and all is well.

          I don't see h
          • Couple of things -

            1. On windows the bundled DLLs definitely cause problems. I'm sure I still have PCs in my home which are vulnerable to that gif/jpg/whatever vulnerability that came out a year ago or so (the one where the flaw was in a series of DLLs that got bundled and repackaged with just about everything). On linux you use shared libs (which support multiple installed versions) and you can dodge this mess.

            2. The .so setup on linux is designed so that you can have multiple versions of the same libra
      • Actually, all that is needed is a background script that can (for example) parse the output of an attempt to run a program, detect "liblablabla not found, or prefererably "_somecall.o", hit the repos database, and prompt the user with:

        "This application requires XXX to work, which is available on "repository Y"
        Would you like to install it and try to run the application again?
        (PGP keys match)

        With yum/apt/urpmi existing capabilities (the latter 2 have almost always worked better for me BTW) this shouldn't be a
      • Not arguing with you, but two points you may be interested in, if you are interested up-and-coming package management.

        1. Look at the "Smart Package Manager". It's surprisingly good. It uses APT repositories (among YUM, APT-RPM, ZYPP, YAST2, whatever), and is actually a good deal smarter than "apt". You can run them side-by-side, as well. Smart can be less strict about "broken" system configurations, which is quite useful. It also load balances repositories based upon response time.

        2. Look at klik:// . It's
    • I agree.

      You know sometimes I wish I could just goto Help -> Check for Updates in Firefox on Linux as easily as I can on MS Windows. It's laughable that the most well known of open source software doesn't function as seemlessly on an open source operating system as it does on a proprietary Microsoft one.

      Hell, if my repository doesn't have the latest version of Opera (it doesn't) I say sod it and get it from the source, run Opera's 'install.sh' and i'm happy if it works (it does). Yet, theres no safe way t
      • You know sometimes I wish I could just goto Help -> Check for Updates in Firefox on Linux as easily as I can on MS Windows.

        If you're using Ubuntu, Update Manager [osnews.com] will take care of the updating for you. You don't even have to ask it to check for updates, it does that automatically and notifies you [ubuntu.com] if there are any updates. Plus, it works the same for all of your software, not just one application.

        Other distros have similar things.

    • This might have been an issue years ago, but these days there isn't any serious "dependancy hell" anymore. Tools like yum sort that out. As long as you pick a sane combination of repositories things will "just work"
      For Fedora (only one I'm familiar with), there's freshrpms , Dag and a few others that work great. For the distro I use (CentOS) I maintain my own repository, so all other users just have to click to get what they need.

      And if you want one-click install, have a look at Klik, which is now availa

    • by cozziewozzie ( 344246 ) on Saturday April 22, 2006 @08:00AM (#15180076)
      Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem. Until we can install and remove applications as easily as OSX users can, we don't stand a chance.

      We can do this already: Klik [atekon.de]

      The problem is that you end up with 200 versions of the same libraries, and the resulting memory and disk space overhead.

      That's why this sort of installation is generally used for easy testing of things instead of a sane installation procedure.
    • If you were a new user to unix, what would you prefer:
      A) open synaptic, search the thousands of packages, hope you find what you're after, install it.
      B) download an app folder, drag it to your appliactions folder. go.

      I have both Macs and Debian Linux boxes, and (A) wins, hands down, almost every time. It is *much* easier for me to find and install software on Linux that it is on a Mac. Uninstallation is a breeze, too. The "drag-to-install" idea is nice, but to make it really nice Apple needs to add

    • If you were a new user to unix, what would you prefer: A) open synaptic, search the thousands of packages, hope you find what you're after, install it. B) download an app folder, drag it to your appliactions folder. go. download from where? How did you find the place to download from? You did a google to find it first, on what term? the name of the software? the purpose? If the first step is to do a search in synaptic (btw... Adept in ubuntu is nicer) that looks only for packages for your distro, ve
      • ### People who say that repositories are not uptodate are not reasonable. Most people want software that has undergone some testing,

        Testing is done by the softwares developer, not the packager. That software in repositories undergoes any real testing, beyond what the original developer does, is a myth, nothing more.

        The real throuble with repositories is that they won't work at all for current software, they are great for yesterdays software, but if you read somewhere about the cool new version of some piece

        • first:
          The developer is not some lone wolf performing creative work in a vacuum. A smart developer will take all the help they can from other sources. The whole point of open source is that testing is done by anybody with interest. If you think no testing is done by packagers, then you know squat about Debian. In the Debian model, there are 'maintainers' (the people who put software into Debian packages) as a completely
          separate group from 'upstream' (original developers.) The maintainers try to make the
        • Testing is done by the softwares developer, not the packager.

          BS!

          repositories are just a tool to make your packages available. There are various degrees of public exposure and testing.

          The real throuble with repositories is that they won't work at all for current software, they are great for yesterdays software.

          BS!

          Either your packages are shiny new (and only tested by their developers) either they are old and ironed out. You cannot claim both.

          That again depend on the repository/distribution you point to.

          Debi
    • Option A, and I didn't even have to think about it. Installation management is the biggest strength Linux has over other operating systems. Instead of having to search all over the place for applications, and having no easy and consistent way to install, manage and update them, it's all done through a single interface that keeps track of everything.

      Why on earth would anyone think this is a weakness?
    • Just a couple of years ago, we would have discussions about how to do cool things with software. These days, nothing seem to matter because it's all about market share? Why should I care about market share? Why should you?

      I thought this was a site for nerds, not for people who play the stock market. What's important for us is that we have cool software to play with, not that we have cool software to sell to random people's grandmothers.

      Why don't you guys just fuck off back to business school with all your f
    • Repository based installation is NOT the way to go. Autopackage is just a pretty frontend around the same problem. Until we can install and remove applications as easily as OSX users can, we don't stand a chance.

      You already can: 0install.net [0install.net]. More easily, in fact, because Linux will automatically fetch the dependencies and check for updates. Things have moved on since the days of centralised APT repositories where you have to be root just to install something.

      Take a look at the screenshots [0install.net]!

    • If you were a new user to unix, what would you prefer: A) open synaptic, search the thousands of packages, hope you find what you're after, install it. B) download an app folder, drag it to your appliactions folder. go.

      You have skipped a common step. You go to Google and look for the package that does what you want. This is something that's actually harder to do in the non free world, where people lie their ass off in the trade rags. Once you have the package name, installing software with synaptic, apt

    • Well, there's always klik:// , which, IMHO, one ups OS X. I say this as someone who is typing on my MacBook Pro (while laying in bed), with my PowerMac Dual 2.7Ghz sitting in the next room.

      Furthermore, I think you exaggerate the problems of "dependancy hell". I run SuSE on most of my systems (including my MacBook Pro), and I don't experience RPM hell anymore. I use the "Smart Package Manager", and in recent history I haven't experienced dependancy problems, even for packages I download. Furthermore, unlik
  • by Psychotria ( 953670 ) on Saturday April 22, 2006 @07:09AM (#15179979)
    Unfortunately, those added software libraries differ among Linux distributors, making it hard to know if an application like a word processor will function on a particular Linux computer.

    What a load of rubbish...

    When I read a comment like this, I have to question a) the qualifications of the article author; and/or b) their motives. Any assertions made in the article need to be critically examined and their validity questioned after such false hoohah.
    • don't take the word function to literally. This could be as innocous as "does it look right". Akin to running a window31 version of software on XP. It might work fine but it certainly doesn't look or "feel" the same.

      The motive is to better the chances of Linux being on the desktop. This means that your going to have some badly worded observations that say one thing to one group and something entirely else to another.

      It is the casual user who controls the market. The buy what is sold to them and what is
  • by penguin-collective ( 932038 ) on Saturday April 22, 2006 @07:29AM (#15180021)
    A few percent desktop marketshare is what Macintosh has. Seems to me that the "fractured" Linux desktop is doing pretty well already.
  • ...in Redmond tonight.
  • by standbypowerguy ( 698339 ) on Saturday April 22, 2006 @09:47AM (#15180445) Homepage
    At the end of TFA I found the following quote: "Installation by the user is easy..." Imagine that! An acknowledgement that linux installation is easy published in a major media outlet. Hopefully, this will encourage some folks to try linux. Installation of any OS may be beyond the "joe sixpack" crowd, but IMHO, most linux distros' installation routines now rival or exceed Windows' simplicity, and you don't have to type in a long, cryptic CD key ;-)
  • This article sounded familiar and sure enough, it was rejected on March 28th and has been sitting in my Journal ever since. While not the exact same article the concept is the same.

    In my case it was Neil McAllister who penned the writeup for InfoWorld. For Neil's take on the subject, you can read it here [infoworld.com].

    Never let it be said that providing folks with recent information was ever a strong suit of this site. Unless you're counting the dupes.

  • UNIX wars redux (Score:4, Interesting)

    by Danathar ( 267989 ) on Sunday April 23, 2006 @01:05AM (#15183467) Journal
    I love LINUX...use it...endorce it...but...

    The fact of the matter is, NOT having standardized methods for things like graphical installation of software (like MS installer) is a BIG drag on desktop adoption.

    Having so many linux distros is good for competition between distros and innovation, but horrible for commercial software vendors wanting to create products that will be bought by many people.

    Graphical installers that pull software from repositories are still (generally) too complicated. I have to hand-hack X11 config files to get multi-monitor configurations to work. Stuff still just does not work "out of the box" as well as windoze in many important respects.

    Get ready...if Apple ever decides to use the LINUX kernel (unlikely) it should put a WHOLE lot of pressure on LINUX distros to clean up their acts.

    you can flame me now...I have my asbestos fire suit on

Parts that positively cannot be assembled in improper order will be.

Working...