Forgot your password?
typodupeerror
Software Linux

The Future of Packaging Software in Linux 595

Posted by Zonk
from the come-together-right-now dept.
michuk writes "There are currently at least five popular ways of installing software in GNU/Linux. None of them are widely accepted throughout the popular distributions. This situation is not a problem for experienced users — they can make decisions for themselves. However, for a newcomer in the GNU/Linux world, installing new software is always pretty confusing. The article tries to sum up some of the recent efforts to fix this problem and examine the possible future of packaging software in GNU/Linux."
This discussion has been archived. No new comments can be posted.

The Future of Packaging Software in Linux

Comments Filter:
  • The solution! (Score:5, Insightful)

    by Stormie (708) on Monday February 19, 2007 @01:43AM (#18064606) Homepage
    Why do I have a sneaking suspicion that the solution will be to create a sixth way of installing software, which will also not be widely accepted throughout the popular distributions?
  • Nonissue (Score:2, Insightful)

    by eklitzke (873155) on Monday February 19, 2007 @01:47AM (#18064624) Homepage
    The diversity of packaging formats is definitely a nonissue, because EVERYONE has the source code. Any software that is even moderately popular will be packaged by volunteers. If I need some software that isn't already packaged for me, I grab the source code and compile it. If it's something I plan on sharing with other people, I'll write a spec file and distribute the RPM I build.

    I understand that it would perhaps be more optimal if there was a single package format, but that just isn't going to happen. Debian based distros have an enormous time investment in the .deb format, RPM distros have a huge investment in the .rpm format. Likewise for Gentoo, Arch, Slack, and all the other distros with their own formats. There are legitimate reasons for sticking to native formats because of distributions' build infrastructures and installation backends. As long as the source code is available, everyone will be able to install your software, and everyone will be able to use the format of their preference.
  • Re:Nonissue (Score:5, Insightful)

    by rsmith-mac (639075) on Monday February 19, 2007 @01:58AM (#18064696)

    I understand that it would perhaps be more optimal if there was a single package format, but that just isn't going to happen

    Then realize you're basically accepting that Linux will never make a significant dent in the Microsoft+Apple consumer desktop market. You may be able to compile the source code, the rest of us either don't know or don't care. Either Linux is going to be a OS for users, or a OS for geeks. It can't be both because geeks will try to escape a OS too user-centric, and users will escape a OS that resembles the inconsistency caused by groups of splintered geeks.

  • by croddy (659025) on Monday February 19, 2007 @01:58AM (#18064702)

    The reason Linux distributions have not been trembling to adopt the OS X style of package management, if you can call it that, is that it would be a poor fit for the Linux software ecosystem.

    The vast majority of software used on Linux systems is licensed under the GPL; what is not is almost always under another license permitting free redistribution. This gives Linux distributors great freedom in selecting and assembling a compatible collection of versions, tested and working with the same versions of dependent libraries. In a larger distribution (such as Gentoo, Debian, or Fedora), most of the software you will ever need is already a part of the OS -- you just need to use the built-in package management tools to summon it from the distributor's repository.

    OS X-style package management is best suited for a software ecosystem in which users draw software from a large number of heterogenous third-party sources, while the core OS and iLife suite are maintained and updated by Apple. A third-party distributor who wishes to distribute something that must link against a particular version of a library can include it in the application bundle, knowing that the exact version needed will be available. This can lead to many copies of the same libraries being installed, facilitating compatibility with applications that require different versions, but consuming (small amounts of) disk space unnecessarily and increasing the attack surface when multiple copies of an exploitable library are installed on the system. A system such as APT does not need to provide a facility for private copies of libraries, since it does all of the dependency computation, and all software in the repository is built and linked against the libraries in the repository.

    Certainly, once you have resigned yourself to visiting a third-party distributor's web page, manually downloading a binary package, and then manually installing the binary package, drag-and-drop installation is very convenient. But the Linux software ecosystem does not require this concession from the user -- the Linux distributor is free to provide a repository and tools for finding, installing, and updating software, without the need for manual installation.

  • by westyvw (653833) on Monday February 19, 2007 @01:59AM (#18064706)
    Debian and Ubuntu don't even get a mention on what they DO use? This article makes it sound like RPM is THE package management system. Give me a break, at least a mention that a similar package approach (and more successful IMHO) is used by the Debian etc.
  • goddammit (Score:5, Insightful)

    by scenestar (828656) on Monday February 19, 2007 @02:00AM (#18064712) Homepage Journal
    We have apt and *.debs

    I'm not in the mood for a holy war right now, but for fucks sake, Debian perfected package management a decade ago.
  • Re:Nonissue (Score:3, Insightful)

    by oberondarksoul (723118) on Monday February 19, 2007 @02:00AM (#18064718) Homepage

    The diversity of packaging formats is definitely a nonissue, because EVERYONE has the source code... If I need some software that isn't already packaged for me, I grab the source code and compile it.

    Most users don't know what compiling software is, let alone how to do it. While having the source available is absolutely a good thing, there needs to be an easier way for new Linux users to install software. With so many different formats, some specific to some distributions, some to others, and so on, it's almost impossible to keep track - if Linux is to gain more share on the desktop it needs to be more accommodating to new users, and the current situation frankly isn't.

  • by khasim (1285) <brandioch.conner@gmail.com> on Monday February 19, 2007 @02:13AM (#18064788)
    #9. Good point. Being able to easily roll-back an "upgrade" that didn't work would be a very nice feature. So I've marked this as number nine.

    In fact, Ubuntu might be switching to the Smart Package Manager http://labix.org/smart/faq [labix.org] which seems to support this functionality.

    I also left out ...

    #10. Mark packages so that they will NOT be upgraded. The same as I can do with apt.
  • by AKAImBatman (238306) * <akaimbatman@gUUU ... inus threevowels> on Monday February 19, 2007 @02:13AM (#18064790) Homepage Journal

    As does clicking on a checkbox. Seriously, I don't need it to be any easier.

    Unless the software you want isn't in the Synaptic repository. Then it's hell on earth for the average user. The only response they get from support and developers is, "Why would you want to use software that isn't in the repository?"

    Actually, that's not true. There are plenty of other fun responses:

    "You should compile it from source."
    "The vendor should spend his time getting his software added to our respository!"
    "Use RPMFind. I'm a developer and I've never had a problem installing binary packages on the distro I work on." (Conveniently ignoring that when something breaks, the "developer" fixes it himself.)

    Not that there's much point in harping on this again. I'll just get the same, "U R STUPID", "You need to try distro XYZ", and "Everything is in my distro's repository!" answers I've gotten before.

    Blinders on, and full speed ahead cap'n!
  • by Overly Critical Guy (663429) on Monday February 19, 2007 @02:43AM (#18064990)
    OS X style bloat? How about Linux style bloat. If you run Firefox, OpenOffice, GIMP, and KIllustrator, that's four entire windowing libraries and widget sets loaded into memory including libraries from two different desktop environments.

    I love your evidence though. "Appears to require 4GB of ram." Right, dude. Right.
  • Re:Nonissue (Score:2, Insightful)

    by Lord Kano (13027) on Monday February 19, 2007 @02:51AM (#18065030) Homepage Journal
    Either Linux is going to be a OS for users, or a OS for geeks.

    Exactamundo!

    Maybe this makes me a heretic, but I don't think that Linux should ever try to become a mainstream user OS. A part of the allure of linux is that it is hard. The learning curve makes users feel like they've achieved something by becoming proficient.

    If you made a distro, how would you prefer to spend your time? Would you rather deal with "XYZ_lib_devel is out of date, we need a package for the new version." or "How do I get AIM, ICQ and Yahoo messenger to work?", "WTF is this GIMP thing? Is it like pulp fiction?" and "How do I install MS Office on this"?

    LK
  • sigh... (Score:3, Insightful)

    by element-o.p. (939033) on Monday February 19, 2007 @02:53AM (#18065038) Homepage
    Why, oh why does everyone always have to gripe that "distro x doesn't do things the same way as distro y?"

    Linux, unlike proprietary, closed source software is about choice. That's what I LIKE about Linux--I can choose the way that I prefer, be that how to install packages, which desktop environment to use, which CLI shell to use, if Linux boots into a CLI shell or if it goes straight to X-Windows, etc.
  • by NeuroManson (214835) on Monday February 19, 2007 @02:55AM (#18065046) Homepage
    Look. If you want mainstream acceptance, then appeal to the mainstream. THAT is what will determine the best distro.

    One of the previous episodes of Drawn Together put it best:
    Spanky (to the TV Reviewer): No wonder you hate the show so much. You're everything we make fun of! You're a Jewish, conservative, pro-life, born again, overweight, Asian, homophobic lesbian broad who cuts herself!
    Reviewer: So?
    Spanky: So, maybe someone who doesn't happen to be a Jewish, conservative, pro-life, born again, overweight, Indian, homophobic lesbian broad who cuts herself might not be offended by our show.
    Reviewer: I have every right to tell people what I think of your show.
    Spanky: Yes! But people should know you're not our audience, asshole!

    You aren't making an OS to appeal to the guy in the cubibicle next to you in the CS class in college. You're making an OS, by your own claims basically, to overthrow the evil overlords (AKA Microsoft, if you ain't got it yet). So why is this STILL a debate today?

    Keerhist, I'm a furry artist, and even I recognize the concept of a limited market margin, but I don't spend my time in debates and having epileptic fits or Tourettes outbreaks in order to try forcing non furry fans to accept what I draw. Jeeze.
  • by wizrd_nml (661928) on Monday February 19, 2007 @02:58AM (#18065064) Homepage

    #1. It must make installing new software as easy as it currently is with apt.
    #2. The same for upgrading the software.
    #3. The same for removing the software.
    #4. The same for handling dependencies. Including the order in which dependencies must be installed.
    #5. The same for validating the installed software against the original software (checksums or whatever).
    #6. The same for re-installing the software over the existing installation when you accidentally delete or over-write something.
    #7. The ability to point the updater at your own repository or multiple repositories.
    #8. The ability to recompile (automatically) any software that you install for your specific hardware.
    ...and it must do all of this without telling me what it's doing, because I don't care what it does as long as the software then works.
  • by pandrijeczko (588093) on Monday February 19, 2007 @03:03AM (#18065082)
    There are many ways software is packaged in the Linux world, I agree 100% on this issue. But I also know that until software becomes portable *across* distributions, chances of Linux gaining a foothold in Joe User's mind and on hid desktop will be continue to be illusive at best. This is not good enough.

    You're making the somewhat dangerous assumption that a general policy of "one sizs fits all" is what the Linux user base both wants and needs - this is entirely incorrect.

    For example, as an experienced Linux user, the last thing that I want is a single, binary-packaged method of distribution of software. I use a source-code based distro called Gentoo which means that I get to compile the stuff I run my way on the basis that, if something goes wrong with compilation (as it does sometimes) then it's up to me to try to work out why. But the advantage is that I get to optimize all my applications the way I want to, all of them (hopefully) linked nicely to system libraries as they should be.

    Sure, this isn't the way Joe Public wants it but then if he wants something simpler then, great, good luck to him - use something simpler. I've used Ubuntu a couple of times and this seems to heve a pretty good package management mechanism which I guess is based on the Debian system. (Please don't flame me if I'm wrong here, BTW, but Gentoo is the only Linux I really use these days so I fully admit to not being up to speed on other package management methods.)

    I have always wondered why bright minds, working for "free" and able to produce an OS that is giving corporations with big budgets a run for their money, cannot agree on how best to package software. To many users, we in the Linux world are still a bunch of jokes.

    This has absolutely *NOTHING* to do with "agreeing" to anything and you have totally missed the point of Open Source. Open Source is about a single or bunch of programmers thinking that they have a neat way of doing something with software and then making that software available for others to improve. Ultimately, if you're looking for Open Source software to achieve a specific task, then you probably have a number of different applications to choose from which will achieve at least some of what you want. This view of the world is typified by Vi and Emacs, for example, both of which at their heart are text editors but can be extended in certain ways to do a whole lot more. Consequently, some people prefer Vi, others prefer Emacs, that's just what happens when people get choices.

    Unfortunately, as things stand currently, you cannot come into the OSS world with a "Windows mindset". In the OSS world, you do not hand over some money and have a piece of shrinkwrapped software fall into your lap. Instead, you have to take some responsibility for your computer and what you run on it and there's an expectation that you take the time to research what's out there and decide what you're going to use and how you're going to use it. Nobody's forcing you to use Open Source - it's there if you want it but if you don't, then stick with Windows and enjoy it.

    Linux and OSS is *NOT* a fashion statement - it's not about being "cool" or different. If you use either, then be an adult and accept the ramifications of that decision. OSS will not come to you, you need to go to it.

    Sadly, it appears that because of bigotry, selfishness and ego, it will be a few more years before those that command authority in the Linux world wake up. I hope we'll still be relevant by then.

    Sorry, but now it is quite clear you've lost it - you're now sounding like a bitter little man who's frustrated with Linux and/or OSS but is not prepared to put in some effort to helping himself.

    "Bigotry"? Where? If you mean that certain people have rejected the Windows way of doing things and have decided to do things a different way, then surely that's their choice, isn't it? I really can't see how it's impacted Windows users in any way - apart from in a good way where OSS surel

  • by LesFerg (452838) on Monday February 19, 2007 @03:10AM (#18065126) Homepage
    You left out the parts that usually alienate new users;
    - Link it into the menu/desktop system
    - Also link in help or documentation, or at least a relevant URL

    Even somebody who has used Linux for many years and feels comfortable with apt, rpm etc, can still occassionally be annoyed as all hell when an application is installed, then you have to go searching all over the web to find some basic configuration guide, let alone finding how to start the app.

    In fact, maybe part of the packaging system could include linking in the wiki that everybody uses to tell others how they made the demmed thing work.
  • Re:The solution! (Score:5, Insightful)

    by M. Baranczak (726671) on Monday February 19, 2007 @03:15AM (#18065146)
    I would like to see GUI tools get the smarts to automatically figure out dependencies across all formats, allowing all distros to become package agnostic.

    The package formats are easy. The real bastard is that each distro has subtle differences in how the packages and the dependencies are organized. The only way that I can see to fix that is to design a universal package tree, and convince all the major distros to conform to it. Which is not impossible, but it aint easy, either. And it might cause other problems.
  • by ploss (860589) on Monday February 19, 2007 @03:18AM (#18065148)
    #10 - The ability for 3rd parties to create their own packages that have the same advantages as being in the repository. For example, I currently download xyz from SourceForge as a .deb, and have it install. Great. Why not provide some notification of a new version, like a link to an RSS feed inside the package file that is checked on every apt-get update? Why not list it in synaptic, adept, yumex, etc.? Also, make it easy for the developer.

    #11 - The ability to have a user install their own package easily and transparently, under their home-dir (not applicable to all packages, of course.) Then, when that package is installed on the base system, it should also remove the user package and symlinks (/home/user/bin.) It's not cool to need root to run yum or apt, or require sudo privs when I just want to get something simple. This is especially important in a multi-user system.

    Just some additional thoughts.

  • by wordisms (624668) on Monday February 19, 2007 @03:18AM (#18065150)
    Dude, until I can click on setup.exe, and it just works, and then there is an "Unistall Program" menu in the program folder on the program menu... I just don't have the time. I've used all 5 methods, and they are great for server management, but for general desktop use, people need click and run. Maybe CNR will take off.
  • by Anonymous Coward on Monday February 19, 2007 @03:21AM (#18065166)
    Any suggestions on what would make them even better?

    How about making it easy to install packages for yourself, without having any special permissions? I did that with .tar.gz's 10 years ago (./configure && make), but I don't know if it's even possible with a .deb, short of ripping apart the archive with ar and setting up an elaborate environment for the binary.

  • by aero6dof (415422) <aero6dof@yahoo.com> on Monday February 19, 2007 @03:23AM (#18065174) Homepage
    I'm only exaggerating a little here, but no one really cares about the packaging format per se. They care that the can find, download, install, and run a package without hassles. Most formats take care of the mechanics of that process, but still need a community of people to track down and fix issues - mostly inter-package issues. Rpm and deb both have that kind of community behind them (both with sub-groups). If there is any technology to be improved here, it should be making package repositories better and reducing the workload of the supporting communities.
  • Re:Nonissue (Score:5, Insightful)

    by zzatz (965857) on Monday February 19, 2007 @03:52AM (#18065314)
    "...never make a significant dent in the Microsoft+Apple consumer desktop market."

    Linux will never make a significant dent in the Microsoft+Apple market by doing the same things the same way as Microsoft and Apple.

    Look at markets where Linux has succeeded, such as servers and embedded systems. Linux succeeds *because* it doesn't follow the Microsoft license model, the Microsoft development model, the Microsoft business model, and so on. You can't win if you play by Microsoft rules.

    Linux can be, and is, an OS for users. It isn't an OS for third party closed source binary distribution. Don't read that as non-commercial; commercial software was distributed in source form before Microsoft and will be again. Distribution in binary form makes sense for games and art, but not for general purpose computing. The value of doing things in software rather than in hardware is that software is malleable. But you need the source to realize the full value; binary distribution removes value.

    So yes, Linux will not make a significant dent in the Microsoft+Apple consumer desktop market, if that means the closed binary sales market. If Microsoft played in the NFL, they'd be the Super Bowl winning Colts. But the Colts will never win the World Cup, which is worth more. Don't complain about Linux not hiring a bigger front line when the game Linux is playing is soccer, and doing rather well at it.
  • by Chandon Seldon (43083) on Monday February 19, 2007 @03:56AM (#18065326) Homepage

    Why is there this obsession with the awful Windows package system? Have you legitimately used a repository-based system with a GUI?

    Setup.exe + an Uninstall menu item is strictly worse than, say, the way packages work in Ubuntu. If you want to just distribute a package file and have the user double click to install, that works great. But... there's also a giant fully-supported package repository.

    I guess it basically comes down to one thing: As they would say on Fark.com... "No you can't have Linux be an exact copy of your favorite version of Windows. Not yours. (Picture of pony)"

  • by gbobeck (926553) on Monday February 19, 2007 @04:04AM (#18065366) Homepage Journal

    Gentoo will never be mainstream until it supports a user friendly installation method like Vista's MSI files.

    I respectfully disagree with your statement.

    1. Gentoo already has a (somewhat) user friendly installation method. For packages, it happens to be called 'emerge'. For the Gentoo install itself, there is a graphical installer. Now, if someone out there would like to create a GUI interface to emerge, that would be cool.

    2. Gentoo doesn't really care about being mainstream. There happens to be a fairly large group of users who like bleeding edge. Gentoo is bleeding edge. Some people want to rice out their make.conf. Gentoo makes it easy for you to annoy everyone by ricing out your make.conf. (Hey, don't laugh, but all of my ricer settings in my make.conf file have really improved my server's performance by a whopping 0.0003%. That increase in performance makes doing absolutely nothing at idle go so damn fast!)

    By the way, .MSI is a (poorly) reversed engineered version of the InstallShield Installer.
  • by Anonymous Coward on Monday February 19, 2007 @04:05AM (#18065372)
    You make some good points, but then:

    Certainly, once you have resigned yourself to visiting a third-party distributor's web page, manually downloading a binary package, and then manually installing the binary package, drag-and-drop installation is very convenient. But the Linux software ecosystem does not require this concession from the user -- the Linux distributor is free to provide a repository and tools for finding, installing, and updating software, without the need for manual installation.

    Most people would not consider being able to download a program from the web and just run it a "concession".

    Many of the great innovations I've seen have been some variant of "this can fail in way X ... instead of pretending this is a rare case, admit that it's common, and design around that". Heck, open-source has been described as just that: "Open-source software has fewer bugs because it admits the possibility of bugs".

    Why do so many open-source system designers pretend that any program you will ever want to use will be in the distro's repository? It's not true today, and it will probably never be true. Pretending that it is will only serve to distance you from the needs of real people.

    I use Debian and apt is pretty neat (and RPM sounds decent too), but why can't we have a system where I can click on a "some-app.source" file on a web page, which contains source code and dependencies, and it automatically installs the deps, compiles the source, and installs that using my native packaging system (so it can check this same webpage for updates later)? Would that really kill you?
  • by davmoo (63521) on Monday February 19, 2007 @04:27AM (#18065498)
    Why is there this obsession with the awful Windows package system?

    Because my 72 year old mother can, and does, install programs herself in Windows. If it requires anything more complex than "double click on setup.exe" or "double click on the program icon when you save it", you've lost her completely and I have to tunnel in to her machine or make a 125 mile drive.

    In the course of my work, I use Mandriva, Redhat, and Slackware distributions (I have never been able to get everyone elses' darling Ubuntu to install on any machine I own or control). I would not dare let my mother install something on any of the three. Hell will freeze over in ice multiple feet thick before she would understand things like "differences in the file tree", version dependencies, etc etc.
  • by value_added (719364) on Monday February 19, 2007 @04:32AM (#18065524)
    OS X-style package management is best suited for a software ecosystem in which users draw software from a large number of heterogenous third-party sources, while the core OS and iLife suite are maintained and updated by Apple.

    Reading the above (which, incidentally, is little different than the traditional Windows ecosystem) raises the question whether this defines what end users demand and expect of package management, which is that they can install anything and everything when and how and from wherever they want.

    The article presents the issue as follows:

    Mike Hearn ... claims that the project did not gain enough popularity ... because of the dominating doctrine which states that it's the distributor who should be responsible for the whole operating system. It is kind of similar to another popular opinion: that the administrator (root) should be responsible for the whole machine, and all the administrative operations should be preformed from a separate root account. These kinds of centralized management systems fit to the server market, but they don't really reflect the desktop reality that well. On desktops, the administration is only one of the roles the user plays and waiting for half a year for a new version of an application is often unacceptable. It is often a decade in the free software world!
    So if the "desktop reality" conflicts with the "server reality", then there may be a problem. I'm inclined to believe that, ignoring the comfortable niche of OS X, the above is mostly rubbish and the problem lies in the desktop reality itself. For example, I doubt anyone could disagree that the prevalent desktop reality of the Windows ecosystem has led to an unending stream of security issues facilitated in part by the bad habits of their users who are encouraged to be just that, desktop users, and similarly encouraged to be oblivious to the underlying reality that administration is a fact of life, irrespective of patience, laziness, or knowledge.

    Personally, I don't see a problem with single or multiple centralised management systems, and I certainly don't see a problem with a root account. To say that what's good for the server is good for the desktop is simply redundant, ignoring the questionable premise that there exists such a distinction.

    People are free to choose a distribution based on their particular needs and preferences. Perhaps they need to be reminded from time to time of the tremendous effort made on their part by the maintainers and settle for a less than perfect or ideal world?
  • Re:Nonissue (Score:3, Insightful)

    by HBI (604924) <kparadine&gmail,com> on Monday February 19, 2007 @04:39AM (#18065558) Homepage Journal
    Not very popular around here, apparently, but i'd much rather it stay a geek OS.

    It's nicer that way.
  • Re:The solution! (Score:3, Insightful)

    by kripkenstein (913150) on Monday February 19, 2007 @04:48AM (#18065588) Homepage

    The package formats are easy. The real bastard is that each distro has subtle differences in how the packages and the dependencies are organized. The only way that I can see to fix that is to design a universal package tree, and convince all the major distros to conform to it. Which is not impossible, but it aint easy, either. And it might cause other problems.
    Regarding the issue of dependencies, they might just be ignored completely for desktop applications (the focus of TFA). For server apps, you want to rely on shared libraries in your distro's repositories, but for the odd desktop app, the simplest solution may be to just include all dependencies inside it. Yes, this creates some security issues (patching the included dependences), so it isn't a perfect solution. Also it can waste some RAM. Still, if a new user has to have some brand-new desktop app, this might be the easy way to do that.

    Other related issues of installing in various distros with various directory setups can be solved (partially, at least) with app virtualization. I believe Klik is doing something along those lines.
  • by burner (8666) on Monday February 19, 2007 @05:06AM (#18065674) Homepage Journal
    Sure, she can do the install, but what about getting updates and bug and security fixes? And what of spyware?

    I much prefer (as do my sister and mother) the simplicity of going to the Applications menu and clicking the entry for "Add/Remove...". You can browse around or search for a particular program by type or name. Click a checkbox, click OK, it's installed, unclick a checkbox, click OK, it's gone from your computer.
  • by Wiseman1024 (993899) on Monday February 19, 2007 @05:24AM (#18065752)
    The need for package managers in Linux is a consequence of a desgin defect. First, there's the "lol freedom" philosophy of not having one, two or even three different OS setups and layouts, but a gazillion of them, which causes trouble.

    Second, there's the FHS, which is the worst idea to ever make it to Unix. You spread your application files like you deal cards in some card games, being completely unable to copy or relocate them, pack and unpack them effectively, or install several versions of the same program, besides being illogocal and semantically wrong in many parts.

    Third, there's the defective LD_LIBRARY_PATH behaviour that makes "." mean the launch directory, not the application directory (holy retarded idea, Batman!). This means you can't rely on putting copies or hardlinks of the required libraries in the application executable directory to keep everything using the right libraries and versions and make them easy to distribute. This led Mozilla to find a workaround with a shell script. When you run Firefox or Thunderbird on Linux, what you're running is a shell script (requires an extra sh instance) that properly sets the environment for the software to be able to use its own libraries regardless of the crap you may have in your FHS "boxes of random crap". Consequently, software like Firefox and Thunderbird do not require a package manager (in fact, PMed versions of them are usually spoiled and crappy), and can be safely copied, relocated, or made to coexist with other versions.
  • Re:The solution! (Score:2, Insightful)

    by muuh-gnu (894733) on Monday February 19, 2007 @05:51AM (#18065858)
    > Like when you go to the dealership, you pick the car model, the engine, color and such.

    When youre a newbie, and searchig for a Linus distro to check out, it isnt any different. You then either just pick the one you saw somwhere work nicely, or go with one of the widely recommended biggies: Fedora, Ubuntu, Suse. They mostly provide sane and easy to learn defaults, and do not confront the newbie with stuff like vi/emacs, kernels, shells and window managers unless he _deliberately_ looks after them.

    Saying that the sole existance of some obscure sourcebased distro or window manager he wouldnt even be able to install is making the first choice to a newbie _SO_ difficult, that he shuns Linux alltogether, is like saying people would stop driving and buying automobiles, because of the existance of limousines, sport cars, monster trucks, forklifts, lawn tractors, and so on. You do not get confronted with those in a car dealership. You also dont get confronted with advanced level distros and apps wheny youre a newbie.
  • Re:GNU/* (Score:1, Insightful)

    by Anonymous Coward on Monday February 19, 2007 @06:01AM (#18065908)
    It's not so hard to write a compiler, linker, command interpreter, etc

    No? Which of them have you written?
  • by cortana (588495) <samNO@SPAMrobots.org.uk> on Monday February 19, 2007 @06:37AM (#18066010) Homepage
    Dear lord, no. Bad & outdated third-party documentation is the software world's biggest usability problem.

    Software packages should *include* the upstream documentation. That way, the user gets correct documentation that matches the version of the software they installed. If the documentation is very large, it can go into a separate foo-doc package.

    The other advantage is that people using the software offline can access its documentation. :)
  • Its very simple (Score:2, Insightful)

    by Stevecrox (962208) on Monday February 19, 2007 @07:19AM (#18066174) Journal
    Installing things in Linux is confusing and hard to a new user (be they computer iliterate or not) Windows has an incredibly easy installation system that even complete novice's can understand, OS X has a simple way of installing that people can understand.

    Linux has 5, none of them simple. Give me something simple that doesn't involve typing sudo something, something and I'll take to it. Why should I have to deal with the source code at all? I get open source products in windows I get an installer than installs the application and puts the source files into a folder for me. I like that.

    you guys may love the various install methods but give me and average joe a simple way to install and get used to the OS first.
  • by cortana (588495) <samNO@SPAMrobots.org.uk> on Monday February 19, 2007 @07:23AM (#18066194) Homepage
    Not to mention https://bugzilla.redhat.com/bugzilla/show_bug.cgi? id=119185 [redhat.com] which is still not really fixed.
  • Re:The solution! (Score:2, Insightful)

    by juergen (313397) on Monday February 19, 2007 @09:16AM (#18066654)
    Now that's one of my pet peeves.

    Why the fsck do some users insist on installing software outside their own distributions?

    1. They have a weird distro. Switch. All modern distros include virtually everything a common uswer needs.

    2. Common users should stay away from bleeding edge versions. Treat them as non existant, like you don't see half finished products in the MS world. (Well, ...)

    3. They have more than basic or intermediary needs. They are not common users and should be able to cope with the current situation.

    4. They just are not educated they can get all they want from their own distro, and stupidly follow instructions on a product's webpage more directed at packagers. No wonder they are frustrated.

    Another *unified* approach will not work in a world where distros mainly distinguish themself by doing (package) management different. It'll create more fragmentation, like another LSB failure.

    Few users expect to be able to install Windows software on Macs either. And nowadays it is even the same hardware. The key to winning this is education, and letting distro's compete on their core bussiness, not forcing a one for all solution on everyone.

    -Jürgen
  • Re:The solution! (Score:3, Insightful)

    by trianglman (1024223) on Monday February 19, 2007 @09:33AM (#18066760) Journal

    and do not confront the newbie with stuff like vi/emacs, kernels, shells and window managers unless he _deliberately_ looks after them.

    Unless, of course, the newbie does want to install a software package that doesn't come pre-installed with the distro chosen. Once that happens it is a crap shoot at best whether it is even possible to install the new software, which is what this article is all about. What needs to happen is for the core Linux user base to get over their egos (I know I am going to get an "Are you new here?" for that) and realize that ease of use for the general public is more important than doing things they way they have always been done. Windows has gone far past the opposite extreme and made everything so easy to do that one can't do anything, but there is a middle ground. Doing things in an inter-operable manner wouldn't preclude custom, build it yourself, distros, it would simply make it easier for a newcomer to start and get the software they want to use.

  • Re:The solution! (Score:4, Insightful)

    by trianglman (1024223) on Monday February 19, 2007 @10:00AM (#18066926) Journal

    packaging is the most major part of a distribution's job.

    This is one of the places Linux gets it wrong. My operating system should not be responsible for all the software I might at some point want to install. Windows messes this up too at times (IE), but MS is much less of an offender than Linux is. It should be responsible for making it easy to install new software, among many other things, but it should not be responsible for every software program out on the web.

    An operating system should be responsible for the kernel, file system, and the nuts and bolts of keeping the system running in general. The program creators should be responsible for packaging so that it can be installed (with the help of the operating system) and should also be responsible for dependencies. It should not be my job to spend three hours searching the web for some obscure package that the program creators just couldn't do without. If they see it as necessary, and they know its not readily available, they should package it with their own program (GPL and BSD licenses both support this and is one of the strengths of these licenses).

  • Re:The solution! (Score:2, Insightful)

    by Serious Callers Only (1022605) on Monday February 19, 2007 @10:12AM (#18067004)

    That's exactly how Microsoft tried to solve it and why their distributed software is huge. It also has led to problems of competing dlls which are often incompatible. If you think you have dependency problems now, just wait until you implement your idea! Imagine installing 16 applications each having a dependency on 16 different versions of the same lib.


    The idea of the post you're replying to was to package dependencies inside the application. That way, you can indeed install 16 applications each having a dependency on 16 different versions of the same library, with no problems whatsoever.

    OS X does it this way just now, and it works very well; apps are sometimes a few MB bigger (not a huge amount for most apps), but it's a huge saving in convenience for the user, who shouldn't even have to know what a dll/shared library is, let alone worry about it. As the grandparent pointed out, packaging applications should be a job for developers, not the OS.

  • Re:The solution! (Score:3, Insightful)

    by trianglman (1024223) on Monday February 19, 2007 @10:17AM (#18067032) Journal
    Why do Linux fundamentalists believe that all users are idiots and they should go somewhere else? Until the majority of Linux users and developers get past this mentality we will never see Linux accepted into the main stream desktop market. Yes, most general users can find all the software they need in a single distro, but most users don't know Ubuntu from Fedora from SUSE. If they pick a distro that doesn't include a software package that they want it shouldn't require uninstalling the OS and installing a new one. Distros shouldn't have to include every single piece of software that a user might want also. If they stopped doing this distros wouldn't require 5+ CDs or a DVD or two. Now, don't get me wrong, I appreciate having most of the programs I will need available on a set of five CDs, but this shouldn't be a requirement of distros. MS took that mentality with lots of things (specifically IE and the media player) and got slammed with anti-trust lawsuits. The non-system required software should be independent of the operating system, making the other software easily available is a feature of the different distros, but you shouldn't have to pick a distro based on the software that will come prepackaged.
  • Re:The solution! (Score:5, Insightful)

    by mrsbrisby (60242) on Monday February 19, 2007 @10:40AM (#18067166) Homepage

    Which is why, as it currently stands, this year will not be Year Of The Linux Desktop. Consumers won't just accept that they can't install software X because it's an RPM and alien doesn't work
    Now my daughter just received a "game" on Windows- brand new (2007) game that insisted on running in some "compatability" mode in Windows, and in a resolution that her LCD display couldn't cope with. The fact is that Windows users have run into this problem attempting to install software that isn't for their particular operating system, and failed on the Internet for a few hours. They just assume that Linux users have run into the same problem.

    They don't. Linux users install software out of their software catalog. Occasionally the brave ones go to the author's website, and download the software from there.

    This is also one reason people are willing to pay for an operating system that has a standardized and dependable way of doing things.
    Bzzt. Wrong. Nobody is willing to pay for Windows, that's why Microsoft doesn't let OEM's give you a choice. Duh, I'll use the Windows I already bought. And don't spread that Lie about how I don't have a License to.

    Microsoft even released the WiX toolkit that allows anyone to create MSI installer packages.
    But not the MSI format specification. That would allow me to cross-compiler into an installable package. As it stands, my users who run Windows have to deal with no installer.

    MSIs are one of the best ideas for Windows in a while ... No more dealing with poorly-written homebrew installers or 10-year old, 16-bit InstallShield programs.
    You're wrong, and you want proof? Look how many programs- nay, look how many programs come from Microsoft that are still distributed as exe files. That shiny new Zune's software comes in exe-form.

    Once that 16-bit installshield program was written, it's forever supported. You can't put the setup.exe genie back in the bottle, and you have to live with that. With Free Software, we can take our software library with us, which is why Free Software always gets better, and non-Free software atrophies.

    Instead you have a fully scriptable installer that's transaction-based and has near 100% support coverage.
    You are wrong on all counts. Pull the power plug while installing and you'll see just how transactional it is. I don't even think you know what coverage means: Microsoft Support will tell you to reinstall your operating system if a broken/corrupt/poorly-written MSI breaks your system. Even if they make it.

    I like apt, but downloading a gzipped file of source or a deb that complains about dependencies still can't compare to an MSI package.
    No of course not, but that's why you used a straw man. MSI is an executable, and just made Microsoft's security problem worse: it introduced yet another executable file format. Nobody downloads "gzipped file of source or a deb that complains about dependencies" ever. They say "apt-get install xyz" and it goes and figures out the dependancies itself.

    It doesn't have to- Linux users could waste disk space by including the dependencies with every program- and some Linux distributions even do this(!), but it makes upgrades very difficult. For example, when libz had a vulnerability discovered, only one copy needed to be upgraded on most Linux systems. On Windows, almost every program that dealt with gzip or deflate-compressed data (like png or zipfiles) needed to be upgraded. Worse still, that library or program can be anywhere on your hard drive, and you might never know it.
  • Re:The solution! (Score:3, Insightful)

    by Jesus_666 (702802) on Monday February 19, 2007 @11:11AM (#18067364)
    It should not be my job to spend three hours searching the web for some obscure package that the program creators just couldn't do without.

    Hence package management. You either distribute everything statically-linked (which means that if a critical bug is found in a library all programs using that library need to be updated) or you make the programs and the libraries they use separate packages, which allows fixes to be distributed more easily. With a good package repository you should not need to hunt down anything, ever.

    Linking everything statically would mean no problems - provided that all programs come with an auto-update routine, the auto-update routine is implemented in a way that makes sense to be deployed in a sensitive network, all authors keep up to date regarding the libraries they use, fixed builds are made available as soon as possible.
    As long as that scenario is not reality package managers are useful.
  • Re:The solution! (Score:3, Insightful)

    by MBGMorden (803437) on Monday February 19, 2007 @12:31PM (#18068060)
    Last time I tried Ubuntu apt-get package_i_want failed to locate the program more than half the time. I use Gentoo and while portage has the largest selection of "packages" available that I've seen, even it does not contain everything.

    Relying on distros for your software has lead to the sad state we're in now. I don't rely on Microsoft to hand stamp and prepare every piece of software I used on Windows, and I certainly shouldn't have to do the same on my Linux machine. Until we get a method by which I download a file, click on it, and install a program (regardless of which distro I'm running or which version of GTK I'm running), Linux will lag behind. SEVERELY.
  • by ElleyKitten (715519) <kittensunrise@gmail . c om> on Monday February 19, 2007 @02:53PM (#18070164) Journal

    Cause, meet effect. How easy is it for someone to create a closed-source application (Tool of the devil, I know) they charge money for (Anti-Christ! Burn!) and distribute it in a way that's easily installable on most linux systems?
    Stick a deb and an rpm on a CD. Charge for the CD. You could also put it in a commercial repository. Linspire's CNR will soon be available for all the major distros, and it handles payment.
  • by bcrowell (177657) on Monday February 19, 2007 @03:37PM (#18070876) Homepage

    The reason Linux distributions have not been trembling to adopt the OS X style of package management, if you can call it that, is that it would be a poor fit for the Linux software ecosystem.
    In addition to the reasons you mention, there are at least two other reasons why it's a poor fit to Linux.

    One reason is that the typical Linux user is running 90+% OSS, while the typical Mac user is running 90+% proprietary software. The individuals and organizations distributing OSS are generally giving it away for free on the internet (although of course some people do pay money for free-as-in-speech software); since they don't have a revenue stream from the software itself, they have to keep their bandwidth costs low, and that means they don't want to serve up a 30 Mb binary, of which 28 Mb is someone else's libraries. Also, users who are downloading OSS don't want to download 30 Mb when they could download 2 Mb.

    The other reason has to do with security. The standard Linux security model is that when a buffer overflow is found in a library, you just update the library, and the problem is solved. That doesn't work with statically linked binaries.

  • Re:The solution! (Score:3, Insightful)

    by mstone (8523) on Monday February 19, 2007 @05:58PM (#18073138)
    ---- Why the fsck do some users insist on installing software outside their own distributions?

    Uh.. because they want to?

    Because they think the software might do something they want?

    Because they think an OS should be able to.. oh, I dunno.. run software?

    Because they want to choose software for themselves, not leave the decision to some central committee that will tell them what options they're allowed to have? [1]

    Because you've got the question precisely backwards, and the one that really matters is the user's question of, "does this OS/distro let me do what I want?"

    Let's face it, if the answer to that question is, "no," you can pretty well forget about the world beating a path to your door. But that's okay, because apparently you don't want them there, anyway.

    Stupid damn users.. thinking they should be allowed to choose software..

    [1] - Seriously dude.. drop the "why can't they just do what we tell them to" meme.. NOW. Friends don't let friends evolve into spokesmen for the RIAA/MPAA.

Some people have a great ambition: to build something that will last, at least until they've finished building it.

Working...