Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Software

Building A Better Package Manager 431

SilentBob4 writes "Adam Doxtater of Mad Penguin has published a preliminary layout for his proposed cross-distribution package manager capable of adding/removing software from any locale. He is suggesting the interface will basically allow for installation of several major package formats including RPM, DEB, TGZ, as well as source code with the ability to pass build time options. All of this will come at the price of standards of course, including naming, documentation, and package structuring. If this idea were to catch on, it would signify a major leap in desktop Linux usability. This might be a project that UserLinux might benefit from. Read the full column here (complete with GUI mockups)."
This discussion has been archived. No new comments can be posted.

Building A Better Package Manager

Comments Filter:
  • Autopackage? (Score:5, Informative)

    by Deraj DeZine ( 726641 ) on Tuesday February 10, 2004 @01:06PM (#8239047)
    So this is a similar effort to Autopackage [autopackage.org] except that it plans on using the native package formats? Intriguing...
    • Sounds like everyone is looking for something like the FreeBSD ports collection [freebsd.org].
    • Re:Autopackage? (Score:5, Insightful)

      by pubjames ( 468013 ) on Tuesday February 10, 2004 @01:56PM (#8239701)
      So this is a similar effort to Autopackage except that it plans on using the native package formats?

      Except that this guy has just stated the idea and made a couple of mock-up screenshots, whereas the autopackage guys are coming up with a complete, sensible solution and are leaving the interface until the end.
    • Re:Autopackage? (Score:5, Informative)

      by IamTheRealMike ( 537420 ) * on Tuesday February 10, 2004 @02:17PM (#8239929)
      By the way, it's funny this should be mentioned now but autopackage.org is in the middle of a DNS repropogation - it was switched to point to sunsite.dk literally hours ago.

      For now, if it doesn't work, use autopackage.sunsite.dk [sunsite.dk] and bear with us as we fixup the broken links etc.

    • Re: A-A-P? (Score:3, Informative)

      by oever ( 233119 )
      Here's a similar package funded by a dutch institution:

      A-A-P [a-a-p.org]

  • I hope they will include Ebuilds [gentoo.org] as they are a great way to install software, and are becoming more and more popular as Gentoo does.

    (No, this isn't a troll, I'd really like to see that. :) )
    • by The Night Watchman ( 170430 ) <smarotta AT gmail DOT com> on Tuesday February 10, 2004 @01:20PM (#8239227)
      My main recommendation for Gentoo...

      Let's say I update my portage tree, and then I want to upgrade a package, like GAIM, for instance. GAIM's dependencies are GTK and a bunch of other stuff. When I try to upgrade my version of GAIM and there happens to be a better version of GTK available, Portage will upgrade GTK first, regardless of whether you actually need the very latest GTK to run GAIM. I'd rather see Portage know what the minimum version a dependency has to be in order to get a program running. As far as I know, it'll just upgrade everything in the dep tree.

      Unless I'm mistaken, at least. I've been using Gentoo for a while now, and for the most part I just do a "emerge -u world", which takes care of me pretty nicely. It just takes a while.

      ---
      • I use Gentoo also but I must admit that doing "emerge -u world" scares the crap out of me. I'm always scared to mess up my system. But I've been lucky so far.

        • I use Gentoo also but I must admit that doing "emerge -u world" scares the crap out of me. I'm always scared to mess up my system. But I've been lucky so far.

          I can certainly relate, but so far, I haven't had any emerge world trouble on any of the Gentoo machines I've set up. Occasionally it'll fail and you'll have to wait for another portage tree to fix the problem, or you've got to go in and tweak something, but I've found the online message board to be extremely helpful in troubleshooting things like
        • by Tony Hoyle ( 11698 ) <tmh@nodomain.org> on Tuesday February 10, 2004 @03:19PM (#8240645) Homepage
          Last time I tried that id upgraded glibc, screwed it up and I had to fdisk/reinstall to get the damned system back.

          My major gripe with portage is it doesn't record USE flags - eg. if you install links with 'USE=-X' to avoid its dependency on XFree (which moron decided that one???) then then next time *anything* decided to upgrade links it'll rebuild all the X dependencies back in.

          That's a real pain when you're talking about things like the java depedency to libdb for example, which doesn't actually work...
      • by polin8 ( 170866 ) on Tuesday February 10, 2004 @01:32PM (#8239404) Homepage
        "emerge -u gaim" will upgrade its immediate dependencies.

        "emerge gaim" will just upgrade to the needed packages, or only gaim.
      • true it doesn't intelligently or automagically handle that
        but gentoo has stated that thier distro is more oriented to the more capable linux types. So if you know gaim doesnt need it. Dont install the deps.

        emerge --nodeps gaim

        though i totally agree with the spirit of the post and maybe portage will have that capability someday...uneeded upgrades are always risky
      • by daserver ( 524964 ) on Tuesday February 10, 2004 @01:35PM (#8239441) Homepage
        That is because you use the -u flag. If you leave that out it will only update gaim or whatever you were updating.
      • by Jerf ( 17166 ) on Tuesday February 10, 2004 @01:40PM (#8239491) Journal
        When I try to upgrade my version of GAIM and there happens to be a better version of GTK available, Portage will upgrade GTK first, regardless of whether you actually need the very latest GTK to run GAIM. I'd rather see Portage know what the minimum version a dependency has to be in order to get a program running. As far as I know, it'll just upgrade everything in the dep tree.

        Basically, this is wrong. Sorry. ;-)

        The "-u" parameter to emerge will make it work as you described. However, if you just typed "emerge gaim", it would only emerge the minimum required. You have to ask for the "emerge all depencies, too" behavior.

        I quite frequently emerge -up world, then just pick and choose what I want updated.

        (I just checked "emerge -p world" against "emerge -up world", and "emerge -up" did significantly more packages on my system, where over 100 packages can be updated. On Gentoo, IIRC, the "world" is the list of things that you explicitly emerged; "emerge mozilla" will put mozilla in the "world" list but not any of its dependencies. So "emerge world" can update the packages you cared enough about to explicitly ask for them, and -u will add all possible dependency update.)
    • by Valar ( 167606 ) on Tuesday February 10, 2004 @01:27PM (#8239322)
      This reminds me of something I read the other day in the gentoo forums: Installing Portage on Other Distros [gentoo.org]
  • Mnyah! (Score:5, Funny)

    by American AC in Paris ( 230456 ) * on Tuesday February 10, 2004 @01:07PM (#8239059) Homepage
    Bah, you kids, always inventing your own names for things!

    When I was your age, we called 'em by their proper name--athletic supporters!

    "Package manager", indeed...

  • They have it. (Score:2, Informative)

    by Anonymous Coward
    It's called pkgsrc [netbsd.org].
  • Again? (Score:5, Insightful)

    by avalys ( 221114 ) * on Tuesday February 10, 2004 @01:10PM (#8239101)
    This topic comes up how often? Like twice a year?

    And yet nothing ever changes.

    • Re:Again? (Score:5, Funny)

      by Deraj DeZine ( 726641 ) on Tuesday February 10, 2004 @01:18PM (#8239196)
      What are you talking about? These mockups are clearly slightly different from past years! Perhaps no code has been written, but everyone knows that impressive mockups are the key to building successful open source software with your newly-created SourceForget account.
    • Re:Again? (Score:5, Interesting)

      by pyros ( 61399 ) on Tuesday February 10, 2004 @01:22PM (#8239254) Journal
      And yet nothing ever changes.

      Not true. Red Hat's up2date supports apt repositories and the dpkg format is getting GPG signature/hash checking. From discussion late in the Fedora Core 1 beta stage it seemed that there is internal pressue to include apt in the core distro at Red Hat. Those are big changes, I think. I stopped reading the article since it's getting slashdotted, but it the author[s] can implement a single database that tracks installation by RPM, deb, and tgz, then I'd wager those features will be added to RPM and dpkg down the line. I honestly can't see either Debian or RedHat jumping ship to a new system, but they both borrow features from each other, so why not from this too?

      • Re:Again? (Score:3, Interesting)

        by rsax ( 603351 )
        the dpkg format is getting GPG signature/hash checking

        Do you have any more information regarding that? Everytime I've asked on #debian I get very vague answers. Usually the argument is that all the package maintainers and contributors would need to send in their keys and sign all of the packages they contribute which would require a lot of effort. Given the recent security issues with trojans I think that this would be well worth the effort.

    • Re:Again? (Score:5, Insightful)

      by Hard_Code ( 49548 ) on Tuesday February 10, 2004 @03:16PM (#8240604)
      Because any time anybody (like me) points out that the tradional "unix" file system with /usr and .../bin/... and .../sbin/... and ../local/.., etc. etc. etc., is vestigial and retarded, everybody flames back: U R n0t a rEEL hax0r d00d, with a thousand apologies about why that has to be to compensate for some stupid clustering strategy cooked up in the early eighties (don't know about you, but it seems obvious to me that all configurations for all my applications go in /etc/, while the cryptic /usr/local/sbin is translated into "user-level administrative applications deployed for the local network of this machine". Boy that taxonomy is USEFUL!).

      The nearest thing I can find to a decent package manager is 'stow' which essentially does *nothing* but hide all that hierarchical complexity behind top-level directories. A new package *format* is not the problem, it's layout and management that is the problem. Everything else is a new wrapper on the same old problem.
  • by ajagci ( 737734 ) on Tuesday February 10, 2004 @01:11PM (#8239107)
    People get so hung up on the package format. It really isn't about the package format, it's about the people and organization behind the packages and whether they produce a consistent distribution. A "better" package format or a better installer isn't going to help you when a piece of software expects libraries to be there that just aren't available, or when an install script assumes functionality you don't have.
    • Exactly. While it may be nice for me to use a SuSE package on a RedHat system or a Fedora package on a Debian system, would I really want that? The distros still put some stuff in different places (even though that problem is almost gone thanks to the LSB), and they use different installation and removal scripts, different /etc/init.d scripts, different default configurations, etc. The boot scripts may be very different (compare, say, Gentoo's runlevels and config files with Debian's, RedHat's, and SuSE's).
    • There are also some fundamental problems with a software infrastructure that's built around C++ and shared libraries. In C++, any time you change the public data or methods of a class, it becomes binary incompatible with the old version. Couple that with shared libraries, and you have a nightmare for end-users. It might work in a centrally controlled evironment where a close-knit team of developers worked really hard to keep everything in working order, but AFAICT the best anyone's been able to do in the wo
  • Please explain....? (Score:5, Interesting)

    by Anonymous Coward on Tuesday February 10, 2004 @01:11PM (#8239112)
    For a simple Windows user, what are these "packages" and why do they need to be managed?

    What is so special about this? It seems just eliminating the whole concept of packages would make life so much easier. Installation programs (like MSI files) are simpler, aren't they?

    This is not a troll. Please answer my question, don't mod me down.
    • by cca93014 ( 466820 ) on Tuesday February 10, 2004 @01:18PM (#8239203) Homepage
      A package is basically the same thing as a Windows MSI file.

      The problem is that different distros have different directory layouts, configuration file layouts, different places to put binary files, different ways of updating the internal library database etc. etc. etc.

      The problem is basically a manifestation of there being more than 1 distro of linux and having distro maintainers who have not agreed on a common standard for this stuff. It's Linux's major achilles heal IMHO.
      • The problem is basically a manifestation of there being more than 1 distro of linux and having distro maintainers who have not agreed on a common standard for this stuff. It's Linux's major achilles heal IMHO.

        I would argue this, actually. Just because it's s a problem that needs fixing, does not make it n achilles heal. Think about it: When we come up with an elegant solution that is cross-distro ( and possibly cross-platform ), it will make linux that much stronger.
        • by aardvarkjoe ( 156801 ) on Tuesday February 10, 2004 @01:40PM (#8239503)
          When we come up with an elegant solution that is cross-distro...

          There are already lots of them. .deb, .rpm, and others. The problem is that most geeks (at least the ones in charge) are stubborn idiots. If they said that debian packages are better than rpms five years ago because the only distro they ever used was debian, there's no way in hell that they will ever admit that another package management system could do the job, and agree to standardize.

          There's no technical reason why we can't get some people together to iron out the last differences and either create a standard package manager, or create well-defined interfaces that allow any front end to access any kind of package. However, if you did that, nobody would use it anyway.
    • by zapp ( 201236 ) on Tuesday February 10, 2004 @01:23PM (#8239274)
      Packages *are* the installers (like MSIs)... only each distribution of linux supports a different one (well, some of them support the same formats).

      In windows, "Add/Remove Programs" is the "Package Manager". Think back to Windows 3.11 where if you installed a program and you wanted to remove it, you had to delete the directory, find any files it dropped in c:\windows, delete them, edit your autoexec.bat, config.sys files... etc.

      Since there is no uniform package manager for linux, and a lot of stuff is just distributed as source (ie: NO package manager support, you're back to the plane old file drop method in win3.11), it can be kind of frustrating.

      For example: Redhat, Mandrake, Suse (and others) all use RPM.
      Debian uses DEB files
      Slackware uses .tgz files
      And anything can usually be found in source format, typically with the extension .tar.gz or .tar.bz2

      It's rather sad when you're on Redhat, and you find a package and its either only in DEB format, or it's in SuSE RPM (which has different dependancies than redhat, so you might not be able to use it) or ... (you get the idea, it's a pain).

      So the point is, we need something equivalent to "Add/Remove Programs" that just *works* on all linux distros.
      • So the point is, we need something equivalent to "Add/Remove Programs" that just *works* on all linux distros.

        So what's wrong with this old song and dance?

        ./configure &&
        make &&
        make install

        Other than taking awhile to compile the occasional large program, it's always worked for me. As far as making a desktop linux for dummies, the idea shouldn't be to have some magic whiz-bang tool that does everything and works on every platform and.... you get the picture. If there is a massive

        • by pyros ( 61399 ) on Tuesday February 10, 2004 @01:53PM (#8239671) Journal
          So what's wrong with this old song and dance? ./configure && make && make install

          No easy dependency tracking, no easy uninstall, no easy upgrade, no audit trail. On a server you don't usually want a compiler installed as it can be a security risk. It's really nice having a database of all the software installed, what versions of what other software it depends on, and reliable way to remove it without keeping the build tree around assuming the build system used has an uninstall method. The only way I would feel confident about not accumulating cruft due to upgrading big packages from source (gnome, kde, X) is if they are installed 100% into a single folder (like /opt/kde/3.2/(bin|lib|conf|man|...). Then I can safely uninstall by deleting that top version folder. Even then, I don't want to take the time downloading and compiling the source, I don't find it to be very recreational. I'd rather run `apt-get install kde` or `apt-get upgrade kde` or `apt-get remove kde`. With that remove command, it also removes packages kde depended on but nothing else does. You don't get that with source installations, you have to keep track of it yourself.

          In the long run, unless you are meticulous about tracking which packages need which other packages, and where they were all installed, you are insuring you will have to rebuild your system from scratch at one point. Package managers like APT and Yum, and even up2date allow you to avoid this.

      • only each distribution of linux supports a different one

        The real problem isn't package formats (as your example with Suse and RedHat shows).

        It's simply that distributions are different. Pretty much any non-trivial Linux program needs a lot of libraries that are non-standard (as in: not in the LSB). Different distributions make different choices, which is probably a good thing, but it breaks binary compatibility between distributions. This has (pretty much) nothing to do with package formats.

    • by KGBear ( 71109 )
      OK, I'll bite on the chance that you really mean it. Unlike Windows, Linux has multiple distributions, each with their own notion of what a complete GNU/Linux system should have and where everything should go. Each distribution caters to different audiences and that means people using Linux have a lot more choices as to what their systems should have and how they should work. Unfortunately, that also means that it's very hard to come up with a (clueless) user friendly way to install and remove programs. Red
    • by theantix ( 466036 ) on Tuesday February 10, 2004 @01:39PM (#8239483) Journal
      For a simple Windows user, what are these "packages" and why do they need to be managed?

      With many windows programs, the source is closed and the developer creates a binary package and controls how the program will be distributed. But with free software, many people take those source files and distribute them in whatever way works best for them -- a package is simply a way to put programs in a file for distributing to others.

      If you'd like you can think of of package as an installation program -- with modern end-user distributions the distinction is minor. A package is RedHat, Mandrake, and SuSE all have programs that will automatically install a .RPM package with a GUI front end, not unlike what you would find in a .MSI file in windows -- even more simple, to be honest.

      But it gets more complicated than that, because of the increased complexity of the *NIX world. Certain programs depend on external libraries (think of it like a .dll file) to run properly, so the package knows which libraries are required for it to install. Debian, Gentoo, and FreeBSD have great systems for automatically installing those dependancies when the user requests a package, and the .RPM-based linux distibutions are getting better at this too.

      It seems just eliminating the whole concept of packages would make life so much easier. Installation programs (like MSI files) are simpler, aren't they?

      Some applications, like the Sun Java JRE, OpenOffice, and the binary NVIDIA drivers (I'm sure there are many others) have their own installation programs. It's ugly and messy and doesn't work that well compared with how each distribution handles packages natively.

      To put it more practical terms, if I download OpenOffice from openoffice.org and run their installer I see a custom installation program that they have developed. I have to answer a lot of questions about how my Linux distribution is set up and do this all in an unfamiliar environment. However if I install OpenOffice .RPMs or use Debian/Gentoo to install the program, the package management system knows how to handle many of the default questions, installs everything in an expected place, and presents any questions in a familliar manner.

      I hope this helps answer your question.
    • by brett_sinclair ( 673309 ) on Tuesday February 10, 2004 @01:47PM (#8239586)
      OK, I'll bite...

      Packages in typical Linux distributions pretty much do the same things as MSI files on Windows, except that they do much more.

      1. They describe how to build from source. That is (obviously) a big deal on an open source platform, since it makes builds repeatable, and so not depending on the magical build environment of one company or person.

      2. They deal with dependencies: package "foo" can dictate that it needs package "bar" to work correctly, and that it needs package "foobar" version 2.32 or higher to build. This is a Good Thing, as you don't have to find out what the dependencies are the hard way.

      This causes some problems from time to time, since distribution X may not have package "foobar", but the real problem here is that distributions are different. This may also be seen as a good thing: package management is a way to deal with diversity.

      3. Standardised package management in a distribution makes other Good Things possible, such as automatic installations of all dependent packages, or automatic upgrades, thanks to tools like apt and yum and the dependency information in packages. That means that you can make sure that every program on the system is up-to-date with just one command.

      Another really Good Thing is that package managers allow a lot more control over installations: they know which files are installed by which packages. That makes it possible to check, say, /usr/lib for any shared libraries that are no longer in use, or if any files have been altered. Thanks to dependency handling, it is also safe to remove unneeded old stuff (i.e. you don't have to put up with a gazillion old .dlls in c:/WINDOWS/SYSTEM32).

    • Packages (Score:5, Informative)

      by arrianus ( 740942 ) on Tuesday February 10, 2004 @01:51PM (#8239637)
      A package is a file that contains information needed to install and uninstall a program. They are similar to MSI files, but have a number of advantages, mostly stemming from the fact that free software is, well, free, and so you can get it without buying it. Proprietary software comes on CDs, whereas free comes over the Internet. Upgrading free software is very "light weight" whereas upgrading proprietary software is usually very "heavy weight." This gives a different distribution model.

      This has several effects. If I distribute a nonfree 10MB program UberTool, that requires the nonfree 20MB MegaLib, I'd better distribute MegaLib with UberTool. If both are free, I can distribute them seperately -- if the user already has MegaLib, he'll just install UberTool.deb. If he doesn't, the package management system will know where to grab MegaLib from, will download MegaLib.deb, and install it.

      Furthermore, if I'm going from Office 97 to Office 2000, it's because I bought money on a CD, and I'm running an installer. In the free software world, upgrades are no-brainers, since they cost no money, and most free software programs are a smooth evolution, rather than major versions every several years. As a result, I'll generally be running the latest version of my office suite (as well as every other little utility on my system), and it is convenient to be able to do the upgrades all in one step (apt-get upgrade; apt-get update will grab all packages with newer versions, and install them, cleanly removing the previous ones). Most people never reinstall Debian -- I know installs from '96 that are still running today, at the latest versions, and there are almost certainly ones from before. I don't know of anyone who went from DOS/Windows 3.1 through Windows XP with just upgrades, and without a reinstall.

      The next thing is Windows has a problem of bit rot. If you leave a Windows system without reinstalling the whole thing, adding and removing programs, etc. crap builds up. You get all sorts of registry keys you don't need, .dll files you don't need, weird interdependencies, and the system gets slower, more bloated, etc. This doesn't happen on Debian -- I installed my box maybe 3 or 4 years ago, and it's identical in functionality to if I installed it yesterday. Package management, well implemented, buys you that. You never reinstall the overall system, and upgrades are well-managed and don't break things.

      The other place package management helps is in centrally-maintained networks. You can install the same package, with the same configuration settings, very easily from a centralized location.

      So package management is, in effect, a fancy way to install and uninstall files. However, the fanciness buys you a lot. The new Windows installer is a form of package management, and gives some of the same advantages, although it's not yet as mature as the GNU/Linux ones (.deb has been around since at least '95, and .rpm even longer).
    • zapp's comment below is good, but I feel the need to add some things.

      There are still PLENTY of Windows applications that don't use Add/Remove programs. You have to find their uninstaller, if they have one. This is the same as downloading a tar.gz with the source and hoping it has a "make uninstall" target. However, free software is available to track packages you compile and the files they install. Software is probably available to help uninstall stuff under Windows too.

      With Debian, I can find out all
    • by IamTheRealMike ( 537420 ) * on Tuesday February 10, 2004 @02:36PM (#8240137)
      Installation programs (like MSI files) are simpler, aren't they?

      I think once you spent hours disassembling and debugging these "simple" installer programs to make them run on Wine you'd have a different view on the matter ;)

      Let's do a quick review of how things are done on Windows:

      • InstallShield/NSIS type installer programs. These embed all the data needed inside themselves, or in the case of InstallShield the actual installer is wrapped by the equivalent of a WinZip self extractor. Ever wondered why InstallShields take so long to start? Well, the first thing it does is extract a bunch of files (setup.exe, data.cab, some dlls etc) that comprise the installer, then it runs setup.exe which in turn extracts the InstallShield Engine to your local hard disk, possibly upgrading it in the process. Then it runs it and makes RPCs to it using DCOM, which starts the actual installation - done by iKernel.exe.

        This is sort of how autopackage works, except we do it in a much simpler way and don't rely on CORBA (the nearest equivalent of DCOM on Linux). These installers have no dependency management beyond "is this file the right version? No? replace it then" which has caused some truly horrific hacks like Windows File Protection.

      • MSI packages. These are the closest Windows has to traditional RPMs/DEBs. You need to install the runtime support separately for MSIs to work. They are based on a bizarre kind of database, with its own dialect of SQL. MSIs are mostly data but are extendable via COM, iirc. They even deal with dependencies, via MSMs (merge modules).

        Yes, Windows apps have dependencies too. Check out this list to see. [installshield.com].

        MSIs "deal" with dependencies by including the least common ones inside themselves, causing huge and bloated downloads, and leaving the user to figure out random breakage for ones that don't exist (how many times have you found that an app assumes the presence of some VB/VC++ runtime lib that just wasn't there?).

        They can get away with this because Windows is a platform and a whole pile of technology is guaranteed to be present. For instance, you know without needing to check that graphics support is available, because it's a part of the kernel and cannot be removed. On the server that's an achilles heel, on the client it's an advantage (in terms of packaging).

      • MSI/InstallShield hybrids. [shudder]. Let's not go there. These things take evil to a new level.

      • Zip files. All MS Windows binaries are relocatable. In contrast, virtually no Linux binaries are. That's partly because it's not at all obvious how to make them so - there is no glibc API available to figure out your absolute path, rather stupidly (and I'm too scared of Uli Drepper to submit one ;). We wrote a simple dropin C file to help C/C++ programs do this - making a program relocatable makes many users lives a lot easier, so do it today.

      Because there is no standard Linux platform (the LSB never caught on), and the user can basically arbitrarily add, remove or upgrade individual components as they see fit (from the kernel to the theme in use) package managers are used to manage and maintain it all. Unfortunately, because there is no standard platform, the distro becomes the platform - of which there are many.

      The freedesktop.org platform effort and the LSB are both worthy steps forward in this area and I hope they pick up steam. In the meantime, approaches like autopackage, being dependency-smart and having communities of packagers are the way forward.

    • The differences are:

      A package declares what other packages are required to install. Imagine if you were installing a program that *required* IE6.0 and Media player 5.0: a Windows installer will start up, run, and then barf saying you need to install something. A package would allow to to determine BEFORE you start that you need other things installed.

      A package lists all files to be installed, and a package manager tracks who installed what. Thus, when you encounter a file you don't recognize, you can ask
  • by Anonymous Coward on Tuesday February 10, 2004 @01:11PM (#8239113)
    I am sick and tired of poorly planned package management. What we need is a system where 3rd parties can build binaries from the same source as the original vendor and get the same exact binary. Then using PGP, we can verify the authenticity of the packages. There is no reason the encryption can't be hidden away and transparent to the end user - checking for signatures of the original vendor only, and wide open for enhancements by power users who want to check multiple sources etc.

    Lets do this from the beginning rather than slapping it on later.

  • OpenPKG (Score:5, Informative)

    by chipster ( 661352 ) on Tuesday February 10, 2004 @01:12PM (#8239122)
  • by El Cubano ( 631386 ) on Tuesday February 10, 2004 @01:12PM (#8239128)

    APT already handles debs and rpms. tgzs should not be a far stretch. The problem is establishing standards and getting everyone to follow them. For example, all debs in the Debian archive follow the Debian packaging standard, else they would not be accepted into the archive.

    Naturally, third parties are free to create their own non-conformant debs. This is just the same as someone creating an rpm for RH9, but it not conforming to the conventions used by Red Hat.

    I assert that the tools already exist. I.e., we don't need a new one. The emphasis needs to be on getting people to follow the standards, and possibly creaitng a cross-dsitro standard fo everyone to follow.

    • I assert that the tools already exist. I.e., we don't need a new one. The emphasis needs to be on getting people to follow the standards, and possibly creaitng a cross-dsitro standard fo everyone to follow.

      I totally agree with that. But I think there should be two options for a each distribution - the native one, and a cross platform one. For filesystem layout, cross-platform packages should assume that everything defined in FHS - Filesystem Hierarchy Standard [pathname.com] (part of LSB [linuxbase.org]). And the distributions shoul

    • Which is, of course, why projects like this never go anywhere. Because if you're going to require that every piece of software be repackaged anyway, to conform to your standard, there's no additional burden in packing them all in the same format. In which case, you've got the Debian project all over again, except that Debian has a ten-year head start and a lot more package maintainers in place.

      Sure, there's scads of obscure software lying around out there in TGZs. But the fact that it's still a TGZ woul
  • by codepunk ( 167897 ) on Tuesday February 10, 2004 @01:12PM (#8239131)
    All smoke and no fire... Don't talk about it just build it. Personally I think someone should sit down and hack together a install package builder program based on something like gdialog and python that outputs a executable compressed image into a single bin file.

    A good installer is not hard to accomplish if the desire for it really exists. It is however one of the most overlooked things as open source programs are involved.

    Don't make me go hunting down 20 dependency packages but offer to install them for me. A simple script based on wget can do that...
  • by derphilipp ( 745164 ) on Tuesday February 10, 2004 @01:13PM (#8239139) Homepage
    Yeah !
    Unify those packages.
    I am so often confused the RedHat comes in a red box and SuSE in a green one. - Which of those should I buy ?
    And Fedora comes with a box you have to fold yourself...

    Oh you mean these packages....
    (Fedora Linux is included in the RedHat magazine - which has a foldable page for creating a suitable box)
  • *BSD ports system? (Score:5, Interesting)

    by Anonymous Coward on Tuesday February 10, 2004 @01:13PM (#8239140)
    Why not leverage from the BSD ports system [freebsd.org]? It already builds directly from source, checksumming the downloads to ensure security, and applies BSD-specific patches. Shouldn't be too difficult to grow this so that source patches and binary packages are platform-neutral.

    ps: BSD trolls are dying!
  • 0Install (Score:5, Informative)

    by Sanity ( 1431 ) * on Tuesday February 10, 2004 @01:14PM (#8239145) Homepage Journal
    What about 0-Install [sourceforge.net]? It is simple, elegant, doesn't require root to do an installation, seamlessly downloads libraries and other dependancies as they are needed, and integrates nicely into the filesystem. I really think 0Install could be the future of installers, if only they can get someone to build a distro around it.
  • Well, for conversion from RPM to DEB (and supposedly the other way around), alien has worked quite well for me (apt-get install alien). Thus far I've managed to convert+install several RPM's that otherwise would not have worked very well on my debian system.
  • It doesn't have any metadata like what other packages this package requires, etc. Its fine
    for what it is but it ain't no package format.
    • It is when used on Slackware. Slackware packages add the metadata in a specially named file in the archive, very much like a Java JAR manifest. (A JAR is bascially just a .zip with a manifest.)

      When you use pkgtool to install from the tgz file, the package database is updated with info about the package that you just installed. It can also check for dependencies, etc.

  • What we all need ... (Score:3, Interesting)

    by phoxix ( 161744 ) on Tuesday February 10, 2004 @01:20PM (#8239229)
    Something like debian's debconf. But much more powerful and versatile

    Essentially there would be some glue between the package management system, the "configurator", and the actual config file.

    RPM/DPKG/ETC <--> Glue <--> /etc/sysconfig/foobar

    Sunny Dubey
  • by Hornsby ( 63501 ) on Tuesday February 10, 2004 @01:28PM (#8239346) Homepage
    Don't they know that it's MozillaFirefox and not MozillaFirebird. This installer obviously isn't going to be able to keep up to date.

  • Learn from Apple (Score:5, Insightful)

    by Octos ( 68453 ) on Tuesday February 10, 2004 @01:29PM (#8239360) Homepage
    Maybe Apple can do this because they have a standardized directory structure, but what can be easier than dragging an app package to the Applications folder? Poof, it's installed. Don't like it? Delete it. If it's more complex, there's an installer program. Playing with dependencies and makefiles is the reason I gave up on Linux.
    • I agree. I've never used Apples too much, but when I got a bit of software from someone and took it to my buddies place to use on his computer, I just had to copy the directory into the applications dir. It worked great, right away. It's a good idea for more than that, as you can insist every application maintains its own configuration files and directories, and leave the /etc (does OS X have one of those) for operating system only stuff.

      cool indeed.

    • by kinnell ( 607819 ) on Tuesday February 10, 2004 @02:57PM (#8240375)
      Maybe Apple can do this because they have a standardized directory structure, but what can be easier than dragging an app package to the Applications folder?

      The Apple way rocks, the directory structure doesn't matter: you can execute any properly written application from anywhere. There are 2 problems with this in linux. Firstly, most applications are written with full path names in the code for files they reference. Secondly, linux applications are built around a wide range of supporting libraries, few of which are standard to all distributions - this means either huge packages, or a dependency resolution mechanism, like apt.

      A linux distribution which wanted to do this would have to firstly pick a core set of standard libraries, then port all the applications they wanted to support.

    • by IamTheRealMike ( 537420 ) * on Tuesday February 10, 2004 @03:00PM (#8240409)
      Maybe Apple can do this because they have a standardized directory structure

      Nope, they can do it because they have a standardised platform. All MacOS X apps have at least one implicit dependency - on some version of MacOS. That one dependency is very easy (well, assuming you are rich) to satisfy, but it gives you a whole pile of functionality.

      Because desktop Linux is evolving at such a rapid rate, and because nobody controls it, and because people like being able to control what's on their system to a fine degree, we have no such platform currently.

      Unfortunately the first attempts at such a platform foundered when KDE made the bum choice of using a toolkit with a problematic license, so Gnome was started in reaction and now we have two, with neither being widely spread. The Gnome guys are busy moving infrastructure out into non-Gnome libraries like GTK, Cairo, DBUS etc that are explicitly designed to be acceptable to everybody, but unfortunately the KDE guys are still having big internal arguments about whether freedesktop.org (the new attempt) is "forcing" things upon them (see the discussions at kdedevelopers.org)

      This is unfortunate because a standard platform that you could depend upon with only one dependency would go a long way towards making software on Linux easier to install.

  • by Timbo ( 75953 ) on Tuesday February 10, 2004 @01:32PM (#8239399) Homepage
    alien [kitenet.net]
  • by arrianus ( 740942 ) on Tuesday February 10, 2004 @01:32PM (#8239402)
    The theory is fine. The problem is that package managers are, in many ways, incompatible. Debian packages, for instance, track dependencies based on the names of other Debian packages (libfoobaz-dev requires libfoobaz). I've seen package management systems that have dependencies based on files (libfoobaz-dev requires /usr/lib/libfoobaz.so). The former system won't recognize dependencies from packages installed in the latter format. Worse, the packages don't overlap. One distribution will have libgnome, whereas another will have 50 different packages, one for each of the Gnome libraries. The problem of dependencies there breaks almost completely.

    There's also a matter of versions and security updates. On Debian, I run 'apt-get update; apt-get upgrade' and have a new version. Since the packages are all maintained by the Debian project (and a few smaller projects that target Debian), this works. Versions aren't linear -- Debian back-ports security fixes. The package manager has no way of knowing whether kernel-2.4.24 is newer than kernel-2.4.19-with-patches.

    Basically, there is no clean way to install .rpm packages on a .deb system, or vica-versa, without breaking something. It is possible to install packages of one sort on the other system, but eventually, things will break. Each package management system relies on some set of information about packages to work, and each system has a different set of information it provides and needs.

    There is room for improvement in package management -- a really good GUI for finding and installing packages would be nice. I wouldn't mind having more information about the packages I'm about to install -- links to project web pages, ability to browse installed files (the packages.debian.org/freshmeat.net/etc. databases either installed locally or quickly accessable from the system), the ability to view screenshots of GUI programs, etc. There's a lot of metainformation that could be added, and better search functionality that could be implemented.

    At the same time, on the package build side, it'd be pretty simple to have a system where you make a configuration file of information about the package, and it builds .deb, .rpm, .tgz, etc. packages, and give easy-to-read information about what systems it'll work on. I've heard of tools similar to this, but I haven't seen them used. Adding something like this to the standard autoconf/automake/... process would certainly be nice.

    The last solution is to have the groups work together to make sure all packages have the same set of metainformation (more than is needed for any given package system), so that cross-platform package installs become possible. In practice, I don't see this scaling across versions, as package management systems evolve.

    One more thing to bear in mind is the perspective of the author of the article -- he says he runs Slackware, and builds most packages from source (something I've stopped doing maybe 3-5 years ago). Slackware's package management tools are very basic, manual, and crude. That gives a very different attitude towards package management than someone running a distribution like Red Hat, which has a much heavier-weight, more technologically-advanced, but somewhat fragile, somewhat inflexible package management system, or a user of Debian, which has a state-of-the-art ubermaintainable, uberupgradeable package management system, but that primarily relies on grabbing packages from one (or a small number) of sources. I apologize about the stereotypes in this paragraph -- they're not entirely true, but the package management systems differ a lot (more than most people realize, if you've ever tried to build the packages), and I'm just trying to make people aware that users of each of them will have a very different world view, and it's important to keep that in mind when reading these articles.
  • by Doktor Memory ( 237313 ) on Tuesday February 10, 2004 @01:41PM (#8239516) Journal
    Let's see if we can actually go for six months without somebody announcing Yet Another Binary Package Management System or Meta-System. That would actually be newsworthy.

    The amount of time and money that's been wasted on this problem for over twenty years in the unix world is just mind-boggling. We really do not need to reinvent this wheel again.
  • KDE + Gnome (Score:4, Insightful)

    by G3ckoG33k ( 647276 ) on Tuesday February 10, 2004 @01:42PM (#8239520)
    Yes, I know, this might be considered offtopic, but...

    If we get KDE and Gnome work together, then we might also, eventually get an installer too!!! :)

    PLEASE!!!

    Let KDE 4.0 and Gnome 3.0 be the same!!!
  • A few things: (Score:5, Insightful)

    by Wordsmith ( 183749 ) on Tuesday February 10, 2004 @01:43PM (#8239529) Homepage
    Why hasn't anyone developed a system that, from the End-user perspective, works similarly to MSI installations (which work very well). Point, click, next next next. In principal, DEBS/RPMs work similarly to MSIs, but the installation isn't as obvious a procedure to end-users.

    And for that matter, why not make the installer intelligent about the distro? Use a single package/installer, but that includes all sorts of scripting information about installation in variosu circumstances. The installer checks to see if it's on RH9, and if so it puts files where RH9 expects them, editing any configurations and making RPM database entries as necessary. If it's on Debian, it takes the appropriate measures there. And so forth.

    Why do we see such absurd dependencies that don't seem to happen in the windows and mac worlds? Install a new version of a KDE app, and you need the latest minor revision to the core KDE libs, which in turn requires a minor upgrade to the font server, etc. In the windows world, occasionally you need to update something big like DirectX to install a latest-and-greatest app, but even then the dependencies are often packaged with the app itself. Why isn't this practice more common in Linux/Unix (not counting Mac OS X)? I undestand that many of these apps are under CONSTANT quick-release development and are often tied to bleeding-edge versions of their libs, but why aren't major releases at least more dependency-friendly? Installing an app can be a real pain in the ass even with something like apt, if you don't have the dependencies in the repositories you've defined. And adding new respositories isn't exacly grandma-friendly.
    • Re:A few things: (Score:4, Insightful)

      by MobyDisk ( 75490 ) on Tuesday February 10, 2004 @02:35PM (#8240132) Homepage
      Why hasn't anyone developed a system that, from the End-user perspective, works similarly to MSI installations (which work very well)... In principal, DEBS/RPMs work similarly to MSIs...
      MSIs are fundamentally different than packages, although they aim for the same purpose.

      First, MSIs typically run executable code and/or scripts to perform the installs. But packages usually contain just a list of files and locations, with no scripting required. This is important since a security-concious administrator won't want to run an MSI-like package since nothing stops it from doing rm -rf /. It alsmo makes them easier to create and maintain. Technically, an RPM or DEB can run scripts, but it is uncommon, and you can easily tear them apart and see the script they are running with a few simple commands.

      Second, MSIs don't have a central database where all the file information is stored. There is some use of the registry for this, but it isn't quite as good. With RPM/DEB, I can ask the package manager: "What package does /usr/libfoobar.so.1.2.3" belong too?" and "What versions of /usr/libfoobar.so are installed and where?" These are just a few simple examples, lots more can be done.

      As far as the granny meter, I rate APT distributions as easier than MSI, because most MSIs display between 3 and 10 screens where you must click "Next" or scroll down and click "I agree." As a consultant to many such peoples, I have been amazed that, many of them cannot get through these installs! An RPM/DEB install with APT handling dependencies does not have any prompts - you just click it, the progress bar goes by, and it is done!

      Why do we see such absurd dependencies that don't seem to happen in the windows and mac worlds?

      Fundamentally, Linux packages don't have the dependency conflicts that Windows packages have. Windows packages commonly overwrite shared files all the time. By splitting the dependencies into separate packages, that DLL-hell should go away: nobody will overwrite another package's files. But instead, we get it in another form because dependencies aren't stated properly in the packages. I just hope that things will improve over time.

      Dead on bro. Remember when Microsoft added "self-healing" to Windows XP because so many companies were making crap installs? Deleting dlls, installing dups, putting them in wrong directories. This is the exact same problem as Windows has/had, but in RPM & DEB form rather than EXE & MSI form. It seems like coders just don't get package management well. And if you look at APT, it even has a healing-like feature now!

  • I've never been terribly concerned about incompatible package foramts, as long as source is available. The common system is that everything managed by the package manager goes into /usr, and everything I build myself goes into /usr/local.

    Once in a while I have to copy something manually into /etc/init.d, but that's about it.

    Of course, it helps that for almost anything I want to install, my distribution includes it or at least includes the dependencies.
  • Standards (Score:5, Insightful)

    by Milo Fungus ( 232863 ) on Tuesday February 10, 2004 @01:45PM (#8239556)

    All of this will come at the price of standards of course...

    Standards are not a price, they are an investment. I use standard XHTML, CSS, and SVG in my web design because I care about the future of the web. Besides, if a standard is well-designed (like W3C recommendations tend to be), it actually makes development and maintenance easier. Anyone who has migrated from HTML 3 (or some nonstandard IE/Netscape HackML) to HTML 4 or XHTML with CSS knows what a pleasure it is to work with modern hypertext (and probably also has an abiding and bitter hatred for IE). The same could be true of package installation in Linux if the standard is well-designed.

  • by Gilesx ( 525831 ) * on Tuesday February 10, 2004 @01:53PM (#8239666)
    Looking at this from a newbie's point of view, is this really such a great idea? I mean, at face value, the idea of us living in a utopia where all the differing packaging standards are compatible is nice, but how many "green" Linux users would even understand what a difference is? They would see themselves as using Linux as opposed to Windows, and not abc's Linux as opposed to xyz's Linux or Windows...

    Total package compatibility would most likely lead to someone using Red Hat trying to install a debian package, and then getting frustrated, confused and pissed off with the inevitable failure due to the entirely different internals of Debian and Red Hat.

    Unfortunately, it is a sad fact of life that Linux distros are deviating from the once common base they shared. An example of this is Mandrake - I used to use Mandrake around versions 6 and 7 and quite often installed Red Hat rpms successfully. However, as those crazy French spend more time tweaking Mandrake in weird and wonderful ways, it becomes further and further removed from Red Hat. Sure, they both use the .rpm standard, but can you imagine trying to install a Mandrake RPM with a *lot* of deps on a Red Hat system?

    All of this leads me to conclude that perhaps rather than concentrating on unifying packaging, we should instead focus on making incompatible packaging systems for each major distro. IMHO, it would be much easier for a newbie to distinguish between what will and won't work if they were guaranteed that an rpm would ALWAYS work on Red Hat, and some other kind of package (MPM?) would ALWAYS work on Mandrake....
  • by Stonent1 ( 594886 ) <stonentNO@SPAMstonent.pointclark.net> on Tuesday February 10, 2004 @02:25PM (#8240009) Journal
    Which is why I'm working on getting gentoo portage going for Solaris. Portaris! [gentoo.org]

    If you feel you habe some ideas that could help with this mini project, feel free to join in!
  • In Windows Hell... (Score:3, Interesting)

    by Spoing ( 152917 ) on Tuesday February 10, 2004 @02:33PM (#8240099) Homepage
    I'm struggling with Windows problems, mainly due to the fact that Windows installation programs do not track dependencies. They use an incrementing counter and self-registration to track what files are in use. Because of that, tools like Dependency Walker -- very handy BTW -- have to be used and the process is manually intensive if you want to be sure everything is actually there that should be. Even Dependency Walker can't be used to examine files that aren't executibles or libs.

    Too much trust is placed in the installation program getting it right, and no built-in way is available to check if dependencies are broken.

    You can ding the different distributions -- and quite rightly -- for package problems though in comparison most of them are dreams when stacked up against Redmond's latest offering.

  • Gentoo and Portage (Score:5, Informative)

    by SwansonMarpalum ( 521840 ) <redina.alum@rpi@edu> on Tuesday February 10, 2004 @02:45PM (#8240246) Homepage Journal

    As usual I'll come out with my Gentoo Zealotry but I'd like to deflect some of the problems I'm seeing mentioned here.

    Gentoo is a Linux distribution largely centric to the Portage package manager (there are other features of Gentoo, but Portage is by far the most conspicuous)

    Portage is a package manager loosely inspired by FreeBSD's ports system. Portage maintains a global software configuration file called make.conf. Make.conf holds meta-configuration settings about your system. As Portage builds all programs from source for your machine, make.conf is the place where you describe your machine to Portage. make.conf also holds a collection of use flags. Use flags are global binary switches. They have a default value if they are unspecified, and if you include a Use flag (ie USE="java") then it turns that flag on, and if you include -flag, (ie USE="-java") then it explicitly will not use that feature which is globally recognized by Gentoo.

    I see complaints that emerge VI tried to build X and thus portage is "smarter" than you as a sysadmin. This is patently false and ignorant. Portage lets you do your job as a sysadmin once and then never have to worry about doing it again. If you do not want X on a machine then you need merely put "-X" in your use flags.

    It puts control in your hands. If you want an application built to support certain things you can have it. If you do not want to support other things explicitly it will do that. It defaults tod doing what's sensible for most people who use Linux casually. If you aren't a casual user, spend a week or so getting familiar with portage and it's configuration. emerge is an incredibly potent tool. All of my systems are patched automatically every day, from source, with the configuration I have specified for that system. My binaries are all built with -march for the CPU, and -Os. And I've never once had any of my systems have a failure caused by misconfigured dependencies. They stay up to date and I don't have to worry about it.

    If you want to do all your dependency checking yourself, you're welcome to. However there's a good solution that takes care of all of the issues revolving around this available, freely, to the world. Click here to find out more about it. [gentoo.org]

  • by Wolfier ( 94144 ) on Tuesday February 10, 2004 @04:11PM (#8241196)
    In other words, there should not be any "pre-install" or "post-install" or "during-install" scripts inside packages.

    Packages should contain data. That's it. Leave all the executable work to the package manager, like dependency checking, startup scripts, etc.

    Letting packages run its own stuffs during installation is the root of all issues associated with uninstalls.

  • by Cthefuture ( 665326 ) on Tuesday February 10, 2004 @04:19PM (#8241283)
    Does anyone else see what's happening?

    Too many package formats, too many window managers, too many GUI toolkits, too many desktop environments, too many Linux distributions, etc, etc.

    I like choice, I really do, but this is madness. Not only is a great deal of time used to create competing software (Kword, Abiword, Open-Office) but now we're creating more work for ourselves by trying to integrate it all (packages managers, RedHat trying to unify GNOME and KDE, etc). Wow, this can't be good.

    How is all of this going to compete with entities that have a more focused approach? I believe the only reason why anything has gotten done at all is because there's just so damn many people working on things. This causes serious talent dilution though. Things are nowhere near as good as they could be (or could have been).

    This is quite disturbing.

    It's interesting to note the things where this hasn't happened. Just try to create a competing standard to HTML, XML, SQL, or OpenGL (note that I'm talking Linux/FreeBSD, etc, not Windows). Not that people don't try but they never gain momentum. I have to think if there was an ANSI, ISO, or whatever standard desktop evironment then that would help. I seriously doubt something like that could be done in a reasonable time, I'm just saying it might help.

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...