Building A Better Package Manager 431
SilentBob4 writes "Adam Doxtater of Mad Penguin has published a preliminary layout for his proposed cross-distribution package manager capable of adding/removing software from any locale. He is suggesting the interface will basically allow for installation of several major package formats including RPM, DEB, TGZ, as well as source code with the ability to pass build time options. All of this will come at the price of standards of course, including naming, documentation, and package structuring. If this idea were to catch on, it would signify a major leap in desktop Linux usability. This might be a project that UserLinux might benefit from. Read the full column here (complete with GUI mockups)."
Autopackage? (Score:5, Informative)
FreeBSD ports collection (Score:3, Interesting)
Re:FreeBSD ports collection (Score:5, Informative)
Don't confuse the FreeBSD ports with a packaging system. FreeBSD has its own nice packaging system [freebsd.org]. (If you've used Solaris a lot you'll feel right at home) The FreeBSD ports all create and install packages for you "behind the scenes", and you can install any package as binary on FreeBSD as simply as:
# pkg_add ftp://ftp2.freebsd.org/pub/FreeBSD/releases/i386/5 .2-RELEASE/packages/archivers/rpm-3.0.6_8.tbz
Re:FreeBSD ports collection (Score:5, Insightful)
I'm not going to call you a troll. But I am going to assume you dislike the RPM format due to the dependency-hell problems of the past, if that's not why you dislike it, then feel free to ignore the rest of my post, except the PS. The problem is that many distro maintainers selected the RPM format (Red Hat, Mandrake, Suse, Ark, even the LSB chose RPM as the standard format), and then packaged software with conflicting package names and file system layouts. So you go looking for an RPM for Red Hat, and find one from Suse, and it says it needs xfree86 3.0.3, even though you have XFree86 3.0.3 installed. Or perhaps it needs some particular .so to be in /usr/lib but the Red Hat package owning that file put it in /usr/lib/ssl.
These aren't flaws in the RPM format, these are the problems this project aims to fix. You would see the same problem with dpkg if there were as many popular distributions which used dpkg but didn't base themselves off of debian's repositories.
The other complaint commonly made is that apt is better. Apt needs a tool like rpm or dpkg behind to actually be useful. Apt is purely that part of the system that locates dependencies, like the part of portage that knows to build and install X before building and installing GNOME. It doesn't actually install the package or maintain the database of installed software. Apt also runs very nicely on RedHat. Connectiva and Ark use it as the default system. Mandrake and Suse implement their own dependency tracking system.
PS - In addition to Red Hat and Mandrake, I've also tried FreeBSD (years ago, didn't support all the hardware needed to install), Debian, Suse, Ark, and Slackware.
Hear Hear (Score:5, Interesting)
Either your package uses packages from some standard repository (Linux standard base, anyone?)
OR
You will provide all needed packages that are NOT in that standard set in your APT repository.
So if I provide Foo.1.2.3, and it requires Narf.2.4.pentium, and the standard repository is providing Narf.2.3, then I must provide Narf.2.4.pentium on my site.
Of course, I would also pimpslap anybody who actually depended upon Narf.2.4. pentium as opposed to simply Narf.2.4.
And to address the tweakophillia of the Gentoo types - what about a program that could be run from a cron job that would examine all recently installed packages, pull the source packages, rebuild them with the locally provided options, and upgrade them? Thus, I could *quickly* install Poit.9.1, and then tonight my machine would pull Poit.9.1.src, build it with "-Os -march=athlon-xp -mcpu=athlon-xp -mfpmath=sse,387", and install it.
Re:Hear Hear (Score:3, Insightful)
-fomit-frame-pointer makes it a bitch to debug things - I don't like making debugging harder.
Yes, APT is a great piece of the puzzle, but it doesn't solve everything - for example, in Fedora, there was a package checked in for Bittorrent-GUI, which required a package wxPython, which exists NOWHERE on the 'Net. Smooth.
That is one thing I will give the Debian folks - you make your package work or you don't get in stable.
Package System Opinion (Score:3, Insightful)
Re:Autopackage? (Score:5, Insightful)
Except that this guy has just stated the idea and made a couple of mock-up screenshots, whereas the autopackage guys are coming up with a complete, sensible solution and are leaving the interface until the end.
Re:Autopackage? (Score:5, Informative)
For now, if it doesn't work, use autopackage.sunsite.dk [sunsite.dk] and bear with us as we fixup the broken links etc.
Re: A-A-P? (Score:3, Informative)
A-A-P [a-a-p.org]
Re:This is Tuesday, isn't it? (Score:4, Funny)
Did you even READ the name of the project I referenced? It's called Autopackage. It takes care of that for you.
Re:Dependancies (Score:4, Interesting)
Whoa, slow down. That's not right at all. Firstly, some RPMs are compatible across distributions, but not all - and it's basically hit and miss.
The linker fixup problems are one issue yes, but to be honest these occur rarely, especially once you start stripping unnecessary DT_NEEDED entries from your binary (unnecessary -lfoo options).
Eventually we'll need to change it to be more like how Windows does it, but it's not a high priority. Fixing build tools to not dump piles of bogus (in the case of recent toolchains) -lfoo options on the compiler is a more important issue, but I have no idea how to fix this in a general way. Possibly extensions to pkg-config would need to be made, certainly this is something that will require large-scale changes to peoples build systems.
I'm hoping that when autopackage is released and stable, it'll be so fantastically popular that it will motivate people to fix these myriad binary portability issues in their apps (and none of them are unsolvable or inherent in the way Linux works). We're writing a guidebook to help people do this, but it's not really released yet.
Re:Dependancies - Sandboxing (Score:3, Informative)
On Windows, the linker records which DLL a symbol comes from and you can explicitly specify it in the source/header files using some simple language extensions. On Linux, that isn't done, and worse ELF specifies unscoped symbol fixup semantics.
To rephrase the original question: if libA1.so is linked against foo_func() in libA2.so, and libB1.so depends on foo_func() in libB2.so, and binary Z links against libA1 and libB1, then the results will not be
Don't leave out Gentoo! (Score:2, Insightful)
(No, this isn't a troll, I'd really like to see that.
Re:Don't leave out Gentoo! (Score:5, Interesting)
Let's say I update my portage tree, and then I want to upgrade a package, like GAIM, for instance. GAIM's dependencies are GTK and a bunch of other stuff. When I try to upgrade my version of GAIM and there happens to be a better version of GTK available, Portage will upgrade GTK first, regardless of whether you actually need the very latest GTK to run GAIM. I'd rather see Portage know what the minimum version a dependency has to be in order to get a program running. As far as I know, it'll just upgrade everything in the dep tree.
Unless I'm mistaken, at least. I've been using Gentoo for a while now, and for the most part I just do a "emerge -u world", which takes care of me pretty nicely. It just takes a while.
---
Re:Don't leave out Gentoo! (Score:2)
Re:Don't leave out Gentoo! (Score:3, Insightful)
I can certainly relate, but so far, I haven't had any emerge world trouble on any of the Gentoo machines I've set up. Occasionally it'll fail and you'll have to wait for another portage tree to fix the problem, or you've got to go in and tweak something, but I've found the online message board to be extremely helpful in troubleshooting things like
Re:Don't leave out Gentoo! (Score:4, Insightful)
My major gripe with portage is it doesn't record USE flags - eg. if you install links with 'USE=-X' to avoid its dependency on XFree (which moron decided that one???) then then next time *anything* decided to upgrade links it'll rebuild all the X dependencies back in.
That's a real pain when you're talking about things like the java depedency to libdb for example, which doesn't actually work...
Re:Don't leave out Gentoo! (Score:5, Funny)
Two years huh? How's the good ol' 386SX going?
Re:Don't leave out Gentoo! (Score:5, Informative)
"emerge gaim" will just upgrade to the needed packages, or only gaim.
Re:Don't leave out Gentoo! (Score:4, Informative)
"emerge gaim" will just upgrade to the needed packages, or only gaim.
Mod up parent please! I don't have any mod points and this concisely explains how portage works.
Re:Don't leave out Gentoo! (Score:2)
but gentoo has stated that thier distro is more oriented to the more capable linux types. So if you know gaim doesnt need it. Dont install the deps.
emerge --nodeps gaim
though i totally agree with the spirit of the post and maybe portage will have that capability someday...uneeded upgrades are always risky
Re:Don't leave out Gentoo! (Score:5, Informative)
Re:Don't leave out Gentoo! (Score:5, Informative)
Basically, this is wrong. Sorry.
The "-u" parameter to emerge will make it work as you described. However, if you just typed "emerge gaim", it would only emerge the minimum required. You have to ask for the "emerge all depencies, too" behavior.
I quite frequently emerge -up world, then just pick and choose what I want updated.
(I just checked "emerge -p world" against "emerge -up world", and "emerge -up" did significantly more packages on my system, where over 100 packages can be updated. On Gentoo, IIRC, the "world" is the list of things that you explicitly emerged; "emerge mozilla" will put mozilla in the "world" list but not any of its dependencies. So "emerge world" can update the packages you cared enough about to explicitly ask for them, and -u will add all possible dependency update.)
Re:Don't leave out Gentoo! (Score:5, Informative)
Re:Don't leave out Gentoo! (Score:5, Informative)
I don't really see a point to including ebuild's in this package manager though, as the package manager should already be doing the work of the ebuilds maintain source packages.
Besides, portage will kick this package managers ass anyday.
Re:Don't leave out Gentoo! (Score:5, Interesting)
Funny, all I ever see is Gentoo bashing. Are we reading the same sladhdot?
Re:Don't leave out Gentoo! (Score:3, Insightful)
If you just stare at the screen and watch the code compile, then yes, that would be a waste of time. Most Gentoo user though are smart enough to know how to multitask. While program foo is compiling, you do something else, either on the computer, or (and here's the really shocking bit) away from the computer.
So with that in mind, please explain to me how typing
Re:Don't leave out Gentoo! (Score:3, Insightful)
The proper way to do things like this, if you must, to a production server is to have a similar machine that you can compile on, and then just transfer the binary over.
Mnyah! (Score:5, Funny)
When I was your age, we called 'em by their proper name--athletic supporters!
"Package manager", indeed...
They have it. (Score:2, Informative)
Again? (Score:5, Insightful)
And yet nothing ever changes.
Re:Again? (Score:5, Funny)
Re:Again? (Score:5, Interesting)
Not true. Red Hat's up2date supports apt repositories and the dpkg format is getting GPG signature/hash checking. From discussion late in the Fedora Core 1 beta stage it seemed that there is internal pressue to include apt in the core distro at Red Hat. Those are big changes, I think. I stopped reading the article since it's getting slashdotted, but it the author[s] can implement a single database that tracks installation by RPM, deb, and tgz, then I'd wager those features will be added to RPM and dpkg down the line. I honestly can't see either Debian or RedHat jumping ship to a new system, but they both borrow features from each other, so why not from this too?
Re:Again? (Score:3, Interesting)
Do you have any more information regarding that? Everytime I've asked on #debian I get very vague answers. Usually the argument is that all the package maintainers and contributors would need to send in their keys and sign all of the packages they contribute which would require a lot of effort. Given the recent security issues with trojans I think that this would be well worth the effort.
Re:Again? (Score:3, Interesting)
The only sensible way to work is with source patching, it is far easier to download smallish source patches and co
Re:Again? (Score:5, Insightful)
The nearest thing I can find to a decent package manager is 'stow' which essentially does *nothing* but hide all that hierarchical complexity behind top-level directories. A new package *format* is not the problem, it's layout and management that is the problem. Everything else is a new wrapper on the same old problem.
it's not the package format (Score:5, Insightful)
Re:it's not the package format (Score:3, Insightful)
Re:it's not the package format (Score:3, Interesting)
Please explain....? (Score:5, Interesting)
What is so special about this? It seems just eliminating the whole concept of packages would make life so much easier. Installation programs (like MSI files) are simpler, aren't they?
This is not a troll. Please answer my question, don't mod me down.
Re:Please explain....? (Score:5, Informative)
The problem is that different distros have different directory layouts, configuration file layouts, different places to put binary files, different ways of updating the internal library database etc. etc. etc.
The problem is basically a manifestation of there being more than 1 distro of linux and having distro maintainers who have not agreed on a common standard for this stuff. It's Linux's major achilles heal IMHO.
Re:Please explain....? (Score:3, Insightful)
I would argue this, actually. Just because it's s a problem that needs fixing, does not make it n achilles heal. Think about it: When we come up with an elegant solution that is cross-distro ( and possibly cross-platform ), it will make linux that much stronger.
Re:Please explain....? (Score:5, Insightful)
There are already lots of them.
There's no technical reason why we can't get some people together to iron out the last differences and either create a standard package manager, or create well-defined interfaces that allow any front end to access any kind of package. However, if you did that, nobody would use it anyway.
Re:Please explain....? (Score:5, Informative)
In windows, "Add/Remove Programs" is the "Package Manager". Think back to Windows 3.11 where if you installed a program and you wanted to remove it, you had to delete the directory, find any files it dropped in c:\windows, delete them, edit your autoexec.bat, config.sys files... etc.
Since there is no uniform package manager for linux, and a lot of stuff is just distributed as source (ie: NO package manager support, you're back to the plane old file drop method in win3.11), it can be kind of frustrating.
For example: Redhat, Mandrake, Suse (and others) all use RPM.
Debian uses DEB files
Slackware uses
And anything can usually be found in source format, typically with the extension
It's rather sad when you're on Redhat, and you find a package and its either only in DEB format, or it's in SuSE RPM (which has different dependancies than redhat, so you might not be able to use it) or
So the point is, we need something equivalent to "Add/Remove Programs" that just *works* on all linux distros.
Re:Please explain....? (Score:3, Informative)
So what's wrong with this old song and dance?
./configure &&
make &&
make install
Other than taking awhile to compile the occasional large program, it's always worked for me. As far as making a desktop linux for dummies, the idea shouldn't be to have some magic whiz-bang tool that does everything and works on every platform and.... you get the picture. If there is a massive
Re:Please explain....? (Score:5, Insightful)
No easy dependency tracking, no easy uninstall, no easy upgrade, no audit trail. On a server you don't usually want a compiler installed as it can be a security risk. It's really nice having a database of all the software installed, what versions of what other software it depends on, and reliable way to remove it without keeping the build tree around assuming the build system used has an uninstall method. The only way I would feel confident about not accumulating cruft due to upgrading big packages from source (gnome, kde, X) is if they are installed 100% into a single folder (like /opt/kde/3.2/(bin|lib|conf|man|...). Then I can safely uninstall by deleting that top version folder. Even then, I don't want to take the time downloading and compiling the source, I don't find it to be very recreational. I'd rather run `apt-get install kde` or `apt-get upgrade kde` or `apt-get remove kde`. With that remove command, it also removes packages kde depended on but nothing else does. You don't get that with source installations, you have to keep track of it yourself.
In the long run, unless you are meticulous about tracking which packages need which other packages, and where they were all installed, you are insuring you will have to rebuild your system from scratch at one point. Package managers like APT and Yum, and even up2date allow you to avoid this.
Re:Please explain....? (Score:3, Informative)
When was that, the initial design phase? up2date was written specifically to address the problems of dependencies and automated errata installation. In Fedora Core, up2date supports APT repositories (and YUM repos as well). My rhn_applet lets me know when the fedora.us and freshrpms.net Apt repos have
Re:Please explain....? (Score:3, Insightful)
The real problem isn't package formats (as your example with Suse and RedHat shows).
It's simply that distributions are different. Pretty much any non-trivial Linux program needs a lot of libraries that are non-standard (as in: not in the LSB). Different distributions make different choices, which is probably a good thing, but it breaks binary compatibility between distributions. This has (pretty much) nothing to do with package formats.
Re:Please explain....? (Score:4, Insightful)
*Windows*
Pros
+ Install is double-click, then run through the options for install (directory to be used, full or partial install (games and such).
+ Library files are either included with windows or are installed by the program, libraries are placed in the install directory, or are placed in windows/system32. Registry entries are entered by the program, as well as menu shorcuts and/or desktop icons.
+ All packaging is handled by the programmer, from the users standpoint, all installers function the same.
+ Anything, including drivers can be installed from a gui installer, installing video drivers does not require exiting to a CLI.
+ For most programs there is only one version, compatible across all Windows versions (occasionally, there will be a version for the 9x family and one for the NT family.) Drivers and such are for each version. Some will specify which service packs must be installed.
+ Program's files are contained in their directory in "Program files". There is one main location for installed programs (easy to find)
+ Even programs that do not use a
+ I never have to deal with dependencies, ever.
Cons
- No central repository for updated versions of software. User must download patches or new versions from the software author, or a third party. Some software provides it's own update mechanism.
- Poorly coded uninstallers may leave behind libraries or files/folders and registry entries.
- Binaries may be larger than needed since they must include all libraries not included with windows. (somehow, firefox is a smaller download for windows than it is for linux).
- Some programs may require a reboot after installation. Some windows updates require a reboot.
Neutral
= No CLI installation or uninstallation of programs.
= Multiple installers exist (installshield, cab self extractors, nullsoft installer, sfx archives, etc.). All of these are executables that function approximately the same from the users point of view.
= All updates to windows components are handled through windows update.
= Newer programs may save user files (profiles, saved games, etc.) in a folder in My Documents. Microsoft is said to be pushing centralization of saved files to the "My documents" folder. For example, image files are saved to "My Pictures" by default, and new saved games will be saved under "My Games".
I'd really be interested to hear what you think the pro's and cons are with the current linux methodology. I guess I'm having trouble understanding why the ease I experience with windows programs would somehow be unfit for Linux.
Re:Please explain....? (Score:3, Insightful)
Re:Please explain....? (Score:5, Informative)
With many windows programs, the source is closed and the developer creates a binary package and controls how the program will be distributed. But with free software, many people take those source files and distribute them in whatever way works best for them -- a package is simply a way to put programs in a file for distributing to others.
If you'd like you can think of of package as an installation program -- with modern end-user distributions the distinction is minor. A package is RedHat, Mandrake, and SuSE all have programs that will automatically install a
But it gets more complicated than that, because of the increased complexity of the *NIX world. Certain programs depend on external libraries (think of it like a
It seems just eliminating the whole concept of packages would make life so much easier. Installation programs (like MSI files) are simpler, aren't they?
Some applications, like the Sun Java JRE, OpenOffice, and the binary NVIDIA drivers (I'm sure there are many others) have their own installation programs. It's ugly and messy and doesn't work that well compared with how each distribution handles packages natively.
To put it more practical terms, if I download OpenOffice from openoffice.org and run their installer I see a custom installation program that they have developed. I have to answer a lot of questions about how my Linux distribution is set up and do this all in an unfamiliar environment. However if I install OpenOffice
I hope this helps answer your question.
Re:Please explain....? (Score:4, Informative)
Packages in typical Linux distributions pretty much do the same things as MSI files on Windows, except that they do much more.
1. They describe how to build from source. That is (obviously) a big deal on an open source platform, since it makes builds repeatable, and so not depending on the magical build environment of one company or person.
2. They deal with dependencies: package "foo" can dictate that it needs package "bar" to work correctly, and that it needs package "foobar" version 2.32 or higher to build. This is a Good Thing, as you don't have to find out what the dependencies are the hard way.
This causes some problems from time to time, since distribution X may not have package "foobar", but the real problem here is that distributions are different. This may also be seen as a good thing: package management is a way to deal with diversity.
3. Standardised package management in a distribution makes other Good Things possible, such as automatic installations of all dependent packages, or automatic upgrades, thanks to tools like apt and yum and the dependency information in packages. That means that you can make sure that every program on the system is up-to-date with just one command.
Another really Good Thing is that package managers allow a lot more control over installations: they know which files are installed by which packages. That makes it possible to check, say,
Packages (Score:5, Informative)
This has several effects. If I distribute a nonfree 10MB program UberTool, that requires the nonfree 20MB MegaLib, I'd better distribute MegaLib with UberTool. If both are free, I can distribute them seperately -- if the user already has MegaLib, he'll just install UberTool.deb. If he doesn't, the package management system will know where to grab MegaLib from, will download MegaLib.deb, and install it.
Furthermore, if I'm going from Office 97 to Office 2000, it's because I bought money on a CD, and I'm running an installer. In the free software world, upgrades are no-brainers, since they cost no money, and most free software programs are a smooth evolution, rather than major versions every several years. As a result, I'll generally be running the latest version of my office suite (as well as every other little utility on my system), and it is convenient to be able to do the upgrades all in one step (apt-get upgrade; apt-get update will grab all packages with newer versions, and install them, cleanly removing the previous ones). Most people never reinstall Debian -- I know installs from '96 that are still running today, at the latest versions, and there are almost certainly ones from before. I don't know of anyone who went from DOS/Windows 3.1 through Windows XP with just upgrades, and without a reinstall.
The next thing is Windows has a problem of bit rot. If you leave a Windows system without reinstalling the whole thing, adding and removing programs, etc. crap builds up. You get all sorts of registry keys you don't need,
The other place package management helps is in centrally-maintained networks. You can install the same package, with the same configuration settings, very easily from a centralized location.
So package management is, in effect, a fancy way to install and uninstall files. However, the fanciness buys you a lot. The new Windows installer is a form of package management, and gives some of the same advantages, although it's not yet as mature as the GNU/Linux ones (.deb has been around since at least '95, and
Lacking some key features. (Score:3, Informative)
There are still PLENTY of Windows applications that don't use Add/Remove programs. You have to find their uninstaller, if they have one. This is the same as downloading a tar.gz with the source and hoping it has a "make uninstall" target. However, free software is available to track packages you compile and the files they install. Software is probably available to help uninstall stuff under Windows too.
With Debian, I can find out all
Re:Please explain....? (Score:5, Interesting)
I think once you spent hours disassembling and debugging these "simple" installer programs to make them run on Wine you'd have a different view on the matter ;)
Let's do a quick review of how things are done on Windows:
This is sort of how autopackage works, except we do it in a much simpler way and don't rely on CORBA (the nearest equivalent of DCOM on Linux). These installers have no dependency management beyond "is this file the right version? No? replace it then" which has caused some truly horrific hacks like Windows File Protection.
Yes, Windows apps have dependencies too. Check out this list to see. [installshield.com].
MSIs "deal" with dependencies by including the least common ones inside themselves, causing huge and bloated downloads, and leaving the user to figure out random breakage for ones that don't exist (how many times have you found that an app assumes the presence of some VB/VC++ runtime lib that just wasn't there?).
They can get away with this because Windows is a platform and a whole pile of technology is guaranteed to be present. For instance, you know without needing to check that graphics support is available, because it's a part of the kernel and cannot be removed. On the server that's an achilles heel, on the client it's an advantage (in terms of packaging).
Because there is no standard Linux platform (the LSB never caught on), and the user can basically arbitrarily add, remove or upgrade individual components as they see fit (from the kernel to the theme in use) package managers are used to manage and maintain it all. Unfortunately, because there is no standard platform, the distro becomes the platform - of which there are many.
The freedesktop.org platform effort and the LSB are both worthy steps forward in this area and I hope they pick up steam. In the meantime, approaches like autopackage, being dependency-smart and having communities of packagers are the way forward.
Re:Please explain....? (Score:3, Informative)
A package declares what other packages are required to install. Imagine if you were installing a program that *required* IE6.0 and Media player 5.0: a Windows installer will start up, run, and then barf saying you need to install something. A package would allow to to determine BEFORE you start that you need other things installed.
A package lists all files to be installed, and a package manager tracks who installed what. Thus, when you encounter a file you don't recognize, you can ask
And no mention of encryption (Score:3, Insightful)
Lets do this from the beginning rather than slapping it on later.
Re:And no mention of encryption (Score:2, Insightful)
Re:And no mention of encryption (Score:2)
OpenPKG (Score:5, Informative)
Why reinvent the wheel? (Score:5, Insightful)
APT already handles debs and rpms. tgzs should not be a far stretch. The problem is establishing standards and getting everyone to follow them. For example, all debs in the Debian archive follow the Debian packaging standard, else they would not be accepted into the archive.
Naturally, third parties are free to create their own non-conformant debs. This is just the same as someone creating an rpm for RH9, but it not conforming to the conventions used by Red Hat.
I assert that the tools already exist. I.e., we don't need a new one. The emphasis needs to be on getting people to follow the standards, and possibly creaitng a cross-dsitro standard fo everyone to follow.
Re:Why reinvent the wheel? (Score:2)
I totally agree with that. But I think there should be two options for a each distribution - the native one, and a cross platform one. For filesystem layout, cross-platform packages should assume that everything defined in FHS - Filesystem Hierarchy Standard [pathname.com] (part of LSB [linuxbase.org]). And the distributions shoul
Re:Why reinvent the wheel? (Score:3, Insightful)
Sure, there's scads of obscure software lying around out there in TGZs. But the fact that it's still a TGZ woul
Re:Why reinvent the wheel? (Score:2, Insightful)
Re:Why reinvent the wheel? (Score:3, Insightful)
apt-get update && apt-get install 'package' how hard is that? Cron a weekly or nightly update and you don't even need to look at it except to install new stuff and you're always patched.
Only one easier is urpmi [for mandrake], which has an gui interface also availaible.
There are tools, maybe we need to just pick one and go with that. Apt works on mu
Well then build it... (Score:5, Insightful)
A good installer is not hard to accomplish if the desire for it really exists. It is however one of the most overlooked things as open source programs are involved.
Don't make me go hunting down 20 dependency packages but offer to install them for me. A simple script based on wget can do that...
Re:Well then build it... (Score:2, Informative)
c.f. apt, apt-get
Unifying the Packages (Score:5, Funny)
Unify those packages.
I am so often confused the RedHat comes in a red box and SuSE in a green one. - Which of those should I buy ?
And Fedora comes with a box you have to fold yourself...
Oh you mean these packages....
(Fedora Linux is included in the RedHat magazine - which has a foldable page for creating a suitable box)
*BSD ports system? (Score:5, Interesting)
ps: BSD trolls are dying!
Where ports excels.. (Score:5, Informative)
Also, it is pretty easy to make a custom "ebuild" file (which is a shell script) in Gentoo, and relatively difficult to create a new
There is also a lot less political activity in Gentoo, and they seem to Get Things Done.
0Install (Score:5, Informative)
Re:0Install (Score:3, Insightful)
[snip]
link your app statically, or include all your depedancies in your package. that is the only solution to this problem.
Um, you realize that both of those options basically mean "send all your dependencies with the package"...which doesn't decrease the download time at all? Particularly if more than one program does it? (you can end up installing 50 copies of part of libXYZ.a instead of one shared copy)
Daniel
Alien? (Score:2)
TGZ is a package format (Score:2)
for what it is but it ain't no package format.
Re:TGZ is a package format (Score:3, Informative)
When you use pkgtool to install from the tgz file, the package database is updated with info about the package that you just installed. It can also check for dependencies, etc.
What we all need ... (Score:3, Interesting)
Essentially there would be some glue between the package management system, the "configurator", and the actual config file.
Sunny Dubey
So Yesterday.... (Score:5, Funny)
Learn from Apple (Score:5, Insightful)
Re:Learn from Apple (Score:3, Insightful)
I agree. I've never used Apples too much, but when I got a bit of software from someone and took it to my buddies place to use on his computer, I just had to copy the directory into the applications dir. It worked great, right away. It's a good idea for more than that, as you can insist every application maintains its own configuration files and directories, and leave the /etc (does OS X have one of those) for operating system only stuff.
cool indeed.
Re:Learn from Apple (Score:4, Insightful)
The Apple way rocks, the directory structure doesn't matter: you can execute any properly written application from anywhere. There are 2 problems with this in linux. Firstly, most applications are written with full path names in the code for files they reference. Secondly, linux applications are built around a wide range of supporting libraries, few of which are standard to all distributions - this means either huge packages, or a dependency resolution mechanism, like apt.
A linux distribution which wanted to do this would have to firstly pick a core set of standard libraries, then port all the applications they wanted to support.
Re:Learn from Apple (Score:4, Insightful)
Nope, they can do it because they have a standardised platform. All MacOS X apps have at least one implicit dependency - on some version of MacOS. That one dependency is very easy (well, assuming you are rich) to satisfy, but it gives you a whole pile of functionality.
Because desktop Linux is evolving at such a rapid rate, and because nobody controls it, and because people like being able to control what's on their system to a fine degree, we have no such platform currently.
Unfortunately the first attempts at such a platform foundered when KDE made the bum choice of using a toolkit with a problematic license, so Gnome was started in reaction and now we have two, with neither being widely spread. The Gnome guys are busy moving infrastructure out into non-Gnome libraries like GTK, Cairo, DBUS etc that are explicitly designed to be acceptable to everybody, but unfortunately the KDE guys are still having big internal arguments about whether freedesktop.org (the new attempt) is "forcing" things upon them (see the discussions at kdedevelopers.org)
This is unfortunate because a standard platform that you could depend upon with only one dependency would go a long way towards making software on Linux easier to install.
Re:Learn from Apple (Score:5, Informative)
Isn't this what alien does? (Score:3, Interesting)
Good theory, poor practice (Score:5, Insightful)
There's also a matter of versions and security updates. On Debian, I run 'apt-get update; apt-get upgrade' and have a new version. Since the packages are all maintained by the Debian project (and a few smaller projects that target Debian), this works. Versions aren't linear -- Debian back-ports security fixes. The package manager has no way of knowing whether kernel-2.4.24 is newer than kernel-2.4.19-with-patches.
Basically, there is no clean way to install
There is room for improvement in package management -- a really good GUI for finding and installing packages would be nice. I wouldn't mind having more information about the packages I'm about to install -- links to project web pages, ability to browse installed files (the packages.debian.org/freshmeat.net/etc. databases either installed locally or quickly accessable from the system), the ability to view screenshots of GUI programs, etc. There's a lot of metainformation that could be added, and better search functionality that could be implemented.
At the same time, on the package build side, it'd be pretty simple to have a system where you make a configuration file of information about the package, and it builds
The last solution is to have the groups work together to make sure all packages have the same set of metainformation (more than is needed for any given package system), so that cross-platform package installs become possible. In practice, I don't see this scaling across versions, as package management systems evolve.
One more thing to bear in mind is the perspective of the author of the article -- he says he runs Slackware, and builds most packages from source (something I've stopped doing maybe 3-5 years ago). Slackware's package management tools are very basic, manual, and crude. That gives a very different attitude towards package management than someone running a distribution like Red Hat, which has a much heavier-weight, more technologically-advanced, but somewhat fragile, somewhat inflexible package management system, or a user of Debian, which has a state-of-the-art ubermaintainable, uberupgradeable package management system, but that primarily relies on grabbing packages from one (or a small number) of sources. I apologize about the stereotypes in this paragraph -- they're not entirely true, but the package management systems differ a lot (more than most people realize, if you've ever tried to build the packages), and I'm just trying to make people aware that users of each of them will have a very different world view, and it's important to keep that in mind when reading these articles.
I've got a much better idea. (Score:5, Funny)
The amount of time and money that's been wasted on this problem for over twenty years in the unix world is just mind-boggling. We really do not need to reinvent this wheel again.
KDE + Gnome (Score:4, Insightful)
If we get KDE and Gnome work together, then we might also, eventually get an installer too!!!
PLEASE!!!
Let KDE 4.0 and Gnome 3.0 be the same!!!
A few things: (Score:5, Insightful)
And for that matter, why not make the installer intelligent about the distro? Use a single package/installer, but that includes all sorts of scripting information about installation in variosu circumstances. The installer checks to see if it's on RH9, and if so it puts files where RH9 expects them, editing any configurations and making RPM database entries as necessary. If it's on Debian, it takes the appropriate measures there. And so forth.
Why do we see such absurd dependencies that don't seem to happen in the windows and mac worlds? Install a new version of a KDE app, and you need the latest minor revision to the core KDE libs, which in turn requires a minor upgrade to the font server, etc. In the windows world, occasionally you need to update something big like DirectX to install a latest-and-greatest app, but even then the dependencies are often packaged with the app itself. Why isn't this practice more common in Linux/Unix (not counting Mac OS X)? I undestand that many of these apps are under CONSTANT quick-release development and are often tied to bleeding-edge versions of their libs, but why aren't major releases at least more dependency-friendly? Installing an app can be a real pain in the ass even with something like apt, if you don't have the dependencies in the repositories you've defined. And adding new respositories isn't exacly grandma-friendly.
Re:A few things: (Score:4, Insightful)
First, MSIs typically run executable code and/or scripts to perform the installs. But packages usually contain just a list of files and locations, with no scripting required. This is important since a security-concious administrator won't want to run an MSI-like package since nothing stops it from doing rm -rf /. It alsmo makes them easier to create and maintain. Technically, an RPM or DEB can run scripts, but it is uncommon, and you can easily tear them apart and see the script they are running with a few simple commands.
Second, MSIs don't have a central database where all the file information is stored. There is some use of the registry for this, but it isn't quite as good. With RPM/DEB, I can ask the package manager: "What package does /usr/libfoobar.so.1.2.3" belong too?" and "What versions of /usr/libfoobar.so are installed and where?" These are just a few simple examples, lots more can be done.
As far as the granny meter, I rate APT distributions as easier than MSI, because most MSIs display between 3 and 10 screens where you must click "Next" or scroll down and click "I agree." As a consultant to many such peoples, I have been amazed that, many of them cannot get through these installs! An RPM/DEB install with APT handling dependencies does not have any prompts - you just click it, the progress bar goes by, and it is done!
Fundamentally, Linux packages don't have the dependency conflicts that Windows packages have. Windows packages commonly overwrite shared files all the time. By splitting the dependencies into separate packages, that DLL-hell should go away: nobody will overwrite another package's files. But instead, we get it in another form because dependencies aren't stated properly in the packages. I just hope that things will improve over time.
Dead on bro. Remember when Microsoft added "self-healing" to Windows XP because so many companies were making crap installs? Deleting dlls, installing dups, putting them in wrong directories. This is the exact same problem as Windows has/had, but in RPM & DEB form rather than EXE & MSI form. It seems like coders just don't get package management well. And if you look at APT, it even has a healing-like feature now!
/usr/local (Score:2)
Once in a while I have to copy something manually into
Of course, it helps that for almost anything I want to install, my distribution includes it or at least includes the dependencies.
Standards (Score:5, Insightful)
Standards are not a price, they are an investment. I use standard XHTML, CSS, and SVG in my web design because I care about the future of the web. Besides, if a standard is well-designed (like W3C recommendations tend to be), it actually makes development and maintenance easier. Anyone who has migrated from HTML 3 (or some nonstandard IE/Netscape HackML) to HTML 4 or XHTML with CSS knows what a pleasure it is to work with modern hypertext (and probably also has an abiding and bitter hatred for IE). The same could be true of package installation in Linux if the standard is well-designed.
Is this such a good idea? (Score:3, Interesting)
Total package compatibility would most likely lead to someone using Red Hat trying to install a debian package, and then getting frustrated, confused and pissed off with the inevitable failure due to the entirely different internals of Debian and Red Hat.
Unfortunately, it is a sad fact of life that Linux distros are deviating from the once common base they shared. An example of this is Mandrake - I used to use Mandrake around versions 6 and 7 and quite often installed Red Hat rpms successfully. However, as those crazy French spend more time tweaking Mandrake in weird and wonderful ways, it becomes further and further removed from Red Hat. Sure, they both use the
All of this leads me to conclude that perhaps rather than concentrating on unifying packaging, we should instead focus on making incompatible packaging systems for each major distro. IMHO, it would be much easier for a newbie to distinguish between what will and won't work if they were guaranteed that an rpm would ALWAYS work on Red Hat, and some other kind of package (MPM?) would ALWAYS work on Mandrake....
I just want more stuff for Solaris (Score:3, Interesting)
If you feel you habe some ideas that could help with this mini project, feel free to join in!
In Windows Hell... (Score:3, Interesting)
Too much trust is placed in the installation program getting it right, and no built-in way is available to check if dependencies are broken.
You can ding the different distributions -- and quite rightly -- for package problems though in comparison most of them are dreams when stacked up against Redmond's latest offering.
Gentoo and Portage (Score:5, Informative)
As usual I'll come out with my Gentoo Zealotry but I'd like to deflect some of the problems I'm seeing mentioned here.
Gentoo is a Linux distribution largely centric to the Portage package manager (there are other features of Gentoo, but Portage is by far the most conspicuous)
Portage is a package manager loosely inspired by FreeBSD's ports system. Portage maintains a global software configuration file called make.conf. Make.conf holds meta-configuration settings about your system. As Portage builds all programs from source for your machine, make.conf is the place where you describe your machine to Portage. make.conf also holds a collection of use flags. Use flags are global binary switches. They have a default value if they are unspecified, and if you include a Use flag (ie USE="java") then it turns that flag on, and if you include -flag, (ie USE="-java") then it explicitly will not use that feature which is globally recognized by Gentoo.
I see complaints that emerge VI tried to build X and thus portage is "smarter" than you as a sysadmin. This is patently false and ignorant. Portage lets you do your job as a sysadmin once and then never have to worry about doing it again. If you do not want X on a machine then you need merely put "-X" in your use flags.
It puts control in your hands. If you want an application built to support certain things you can have it. If you do not want to support other things explicitly it will do that. It defaults tod doing what's sensible for most people who use Linux casually. If you aren't a casual user, spend a week or so getting familiar with portage and it's configuration. emerge is an incredibly potent tool. All of my systems are patched automatically every day, from source, with the configuration I have specified for that system. My binaries are all built with -march for the CPU, and -Os. And I've never once had any of my systems have a failure caused by misconfigured dependencies. They stay up to date and I don't have to worry about it.
If you want to do all your dependency checking yourself, you're welcome to. However there's a good solution that takes care of all of the issues revolving around this available, freely, to the world. Click here to find out more about it. [gentoo.org]
We need complete separation of data and programs. (Score:3, Insightful)
Packages should contain data. That's it. Leave all the executable work to the package manager, like dependency checking, startup scripts, etc.
Letting packages run its own stuffs during installation is the root of all issues associated with uninstalls.
Egads, so much effort (Score:3, Interesting)
Too many package formats, too many window managers, too many GUI toolkits, too many desktop environments, too many Linux distributions, etc, etc.
I like choice, I really do, but this is madness. Not only is a great deal of time used to create competing software (Kword, Abiword, Open-Office) but now we're creating more work for ourselves by trying to integrate it all (packages managers, RedHat trying to unify GNOME and KDE, etc). Wow, this can't be good.
How is all of this going to compete with entities that have a more focused approach? I believe the only reason why anything has gotten done at all is because there's just so damn many people working on things. This causes serious talent dilution though. Things are nowhere near as good as they could be (or could have been).
This is quite disturbing.
It's interesting to note the things where this hasn't happened. Just try to create a competing standard to HTML, XML, SQL, or OpenGL (note that I'm talking Linux/FreeBSD, etc, not Windows). Not that people don't try but they never gain momentum. I have to think if there was an ANSI, ISO, or whatever standard desktop evironment then that would help. I seriously doubt something like that could be done in a reasonable time, I'm just saying it might help.
Re:Gentoo Portage is closer than anything. (Score:2)
At first I thought that was a typo... that's it's called Emerge or something. But no, it really is named Eshit ("merde" is French for "shit"). Weird Gentoo people.
Re:An idea (Score:3, Interesting)
Re:Distribute deps (Score:4, Insightful)
It is possible to fake this, we do it with our commercial Linux software, and it is obvious that other commercial software does this:
*All* files for your software are dumped into a single directory, so installation and removal is trivial (this is similar to OS/X App directories). In that directory is a single "wrapper" executable, usually with the name of your program, and only dependent on glibc. The main thing this program does is read
We then install Gnome and KDE shortcuts on their start menus directly pointing at the full pathname of the executable. Anybody who wants to run it from a shell has to type the full path or symbolically link it themselves into a directory on their path. Deleting our software leaves these start menu items and any symbolic links pointing at nothing, which as far as I can tell is not a real problem.
Intelligent users can easily peek into our directories and see that we ship libtcl and some others in an attempt to hide system differences. They can rename or remove these to better integrate the software into their system. But at least it works, first time, for any users.
I would like to see Linux altered to handle this cleanly, by adding direct support for "app directories". Unlike OS/X, I think they should be named with no extension. The rules are simple: if the user tries to run "foo" and it is a directory, it tries to run "foo/foo" after first adding foo to the LD_LIBRARY_PATH and after fixing up argv[0] to be the full expanded path to foo, and possibly other fixes for stupid Posix rules. This would get rid of 99% of the "install" headaches.