Forgot your password?
typodupeerror
Software Linux

CDE — Making Linux Portability Easy 385

Posted by timothy
from the namespace-collision dept.
ihaque writes "A Stanford researcher, Philip Guo, has developed a tool called CDE to automatically package up a Linux program and all its dependencies (including system-level libraries, fonts, etc!) so that it can be run out of the box on another Linux machine without a lot of complicated work setting up libraries and program versions or dealing with dependency version hell. He's got binaries, source code, and a screencast up. Looks to be really useful for large cluster/cloud deployments as well as program sharing. Says Guo, 'CDE is a tool that automatically packages up the Code, Data, and Environment involved in running any Linux command so that it can execute identically on another computer without any installation or configuration. The only requirement is that the other computer have the same hardware architecture (e.g., x86) and major kernel version (e.g., 2.6.X) as yours. CDE allows you to easily run programs without the dependency hell that inevitably occurs when attempting to install software or libraries. You can use CDE to allow your colleagues to reproduce and build upon your computational experiments, to quickly deploy prototype software to a compute cluster, and to submit executable bug reports.'"
This discussion has been archived. No new comments can be posted.

CDE — Making Linux Portability Easy

Comments Filter:
  • by Anonymous Coward on Friday November 12, 2010 @08:46PM (#34212490)

    If those libraries are GPL or LGPL, then when you deliver the binary of the library, you must also deliver the source or an offer to deliver the source, and you must also deliver a copy of the (L)GPL, as part of the CDE. Is this done?

  • by h4rr4r (612664) on Friday November 12, 2010 @08:47PM (#34212494)

    Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.

  • by tommeke100 (755660) on Friday November 12, 2010 @08:48PM (#34212504)
    This sounds like an easy way to copy installed proprietary software?
  • by h4rr4r (612664) on Friday November 12, 2010 @08:50PM (#34212516)

    Wow, static linking, did anybody for even a second think it is kinda weird to have the same lib on the machine over and over and in every old exploitable version you can find?

  • by maestroX (1061960) on Friday November 12, 2010 @08:52PM (#34212532)
    he could always use tar
  • by countertrolling (1585477) on Friday November 12, 2010 @09:05PM (#34212598) Journal

    That would all be nice if not for each program replacing shared libraries with its own and breaking the other programs. A program should not disturb the system or mess with its libraries. I prefer it remains as isolated as possible, where it can do the least damage.

  • by h4rr4r (612664) on Friday November 12, 2010 @09:12PM (#34212652)

    That method guarantees security problems. Applications and their dependencies should be managed by a proper package management system.

  • by Haeleth (414428) on Friday November 12, 2010 @09:15PM (#34212670) Journal

    I am still waiting for Gnome or KDE to catch up with the efficiency and usability of these older environments.

    KDE is getting closer now that it's possible for the desktop menu to present a list of applications rather than a handful of useless wallpaper-changing commands, but both major environments seem to be stuck on the stupid Windows 95-derived taskbar paradigm. Give me spatial management of running applications dammit! I want to develop muscle memory, not scan slowly across a list of tiny icons that are never in the same place twice.

  • by Haeleth (414428) on Friday November 12, 2010 @09:24PM (#34212722) Journal

    That would all be nice if not for each program replacing shared libraries with its own and breaking the other programs.

    Do, please, show me just one widely-used program that does this on a recent UNIX or Unix-like platform.

    A program should not disturb the system or mess with its libraries.

    Right. That's why you should put programs you install under /usr/local, not straight under /usr. Or of course many programs like to be installed in their own self-contained directories under /opt, which is, er, basically exactly what you're asking for and has been common practice for decades.

  • Re:Limitations (Score:3, Insightful)

    by h4rr4r (612664) on Friday November 12, 2010 @09:29PM (#34212764)

    You could just use SElinux, which would already let you do that.

    This looks like a solution looking for a problem.

  • by countertrolling (1585477) on Friday November 12, 2010 @09:29PM (#34212766) Journal

    I prefer to avoid the disagreements over what is a "proper package management system". In fact with each program in its own "sandbox", protected from each other, I see better security.

  • Re:It's About Time (Score:3, Insightful)

    by CannonballHead (842625) on Friday November 12, 2010 @09:32PM (#34212794)

    ... so isn't this basically just a way to gather all the appropriate dependencies and put them all into one spot?

    Hm. I guess this is a way to do it without installing. Reading comprehension fail...

    Still, I can see how it could be useful in some situations, just like having certain programs that don't require installation on Windows can be helpful.

  • by icebraining (1313345) on Friday November 12, 2010 @09:35PM (#34212808) Homepage

    Firstly, you don't need 5000, you need 4 or 5 for the most used distros. Ubuntu, Fedora, OpenSuse, Debian and Red Hat. Let the others figure it out from a tar file.
    And if a company like Skype can produce those packages, so can e.g. Adobe.

    Secondly, that already exists [wikipedia.org].

  • by h4rr4r (612664) on Friday November 12, 2010 @09:35PM (#34212812)

    It is also sure to piss off users who now have to have another Documents directory for each application.

    Else my bad application could edit a document that another application that relies on an old outdated insecure library uses.

    I am starting to think you are not thinking this through.

  • by Kjella (173770) on Friday November 12, 2010 @09:41PM (#34212842) Homepage

    For packages provided by the distro, it makes sense to have them all use their complex dependency tree. For installing some other version side by side, this sounds like a great tool. The problem with dependencies is that often a pebble turns into an avalanche by the time you're done. If you want the new version of *one* KDE app, it can drag pretty much the whole of KDE and every library they in turn depend on with it in an upgrade. I've had that happen and ended at 450MB to download and install, and that would pull almost all packages out of LTS support.

    From the user's point of view it's completely illogical to upgrade the whole system just because you want a new feature in amaroK 2.4 while your distro only packages 2.3, you expect one application to install or upgrade independently of any other application. That does not happen with Linux. It is not just about new library versions, via dependencies you pull in unwanted version upgrades. As for security I'd rather have one potentially insecure package on my system than to pull most packages out of support, it probably open ups more vulnerabilities than it prevents.

    I wouldn't want to run a dozen applications like that. But if it's one or two? I got no problem taking the extra overhead of a bit more memory use. And honestly, a lot of software I use isn't in contact with the "outside world" as such. Even if there is an exploit in a library, I'd never open any file crafted to exploit it. Obviously it is good in general to patch stuff, but it's not always that critical...

  • by Junta (36770) on Friday November 12, 2010 @09:45PM (#34212846)

    Dear god no.

    I do not want to execute installshield or any similar crap/wizard for every little thing I install.

    I do not want to have a system tray/task manager full of two dozen vendor's update checker processes, each individually bugging me about how I'm running WidgetFoo 1.8.1.20.1.3, and it is critically important that I execute WidgetFoo's custom one-off graphical update wizard with 3 or 4 pages to click through to get to 1.8.1.20.1.4. Then rinse and repeat once per app instead of knocking them out in one shot/dialog/icon/process.

    I do not want each application to bundle their ancient ass directx library or ancient library from visual studio or any other similar crap.

    Windows installs were not historically 'easy' due to any effort on MS's part (installshield and friends made an entire business out of covering for MS' lack of help, even as MSI matured into a usable solution). Linux (specifically Debian) really got this right first. Apple recognized that model and made it a great success on the iPhone, setting the tone for all of modern mobile devices. Debian did it right first and never gets the credit.

  • by countertrolling (1585477) on Friday November 12, 2010 @09:49PM (#34212856) Journal

    That's why you should put programs you install under /usr/local, not straight under /usr.

    Not the issue. That's a given. It's when I suddenly find out I don't have some bizarre version of gtk, or ncurses (great name, because that's what I'm doing when I find it missing), and I'm suddenly without internet, it gets a bit tense. I prefer the portability over raw efficiency. It is far and away one of the best things about a Mac. I can take something as bloated as MS Office or Photoshop straight from one machine to the next.

  • by bmo (77928) on Friday November 12, 2010 @10:07PM (#34212946)

    Where something like this "CDE" might be handy is for software that is not in the package manager. Suppose you've written a program that is only of interest to a handful of users. There's no way it's going to find package maintainers for every major distro, and your users might not be happy building from source code

    So do the packaging yourself. It's not hard. And when you're done, you have something sitting in the RPM or DEB database with all the others so you can keep track of it.

    There are also classes of software that are not allowed in the main repositories for some major distros like Fedora and Debian. For example, the authors of indie games might want to let Linux users play without making the whole game open source. Even if they open-sourced the engine, some distros will not permit it in the repos if its only use is to access non-free data.

    So set up your own ppa (or rpm equivalent) repository. Your customers can add the repository to their list and then keep track of the package. You seem to be under the impression that repositories are only for "approved" software or that package managers can only handle a small number of entries. I have over 150 entries in /etc/apt/sources.list. Adding another one is no big deal. You also seem to think that licensing issues affect what you can put in a repository. It doesn't matter if you have your own repo. You could put commercial software in there, like Sun/Oracle with their VirtualBox.

    Package management and repositories as they exist in the Linux world are better ways of handling the distribution of software both free and commercial than anything else I've seen on any platform.

    This "CDE" doesn't solve any problems, but introduces its own "dll hell"

    --
    BMO

  • by h4rr4r (612664) on Friday November 12, 2010 @10:29PM (#34213044)

    File extensions mean something?

    Wow, that takes me back. That is another big windows flaw you bring up there.

  • by tepples (727027) <tepples&gmail,com> on Friday November 12, 2010 @10:30PM (#34213054) Homepage Journal

    apt-get install foobar

    This brings four problems that I can think of:

    • foobar is not available in the predefined repositories because it does not match the political views of the operating system's publisher. Some GNU/Linux distributions such as Fedora package only free software, and not all classes of application are amenable to distribution as free software.
    • foobar is not available in the predefined repositories because nobody has packaged it for your distribution. This could be because the maintainer hasn't yet learned how to package, because the maintainer has never had a chance to travel to get his PGP key signed by more established developers, or because the maintainer uses a different distribution from you.
    • foobar or a library on which it depends is in the "community-maintained" part of the repository, not the "core" part, and a change to an underlying system library broke the application. This is the case with, for example, games that use the Allegro library after Ubuntu started using PulseAudio, which doesn't support the audio sample format that Allegro has used for years.
    • The feature that you seek was not introduced until a later version of foobar than the version included in the predefined repositories. The program hasn't changed since the package freeze over a year ago except for backported fixes to security defects.
  • Re:It's About Time (Score:3, Insightful)

    by visualight (468005) on Friday November 12, 2010 @10:33PM (#34213070) Homepage

    I've spent time on work like this myself, even used code like this in production...it's really really hard to be sure you've got everything and don't have unnecessary deps, particularly when you've got scripts (that you didn't write) that call out scripts that call out scripts.

    I know a few others that have invested time in this also, when you spend much time on a cluster that you don't own eventually you at least think about doing this.

  • by Hach-Que (1524899) on Friday November 12, 2010 @10:36PM (#34213082)

    While I can see this being useful for scientists, it's probably not going to be so useful in terms of an end-user environment (not that it couldn't be used in one). For starters, this is really only good for command-line packages as GUI applications, the one that end-users are most likely to use, would end up including most of the X and UI libraries during the CDE detection, even if the files on the target computer are compatible with the machine.

    The other issue with having different projects for different distribution purposes (Klik for applications, CDE for scripts, etc.) is that it induces fragmentation; the very thing these projects are trying to get rid of by no longer relying on the package management system.

    Disclaimer: I work on AppTools (http://code.google.com/p/apptools-dist) and therefore I'm biased :)

  • Re:It's About Time (Score:5, Insightful)

    by ozmanjusri (601766) <aussie_bob@[ ]mail.com ['hot' in gap]> on Friday November 12, 2010 @11:00PM (#34213214) Journal
    I'm not sure why you're getting "Troll" mods for this.
    1. It does allow you to have outdated and unsecure software where it is likely to be used.
    2. Windows does lack proper package management.
    3. There are better ways to achieve the goal.
    4. It isn't a good idea to turn Linux into Windows. In fact, most mainstream OSs are switching to package managers.

    Guo comes from a Windows background (He interned at Microsoft last year), so it's understandable why he might have a Window perspective. That doesn't make it good for Linux to adopt that mindset.

  • by countertrolling (1585477) on Friday November 12, 2010 @11:11PM (#34213252) Journal

    In addition to the AC's very valid point below. I might have an app requires a previous version of an installed library. Remember, the issue is portability. You can't have it any other way. Copying and running a program to any machine should be just as simple as doing the same with any text document.

  • by countertrolling (1585477) on Friday November 12, 2010 @11:22PM (#34213280) Journal

    If a program needs its own own BSD style jail to move from one machine to another, then perfect. I'm not worried about space. I'll probably run the program from the removable drive itself, so I don't even need to copy it over. Hmmm.. kinda like having an old Atari game cartridge.

    Convenience, reliability, and simplicity are the goal here.

  • Re:It's About Time (Score:3, Insightful)

    by ischorr (657205) on Friday November 12, 2010 @11:37PM (#34213370)

    Why is this a kludge?

    App portability and dependency problems has been one of the Achilles Heels of Linux since, well, forever. We laughed at Windows for DLL-hell, and if anything package managers seem like the ugly kludge way to resolve it to me. I wonder how many tens or hundreds of thousands of man-hours have been lost dealing with these sorts of issues. It's by far the #1 thing that's prevented me from using Linux for those purposes, and I'd REALLY like to use Linux (though there are others). Hell, it's the main thing that's kept me from taking Linux seriously outside the server room. Particularly since people really don't seem to get why this is a problem.

    - I should be able to install an application QUICKLY and easily. There's no reason why "installation" should be more complicated than "copying/extracting the binaries to wherever I want them to go".
    - I should not be dependent on some third party to get around to porting each version of software to my flavor of Linux. When a new version of Wireshark or VLC or whatever software comes out, I should be able to install it *quickly and simply* without waiting on package maintainers to get around to it (even if some are very responsive)
    - Along with the above, I shouldn't be in a state where I can no longer easily install applications because I'm using a somewhat older distribution (and packages are no longer being maintained for that version)
    - Although the option should be there, it should never, ever, ever, ever, ever, ever, ever, EVER be considered "acceptable" to expect users to compile an application to install it (and all the potential headaches of getting that to work, including hacking make files, dealing with dependencies, patching the software, actually doing C++ debugging, etc).

    Balloon up my applications with static libraries. Please. The trade-off in system administration headache would pay for itself 100s of times over.

  • by grumbel (592662) <grumbel@gmx.de> on Friday November 12, 2010 @11:53PM (#34213450) Homepage

    Package mangers exist for a reason, use them.

    Except that distribution specific package manager do *nothing at all* to address the problem of distribution independed binary packaging. On top of that package manage are a really lousy solution to the software packaging problem, as they don't actually solve the underlying problem of duplication and incompatibilities, instead they have a single monolithic repository that they declare as the one and only source for software out there. As long as your software is in that repository, in the version you want, you are of course fine, but if you want a newer version, an older one, two different ones at the same time or a piece of software not yet packaged, you are back at square one as the package manager doesn't deal with those at all. You have to completly bypass the package manager, compile stuff from source and install it somewhere where it doesn't conflict with existing stuff. Its pretty much one big ugly hack.

    Now of course CDE isn't exactly a beautiful solution either, its a hack to solve a problem which the distribution builder have preferred to ignore for the last 15 years.

  • by Pentium100 (1240090) on Saturday November 13, 2010 @01:05AM (#34213712)

    I do not want to compile from source every time I want to install some app on an older or uncommon system.

    I do not want to have a system tray/task manager full of two dozen vendor's update checker processes, each individually bugging me about how I'm running WidgetFoo 1.8.1.20.1.3

    WidgetFoo could just as well check for updates itself and not have a system tray app runnin for that. Example - Firefox - when it is running it checks for updates, but does not leave a system tray icom to do that when it's not running.

    I do not want each application to bundle their ancient ass directx library or ancient library from visual studio or any other similar crap.

    Then maybe you should ask the out-of-business (or not) game or app developer to update its product to make it run on a Windows version that was released 8 years after they released the product?

    Linux (specifically Debian) really got this right first.

    Yes, until an app somehow clashes with the political view of Debian creators.

  • by the_womble (580291) on Saturday November 13, 2010 @02:32AM (#34213964) Homepage Journal

    Suppose a security flaw is found in a commonly used library, do you think you will get more timely security updates by

    1) the packager for that library providing an updates package, or,
    2) every single application that uses it providing an updated package

    The CDE site gives two reasons for doing this:

    1) To solve "dependency hell". This is a rare problem these days - except with Skype!
    2) To provide guaranteed reproducible results for researchers. This is a specialist concern, and not necessaries a good thing - it means that results that are the product of a bug in a particular version of a library will be duly reproduced.

  • Re:It's About Time (Score:3, Insightful)

    by martin-boundary (547041) on Saturday November 13, 2010 @03:58AM (#34214206)

    I DREAM of being able to just download an application archive, extract it *anywhere I want*, and just run it.

    Why? No offence, but you dream small. This seems barely one step up from what we've had in the past. Why SHOULD you have to do any of this? A real improvement shouldn't involve any of this mundane stuff. You shouldn't have to go download anything yourself, and you shouldn't have to think about managing where you want to extract things.

    The true goal should be to automate all this and make it transparent. If you want to run a program, that's all that should be needed from you, by the interface. The computer should figure everything out for you and do it, including putting things in your preferred locations if that's what you like. We're half way there with package management systems already. Take a look at Debian/Ubuntu. You choose a package, and you run the program. No manual downloads, extractions, configurations, hunting for dependencies etc. If the community can figure out how to make the packaging literally trivial for developers, then we'll be that much closer to the goal. I'd like to see Linux distributions able to figure out all the details for creating their own packages, just by reading a single URL that the project developer would publish. And I'd like this to be so ROUTINE, that every developer does it.

  • by The Mighty Buzzard (878441) on Saturday November 13, 2010 @05:00AM (#34214376)
    I can't tell if that should have been +1 Funny or -1 Fucktarded. I have a suspicion that it's the latter but it is 3am and I'm starting to nod a bit, so I'll leave it to someone else to decide.
  • by Bert64 (520050) <bert@slashd[ ]fi ... m ['ot.' in gap]> on Saturday November 13, 2010 @06:18AM (#34214618) Homepage

    Generally linux distributions follow a fairly standard naming/location convention for files, most of the variations exist in specialised linux distributions (eg android) where there is good reason for the differences.
    Most software also allows you to choose where to install it at compile time, although the default will usually be /usr/local.

    A linux system is often far less messy than a windows system for instance, where all kinds of files are under the windows and system32 dirs.

    Package managers are actually a very good solution to many problems, not only do they handle dependencies but they provide a centralised database of installed software, a file integrity database (both on the system - storing checksums of everything, and off system because the checksums corresponding to a given package versions files are known), clean removal of software, a single place and standardised interface for installing software (thus removing the need to download programs from potentially untrustworthy websites - you only have to trust your os vendor, not hundreds of third parties) and most important of all, a centralised update mechanism for applying important security patches to all of your software...

    Other software vendors have chosen different methods to try and resolve the same problems, but most of them are lacking in one way or another, or make different compromises...

    The OSX method of program bundles avoids dependency problems, but introduces the inefficiency of reducing code sharing, this has less impact on closed source software where code is rarely shared anyway, but for open source one of the key advantages of the open development model is reduced by this approach. On the other hand, this method does provide clean removal and makes it easy to have multiple versions of something installed.

    The Windows method is rather chaotic, individual programs are expected to create their own installation and removal programs as well as handle their own update mechanisms, this has resulted in a whole range of software which behaves in different ways, stores files in different places etc... Update mechanisms and uninstall routines are down to the individual application and may not exist at all, or may not work correctly. This has resulted in lots of very poorly behaved software which assumes you are a privileged user and can write to system locations, and subsequently in order to retain compatibility microsoft have been forced to implement all kinds of dirty kludges to make such applications think they are able to write to system dirs when they can't.

    The only potential downside to the linux system, is that application suppliers don't have a fixed list of system libraries which will always be present. Under OSX or Windows you know that a core set of libraries will always be there, and anything else is typically provided by the app (sometimes redundantly), whereas different linux distributions may provide different base libraries.

  • by m50d (797211) on Saturday November 13, 2010 @10:06AM (#34215274) Homepage Journal
    It's not a flaw. Unlike mimetypes or resource forks or any of the other various non-solutions around, they are preserved over all transfer protocols. They are under the user's control, without getting in the user's way. They're the best way to record filetypes I've seen, I mean it.
  • by Svartalf (2997) on Saturday November 13, 2010 @10:35AM (#34215394) Homepage

    The big problem with this is that he's dragging along a complete sandbox (Incl. X11 for X apps...) for each application.

    FUN.

    It avoids dependency hell, yes- but it's vast overkill considering that I can manage it the other way (i.e Going from the 4 yr old distribution forward...)- and have done it with one indie title that I've ported for the studio, and about to do with another one for a different studio.

    Now, this is not to say that it's not an interesting program, or that I won't have uses for the concept he's come up with there- but to call it "easy" portability or a great answer for Linux on things is to clearly misunderstand the real problems. If it's an answer for all of that, you asked the question wrong.

  • by Svartalf (2997) on Saturday November 13, 2010 @10:38AM (#34215408) Homepage

    The thing is...typically, mind...most people don't encounter the fun you're describing. It's when you want something SPECIFIC that it typically gets you the way you're describing it. Seriously.

  • by countertrolling (1585477) on Saturday November 13, 2010 @11:24AM (#34215560) Journal

    Maybe I'm misunderstanding things, I don't know, but I see two distinct points of view. For the developer, shared libraries are far more convenient for efficiency's sake, but for the end user, they could care less. The convenience of portability outweighs those concerns.

    The big problem with this is that he's dragging along a complete sandbox (Incl. X11 for X apps...) for each application.

    Time to swap out that 80 meg hard drive :-) And bits don't actually weigh that much, so I don't mind schlepping them around with me. Being able to run my program directly off the USB stick sure is nice. But then, what the hell, I could just put a whole "live" system on it, so maybe portability really isn't an issue.

If a 6600 used paper tape instead of core memory, it would use up tape at about 30 miles/second. -- Grishman, Assembly Language Programming

Working...