Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software Linux

CDE — Making Linux Portability Easy 385

ihaque writes "A Stanford researcher, Philip Guo, has developed a tool called CDE to automatically package up a Linux program and all its dependencies (including system-level libraries, fonts, etc!) so that it can be run out of the box on another Linux machine without a lot of complicated work setting up libraries and program versions or dealing with dependency version hell. He's got binaries, source code, and a screencast up. Looks to be really useful for large cluster/cloud deployments as well as program sharing. Says Guo, 'CDE is a tool that automatically packages up the Code, Data, and Environment involved in running any Linux command so that it can execute identically on another computer without any installation or configuration. The only requirement is that the other computer have the same hardware architecture (e.g., x86) and major kernel version (e.g., 2.6.X) as yours. CDE allows you to easily run programs without the dependency hell that inevitably occurs when attempting to install software or libraries. You can use CDE to allow your colleagues to reproduce and build upon your computational experiments, to quickly deploy prototype software to a compute cluster, and to submit executable bug reports.'"
This discussion has been archived. No new comments can be posted.

CDE — Making Linux Portability Easy

Comments Filter:
  • by Anonymous Coward on Friday November 12, 2010 @07:38PM (#34212442)

    CDE will always mean Common Desktop Environment to me.

    • Re: (Score:3, Interesting)

      Me too.

      Common to Sun and HP. :-)

      I guess Ultrix, too.

      Regarding this development - it's really what NeXT and later Mac OSX packages do. In the Windows world they have Thinapp and MS's App-V.

    • CDE will always mean Common Desktop Environment to me.

      Hear, hear! Of course I was always an openwindows fan since CDE rendered so slowly on our sparc lx's.

      • by h4rr4r ( 612664 )

        XFCE on my Ultra 5.

        • by afidel ( 530433 )
          Agreed, XFCE was my preferred WM as well, worked really well with Hummingbird and Exceed from Windows =)
      • by Haeleth ( 414428 ) on Friday November 12, 2010 @08:15PM (#34212670) Journal

        I am still waiting for Gnome or KDE to catch up with the efficiency and usability of these older environments.

        KDE is getting closer now that it's possible for the desktop menu to present a list of applications rather than a handful of useless wallpaper-changing commands, but both major environments seem to be stuck on the stupid Windows 95-derived taskbar paradigm. Give me spatial management of running applications dammit! I want to develop muscle memory, not scan slowly across a list of tiny icons that are never in the same place twice.

    • Re: (Score:2, Insightful)

      by maestroX ( 1061960 )
      he could always use tar
    • No! I'm not going back! I'M NOT going BACK! MOTIF IS DEAD TO ME!

    • by Waffle Iron ( 339739 ) on Friday November 12, 2010 @09:06PM (#34212936)

      CDE will always mean Common Desktop Environment to me.

      I only used CDE briefly, but I remember that it was like a combination of the sheer visual elegance of Tk's widgets with lush the color scheme of a bordello.

    • Re: (Score:3, Funny)

      by Greyfox ( 87712 )
      Yeah, and I was wondering why anyone else would even want to take that acronym and the memories of revulsion it evokes in those of us who were forced to use it for any length of time. CDE is right up there with SCO for how quickly it makes me recoil in horror, evoking memories of clunky motif controls and single-threaded inconsistent desktop environment. If you long for the halcyon days of Windows 3.0, give CDE a try, it's just what you're looking for!
  • CDE 2 (Score:3, Informative)

    by ukpyr ( 53793 ) on Friday November 12, 2010 @07:41PM (#34212460)

    I'm just pointing out a major application - that's not so major anymore - Common Desktop Environment uses this acronym :)
    Does sound like a neat tool though!

  • To more quickly prepare software for easily installation.....

  • by Anonymous Coward

    If those libraries are GPL or LGPL, then when you deliver the binary of the library, you must also deliver the source or an offer to deliver the source, and you must also deliver a copy of the (L)GPL, as part of the CDE. Is this done?

  • by h4rr4r ( 612664 ) on Friday November 12, 2010 @07:47PM (#34212494)

    Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.

    • Re: (Score:3, Informative)

      Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.

      Or you could just use OpenStep, get dynamic libraries and portable apps. This is a long solved problem.

    • by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Friday November 12, 2010 @09:12PM (#34212974) Homepage Journal

      One method is to have a tool for interrogating the API version and also testing the API against some set of tests that relate to the application being installed. You'd then apply the following:

      • If the API version is within bounds, do not install the library
      • If the API version is outside of bounds but the tests succeed, do not install the library
      • If the API version is greater than the latest supported and the tests fail and a backwards-compatibility library which IS within bounds of the API provided is within the archive, install the backwards-compatibility library
      • If the API version is greater than the latest supported and the tests fail and no backwards-compatibility library is usable, install the supplied library locally to the application, using the package manager, using an alias so there's no name-clash with primary libraries
      • If the API version is less than the minimum supported and the tests fail and the user authorizes an upgrade, use the package manager to upgrade to the supplied library
      • In all other cases, install the supplied library locally to the application, using the package manager, using an alias so there's no name-clash with primary libraries
      • Where the library is installed locally, all information regarding the supplied API must be removed since it's vital it doesn't clash with anything else - however, there must be a maintenance tool for cleaning out such local libraries when they are no longer required

      This should keep redundancy to a minimum. There will be some, since there's nothing in this to collaborate between apps using this method, but it's a start.

    • Re: (Score:3, Insightful)

      by grumbel ( 592662 )

      Package mangers exist for a reason, use them.

      Except that distribution specific package manager do *nothing at all* to address the problem of distribution independed binary packaging. On top of that package manage are a really lousy solution to the software packaging problem, as they don't actually solve the underlying problem of duplication and incompatibilities, instead they have a single monolithic repository that they declare as the one and only source for software out there. As long as your software is in that repository, in the version you want,

  • This sounds like an easy way to copy installed proprietary software?
  • Knowing what files a program will open without running the program is impossible, and since a program can dynamically change what files it opens from run to run, it would be impossible to predict every file that a program would require in all situations. The best that a tool like this could do would be to record the files that were used during a given run of the program and assuming that the program when run later with the same inputs would use the same files, support running the program with exactly those

    • That's exactly what the tool does. Quoting from the CDE homepage:

      CDE is easy to use: Simply prepend any Linux command with cde, and CDE will execute that command, monitor its actions, and automatically copy all files it accesses (e.g., executables, dynamically linked/loaded libraries, plug-ins, scripts, configuration/data files) into a package within your current working directory. Now you can transfer the package to another computer and run that same command without installing anything. In short, if you ca

      • Re: (Score:3, Insightful)

        by h4rr4r ( 612664 )

        You could just use SElinux, which would already let you do that.

        This looks like a solution looking for a problem.

  • but i can't read which license this software adheres*?
    *just in case anyone wants to take a look and change some source code...
  • Uhm, but using the packaging system already present can do the same with less waste, better support for distribution and with existing tools. File-based dependencies like in RPM may be slightly deficient here, but with package-based dependencies like in .DEB you have full control over what you want.

    There were multiple such systems for working against the packaging system already like autopackage, and they turned out to be a disaster. I fail to see how this one is any different.

    • by Burdell ( 228580 )

      Not sure what you mean by "File-based dependencies like in RPM may be slightly deficient here". RPM only depends on specific files if you need them; e.g. if you have package A with a config file /etc/a.conf that is also required by package B, package B will depend on the file /etc/a.conf (commonly found with scripts and dependencies on /bin/sh, /usr/bin/perl, etc.). If package B just needs package A to work, it'll depend on "A".

  • I very recently published a tool [xgoat.com] that performs a similar task. dynpk (my tool) bundles programs up with packages from your system, then wraps them with some other stuff to create a bundle that essentially allows you to run a Fedora program on a RHEL machine (and probably Ubuntu or Debian, but this is outside my needs...).

    Recompiling loads of libs for RHEL isn't fun or particularly maintainable. Therefore, use the ones from Fedora!

  • Seriously though this could end up making collaborating on software a lot easier among trusted people.

  • Wait...this sounds like it's just a couple half-steps shy of an automated, app-specific chrooting system.

    Concerns about outdated libs aside, that sounds...awesome.

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...