CDE — Making Linux Portability Easy 385
ihaque writes "A Stanford researcher, Philip Guo, has developed a tool called CDE to automatically package up a Linux program and all its dependencies (including system-level libraries, fonts, etc!) so that it can be run out of the box on another Linux machine without a lot of complicated work setting up libraries and program versions or dealing with dependency version hell. He's got binaries, source code, and a screencast up. Looks to be really useful for large cluster/cloud deployments as well as program sharing. Says Guo, 'CDE is a tool that automatically packages up the Code, Data, and Environment involved in running any Linux command so that it can execute identically on another computer without any installation or configuration. The only requirement is that the other computer have the same hardware architecture (e.g., x86) and major kernel version (e.g., 2.6.X) as yours. CDE allows you to easily run programs without the dependency hell that inevitably occurs when attempting to install software or libraries. You can use CDE to allow your colleagues to reproduce and build upon your computational experiments, to quickly deploy prototype software to a compute cluster, and to submit executable bug reports.'"
Isn't that three-letter acronym taken? (Score:5, Informative)
CDE will always mean Common Desktop Environment to me.
Re: (Score:3, Interesting)
Me too.
Common to Sun and HP. :-)
I guess Ultrix, too.
Regarding this development - it's really what NeXT and later Mac OSX packages do. In the Windows world they have Thinapp and MS's App-V.
Re:Isn't that three-letter acronym taken? (Score:5, Insightful)
That method guarantees security problems. Applications and their dependencies should be managed by a proper package management system.
Re:Isn't that three-letter acronym taken? (Score:4, Insightful)
I prefer to avoid the disagreements over what is a "proper package management system". In fact with each program in its own "sandbox", protected from each other, I see better security.
Re: (Score:2)
So now you want everything in it's own BSD style JAIL?
That is sure to waste a lot of space.
Re:Isn't that three-letter acronym taken? (Score:4, Insightful)
It is also sure to piss off users who now have to have another Documents directory for each application.
Else my bad application could edit a document that another application that relies on an old outdated insecure library uses.
I am starting to think you are not thinking this through.
Re: (Score:3, Insightful)
File extensions mean something?
Wow, that takes me back. That is another big windows flaw you bring up there.
Re:Isn't that three-letter acronym taken? (Score:4, Insightful)
Suppose a security flaw is found in a commonly used library, do you think you will get more timely security updates by
1) the packager for that library providing an updates package, or,
2) every single application that uses it providing an updated package
The CDE site gives two reasons for doing this:
1) To solve "dependency hell". This is a rare problem these days - except with Skype!
2) To provide guaranteed reproducible results for researchers. This is a specialist concern, and not necessaries a good thing - it means that results that are the product of a bug in a particular version of a library will be duly reproduced.
Re:Isn't that three-letter acronym taken? (Score:4, Interesting)
No, solving dependency hell is far worse today. Building from source back in tarball only days you had problems if version W of library X was not installed. Building from source today you have that along with issues if your distro of choice does not have version W of library X in the repository along with actually having version W of library X that you built from source installed but your package manager refusing to install things dependent on it because it refuses to acknowledge anything's existence outside its list of installed packages.
You also have issues like cpan which is currentish vs your distro's package manager which is usually anything but.
If it weren't for checkinstall I'd seriously consider LFS over package management in situations where I was constantly having to build things from git/cpan/etc... And I'd probably have a huge dent in my desk from where I constantly banged my head instead of the only moderately sized one I have now.
Re: (Score:3, Informative)
Now it depends on your needs, if your constantly building stuff from git/cpan it sounds like your intentionally living on the bleeding edge and/or doing development...
For most users, and especially production servers having non cutting edge packages managed centrally is far more desirable. Personally when i want cutting edge i use gentoo, so i can have installed a mix of stable core packages with a select few being cutting edge versions while still being package managed. If i want to install something not c
Re: (Score:3, Interesting)
It's really more of a problem when one or two packages on a non-bleeding edge system need to be bleeding edge for some specific reason but actually updating those packages would cause backwards compatibility breakage of some other essential something.
Trust me, it's far from gone even on arch/gentoo/slackware.
Re: (Score:3, Insightful)
The thing is...typically, mind...most people don't encounter the fun you're describing. It's when you want something SPECIFIC that it typically gets you the way you're describing it. Seriously.
Re: (Score:3, Insightful)
The big problem with this is that he's dragging along a complete sandbox (Incl. X11 for X apps...) for each application.
FUN.
It avoids dependency hell, yes- but it's vast overkill considering that I can manage it the other way (i.e Going from the 4 yr old distribution forward...)- and have done it with one indie title that I've ported for the studio, and about to do with another one for a different studio.
Now, this is not to say that it's not an interesting program, or that I won't have uses for the concept
Re: (Score:2)
Applications and their dependencies should be managed by a proper package management system.
RPM or DEB? And which version of each package? Long-term support distributions are known for having packages that are months to years out of date except for backported security fixes.
Re: (Score:2)
Which ever you need.
Long term support distros are for servers.
Re: (Score:2)
So build a package, if the only things in it are your code that will make it very easy.
Just about 0 commercial software uses no outside libs in some shape or form.
Re: (Score:3, Interesting)
Re: (Score:3, Insightful)
Re:Use a package manager (Score:5, Insightful)
Generally linux distributions follow a fairly standard naming/location convention for files, most of the variations exist in specialised linux distributions (eg android) where there is good reason for the differences. /usr/local.
Most software also allows you to choose where to install it at compile time, although the default will usually be
A linux system is often far less messy than a windows system for instance, where all kinds of files are under the windows and system32 dirs.
Package managers are actually a very good solution to many problems, not only do they handle dependencies but they provide a centralised database of installed software, a file integrity database (both on the system - storing checksums of everything, and off system because the checksums corresponding to a given package versions files are known), clean removal of software, a single place and standardised interface for installing software (thus removing the need to download programs from potentially untrustworthy websites - you only have to trust your os vendor, not hundreds of third parties) and most important of all, a centralised update mechanism for applying important security patches to all of your software...
Other software vendors have chosen different methods to try and resolve the same problems, but most of them are lacking in one way or another, or make different compromises...
The OSX method of program bundles avoids dependency problems, but introduces the inefficiency of reducing code sharing, this has less impact on closed source software where code is rarely shared anyway, but for open source one of the key advantages of the open development model is reduced by this approach. On the other hand, this method does provide clean removal and makes it easy to have multiple versions of something installed.
The Windows method is rather chaotic, individual programs are expected to create their own installation and removal programs as well as handle their own update mechanisms, this has resulted in a whole range of software which behaves in different ways, stores files in different places etc... Update mechanisms and uninstall routines are down to the individual application and may not exist at all, or may not work correctly. This has resulted in lots of very poorly behaved software which assumes you are a privileged user and can write to system locations, and subsequently in order to retain compatibility microsoft have been forced to implement all kinds of dirty kludges to make such applications think they are able to write to system dirs when they can't.
The only potential downside to the linux system, is that application suppliers don't have a fixed list of system libraries which will always be present. Under OSX or Windows you know that a core set of libraries will always be there, and anything else is typically provided by the app (sometimes redundantly), whereas different linux distributions may provide different base libraries.
Complete and utter bullshit! (Score:5, Interesting)
Sorry if topic sounds a tad personal, but hey...
> The real problem is that Linux distributions, taken together or individually, presents developers with too many completely unnecessary choices as to where essential library files can be put, and also, there is no standard version naming and locating convention.
Do you need it to boot? Prefix is / /usr /usr/local or /opt /import/x86 etc is a good place
Do you need it after boot? Prefix is
Do you want to install custom stuff that is not handled via the system's default software handling solution? Prefix is
Do you want to install into home dir? Prefix is ~/local or ~/opt
If you are in a heterogenous environment with shared home between lots of architectures etc,
This leads to clean & clear separation of software after a system people poured a lot of thought into. Is it easy to grasp at first sight for someone used to Windows? No. But that is _not_ the priority. Sorry, it's not. People writing code need to learn how the language works. Why shouldn't they learn how to system works?
> Package managers are a complex solution to a problem that need not have existed in the first place, if it was realized that unnecessary choice is deadly dangerous, in the world of large-scale software interoperability.
Yeah, cause grabbing random downloads of .bat, .exe, .msi, .whatnot turned out to be awesome. Especially the integrated updates. Oh, what's that? Everyone is implementing their own system leading to dozens of parallel update mechanisms on a single machine? Now _that_ is efficient! And the programs that don't have an update routine? Simple, just write them bug-free, without holes and a complete feature set in 1.0!
> There does not need to be any choice for where on a file system a given application or a given library should be located.
That is true if you consider every machine to be an island. Unix thrived and continues to thrive cause you can create huge shared environments with almost no work.
> That should be completely determined by the app or library name, version (using a standard versioning scheme), variant (using a standard variant naming scheme), and origin person-or-organization, using a standard organization identifying scheme.
My custom mplayer is in /usr/local/mplayer. My custom git is in /usr/local/git. My custom vim is in /usr/local/vim. I can delete any of those and remove the program, along with all its libraries and whatnot, with one single rm. /usr/local for stuff, again... It's their problem, same as if they did not know how to open() a file.
If devs simply don't know that they should default to
> It goes without saying that there should also be a standard globally unique URI for such libraries and apps (including the unique name, version, variant, origin identification).
No. No. No. This breaks any and all assumptions about being able to install different versions of stuff for different reasons. Use prefixes and use LD_PATH, etc.
> So there should be no choice about where on the internet to get it (except for the choice involved in a standard mirroring URI scheme), and
no choice about where to put it.
Maybe you are too young to have seen this yourself, but after a few years, most URLs are dead. With gittorrent, ideally with a DHT sprinkled on top, this might change in the longer run, but what if the next VCS that whoops git's ass comes along? Static information on the internet is mostly a myth. (Also, git would need to get rid of SHA1 for fully automated code distribution, imo)
> With this discipline, obviously needed in today's universe of code, all such package management, as well as dependency acquisition and installation, could be managed by a single unified and incredibly simple automated package manager; call it the
Re: (Score:2)
So when a library that 20 apps use needs to be updated, it gets downloaded 20 times?
Making a package isn't hard at all, and you can completely automate the process, making it effortless for subsequent versions of the app.
Also, there's no central updater for the system. Talk about madness.
Re: (Score:2)
CDE will always mean Common Desktop Environment to me.
Hear, hear! Of course I was always an openwindows fan since CDE rendered so slowly on our sparc lx's.
Re: (Score:2)
XFCE on my Ultra 5.
Re: (Score:2)
Re:Isn't that three-letter acronym taken? (Score:5, Insightful)
I am still waiting for Gnome or KDE to catch up with the efficiency and usability of these older environments.
KDE is getting closer now that it's possible for the desktop menu to present a list of applications rather than a handful of useless wallpaper-changing commands, but both major environments seem to be stuck on the stupid Windows 95-derived taskbar paradigm. Give me spatial management of running applications dammit! I want to develop muscle memory, not scan slowly across a list of tiny icons that are never in the same place twice.
Re: (Score:2, Insightful)
I almost started screaming (Score:4, Funny)
No! I'm not going back! I'M NOT going BACK! MOTIF IS DEAD TO ME!
Re:Isn't that three-letter acronym taken? (Score:5, Funny)
CDE will always mean Common Desktop Environment to me.
I only used CDE briefly, but I remember that it was like a combination of the sheer visual elegance of Tk's widgets with lush the color scheme of a bordello.
Re:Isn't that three-letter acronym taken? (Score:5, Funny)
I only used CDE briefly, but I remember that it was like a combination of the sheer visual elegance of Tk's widgets with lush the color scheme of a bordello.
I'm unfamiliar with this 'CDE' but you're compelling me to try it.
Re: (Score:3, Funny)
CDE 2 (Score:3, Informative)
I'm just pointing out a major application - that's not so major anymore - Common Desktop Environment uses this acronym :)
Does sound like a neat tool though!
Next up, the Flash Transport Packager (Score:4, Funny)
To more quickly prepare software for easily installation.....
GPL Compliance warning! (Score:2, Insightful)
If those libraries are GPL or LGPL, then when you deliver the binary of the library, you must also deliver the source or an offer to deliver the source, and you must also deliver a copy of the (L)GPL, as part of the CDE. Is this done?
Bad idea or worst idea ever? (Score:5, Insightful)
Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.
Re: (Score:3, Informative)
Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.
Or you could just use OpenStep, get dynamic libraries and portable apps. This is a long solved problem.
Re:Bad idea or worst idea ever? (Score:5, Interesting)
One method is to have a tool for interrogating the API version and also testing the API against some set of tests that relate to the application being installed. You'd then apply the following:
This should keep redundancy to a minimum. There will be some, since there's nothing in this to collaborate between apps using this method, but it's a start.
Re: (Score:3, Insightful)
Package mangers exist for a reason, use them.
Except that distribution specific package manager do *nothing at all* to address the problem of distribution independed binary packaging. On top of that package manage are a really lousy solution to the software packaging problem, as they don't actually solve the underlying problem of duplication and incompatibilities, instead they have a single monolithic repository that they declare as the one and only source for software out there. As long as your software is in that repository, in the version you want,
copying proprietary software (Score:2, Insightful)
Limitations (Score:2)
Knowing what files a program will open without running the program is impossible, and since a program can dynamically change what files it opens from run to run, it would be impossible to predict every file that a program would require in all situations. The best that a tool like this could do would be to record the files that were used during a given run of the program and assuming that the program when run later with the same inputs would use the same files, support running the program with exactly those
Re: (Score:2)
Re: (Score:3, Insightful)
You could just use SElinux, which would already let you do that.
This looks like a solution looking for a problem.
Maybe is the fatigue (Score:2)
*just in case anyone wants to take a look and change some source code...
already done, already proven a bad idea (Score:2)
Uhm, but using the packaging system already present can do the same with less waste, better support for distribution and with existing tools. File-based dependencies like in RPM may be slightly deficient here, but with package-based dependencies like in .DEB you have full control over what you want.
There were multiple such systems for working against the packaging system already like autopackage, and they turned out to be a disaster. I fail to see how this one is any different.
Re: (Score:2)
Not sure what you mean by "File-based dependencies like in RPM may be slightly deficient here". RPM only depends on specific files if you need them; e.g. if you have package A with a config file /etc/a.conf that is also required by package B, package B will depend on the file /etc/a.conf (commonly found with scripts and dependencies on /bin/sh, /usr/bin/perl, etc.). If package B just needs package A to work, it'll depend on "A".
My extremely similar tool: dynpk (Score:2, Interesting)
I very recently published a tool [xgoat.com] that performs a similar task. dynpk (my tool) bundles programs up with packages from your system, then wraps them with some other stuff to create a bundle that essentially allows you to run a Fedora program on a RHEL machine (and probably Ubuntu or Debian, but this is outside my needs...).
Recompiling loads of libs for RHEL isn't fun or particularly maintainable. Therefore, use the ones from Fedora!
Sharing nethack "bones" file has never been easier (Score:2)
Seriously though this could end up making collaborating on software a lot easier among trusted people.
chroot (Score:2)
Wait...this sounds like it's just a couple half-steps shy of an automated, app-specific chrooting system.
Concerns about outdated libs aside, that sounds...awesome.
Re: (Score:3)
This does not make anything easier, it just makes it wrong.
The ubuntu app center makes things easier without this sort of nasty kludge.
Re:It's About Time (Score:5, Informative)
I think most people here are not understanding the target audience for this tool (hint: it's not for your typical linux environment). It's not about package management or having a universal installer... it's about being able to run your application in a different environment where you don't have admin rights.
In a lot of university clusters or compute grids researchers have access to a large collection of compute nodes, but they usually don't have any rights to those machines. In fact, most of the time the programs are ran in a sandbox and have a restrictive environment. To run their codes reliably, researchers often have to perform some sort of static linking or package up all of the dependencies with the executable. apt-get or yum are not options in these environments... you may not even be able to ssh into them. Ideally, you could ask the system administrator that controls the cluster to install certain packages, but again, this is not always possible particularly if the researcher requires a niche package used in their domain.
Moreover, the cluster may be composed of heterogenous set of machines with different versions of Linux. Package management does not help you here. The only way to reliably execute your programs on such a heterogenous cluster is to statically link or include your dependencies. If you are wondering who would use such a maddening environment where you have no admin rights... google Condor, OpenScienceGrid and Globus. This is how a lot of research computation is done.
Of course, the hot new thing is virtual machines and clouds... but firing up a VM each time you want to run an application is very heavyweight... especially if your applications has a short run-time.
TL;DR: this isn't for your typical ubuntu or fedora install; it's for scientific research that is done on restrictive computing clusters and grids.
As a side note, I made and use a much cruder tool http://bitbucket.org/pbui/starch/ [bitbucket.org] that packages everything up (executables, libraries, and data) in a self-extracting tarball which can be executed on remote hosts. It's not as slick as CDE, but it's been used with success by various research groups that I collaborate with.
Re: (Score:3, Insightful)
I've spent time on work like this myself, even used code like this in production...it's really really hard to be sure you've got everything and don't have unnecessary deps, particularly when you've got scripts (that you didn't write) that call out scripts that call out scripts.
I know a few others that have invested time in this also, when you spend much time on a cluster that you don't own eventually you at least think about doing this.
Re:It's About Time (Score:5, Interesting)
That's probably another use, but I really don't think that's the main place where it'd be useful. I DREAM of being able to just download an application archive, extract it *anywhere I want*, and just run it. Just use it, without having to worry. Any application - not the apps (and versions) that some distribution maintainer has gotten around to porting to my flavor.
Re: (Score:3, Insightful)
Why? No offence, but you dream small. This seems barely one step up from what we've had in the past. Why SHOULD you have to do any of this? A real improvement shouldn't involve any of this mundane stuff. You shouldn't have to go download anything yourself, and you shouldn't have to think about managing where you want to extract things.
The true goal should be to automate all this and make it tra
Re: (Score:3, Informative)
Why dream?
Buy a Mac.
Software is most frequently distributed in .dmg (disk image) files. You download the file, maybe gunzip it, then you click on it. That mounts the volume. Then you click on the application. That runs it.
If you want to turf the .dmg and put the application in your application folder, you just drag it from the dmg to the application folder (or fan) and let go.
The biggest piece of the puzzle to making this work from a systems POV is the Mach-o linker. The linker understands executable-relati
Re: (Score:3, Interesting)
apt-get or yum are not options in these environments...
Really? Why not?
Put another way: Rubygems can install to a home directory, and only requires Ruby itself to be in the path. Are you saying that the sandbox environment doesn't allow you to have reliable filesystem access or modify your environment variables? Because that's all it takes.
I realize apt-get or yum may require system-level access in their current configuration, but if they can be configured per-user, that's a limitation in apt-get or yum, not in the idea of a package manager.
firing up a VM each time you want to run an application is very heavyweight...
Not necessarily, esp
Re: (Score:3, Insightful)
Why is this a kludge?
App portability and dependency problems has been one of the Achilles Heels of Linux since, well, forever. We laughed at Windows for DLL-hell, and if anything package managers seem like the ugly kludge way to resolve it to me. I wonder how many tens or hundreds of thousands of man-hours have been lost dealing with these sorts of issues. It's by far the #1 thing that's prevented me from using Linux for those purposes, and I'd REALLY like to use Linux (though there are others). Hell, i
Re: (Score:2)
Re: (Score:3, Informative)
You can also run different profiles at the same time with -no-remote -P . You can run different versions or the same.
Re:It's About Time (Score:5, Interesting)
Making applications portable is handy for doing things like running them from a USB stick. It also makes backup much more convenient.
Copy the program and its data in one shot, carry it with you, and use anywhere.
Windows apps are ahead of the game on this one:
http://portableapps.com/ [portableapps.com]
Re: (Score:2, Interesting)
If you need the same app everywhere it is easy enough to either make the data format portable or run the entire OS from the usb stick. Your method just lets you move outdated and almost guaranteed insecure software around. Windows only has this because it still lacks proper package management.
Bringing that sort of braindead thinking to linux is a curse not a blessing.
Re: (Score:2)
Ubuntu package management requires online access, unless you keep an updated DVD of the latest normal dependencies.
That means, if you have a Linux distro sitting on a non-internet-ized computer, it can be difficult to run a random program because of the dependencies and libraries you need.
Granted, that's not going to happen in too many random user's setups, but it could happen.
Re: (Score:2)
So you sneakernet the debs over. How else are you planning on installing/running something anyway?
Unless you somehow write software yourself that depends on libs you do not have installed. Which is a pretty silly thing to do.
Re: (Score:3, Insightful)
... so isn't this basically just a way to gather all the appropriate dependencies and put them all into one spot?
Hm. I guess this is a way to do it without installing. Reading comprehension fail...
Still, I can see how it could be useful in some situations, just like having certain programs that don't require installation on Windows can be helpful.
Re: (Score:2)
Those are only useful because windows is broken. If you could install apps as easily on windows as you could on linux from one central source no one would bother with that.
They might make a launcher that calls firefox with their portable preferences folder, but no reason to go carting firefox around.
Re:It's About Time (Score:5, Insightful)
Guo comes from a Windows background (He interned at Microsoft last year), so it's understandable why he might have a Window perspective. That doesn't make it good for Linux to adopt that mindset.
Re:Party like it's 1988 (Score:5, Insightful)
Wow, static linking, did anybody for even a second think it is kinda weird to have the same lib on the machine over and over and in every old exploitable version you can find?
Re: (Score:2, Insightful)
That would all be nice if not for each program replacing shared libraries with its own and breaking the other programs. A program should not disturb the system or mess with its libraries. I prefer it remains as isolated as possible, where it can do the least damage.
Re: (Score:2, Informative)
No the program should just use the libraries the system has. This is what package management is all about. What you prefer is only the product of brain dead OSes that lack proper package management.
Re: (Score:3, Interesting)
No the program should just use the libraries the system has.
Then the package manager will fail to install the program because the program requires a later version of a given library than is available in the long-term-supported distribution that the end user is running. For example, PHP 5.3 made PHP far less of a pain in the ass, but Ubuntu 8.04 LTS (the version found on Go Daddy dedicated servers, for example) still has 5.0.something or other.
Linux is not Windows (Score:3, Informative)
No, it won't. Unlike Windows, Linux is perfectly happy having multiple versions of any given library installed and available: Fortunately, on Unix-like systems (including Linux) you can have multiple versions of a library loaded at the same time, so while there is some disk space loss, users can s [faqs.org]
Re: (Score:2)
That's why dpkg prevents installation if the package wants to override others' files.
You don't need static linking to do damage prevention, you need a decent package manager.
Re:Party like it's 1988 (Score:5, Insightful)
Do, please, show me just one widely-used program that does this on a recent UNIX or Unix-like platform.
Right. That's why you should put programs you install under /usr/local, not straight under /usr. Or of course many programs like to be installed in their own self-contained directories under /opt, which is, er, basically exactly what you're asking for and has been common practice for decades.
Re: (Score:3, Insightful)
That's why you should put programs you install under /usr/local, not straight under /usr.
Not the issue. That's a given. It's when I suddenly find out I don't have some bizarre version of gtk, or ncurses (great name, because that's what I'm doing when I find it missing), and I'm suddenly without internet, it gets a bit tense. I prefer the portability over raw efficiency. It is far and away one of the best things about a Mac. I can take something as bloated as MS Office or Photoshop straight from one machine
Re: (Score:3, Interesting)
I can pirate software too, but I prefer to use superior free software.
Re: (Score:2, Informative)
Re:Party like it's 1988 (Score:5, Insightful)
For packages provided by the distro, it makes sense to have them all use their complex dependency tree. For installing some other version side by side, this sounds like a great tool. The problem with dependencies is that often a pebble turns into an avalanche by the time you're done. If you want the new version of *one* KDE app, it can drag pretty much the whole of KDE and every library they in turn depend on with it in an upgrade. I've had that happen and ended at 450MB to download and install, and that would pull almost all packages out of LTS support.
From the user's point of view it's completely illogical to upgrade the whole system just because you want a new feature in amaroK 2.4 while your distro only packages 2.3, you expect one application to install or upgrade independently of any other application. That does not happen with Linux. It is not just about new library versions, via dependencies you pull in unwanted version upgrades. As for security I'd rather have one potentially insecure package on my system than to pull most packages out of support, it probably open ups more vulnerabilities than it prevents.
I wouldn't want to run a dozen applications like that. But if it's one or two? I got no problem taking the extra overhead of a bit more memory use. And honestly, a lot of software I use isn't in contact with the "outside world" as such. Even if there is an exploit in a library, I'd never open any file crafted to exploit it. Obviously it is good in general to patch stuff, but it's not always that critical...
Re: (Score:3, Informative)
From the user's point of view it's completely illogical to upgrade the whole system just because you want a new feature in amaroK 2.4 while your distro only packages 2.3, you expect one application to install or upgrade independently of any other application. That does not happen with Linux.
Sure it does. It just doesn't happen with Ubuntu, or Debian, or RHEL/Fedora/SUSE/etc.
If you use a source-based distro then unless the newer version depends on some API exposed by a newer library then it will build just
Re: (Score:3, Interesting)
LTS is good for production environments where you have lots of machines to support.
Suppose you run Ubuntu on 1000 workstations. Each of those runs a variety of programs, not all the same, and some aren't from the PM (they might not be open source, or widely used - they could even be homegrown). Every time there is an upgrade one of these programs could break.
The idea of LTS is that for the most part everything stays put but you still get security updates. For something like a corporate desktop that is ex
Re: (Score:3, Informative)
OS X is routinely first to lose in the "pwn to own" challenge, among others.
Re: (Score:2)
Did anybody, for even a second, think that tracking where files are "splattered" might make it less splattery?
Re: (Score:3, Funny)
[ducks outa the way]
Re: (Score:2)
It's no weirder than the opposite, which is for your documentation to be scattered all over the disk in a whole bunch of different directories, and your commands to be scattered all over the disk in a whole bunch of different directories, and then when you want to fix a major security hole in a common library you have to completely reinstall 50 programs from scratc
Re: (Score:2)
Installing software on linux is easier than on windows or osx.
Either
apt-get install foobar
or find foobar in the GUI package manager/app store/software center.
This just brings us the hell that is software on windows. Every app with its own outdated versions of every other damn lib nevermind the fact that it wastes space and provides a ton of security bugs.
Re: (Score:2)
Yes, if you ignore when the version of the package you want doesn't exist in the repos and it also requires dependencies that are of a higher version than in the repost. And this isn't an uncommon case.
Re: (Score:3, Interesting)
For software that is in the package manager, yes.
Where something like this "CDE" might be handy is for software that is not in the package manager. Suppose you've written a program that is only of interest to a handful of users. There's no way it's going to find package maintainers for every major distro, and your users might not be happy building from source code.
There are also classes of software that are not allowed in the main repositories
Re:I really like where this is going. (Score:5, Insightful)
Where something like this "CDE" might be handy is for software that is not in the package manager. Suppose you've written a program that is only of interest to a handful of users. There's no way it's going to find package maintainers for every major distro, and your users might not be happy building from source code
So do the packaging yourself. It's not hard. And when you're done, you have something sitting in the RPM or DEB database with all the others so you can keep track of it.
There are also classes of software that are not allowed in the main repositories for some major distros like Fedora and Debian. For example, the authors of indie games might want to let Linux users play without making the whole game open source. Even if they open-sourced the engine, some distros will not permit it in the repos if its only use is to access non-free data.
So set up your own ppa (or rpm equivalent) repository. Your customers can add the repository to their list and then keep track of the package. You seem to be under the impression that repositories are only for "approved" software or that package managers can only handle a small number of entries. I have over 150 entries in /etc/apt/sources.list. Adding another one is no big deal. You also seem to think that licensing issues affect what you can put in a repository. It doesn't matter if you have your own repo. You could put commercial software in there, like Sun/Oracle with their VirtualBox.
Package management and repositories as they exist in the Linux world are better ways of handling the distribution of software both free and commercial than anything else I've seen on any platform.
This "CDE" doesn't solve any problems, but introduces its own "dll hell"
--
BMO
Re:I really like where this is going. (Score:4, Informative)
You're not even replying to anything I said, except to post an ill-informed rant.
There are *three* package types. And what these do is far more than anything seen in the Windows universe.
Windows doesn't even *have* a packaging scheme. Sure, there are installers, but that's all they do. There is no dependency resolution. There isn't any updating except manually or if the individual program checks by itself. Things really haven't come very far from the self-extracting .zip file method of installing. What we've gotten over the years is a graphical front-end to an archive extraction and a script to tweak the registry, and a script to uninstall. That's it. It's arcane. It's stone knives and bear skins.
At least Microsoft dispensed with the bogus "add and remove software" and renamed it "remove software" in the Control Panel. In XP if you came from Debian, you'd expect the "add and remove software" would be where you'd find the Windows equivalent of Synaptic. Sorry, chuckles, it's not. Now in Windows 7 it's finally fixed. It's a small amount of honesty, but it's still better than before.
And besides, do you know how long it takes to make a package for Linux after compiling? Literally less than 5 minutes, and it can be automated. Doing packaging for 3 package formats (.deb, .rpm, tar.gz) is not a lot of work at all.
What is available for software take packaging and distribution for Linux software is light years ahead of anything seen in the Windows universe. The Apple "app store" is the only thing that comes close and even that is clunky compared to a fully functioning Debian or RPM repository.
Your rant is typical of the Windows user who thinks he knows anything about Linux but doesn't really. I suggest you acquaint yourself with what you're talking about before you open your mouth here again and sound even more silly.
--
BMO
Four GNU/Linux package mgmt problems (Score:3, Insightful)
apt-get install foobar
This brings four problems that I can think of:
Re: (Score:2)
hopefully it'll get to the point where installing software on Linux will be as easy as on WIndows and OSX.
Why would we want that? I prefer the current situation, where installing software on Linux is much easier than installing software on Windows and OS X.
Re: (Score:2)
I prefer the current situation, where installing software on Linux is much easier than installing software on Windows and OS X.
Like when you have to upgrade your entire Ubuntu install just to get the latest version of Firefox? Whereas I can install Firefox 4 beta on a 10 year old version of Windows?
Re: (Score:2)
And have all the existing, open bugs of a 10 year old version of Windows along with all the bugs of a beta version of your browser?
Why not just upgrade when things are fixed?
Re:I really like where this is going. (Score:5, Insightful)
Dear god no.
I do not want to execute installshield or any similar crap/wizard for every little thing I install.
I do not want to have a system tray/task manager full of two dozen vendor's update checker processes, each individually bugging me about how I'm running WidgetFoo 1.8.1.20.1.3, and it is critically important that I execute WidgetFoo's custom one-off graphical update wizard with 3 or 4 pages to click through to get to 1.8.1.20.1.4. Then rinse and repeat once per app instead of knocking them out in one shot/dialog/icon/process.
I do not want each application to bundle their ancient ass directx library or ancient library from visual studio or any other similar crap.
Windows installs were not historically 'easy' due to any effort on MS's part (installshield and friends made an entire business out of covering for MS' lack of help, even as MSI matured into a usable solution). Linux (specifically Debian) really got this right first. Apple recognized that model and made it a great success on the iPhone, setting the tone for all of modern mobile devices. Debian did it right first and never gets the credit.
Re: (Score:3, Insightful)
I do not want to compile from source every time I want to install some app on an older or uncommon system.
I do not want to have a system tray/task manager full of two dozen vendor's update checker processes, each individually bugging me about how I'm running WidgetFoo 1.8.1.20.1.3
WidgetFoo could just as well check for updates itself and not have a system tray app runnin for that. Example - Firefox - when it is running it checks for updates, but does not leave a system tray icom to do that when it's not running.
I do not want each application to bundle their ancient ass directx library or ancient library from visual studio or any other similar crap.
Then maybe you should ask the out-of-business (or not) game or app developer to update its product to make it run on a Windows version that was released 8 years after they
Re: (Score:3, Informative)
Why in the world would you need an installer for something like this? I don't understand why 99% of applications are ever distributed that way, except for poor OS design or just bad developer habits.
OS X (and OpenStep) did it right earlier in the sense that the app is a self-contained unit, well-laid out and hierarchical but a single unit to the user. "Installation" typically involves a single move command or drag-and-drop of a single "file" (bundle), and portability across systems cannot be beat (providi
Re: (Score:3, Interesting)
AppBundles are beautiful and easy, but also not flexible enough to handle many cases, which is why quite a few of software on MacOSX comes as .pkg with an installer, instead of as AppBundle. What MacOSX however did right is making the install application a standard thing of the OS, not something that has to be reinvented by every software developer like on Windows (.msi should probably fix that one day).
Re: (Score:2)
Re: (Score:3, Informative)
4) It needs to be open-source for this model to work. Some software isn't. =)
Not true. Adobe publishes their closed flash plugin via their own proprietary apt and yum repositories. The distros allow arbitrary vendors to hook into the single update management infrastructure with nothing more than the end user's permission.
Re: (Score:2)
You seem to be pretty bad at counting.
2 versions would be fine. RPM and Deb.
Re: (Score:2)
You seem to be pretty bad at counting.
According to this page [freedesktop.org], libjpeg is not guaranteed to be binary compatible between releases.
Re:cross distribution compatibility (Score:5, Insightful)
Firstly, you don't need 5000, you need 4 or 5 for the most used distros. Ubuntu, Fedora, OpenSuse, Debian and Red Hat. Let the others figure it out from a tar file.
And if a company like Skype can produce those packages, so can e.g. Adobe.
Secondly, that already exists [wikipedia.org].
Re: (Score:2)
Which distro is "your" distro? And what are people who use a different distro supposed to do? Write their own packages?