Binary Package Formats Compared 292
jjaimon writes "There is a document on different package formats used in Linux/Unix systems. Worth reading." Another reader sends in this guide to creating Debian packages which seems apropos here.
Good God, You're starting a flamewar! (Score:4, Funny)
Apt-get, no emerge, no make world, ARRRGH!
Re:Good God, You're starting a flamewar! (Score:3, Funny)
Typed on my PowerBook, BTW ;-)
Slackware packages (Score:2, Insightful)
Come with everything you need, esp created to make you learn what the hell your doing before you do it.
Re:Just curious... (Score:2)
One thing less to break
Hmmm... (Score:3, Funny)
Why? I have no clue, but you could if you wanted all the features. Kind of like putting a roll bar into a SUV, so that you can start amature racing.
Interesting... (Score:5, Funny)
Not compared: my favorite package [atomandhispackage.com]
Re:Interesting... (Score:2)
First time I've ever seen guys compare packages and size doesn't seem to matter.
From what I hear, it only matters to women, and we all know how many of them hang around here...
Whats the point? (Score:3, Interesting)
While at first glance it seems heavily slanted towards Debian's .deb packages (Its first and contains more "YES"'s than the competition), as a developer I'd be far more concerned with basics like "market penetration" than whether it allows me to assign my package a "priority" over other packages.
I suppose it might be of use to folks building their own distribution, but I expect thats a pretty short list.
But personally, I tend to grab the source when I'm adding something that RedHat didn't include (or seems woefully out of date)
Deb vs RPM (Score:5, Interesting)
Part of the reason, I think that the deb format has always seemed to hold together really well is that most all of teh deb using distributions are so tightly integrated with the main debian distribution that packages are always totoally interchangeable (and are very good to notify you when they will not work.)
RPM on the other hand is adopted by many different and sligtly incompatible distributions that often finding the libraries and applications you want to install is difficult not because RPM's are hard to find but because RPM's that work in your current setup are hard to find.
This is simply why the management tool(s) on both ends (creating packages and maintaining installed packages) matter way more than the package formats themselves. Deb's are very compicated but sometimes easier to deal with because of all the good debhelper tools. RPM's are most often more 'hand-crafted', but they are a lot easier to create from scratch for many people.
The thing I really hate about deb's is the lack of signature verification. It's absolutely central to the development/upload/build process but until very recent efforts has been a total pain to use on the installation front. There is no good reason for this either.
~GoRK
The difference is the Debian community. (Score:3, Insightful)
This allows more standardization than amongst the various
Standardization means fewer problems for the rest of the users.
But it means more work for the people developing the packages.
APT (DEB) vs ??? (RPM) again. (Score:5, Insightful)
There are different levels of package management which often confuses the newcomer into believing (dogmatically) that one is better than the other.
The installable packages themselves have to have flexible dependency markings and coherent version markings. The low-level package tool has to be able to install and uninstall packages cleanly and repeatably. Seems like the dpkg/deb suite and the rpm suite are quite comparable here.
The package manager has to be able to build a requirements tree for a desired package, and then fetch all of the required packages to fulfill those dependencies on the local system. It should offer trust or signature verification to ensure only trusted repositories and trusted packages are used. The apt tool seems to be cross-platform, while non-Debian distros often spin their own service model here: up2date, Red Carpet, and whatever Mandrake and Lindows offer are each commercialized with some amount of sample access.
Lastly, the most important criteria, is the repository itself: it should contain packages which are clean and trustworthy. There have been cracking incidents, and there will be more. The quality of code between distro-produced packages and externally-produced packages can be as different as night and day. The package's meta-data and manifesting information can be crap, or it can be carefully constructed. The embedded installation scripts can be trivially exploitable or they can be carefully scrutinized against unexpected results.
Even if your package format is cool, and your package manager is cool, consider the repository. If the repository is not secure and offers poorly tested packages, many folks are going to unfairly blame it on the tools.
Re:APT (DEB) vs ??? (RPM) again. (Score:4, Insightful)
Re:APT (DEB) vs ??? (RPM) again. (Score:2)
Re:APT (DEB) vs ??? (RPM) again. (Score:3, Interesting)
My biggest gripe is not the commercial aspect of it, but it would be nice if you could add
Re:APT (DEB) vs ??? (RPM) again. (Score:2)
If you were bothered to read the text, there's a reason I put ??? there. up2date is only for Red Hat Linux users, and only covers Red Hat Linux official packages. I use it. I subscribe. But it's not the only tool out there and it doesn't cover all it should.
It's slashdotted- here is the google cache (Score:2)
>> or for those with text browsers or aol
http://216.239.37.104/search?q=cache:x0Hrwxt537
Have a nice day
5 years old! (Score:5, Informative)
LINUX needs to tell apps where they live! (Score:5, Interesting)
system. Imagine a package of...oh, say, g++, where g++ runs properly
even if you move the whole g++ package to a different dir (say from
themselves to run in one location, and they'll get confused if you
move 'em.
Some packages (eg Tomcat) let you move them and they'll still
work...but only if you set an environment variable (eg TOMCAT_HOME)
so that Tomcat now knows where it lives. In a proper environment, an
application could easily & consistently know where it currently
resides on the filesystem *cough* OSX *cough*.
What Linux needs is some standard 'run-app' script that would inform
a package of its location. For instance:
% run-app tomcat
Run-app would be simple, say, the following:
#!/bin/sh -f
$app = shift
$location = `which $app`
env {$app}_home = $location $location/bin/app
That would enable Linux to devise a package format (or better yet,
improve rpm, deb, etc) for more flexible package management.
A package would no longer need to place its binary, libraries,
manpages, etc. all in hardwired locations in the OS...it could just
leave them in its original dir. (or maybe create a 'obj' dir that you
can remove if you wish to clean up the package.)
Re:LINUX needs to tell apps where they live! (Score:5, Informative)
RPM's are relocatable (at least most, if not all of the packages Mandrake distributes are; hardcoded directories are against Mdk policy and caught by rpmlint). Just edit your .rpmmacros and set macros like %{bindir}, %{libdir}, etc.
Re:LINUX needs to tell apps where they live! (Score:2)
Yes, that's so much easier than just dragging an icon a la OS X.
The point is not that it's theoretically possible to move apps or RPM under linux, or it can be automated if you do some fiddling under the hood (and anything that involved touching a file that starts with a '.' is almost by definition under the hood), but that Linux should offer this functionality automagically. Installing or moving apps in Linux can be a nightmare. In
Re:LINUX needs to tell apps where they live! (Score:4, Interesting)
Personally, I find the directory layout of most Linux systems to be painfully baroque, with the BSDs just a step behind. Both kick the crap out of Windows system layouts, esp. when it comes to quick configuration tweaks and the like, but the simple fact that you have to know how to do shell scripting to install applications for yourself only is rediculout IMHO. I can do it, but it'd be a lot easier for people I'm trying to get started on Linux to never have to worry about entering a password every time they want to install a new version of the Same Game.
Re:LINUX needs to tell apps where they live! (Score:2)
-molo
Re:LINUX needs to tell apps where they live! (Score:2)
Running out of space on a given partition. Deciding you want to put gcc, ld, and friends inside a "development" directory and "xpdf" inside a "viewers" directory. Keeping configuration files with the program that uses them so when you start using a new machine you can copy over one bundle of files and everything works the same...
Re:LINUX needs to tell apps where they live! (Score:2)
Running out of space? Increase your logical volume size and grow your filesystem.
Want to copy your configuration over? Copy all of
Guess w
Re:LINUX needs to tell apps where they live! (Score:2)
I'd note that libprefix(db) is not so users can pointlessly drag icons around all day and mess about with filing system structures in the process. It's so you can install to peoples home directories, or /opt, or /usr, or /usr/local, or perhaps a path on another mounted drive. Having relocatable programs is just conveni
Location almost irrelevent (Score:2)
Huh? No they don't. Most packages don't give a rat's ass where the binary is located. Historically, GCC was one of the few that did, and recent versions have changed that so that you can move the install tree around.
The remaining few packages that care are mostly just suffering from bad design. Fortunately, as you say, they usually pay attention to environment variables telling them
Re:Location almost irrelevent (Score:2)
On Linux at any rate, virtually an program that uses glade, or loads data files/artwork from an external file, will have paths hard coded into it by the C preprocessor at build time. Removing these hardcodings is a royal pain in the ass.
I'm hoping once we get a nice API and strong implementation projects will begin to deprefix their software with our library.
Re:Location almost irrelevent (Score:2)
I've seen this, too, and along with hard-coded absolute paths throughout GNOME files and library
I swear that somewhere along the line, open source took a really big step backwards with resepct to libr
More to the point... (Score:3, Insightful)
You see, I know there are folks out there like you... so I don't have to be like you too. Enough hobiests and security folks will bang on popular newly released code to pacify my concerns. For specialty apps coming from unknown sources, care is taken and sourcecode reviewed. But, for code from the likeness of a major OS distributor (RedHat, Debian, etc) or a major code project (Moz, Apache, etc), I don't have to bother.
I want it fast and pain - free. Binaries please.
RPMs, an' all. (Score:5, Informative)
Best thing about RPMs? GPG signatures built in. Try rpm -K whatever-x.x.x.rpm next time. Second best thing? rpm -Va.
Ah yes, packaging (Score:5, Interesting)
Definitely one of the features which makes Linux a powerful OS. A good and well-configured packaging system can be a blessing, automagically resolving dependancies or at least telling you where and how things will fuck up. The problem with package managers is that there are quite a few of them around. Normally, diversity would be a good thing but those package managers don't seem too willing to process eachother's packages...
For example, RPM packages are almost common these days; most open source software has a few packages ready to be implemented. Pretty much the same thing with Debian packages, because of the large userbase. Chances are that a few hours after the release of a major product someone has made a .deb somewhere, ready for you to install. However, if you'd look beyond these two packaging systems, you'd get a few nasty surprises...
The TGZ packaging scheme (also mentioned in the article, along with RPM and DEB) just... Well... Sucks. Or at least in Slackware, I don't know if any other distributions use it differently but lets use Slackware's TGZ system as an example for now. What's wrong with it, you ask? First of all (and possible the foremost reason) it's almost unused. Apart from the Slackware packages itself, I've never seen anyone distribute something in the TGZ format which worked. That excludes the few things which I found and simply refused to install. It doesn't do dependancy checking, conflict-checking, heck, it doesn't do anything or so it seems. I'd continue but ranting about the bad parts of Slackware isn't the issue at hand.
The issue at hand are the two remaining package systems, which might be technically sound and quite useable, but they still won't have allot of use. Who here has ever heard of SLP and PKG packages? And even then, who here knows of any major applications which distribute their software using those package systems? Sure, SLP and PKG might be a dream to use, but without any actual packages to install, they're (possibly sadly?) not really of any value.
Which brings us back to RPM and DEB, apparently two of the most common systems, courtesy of Red Hat and Debian. Looking at the list of summed up data, it's really not a miracle those two are more common: Both support mostly options listed, both are backed by a large amount of users/developers and both are relatively easy to use, yet still distinct. Perhaps a system which allows multiple systems to cooperate (regarding dependancies and conflicts and the like) would be a nice compliment to both RPM and DEB?
Re:Ah yes, packaging (Score:5, Informative)
> just... Well... Sucks.
Not intended as an attack aginst your comment (You are fully correct, it sucks) but to clarify a point:
Slackware created a rather elegant hack at the time, of having a
Imagine if you will, a
That is all a
This is why it supports no dependencys or checking, because its just an archive file.
Technically speaking, this isnt a package format as much as a creative way to run a shell script after extracting some files.
* I realize you were just replying to the articles claim that it is a package format, and from your own experences. I just wanted to explain why your experences sucked... It was more of a design flaw to use an archive as a package format, then the package format sucks.
From an archive stand point,
Re:Ah yes, packaging (Score:2)
The other cool part about them, was that the final byte in the file told you what version of SLP the package was, and a single fread() ca
Ah, another linux-only discussion (Score:5, Interesting)
All of these formats could be done better. The OpenPackages project had a design project underway to consider the features of an ideal, multi- platform package format early last year but it seems to have died from lack of input. It'd be great to see it get a breath of new life. If nothing else, this article could serve as a starting point for what we do and don't like about current formats.
OSX Packages (Score:3, Insightful)
Re:OSX Packages (Score:2)
Re:OSX Packages (Score:2)
1) Open up your web browser
2) Find it on the web.
3) Download the DMG
4) Open the DMG (i know safari does that for you, good usability that, magic self destructing folderfiles)
5) Decide - is the icon an AppFolder, or an installer? Installers are rather common on MacOS these days, primarily because a pure appfolders system is too limited.
5a) If it's an AppFolder, open up the finder
5b) Navigate to Applications
5c) Drag and drop the appfolder in
Re:OSX Packages (Score:2)
Re:Ah, another linux-only discussion (Score:2)
One problem with humans is that they easily forget alternatives in light of the current "one true way". Linux will become the next Windows. Just wait.
Gentoo and FreeBSD ports (Score:4, Insightful)
InstallShield (Score:2)
Because you haven't looked? (Score:2)
htt
Everyone and their brother is probably writing an installer (although more people are apparently writing MP3 jukeboxes, Web image galleries, and CMSs. Trust me.) Can't say I'm seeing a "clear winner" though, which is also the case with apt front-ends.
Re:InstallShield (Score:2, Insightful)
Each of these does what InstallShield et al. do: install a program, keep track of how & where it was installed, and give the user the the ability to uninstall it.
I've had problems with un-installing things on my Windows XP box at work: too many software makers try to get around it, and there's no way to figure out where things were put. At least with Unix, I can do a find(1) before installing, then after, do a diff(1) and figure things out from there if I don't trust t
Technically... (Score:5, Informative)
Similarly, I could install a Debian binary package if that were all that existed for my particular environment, with a simple
(I digress, simple may be relative)
On the other hand, since RPMs have a special binary header, the lazy would be forced to install RPM and Berkeley DB on non-Red Hat-machines in order to build an RPM package. Though it is possible to extract the gzip'ed+cpio'ed data in an RPM without using rpm.
So, in my view, Debian has a bit of an upper hand in simplicity, from a technical standpoint, but not by much.
Re:Technically... (Score:5, Insightful)
This is not something anything but highly technical users, or even faint of heart, will encounter. However, it is something that has undermined my ability to recover from catastrophic failures on machines with RPM that do not have CD or network access. I have even been reduced to binary manipulation of RPM files to extract the cpio compatible archive (not a task I would undertake lightly).
In contrast, with Debian packages, I have been able to rebuild a machine from scratch with ar, tar, and gzip, which are extraordinarily unlikely to break. Even in the event that they are unavailable, one can copy them to lightweight media, statically compiled, and then they have no real dependencies. Even if dpkg or apt fails (the latter more likely than the prior, in my experience), it is almost always possible to recover from catastrophic mistakes.
In summary,
Re:Technically... (Score:2)
Though I do thank you for adding that bit of information.
The problem with RPM... (Score:5, Insightful)
But it's very difficult to create those management tools for RPM when the API is a "black art" known only to a few. Questions on the RPM mailing list/newsgroup will generally be met with the advice to "use the source, Luke"--all several hundred thousand lines of it!
Slashdotted..... (Score:3, Informative)
Here is the link to googles cache of the site:
CLICK HERE [216.239.57.104]
Its about how you can get them... (Score:2, Interesting)
apt-get and rpm (Score:3, Insightful)
Cause and effect (Score:2, Funny)
Effect: The word apropos is used on Slashdot.
Re:Cause and effect (Score:2)
Ergo, the otherwise contradictory systemic anomaly displays the emergent grotesqueries of our race, thus leading to the far slowe
Binary packages don't mix with source packages. (Score:2, Informative)
The RPM/DEb ideas are really good. The main problem i have however , and don't really know how can be solved is combining binary packages with source code packages (eg. when you compile you own X). When the time comes and you want to update , let say Libc, then you will be unable to do so because the dependecies include almost every package.
Or when you compile a program that is listed as a dependency to another program , you can not install that other
Where Is that Lib. (Score:2, Informative)
Portage has binary packages too, kinda (Score:2)
When you merge a package, do "emerge -b package". It builds and installs the program like normal, then creates a signed tarball file that you can use to install on other machines. emerge can then take that
What we need (Score:3, Interesting)
And instead of the old PATH environment variable idea, think of something new, how about a central file (with user modifyable sub-files) that contains a list of all binaries to be called by default.
Or about a package tree in the bash memory, that holds the information which binaries are callable... etc.
There are so much ways to get rid of PATH, and with PATH away, nothing speeks anymore against installing every package in it's very own directory, making administration and package management so much easier...
Zero Install (Score:5, Interesting)
Zero Install [sf.net]
"The Zero Install system makes software installation not merely easy, but unnecessary. Users run their applications directly from the Internet from the software author's pages. Caching makes this as fast as running a normal application after the first time, and allows off-line use."
Experimental, but give it a try. See especially the comparison with apt-get [sourceforge.net] and the security model [sourceforge.net] documents.
Re:Zero Install (Score:2, Interesting)
what a day to post this (Score:5, Informative)
uPM anyone ? (Score:2)
Does anyone have more informations to add about uPM ?
Bad Grammar (Score:2, Funny)
Always looking towards the past... (Score:3, Insightful)
* relocatable packages
* support for arch name in metadata, arch indep packages
* multiple version of the same package can be installed simultaneously (is this really a package format issue?)
Sigh. The guy has an entire section on how well "standard" tools can manipulate these file formats, as if the typical user has any desire to do home surgery on their software.
(Well, why shouldn't he? The typical linux user does want this level of control...)
But there, at the end, in the neglected "ToDo" section, are the real issues. Features that put the user in control of their software instead of the other way around. Is anyone ever going to write a package management system that addresses the needs of the user, instead of the sysadmin?
-pmb
Why it's not as important as you might think. (Score:3, Insightful)
The only conclusion I've come to is this: The package format itself isnt' so important.. what matters is the whole system approach to packaging and distribution.
Take Debian. Everyone agrees, I think, that the debian package format, and apt, together make for a great system.. but that's because of the method of package distribution and tracking, not the packaging system itself.. that and the fact that it's fairly universal in the debian world. Several apt repositories make up basically all software available for debian... and it's a lot. SO the overall experience is "Great package management". It's not just about the format, but the people.. people know what's in the standard packages, and can refer back and forth to them, checking for compatability and whatnot. The overall appraoch to package management is what rocks.. not the binary format.
Look at OSX.. they have fink. Fink, if you don't know, is basically apt-get for OSX. Works fine, no problem... except, it puts stuff in it's own folder (/sw) and it doesn't necessairly know about apple stuff already installed.. it only tracks stuff that is in the fink repositories.
In other words.. it's useful, but it doesn't have the feel of a really great package system.. because the system itself isn't based on it.
People say "ports rocks" in bsd land... but why? Becasue it's superior? No.. just because it's a big collection of useful stuff that handles dependencies well. The actual package management system is extremely basic. But the system is more or less based on it, so it works very, very well.
Redhat.. is kind of a mess. Is it because rpm sucks? Heck no.. it's just because, well, the overall approach wasn't right.
OSX.. (yeah okay I'm a mac fiend now.. I admit it.). What package management? Apps tend to be one single file, which is a package containing all the bits and pieces. No real package management system to see what's installed or not.. and who needs it.. you can just go to the Apps folder and toss stuff in the trash to get rid of it. The system was designed to work that way. so it works really well. You don't say "Gee I wish the system tracked apps" because it's so very simple to get rid of them, and to ferret out any pieces they may have left behind, which is rare..
So overall... the complexity of the package management isn't as important as everyone sticking together on how things are going to be installed and removed. If everything works the same way, it doesn't really matter how sophisticated it is.
OOOOPS (Score:3, Interesting)
He states that rpm is not unpackable by standard tools.
Can an experienced user, when presented with a package in this format, extract its payload using only tools that will be on any linux system? They can remember a few facts to help them deal with the format, but remembering file offsets and stuff like that is too hard.
First problem I have is with the "any linux system" Ummmm I've a Linksys router running Linux that can't do jack with any of these. Next an RPM is actually a cpio archive rpm2cpio is actually just a tool to shortcut what is doable with cpio. This applies as well to all of the "standard tools" statements. I also would like to point out that standard depends on which standard you use. Posix, LSB etc. In that rpm is a standard of LSB but not of Posix.
His statement that binary programs are not allowed.
Must these programs be scripts, or can compiled binaries be used as well?
This is very, unclear. Can I execute a binary from within rpm. The answer is yes. I do it all the time. Can RPM be made directly from binaries (skiping all of the build etc.) Yes it can. Can I embed the binary in the RPM and not have it ever get installed... no. But I can run it then remove it before RPM finishes.
Suggestions ... he states that RPM doesn't have them.
... but it's up to the packager to use the tool. Second the author needed to get a little deeper into rpm's queryformat (info here) [rpm.org] He would have found much of what he needed.
A suggestion says a package may sometimes work better if another package is installed. The user can just be informed of this as a FYI
This is really the fault of the packager not of the product. There are two areas for comments which can give you this kind of data
Statement that RPM can't do Boolean Relationships.
This means that a package can depend, conflict, etc on a package AND (another package OR a third package). Any boolean expression must be representable, no matter how complex.
RPM does have the conflicts and the depends paramaters that can be set. Once set you can't install a without b and c, plus removing y and x.
HOWEVER he is very right about the boolean "or" being missing... I've been championing this one for a while (I've talked with some of the developers.) but it seems it hasn't up till now been high enough on the horizon to have someone take a shot at it. (Sorry but it's beyond my ken to work on this personally) So I will keep politely advocating this until it does break the plane of need.
New Section
.. On a system that uses this product how do you do X" will yeild a completely different answer.
Sorry but this stament is just too nebulous. It's been coping with the unforseen for years. Just as debian has. That's why it get's upgraded. That's in fact a lot of the reason for the new version coming out now. To make the format more modular. and easier to mutate as times change.
All and all the article seems well done. However I'd say the chances are pretty strong that the author is a Debian fan. My personal recommendations would be. One lose the subjective nature of a number of statements. Next, when doing research be careful how you ask a question. Often times asking "Can this product do X" will yeild a no. But if you ask
I'd give this article all in all a 6 on a 1 to 10 scale for research. a 3 for new info. and a 7 for layout and style.
Re:counterproductive (Score:4, Insightful)
The distroes are quite divided. How 'bout we look at what is specific to each distro so people can chose which is best for them. That is a better way to garner support from the outside community, doncha think? Don't leave them in the dark, that's what scares people away from Linux.
Re:counterproductive (Score:2)
It will be so good all the other packages will run and hide!
um... (Score:3, Funny)
Yeah!
Cause since when has using direct comparisons for determining
the best method for a process worked for anyone?
Re:Binary packages: Security suicide (Score:5, Insightful)
Having a compiler available on all of your systems to compile C code is far greater risk than the "threat" of getting trojaned builds from Red Hat.
Take your tinfoil hat off and breathe.
Crackpot security (Score:2, Interesting)
Yes, and it is completely impossible for users to (gasp) compile their trojans elsewhere and FTP them over?
Also, are you saying that you're compiling your stuff as root? Bad idea, since compiling software does not require root priviledges. A better idea is to compile your software as a user, and then "make install" as root through sudo.
There have been cases in the past in which open source software source code has been backdoored, so that running the
Re:Binary packages: Security suicide (Score:3, Insightful)
If your paranoid enough not to trust the RPM Builders, the checksums, the download process, etc, then download the src.rpm; unpack the contents, and do your vodoo tricks. Then run rpmbuild to build your own damned binary package. I've done it (Ok, I'm a RHCE) and its cake.
Basically, this isn't a reason to dis binary packages unless your paranoia is well into the tinfoil hat level.
Re:Binary packages: Security suicide (Score:5, Interesting)
I'll download the source and compile it on my nice fast zippy P4 2.6ghz machine, then build a
My three router computers are all p133 or p166 machines. No way am I compiling anything there. Routers also dont need gcc installed.
I run my own private apt repository for this (Its just apache and some config files in text format, and one more line in my apt sources file.)
This way I can tell whichever machines to apt-get it, and later I can apt-get remove it as well.
I also dont have a huge server farm, I just have 8 machines at home for different purposes. Below the P4 mentioned above, my next fastest system is a p2 450. The others get way slower below that (p2 200 and the like, or worse.)
I also dont want a compiler on any system but my development system. The machines in my DMZ dont need compilers, as if they do happen to get rooted, that's one less tool for them to use aginst me.
Open source is also about choice. Please let us have ours, even if you have made your own.
Tradeoffs (Score:4, Interesting)
So, how much do you need to trust your packages? Do you have enough work and not enough top-secret data that you can trust the package maintainer, the upstream maintainer, and your copy of MD5? For most people, the answer is "yes". This does not apply to X Random Freshmeat App; if you're downloading a new program and installing it yourself, you should check it out first (if you have the means), since sometimes even good authors do things that are unintentionally destructive. But most people can afford to trust that a package which has been around for a while and comes from a reputable distributor is reasonably safe, especially if they're doing the work of 3 people, maintaining 5 platforms, and just trying to keep up.
Unless you're in a situation where many people want your very important data, you can usually afford at least a little well-placed trust. Otherwise, just keeping up with updates is going to consume an inordinate amount of your time, and the rest of your duties will suffer.
Re:Binary packages: Security suicide (Score:2)
Second, there is the simple case of ease. It is a serious pain in the ass to check each and every program installed on a modern desktop OS. It is so much easier to just click on an app in Red Carpet, wait for it (and dependencies) to download and install, and then get on with doing real work. I don't have the time nor the inclinati
Installing from source can tend to be easier... (Score:2)
Re:Installing from source can tend to be easier... (Score:2, Informative)
For a desktop linux distribution, I run Mandrake. Currently, I'm running Mandrake 9.1.
Let's say that today, I want to install a common package that I don't have, but want to use, like kismet, the wireless sniffer (http://www.kismetwireless.net).
So, this is what I do:
# urpmi kismet
and how long does it take?
# time urpmi kismet
ftp://ftp.club-internet.fr/pub/unix/linux / distribu tions/Mandrake/9.1/contrib/RPMS/kismet-2.8.1-2mdk. i586.rpm
installing
Re:Binary packages: Security suicide (Score:2)
I use source packages when I need to make something fit in the system, even if it means hacing code.
Generally, only if a program is of critical importance, or I am curious do I ever bother source browsing. I am not lazy, just practical. I have real work to do, and wasting my time being -anally- paranoid about legitimate distribution channels doesn't help me do my job.
Re:Binary packages: Security suicide (Score:5, Informative)
The binary distributions usually come with a way to compile your own binary packages if you wish to inspect the source code. Just download the source RPMs or use "apt-get source". The "binary distributions" are generally binary and source distributions where you can choose whether you use prepackaged binaries or compile your own.
Re:Binary packages: Security suicide (Score:5, Funny)
Every time I've bought a car I bring a socket wrench set with me and tear apart the engine right there on the dealership floor. If I didn't how could I know that Al Queda hasn't set me up to be a pawn!?! Not checking your equipment like this is tantamount to supporting terrorists.
Re:Binary packages: Security suicide (Score:2, Insightful)
N
Re: Security suicide - not necessarily (Score:3, Insightful)
Binary packages are a great boon to any sysadmin that manages more than one box..
I admin a few dozen Linux boxes (all slackware
For example, not too long ago I updated Squid (due to a security hole), which was running on 20 or so of my servers.. can you imagine HOW
Re:Binary packages: Security suicide (Score:2)
However, it is reasonable to expect several hundred packages to be in place on a moderately loaded machine. I get paid to admin linux, but even i have better things to do than sit and go over all the packages with even grep. On 9 different distributions. across 30+ machines. And it's not like i have a large installation to deal with.
Re:Binary packages: Security suicide (Score:4, Insightful)
Obviously, for large software packages, you probably don't have time to read every last line of code.
That is the understatement of the year. I would dare say that in order to read and understand a program that is on the order of five million lines, it could take you a year or two. For a non-expert programmer (or even an expert with no operating systems experience) it could take forever to just begin to understand something like the Linux kernel source.
But what I generally do is untar the source and then grep through it for suspicious things.
That's great if you know what you are an expert programmer (and if you think that a simple grep will help you that much.) But what of the small business that doesn't employ you? Do they need to perform that same review? Of course, they could skip it and just compile the thing, but that is the same as just using the binary packages!
You may as well just hang your Linux box out on the net with 500 open ports and no firewalls.
Baloney.
Because a well-hacked program will allow the hacker to get at your data, firewall or not.
A well-hacked program will be completely invisible to you, as well. Your grep methodology is too simplistic to catch any sort of sophisticated trojan. Even if you were to laboriously go through the code, line by line, you still wouldn't catch anything but the most obvious of hacks/problems.
The only way you can be completely sure is to read the source.
No, the only way is to not run it. Software is not a mathematical formula that you can "prove". Large programs are horribly complex, as you most likely already know. Binary packages serve a very useful purpose for many people. If you choose not to use them and to perform some limited form of code review, then that is great for you, but don't try to demean anyone who doesn't do the same.
Re:Binary packages: Security suicide (Score:3, Insightful)
Mandrake RPMs are security-signed, and if they don't have a valid Mandrake GPG signature, you get prompted as to whether you really want to install them. I'm sure most other distributions which use binary packages do the same.
Re:Binary packages: Security suicide (Score:2)
Re:Binary packages: Security suicide (Score:4, Insightful)
Re:Binary packages: Security suicide (Score:2, Troll)
On trusting trust (Score:2)
I have two questions. What percentage of the packages you use did you download the source for and grep through? A
Re:Binary packages: Security suicide (Score:4, Insightful)
You haven't read the entire source of the GNU/Linux system you're running, so you have no business telling us that we ought to do it!
Why am I so confident you haven't read the source? Because it's not possible. Even a relatively basic Linux install will correspond to over 2GB of source code. That would be about 800,000 pages of a typical book (2500 to 2800 characters per page). Source code is typically much less dense in terms of characters per page, so it would be millions of pages. It would take you several years to read it, by which time the bits you'd read first would be long obsolete.
How about just configuration hell? (Score:2)
Mostly you have no idea what options the binary package builder chose. Some co
Re:Binary packages: Security suicide (Score:2)
Or you can choose not to and use a binary. It's about choice and flexibility, not my-way-is-best or "just trust me." I've had some cases where my compiled versions simply aren't quite up to snuff... either because my compiler version is off or other reasons - and the binary worked much better.
The whole my-distro, my-way is just going badly for all open-source. Really, if you have the skills and time to edit so
Re:Why get binaries (Score:2, Interesting)
Re:Why get binaries (Score:2)
Gentoo provides binary downloads of the larger packages. The stage 3 tarballs (available in multiple flavors to accomodate different processors) provide the basic system, while v1.4rc2 includes the Gentoo Reference Platform (GRP). GRP provides prebuilt X11, GNOME, KDE, Mozilla, and OpenOffice on x86 and PowerPC systems. (It doesn't seem to have been updated for newer Gentoo versions, so it's no
Re:Why get binaries (Score:2, Insightful)
Re:Why get binaries (Score:5, Insightful)
Before giving me an explanation as to why you (read, the parent poster) in particular would not have a use for binary packages, allow me to explain why binary packages are useful. In the majority of instances, binary packages are useful when one is installing the userland on a system, or when installing a compiler, when you have no other systems to build the compiler on. Binary packages are also handy for systems where compiling from source would be inconvenient, resource-intensive, and time consuming.
Also, there are some proprietary applications that are not available as source, so a logical manner of packaging is with a standard binary packaging system such as RPM or dpkg.
Even NetBSD has its own binary package format (no, not the sets, those are for the base install and are just tarballs without package information).
All in all, binary packages are very convenient, despite the inconveniences caused by vendors who do a poor job of managing their package collection and dependencies. Binary packages are single files, smaller than source archives in most instances, and are installable in a uniform manner.
Let's not get into rogue "package vendors" who package trojans. They are the minority, and most reputable software developers release their own binary packages along with sources anyways.
I think I need a glass of water.
Re:Why get binaries (Score:5, Insightful)
Because some people run dozens or hundreds of machine with identical configurations, so compiling the same package on every single machine is pointless?
Because companies prefer working against a known build of a piece of software for support reasons?
Source distributions are far from a panacea.
Re:Why get binaries (Score:2)
Even more pertinent of a question: (Score:2)
Why should we choose Gentoo when we have the actual sources, with which you can download and build your own optimized binaries?
Re:Because of /.ing (Score:2)
Re:Because of /.ing (Score:2, Funny)
suggestions yes no no no no conflicts yes yes no yes yes
Ahh, I understand...
Re:Take it into context (Score:2)
Accept more,
grab aptitude,
learn aptitude,
love aptitude.