Ulrich Drepper On The LSB 401
Sam Lowry writes "In a recent post at his livejournal, Ulrich Drepper criticizes the LSB standard and urges the distributions to drop it." It's an interesting piece; Ulrich raises some good points.
Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari
who? (Score:4, Insightful)
Re:who? (Score:4, Informative)
He also has an ego that could drag Theo deRaadts ego into a dark alley and beat it senseless. He is an asshole.
How he is considered qualified to talk about the LSB when it doesn't have much of anything to do with Glibc, I don't know.
Re:who? (Score:5, Informative)
Probably because the LSB was created so that commercial binaries can run on any LSB-compatible distro. A key part of this is also related to symbol versioning in Glibc. As Ulrich is maintainer of Glibc, and as he works for Redhat which has to guarantee LSB certification, I guess he's entitle to talk about the LSB.
Re:who? (Score:4, Informative)
As I understood that somewhat incoherent rant, his complaints are actually about the LSB test suite, not the spec itself, and specifically about linker- and threading-related bugs in the suite.
Re:who? (Score:3, Interesting)
And simply working to gether to fix that testsuite may be a more pragmatic way of fixing things. The LSB organisation has been open to feedback, is fixing things and is, like all these organisation, resource contrainted. Exactly the sort of thing open source volunteers are so excelent at helping overcome. Especially those employed by companies, like Ulrich his employer, who really want LSB certification.
And linux desperately needs LSB. At the very le
Re:who? (Score:3, Interesting)
Riiight, you have apps. that are checking for an include file in a specific location, which is _also_ provided by dietlibc ... and I can guarantee they can't use that.
Instead of say just running /lib/libc.so.6 and comparing the version (which, of course, isn't ideal and it could even
Re:who? (Score:5, Interesting)
If the test-suite is broken, then the LSB guaranties are worthless.
Re:who? (Score:5, Informative)
AFAIK, GLIBC is one of the components required for LSB compliance.
And he's right, the LSB was a poorly thought out attempt to make all distributions compatible with RedHat rather than an attempt to come up with a common groud for all distros. For example, why oh why is RPM support required for LSB compliance? It doesn't affect the execution of software on the system, and only serves to create a mess for distros that use another packaging system.
Far more frustrating than that, however, is the fact that LSB only covers the very core of the system. The APIs that 90% of programs rely on are not even mentioned in the LSB spec. Rather, the spec simply states that a few very basic libraries must exist, then goes on to detail the signatures of the function libraries. Not particularly useful unless you're Sun Microsystems looking for a way to convince people that you're compatible with Linux.
Re:who? (Score:3, Interesting)
A better approach. (Score:3, Interesting)
#2. Define the functionality needed by the package management system to install, update/upgrade, remove those packages.
#3. Let the various distributions add that functionality to their own systems IN ADDITION to the functionality they already have.
Never define a app as the "standard".
Always define the functionality so anyone can write an app to that standard.
Re:who? (Score:3, Informative)
So there is some standard way of packaging a program for all LSB distros. Joey Hess's Alien can turn an LSB-RPM into about any package format you can need.
LSB only covers the very core of the system.
Right; that very core has taken years to standardize.
The APIs that 90% of programs rely on are not even mentioned in the LSB spec.
What programs? It's designed to be sufficient for commerical binaries, which historically statically link everythin
Re:who? (Score:5, Insightful)
I take that right back. I'd forgotten that LSB goes as far as defining the ABI, which is clearly the realm of Glibc and something which Ulrich is more than qualfied to comment on.
I've always thought that the biggest problem with LSB was that it didn't go nearly far enough, which means that distributors and users can't all use the same binary and we end up with these ABI issues that Ulrich complains about.
From what Ulrich says, the idea of the LSB is good but the implementation is deeply flawed. The standards board are seperated from the implementors who are seperated from the testers and communication and understanding between the groups is poor. Which is a shame, but LSB has always struck me as a bit of a lame duck.
Re:who? (Score:3, Insightful)
Re:who? (Score:5, Interesting)
The LSB has nothing to do with glibc? Really? Strange. I always thought the LSB was designed to ensure binary compatibility between distributions, and hence has quite a lot to do with glibc.
Personally, I still think the LSB has some value, but Uli's concerns are valid. IMHO, they seem to point to problems with the current LSB test suite that should be fixed, rather than leading to the conclusion that the whole concept is broken, though. In its current form, there is little value to be had in LSB compliance, true. But it needn't always be that way. A decision needs to be made to either fix the LSB or abandon it altogether. Uli prefers the latter approach. I favour the former.
currently leads Glibc (Score:3, Interesting)
Re:currently leads Glibc (Score:5, Insightful)
Were there one available, I would still be unlikely to use it. The fact remains that after you've seen through all the marketing hype, XML remains inappropriate for many tasks, and configuration files are right at the top of the list. You only have to look at Jabber or Tomcat to see some perfect examples of that.
XML is great for configuration files (Score:5, Informative)
The fact remains that after you've seen through all the marketing hype, XML remains inappropriate for many tasks, and configuration files are right at the top of the list.
In fact, it's the opposite: XML makes a lot of sense for configuration files. For instance, suppose that you need to write a script that automatically adds a line to /etc/X11/xorg.conf or a similar configuration file. If a file like that is in XML, this is trivial: you can write a XSL transformation or use any of a billion tools to apply the change in a correct way. But if it's in some ad-hoc file format (as it is right now), you either have to write a parser and unparser (which would have been unnecessary if it had been in XML; and how do you know for sure that your code is entirely correct?) or use some hacky combination of sed/grep/etc. to perform the change (which is, alas, the "Unix way"). The latter will of course fail unpredictably in lots of cases. E.g., are you handling those sections correctly? Comments? What if the line was already present? And so on.
Of course, XML is a horribly bulky format. But who cares? It's not like configuration files will take up a lot of disk space either way. The important thing is to have a universal standard format that can be easily manipulated using standard tools so that you don't have to implement parsers and printers all the time or approximate them using broken sed/grep hacks.
Re:currently leads Glibc (Score:3, Interesting)
Gotta love those specially formatted comments! Need to generate or parse this code which is embedded after initial comments and before shell script code? Or need to make a programmatic way of modifying i
Re:currently leads Glibc (Score:3, Insightful)
Re:currently leads Glibc (Score:3, Insightful)
Re:who? (Score:3, Insightful)
Some of these rules are *nix standards and make sense in an old fassioned traditional way. Or they make sense in that we need a standard place to always find these files between different systems, in order to assume some sort of compatibility across platforms, way.
But they don't always offer the best solution. Sometimes they have unnecessary rul
Re:who? (Score:3, Funny)
The MAIN GCC developer... (Score:2)
Re:The MAIN GCC developer... (Score:5, Interesting)
And while he happens to be right in this case, I don't think very highly of him. He's clearly very bright, but the poster above who said that Ulrich had a bigger ego than Theo was spot on. Too often, he lets his ego and NIH syndrome get in the way.
For example glibc is the only major C library that doesn't support the new buffer proctected string functions originally written by OpenBSD (at least last time I checked). These fuctions are faster, safer, and easier to use then the POSIX ones and are supported not just on BSDs but almost every commercial UNIX. Source compatability alone would dictate including them.
Drepper however has repeatedly refused to include them because they work and they make it too easy to not code buffer overflows (no this is not a joke). According to Drepper programmers should be good/smart enough not to mess up something so simple as a string buffer so including a defacto standard that makes it easy to get it right is inappropriate. WTF?
Re:The MAIN GCC developer... (Score:5, Informative)
While Ulrich has his faults, the above is completely false. The reason they weren't accepted into glibc was IIRC:
1) They are non-std. and did not have a usable standard like definition apart from the implementation and had no tests (Solaris implemented them slightly differently, for example, and Input Validation in C and C++ [oreillynet.com] from oreilly also screwed it up -- and that was written by people selling a Secure codeing in C book).
2) It doesn't solve the problem better than asprintf() which had been around for years (although also non-standard), as you still have problems with truncation [and.org] (and both APIs have the problem of requiring the programer to correctly pass around the meta data about the string -- Ie. it's size/length).
3) Given the above, and the fact the implementation is "free" then anyone wanting to use them can just include the source in their apps. and rely on autoconf (and they'll also be guaranteed to have the "correct" implementation).
Re:The MAIN GCC developer... (Score:3, Interesting)
First what I said above is true, at the time Ulrich said specifically that strlcat and strlcpy wern't nessessary because programers could just check their code for the common mistakes the strl* functions are intended to solve.
1) It is true that they are not in the POSIX, ANSI, ISO, or Single UNIX standards, but neither is a ton of the other stuff in glibc. However, they are supported on almost every non-GNU libc -- making it a defacto standard. Many open source apps use
who cares? (Score:4, Insightful)
It could have been written by Bill Gates or my mom.
Why does the author have to be so important if the facts are laid out and verifiable. You don't have to agree with his analysis nor his conclusions, but the facts should stand or fall regardless of the author
Re:who cares? (Score:3, Insightful)
There has always been spin and FUD, but these days it has developed a very organised, very slick phenomenon. This means that you need to give increasing weight to background motivations to pierce the veil.
Re:who cares? (Score:4, Insightful)
ummmm... at some point someone has to produce content to gain credibility. You say that FUD has become slick? Just because someone produces a slick info shot doesn't mean you shouldn't STILL be checking the facts.
I think we're probably on the same side here, but you don't need anything to "pierce the veil" except verifiable references.
Which this guy has. You can go to the bugzilla database that he talks about and discover for yourself if most of the bugs submitted are indeed bugs that show the tests are broken
Re:who cares? (Score:3, Interesting)
Example 1: Interest Rate Prediction Markets
Traders make judgements based on lots of market data - consumer confidence surveys, growth predictions, oil/stock/property price trends, major company results, etc. When you have a particular outlook it is almost always easy to come up with LOTS of figures to support that position. These are all verifiable statistics! To objectively ensure you are taking into account a representative sam
Re:who? (Score:2, Insightful)
My advise: but the losses. To some extend, I think, the claims a scaled back meanwhile, if I understood Art correctly.
To quote Lisa Simpson, "I know those words, but that sentence makes no sense to me!"
Can someone.... anyone convert that into English?
Re:who? (Score:2)
Re:who? (Score:2)
Perhaps me meant "to some extent, the claims should be scaled back"? If they are already scaled back, then what's he ranting about?
Re:who? (Score:2)
extend => extent
a => have been
meanwhile => recently
Re:who? (Score:2)
Re:who? (Score:2)
Ulrich Who? (Score:3, Insightful)
Re:Ulrich Who? (Score:2, Funny)
Wait... they haven't reinstituted floggings for those yet, have they?
Re:Ulrich Who? (Score:2)
It's a play on information assymetry. When slashdot's URL is pronounced out loud, it sounds like "H-T-T-P colon slash slash slash dot dot org". To the normal person, that's uber nonsense. But to those "in the know", it's like a secret pun.
Granted, most people would now pronounce it like "H-T-T-P colon forward-slash forward-slash slash dot dot org", which loses some of the cleverness.
Re:The primary maintainer for GCC right at the mom (Score:2)
Re:The primary maintainer for GCC right at the mom (Score:2)
You mean glibc not gcc
Ulrich Drepper... (Score:4, Informative)
False Alarm! (Score:4, Funny)
I agree, but something needs to happen (Score:5, Insightful)
Something needs to be done. Even with the source, half the time I have to make all sorts of include changes. What is so hard about providing a common build and install process? If you get Apache, OpenOffice, and Mozilla to adopt a convention, everything else will follow. Why not have something like Apache Ant that simply installs either to a user directory or to a common directory and links to every user directory? Then provide a nice GUI on top of it, where it will either compile if the source is there and then install, or just install otherwise? How hard could that be? Forget this
Regardless, this is a perfect example where sometimes it really does make sense to have "management" provide leadership by imposing structure. Ideally, they would be serving and representing the interests of users and helping to overcome the disinterest of joe programmer who doesn't do the psychologically difficult work of catering to someone other than themselves. The "scratch an itch" metaphor breaks down when other people don't know how to "scratch" themselves and need the help of a division of labor to serve their needs. Before you say that they should learn how to "scratch", think that as a community, society, and economy we all scratch eachother's itches in an incredibly diverse number of ways. This comes about because of intentionally trying to fulfill a demand. In the case of the Linux stack of Free/Open Source software, the developers have not taken responsibility for how their product is consumed.
Re:I agree, but something needs to happen (Score:2)
Then the program is broken. Report the bug. The autoconf/automake scripts should take care of all that.
It sounds as if you'd like autopackage [autopackage.org].
Re:I agree, but something needs to happen (Score:3, Interesting)
Re:I agree, but something needs to happen (Score:5, Insightful)
No kidding. You'll find some decent looking project, and it's no big deal, the developers just require this neat toolkit that they consider standard, and all the 133! distros have it, just not the old ones like RedHat, Slackware, and SuSE. Of course, the most recent build is two years ago, because after a year of development all the kids got egos and couldn't stand each other.
Of course, then you find out that the neat toolkit they use depends on an old version of Python, and naturally it's built to do a hard-coded check for a specific version of python in the configure - not the current one of course. And naturally the references to the old version of python are strung throughout the config file. And as it turns out, if you fix all the references in the config, that will break the calls somehow. So you can either install yet another version of python, or forget about this neat little program.
I really prefer compiling from source, but it's getting to the point where it's just not worth the crap.
Re:I agree, but something needs to happen (Score:3, Interesting)
After ditching some old Linux installs a few months ago (a Gentoo system that had gotten hopelessly snarled up and a YDL with a broken RPM database) I tried out a few different options. Conclusion: the most important thing in Linux is a good package archive. The other 10,000 Linux annoyances mostly need to be solved once. The package stuff is just going to keep on biting and biting at you.
Re:I agree, but something needs to happen (Score:4, Interesting)
Right now, so long as you pick one of the "big three" (Debian, Red Hat/Fedora, SuSE), you will have very little package/software install trouble.
Most companies that release Linux software offer the following downloads (as do most OSS software websites for individual products):
1.
2. RPM for Red Hat/Fedora
3. RPM for SuSE
3. DEB for Debian
I have been in the Red Hat family since Red Hat 5 or so and I can tell you that beginning with Red Hat 8 things started to get really easy, and by the time the Fedoras had come around, I spend nearly zero time compiling my own software or chasing package dependencies. Tools like yum/apt even make it so that you don't have to FIND a download site and double-click on and icon, you just type in a command that says "I WANT IT!"
But even for commercial software like Flash or Java, it's cake, I just install the package. The reason is because the package is DESIGNED FOR MY OPERATING SYSTEM.
Sorry, but most of the other Linux operating systems (Slackware, Mandrake, Yoper, Xandros, whatever) are too small for packagers to target them, and that's generally what results in package hell--you are trying to use a package that assumes the components installed by default in another operating system. So even if they are both RPMs, installing a Red Hat/Fedora RPM on Mandrake will cause you trouble. Even once you get the packages all installed, the configuration and support files are likely to be located in all the wrong places.
And yes, generally the packages ARE clearly labeled. So I guess my answer is the one people hate to hear, but if you're going to ask the question about "package hell" then you're going to get this answer: switch to a bigger distro (best case is probably Red Hat/Fedora) and the problem will generally go away.
Re:I agree, but something needs to happen (Score:3, Interesting)
I will probably get modded flamebait, but I agree.
I just went throught the process of adding Bugzilla [bugzilla.org] to my installation of Fedora Core 3 [redhat.com]. I run Fedora because that is the default Linux installed by my provider and anything else would more than double my costs. I just checked the LSB Certified Distribution List [opengroup.org], and sure enough Fedora is not on it. I tried upgrading my system using Yum [duke.edu], but th
Re:I agree, but something needs to happen (Score:3, Informative)
When I installed Bugzilla I issued this command: apt-get install bugzilla. Debconf asked me a few questions, and it worked fine.
Where'd this go? (Score:2)
From the fine article:
This applies also to the code which is written by the presumed professionals paid by the OpenGroup to write tests. Want an example? Look at this [linuxbase.org]. This is no isolated incident, I've found this kind of problems on many occasions.
Thought-Out, or Whining? (Score:2, Interesting)
I'll grant I'm not familiar with all the politics and the specific methodology by which a Linux distro tests or achieves LSB compliance, but this blog entry sounds a lot like whining. Ulrich whines that it's hard, that the audit raises many bugs, that it's tedious, that other distros "somehow" achieve their compliance but he's not sure how, that the audit process itself has bugs, and that the LSB group must be pushing this agenda down people's
Re:Thought-Out, or Whining? (Score:2)
And why shouldn't he be allowed to whine if, as he claims, the bugs are in the *tests*? If he's the maintainer of glibc, I'd assume he knows more about this domain than the average hacker.
Re:Thought-Out, or Whining? (Score:3, Insightful)
Well, you can... But then they are a hinderance, not a benefit.
Re:Thought-Out, or Whining? (Score:5, Insightful)
All other points raised are shown to be consequences of this.
The specific example he cited is a rather enormous bug (a thread which is detached can by definition not be joined. "Detaching" a thread means telling the system that you are not interested in its exit status... and join()ing is reading the exit status).
(This doesn't mean that other examples are as clear cut. It could still be that most tests do actually show genuine glibc bugs, and that he just picked up the right example to bolster his point.)
that the audit raises many bugs
that other distros "somehow" achieve their compliance but he's not sure how
I'd say, if Ulrich is right about the test cases, the situation should be fixed by removing/rewriting the dodgy test cases althogher. Deliberately running distros with non-standard shared libraries or on dog-slow hardware to make them succeed the tests is pointless. If that is indeed how "somehow" some distros achieve to pass the tests, Ulrich is indeed right on the mark that it would make the test suite completely meaningless. You are not certifying a distribution, but you are certifying a distribution tweaked to run the tests...
Better fix the suite, and run the distro under "normal" conditions (i.e. the same as normal users would do).
Re:Thought-Out, or Whining? (Score:5, Insightful)
His argument is: no set of Linux software could pass the LSB suite by actually consistantly giving the desired results, because there's no libc that consistantly gives those results (when run on sufficiently fast hardware to expose the bugs in the tests, for example); yet distros do claim to pass the suite; therefore, the LSB is not ensuring compatibility, because it certifies things that don't work by their rules.
Furthermore, he argues that programs that don't work tend not to work because they rely on undefined behavior. Certifying that the environment behaves in accordance with the standard doesn't help, because the software developer's environment and the user's environment may do different things in some cases, while both comply with the standard. Unless the programs are tested for doing non-standard things, they won't necessarily work. And the undefined behavior is undefined for a reason: you can't improve the system without changing it (especially when the thing not defined is which takes longer: executing a certain function or waiting
The sections that you dismiss as whining are actually providing examples, which is important in engineering (or science). There are theoretical flaws in any process; it is always important to know whether those situations ever actually occur. If he didn't have an example of a program relying on undefined behavior which should vary between systems, one could say that nobody would actually write code like that and think that it worked; but it turns out that people actually do write such code, and these people happen to include the people writing LSB tests, which is why they're flawed tests.
I agree (Score:2)
/. needs a new lamness filter (Score:2, Interesting)
The problem is the LSB does not PUSH LINUX FORWARD (Score:3, Insightful)
The problem with the LSB is it does not do much. What is needed is not a standard for "thou shalt have this version of libc in this directory", but instead a standards body needs to come up with "this is the way you will perform your system initilzation", "this is how you will set and store your ip networking configuration" etc...this would make YOUR skills transferable from distro to distro, would allow the community to come up with BEST OF BREED solutions for things like system configuration tools etc.
Having 1000 different distros do this stuff in 1000 different ways is WORSE THAN not being able to run Oracle on a particular distro without a little tweaking.
Let's forget binary compatibility (Score:5, Interesting)
What matters is source compatibility. And right now GNU/Linux has that in spades. Not just GNU/Linux, but the BSDs, Mac OSX, Solaris and even Windows have it. If the source code is properly written, and properly packaged, then it will compile on any machine that is up to the job of running it. If you make any really drastic changes -- the standard C library for instance -- you might well have to recompile some applications. Is that a major hardship? I don't think so. Back when we changed from round-pin 5 and 15 amp plugs to rectangular-pin 13 amp plugs, people had to have their houses rewired. When we went from artificial gas to natural gas, people had to have their cookers and heaters modified. When Channel Five launched, many VCRs needed their RF output shifted. These were all necessary changes for the better {ironically enough, we probably will be going back to artificial gas in future
Binary compatibility was never more than a nasty hack, fudged in for the benefit of those who want to lock up the source code of their software. These people are pure evil. By not sharing their code with you, they are just one very tiny step removed from stealing from you. It had the beneficial {at least, it was beneficial when processors were slow and disk space small} side effect that you did not have to spend CPU time and disk space compiling applications locally; but now that disk space and processor power are cheap, the benefits of pre-compiled applications are diminished substantially.
There's even a good argument to be made in favour of deliberately introducing binary incompatibility. If programs compiled on my computer would only ever be able to run on my computer, and any program compiled on anyone else's computer would never be able to run on mine, then there would be no such thing as viruses or buffer overrun vulnerabilities. {Unfortunately, this raises the question of how to ever get any computer up and running}.
Re:Let's forget binary compatibility (Score:3, Insightful)
Binary compatibility is EXTREMELY important to Linux if you want acceptance on the same level as Windows or OSX.
If you make any really drastic changes -- the standard C library for instance -- you might well have to recompile some applications. Is that a major hardship? I don't think so.
This laughable. That I even have to compile an app is laughable.
End users do not want to compile and application. They do not want to debug it, figure out
Linux File System Standard (Score:3, Insightful)
The LSB is overrated imho.
standards testing (Score:3, Informative)
The testing process was to run a test, and when it failed, try to figure out if the problem was in the test suite or the tested code. Simple enough.
The tests certainly at some point worked.
No. That wasn't the case. I found myself fixing obvious bugs in the test suite, then attempting to use the fixed version against the target. It was often clear that the test suite could never have worked.
Some distributions still somehow manage to pass the test suits of a new version of the spec. And all this without the people reporting any problems and requesting waiving the test.
We'd report the bugs, with suggested fixes, but we could not wait for fixes to come back and retest. We had to plow forward. We claimed compliance when we had a test we thought tested the assertions and passed it. We never asked for a waiver. Another nice things we came across during the LSBv3 testing are numerous timing problems.
Been there. Done that, though I didn't have to find some slow machine. What is the value of such a certification? What assurance does this give you? Is don't use fast SMP machines an acceptable answer in any universe, especially when it comes to thread tests?
If you have need of slow machines, I can provide approximately 25 working 486/33's. I'd put this on his blog, but he doesn't allow comments. I thought this was strange, because I use livejournal [livejournal.com] primarily as a place where people can comment. However, he talks about his choice there, too. To each their own.
It is not possible to achieve the goal of 100% binary compatibility...
All good points. And its worse than that. Yet, the exercise was valuable. For us, it uncovered many bugs in SVr3. Many. This was ultimately a good thing for our customers.
We were also a Unix porting house. We fixed lots of bugs in our prior ports of Unix. We offered our fixes to AT&T for free. They declined. We had to apply our fixes to each port - without the benefit of CVS. And, we had thousands of patches. And all this for a basically stable system. It was around then that I was convinced of the incredible inefficiency of propietary software. This would never happen to gcc.
My advise: but the losses.
I read this as "My advice, cut the losses." Oddly, many versions of this mispelling pass my spell checker. Ulrich needs an editor. Perhaps I'll volunteer. Perhaps he can check my work. Will you be a swap editor for me? I'll check your work, you check mine.
So, i agree that the test suite was a horrible idea from the idea that one might assure customers that their old software will still run, or will run on compatible platforms. I agree that the last bug will not be found. However, that is not an excuse to give up the search.
Re:WE NEED STANDARDS (Score:4, Funny)
Dude thanks! I finally know how to install this game on Linux. The last time I tried, I ended up causing my mother's computer to wardial her friends from her recipe club.
Re:WE NEED STANDARDS (Score:3, Funny)
Dude thanks! I finally know how to install this game on Linux. The last time I tried, I ended up causing my mother's computer to wardial her friends from her recipe club.
That was you?!!?? My mum's gonna kick your mum's ass! ;)
YES, we need standards... (Score:4, Informative)
Combine this with silly requirements such as needing Sendmail (Uhm, shouldn't it be more along the lines of, we need an MTA of some sort- so long as it's handled properly, who cares which one, right? Sendmail's the least desireable of all of them, and it tends to get turned off for Postfix or Qmail most of the time anyway!) and it's about as useful an appendix is to a human these days.
Yes we need standards. API standards, possibly ABI standards- but not what we're getting here.
Re:YES, we need standards... (Score:2)
Re:YES, we need standards... (Score:2)
Re:WE NEED STANDARDS (Score:4, Informative)
1. Insert CD
2. Double-click on installer icon when file manager window pops up
3. Enter root password when prompted
4. When all is said and done, choose Quake3 from the start menu
From what I can tell, there's only one difference between this and the Windows version that you described, and that's the entering of the root password. And we don't want to do away with that, because it's what makes Linux 90% less susceptible to malware.
Anyway, what distribution and version of Quake3 are you using?
Re:WE NEED STANDARDS (Score:2)
Re:WE NEED STANDARDS (Score:2)
Re:WE NEED STANDARDS (Score:2, Troll)
Re:WE NEED STANDARDS (Score:3, Insightful)
then knock yourself out.
Better yet, use Synaptic.
Even better, try not being a plagarizing [slashdot.org] troll [slashdot.org]. Go outside and get some fresh air, perhaps also try dating. You'll be happier.
Mod copied-and-pasted troll down. (Score:3, Informative)
When I read this, I had a curious sense of deja-vu, as if I had responded to this retarded argument once before. And looky here:
http://www.google.com/search?ie=UTF8&q=User%3A+%2Come on. It wasn't even insightful the first time.
Re:WE NEED STANDARDS (Score:3, Informative)
Comment 1 [slashdot.org]
Comment 2 [slashdot.org]
Comment 3 [slashdot.org]
Comment 4 [slashdot.org]
Comment 5 [slashdot.org]
And some more! Stop it!
PARENT IS A TROLL (OR BOT) - IGNORE/MOD DOWN (Score:4, Informative)
Please Do Not Feed The Trolls.
Mod down or ignore... for Christ's sake don't reply - it only encourages them
Re:WE NEED STANDARDS (Score:2)
>1% market share meanst greater than not less than
Also Linux already has >1% share in a lot of market segmants, hey guess what a lot of segmants microsoft has next to or 0% share! gasp!
It's not like Linux has shareholders to please why should I give a crap if my grandma wants to use it?
The fact of the matter is Windows XP is pretty easy to use and pretty stable. I've had the same install on my laptop since Febuary 16 2002 and I use it every day, I install and uninstall and travel around plug
Re:4 posts so far... (Score:3, Informative)
*Sigh* your post on the other hand, does indicate that the average
Re:4 posts so far... (Score:2, Funny)
Guys, you're posting to the wrong web site. This is /. , not ./
Re:is this really livejournal? (Score:5, Funny)
Current mood: Sad :(
Re:is this really livejournal? (Score:2, Funny)
Seriously, wtf is a technical article doing on a site full of whiny emo-kids?
Maybe it's an exchange program of some kind and the teenage girls from LiveJournal are going to start turning up on tech forums?
Re:Three Key Issues (Score:3, Informative)
I'm a fairly technical user
You certainly have mastered the cut & paste operations.... see here [slashdot.org].
Re:Three Key Issues (Score:2, Insightful)
A) You should never have to recompile the kernel or break anything by upgrading the kernel in a recent distro. And in the Red Hat product family, since Red Hat 8 (i.e. 8, 9, FC1 through FC4) all you do to install software is double-click on the package icon.
B) In the same lineage of Linux systems just mentioned (i.e. RH8 and later), all administration, from firewall through Apache setup, can be done with graphical configuration t
Re:Linux is too fragmented (Score:2)
Re:Linux is too fragmented (Score:2)
BTW, the menu in my 9.3 installation have a single volume control applet, and it's pretty damn simple to understand. Actually, it's pretty much a copy of the Windows mixer. WTF is wrong with your install ?
Try Mandriva (Score:2)
Re:Linux is too fragmented (Score:5, Insightful)
The poster points out some of the same frustrations many non-linux people have when they try to use the OS. Keep in mind, that anyone switching to Linux still has to do work. This means any switching to Linux research is going to occupy spare time. That time better be spent getting Linux to do my work better, not me making Linux work at all.
Re:Linux is too fragmented (Score:5, Insightful)
You've been incrementally learning Windows for 10 years now. Every time you change versions you have to go through another learning-curve bump. "Where did they put "ODBC Drivers" now?". If you were suddenly presented with learning Windows on a tabula rasa, your learning curve and frustration level would be just as high as they are for a Windows user moving to Linux for the first time.
If you're a programmer, let me ask you this: How many text editors have you had to learn? Isn't it a pain in the ass learning a new one? "Hell, I already know 43 editors, I have no desire to learn another one". This does not make any of the editors you already know superior to the one you don't, nor does it make the new one inferior just because you don't. Different isn't a priori bad, it's just different.
Re:Linux is too fragmented (Score:3, Insightful)
Should I have to learn a text editor? Sure, I had to learn emacs and vim. What about nano? Nano was obvious. It listed the commands at the bottom. Sure it's not the most powerful editor around, but still. It's a freaking text editor. I should be able to open a file, type stuff in, save, and quit, without ever having seen the editor before, and without having to read the man page. None of the GUI editors really suffer from this problem, since they have menus and
True? True?! (Score:2)
Windows:
Paint Shop Pro
Photoshop
Corel Draw
Ulead
(And umpteen others...)
Linux:
GIMP
It's not that it's that there's too many applications and all- it's that it's different from Windows by enough to throw the people that work by rote off. (Never mind that Bill and Company change
Re:Linux is too fragmented (Score:2)
First, Suse bugs have nothing to do with the Linux community being fragmented.
Second, these bugs are not representative of most Suse users. They appear to be taken from a message board where people post their problems, then aggregated as if they all happened to the same person. Do that with any OS and it'll look bad.
Third, it appears as though some people are being paid to post negative comments about Linux. This is happening on other message boards a
Re:grammer police (Score:2)
Re:Huh??? Sorry, this story makes no sense. (Score:2)
Re:Apple did what redhat should have, train gone.. (Score:2, Funny)
Go back to fellating your iPod.
Re:Apple did what redhat should have, train gone.. (Score:2, Interesting)
The other segment is for desktop OSs that run on generic multi-source hardware. That is over 90% of the market, and that is where the BSDs, Windows and Linux compete.
The hardware part of this market segment is not dominated by anyone, there are low entry barr
Re:Apple did what redhat should have, train gone.. (Score:2)
How could redhat given us "OSX based on linux?" Download the aqua source from Apple under the GPL and recompile it?
How much does Apple pay you shills anyway?
How redhat could have given us OSX.. (Score:2)
Single clean theme for KDE or GNOME.
Video and multimedia that work in linux without problems.
Specified set of working hardware and a distro deal with Dell / whomever.
Unified, cleaned up development tool package.
Application developer support for ^^^^
Clean application installer, no hassles.
Redhat had enough funding to make those things happen. They didn't.
Re:What *IS* the LSB ??? (Score:4, Informative)
http://www.linuxbase.org/ [linuxbase.org]
Re:Linux needs a stable ABI *BADLY* (Score:3, Insightful)
Guess what? Tons of products now offer "linux support" by ugly wrappers or, worse yet, per-distro builds, or even WORSE no drivers at all.
Hell yeah I want open source drivers in the kernel. It's nice when the drivers come built-in! Lots of companies agree with this concept in the Win32