NY Times on "the Fragmentation of Linux" 67
Weramona writes "The Times is running an article on the possibility of Balkanization of Linux, due to commercialization. To be fair, both sides are presented, and it isn't all that sensationalist. The article is aimed rather low ("Unix was created in 1969 by..."). What's funny to me is, a couple months ago, this was a favorite "Damn the Man" conspiracy theory on /. " Its the times so you need a free account to read the story, but its a pretty good piece so its worth it.
GUI design philosophy (Score:1)
Justin, of Gnuidea Software [169.233.22.137]
Couple points......... (Score:1)
same old same old (Score:2)
Not bad (Score:2)
I don't think the article was aimed low, btw; it's another example of a mainstream paper covering a topic with which a lot of -- but by no means all -- readers are familiar. It makes sense to include background, and it's a further example of Linux being brought to the masses.
Recreating the history... the right way (Score:2)
Linux has been backed up with a process which is more democratic, unlike the older UNIXes which was essentially maintained by companies for economic reasons which I would call the true capatilist way of management.
FSF, Linus and the rest of the gang around the world play a very vital role in regulating code which was absent in previous UNIX.
However, thats just my feeling.... others might have a different view to it.
Nothing new (Score:1)
Depends on what you mean (Score:3)
I mean, do you include people creating distributions of the same basic kernel, and a different selection of utilities? (In which case, how is that any different from computer companies bundling different selections of software?)
Do you include distributions with different kernels (eg: L4Linux), but the same utilities? (Here, how would the average user be able to tell that there was a difference at all?)
How about a.out/elf, or libc5/glibc? Well, everyone has migrated to elf, and most have finished moving to glibc, so there seems to be a compunction to standardise, there.
What else is there? Window managers & underlying X toolkits seem to be one battle, but I'd put that in the same category as bundled utilities - no different from any other computer market, since (time *) began.
There's the directory the config files are put in, yes, but that seems to be working itself out.
There's the X vs. Berlin battle, but that won't be anything more than a possibility (not even a certainty) for a long time to come. Berlin looks promising, but it's not ready for the Prime Time.
What's left? The installer? Oh, wow! Like you have to worry about that, after you've installed the distribution.
The Package Manager? That might have been a really serious contender for causing fragmentation, but Alien and similar utils make that almost redundant. As far as your computer is concerned, all package managers can effectively interchange packages with each other.
AFAICS, that pretty much wraps up all the possible causes of fragmentation.
Some Fragmentation Illusory, Some Good (Score:3)
The main place where differences between Linux distributions are persistent are with regard to two things:
This is arguably a matter for more concern.
Tools include rpm/dpkg, and the recent proliferation of distributions based on Debian is results in RPM no longer being quite as "worshipped" as it used to be.
I regard the increase in interest in Debian-based distributions as a good thing since Debian has more automated tools for managing and validating validity of packages, which is an area where RPM had "gotten pretty stuck" for a long time.
Aside from package management, there is then "system management," with tools like COAS and Linuxconf, where different distributions are promoting different tools. (And I'd put in a plug for the OS-independent tool cfengine [hioslo.no] that's good for lots of purposes...)
There's some fragmentation, but my old essay Linux and Decentralized Development [hex.net] has the thesis that the net results are positive. I haven't seen compelling evidence to the contrary yet.
I tend to disagree (Score:3)
Linux is fragmenting into specialised tools with a common base. The tools are aimed at certain core markets where it performs very, very well. Microsoft is a good example of where a product hasn't fragmented to exploit markets. Win9x doesn't know if it wants to be a server or a desktop system, and NT has grown so large trying to be all things for all people that its nearly unmanageable, and each release seems to be getting heavier and heavier, and more unstable (the Win2k test shows that even microsoft has realised this).
Linux must retain and expand these areas and make sure people understands why this is the case. If you are presented with a project that requires multi-user access, take a look at all the linux distros. Somewhere in there is a distro that will provide you with exactly the base you need to build your application on. In some cases all you need to do is to change a few variables and design a webpage.
There is no major infighting between developers over disros - this is where bad things would happen (but there is a bit of mumbling and finger pointing). The developers either tend to igore one another or work with each other. This is good.
The current trend of articles is to portray linux as a fragmented infighting collection of geeks. There needs to be more PR and education projects to get the journalists to realise that this is not always the case.
If linux was a corporation, it would take a seclection of editors off and wine and dine them somewhere expensive, and pick up the tab. It would take a selection of journalists off on a jaunt somewhere and get them drunk.
The problem Linux faces is that until recently it's not had the financial backing to do this. The RedHat IPO does give them the money to do this, but it remains to be seen if they wil follow this way of doing business. I think that they probably won't (at least not for a while yet).
Mostly well researched (Score:2)
----
Still have a few years (Score:2)
We will, however, see a flurry of activity on the desktop side. There are a ton of people who do not need to run a server, but instead want a fast, stable, and cheap platform to surf the web, play games, and write letters and resumes. These people are willing to pay US$60 a pop for this (or part of it) and have payed US$400 for just the software to gain this functionality. Aside from installation, there isn't much support that is required and when it is, the establised companies are already charging per incident.
Unfortunately, this Linux desktop will probably not come from the larger distributors today. It will be a company who adopts the Linux kernel and extends it with their own proprietary GUI. It won't be X complient, or even have X available. The winner will eventually get X support through a company like Hummngbird. Early entrants will make developers pay a couple thousand for the priveledge of developing for "their" platform. That will eventually stop as competing desktop vie for developers. Free tools and "open" APIs will finally arrive.
The Linux you know and love will still be strong. Serving enterprises and power-users home desktops, but your mom will be running Linux without even knowing it. From a causual inspection, you might not know it either.
What is the world coming to? (Score:2)
HH
Re:I tend to disagree (Score:3)
Most Dedicated... (Score:2)
I would perceive Redhat to be a likely candidate for the latter, possibly only because they seem to be the leading distribution here in the US. Although, I admit, I haven't seen them do anything I didn't like. Caldera also comes to mind.
If there is any danger I think it comes from the most popular distributions. The momentum of the sales of a large distribution like Redhat could cause fragmentation even if the rest of the community realized what was going on. In the article, the quote from the Redhat guy seems to say that there might be a problem, although most of the posts I've seen so far discount most of this fear. My reason for wanting to know the answers to the two questions above is this; If I'm going to support a company with my dollars, I want to make sure that I'm supporting someone who is devoted to the Linux community.
Strength in Diversity ... (Score:1)
Naturally this requires some smarts on the part of the integrator and, of course
Oh well, different courses for horses.
LL
Linux != Unix (Score:3)
Linux core (the True Linux or kernel) will always be the same among the distros. Any distro to fork will fail since it will no longer be compatible with the rest. Or you won't be able to keep up with the "latest" by downloading.
This brings up one exception. And this was stated in the article about Unix. If different hardware architectures arise, then we may see a split with Linux. But even then, the GPL will allow any "enhancements" to be shared among all distros.
So far I have had no problems in keeping my Slackware and RedHat Linux boxes up and running the same utilities and applications. I'll raise a concern once I start seeing a problem.
Steven Rostedt
some evidence... (Score:4)
distro is good for us hard-core Linux types, but
bad for the general user.
A week ago I went into the local Best Buy store,
and went to the Linux section just to see what
all they had. There was a lady there who looked
confused, and just kept picking up different
distro boxes, not sure which to buy.
I felt bad, because she can go right over to the
windoze section, and buy *the* windoze 98 box.
She had no idea that SuSe, RedHat, OpenLinux,
et al were all just Linux.
In that regards, this is a Bad Thing[tm], because
it confuses the average Joe user. I do think
there would be some advantage to having The Linux
Distrubution.
Now I'm sure you'll all reply "well we don't want
people that don't know to buy Linux!", but if
we want global desktop domination, this spread
of distros will NOT help. People don't like
actually doing research when it comes to
technology. They want to be told what to buy or
have no choice. Hence the popularity of windoze.
Alarmistic and misses the real issues (Score:1)
Of course it is possible for a vendor to add value like that, but it won't be a huge problem.
The REAL issue, and one that I'm surprised that the LSB spokesperson does not even acknowledge, is the difference in how the different distributions compose packages, regardless of distribution format. It's not enough being able to force import of foreign packages. One wants to enforce the dependency constraints. That is one of the main strengths of Linux (and IRIX, BTW, where inst/swmgr is still much better than RPM, although the latter indeed satisfies the basic needs).
It's not really a fragmentation issue because... (Score:2)
There's incentive for most vendors to package their distro in a standard format (or at least support RPM installation), because they'll have off-the-shelf compatibility with the increasing number of applications available for the platform. Forking costs you the penalty of breaking that compatibility - now you've lost control of your commercial applications market. Where things get proprietary is in places like install procedures and/or bundled goodies (or sometimes in system management - like SuSE does), but once installed, Linux is Linux. All praise the penguin.
- -Josh Turiel
When will they understand........ (Score:1)
Chas - The one, the only.
THANK GOD!!!
Fragmenting - Sort of - Depends (Score:1)
Re:some evidence... (Score:1)
What's all the fuss about? (Score:1)
The consequence of Fragmentation under Linux (Score:3)
Second, there are 2 kinds of fragmentation: API and binary. If you change the API, then you fragment and may the hordes of angry Linux hackers persecute you for the rest of your miserable days. In terms of binary fragmentation, we're there already (at least we were when some distros had already changed to glibc while others where still using libc). This however, I don't see as a major problem, as it (in most cases) can be fixed by a recompile.
In summary: Yes, someone could fork the kernel tree, but at what price? I would hate them for it (as probably/hopefully millions of other people also would), which would automatically reduce their chances of successfully marketing whatever it is they make. Plus, they would have to run like hell to keep up with the rest of Linux development. I really do think that the Linux development model (ie: the speed at which Linux evolves) is actually a pretty good defense against fragmentation: both from a technical standpoint as well as from a social one. Lets not forget that even companies like Toshiba can be swayed by enough angry emails threatening to boycot them.
The Balkanization Of Linux? (Score:2)
People use computers to perform various tasks other than running an OS. If software is not available for an OS, no matter how good the OS is is, it wanes in popularity and possibly dies. If companies only support RedHat with software, then no matter how good Linuxen like SUSE and Debian may be, they're going to eventually decline in favor of RedHat, because people need software to do work.
Also, many people can't handle recompiling software, so if they've got a Linux variant with a nice installer, and they can get commercial software in proprietary binary formats that have nice installers, then they are using a computing paradigm that is familiar to them...
Linuxen that have nice, easy installers and are supported by commercial software with nice, easy installers will be the ones that have the best chance of combating Microsoft... but they'll also fuel the "balkanization" of Linux...
Software vendors that don't commit to releasing cross-linux software are basically just pushing Linux towards the same situation that arose with other OSes... the difference is that it's easier with Linux to make cross-builds, so that a commercial package could be released on RedHat, SUSE, Debian, and perhaps others without as much effort as, say, a cross-build between Solaris and IRIX, and certainly more easily than a Windows and Mac cross-release...
It's up to the users to demand such things, by contacting commercial software vendors and requesting it, letting them know there is money in it for them... they're not going to do it out of the kindness of their hearts, they're in business to make a living, not prevent Linux balkanization...
However, I think most Linux vendors would be willing to support several Linux variants if they knew the customers were there, and the best way for them to find out that is the case is for the customers to let them know directly...
Does anyone notice a pattern here? (Score:1)
Beer recipe: free! #Source
Cold pints: $2 #Product
Cypherpunks for login - good article though (Score:2)
I thought this was a fairly good article, in terms of expressing many people's opinions about code drift. But I still don't feel like it's anything like when Unix split. Perhaps that will happen once MSFT ports their apps over onto a commercial GUI shell, but I think it's just the usual paranoia about lack of control expressing itself.
Is there really a fragmentation problem? (Score:2)
Thanks
Bruce
Username / Password: slashdoted / slashdot (Score:2)
Username: slashdoted
Password: slashdot
Note that there's only one "T" in "slashdoted", for some reason.
I'm also posting this at the top level of the discussion tree.
-----
The real meaning of the GNU GPL:
Re:But do bear in mind...... (Score:1)
A plea for cooperation among FUD vendors (Score:4)
The balkanization of FUD is causing numerous problems, most importantly several not quite compatible variants of FUD. I have seen FUD from one company saying that since Linux is free it is worthless, and FUD from a different company saying that Linux is in fact more expensive to deploy than, say, Windows.
I think it is important that all producers of FUD work together so that needless incompatibilities can be avoided. It is of course important for vendors to be able to differentiate their FUD in the market, but this needs not cause incompatibility. I applaud the efforts that Microsoft does to provide basic FUD to VAR's such as ZD and NY Times, who are then able to add their spin, creating different but compatible FUD.
Benny
Username / Password: slashdoted / slashdot (Score:1)
Username: slashdoted
Password: slashdot
Note that there's only one "T" in "slashdoted", for some reason.
-----
The real meaning of the GNU GPL:
Re:Depends on what you mean (Score:1)
Mind you, I'm speaking as a developer of closed-source software. I've been in meeting discussing which distributions of Linux we can and cannot support. It's very frustrating.
A Little bit of Reassurance... (Score:1)
In a cnet article about RedHat supportin linux on Compaq machines I saw this:
"As part of the Compaq deal, fixes created by Red Hat personnel will be contributed to the open-source community, Red Hat said. "
-johnE
the article [cnet.com]
Re:Depends on what you mean (Score:3)
WRT the meetings, I can understand such concern, but think that it's largely born of fear, uncertainty and doubt. (That is VERY different from saying that such discussions produce FUD, but that if the technical issues were understood, and the fears allayed, they would never have occured in the first place.)
A case in point:
Let's say that you want to produce some program, Z, which needs to run under Linux. However, you don't know which distribution of Linux it's going to use. What do you do?
Answer: Simple. Scan the distribution to see what resources exist and where they are, then install anything extra you need. (Configure isn't confined to Makefiles - I've used it as a nice installation tool, as well.)
But what about versions of libraries? Not a problem! Just install your own, and make sure your installation directory is at the head of LD_LIBRARY_PATH.
What about Gnome/KDE/Motif? I answered this in an Ask Slashdot, not too long ago. Write or use a generic interface, and dynamically link to the toolkit, via a symlink. To change the toolkit, change the symlink. A single ln -sf operation.
What about directories? Most directory layout information can be plucked out of the environment variables and standard utilities. ("which" is VERY handy, and "find" is invaluable.)
What about different processors? Do what the old DOS programs did - probe! In this case, it's easy, as you can find out with uname. Then, just have binaries for each processor and install the right binary.
What about different kernel versions? Same as above, for toolkits and processors. Anything that is kernel-specific goes in a seperate .so file, and which .so file you use depends on which kernel uname returns.
Conclusion - you CAN guarantee software will run on ANY distribution that exists or ever will exist, without having to maintain it specifically FOR that distribution.
Fragmentation of O/S (Score:1)
What I find interesting is the fragmentation taking place with in MS O/S. E.g., Win95, Win98, WinNT workstation, Win CE, embedded Windows, Win2k (which has enterprise, desktop and home user flavors). I think this is a bigger story, when one organizantion cannot standardize within itself.
Any input?
Re:Username / Password: slashdoted / slashdot (Score:1)
How about next time we use FuzzyPenguin / LinuxRules when they shut off this one.
Re:Does anyone notice a pattern here? (Score:1)
As I'm basically solely a user of SunOS and MacOS, I'm not qualified to discuss potential Balkanization of Linux, but the article seemed pretty even-handed. It didn't say that Linux was doomed. It said that given past issues with commercial Unix flavors (like, uh, SunOS) and the recent flurry of interest in Linux, fragmentation was a danger to be watched out for.
I personally don't think the KDE vs. Gnome issue is a big one, but the article mentioned that. And it brought up the counterargument, citing Linus and the Linux System Base. I think it was about as good as coverage of a basically technical issue will get in the mainstream press.
Re:Username / Password: slashdoted / slashdot (Score:1)
All-In-One Linux (Score:1)
Caldera and fracture of the Linux community (Score:1)
I personally don't worry about it to much at this point.
Re:Fragmentation of O/S (Score:1)
Re:The consequence of Fragmentation under Linux (Score:1)
To start with I want to point out that I agree with you on this. It's some of your other arguments that I'm not "at one with".
I don't agree with this, nor do I think it is a worthwhile goal either.
Let's say I add a utility that allows one to specify a power max or a tempature maximum. This changes the command line API (i.e. it adds a new command, probbably one only the "system manager" would bother with, and most likely only on portables, but still...). I don't think anyone will oblitarate me (assuming I release the source to all the needed kernel mods). I do think almost nobody will pay attention to me unless I release the source to my command line utility.
Same for the system interface, let's say I add a setsockopt(2) that tells the kernel to include timestamps on each packet read via recvmsg(2). Assuming I realase source (required as the code in question is GPLed) nobody will obliterate me.
In both cases most people would ignore the changes unless they made it into a major distro. Even then many people would ignore them unless they made it into almost all major distros.
However some small number of people that found the functionality very useful would use it. This would be good for them. They might lobby some of the distros to include the changes. They might not. A very few people might use some of these things and only later discover that they arn't very portable from Linux to Linux. That would be bad for them.
I could do a similar example with device drivers, but I leave that as an excorsize to the intrested reader. :-)
Then how do we get any change that isn't driven directly through Linus? I think the issue is less about forks in the tree, but forks that never join back up. It is good if the tree forks a little. It is good if some of the forks are deemed bad and allowed to die off. It is good if some of those forks are deemed good and folded back into the main line. It is very very bad if some of the forks are deemed good, but don't make it back into the main line.
The GPL makes sure that the code formed by forking a GPLed package (the kernel, and many but not all user level utilities) is available for folding back. That is a major strength of Linux. It doesn't prevent forking (as the sun community licence more or less does). I for one think that is also a major strength of Linux.
Re:The Balkanization Of Linux? (Score:2)
(Love your login name [kultur-online.com], BTW....)
I'm curious what "support" means in this context. (NOTE: in the following, I'm using "Red Hat" because it's the one people seem to most fear becoming the Only Linux For Which People Release Software.)
Does it mean "we're releasing a version that can be installed on, and run on, a Red Hat system, but that depends on stuff (installer, libraries, file system layouts, etc.) on a Red Hat system, so it won't work, or won't work quite right, on a different distribution"?
Or does it mean "well, we're not trying to make it Red Hat-only, but we're only going to test it on Red Hat, and are only going to offer support for customers running Red Hat - if you call us up because it doesn't work on OpenLinux or TurboLinux or SuSE or Debian or..., we'll tell you how sorry we are to hear that, and then we'll suggest you install Red Hat if you want to run our software"?
(It may well be that different vendors mean different things by "support".)
To some extent, the first of those could perhaps be worked around by adding stuff to your non-Red Hat system (unless the changes needed to get the software to run are incompatible changes - but I don't know how many users, other than technoids, will want to do that). Perhaps that'll provide an incentive for vendors to make their distributions more Red Hat-like, for better or worse.
The second of those may be less of a problem, in that software that's not "supported", in that sense, on other distributions may Just Work on those distributions - but there may be customers for whom "it works, but we won't answer your phone calls" may not be good enough.
...and those vendors might, in turn, apply pressure on developers of Linux distributions to try to make it easier for software to work on multiple distributions - and for software vendors to test software without having to do a ton of testing on N different distributions. (The LSB [linuxbase.com] appears to be intended to have a sample implementation; however, the LSB Organization page [linuxbase.com] says:
so it won't necessarily be usable as a distribution on which vendors can do testing of their applications.)
(Hmm. I'm curious how vendors of Windows applications handle Windows OT, e.g W95 and W98, and Windows NT? I wouldn't be at all surprised to hear that they have to test applications on both platforms if they're going to support them on both platforms - and to test them on different versions of those platforms, e.g. W95 and W98, or NT 4.0 and NT 4.n^H^H^H4.0 SPn. Heck, the Windows OT and Windows NT implementations of the Win32 API probably differ a lot more, in their kernel and API libraries, than would the kernel and API-library implementations of two 2.2-kernel/glibc-2.1 Linux distributions.)
Re:Linux != Unix (Score:2)
Well, applications often sit atop more than just the kernel - either they're dynamically linked (and thus sit atop the system shared libraries), or they're statically linked (and may have wired into their binaries assumptions about, say, the locations of files used by the library routines).
Fortunately, it appears that most distributions on which you'd run shrink-wrapped applications (as opposed to, say, a "slap this on a PC with multiple network interfaces and you have a router/firewall" distribution) may be converging on glibc 2.x (although, if the shrink-wrapped application is called "Netscape Communicator 5.0", or whatever the next release is, it may require glibc 2.1 or later, as per Mozilla's requirements); I don't know if any other libraries those applications might use differ widely between distributions.
I note, though, that "Linux core (the True Linux or kernel) will always be the same among the distros." isn't necessarily entirely the case - they aren't all using the same kernel version (Debian's still on 2.0[.x] - no, Potato isn't "done" yet - but I think the other "major" distributions have gone to 2.2[.x]), and they might make local changes (which, of course, other distributions could adopt - blah blah blah GPL blah blah blah - but that doesn't mean they will).
In some cases, local changes are just "enhancements", in which case an application vendor might choose Just To Say No and not use features added by a distribution. Of course, the trick there is how to discover what's distribution-unique; an LSB "reference implementation" might be useful there - if it doesn't run on the reference implementation, it might not run on all LSB-compliant distributions.
I hate to ask this... (Score:1)
has this been thought out?
-------------
7e [sevenelements.com]
Things are different in this case (Score:1)
Today, the tide has shifted. Linux vendors use hardware as just a means to run the software well enough so that they can sell it (and services for it) for a lot of money. They don't have a need or desire to change the guts of Linux. Sure the outsides are changed but that is true of other very successful software products as well. Windows comes in many forms, including versions "adjusted" for certain hardware vendors. There are applications that run on Win9x that do not run on Win3.1, apps that run on WinNT that do not work anywhere else, and apps that run on Win9x that do not run anywhere else. Heck, there are applications that seem to only run on specific versions of Win9x! Has this hurt the market share Windows has enjoyed? Nope.
Finally, if the fragmentation of Unix was so bad...why are all these versions of Unix still alive and kicking? The only reason Irix is fading out is because SGI is in trouble, and that is not even from Irix problems. Sun's Solaris, HP's HPUX, IBM's AIX, and Compaq's True64 (the OS formally known as Digital Equipment Corporation's Digital Unix) are all still around and not going anywhere soon. All the companies still make money off of them and the hardware they run on. Fragmentation didn't kill them. Unix is not dead. Unix is alive, kicking, and profitable.
So relax; there is nothing to worry about.
Non-Story (Score:1)
C'mon folks! Here's ESR telling reporters from the Times that this is a NON-STORY, and that they should better examine their motives for writing it! Kudos to Raymond for being so politic about it that the writer didn't catch it. The single quote above is the only thing in the article that actually made a modicum of sense.
--B
Re:GUI design philosophy (Score:1)
Different opinions are good, different whims are good, different design philosophies can be very bad. In my opinion, that is the single biggest weakness of Linux at this point. The tools are all over the place. With a commercial package, you tend to get a fairly intergrated set of tools.
System administration in Linux is a good example. It's totally disjointed. Linuxconf here, netcfg there, make over there. And I personally think Linuxconf has a looong way to go. For established Unix users, it's no problem -- they just use the command line. But what about everyone else?
- Scott
------
Scott Stevenson
fragmentation due to commercialisation? (Score:1)
The article raises a couple of points that aren't new but worth addressing.
First, Unix is fragmented. This is definitely true: every Unix system is a toolbox of utilities and components; competing tools are often installed together on the same system. Most every tool or utility has to face the consequences of being compatible with a couple of different methods for doing the same thing. Even if the end users don't always notice, it is a serious concern for the sysadmins, distributors, and developers.
Second, fragmentation is due to commercialisation. This is a highly questionable statement. Commecrial ventures have an active interest in protecting their unique added value, and some added software can be part of that. But on the other hand, they have an interest in building a well integrated system that appeals to a wide range of users. Therefore, with commercialization you see many incompatibilities between vendors, but once you go with one, a well integrated environment; in a free software environment it's a jungle of competing developments that all try to be compatible with whetever the authors happen to be familiar with. To me, a Linux system, looks more fragmented than a commercial Unix system.
Re:Linux != Unix (Score:2)
may have wired into their binaries assumptions about, say, the locations of files used by the library routines).
I wonder why it hasn't gone to (*shudder*) the MS way. If the DLLs (or shared objects) don't exist, then just insert them. I don't see a problem since shared objects have ways of versioning that DLLs don't. So it won't be a problem to add libX.2.1 if it doesn't exist.
I know all the distros use different versions of the kernel. The first thing I do when I install a new distro is download and compile the latest kernel. And I have yet to have a problem with this.
Steven Rostedt
Fragmentation happening rapidly! (Score:1)
No longer can the user be sure that any generic code will work on any one distribution. No longer can the user even be sure the basic functionality of the kernel will work consistently from one distribution to another.
The source of all this incompatibility? How do I loathe these commercial distros, so let me count the ways!
Lack of strong, pro-active support for the LSB
by the commercial distros: Lip service spewed
simply to avoid getting flamed doesn't quite
serve the purpose of getting a solid LSB.
The commercial vendors really don't want a LSB,
at least their marketing folks don't: One very
strong concept in marketing is DIFFERENTIATION!
You need to make your product different enough
and drone on about the "superior" aspects of
the variety to get the consumer to buy the
product.
Money counts more than quality. The commercial
vendors have to be concerned with money first
and their products show it. Redhat is buggy
crap when running X; Caldera's install won't
even let you make a boot floppy during install
(hey, you know those newbies just gotta love
that); SuSE has so much proprietary patching
done to their kernels that I often can't get
common drivers to work; and the list goes on
and on and on....
The frickin' long-term libc vs. glibc6 mess.
This has opened the door to all sorts of
opportunities for the differentiators to make
trouble. Any LSB should deal with this ASAP!
Perhaps dual-library cross compilers as a
standard feature? Make the effort to ensure
glibc6 is fully inclusive of libc5?
To sum it up: The commercial distros are desktop manager happy and want the entire Linux world to look and act like Microsoft product, apparently to the point of being sloppy, unreliable crap just like their favored model. The commercial distros care far more about making money than they do providing a quality product. One commercial variety of Linux will not be consistent in the way it works and the programs the user can use with it when compared with another commercial variety of Linux.
What do I use? My control testing box is Slackware-based, I don't use either one of the slow, and unreliable desktop managers (both Gnome and Kde sucketh in a big, bad, buggy kinda way) except when I'm testing X/video related stuff. I've tried using both Gnome and Kde, they are both buggy, unreliable and offer very little functionality for the loss of speed and increase in instability that comes with them. IMHO, both are still beta-stage code.
A prediction? If there isn't a strong LSB in place
soon, Microsoft will continue to dominate the Desktop, will make a turn-around in server space, and Linux will have been a flash-in-the-pan. Why?
Because users won't abandon one buggy, unreliable mess for another: Better a known evil than an unknown evil, to paraphrase an old saying.
Be prepared Linux-folk, the commercial vendors will try every way possible to either sink the LSB or render it a toothless (i.e., worthless) tiger because it is not in their best interest, which is making money.
Re:The consequence of Fragmentation under Linux (Score:1)
As long as there is no confusion about which version is the real Linux then I don't see a problem.
People can choose.
I've been thinking about OS design for a while (although I've haven't played around much at that level before). So I'm thinking the best way forward is to take the Linux source, learn how it works (am I being naive?) and then tinker. If something cool (but not suitable for a Linux Kernal patch) came out of it why shouldn't I put it on an FTP server somewhere and tell people about it. How does that harm anyone?
But you're talking about company X trying to make money and keep up with Linux updates, right? Agreed forking wouldn't make much sense, but I wouldn't feel the need to damn then.
Please enlighten me if I've missed something here
Re:It's not really a fragmentation issue because.. (Score:1)
Not in security patch administration it isn't. There things are already quite fragmented.
Don't get me wrong, I don't want to see fragmentation, but I'm not going to pretend it doesn't exist if it does, either.