Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Linux Software

NY Times on "the Fragmentation of Linux" 67

Weramona writes "The Times is running an article on the possibility of Balkanization of Linux, due to commercialization. To be fair, both sides are presented, and it isn't all that sensationalist. The article is aimed rather low ("Unix was created in 1969 by..."). What's funny to me is, a couple months ago, this was a favorite "Damn the Man" conspiracy theory on /. " Its the times so you need a free account to read the story, but its a pretty good piece so its worth it.
This discussion has been archived. No new comments can be posted.

NY Times on "the Fragmentation of Linux"

Comments Filter:
  • There is significant fragmentation in the Linux community, but that is not a bad thing... People have different opinions, whims, and design philosophies... It's all a part of the Linux culture and not something that should be looked at with a negative spin.

    Justin, of Gnuidea Software [169.233.22.137]
  • Seems like the open source model means that a lot of this can be avoided, but it also means that the user is going to have to know more to use the system, which may be good, but will keep Linux from being a true desktop (or general public desktop) solution.
  • While this presents both sides of the story, I still feel like it's just a regurgitation of old worries. There is not a valid argument in this article that would suggest in any way that Linux is more susceptible than any other open source project to fragmentation.

  • by rde ( 17364 )
    It's not so much a piece on balkanisation as it is one on fears of same. To this end it's a pretty good piece, but one I felt was written purely because the author knew it was an issue with some people. The main quote that some vendors "are more inclined to chase money and less inclined to share all their toys with their friends" isn't substantiated, and as the author pointed out, the Linux Standard Base should head problems off at the pass.
    I don't think the article was aimed low, btw; it's another example of a mainstream paper covering a topic with which a lot of -- but by no means all -- readers are familiar. It makes sense to include background, and it's a further example of Linux being brought to the masses.
  • While I still personally believe that linux is breaking up, I can also see unification in the horizon. The way Linux is maintained and the way UNIX was maintained were different in all possible ways.
    Linux has been backed up with a process which is more democratic, unlike the older UNIXes which was essentially maintained by companies for economic reasons which I would call the true capatilist way of management.
    FSF, Linus and the rest of the gang around the world play a very vital role in regulating code which was absent in previous UNIX.
    However, thats just my feeling.... others might have a different view to it.
  • Nothing new in the story. Linux does not comprise of the applications running on top but of the kernel. As long as the underlying libraries are the same, I see no fragmentation problems. Hell just pick an executable off a machine running RHL and put it on a machine running Caldera. With the same libraries, it should run.
  • by jd ( 1658 ) <`imipak' `at' `yahoo.com'> on Monday October 18, 1999 @03:17AM (#1605610) Homepage Journal
    by fragmentation.

    I mean, do you include people creating distributions of the same basic kernel, and a different selection of utilities? (In which case, how is that any different from computer companies bundling different selections of software?)

    Do you include distributions with different kernels (eg: L4Linux), but the same utilities? (Here, how would the average user be able to tell that there was a difference at all?)

    How about a.out/elf, or libc5/glibc? Well, everyone has migrated to elf, and most have finished moving to glibc, so there seems to be a compunction to standardise, there.

    What else is there? Window managers & underlying X toolkits seem to be one battle, but I'd put that in the same category as bundled utilities - no different from any other computer market, since (time *) began.

    There's the directory the config files are put in, yes, but that seems to be working itself out.

    There's the X vs. Berlin battle, but that won't be anything more than a possibility (not even a certainty) for a long time to come. Berlin looks promising, but it's not ready for the Prime Time.

    What's left? The installer? Oh, wow! Like you have to worry about that, after you've installed the distribution.

    The Package Manager? That might have been a really serious contender for causing fragmentation, but Alien and similar utils make that almost redundant. As far as your computer is concerned, all package managers can effectively interchange packages with each other.

    AFAICS, that pretty much wraps up all the possible causes of fragmentation.

  • Consider that most of the critical pieces of software (things like GCC, GLIBC, Perl, SAMBA, Linux Kernel come to mind) involve Dipping Into The Same Source Code Stream. Thus, while distributions may pick different versions of these components, the differences are not persistent since the next releases will pick a later version from the same stream of development.

    The main place where differences between Linux distributions are persistent are with regard to two things:

    1. Installation tools

      ... In the case of "initialization" stuff, the custom tools built by Caldera versus RHAT versus SuSE versus ... may be permanently different, but this is relatively uninteresting since you only run this stuff once.

    2. System Management tools

      This is arguably a matter for more concern.

      Tools include rpm/dpkg, and the recent proliferation of distributions based on Debian is results in RPM no longer being quite as "worshipped" as it used to be.

      I regard the increase in interest in Debian-based distributions as a good thing since Debian has more automated tools for managing and validating validity of packages, which is an area where RPM had "gotten pretty stuck" for a long time.

      Aside from package management, there is then "system management," with tools like COAS and Linuxconf, where different distributions are promoting different tools. (And I'd put in a plug for the OS-independent tool cfengine [hioslo.no] that's good for lots of purposes...)

    There's some fragmentation, but my old essay Linux and Decentralized Development [hex.net] has the thesis that the net results are positive. I haven't seen compelling evidence to the contrary yet.

  • Linux commercialisation has fragmented things a little bit, but in a good way.

    Linux is fragmenting into specialised tools with a common base. The tools are aimed at certain core markets where it performs very, very well. Microsoft is a good example of where a product hasn't fragmented to exploit markets. Win9x doesn't know if it wants to be a server or a desktop system, and NT has grown so large trying to be all things for all people that its nearly unmanageable, and each release seems to be getting heavier and heavier, and more unstable (the Win2k test shows that even microsoft has realised this).

    Linux must retain and expand these areas and make sure people understands why this is the case. If you are presented with a project that requires multi-user access, take a look at all the linux distros. Somewhere in there is a distro that will provide you with exactly the base you need to build your application on. In some cases all you need to do is to change a few variables and design a webpage.

    There is no major infighting between developers over disros - this is where bad things would happen (but there is a bit of mumbling and finger pointing). The developers either tend to igore one another or work with each other. This is good.

    The current trend of articles is to portray linux as a fragmented infighting collection of geeks. There needs to be more PR and education projects to get the journalists to realise that this is not always the case.

    If linux was a corporation, it would take a seclection of editors off and wine and dine them somewhere expensive, and pick up the tab. It would take a selection of journalists off on a jaunt somewhere and get them drunk.

    The problem Linux faces is that until recently it's not had the financial backing to do this. The RedHat IPO does give them the money to do this, but it remains to be seen if they wil follow this way of doing business. I think that they probably won't (at least not for a while yet).
  • They make no conclusions, but offer opinions on both sides of the issue. A good article, except for one glaring error. Free Software does not equal public domain! They really need to put an accurate Free Software entry into their style guide.

    ----
  • by Anonymous Coward
    Linux as a server OS will be OK, with presure placed on commercial distributers to adopt the LSB, many of these problems will be minor. Even Caldera's integration of proprietary tools will be moot as some of the more interesting protcols mature (i.e., LDAP, XML-RPC, etc.).

    We will, however, see a flurry of activity on the desktop side. There are a ton of people who do not need to run a server, but instead want a fast, stable, and cheap platform to surf the web, play games, and write letters and resumes. These people are willing to pay US$60 a pop for this (or part of it) and have payed US$400 for just the software to gain this functionality. Aside from installation, there isn't much support that is required and when it is, the establised companies are already charging per incident.

    Unfortunately, this Linux desktop will probably not come from the larger distributors today. It will be a company who adopts the Linux kernel and extends it with their own proprietary GUI. It won't be X complient, or even have X available. The winner will eventually get X support through a company like Hummngbird. Early entrants will make developers pay a couple thousand for the priveledge of developing for "their" platform. That will eventually stop as competing desktop vie for developers. Free tools and "open" APIs will finally arrive.

    The Linux you know and love will still be strong. Serving enterprises and power-users home desktops, but your mom will be running Linux without even knowing it. From a causual inspection, you might not know it either.
  • What is the world coming to, when there's a piece on Linux in a big mainstream newspaper that's been written by a journalist with a clue. Initially, I thought that this piece was going to be another addition to the mountain of FUD that's been written in the last few months, but was very pleasantly suprised by its balance and accuracy. The article quite cleverly and subtly suggests to its readers that linux fragmentation is something to be a little wary of, but is not likely to be cause for great concern.


    HH

  • by hey! ( 33014 ) on Monday October 18, 1999 @03:54AM (#1605616) Homepage Journal
    What I don't understand is why some people think that competition is always good except in the software arena. It seems to me that people who create their own distributions do it because they think that they can put together a better mix of components than anyone has thought of before, or because they see a niche that hasn't been addressed. It seems to me that open source enables the right combination of cooperation and competition. If everyone who wanted to create their own distro had to become part of Debian (my personal favorite distro), Debian would not have any focus at all. This way, developers are free to break free and form whatever combinations are seem best, and consumers benefit from both diversity between competing offerings and coherence within a single distro. The market for commercial software has not shown the ability to support diversity between products, and has for some time been showing a tendency towards loss of coherence with each subsequent release of new versions.
  • Which distribution appears to be the most dedicated to maintaining standards? Which appears to be the most likely to jump ship and cause fragmentation in a key area when the situation proves profitable enough?

    I would perceive Redhat to be a likely candidate for the latter, possibly only because they seem to be the leading distribution here in the US. Although, I admit, I haven't seen them do anything I didn't like. Caldera also comes to mind.

    If there is any danger I think it comes from the most popular distributions. The momentum of the sales of a large distribution like Redhat could cause fragmentation even if the rest of the community realized what was going on. In the article, the quote from the Redhat guy seems to say that there might be a problem, although most of the posts I've seen so far discount most of this fear. My reason for wanting to know the answers to the two questions above is this; If I'm going to support a company with my dollars, I want to make sure that I'm supporting someone who is devoted to the Linux community.
  • Hmmm ... title starting to sound like a 1984 byline. I thought the strength of Linux was suppose to be its customisation and flexibility? That you could mix and match components/applications as you wish (for example, there are at least 3 variants of mail transport agents). Thus you can get a version that can optimised to fit the simplified hardware needs rather than a single size fits all. If someone wants to say throw together a simple drawing appliance for kids, they could whip together a mips board, touch screen, along with a stripped down Linux and GIMP. Having access to the source and packages allows one to hit small niches that a big expensive PC wouldn't be practical or affordable.

    Naturally this requires some smarts on the part of the integrator and, of course ;-{, as it is well publicised, we all know that a certain company claims to have purchased the smartest CS brains alive so obviously this can't be the case :-).

    Oh well, different courses for horses.

    LL
  • by nevets ( 39138 ) on Monday October 18, 1999 @04:06AM (#1605620) Homepage Journal
    Why do I always see this comparison, that Linux will follow in the footsteps of Unix. Wasn't Unix able be hide its source, whereas Linux can't. The GPL is probably the strongest reason that Linux will not fork. Any packages that are on top of Linux that are not GPL has the probability to do so. But even with KDE and GNOME, I see them merging more than I see them separating, and that is because of the ability to look at the others code and make updates or "compatibilities".

    Linux core (the True Linux or kernel) will always be the same among the distros. Any distro to fork will fail since it will no longer be compatible with the rest. Or you won't be able to keep up with the "latest" by downloading.

    This brings up one exception. And this was stated in the article about Unix. If different hardware architectures arise, then we may see a split with Linux. But even then, the GPL will allow any "enhancements" to be shared among all distros.

    So far I have had no problems in keeping my Slackware and RedHat Linux boxes up and running the same utilities and applications. I'll raise a concern once I start seeing a problem.

    Steven Rostedt
  • by Darth Maul ( 19860 ) on Monday October 18, 1999 @04:11AM (#1605621)
    Well, the fact that everyone can make their own
    distro is good for us hard-core Linux types, but
    bad for the general user.

    A week ago I went into the local Best Buy store,
    and went to the Linux section just to see what
    all they had. There was a lady there who looked
    confused, and just kept picking up different
    distro boxes, not sure which to buy.

    I felt bad, because she can go right over to the
    windoze section, and buy *the* windoze 98 box.
    She had no idea that SuSe, RedHat, OpenLinux,
    et al were all just Linux.

    In that regards, this is a Bad Thing[tm], because
    it confuses the average Joe user. I do think
    there would be some advantage to having The Linux
    Distrubution.

    Now I'm sure you'll all reply "well we don't want
    people that don't know to buy Linux!", but if
    we want global desktop domination, this spread
    of distros will NOT help. People don't like
    actually doing research when it comes to
    technology. They want to be told what to buy or
    have no choice. Hence the popularity of windoze.
  • by Anonymous Coward
    First, it talks about balkanization of Unix like the different flavors were insurmountably different. The real barriers arise with usage of proprietary very high level APIs (eg: IRIS Performer, ImageVision, video libraries, etc.)

    Of course it is possible for a vendor to add value like that, but it won't be a huge problem.

    The REAL issue, and one that I'm surprised that the LSB spokesperson does not even acknowledge, is the difference in how the different distributions compose packages, regardless of distribution format. It's not enough being able to force import of foreign packages. One wants to enforce the dependency constraints. That is one of the main strengths of Linux (and IRIX, BTW, where inst/swmgr is still much better than RPM, although the latter indeed satisfies the basic needs).
  • Regardless, Linux is still Linux. The API's are the same, the system's resources and libraries are the same, and the file system is just about the same. There are a few differences here and there (mainly in things like library/kernel versions and install script methods), but it's not the issue it is on other Unix versions, and because of the Open Source model, it never will be.

    There's incentive for most vendors to package their distro in a standard format (or at least support RPM installation), because they'll have off-the-shelf compatibility with the increasing number of applications available for the platform. Forking costs you the penalty of breaking that compatibility - now you've lost control of your commercial applications market. Where things get proprietary is in places like install procedures and/or bundled goodies (or sometimes in system management - like SuSE does), but once installed, Linux is Linux. All praise the penguin.

    - -Josh Turiel
  • That abundance of choices doesn't necessarily indicate fragmentation. In fact, use of different formats for various applications creates a niche for compatibility, either through the development by the primary applications' projects (Gnome and KDE) or through a third party (RPM, Debs, and Alien).


    Chas - The one, the only.
    THANK GOD!!!
  • I actually have a case in point where recently I installed Linux-Mandrake 6.0 (RedHat 6.0 plus StarOffice and some other junk) yes -- slap me with a dead trout because that was *dumb* First of all installing any RH*.0 has been known to be a dnagerous past time. Add mandrake-soft's tweaking to it and you might have a problem. In essence, regardless of the WM, Xfree86 version, utils or whatever, X would destroy my memory and wreak havoc on the ext2 because after reboot the fs was out of sync. I ran a bunch of X stuff through gdb and lefence to no avail and franky, I was running out of time on a project. I snagged slakware 4.0 (I started out on Slakware 1.0.13) and ran it through a variety of "as close as I can replicate it" tests and nothing even near the same level of disasters was occuring (although I did manage a nice xemacs core dump - I am rather proud of that). So - if I had the time, I would have found the problem (and am in fact still running the mdk distro on a stronger system to see if I can find the evil code) because I know how, but, not everyone does. Here there may have been a case of code tweaking (I have not gotten a confirm or denial from Mandrake - actually I haven't heard anything at all) or some mismatched guts. Now, as a matter of record, I did buy Mandrake 6.0 in a rush without thinking it through. On that occasion I had bashed my previous distro and the CD was shot and - once again - I was on a project and needed a system like THAT DAY so I pretty much deserved what I got. With Slakware 4.0 I mulled over it for several weeks until finally asking around and getting the answers I wanted. I do not think Linux is fragmenting nearly as much or as rapidly as UNIX did, but I do think that potential users need be aware that the fallacy of composition does exist within the confines of Linux distros - what works on one does not neccessarily work on the other even if the kernel is the same. The point to remember is Linux can be just as complex as UNIX (which is how some of us like it) and critical thinking has to come into play before picking up a distro. In the past it was easy, you had Slak and RH. One non-commercial and one commercial (respectively). It is much more complex now and users need to have the capability in understanding the differences, where to find good data about distros and how to implement them.
  • Is this not partially an apples/oranges comparison? If the customer buys the windows98 box then (s)he is basically just buying the OS - most applications have to be obtained separately. In Linux distributions, though, the OS forms just a small part of the package. Many distributions include a substantial number of applications as well.
  • by Anonymous Coward
    I'm new at this, but from what I can tell, working with AIX, Solaris, BSDs, Debian or Red Hat is closer than the differences between Win 3.x, Win 95, or NT. Software porting is different, but a Unix admin should be able to work with any of them. What is the huge problem with balkanization (besides perception)?
  • by RNG ( 35225 ) on Monday October 18, 1999 @06:21AM (#1605630)
    I think this is all blown out of proportion. Lets say company X wants to implement their own Linux version and adds some stuff to the kernel that Linus won't accept into the main distribution. I would think that the resulting backlash of such an action (shipping a custom kernel or for that matter a custom libc) by the hordes of Linux hackers would be enough to change that companies mind. If they persist on still keeping their custom enhancements, then they will have to re-apply their patches against every new kernel (or every new libc); no small feat in the Linux world where releases are measured in days rather than months.

    Second, there are 2 kinds of fragmentation: API and binary. If you change the API, then you fragment and may the hordes of angry Linux hackers persecute you for the rest of your miserable days. In terms of binary fragmentation, we're there already (at least we were when some distros had already changed to glibc while others where still using libc). This however, I don't see as a major problem, as it (in most cases) can be fixed by a recompile.

    In summary: Yes, someone could fork the kernel tree, but at what price? I would hate them for it (as probably/hopefully millions of other people also would), which would automatically reduce their chances of successfully marketing whatever it is they make. Plus, they would have to run like hell to keep up with the rest of Linux development. I really do think that the Linux development model (ie: the speed at which Linux evolves) is actually a pretty good defense against fragmentation: both from a technical standpoint as well as from a social one. Lets not forget that even companies like Toshiba can be swayed by enough angry emails threatening to boycot them.
  • What really will balkanize Linux is software which is made binary incompatible amongst Linux systems. In the BSD and SysV world, as with Linux, there is plenty of software that can be recompiled cross-platform, but it's software that was locked into releasing only on proprietary binary formats that fueled the competition between systems like IRIX, Solaris, HPUX, SCO, and OSF.

    People use computers to perform various tasks other than running an OS. If software is not available for an OS, no matter how good the OS is is, it wanes in popularity and possibly dies. If companies only support RedHat with software, then no matter how good Linuxen like SUSE and Debian may be, they're going to eventually decline in favor of RedHat, because people need software to do work.

    Also, many people can't handle recompiling software, so if they've got a Linux variant with a nice installer, and they can get commercial software in proprietary binary formats that have nice installers, then they are using a computing paradigm that is familiar to them...

    Linuxen that have nice, easy installers and are supported by commercial software with nice, easy installers will be the ones that have the best chance of combating Microsoft... but they'll also fuel the "balkanization" of Linux...

    Software vendors that don't commit to releasing cross-linux software are basically just pushing Linux towards the same situation that arose with other OSes... the difference is that it's easier with Linux to make cross-builds, so that a commercial package could be released on RedHat, SUSE, Debian, and perhaps others without as much effort as, say, a cross-build between Solaris and IRIX, and certainly more easily than a Windows and Mac cross-release...

    It's up to the users to demand such things, by contacting commercial software vendors and requesting it, letting them know there is money in it for them... they're not going to do it out of the kindness of their hearts, they're in business to make a living, not prevent Linux balkanization...

    However, I think most Linux vendors would be willing to support several Linux variants if they knew the customers were there, and the best way for them to find out that is the case is for the customers to let them know directly...
  • It seems that many mainstream publications have picked up on Linux, one run good general background article on it, and then waited a few months to run a gloom-and-doom article dismissing it (usually for some drummed-up reason like "Linux is too fragmented" or "Red Hat is another Microsoft"). I'm not sure why they would want to do this, unless they're just fad-crazy and they want to dump Linux in favor of another fad. One thing is for sure: this is sensationalism, not responsible journalism. First it's "Linux is the revolution that will topple Microsoft". Now it's "Linux is all washed up." How about being realistic and trying to find some middle ground?
    Beer recipe: free! #Source
    Cold pints: $2 #Product
  • I can't see any of the 19 replies (don't know why), but if noone posted it, use cypherpunks as login and password for the NYTimes.

    I thought this was a fairly good article, in terms of expressing many people's opinions about code drift. But I still don't feel like it's anything like when Unix split. Perhaps that will happen once MSFT ports their apps over onto a commercial GUI shell, but I think it's just the usual paranoia about lack of control expressing itself.

  • The fact that the important components (kernel, C library) are GPL-ed means that everybody can copy everybody else's changes. Given this, and given the standardization efforts currently running, I don't think that fragmentation is the bogey man people claim. It's fundamentaly different for GPL software because you can't hold your changes close as with proprietary software.

    Thanks

    Bruce

  • It seems the NY Times noticed the heavy activity on the "cypherpunks" account and shut it down. Someone (not me) has created a new, easy-to-remember username/password combination, though:

    Username: slashdoted
    Password: slashdot

    Note that there's only one "T" in "slashdoted", for some reason.

    I'm also posting this at the top level of the discussion tree.
    -----
    The real meaning of the GNU GPL:

  • The lack of a "standard" User Interface for the commercial vendors to design around is an advantage. Different User Interfaces suit different users. Currently the user/developer can chose the optimimum UI for their use/the application. We are not forced into using the "standard" UI as Windows users/developers are.
  • I'm concerned.

    The balkanization of FUD is causing numerous problems, most importantly several not quite compatible variants of FUD. I have seen FUD from one company saying that since Linux is free it is worthless, and FUD from a different company saying that Linux is in fact more expensive to deploy than, say, Windows.

    I think it is important that all producers of FUD work together so that needless incompatibilities can be avoided. It is of course important for vendors to be able to differentiate their FUD in the market, but this needs not cause incompatibility. I applaud the efforts that Microsoft does to provide basic FUD to VAR's such as ZD and NY Times, who are then able to add their spin, creating different but compatible FUD.


    Benny

  • It seems the NY Times noticed the heavy activity on the "cypherpunks" account and shut it down. Someone (not me) has created a new, easy-to-remember username/password combination, though:

    Username: slashdoted
    Password: slashdot

    Note that there's only one "T" in "slashdoted", for some reason.
    -----
    The real meaning of the GNU GPL:

  • IMHO, anytime a comercial vendor chooses to support one particular distribution of Linux we're seeing the results of fragmentation. The technical reasons are unimportant (sometimes even non-existant), what counts is the perception that there is a difference and one is better/faster/more supported.

    Mind you, I'm speaking as a developer of closed-source software. I've been in meeting discussing which distributions of Linux we can and cannot support. It's very frustrating.
  • ...from RedHat.

    In a cnet article about RedHat supportin linux on Compaq machines I saw this:

    "As part of the Compaq deal, fixes created by Red Hat personnel will be contributed to the open-source community, Red Hat said. "

    -johnE

    the article [cnet.com]

  • by jd ( 1658 ) <`imipak' `at' `yahoo.com'> on Monday October 18, 1999 @09:10AM (#1605647) Homepage Journal
    You're right that it is a case of perception, rather than reality. (Stampede is one of the few -possible- exceptions, but only if you're talking about a raw install, rather than progressive upgrades.)

    WRT the meetings, I can understand such concern, but think that it's largely born of fear, uncertainty and doubt. (That is VERY different from saying that such discussions produce FUD, but that if the technical issues were understood, and the fears allayed, they would never have occured in the first place.)

    A case in point:

    Let's say that you want to produce some program, Z, which needs to run under Linux. However, you don't know which distribution of Linux it's going to use. What do you do?

    Answer: Simple. Scan the distribution to see what resources exist and where they are, then install anything extra you need. (Configure isn't confined to Makefiles - I've used it as a nice installation tool, as well.)

    But what about versions of libraries? Not a problem! Just install your own, and make sure your installation directory is at the head of LD_LIBRARY_PATH.

    What about Gnome/KDE/Motif? I answered this in an Ask Slashdot, not too long ago. Write or use a generic interface, and dynamically link to the toolkit, via a symlink. To change the toolkit, change the symlink. A single ln -sf operation.

    What about directories? Most directory layout information can be plucked out of the environment variables and standard utilities. ("which" is VERY handy, and "find" is invaluable.)

    What about different processors? Do what the old DOS programs did - probe! In this case, it's easy, as you can find out with uname. Then, just have binaries for each processor and install the right binary.

    What about different kernel versions? Same as above, for toolkits and processors. Anything that is kernel-specific goes in a seperate .so file, and which .so file you use depends on which kernel uname returns.

    Conclusion - you CAN guarantee software will run on ANY distribution that exists or ever will exist, without having to maintain it specifically FOR that distribution.

  • A friend and I were discussing this a few days ago. UNIX is fragmented in the sense that each vendor places a particular "spin" so to speak on each type to differentiate it from the competition. It seems to me to be a natural process that HP, IBM, Sun etc. would all do things differently.

    What I find interesting is the fragmentation taking place with in MS O/S. E.g., Win95, Win98, WinNT workstation, Win CE, embedded Windows, Win2k (which has enterprise, desktop and home user flavors). I think this is a bigger story, when one organizantion cannot standardize within itself.

    Any input?
  • Cool. Glad to see someone did this for those amongst us with securaphobia.

    How about next time we use FuzzyPenguin / LinuxRules when they shut off this one.
  • Did you read the article? I felt it was anything but "all washed up" FUD.

    As I'm basically solely a user of SunOS and MacOS, I'm not qualified to discuss potential Balkanization of Linux, but the article seemed pretty even-handed. It didn't say that Linux was doomed. It said that given past issues with commercial Unix flavors (like, uh, SunOS) and the recent flurry of interest in Linux, fragmentation was a danger to be watched out for.

    I personally don't think the KDE vs. Gnome issue is a big one, but the article mentioned that. And it brought up the counterargument, citing Linus and the Linux System Base. I think it was about as good as coverage of a basically technical issue will get in the mainstream press.

  • Thanks. I'm now reading the article on behalf of Mr. slashdoted. I will summarize the gist and forward it to him. I wish to emphasize that I am not reading on my own behalf! Legally, Mr. slashdoted is now reading.
  • All-In-One Linux is due out soon, which should break the barriers set by different Linux distributions, making installation of one package system designed for on distrtibution a breeze on AIO Linux. The only catch is that AIO Linux did not get it's name for it's supposed ability to bring the distributions together. No, no, it got the name because it will not support subdirectories of any kind. Everything is install in one directory. You won't be able to make new directories, but you will be able to have files of the same name, accessable by a special character sequence plus a function key pressed while running commands. The ls program is being re-written from scratch, featuring over 2000 switches to facilitate directory listing. Spintek software has already released a freeware program called AIO Mark, a shell which will cross refrence files in a large master file, creating virtual subdirectories.
  • Well Red Hat releases all their software under the GPL, as well as most of the other Linux distributors. Caldera is one of the few exceptions. They didn't OpenSource the Netscape FastTrack e-commerce server but it wasn't theirs to do. Also they didn't OpenSource the Novell NDS stuff but that may soon be pointless since I have heard from several Novell higher ups that Novell is strongly thinking of OpenSourcing their NDS to all platforms especially Linux. If that happens then there really isn't that much that distros have that are propriatary. Caldera has their own installer (Lizard) that they are opensourcing and that and some other things they are opensourcing but not much commercial software anymore.

    I personally don't worry about it to much at this point.
  • I think that as long as Linux distro's keep the kernel, and libs all common there's not much to worry about as far as "Balkanization" is concerned. I agree that Linux is made to be flexible and there is no right distro. Unlike M$ Linux and it's users can be as flexible as they want, not confined to a certain order of things. As for the different variations of MS Products, I believe it comes down to money. It's human nature to want bigger and better and MS knows that if they come out with something new, and "fix" the problems with the old they'll have more money to go buy small companys that challenge them :)
  • I think this is all blown out of proportion.

    To start with I want to point out that I agree with you on this. It's some of your other arguments that I'm not "at one with".

    If you change the API, then you fragment and may the hordes of angry Linux hackers persecute you for the rest of your miserable days.

    I don't agree with this, nor do I think it is a worthwhile goal either.

    Let's say I add a utility that allows one to specify a power max or a tempature maximum. This changes the command line API (i.e. it adds a new command, probbably one only the "system manager" would bother with, and most likely only on portables, but still...). I don't think anyone will oblitarate me (assuming I release the source to all the needed kernel mods). I do think almost nobody will pay attention to me unless I release the source to my command line utility.

    Same for the system interface, let's say I add a setsockopt(2) that tells the kernel to include timestamps on each packet read via recvmsg(2). Assuming I realase source (required as the code in question is GPLed) nobody will obliterate me.

    In both cases most people would ignore the changes unless they made it into a major distro. Even then many people would ignore them unless they made it into almost all major distros.

    However some small number of people that found the functionality very useful would use it. This would be good for them. They might lobby some of the distros to include the changes. They might not. A very few people might use some of these things and only later discover that they arn't very portable from Linux to Linux. That would be bad for them.

    I could do a similar example with device drivers, but I leave that as an excorsize to the intrested reader. :-)

    Yes, someone could fork the kernel tree, but at what price? I would hate them for it (as probably/hopefully millions of other people also would), which would automatically reduce their chances of successfully marketing whatever it is they make.

    Then how do we get any change that isn't driven directly through Linus? I think the issue is less about forks in the tree, but forks that never join back up. It is good if the tree forks a little. It is good if some of the forks are deemed bad and allowed to die off. It is good if some of those forks are deemed good and folded back into the main line. It is very very bad if some of the forks are deemed good, but don't make it back into the main line.

    The GPL makes sure that the code formed by forking a GPLed package (the kernel, and many but not all user level utilities) is available for folding back. That is a major strength of Linux. It doesn't prevent forking (as the sun community licence more or less does). I for one think that is also a major strength of Linux.

  • (Love your login name [kultur-online.com], BTW....)

    If companies only support RedHat with software

    I'm curious what "support" means in this context. (NOTE: in the following, I'm using "Red Hat" because it's the one people seem to most fear becoming the Only Linux For Which People Release Software.)

    Does it mean "we're releasing a version that can be installed on, and run on, a Red Hat system, but that depends on stuff (installer, libraries, file system layouts, etc.) on a Red Hat system, so it won't work, or won't work quite right, on a different distribution"?

    Or does it mean "well, we're not trying to make it Red Hat-only, but we're only going to test it on Red Hat, and are only going to offer support for customers running Red Hat - if you call us up because it doesn't work on OpenLinux or TurboLinux or SuSE or Debian or..., we'll tell you how sorry we are to hear that, and then we'll suggest you install Red Hat if you want to run our software"?

    (It may well be that different vendors mean different things by "support".)

    To some extent, the first of those could perhaps be worked around by adding stuff to your non-Red Hat system (unless the changes needed to get the software to run are incompatible changes - but I don't know how many users, other than technoids, will want to do that). Perhaps that'll provide an incentive for vendors to make their distributions more Red Hat-like, for better or worse.

    The second of those may be less of a problem, in that software that's not "supported", in that sense, on other distributions may Just Work on those distributions - but there may be customers for whom "it works, but we won't answer your phone calls" may not be good enough.

    However, I think most Linux vendors would be willing to support several Linux variants if they knew the customers were there, and the best way for them to find out that is the case is for the customers to let them know directly...

    ...and those vendors might, in turn, apply pressure on developers of Linux distributions to try to make it easier for software to work on multiple distributions - and for software vendors to test software without having to do a ton of testing on N different distributions. (The LSB [linuxbase.com] appears to be intended to have a sample implementation; however, the LSB Organization page [linuxbase.com] says:

    The sample implementation will be used to compare and evaluate features that are being considered for inclusion in the standard. The sample implementation is not meant to be used as a reference implementation for the purpose of resolving conformance issues among distribution vendors, though it may be used during the investigation of such issues.

    so it won't necessarily be usable as a distribution on which vendors can do testing of their applications.)

    (Hmm. I'm curious how vendors of Windows applications handle Windows OT, e.g W95 and W98, and Windows NT? I wouldn't be at all surprised to hear that they have to test applications on both platforms if they're going to support them on both platforms - and to test them on different versions of those platforms, e.g. W95 and W98, or NT 4.0 and NT 4.n^H^H^H4.0 SPn. Heck, the Windows OT and Windows NT implementations of the Win32 API probably differ a lot more, in their kernel and API libraries, than would the kernel and API-library implementations of two 2.2-kernel/glibc-2.1 Linux distributions.)

  • Linux core (the True Linux or kernel) will always be the same among the distros.

    Well, applications often sit atop more than just the kernel - either they're dynamically linked (and thus sit atop the system shared libraries), or they're statically linked (and may have wired into their binaries assumptions about, say, the locations of files used by the library routines).

    Fortunately, it appears that most distributions on which you'd run shrink-wrapped applications (as opposed to, say, a "slap this on a PC with multiple network interfaces and you have a router/firewall" distribution) may be converging on glibc 2.x (although, if the shrink-wrapped application is called "Netscape Communicator 5.0", or whatever the next release is, it may require glibc 2.1 or later, as per Mozilla's requirements); I don't know if any other libraries those applications might use differ widely between distributions.

    I note, though, that "Linux core (the True Linux or kernel) will always be the same among the distros." isn't necessarily entirely the case - they aren't all using the same kernel version (Debian's still on 2.0[.x] - no, Potato isn't "done" yet - but I think the other "major" distributions have gone to 2.2[.x]), and they might make local changes (which, of course, other distributions could adopt - blah blah blah GPL blah blah blah - but that doesn't mean they will).

    In some cases, local changes are just "enhancements", in which case an application vendor might choose Just To Say No and not use features added by a distribution. Of course, the trick there is how to discover what's distribution-unique; an LSB "reference implementation" might be useful there - if it doesn't run on the reference implementation, it might not run on all LSB-compliant distributions.

  • but what will happen when Linus shuffles off this mortal coil (god forbid) ?

    has this been thought out?



    -------------
    7e [sevenelements.com]
  • When Unix fragmented the situation was very different from today's Linux vendors. Back then software was just a means to make hardware do enough so that you could sell it (and services for it) for a lot of money. In that vain each company needed an OS that could take full advantage of their paticular hardware needs. At the same time they had spent a lot of time and money developing this hardware. They couldn't share their OS code without giving away bits of information about their hardware.
    Today, the tide has shifted. Linux vendors use hardware as just a means to run the software well enough so that they can sell it (and services for it) for a lot of money. They don't have a need or desire to change the guts of Linux. Sure the outsides are changed but that is true of other very successful software products as well. Windows comes in many forms, including versions "adjusted" for certain hardware vendors. There are applications that run on Win9x that do not run on Win3.1, apps that run on WinNT that do not work anywhere else, and apps that run on Win9x that do not run anywhere else. Heck, there are applications that seem to only run on specific versions of Win9x! Has this hurt the market share Windows has enjoyed? Nope.
    Finally, if the fragmentation of Unix was so bad...why are all these versions of Unix still alive and kicking? The only reason Irix is fading out is because SGI is in trouble, and that is not even from Irix problems. Sun's Solaris, HP's HPUX, IBM's AIX, and Compaq's True64 (the OS formally known as Digital Equipment Corporation's Digital Unix) are all still around and not going anywhere soon. All the companies still make money off of them and the hardware they run on. Fragmentation didn't kill them. Unix is not dead. Unix is alive, kicking, and profitable.
    So relax; there is nothing to worry about.
  • The following quote from the article said it all:


    Others disagree. "Only the trade press is really squeaking about this," said Eric S. Raymond, president of the Open Source Initiative, a programmer group, because it makes a good story hook when you have nothing real to write about and the advertising department is pressuring you to make closed-software vendors look good.

    Raymond added: "The actual Linux developers know better. Fragmentation isn't going to happen, because developers outside of the Linux distributors effectively control all the key pieces." That is because Torvalds and a small group of his colleagues control the Linux standard and subject all modifications to peer review.



    C'mon folks! Here's ESR telling reporters from the Times that this is a NON-STORY, and that they should better examine their motives for writing it! Kudos to Raymond for being so politic about it that the writer didn't catch it. The single quote above is the only thing in the article that actually made a modicum of sense.

    --B
  • There is significant fragmentation in the Linux community, but that is not a bad thing... People have different opinions, whims, and design philosophies... It's all a part of the Linux culture and not something that should be looked at with a negative spin.

    Different opinions are good, different whims are good, different design philosophies can be very bad. In my opinion, that is the single biggest weakness of Linux at this point. The tools are all over the place. With a commercial package, you tend to get a fairly intergrated set of tools.

    System administration in Linux is a good example. It's totally disjointed. Linuxconf here, netcfg there, make over there. And I personally think Linuxconf has a looong way to go. For established Unix users, it's no problem -- they just use the command line. But what about everyone else?

    - Scott

    ------
    Scott Stevenson
  • You echo my thoughts exactly. (Or perhaps I should say, mine echo yours.)

    The article raises a couple of points that aren't new but worth addressing.

    First, Unix is fragmented. This is definitely true: every Unix system is a toolbox of utilities and components; competing tools are often installed together on the same system. Most every tool or utility has to face the consequences of being compatible with a couple of different methods for doing the same thing. Even if the end users don't always notice, it is a serious concern for the sysadmins, distributors, and developers.

    Second, fragmentation is due to commercialisation. This is a highly questionable statement. Commecrial ventures have an active interest in protecting their unique added value, and some added software can be part of that. But on the other hand, they have an interest in building a well integrated system that appeals to a wide range of users. Therefore, with commercialization you see many incompatibilities between vendors, but once you go with one, a well integrated environment; in a free software environment it's a jungle of competing developments that all try to be compatible with whetever the authors happen to be familiar with. To me, a Linux system, looks more fragmented than a commercial Unix system.

  • Well, applications often sit atop more than just the kernel - either they're dynamically linked (and thus sit atop the system shared libraries), or they're statically linked (and
    may have wired into their binaries assumptions about, say, the locations of files used by the library routines).


    I wonder why it hasn't gone to (*shudder*) the MS way. If the DLLs (or shared objects) don't exist, then just insert them. I don't see a problem since shared objects have ways of versioning that DLLs don't. So it won't be a problem to add libX.2.1 if it doesn't exist.

    I know all the distros use different versions of the kernel. The first thing I do when I install a new distro is download and compile the latest kernel. And I have yet to have a problem with this.


    Steven Rostedt
  • As a person who tests bleeding-edge hardware against the four major Linux commercial distros daily (and others as well, but mainly the big four - Caldera, SuSE, RedHat and TurboLinux) I strongly agree with the NYT article: Linux is fragging to the point of looking like a massive gibbing in a Quake fest.

    No longer can the user be sure that any generic code will work on any one distribution. No longer can the user even be sure the basic functionality of the kernel will work consistently from one distribution to another.

    The source of all this incompatibility? How do I loathe these commercial distros, so let me count the ways!

    Lack of strong, pro-active support for the LSB
    by the commercial distros: Lip service spewed
    simply to avoid getting flamed doesn't quite
    serve the purpose of getting a solid LSB.
    The commercial vendors really don't want a LSB,
    at least their marketing folks don't: One very
    strong concept in marketing is DIFFERENTIATION!
    You need to make your product different enough
    and drone on about the "superior" aspects of
    the variety to get the consumer to buy the
    product.

    Money counts more than quality. The commercial
    vendors have to be concerned with money first
    and their products show it. Redhat is buggy
    crap when running X; Caldera's install won't
    even let you make a boot floppy during install
    (hey, you know those newbies just gotta love
    that); SuSE has so much proprietary patching
    done to their kernels that I often can't get
    common drivers to work; and the list goes on
    and on and on....

    The frickin' long-term libc vs. glibc6 mess.
    This has opened the door to all sorts of
    opportunities for the differentiators to make
    trouble. Any LSB should deal with this ASAP!
    Perhaps dual-library cross compilers as a
    standard feature? Make the effort to ensure
    glibc6 is fully inclusive of libc5?

    To sum it up: The commercial distros are desktop manager happy and want the entire Linux world to look and act like Microsoft product, apparently to the point of being sloppy, unreliable crap just like their favored model. The commercial distros care far more about making money than they do providing a quality product. One commercial variety of Linux will not be consistent in the way it works and the programs the user can use with it when compared with another commercial variety of Linux.

    What do I use? My control testing box is Slackware-based, I don't use either one of the slow, and unreliable desktop managers (both Gnome and Kde sucketh in a big, bad, buggy kinda way) except when I'm testing X/video related stuff. I've tried using both Gnome and Kde, they are both buggy, unreliable and offer very little functionality for the loss of speed and increase in instability that comes with them. IMHO, both are still beta-stage code.

    A prediction? If there isn't a strong LSB in place
    soon, Microsoft will continue to dominate the Desktop, will make a turn-around in server space, and Linux will have been a flash-in-the-pan. Why?
    Because users won't abandon one buggy, unreliable mess for another: Better a known evil than an unknown evil, to paraphrase an old saying.

    Be prepared Linux-folk, the commercial vendors will try every way possible to either sink the LSB or render it a toothless (i.e., worthless) tiger because it is not in their best interest, which is making money.
  • I'd be quite happy to see fragmentation of the kernel tree.

    As long as there is no confusion about which version is the real Linux then I don't see a problem.

    People can choose.

    I've been thinking about OS design for a while (although I've haven't played around much at that level before). So I'm thinking the best way forward is to take the Linux source, learn how it works (am I being naive?) and then tinker. If something cool (but not suitable for a Linux Kernal patch) came out of it why shouldn't I put it on an FTP server somewhere and tell people about it. How does that harm anyone?

    But you're talking about company X trying to make money and keep up with Linux updates, right? Agreed forking wouldn't make much sense, but I wouldn't feel the need to damn then.

    Please enlighten me if I've missed something here

  • Linux is Linux?

    Not in security patch administration it isn't. There things are already quite fragmented.

    Don't get me wrong, I don't want to see fragmentation, but I'm not going to pretend it doesn't exist if it does, either.

Our policy is, when in doubt, do the right thing. -- Roy L. Ash, ex-president, Litton Industries

Working...