Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Linux Software

The Question Of Too Many Linux Distributions 202

evenprime writes "In this zdnet column, Evan Leibovitch responds to linux critics who say that Caldera and Red Hat will be the only distributions to survive. Evan points out that the diversity of available distributions, and the ability to roll your own, is a great strength." The arguement Evan makes is one that, IMHO, is correct - and people need to remember that the diversity of the distributions isn't necessarily a bad thing. Sure, maybe the commericial variants will move down to a couple, but I think for the overall community, diversity is a strength.
This discussion has been archived. No new comments can be posted.

The Question Of Too Many Linux Distributions

Comments Filter:
  • The business field will be narrowed because market just isn't there for for 500 distros (but the standards issue will play a roll). Look at the auto industry in the '20s verses the auto industry today.
  • The problems you describe are likely to be less of an issue for a big business. They're likely to standardize on a comparatively small number of packages- just the ones that they really need- and stick with them for a good long while. They're not going to be trying to upgrade the version of their CD-ripping software every time it comes out with a new bugfix. They're also capable of doing neat tricks like compiling their own to solve some of the dependancy problems that you mention. (Actually, the next time that sort of problem happens, you might consider downloading the source RPM and running rpm --rebuild to see if it can be compiled with the software you already have. One major flaw of compiled RPMs is that they often require much more specific versions of packages than is strictly necessary.)

    Of course you can also solve this problem by using a more advanced package management system. Debian users are constantly (and correctly, IMO) bragging about the ease of using apt-get for package management; it deals with all those annoying dependency details for you. The still-under-development Ximian RedCarpet is also quite nice about resolving RPM dependencies and downloading any updated packages you may need. Both systems do depend on having a blessed source of guaranteed compatible packages, though.

  • they use red hat linux at ucla? afaik a majority of the servers run NT and SEAS use Aix/Solaris... and at the installfests the lug generally installs mandrake.


    Zetetic
    Seeking; proceeding by inquiry.

    Elench
    A specious but fallacious argument; a sophism.
  • This sounds like very good advice, unfortunately a lot of companies have taken a different approach. We have a large customer which is a hardware vendor. They develop special purpose systems, and have their own Linux distribution which is tailored to their needs. They support a variety of operatin systems, but Linux's open source nature is a nice advantage when dealing with custom systems. When problems arrive, they don't have to deal with intellectual property issues with the OS vendor when working with their customers to resolve the problems.

    When working on a large system, the fast pace of changes in Linux can be a problem, so they have all the different vendors use their version of Linux, which they then control how often it gets updated.

    The problem then arrises that Linux doesn't have a binary driver interface. Changes to the kernel often break device drivers. It's very difficult to release a driver in binary form at all, and making it work on a variety of systems and distributions is even worse.

    We can't just release a source code version of the driver and have the customer recompile their kernel with our driver built in for three reasons.

    1. The customer doesn't want their customers to have to recompile the kernel to install a new driver. The people installing and updating the drivers are not Linux experts. The installation must be simple. If running Linux on the system means they have to have better trained users, then Linux is no longer cost effective.

    2. The source code contains intellectual property of the ASIC vendor that we have access to under NDA. We cannot legally release the source.

    3. The ASIC we use in available and used by other vendors. Our software is what makes our products superior to our competitors. Making the source available so our competitors, when they've been promising the same features for a year but can't get it to work, can catch up is not a good business decision.

    The problems get even worse because the customers want to use software from a third party which they think should work with our hardware. Unfortuately, that software contains kernel patches which are poorly hacked into the kernel. Now we're spending our development time patching a customer specific kernel and third party software.

    After a couple months of this we think we've got it worked out. My managers are happy and mentioned that they have some other customers who want Linux drivers and can't wait to give the customers the new code. Of course the customer has a different distribution with a different kernel version. My boss was not happy to find our that changes to the kernel break our driver, and more development needs to be done to get the driver working for that customer, who also has some very specialized needs.

    Our product has a lot of features. Fully running our verification tests takes several weeks to a month. Supporting a large number of OSes is expensive. Supporting an OS like Linux where everyone cooks there own version is very expensive. Support for anything but Red Hat or Linux PPC (yes we support PPC) will likely not be a free product from us in the future. It just isn't cost effective unless there's a very large order involved.
  • Friendly graphical installers like the one that comes with Red Hat, or like the one that comes with Windows? There are only 5 varieties of Windows, but their installer still confuses most 1-2year Windows admins I've seen.

    Of course, most Windows desktops come pre-installed, which if that were the case for Linux, installers wouldn't really be the issue.
  • Ok, so why not add pirana to RedHat then instead of a new distro entirely? Even better why not just develop apache to use mod_proxy and make it a non Linux specific solution.

    What's the benefit or why does it matter if Linux fits on a floppy for a firewall? Let's say the environment is grown to 200-300 Linux boxes strong and each one is totally different. How does having a machine that's nothing like the rest help make the admin's job any easier at 2:00am when it goes down? Or what if a new person walks in the door to take over the environment and has to figure out wtf is going on and how each one works, learning curve goes waaay up.

    Not trying to start a war, but it's the custimization and optimization that makes it harder to administrate in my opinion. It makes it an elegant hack job of bubble gum and duct tape...

  • I love this really dumb argument that we shouldn't have too many choices. We should have only one car company, one airline, one oil company, .... That way we would always know for sure which one was best. Is this just a tech problem with micosoft or are all consumers fools?
  • Cosnidering that I turn down several job offers a month and am now looking at another one pretty closely yes I do think being an admin is a long term good thing.
    Yup and of course winders *never* loses it's mind and goes all to hell and needs to have people called in to fix it. The simple fact of the matter is when it is up and running it is just as easy if not more so than winders and when it breaks Susie is not going to be fixing winders anymore than she can fix *nix.
    You will notice that I used the past tense in those statements. Yes it is somewhat better with winders 2000 but still no where as nice as *nix.

  • I'm a firm beleiver of free competition. It's never a good idea to have a monopolist in this (see Windows). One of the things that must be observed is the cross-compatibility. If you constantly have to change software when you have to change distibution, peolple feel like their stuck inside a distrib. Thake the installation procedure for instance. Sometimes it's a tarball, then an rpm, then is a binary with an install script allways assuming a different configuration (read: distubution). Usually it's possible to make them work, but it annoys me every time.

    Slackware rulez! But I would appricate it if Slack 8.0 would be released soon...
    Since I use Slackware 7.1 it usua
  • b) and c) is already supported on Debian.

    Debian has this menus program. When you install a X program on Debian, a new menu item will be created which is available no matter which WM you are using (I tried KDE, GNOME, Enlightenment and WindowMaker).

    Debian also supports dependency checks and downloads.

    • Typical dekstop workplace environment
    • Too many variations of Linux
    • Gnome vs. KDE vs. etc.

    When it comes to deciding whether or not to use Linux in your work environment, these arguments are all very poor cop-outs.

    Here is why. I used to do tech support for IBM Global Services for the Lucent account. Massive outsourcing effort. One of the first things my group did was enforce standardization of the desktop for our users. At the time, the standard desktop was Windows95 with Office95 and very specific network shares for specific groups. If you installed anything else on your system, you had to support it yourself. We only supported Windows95/Office95. Period.

    So what is keeping the CTO of a given company from mandating that servers shall run Debian stable (for example) and nothing else? Workstations shall run Debian stable with Gnome/Gnumeric/Abiword and nothing else? Hell, you can enforce running RedHat servers sharing samba and Windows98 clients running KDE inside X servers for all I care.

    The point is, Linux provides you with not only options and choices but the opportunity to build and control *any* type of environment that suites your business needs. If you cannot enforce standardization of your own business's computing environment, that is your own fault and is independent of OS or GUI.
  • Uhm, then why did you post?
  • My job would be *much* harder if we only had one distro.

    Then you must be a developer, because you definately aren't a Systems Admin...

  • Not to start a flamewar. But SuSE has probably by far the greatest market share in Europe.

    At least in the German speaking parts...

  • Diversity in Linux is a good thing. Darwin will take care of the rest!
  • Ok, sport, I use my Linux machines for work and not for CD Ripping, etc. And life was much simpler before everyone started using rpm. tar is just so much better. So let's not throw personal barbs, OK? Soecially when you don't know squat about me.

    Second, the problems I describe are a huge issue for business large and small regardless of the OS and it is not easily solved. All distribution packages are a nightmare to use and unless your deploying on a very large network are often times far more work than not. Every try to deploy and update to an application only to have it fail on 1/2 the machines for various and sundry reasons. Many update packages are fine, in theory, but in reality are not too helpful.

    My point remains: Linux distros are just as bloated and top heavy as other OS's and just becuase it says linux doesn't make it easier.

    ps: Can't install the frigging source RPM because I need rpm 4.0, which I can't install ...

  • If people would actually hack at Windows as much as they hack at Linux, Windows would never "blow up". I run Windows 95OSR2. It never crashes and runs nice and quick.Why? Because I actually manage my DLL's, VXD's and registry to the point that I do not have such problems.

    I have never seen a Linux user just install a program that they have downloaded off some wayward site without checking it, yet I see the same people just grab the latest version of whatever and install it into Windows. These people have full system trays with the most useless things and several files loading at boot that they'll never use. Then they complain because Windows is slow and buggy. The same user will also spend hours working on a config file in /etc but will never use Regedit in their life.

    I like Linux and Windows for their individual strengths, and I don't have a huge problem with either. This has become sort of a rant, but I think both are fine in their own way once you actually learn to use them both to the proper extent.

  • Have you tried the RedHat beta yet then?
    It's beta (so by definition it sucks :), but is very easy to install compared to previous versions and once installed it's child's play.
    The only thing that might be hard is choosing what you want to install.
  • How can Apple does it so easy with BSD kernel that linux geeks can't do with linux kernel? Put a good GUI and management features etc.. on it!!!


    It has been a long time for linux geeks to fight internally, wait a bit longer and OS X "might" take over the *NIX world iff they allow some open source/license method to their GUI.

    Pondering...

  • And this is just an instantiation of the versioning problem in monolithic OS's. Only it tracks vertical instead of horizontal.

  • As has been pointed out before, there is plenty of room for diversity and personalization - and distribution-specific wackiness, even - but a simple set of agreed standards would solve most of these problems. If the major distributions follow a standard, open and reasonable set of installation standards - oh, say the LSB - then commerical and noncommercial software makers lives will be made easier. I am impressed with the variety of places various rpms want to put those files (/usr/lib, /usr/local, etc, etc, etc). Each user can still place tarballs where they want to, and minor distributions can continue to follow any old wild idea that comes in their head. Vive la difference!
  • "No offense, but I think you must know some really stupid "windows admins" then"

    Of course, these are not the most clued in people ever. I find the Linux installers to be easy to use (a little on the feature-rich side, which can be confusing), but I know folks who see "what kind of keyboard are you using" and their brains lock up. AND THESE ARE PROFESSIONALS.

    We live in a world where we're damn lucky most machines come with an OS because most people who do admin for a living could not install one to save their lives. Most of the admins I know wouldn't break a sweat on a Linux, Windows or BSD install. But, then most of the admins in the world are nothing like most of the admins I know....

    That said, I've been baffled by Windows installs when they get... odd. The thing most people forget is that 99% of the time, when you're installing Windows you're doing so on a system that was meant to run Windows. When you try to install Windows on a home-grown system with a blend of old and new parts, it can be a refreshing dream to install Linux on it ;-)

  • Download the latest kernel, then the latest version of XFree86, the latest version of Gnome, KDE, Apache, etc., then assemble it yourself for a no-name Linux. We do it with computers all the time by buying the parts separately to get our own tailored computer, why not the OS too? Then you know exactly what you got and won't have the needless bloat like full Red Hat or Mandrake distros. Not that they don't have their uses, I installed Mandrake 7.2 on my machine so I can learn Linux, play around with it, and try out all the apps to see what they do, but then I discovered www.linuxfromscratch.org.
  • I wish they would specialize a little more. Mandrake uses KDE and uses aurora, Redhat uses Gnome and uses a beta gcc. Suse is like mandrake with a different default config tool. Other than that they look the same. It's like having twelve different flavors of ice cream that are all strawberry.

    I would much rather have a couple great server distros, a couple great desktop distros, and a couple low-cpu requirement distros. Better than having twelve that each try to do all three.
  • Don't forget:

    4. A couple of dozen "botique" distributions aimed at specialized purposes. There will long be a need for distributions that are optimized for specific applications. These might include single floppy versions for rescue disks, versions for single purpose devices like routers and firewalls, versions for embedded devices that have to boot off a ROM, CD-ROM only versions for internet appliances, etc. There might even be versions that are aimed at markets that we don't think of as being separate right now but that would take off when available. I'll bet, for instance, that blind people would appreciate a distribution that had an audio based install and included lots of packages aimed at making it more accessible. These special purpose distros may be originally based as a derivative of one of the general purpose distributions, but they'll probably wind up taking on a life of their own and being maintained separately.

  • "Thake the installation procedure for instance. Sometimes it's a tarball, then an rpm, then is a
    binary with an install script allways assuming a different configuration (read: distubution)"

    This is why installing binary-only packages is a bad idea. Ultimately the nature of glibc and gcc is to lead the field forwards in their own sweet way. If "your" distro chops a basline saying "we shall use glibc-2.0.7 for OurDistro-6.0" then that's its problem for quantizing the evolution process. What's even more silly is producing binaries that will only work on a few such versions: specifically, I'm thinking SuSE-5.2 and 6.0, RH 6.0, 6.2 and 7.0; they all use different glibc2 versions, 2.0.7, 2.1.x, 2.2.2 (oh woops, that's Debian Unstable, never mind, it'll be 2.2.3 before you can blink). Any company producing a binary-only package "for SuSE" for, as is rather more frequently encountered, "for RH" is screwing their business model over badly because I just simply won't be able to install it - and around here, what I have installed here wins over what you might be able to provide.

    Oh yeah. Now what was that about commercial distributions? What about the real linux distro?
    ~Tim
    --
    .|` Clouds cross the black moonlight,
  • by fantom_winter ( 194762 ) on Wednesday March 14, 2001 @04:30AM (#364044)
    I think that's the whole idea of Linux. From a design aspect, Linux is more of a bazaar, where everyone puts a piece in. It works pretty well for the hobbyist. If you want something more consistent and slower moving, try FreeBSD, or a commercial Unix.

    Distros do nothing but give people options..

  • by blueg3 ( 192743 ) on Wednesday March 14, 2001 @04:31AM (#364048)
    we Linux geeks wouldn't be able to spend nearly so much time installing differents ones. Where's the fun in that? We'd even be stuck using friendly graphical installers... (shudder)
  • Diversity good. If mutations weren't everywhere, not as many things could survive a changing environment.

    As artificial life research (read: hundreds of underpaid graduate students :o) has shown, while it is true that mutations are an important part of evolution and adaptativity, the most efficient tool for optimizing a population is still sexual reproduction (ie crossing-over between similar-but-different genotypes).

    We can see a brilliant example of this in the distro problem: distros evolve not only by inventing new things, but also by borrowing from each other. Imagine if Red Hat or Caldera or Debian could prevent others from using this or that package in their distro ! Same thing for Gnome/KDE (disclaimer: I use X & Blackbox [alug.org], period :o) : they get better and better by inventing new features, but also by borrowing each other's inventions and adding them their own little tweaks.

    For the benefit of us all :o)

    Thomas Miconi
  • I have to disagree with you there. I've been a big Red Hat user for many years, and the 7.0 release persuaded me to run other distros in concert. At home I have a firewall machine, laptop, Oracle/Apache server, and a desktop. At first I was a little hesitant about having a mix of distros on different machines for "self-support" reasons, but my fears were quickly proved unfounded. I now run Mandrake 7.2 on my firewall, SuSE 7.0 on my laptop, Red Hat 6.2 on my server, and both Red Hat 7.0 and Mandrake 7.2 on my desktop. I've found that they all have there little quirks, but for the most part they are still very much alike. I think the only exception is Debian. I gave it a quick glance, only to determine that while I liked what I saw, I wasn't prepared to go down that road just yet.

    BTW, I definitely wouldn't use Red Hat 7.0 on my Oracle box (tried, not pretty), but its hardly bullshit. Well, I suppose that's just my opinion and you have yours. C'est La Vie.

  • It's not difference for difference's sake, it's differences for choice's sake! We get plenty of choice to run our system the way we want, whether it's to run apt-get or rpm, use BSD printing management or CUPS, or even how many virtual desktops you want. This isn't just a distro specific thing, this is the UNIX philosophy that allows us to string together programs the way we want. The Linux distros are simply an extension of this. It's a matter of choosing the right tool for the right job.

    I don't know if you actually use linux, but I've never had something really break because of some inane thing like a different window manager, screensaver, or font. The things that break things are much like those on Mac or Windows... missing dll's (lib.so's), broken programs and drivers, and misconfigured systems. These things are no different in Mac or Windows, it's just easier to mess up in linux because the user is generally trusted.

    And there are higher rules to the system, but even a system that says "screw that" can still take a piece of software, rearrange it, and redistribute it so that it will run and install just fine on their system. It's choice. You can't do these things most other places. And if diversity and choice doesn't spawn innovation, then I don't know what does.

    "I may not have morals, but I have standards."
  • IMHO, Steven J. Vaughan-Nichols [zdnet.com] should open his eyes.
    Only two distributions left?
    Not even a mention of my favorite [mandrake.com] desktop distribution.

    What about special distributions?
    Heck, there's so many niche markets that even Redhat has almost a dozen different versions. [redhat.com] and that list doesn't include the Cheapbytes [cheapbytes.com], or Embedded versions.

    Don't get me wrong, Redhat makes a great distribution, but do you really expect them to fit every market niche?
  • You don't need the command line to install RPMs. Both KDE and Gnome have graphical RPM install tools.


    Most users never need to install drivers.


    I think that, if you are comfortable with the command line, then you will not tend to explore the alternatives. That doesn't mean they are not there.

  • Yes it has. That has nothing to do with my argument. Windows still has a long way to go before it's easy to use as well. When I go six months without anyone asking me for help on Windows then I will concede that it's easy to use.
    And if your company will hire just anyone to admin their systems then they've got real problems in their IT department.
  • You forget that every one of those kernel hackers is sharing their code, thus eliminating duplication of effort.

    I think that right now, there are two major distribution types out there: Red Hat and Debian. All the others are basicly based on one of those two, with Slackware being the only real exception. Other then that, you have a bunch of specialty distros like Maragda (which boots off a CD) or ucLinux (small distro for PDAs).

    There is some duplication of effort between Debian and Red Hat (produce diffrent package manager programs, etc.), but they ultimately base themselves off the same code. The specialtiy distributions must have lots of duplication, but only because they are specialized and what works for someone else doesn't work for them.

    Diversity is a strength.


    ------

  • by SquadBoy ( 167263 ) on Wednesday March 14, 2001 @05:02AM (#364060) Homepage Journal
    That would be wrong. Let's take a look at the facts.
    1. Most distros use a stock kernel just with diffrent compile options. Thanks to the GPL anyone anywhere working on the kernel is helping everyone.
    2. Many distros are derivied from RH or Debian. And in the case or Progeny help Debian out while working on their own goals. This is a good thing.
    3. People are going to work on what they think is cool. Even with some of the economic woes going on if you have the skills to be a Linux developer you are still not going to have any trouble paying the rent. This means that almost every person out there who is paid to write Linux code is doing it because they want to if they did not want to work on that bit of code they could very easily go somewhere else and do it and get paid. The people who are doing it on their own ar going to do what they want to end of story.
    4. Point to a distro that has caused fragmentation in the kernel. There is not a major one with more than a handfull of people working on it. Go back to 3 for comments on fragmentation of desktops packaging systems etc.
    5. My job would be *much* harder if we only had one distro. For example I have a client who really wants a phone number to call in case I fall off the face of the earth. I need RH (shudder) for that person. Another wants the samllest cleanest system I can give him. Debian to the rescue. Yet another needs a very small braindead easy to manage firewall system. god I love the guys who work on floppyfw. In short multiple distros are a great thing for anybody who really understands and works with the code.
  • However, there needs to be a consistent UI that can be chosen at Install that configures the machine a particular way. Users NEED consistent UIs.

    How do people manage to cope with the enormous variety of private telephone systems then? There might be a case for a consistant UI within an organisation, or within a department. But that is up to the people running the system to configure appropriatly.
    It's not as if people even get a consistant UI with Microsoft. every version of Windows and every version of MS Office changes the UI. Even before you start using corporate customisations. (Or as MS calls them "Resource Kits".)
  • I think the biggest issue comes down to convenient packaging, software installation, and a single consistent interface. If people want Linux to keep growing on the desktop, these three issues are extremely important. The variety offered by different distributions is a great thing, but it can also result in a lot of headaches. Any time you have a lack of standards or competing standards it causes confusion for the end users. That said, I think the Linux community is doing a good job of developing and incorporating standards on what is an extremely fast growing platform.
  • There are only 5 varieties of Windows, but their installer still confuses most 1-2year Windows admins I've seen.Except that there are about 4 versions of 95, 2 versions of 98. How long before different versions of ME and 2000 pop up?

    Of course, most Windows desktops come pre-installed, which if that were the case for Linux, installers wouldn't really be the issue.

    In most corporate settings the first thing you will find done is to use a drive image program to set things up how they should be. OEM preloads are generally a waste of time in many situations.
  • Unless Susie the secretary installs a distro from '96 then she will probably never have to use the command line. Do you need to know how the Windows kernel works to use Word 2000. Also if Susie has problems with her machine she calls tech support who logon remotely and fix it for her.

    Actually, if Suzie the secretary is working in a typical business environment, she won't be doing the install, anyway. Instead, the system will be installed and configured by professional sysadmins who will set up /home/suzie so that she'll have access to all of the programs she needs from her GNOME/KDE desktop. I know the admins at my workplace would be right pissed if our secretaries tried doing any serious adminstrative tasks for their own computers.

    As an aside, I actually wonder why people view secretaries as the perfect example of computer incompetents. My experience has been that they're using their computers for most of the day and eventually become quite adept at doing all of the computer related tasks that are required as part of their everyday job- much more so than the rest of the people around who only use computers occasionally. In fact, I suspect that they're exactly the kind of people who might appreciate the customizability and flexibility of Linux the most. All of the secretaries at my workplace have their desktops customized on Windows (while just about nobody else does), and I strongly suspect that they'd be the people who would have the most fun fiddling with getting just the right window manager and theme.

  • Still sounds easier than Windows to me... (shrug)

    I do not find in orthodox Christianity one redeeming feature.

  • by SnapShot ( 171582 ) on Wednesday March 14, 2001 @06:11AM (#364070)

    Now I'm not a Linux expert, but it seems to me there could be a market for three or so comercial distributions, plus Debian acting as the reference platform for the others. What might we see in a stable Linux marketplace??

    1. A simple Windows migration distribution with two click setup into KDE or GNOME. This could be the AOL of Linux distributions; the kind of distribution Slashdoters sneer at, but the one that would introduce many people to Linux in the same way that AOL introduced many people to the internet. It's too bad that Corel gave up since they seemed to be working in that direction.

    2. Two or three competing hobbiest / proffessional distributions. Each would have the kind of subtle differences that people who know what they are talking about (and those that don't) could argue about for days... This competition would drive the bleeding edge of Linux evolution.

    3. Finally, Debian -- the Amish farmer of Linux -- carefully examining what should and should not be included in the software, probing the political, ethical, and social differences between Free Software and Open Source, asking the question "whether we should", and, acting as a standard; the "Microsoft seal of approval" for the Linux world.

    These opinions are from someone who is just starting to enter this world...

  • by jimlintott ( 317783 ) on Wednesday March 14, 2001 @06:12AM (#364071) Homepage

    People often base their perceptions about computing based on what they understand. What they understand is overwhelmingly MS Windows. In this world virtually anything that isn't Microsoft is incompatible. They extend this reasoning and assume that multiple distributions means multiple incompatibilities.

    It is part of the *nix advocates job to point out that all Unices are functionally compatible with each other. Exchange of information between the different platforms is easy because of the very simple text file standard. Legacy binary compatibility is a non-issue due to source code distribution. When a new architecture arrives the *nix world just recompiles.

    The number of distros doesn't matter. Underlying architecture doesn't matter. Users matter.

  • I'm a crusty old greenscreener who lived through the Unix fragmentation of the 80's and I fear this is happening all over again with Linux. Ask any ISV (independent software vendor) how much of a nightmare their porting center is if they support more than a couple flavors of Unix.

    The key to widespread Linux acceptance is ISV support. That's right, commercial, closed source products people have been using for years, whether or not there's an open-source "equivalent" (e.g., Photoshop vs. Gimp).

    The big applications have to run on Linux for Linux to penetrate anywhere other than, well, here. And as the whole FrameMaker thing showed us, that's not a foregone conclusion even for applications with Unix origins, much less things that come from the M$ world (e.g., convincing Intuit to port Quicken to Linux).

    ISV support on Linux, over the long term, will be inversely proportional to how difficult it is to develop a distribution-independent application.

    [And I won't even get into whether or not an open-source product can ever avail itself of "closed" resources like the Pantone color database Photoshop uses or Avery's office stationery specs that M$ Office knows about.]
  • This is silly. If you listed every version of every Linux distro (including the embedded ones), along with every patch, you'd get a heck of a lot more than 20 entries.

    And even so, with all these "Windows distributions", there is one vendor, and one place to go for tech support.

  • If Linux didn't have so many different distros with different configuration files, different GUI libraries, different window managers.. it would make the job of developing and installing programs on all platforms much easier. How many times do you have to download and install a bunch of extra libraries before you can install the main program you wanted to use? This is one of the things that makes MS-Windows so easy to use to install applications. All the basic libraries applications need are already on the system. Linux on the other hand has so many different libraries competeing with each other that no distro can fit every single one on the same distro and so users have to go around downloading all these libraries just to install a simple program.

    This brings up another point. With Open Source, libraries shouldn't be fighting against each other. All developers have access to the libraries code so they should work together and make one very good library. If they go off and make their own library then we end up with a bunch of poor libraries that are pretty much the same. Open Source means people should be working together, yet for most applications this isn't true.

  • Not necessarily! I was once a sysadmin and I would say that my job would have been much harder if we only had one distro.

    My company used to use a couple different distros for different things. We used Red Hat 6.2 for our Oracle machines, because it was the best Linux at the time for Oracle. TurboLinux's Cluster server for our web farms, because it was a breeze to set up and manage. We also used Caldera, Red Hat, and Mandrake on desktops based on that user's personal preference.

    The fact of the matter is that there is no such thing as "one size fits all". Windows, Solaris, AIX, Mac OS-X will never be able to solve everyone's IT needs all by themselves. If they could then they would have already. They each have their strengths and their weaknesses. Linux's strength is that it can very easily be customized/optimized (without vendor approval) to suit _your_ needs. Try and shoe horn WinNT or even Solaris for that matter onto a floppy disk to provide the complete operating system for a firewall system.

    One distribution could never have achieved what many have already done.

  • Red Hat managed to polute the distro pool by adding a bunch of premature 'admin tools' that teach you NOTHING about managing a machine.

    To stretch an analogy a motor mechanic probably isn't much interested in the driving controls. They want to easily be able to use their tools.
    Problem is that many people don't seem to get that just as a driver dosn't need to be a mechanic a computer user dosn't need to be a sysadmin. With the result that we end up with systems like Windows which expect the end user to perform sysadmin tasks (or mix up sysadmin and end user tasks, e.g. changing the screen resolution is an end user task, changing which video card driver certainly is not.) often at the same time making things more difficult for an actual sysadmin. (e.g. imagine if people had cars with a knob to alter the fuel/air mix on the dash.)
  • That said, I've been baffled by Windows installs when they get... odd. The thing most people forget is that 99% of the time, when you're installing Windows you're doing so on a system that was meant to run Windows. When you try to install Windows on a home-grown system with a blend of old and new parts, it can be a refreshing dream to install Linux on it ;-)

    I've got to agree with you there. I've been very confused on my last couple tries at installing Win98.

    I was thinking that it might be nice to have my home machine dual boot so I tried to install Win98 on an unused partition (vfat formatted even). Turns out that it's impossible to install Win98 anywhere but /dev/hda1. So I just can't do it at all without wiping my hard drive. Needless to say, it's still a Linux only box.

    Then last week, I was upgrading from 95 to 98 on my work machine (I develop Windows apps. I'd love to have linux at work but it's not practical.) After the 6th reboot I got an error message stating that my display was configured incorrectly and would I like to choose the correct driver from the list below. Only thing in the list was PCI bus.

    I find that to be quite unacceptable, especially for a supposedly userfriendly piece of technology. How can you not properly identify something as basic as a PCI bus in less than 6 reboots? And why would you misidentify the problem as a misconfigured display adaptor? And why must Windows assume that it is a monopoly and refuse to take the back seat to any other OS? I'll take Mandrake on my desktop anyday. It's much easier to install than Windows and works better once you've done that.
    _____________

  • I think the number of distributions should be very large. However, they should try to standardize (as in this story yesterday [slashdot.org]) as much as they can - reduce the number of arbitrary differences between distros and focus on the meaningful ones.

    My school made it's own Linux distribution [umich.edu]. It's little more than a slightly altered Redhat 6.1. But, for a Linux newbie such as myself, it was great to be able to install a distro with working AFS/Kerberos that was designed to be interoperable with UofM's other systems.

    Lots of customized distros are good. Lots of generalized distros are good too.

  • I see several strong commercial releases that draw from the 'lower', smaller releases. In other words, a two-tiered approach. The commercial versions will appeal to the conservative businesses mindset, while the smaller distributions allow for innovation and growth. The conservative versions can introduce innovations and advancements after a Darwinian process determines the strengths and weaknesses of the innovations - which open source will then clean up as needed.

    This is far better than Microsoft, which simply swallows the ideas of others and then regurgitates them in a closed (and usually badly reimplemented) form.

  • Having lot's of Linux distributions focused at specific target markets is probably good, the more the better.

    The problem is the incompatability and fragmentation between the distributions. I'm not talking about diffrent kernel or library versions, or even about diffrent package management. I'm talking about diffrent init levels, diffrent init scripts, diffrent paths for add on software, and totally diffrent layouts. (Maybe even diffrent packaging, but I like diffrent packages, I would only ask that they maybe consider standardizing the metadata, which is something they are quitely talking about already).

    I agree with many of the artical points, but I say that he is dead wrong, and companies like Chilliware (or whatever it's called), Conectiva, and have plenty of room to create specific nitch application or language focused distributions.

    The problems all stem from lack of standards (again), which make ALL distributions result in some "roll your own" needs when you start adding stuff you want (like Adobe, Oracle, or any ISV applications) and integrating diffrent systems into the same network. Time will tell, but (again, IMHO, and probably flamebait) it's just because the standards groups are more worried about finding a middle ground through the path of least resistance, rather than doing what they should and laying down a few laws based in logic and substance.

  • There is some truth to this, but I was at a small bank the other day an noticed they had small weak machines running windows but they used some kind of proprietary app to do everything in. This took up the whole screen and I don't think that they really even had to deal much with windows to do their work. So this could eaisly be done with Linux to save on OS costs, but then again, they couldn't usethe computers for personal stuff, so I don't know if this would be a plus or a minus.
  • Yes you're right there is a difference, but not as different as you think. I currently have to deal with a server room full of Solaris, HP-UX, and Irix. Of those three operating systems alone, there are no fewer than three disparate versions of each. Of those three or more different versions you are going to find different revs, slight differences in commands, missing commands, and even software install methods all within the same OS family. Our Unix Admins have to deal with all of this and do so on a daily basis. A couple of Linux distros in the mix would be childs play for these guys. Hell about half of them run Linux on their work laptops.

    I think my point, which I invariably did a poor job of conveying is that the minute differences between distributions, especially those within in the same family (ie. red hat based versus debian based), is nothing that SysAdmins can't handle. Sure it would be better from an administrator's perspective, if you could have all your machines running the same version of the same operating system, but it isn't always practical. It's a fact of IT life that AppX runs better on SomeOS4 then it does on AnotherOS5 or even SomeOS2 which you decided to standardize on. While a homogenous infrastructure may get the job done, more often than not its nothing but a pipe dream that PHB's still cling to in hopes of lowering their beloved TCO. IMHO, the goal should be to use the best operating system available for a given task, whether its Solaris, Linux, Windows, or Mac.

    Now don't get me wrong, I'm not defending Linux for the sake of defending Linux. In fact, I agree with you completely that there are some serious issues like those you mentioned. However, I don't think there is anything wrong with it that a few standards can't remedy.

  • The "ease-of-use" claim for Windows has never made all that much sense to me. It doesn't hold water from two either of the two classes of "users" (as opposed to admins/hackers/developers/hobbyists):

    User Type #1: Suzie the Secretary - she runs 5 programs total and was utterly mortified by DOS prompt back in day. She still can't install a piece of shareware if her life depends on it. I've got the perfect setup for her - a drawer in GNOME that contains StarOffice, Netscape, a mail client, XMMS, and GAIM. It's like the Win XP "common tasks" list, except StarOffice won't dissappear if I use AbiWord for a couple days.

    User Type #2: Joe the Tinkerer. Joe's job may not be different than Suzie, but he's comfortable doing more things with his machine - he has a shareware program that randomly changes his desktop image every day, and he installed Napster for spare time usage He could swap a PCI card if he had to. Joe used to be somewhat handy with DOS prompt. You give Joe a DOS-to-UNIX cheat sheet, explain the fundamental structure of UNIX directories and "./configue", "make", "make install", and in a couple days he's happy.

    What boggles my mind is the CTO who's willing to pay current MS prices. $199 for an Office upgrade?!?! Multiply that by x licenses (and multiply by 0.yy for volume discounts) and you've just dropped many thousands of dollars on a program with a free alternative that's 100% compatible. Ditto for the operating system. Repeat this process every two years, or until replaced with subscription-ware.

    When the average corporate user needs only Windows and Office, why not use RedHat and StarOffice instead? Doesn't software cost play any factor in these decisions?
  • I think Win2k would be a better workstation solution, unless your company is having financial difficulty. It doesn't "blow up" either.

    BTW, VMWare fully emulates the instability of the emulated OS, so it won't really help in the way you suggest. Win4Lin -- haven't used it, but I doubt it is more stable than running native. (Especially since running native Win2k is just as stable as running native Linux in my experience.)

    ------

  • I found that out the hard way....After I tried.
  • Your views are correct! :-)

    You have to remember that in larger companies or organizations, it's good to try to standardize on system configurations as much as possible. That makes it much easier to do company-wide systems management and upgrades.

    What I find interesting is that many so-called roll your own distributions are actually modified versions of commercial Linux distributions, customized for the local organization's needs. A good example of this is the Linux used at UCLA, which is essentially Red Hat Linux modified to take advantage of the network infrastructure at UCLA.
  • I remember the Unix fragmentation of the 80s too.

    I don't think Linux forking is nearly as bad. Everyone's using the same. Most kernel upgrades don't seem to break most program binaries. The most painful compatibility issue I can recall was when some distros had upgraded to glibc and others had not.

    You can often use binary RPMS from one distro with another (although following the dependency tree to ensure all the necessary libraries are in place may get painful). Things will get even better as the LSB standards emerge. You shouldn't need to turn your code into a rats nest of #ifdefs like you did back in the 80s.

  • We would like to think Linux would overtake MS as the most used OS, but the fact remains, most of the people jumping online, and working on the "typical" PC based application solely need simple functions out of their pc, and them having to gcc -o something something.c or ./configure --with-some-new-package ; make ; make install is just not going to cut it.

    Do you really have to compile your own stuff these days??? Sure, some crazy hackers (like me!) compile everything from scratch, but it's hardly nessary. Run Debian? It's by far easier to install a program on Debian then Windows:`apt-get install `. On Windows, you have to click the program, click through the license agreement written by Nazi war criminal lawyers, decide where to install it, sit there waiting for it to install (and hope it doesn't blue screen in the process), then you're done.

    Then what do you do when you want to get rid of a program on Windows *shudder*. No, the little uninstall icon is not enough, as it often leaves cruft in the system (this is why it's a good idea to do a clean reinstall of Windows a lot, especialy if you install and uninstall a lot of stuff). To *really* get rid of a program on Windows, you have to go digging through the registry, and the \windows directory, and the \windows\system directory, and the \windows\system32 directory, and the \Program Files directory, and the . . . well, you get the point.

    Want to get rid of that program on Debian? `apt-get remove `


    ------

  • I think that multiple distro's are actually hurting Linux. If there were only two or three, each would be constantly striving to best the others, and a staunch competition would ensue that would improve the product of each, but as it stands right now, there are literally dozens of distros, each with it's own "specialty" if you will.

    Except that what is "best" depends greatly on what the distribution is wanted for. A distribution oriented around trying emulate the look and feel of Windows might be fine for the home user but an utter pain for a network adminstrator.
    Too few distributions could easily result in a distributions which are less than useful to many potential users.
  • I think this is both true and untrue. Assuming Linux is widely in adopted in some markets I think we'll end up with more distros than that for commercial use. I don't think the Linux market will look like the auto market, with a Big Three distros or something like that. The Linux market will look more like the OEM market, with a couple very large vendors, dozens of second-tier vendors, and more little shops than you can count.

    Why? The cost to enter the Linux market is pretty low - you can take somebody else's GPL software, modify it a bit, and make your own distribution at relatively low cost. Then you charge less for support and try to undercut them. Like the OEM market, the Linux market is based completely on a open standard - there are no proprietary secrets needed to bust in and everything you need is readily available. The difference will be in levels of support. I have no hope that my no-name OEM will answer the phone if I call - they might even be out of business by tommorrow, but the price was right. RedHat and other first-tier distributions will offer any level of support you care to buy. Other commercial distros will limit support to a minimum and keep it cheap.
  • by cyber-vandal ( 148830 ) on Wednesday March 14, 2001 @05:16AM (#364118) Homepage
    Susie the secretary will not understand *Nix vs. point and click.

    Unless Susie the secretary installs a distro from '96 then she will probably never have to use the command line. Do you need to know how the Windows kernel works to use Word 2000. Also if Susie has problems with her machine she calls tech support who logon remotely and fix it for her.
    As for 'which one is better', I suppose having several choices of server company is also a bad thing. How are you supposed to know who's the best.
    How about anti-virus software, groupware, fault-logging software? Is the plethora of choices also a bad thing.
    I'm so tired of bullshit arguments like this. Linux is no harder to use than Windows. I have to help my family out on a regular basis because Windows plays up on them, and unlike Linux there is no way for me to find out why.
    If Windows is easy to use then obviously no tech support is required as well.
    Linux nowadays needs as much command line intervention as Windows does, which is to say occasionally, usually when network information is required.
    If you can name me a situation when command line is the only option I'll be impressed.

  • Please forgive me for playing the devils advocate here, and I totally agree on most your points, hell I've been using BSD's for some time now and Windows is completely banned (as I jokingly put it) from my home.

    This is why sys admins exist and why we are paid rather well.
    Take a quick look at the sagging markets, and truthfully ask yourself if things will continue to flourish for sysadmins who get paid well. I'm in the sys admin/security based field and get paid well, but when it comes down to the nitty gritty, I often wonder is it really neccessary, when by using simple products, most people can be taught to fix things on their own leaving a sys admin in the cold.

    What about Linux with X up and a well configured WM is *not* point and click. My wife can do it. I have given it to many secretaries all of whom with ~1 hour of training can do it.
    What about when X has a huge gaping security hole, or she gets an error like connection refused, or anyone with simple relative skills, should companies spend their money waiting for admins to fix these problems often having few minutes worth of unproductive downtime?

    With M$ it has been the case that people have had to play amatuer sys admin either because there was no good way to stop them from doing it or there was no really good way to do it for them.
    Uh yes MS' Windows NT has perms just like Unix based systems, sure people are going to want to modify their own systems since they are the ones using it. Now take a *Nix based system and by chance you get a curious user who asks whats rm do on a live machine? Well I hope the admin took the time to ensure everything was in the proper groups.

    With *nix I can use SSH to admin boxen on the other side of town and not get out of my chair.
    If you haven't noticed Windows has remote administration tools including ssh now. It can easily be modified remotely.

    Anyways I don't want to sound at all like an MS advocate or even Linux advocate, I just wanted to point out instances which for us geeks are simple, but in a simple world difficult.
  • If I remember from a comment on a older Slashdot article, the Linux given to any interested UCLA student for some time was essentially Red Hat Linux 5.0 with a pre-canned configuration so the user can quickly log onto the UCLA computer network.

    Mind you, Mandrake Linux is in many ways derived from Red Hat Linux. :)

    I think for neophyte Linux users they should be using either Mandrake 7.2 or Caldera OpenLinux 2.4.
  • SnapShot,

    If I could moderate this message thread I'll mod your message up at least two levels. :)

    Your assessments are completely correct in regards to Linux. I think Caldera OpenLinux 2.4 and Mandrake 7.2 make excellent Linux distributions for first-time Linux users.

    The professional distributions should be between Red Hat, Turbolinux and S.u.S.E. for corporate use.

    And the Debian Linux should be the base reference for Linux distributions.
  • Three reasons why multiple distributions are better then a few.

    Direction

    If you've ever worked on a team of equals (or close to it) you will have noticed that it is often very difficult to agree on a common direction. Half the group may want to work on X while the other half wants to work on Y. When this happens Project Leaders have to get their developers to compromise more than they collaborate. Force the issue and you'll end up with a product that fails to meets either goal.

    I think this philosophy is best illustrated by the recent "fork" of the Samba project. A contingency of developers decided that they really wanted to try some new avenues. Avenues that didn't necessarily coincide with the short term vision of the main development effort. Rather than try and compromise, the project leads thought it a better idea to split their forces and hopefully meet again somewhere down the road. While its true that this decreased the number of eyes on the "core" Samba effort, it may very well lead to more quantum leaps in the future.

    The same holds true for Linux because the code is open. If Mandrake sees that SuSE has made some great strides in a given area, they can incorporate those changes and vice versa. This in effect puts them in the same development community, but allows them the freedom to explore those areas that interest them.

    Competition

    Competition fuels innovation. Red Hat, Mandrake, SuSE, TurboLinux, and Caldera are all trying to make a buck on Linux. A market without a lot of bucks to go around (at least at the moment). If they want to sell more boxes they've got to demonstrate why their Linux is better than the next guys. Competition is a huge motivator and one that diminishes with fewer players.

    Choice

    Choice isn't really a reason all its own, in fact it greatly depends on the other two. If they don't exist then we won't have many choices at all. Limit choice, and the next thing you know a young college student is writing his own kernel for his own enjoyment and educational purposes. (which is a good thing by the way, because it reintroduces choice)

  • Somebody PLEASE give this person karma!
  • If you can name me a situation when command line is the only option I'll be impressed.

    You're kidding, right? I can't imagine going through a day without the command line in Linux, even if I tried.

    You want examples? Well, with most Linux programs, you need to use the command line to build and install them. In some cases (ok, a lot of cases) you might be lucky enough to find an RPM. Well, so what? You install the RPM, and then what? How do you start the program?

    This is a problem my cousin was having. He complained that he'd try to install an RPM, and it would claim to have succeeded, but then the program wasn't installed. In fact, the program simply wasn't showing up in his start menu (or whatever you call the GNOME equivalent -- the foot menu if you will). Whe I heard him say this, I was shocked. I said, "You mean you don't know how to use the command line?" I then explained to him that you can do very little without the CLI in Linux, and I taught him how to use it.

    Another example -- drivers. How do you install new drivers in Linux? Well, frankly, every freeking driver is different. My sound driver is a kernel module. My video driver is a combo kernel module / X driver. Both of these had to be installed from the command line, and in very different ways. On Windows, OTOH, you have the convenient hardware manager, which is a nice, consistent interface for any kind of driver installations and updates. I don't have to read the readme when I download a Windows driver -- I just go to the hardware manager, click the hardware I want to update, and update the driver.

    I have been using Linux for two and a half years, BTW. It has been my primary OS for most of that time. I am running on a custom-compiled 2.4.2 kernel under Debian/unstable right now -- I know how Linux works. But the simple fact is that Linux is not (yet) as easy to use as Windows. I eagerly await the day when it is, but there is just so much work that needs to be done before that happens. I'm sorry, but it's the truth.

    ------

  • But these aren't applicable examples. When was the last time Susie the secretary installed a driver on her Windows machine at work. I would be willing to bet that Susie has no clue what a "driver" is.

    Susie wouldn't be installing new software to her machine at work, either, she would expect everything that she needs to be there, and if it's not, she'd call for help to get it installed. Even in Windows, the sys admins would be doing these things, and the users could go along happily without the command-line.

    How about some examples that would affect Susie?

  • Hmm. I've had plenty problems with Windows and buggy video drivers. Not to mention drivers for other hardware devices. Susie may not know what to do with those error messages, but at least they're there so that the tech support guys can help her.

    What's she to do in Windows when she sets the video driver to a resolution that her monitor can't handle and all she gets is trash accross the screen (and yes, I've seen it happen).

    All the Linux installations I've done lately don't require you to type startx. They set up the system so that it boots in the graphic runlevel. You never see the command line unless you open up a term (which I do, first thing).

    No, Linux isn't perfect yet. ;) But it's not as far behind Windows as you emply.

    --

  • This is why sys admins exist and why we are paid rather well. What about Linux with X up and a well configured WM is *not* point and click. My wife can do it. I have given it to many secretaries all of whom with ~1 hour of training can do it. Hell my 5 year old son can navigate E + gnome. While it is not possible to admin a Linux system without the command line it is very possible to use one without. The simple answer to too much choice which is what your other two points come down to is have a good admin test demonstarte and explain. Then make a decision based on your needs and wants and stick with it. Most arguments I have heard of this type stem from a confusion between using and system and admining a system. With M$ it has been the case that people have had to play amatuer sys admin either because there was no good way to stop them from doing it or there was no really good way to do it for them. With *nix I can use SSH to admin boxen on the other side of town and not get out of my chair. Thus we can once again have a sharp division between admins and users. And then Susie the secratary can set down at her machine and do her work instead of having to worry about doing admin duties on her machine. This is a good thing.
  • by Masem ( 1171 ) on Wednesday March 14, 2001 @05:16AM (#364138)
    Given all the available distributions, I can think of only two things that are important in how the various distributions vary...package management method and Default security installation. The former is probably the biggest one, distinguishing Slackware from Redhat/Mandrake/etc from Debian, and is probably where people have the most 'religious' conviction to their preferred distro. The latter is basically a more 'personal' touch, using one's opinion on what is 'secure' to set up how the box is initially installed. Given that this can vary from person to person, there is definitely potental for an infinite number of linux distros out there; and unless apt and rpm combine into one tool, there will never be a consolidation of distros.
  • by flipper9 ( 109877 ) on Wednesday March 14, 2001 @05:31AM (#364142)
    Just think of the many distributions of Windows out there in current use...
    Windows 95
    Windows 95 OSR1
    Windows 95 OSR2
    Windows 95 OSR3
    Windows 98
    Windows 98 SE
    Windows ME
    Windows NT 4.0
    Windows NT 4.0 SP1...Windows NT 4.0 SP6
    Windows NT 4.0 Server
    Windows NT 4.0 Server SP1...Windows NT 4.0 Server SP6
    Windows CE 1.0
    Windows CE 2.0
    Windows CE 3.0
    Windows 2000 Professional
    Windows 2000 Server
    Windows 2000 Advanced Server
    Windows 2000 Datacenter Server
    Windows.NET

    Sounds fragmented to me!
  • I don't think that multiple Linux distros are beneficial because instead of having everyone contribute to the same project, they are contributing to their own projects which just happen to have a kernel in common. You have said that multiple distros are in accordance with the principle of "everyone puts a piece in", but I disagree -- Everyone is contributing pieces, but to different projects
  • Basically I agree with everyone else here that diversity is a good thing. However, I do not see the point in difference for difference's sake. When stuff start to break just because you're using a different distribution, windows manager, screen saver or font, It Just Ain't Worth It Anymore. In such situations something is missing, and if you can't conform and agree on the current level, there should be guidelines so that everything can conform on the meta-level. Higher rules to rule different distros could be a way to take the OS to a whole new and (damnit) innovative level.

    - Steeltoe
  • I totally agree.

    I'd certainly like to see some _meaningful_ standards in the Linux space. It bugs me that some of the differences between distros serve no real purpose at all.
  • by bonzoesc ( 155812 ) on Wednesday March 14, 2001 @04:33AM (#364150) Homepage
    This type of story will probably start a flamewar, but here's my take:
    Diversity good. If mutations weren't everywhere, not as many things could survive a changing environment. If Critter A makes millions of subtly different copies of itself, while every instance of Critter B is identical, and a large disaster comes along that only rare mutations of Critters can survive, Critter A will be the only one to carry on its genes. Critter B, due to it's lack of mutations, would not live to see another day.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • Why do people here sound apologetic about chusing a desktop environment?

    It's your choice and nobody elses. If you want or need to make the info public, just do it and wait for the responses "you should check out the new version of mine. It's better" or "way to go. You have good tast, just like me."

  • Ok, so why not add pirana to RedHat then instead of a new distro entirely? Even better why not just develop apache to use mod_proxy and make it a non Linux specific solution.

    Because at the time Piranha was still in its infancy, and Red Hat hadn't introduced the HA server for atleast another couple of months. By that time, I wasn't there and don't know if they ever switched. Furthermore, the Cluster server from Turbo was very reliable, quick, and easy to administer.

    What's the benefit or why does it matter if Linux fits on a floppy for a firewall?

    For one its small, and its highly portable. This can be a very useful tool, but more specifically it was merely an example of the flexibility of Linux. There are a couple of distros out there that fit onto a floppy disk. I can't remember the names, but we had a couple tailored for a number tasks, most of which were diagnostic in nature. Don't read too into this, its only an example.

    Let's say the environment is grown to 200-300 Linux boxes strong and each one is totally different. How does having a machine that's nothing like the rest help make the admin's job any easier at 2:00am when it goes down? Or what if a new person walks in the door to take over the environment and has to figure out wtf is going on and how each one works, learning curve goes waaay up.

    First of all, 3 to 4 different distros is a long way from 200-300 in a server room. Secondly, anyone fool enough to install 200-300 "completely" different operating systems is insane. Furthermore, the differences between Mandrake, Red Hat, Caldera, and TurboLinux are minute. So much so, that an admin who can administer one but not the other is obviously an idiot. In fact, I would rather admin a server room with five different "specialized" distros of Linux than I would one with a mix of AIX, Solaris, HP-UX, Irix, and Digital Unix.

    Not trying to start a war, but it's the custimization and optimization that makes it harder to administrate in my opinion. It makes it an elegant hack job of bubble gum and duct tape.

    Think of it this way. Even if a company standardizes on one platform (I'll use Solaris for example) there are going to be differences, possibly even significant ones, from one system to the next. My Solaris boxes for Oracle are going to have different kernel parameters for larger shared memory. My web servers are going to have optimizations for iPlanet or some other web server. This is going to continue right down the line from mail servers to NFS servers to NIS servers all the way down to the workstations. So even though I've theoretically standardized on Solaris, my admins are going to have to approach each box differently, by role that is.

    You wouldn't use a framing hammer to tack up a picture and you wouldn't use a tack hammer to frame a house. Carpenters don't use just one hammer and SysAdmins shouldn't limit themselves to a so-called "one size fits all" solution. Hell, even NT has optimizations for specific roles.

  • by S1mon_Jester ( 223331 ) on Wednesday March 14, 2001 @04:38AM (#364159)
    installations - not customization.

    Commericial companies (and single developers) want their packages to be installed easily and correctly.

    I doubt if anyone cares if you use debian's package format or RPM, but the installation routine must be the same regardless of distribution and it must be easy enough that anyone can do it. (Furthermore, it should uninstallable.)

    // configure; make config; make is way too difficult for most folks.

    If I was smart, I'd suggest a database for each system that the would tell the package what to install where.

  • by micromoog ( 206608 ) on Wednesday March 14, 2001 @04:38AM (#364161)
    I think the people the author is responding to were suggesting the Linux field will be narrowed for business. Sure, there will always be 500 separate distibutions out there, but businesses really like standards. Two or three distributions will prevail in a big way for business, and the other 497 will still be around.
  • You've got to be kidding me if you think that this is OS fragmentation like Unix or Linux distributions.

    The vast majority of apps work perfectly well and are generally upwards compatible - especially on Windows 95, 98, 98SE, and the absolutely horrible ME. Lots of apps even work perfectly well on Windows NT and Windows 2000. While not perfect, the Microsoft camp has got binary compatibility between "distributions" down. Try that on the Linux side and the situation gets a lot muddier. I first need to worry about getting the glibc and libc library if I want binary rpms. Ever try to install newer RPMs onto a older version of Mandrake or RedHat. Secondly, things change in the Linux world. Try installing a new graphics driver or sound driver - now you might need to match kernel version if only a binary is available. Plus, you have to worry about different processors; believe it or not, a friend of mine downloaded a rpm for the alpha thinking it was an alpha version of XFree86 with the required driver.

    How do you do most of the administration? Linux systems are different enough that administrators can have a lot of difficulty with the various sysadmin tasks. Do I use DrakConf, linuxconf, or edit the files by hand? Even if I use linuxconf, not all options could be set correctly so you have to go back to text editting. Try setting up dhcpcd using Linuxconf only on LM 7.2. You need to add DCHP_HOSTNAME to the ifcfg script. Or how about changing video drivers - try installing the Nvidia drivers and compare that the Windows method. Do I use Xconfigurator, DrakConf or sax? Use sndconfig or Draksound for your soundcard - works sometimes, but I always resort to hand editting.

    Ask yourself how many times you've recompiled the kernel or know someone who did inorder to get something to work: DVD support, CD-RW support, NTFS support, etc.. How many people recompile Windows to add that kind of support?

    The Windows way may be buggy sometimes and can cause headaches trying to fix it, but it is consistent. The Linux way is a complete jumble of different methods and interfaces, but if you edit it properly the problem is fixed and fixed for good.

  • There was an article on informationweek.com [informationweek.com] about this last week. The article [informationweek.com] talks about all the different flavors of Linux, and how great it is to have something which you can make so unique to you with so little effort, but how bad this can be to business. The main point of the article is that, without standards, Linux isn't a viable platoform for businesses.

    It's an interresting read, check it out.

    [Note: Informationweek updates a lot, so the article may get flushed to an archive soon. I suggest someone paste it below if they find it as usefull as I did.]


    ---
  • It increases competition, brings down costs, increases availability, etc... The highly competitive PC clone market of the late 80s is what gave Microsoft its current advantage. They were smart (and lucky) and rode the wave.

    Linux moves fast because it does for the OS what the IBM PC platform did for the hardware. I don't mind if the masses continue to use Windows. Windows has to improve more because of Linux, too. It's a win-win situation for the masses all around, and with Linux/KDE/GNOME/ReiserFS/etc steady improvement (and the distros making it available it whatever fashion is best for their target audience), many newbie complaints will disappear with time.

    Maybe I'm just optimistic, but I think it's a fair analysis. Competition almost always helps things out.

    - Tom

  • Oh but learning to install service packs, how to use regedit to keep it stable, how to rescue self-corrupted office files and reinstall the whole mess when it finally dies... that's easy enough? But ./configure && make && su -c 'make install' is too hard?

    I do not find in orthodox Christianity one redeeming feature.

  • I think the people the author is responding to were suggesting the Linux field will be narrowed for business. Sure, there will always be 500 separate distibutions out there, but businesses really like standards.

    You make be confusing a business wanting to standardise within their business.
    The other would only make much sense were there a standard entity called a "business".
  • I have mixed views on number of multiple distributions, but when it comes down to it the most of the reasons I can think of why one would like to see less distribution out there are all based around FUD.

    I may think a friend who uses a GUI which is different from the one that I use a little weird, but when it comes down to it, any thoughts I have concerning its inferiority are because I don't use it and aren't used to it, rather than based around what I acually know. Sure it doesn't do some things as well as what I use, but there are certainly other things it does better.

    All distributions have there good points and their bad points, if you don't like them don't use them.

    When I go to the supermarket I don't complain about the huge choice in washing detergent that there is on sale, and how difficult it is for me to choose the right one, and I really know nothing about washing powder. If I find myself suddenly not liking it I try something else.

    The big difference between linux and more commercila thigs like washing powder and cars is that the openness of linux means that often improvements made by one particular distributer are often made available to others. The friendly competition between KDE and gnome should be considered a good thing.

    The one area when the variation can cause hassle is when releasing binaries. Having to produce 6 or more different binaries for different distributions, or even versions of the same distribution can be a pain. But even then there are usually people who use distributions I don't that are prepared to help out, and in the longterm this probably just encourages better codeing techniques to allow for the necessary portability.

    People are always going to be ardent supporters of one version of a product, but when it comes down to it, you only know how good you product is when there are others to compare it to.

  • Windows doesn't have a diversity problem at all

    If I had mod points this would be +1 Funny.

    Windows currently has two desktop versions (ME & W2K Pro), two embedded versions (CE & Embedded NT) and three server versions (W2K [Advanced|Datacenter] Server). As far as I know, only the server versions actually run all each others binaries.
    --
    #include "stdio.h"
  • I concur that mutations are a good thing for Linux. However, there needs to be a consistent UI that can be chosen at Install that configures the machine a particular way. Users NEED consistent UIs. What we need is a way so that on most popular distributions during install you can check a box and that will configure the machine with a commonly agreed upon GUI and windowmanager that people are familiar with. This wouldn't be necessary on all distributions but would be a great start to making the Linux GUI similar on all ends for end users that simply want a consistent interface.
  • by Syberghost ( 10557 ) <.syberghost. .at. .syberghost.com.> on Wednesday March 14, 2001 @09:19AM (#364181)
    In other words, the Anna Kournikova virus could come along and wipe out Critter B.

    -

  • Recently I was in a discussion with some older CTO type people and the arguements of Linux vs. MS came into play. Attempting to explain why (I thought) Linux can make some steps in the future to be where Microsoft is, I brought up the different distros, their functions, benefits over MS, etc.. One of the gentleman made some very sharp points though which no one can really refute, sure Linux is better at certain aspects of computing vs. other OS' but heres some of the failures which came in the picture of which I could not think up a retribution based answer.

    Typical dekstop workplace environment
    Susie the secretary will not understand *Nix vs. point and click.

    Too many variations of Linux
    Which one is really better as they all claim one or two niches over the other.

    Gnome vs. KDE vs. etc.
    Why so many desktop environments, sure alternatives are good, but when work needs to be done, money is burnt by time spent figuring out whats what on Linux vs. point and click MS

    And this went on for hours. We would like to think Linux would overtake MS as the most used OS, but the fact remains, most of the people jumping online, and working on the "typical" PC based application solely need simple functions out of their pc, and them having to gcc -o something something.c or ./configure --with-some-new-package ; make ; make install is just not going to cut it.

    The Big Breach" [antioffline.com]

  • I think that multiple distro's are actually hurting Linux. If there were only two or three, each would be constantly striving to best the others, and a staunch competition would ensue that would improve the product of each, but as it stands right now, there are literally dozens of distros, each with it's own "specialty" if you will. This can not possible by good for the advancement of Linux.

    Allow me to explain: There are a fixed number of kernel hackers in the world. Granted, more and more people are switching to Linux each year, but in any given moment, there are a limited, fixed number of people who write code for Linux. Dividing these people up between so many distros only serves to impede the process of advancement. Let's say there are 10,000 kernel hackers out there (the number is probably much larger, but hey, it's an example). If these hackers are distributed among 20 seperate entities working for 20 seperate goal, each goal will be lesser than if they were working in unison for a common cause. This is why fragmentation and code forks can only hurt Linux as a viable MS alternative.

  • Its one of the nightmares of MS Windows that application distributors package the system libraries with their programs. You may honestly believe that all the libraries are already there, but installers like InstallShield (TM) actually grab all the dependancies and include them in the program installer file. How many times have people had an old version of DirectX overwrite their newer one because they wanted to play an old game that shipped with D-X 3.0? By shipping the libraries seperately, Linux avoids these issues. By using good tools, like gnorpm (some day), you can have those dependancies updated at the same time as the program you're installing. Its not unreasonable for an application (like Mozilla) to ship on CD with several other RPMs (like glibc2.1, etc.) included, which get installed if needed.
  • I'm talking about corporate users. I, personally, would use the command line a lot, but there would be no requirement to do so. Set up the corporate desktop with the required start menus, host it remotely on an X server and have Win4Lin or VMWare for any apps that only run on Windows. Support nightmare is diminished due to Linux not blowing up incomprehensibly and the only difference the user might notice is that the fonts are uglier.
  • This is very similar to the complains I've heard from other windows-favorible journalists. I'm sure you've also heard about the common complaint that Linux needs a consistent desktop and all vendors should use that one.

    Linux is not about consistency, linux is about choices and customization. About picking your favorite distribution, your favorite window manager, your favorite theme within that window manager.

    If the skeptics are right - that linux must be a single distribution with a single interface to become the operating system of the "masses", then I hope it never happens.

    Honestly, I like linux for what it is - diversity, choices and customization. After all, isn't that why we have the source code in the first place? So if we, as users, think an application or feature could be made to better fit us - then we can make that change?

    If it takes sacrificing diversity to appease the masses, I don't want the masses. This is just my honest opinion. Even if the masses don't adopt linux, it will still be a success - and I will still use it.
    --
    Twivel

  • I was just generally using mutations as a blanket term, because it was early in the morning and I couldn't think of complex things like combining different genotypes.

    One thing I failed to address was the fact that some Critters need to die off as a result of the evolution of the whole.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • What? Pay Microsoft yet again for something that doesn't work properly. I've got it on my desktop, a PIII-500 Compaq with stock parts. It took 3 hours to get it on there because it would fail silently after trying to load loads of unnecessary drivers. I started browsing the web, and after 5 minutes it rebooted automatically. Buggy drivers? Possibly, although I doubt I'll ever know. Linux has yet to do this to me, although I'll admit to a crash or 2 due to me not knowing what I was doing with X (and therefore could sort it out - unlike Windows)
  • by grovertime ( 237798 ) on Wednesday March 14, 2001 @04:45AM (#364207) Homepage
    The idea that Linux could be too widely distributed (by that I mean by too many competitors) is not only ridiculous but appears to be ignorant. If Linux is to continue resembling and possibly leading the open-source generation of programming, it is not just important but INHERENT that many distributions co-exist. Sure, there will be bottom-feeders and top-dogs, but this is a quality not quantity situation where we all just want the bugs worked out. The prediction that we will be down to 2 distros is not just ludicrous, it's damaging, ignorant and wholly implausible.

    1. what the? [mikegallay.com]
  • Oracle doesn't work with Red Hat Linux 7, as it does wacky things glibc 2.2 doesn't like - they're relying on undefined behaviour.

In practice, failures in system development, like unemployment in Russia, happens a lot despite official propaganda to the contrary. -- Paul Licker

Working...