Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Linux Business

Linux Vendors to Standardize on Single Distribution 497

Jon James writes "eWeek is reporting that a number of Linux vendors will announce on Thursday that they have agreed to standardize on a single Linux distribution to try and take on Red Hat's dominance in the industry. " The vendors in question are SuSe, Caldera, Conectiva, and Turbolinux. However, as the article also points out - Red Hat has a very well established lead in the corporate market - and Sun's decision to create Yet Another Linux Distribution (Sun Linux! Now With McNealy Vision!) will make the waters even more muddy.
This discussion has been archived. No new comments can be posted.

Linux Vendors to Standardize on Single Distribution

Comments Filter:
  • by Brown Line ( 542536 ) on Wednesday May 29, 2002 @08:25AM (#3601220)
    This reminds me of the time in the late 1980s, when hardware manufacturers tried to unify UNIX. They just screwed matters worse. Fragmentation is a sign of a healthy market, after all: if we wanted unity, we could all just bare our throats to Microsoft.
  • by MichaeLuke ( 50412 ) on Wednesday May 29, 2002 @08:36AM (#3601263)
    If they're going to standardardise on one distribution, why don't they standardise on Redhat? No, Seriously.
  • by justsomebody ( 525308 ) on Wednesday May 29, 2002 @08:39AM (#3601280) Journal
    Actually not, they are just adapting to a new form of existance. Targeting geeks it was simple, every geek has it's own needs and would like it's own distribution.

    Targeting masses actualy defines being more organized and more uniform. This way linux development actualy speeds up, what's one of the main things of this merging.

    Setting one standard and deploying jobs across few companys that had to do all the work untill now. Speed is increasing, uniforming gets better and most importantly. There is a higher organisation level
  • Re:Gentoo (Score:1, Interesting)

    by Anonymous Coward on Wednesday May 29, 2002 @08:48AM (#3601327)
    When Gentoo first started making waves, my ears pricked up, I got interested, and then subscribed to the Gentoo mailing lists. And it didn't exactly fill me with a great desire to rush out and install it, because nobody seems to actually /do/ anything with it, people just post messages everywhere saying "Gentoo r0ckz", but nobody seems to actually do any real work with it.
  • by CynicTheHedgehog ( 261139 ) on Wednesday May 29, 2002 @08:52AM (#3601342) Homepage
    SuSE already does. And as a SuSE user, I'm curious to see what exactly they do mean. To me SuSE is way ahead of the other distributions mentioned, in terms of exposure, partnership, market share, and functionality. Just what do they stand to gain by unifying their distribution with the others?

    But you could be right. This might be a move to 1) provide strong industry support for LSB and/or 2) put pressure on RedHat to adhere to standards. Either way it's a good thing.
  • Re:To be honest... (Score:5, Interesting)

    by jilles ( 20976 ) on Wednesday May 29, 2002 @09:03AM (#3601373) Homepage
    The one reason I have not yet downloaded this is recompilation. I mean, compiling is pretty much a deterministic activity. Given similar compiler settings you'd expect the result to be the same each time. Apart from being deterministic it is also time consuming. Just compiling a pretty bare bones installation with gnome, kde, open office, mozilla would likely take me weeks by which time most of the packages would need recompilation because of updates!!! I'm all in favour of optionally compiling a few key things but I'm even more in favour of using pre-packaged binaries. Most of us probably would go for the 686 type code, so create binaries for all popular variants of X86 and distribute those (and maybe also other processors).

    An alternative, admittedly far fetched, idea would (imagination going beserk here) be p2p compilation. Compilation can be distributed over computers and there likely is a small subset of all possible compiler settings that is most frequently used. Simply cache the results for such compilations and given a match in source code version processor architecture and compiler settings, reuse the result (and offer the replicated binary for download). If there is no match, compile yourself and offer the result. This should quickly eliminate redundant compilations and offer most of the advantages of compiling everything yourself.
  • by goldspider ( 445116 ) on Wednesday May 29, 2002 @09:05AM (#3601381) Homepage
    I agree with the moderation; quite amusing!

    But the post (unintentionally?) brings up a good point. If RedHat were to gain a relatively large desktop environment market share, and consequently earn more profits, will Microsoft be alone atop the Evil Empire pedestal?

    I hope that these vendors will compete by trying to create a superior product that can take some share away from Windows, not just from RedHat.

    Infighting among the open-source community is one of those things, I believe, that is keeping Windows atop the OS market. Until the distributions stop fighting each other, MS isn't going to lose an inch to Linux.

  • by Arethan ( 223197 ) on Wednesday May 29, 2002 @09:14AM (#3601419) Journal
    You know, I could care less if there are 5 or 5000 Linux distributions out there. But I'm really getting tired of the lack of binary compatability between distributions. And when I say that, I mean lack of binary compatability all the way from libc up to the desktop environment. I can compile simple command line apps and have it run on most distributions, but the second I start using extra libraries (like GTK+) I start running into compatability problems between distros.

    Distro A has the library, but it's a different filename since it's a newer version than the one in Distro B. Bah! The best tech that MS stole was COM objects. Just cram all the necessary versions into a single file, and let the runtime linker figure it out on the fly.

    Well, I'm not trying to say that we need that sort of extra functionality/overhead, but I do want to say that Linux will take off like a shot at soon as developers have a steady target to aim for. The sooner all the major distros decide on a list of libraries that make up a standard linux distribution, the sooner I'll be able to start telling my friends and family that they should switch.

    RPM, apt, deb, and even slack's TGZ all have the same problem. The application/library is compiled and packaged for a single version of a single distribution. Sometimes you can take them to another version or distro and it will work, but most often not. With a little fussing, you can usually put together some symlinks on a few libs that will at least get the app to run, but certain features won't work correctly, or the app will crash because a certain interface isn't exposed by this version of a lib. Even if it did run 100% correctly after you made the necessary symlinks, that still isn't good enough, since you had to manually manipulate the system in order to get the app to run. I don't tell my family to run regedit when they can't get an item out of the "Uninstall Application" menu (I fix it for them next time I'm over there), so I'm not going to tell them to "Just make a few symlinks in /usr/lib and you'll be okay" either!

    Man this continual problem pisses me off...
    It's so basic that I was sure that it would have been worked out by now. I've looked and looked and found nothing. The Linux Standard Base doesn't even come close to defining everything that is necessary for binary compatability between distros, and google hasn't given me any other good leads.

    If I'm missing the big red neon sign that points to the solution, then please do share it with me. But if I simply haven't found it because it doesn't exist, then we should defenitely evaluate the value that this would add to Linux, and seriously consider its immediate implementation.
  • Re:To be honest... (Score:4, Interesting)

    by FreeUser ( 11483 ) on Wednesday May 29, 2002 @09:21AM (#3601448)
    The one reason I have not yet downloaded this is recompilation. I mean, compiling is pretty much a deterministic activity. Given similar compiler settings you'd expect the result to be the same each time.

    What isn't deterministic is what packages (and what versions of those packages, and what compile-time options for those packages you've selected) you've chosen to install. If you're using somelib.so.1.0.1 and someone else is using somelib.so.1.0.2, there is a small (but real) chance that a minor incompatability will result in a binary compiled against one displaying some occasional flackeyness when run against the other. This isn't terribly common (and it represents a mistake on the library maintainer's part when it does happen ... incomopatabilities should mean major revision number changes, not minor), but it does happen. When borrowing packages and binaries from other distros this becomes more acute.

    Compiling on your own machine eliminates this.

    There is also the problem of binaries compiled with different versions of the GCC compiler behaving is subtly different ways ... again, this is very acute when moving from GCC 2.9.x to 3.x, and again, compiling everything yourself fixes that problem.

    If you have a decent processor, compiling isn't really that burdensome (the initial installation excepted of course). Most people start their daily or weekly upgrades in the evening before going to bed, making the burden effectively zero. In any event, the advantages are well worth the trouble, and the speed improvements are dramatic.

    Your P2P idea is interesting (sort of a shared cpu cycle approach a la Seti@home). Again, the problem with having others compile for you (rather than sharing cpu cycles you use yourself) is that they will likely have slightly different libraries than you do, for some things at least, possibly compiled with different optimizations, so you cannot be 100% certain that what you are getting is exactly what you want. With Gentoo and Source Mage's approach you can be 100% certain that you are getting precisely what you want, and that it is compiled against precisely what is on your system.
  • by Te1waz ( 453498 ) on Wednesday May 29, 2002 @09:22AM (#3601450)
    A agree wholly on this. With one proviso.

    The inclusion of a standard Package management tool or process. The process in itself would ensure each distro adheres to the paths/library locations standard. It would also greatly assist in the usability of the GNU/Linux system.

    I use Suse Linux personally (have done regularly since 7.0) as my home desktop. There's a lot I like about Suse.
    Most reviews I've ever read rate it higher than Red Hat keeps me coming back.
    The regular and relatively well priced releases mean I can depend on regular base upgrades in addition to the online updates.
    Good KDE (my choice of default)and improving Gnome and other Window Manager support.

    MY only gripe is the environment scripted setup, but I don't usually have too much of a problem with installation as long as I compile from Source.

    While Yast2 et all are by no means perfect, the appliance of a standard among the allied distributions would be an ideal way of pooling resources and avoiding duplication of effort (very apt-get to the Open Source philosophy) and it would free up paid distro programmers to work on further enhancements (hint hint, the package management tool).
  • Re:SEC approval? (Score:3, Interesting)

    by IamTheRealMike ( 537420 ) on Wednesday May 29, 2002 @09:26AM (#3601472)
    Car analogies are often flawed - remember that cars already ARE standardised in that they all are roughly the same width, height etc. They all have similar turning circles: basically they all work on our roads in the same way. In computers we have a situation where some cars can only go down some roads and it's a mess. Roll on LSB.
  • by BitMan ( 15055 ) on Wednesday May 29, 2002 @09:44AM (#3601555)

    Anyone who has messed with Caldera, SuSE and Turbolinux knows that they do NOT produce a 100% redistributable version of their commercial offerings. In addition to now allow redistribution of their CDs, most either omit major packages, or limit usage to "personal." As of 8.0, SuSE has gotten even more restrictive no longer offering free downloads of many components. This alone has turned off this user from considering their software.

    Conectiva, on the otherhand, has gained a lot of notariety in their efforts. The two biggest being the use of Apt for RPM, and one of their lead developers managing a Linux kernel branch alongside Alan Cox and only one other. I have not used their distro, and DistroWatch.COM does not differentiate between "free download" and "100% redistributable" so I cannot tell if they maintain the same GPL-anal approach as RedHat. For now, I'll assume so (please let me know if otherwise?).

    So, for this strategy to work, assuming the rumor is true, I make the following 2 recommendations to the resulting conglomerate:

    1. Make a 100% Redistributable CD set, then value-add

      These vendors don't have to stop value-adding to their distros. In fact, this approach could still allow them to do so. But they really need to build some mindshare with those of us who like RedHat and Debian because of their 100% GPL-focus. Release a 100% Redistributable CD set which they all agree on. This has kept me from using Caldera, SuSE and Turbolinux over the years.

      Then each can include their own CD #1 binary, "alternate," non-redistributable boot CD in their commercial, boxed sets so the value-added stuff can be installed (in addition to other, non-redistributable CDs). The idea is that the install packages should be the same for both the freely redistributable and commercial non-redistributable versions, even if the default/base freely redistributable ones are replaced by those in the commercial, non-redistributable CD(s). Simple, no?

    2. Leverage Conectiva's Apt focus, build a Debian-like "universal" repository

      This will get the masses to join them. If the new conglomerate can build a new, 3rd party software repository for Apt like Debian has for Deb, this would get me to use this new distro. And they would quickly find that a number of 3rd party free software / open source projects would make sure their packages are built for and distributed in this new RPM-Apt repository. God knows I'd be sold in a heartbeat, assuming the distro quality is as good as RedHat. With SuSE in the mix, I don't see this being an issue, since I have used their kernels before (and trust them as much as RedHat).

      Right now I mix a custom distro (usually installed via NFS so I don't have to build CDs that are outdated quickly) use RedHat with Ximian and FreshRPMS added. Ximian is Ximian, and I don't forsee not using their Gnome set (this new "standard" distro will make it easy for them to support). FreshRPMS is RedHat-focused and uses RPM-Apt, but it is far from "comprehensive" with only about 50 packages or so. This is a far cry from Debian's 10,000+ Going to RPMfind or the older contribs is just not viable, and I don't bother much anymore. But I don't have nearly the package selection as Debian with RedHat and this frustrates me since I will not use Debian for other reasons (I'm not going to expand on them here, just note I said *I* will not use Debian -- not that Debian is "bad," not at all).

  • by tig ( 6017 ) on Wednesday May 29, 2002 @10:04AM (#3601681) Homepage
    Combining forces is a step in the right direction. But how can 4 companies survive competing for the same small services market for servers, where you are selling to system admins who know what to do anyway, and will buy only hardware related support? Yeah, some will sell as insurance, but I'd rather buy my insurance from IBM who will be around tomorrow.

    The only market that would have bought in droves, and did between 1999-2002 is the academic and technical workstation market, where the need was plenty and the expertise thin. Where was the user centric product then? Today's linux companies are making the same mistakes that the unix guys did, leaving the desktop to be picked up by microsoft, concentrating on the server. And today, OSX is replacing linux as the desktop of choice for power users.

    And whats with 1000 packages with 10 email clients, all substandard? Why package 2 desktops? Make a courageous decision and pick one! Why duplicate work, decreasing productivity? Do RedHat and Ximian need to both package gnome or evolution? Why not contract it out? Why not pay the gphoto developers a royalty, something you can do if you had just 15 desktop apps? Linux is presently sustaining programmers through VC, not through profits. This isnt a get-rich-quick scheme. Support the developers. Provide user testing for them. Give them a chance to live atleast part time of their software and consulting .

    But most of all, dont leave the desktop. For they who dont have the desktop today wont have the server tomorrow.

    I wrote some more about this stuff, see http://3point0.nareau.com and the 2 links on that page. Also see
    http://reno/~rahul/venn.jpg for an example of how an ecology of companies around a linux distribution and an application server(spacestation on desktop and cloudserver on server) could work.

    And email me at tig@nareau.com if you want to do something about creating a distribution with one desktop(gnome as gtk is lgpl, i believe in letting developers choose their own license, so no religious nut jobs pls), few well done apps, attention to quality, user interface and simplicity rather than emphasis on service contracts, and a biz-model by which work is distributed to the actual package developers on a per-product sold royalty basis, and selling the software is supposed to bring in money.
  • by goneaway ( 224677 ) on Wednesday May 29, 2002 @10:21AM (#3601767) Homepage
    about all this icky patent stuff here [redhat.com] which I'm assuming was the impetus for this announcements timing despite the fact that these four have been scheming for a while.

  • by elflord ( 9269 ) on Wednesday May 29, 2002 @10:44AM (#3601908) Homepage
    Distro A has the library, but it's a different filename since it's a newer version than the one in Distro B. Bah! The best tech that MS stole was COM objects. Just cram all the necessary versions into a single file, and let the runtime linker figure it out on the fly.

    Actually, Linux also lets you install different versions of shared libraries. For example, check your system to see how many different versions of libstdc++ you have. I have 6. The main problem is that distributions often differ at the very core. For example, if the glibc versions are different, there's really not a whole lot of hope of any binary compatibility. And if the gcc versions are different, all of the C++ binaries will be incompatible.

    The problem with "base level" differences is that you typically need a parallel set of libraries-- for example, if you have a program that needs an old libc and libjpeg, then you actually need a special libjpeg version that is compiled against the old libc. In other words, you can't just install an old libc, you need to install an old libc subsystem. I do this a lot because I need to have a gcc 3 based development platform. For me, the subsystem means gcc + qt (thankfully, the default glibc is OK, otherwise I'd also need glibc, libpng, libz, libjpeg, and the X11 libraries) The reason your simple command line apps run on most distributions is that most distributions have a minimal (libc + X11 + libstdc++) compatibility subsystem, but that subsystem doesn't include GTK. If you want your GTK apps to run on any distribution, you'll need to link statically to GTK and any other graphics libraries (jpeg, etc)

    My suggestion as far as solutions are concerned is to forget about running anything that's built against the wrong version of libc, and avoid running anything that's not built on your distribution.

  • by Interrobang ( 245315 ) on Wednesday May 29, 2002 @10:53AM (#3601957) Journal
    Here's a slightly different perspective: I am still relatively new to Linux (>2 years), and I'm just a rank beginner when it comes to programming. I sit in front of a Windows box all day at work because I have to, and I have a Windows box at home, too, mostly for (in)convenience.

    Why do I use Red Hat? Why do I use Linux at all? Well, frankly, the more I use Windows, the more I like Linux. It's stable, powerful, non-stupid, (don't even get me started about Stupid Automagical Windoze Tricks) and it does exactly what I need in a way that works well for me. Also, I think the interfaces are fascinating, so I'm writing a paper about them (for the arts/social sciences community) now.

    On the other hand, I neither have the skills nor the inclination (yet) to spend hours tweaking and reprogramming config files so that I can get something up and running. I like that it works. I like that I can do what I want with it, and I don't have to tinker with it incessantly.

    Sorry if that sounds kind of anti-hackerish (it's not meant so), but I'm still trying to master the basics, and I wouldn't try to drive a Formula 1 racer while on my learner's permit, either.
  • by mcrbids ( 148650 ) on Wednesday May 29, 2002 @10:54AM (#3601961) Journal
    I use Red Hat Linux for my servers. I'd not dream of anything else at the moment. Why?

    1) Excellent support - whatever software I want to install, I can be quite sure that there's a RH version - often in RPM form. This reduces the cost of maintanence dramatically.

    2) The RED HAT NETWORK is fantastic! I simply type "up2date -u" and 10 minutes later, I have all the relevant security patches installed! Just $5 per month, and their download servers are FAST. (I routinely see 15-20 Mbit connections - 10x-15x FASTER than an unfettered T1!)

    3) Reliability. My Red Hat systems are stable. They work today, tomorrow and next year.

    4) Stability of the distro. Red Hat has been around. They are profitable, or at least not burning capital very fast. I can feel good knowing that I'm investing my considerable time, money, and energy into a platform that will be there in the future, too.

    With the above, I can fulfill my support contracts easily and cheaply, and focus on the delvery of service rather than simple maintanence.

    Is Red Hat perfect? No. But it satisifies the above, and they are what I need to found my business upon.
  • slightly hypcritical (Score:3, Interesting)

    by sixSecondsOfDefeat ( 580997 ) <user@garyCondit.com s/$RAPIST/aol/> on Wednesday May 29, 2002 @10:55AM (#3601972)
    Let us not hate Red Hat just for finally being able to do what we have all wanted to do for years: turn OSS into a viable marketable tool.
  • Competition is good (Score:3, Interesting)

    by toolz ( 2119 ) on Wednesday May 29, 2002 @11:01AM (#3602005) Homepage Journal
    We all know that competition is good. It encourages innovation, progress and new directions. One of the reasons why there has been so little real innovation in the closed-source world has been the lack of competition to Microsoft's products (other than Windows Servers - which are seriously challenged by Linux).

    Over the years, Suse, Caldera et al have offered little serious competition to RedHat when it comes to *marketing* themselves (technically, RedHat is no way superior to any of these distributions).

    A "UnitedLinux" would actually be a good idea. It will encourage (spelled f-o-r-c-e) RedHat to improve their product (I am an RHL user, but I'll be the first to admit that RHL is about as exciting as a glass of water these days).

    At the same time this will give the players of UL a chance at a bigger market, which in the end is good for Linux and OpenSource.

    However, just like Linux chewed up the Unix market before it started spreading its wings, it is very likely that the initial gains UL would achieve would be at the cost of RedHat's share. There will probably be a bit of seesawing before things stabilize.

    And *that's* where the fun really begins. ;-)
  • Re:They don't exist (Score:3, Interesting)

    by rseuhs ( 322520 ) on Wednesday May 29, 2002 @11:13AM (#3602052)
    I won't even bother with the other two.

    Funny, because SuSE has certification programs, books, courses and everything else.

  • Re:Gentoo (Score:2, Interesting)

    by Wizy ( 38347 ) <`moc.liamg' `ta' `chgtaggerg'> on Wednesday May 29, 2002 @11:25AM (#3602116) Journal
    I dont see what you mean? Its got many people doing "real work" with it. Keeping up the ebuilds, working on portage and so on. It also has many users. I have switched all of my production and development servers to it. I know many others who have as well.

    I imagine gentoo as the second coming of linux. It is so far and above the rest of the distros in the way it feels to use. I just love it.
  • Confused journalists (Score:3, Interesting)

    by osolemirnix ( 107029 ) on Wednesday May 29, 2002 @11:37AM (#3602198) Homepage Journal
    Yeah exactly. I think they had to conjure some kind of competition that doesn't exist.
    Besides the fact that the Red Hat dominance is a myth that won't become any more true by repeating it even more often. It's bullshit, check out the latest sales figures [linuxplanet.com].
    I guess the only reason incompetent market analysts still believe this "Red Hat market dominance" hype, is because Red Hat is much better at marketing, so they are ever more "present" in peoples minds as the linux distribution. And SuSE (being mostly Germans) are unfortunately notoriously bad at this kind of new economy marketing.

    From a technical point of view they are already so similar that it's a piece of cake to support them both with a product. It takes a lot less than supporting other different Unices for sure.

  • by fsmunoz ( 267297 ) <fsmunoz@m[ ]er.fsf.org ['emb' in gap]> on Wednesday May 29, 2002 @11:56AM (#3602305) Homepage
    Within 1.5 years, we will see only 3 "major" players in the Linux distro market, with Debian taking a distant 3rd in revenue.

    Revenue has hardly any influence on Debian development, and has such it can't be used to prove which distribution would be more popular in your hypothesis. I will package things and help development without revenue in mind, at least for my self. That's one of the biggest strenghts of Debian (well, it can also be the root of some 'features' in development timing): it isn't really dependent on bussiness pressure or traditional revenue models.
    As such you can pretty much assume Debian will always be there, and that's, well, conforting :)

    regards,
    fsmunoz
  • by Anonymous Coward on Wednesday May 29, 2002 @01:57PM (#3603178)
    It couldn't last forever guys. Eventually the "open-source" concept would be beaten buy sheer $$$. Thus we end up with 2 large linux distributions.... then 1.... then its Linux vs M$.....and who knows... maybe a good fight.

    Anyone who thought open-source was going to last in a capitalistic society is kidding themselves. Sure you can jack around with what ever you want on your home machine, build a beowulf cluster out of old machines, but when it comes down to business. You better have sales guys and a company to relate to, which means corporation, which turns the distros against each other thus forcing the fight.

    Anyone who thinks I'm full of crap feel free to flame on but it won't change the future.

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...