Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Data Storage Software Linux

Why Aren't More Distros Becoming LSB Certified? 651

mydoghasworms asks: "I have done much thinking lately about Linux Standards Base. The idea makes lots of sense: Adopt a standard which will ensure that if some piece of software is compiled on one LSB-compliant system, it will run on any other LSB-compliant system. This would be great for members of the general public who are looking for an alternative to Windows, don't want to pay for Mac, but are looking for a platform where installing and running software is as easy as on the platform they are used to. Seen in that light, if LSB lives up to its promise, it could be the step in Linux's evolution that could see it adopted by the general public. That leaves the question: Why is LSB not seeing greater adoption?"
"Is it because it is not marketed well enough? Is the certification process too difficult? Are there perhaps technical challenges to LSB certification not often discussed? If people agree that LSB is in fact what Linux needs right now to ensure widespread adoption, what should be done to create awareness of LSB? Should communities developing Open Source/Free Software projects be encouraged to provide LSB binaries? Your input would be most welcome here."
This discussion has been archived. No new comments can be posted.

Why Aren't More Distros Becoming LSB Certified?

Comments Filter:
  • by Tuxedo Jack ( 648130 ) on Wednesday April 20, 2005 @03:06PM (#12295688) Homepage
    LSB-certified rootkits for the bastard.
  • by esconsult1 ( 203878 ) * on Wednesday April 20, 2005 @03:07PM (#12295693) Homepage Journal
    Mac applications are cool because of the contained environment that is OS X (except Apple did not create enough of their own native applications). Microsoft is successfull with their applications because they built a container that is at least perfect for them -- Windows.

    Why is Linux not gaining on the desktop? Because there is no "perfect Linux desktop container". The properties of such a container is that it should be standardized, easy to accept new client programs from a wide variety of sources, have easy to use services and a well known API that is well documented and defined so that programmers can easily write to it.

    Instead we have a bunch of fragmented containers (KDE, Gnome, lots of lesser known desktop environments) that are incomplete and immature. Heck, its a pain in the ass sometimes to get simple brain-dead stuff such as printing and mounting a drive working. So you have projects like OpenOffice having to write their own container!!! And Miguel (bless his heart) making a version of Microsoft's .NET container (Mono) for Linux that is still incomplete and sits with an incomplete container -- Gnome, which is sitting on top of an incomplete desktop container -- Linux.

    I know this is a rant, but my shop recently switched back to Windows from Linux desktops (about 40 people), why? Because the new CEO (and me too), were sick and tired of people trying to get things to work together properly. We were sick of not having an Exchange replacement (don't get me started on the open source ones now "available"). And new hires and our clients were just plain used to using the dominant containers out there (windows/mac).

    At least Linux as a server container works, because it has the same API as standard UNIX.

    • Linux is defiently not for the desktop. You put any mid level Computer person on it and they will have trouble. The Newbe will be fine after it is set up because all they need to do is there and they wont expand. The expert will find there way. But the mid level people are the ones which Linux is really lacking. As you stated jobs like Printing and file sharing and other jobs that Mid Level computer users do are lacking and difficult to use.
      • Neither you nor the post you replied to are true. The problem isn't Linux itself but the variety and indecisiveness of applications writers to pick proper standards.

        Example: if everyone would choose the cups printing model then linux would have better printing than windows. The fact that KDE still doesn't see cups as the prevalent and *best* printing platform confirms this.

        Also, application vendors like mozilla (oss locking while alsa exists???) futz these things does not mean Linux is off worse. In fact,
      • That's completely wrong ; what mid-level people can't do in a corporate network based on linux is just f*ck things around, upload that cute bug-riddled fish-bowl screensaver from the Internet and use it, and change the background picture to their kids ones because any savy tech won't let them hook their digital camera on the usb.

        And even if that pisses those mid-level users, that is *just* fine if you intend to have an actual work done.

        • by jacksonj04 ( 800021 ) <nick@nickjackson.me> on Wednesday April 20, 2005 @04:53PM (#12296894) Homepage
          You got that wrong. It should have read "properly configured corporate network", and not had the "based on linux".

          A properly configured Windows network can be just as secure as a properly configured Linux network. Possibly more so, since 99.9% of things are forced to 'just work' with group policies.

          Likewise, a piss-poorly configured Linux network is just as insecure as a piss-poorly configured Windows network. Possibly more so, sice 99.9% of things have a million options, all of which have the possibility of doing arcane things to the filesystem.

          I'm not a big MSFT fan, it just gets to me when I see people going "Linux is uber-kewl and can beat everything in Windows!" when for things such as corporate networks it is on a par with (if not less effective overall than) Windows. Admin dependant, of course.
    • by syousef ( 465911 ) on Wednesday April 20, 2005 @03:22PM (#12295902) Journal
      Thank you for saying this in a way where you haven't been modded down. I've definitely seen Linux go backwards over the last few years and it pains me, because by now it COULD have been ready for the desktop. It's just not, and the splintering of hundreds of different distros hasn't helped at all.

      Trouble is I don't know how you fix a beast that's this fragmented and distributed amongst so many individual groups of programmers. Most people here seem to just want to bury their heads in the sand and chant RTFM repeatedly at the top of their lungs, and if you shatter their fragile fantasy you'll feel their wrath.
    • Heck, its a pain in the ass sometimes to get simple brain-dead stuff such as printing and mounting a drive working.

      For some reason I always get modded down for saying this, but I'll say it anyway. I can't ever figure this opinion out. I have problems almost daily in Windows XP trying to print to a network printer (it randomly decides I don't have permission to print), but I never have a problem with this in Linux. I've also never had a problem mounting a drive. For example, I can plug my new Seagate extern

      • by ACNiel ( 604673 ) on Wednesday April 20, 2005 @04:14PM (#12296533)
        You do have a point about Windows having the very same problems. You don't have a point about people pointing out Linux's shortcomings as FUD.

        I have trouble printing to some of my MS printers all the time. There are people in my office, on the same network, with the same admins, that don't have any problems.

        If you don't have a problem with something, it doesn't mean there isn't a problem, it means you haven't had enough experience with that to have had to deal with the problems.

        If you want to know where "it doesn't work right", go ask the people that can't get it to work right. Don't ask the guy that has 5 computers in his basement, with 1 user, and says he has no network problems at all. Ask the people that have to support 40 workstations, with 40 users, all who poke and prod BEFORE they call someone who knows what they are doing, and then deny it like some child.

        Things don't work right all the time. Whether it is perception, or reality, they seem to work right more often in Windows.
    • by override11 ( 516715 ) <cpeterson@gts.gaineycorp.com> on Wednesday April 20, 2005 @03:25PM (#12295937) Homepage
      We use the LTSP project with about 45 users network booting to a XFCE desktop right now. They browse the web, access our exchange 5.5 server using Thunderbird and have a LDAP directory with auto-name completion as they type email addresses. They access our 5250 iSeries system, and use OpenOffice for word / excel needs on a Windows NT shared drive. We love it, works great. Some more 'advanced' end users chafe some because they cant download their own screen savers or games, but frankly we LOVE that part of LTSP!
    • by yamla ( 136560 ) <chris@@@hypocrite...org> on Wednesday April 20, 2005 @03:26PM (#12295944)
      It's a pain in the ass to get simple brain-dead stuff like printing and mounting drives working in Windows, too.

      For printing, my home desktop needs new (and uncertified) drivers from Brother. My brother's computer can't share the printer hooked up to my sister's computer and I've spent a couple of hours trying to figure out why. All the sharing _seems_ to be set up correctly, it just doesn't share.

      And at work, I had to write up a document showing how to remap drives when my coworkers plug in removable drives to their systems. Windows kept on assigning drive letters that were already in use. Why on earth do we still use drive letters, anyway?

      NONE of these things are things I would expect average users to be able to do. Linux certainly has plenty of problems, but so does Windows.
    • by MooCows ( 718367 ) on Wednesday April 20, 2005 @03:27PM (#12295960)
      Also, let's not forget how easy software installation on Windows and MacOS is.
      Download setup.exe, install, run.
      No dependencies (except a few possible dll's, which can be included with the application), no compiling, no need for 50 libs on your system to match a certain version number. It just works. More often than not anyways.

      For many users it would make the transition to a Linux desktop much easier if (desktop) Linux software could be installed as easily. Just a simple package which doesn't have to care about the rest of the system.

      Yes, Java/Python/.Net etc. are a possible way to go, but those applications still often depend on a bunch of libs and aren't known for their cpu and memory friendliness either.
      • Just hope that dll you installed doesn't overwrite another and crap all over previously working software. Dependencies exist. Linux seems to do better with them. Running apt-get install software-name beats the crap out of Windows' distribution methods. It is easier, faster, and the most convienent way yet that I've found to install software. It really just works. Commercial distribution makes this harder, but it shouldn't be impossible to move away from CDs.
      • Gee. For me, I don't even have to go find setup.exe. Apt knows where it all is already. It also does all the dependency checking for me.
      • There's no magic to Windows that makes that work. You could make the same sort of package/installer for Linux. Heck, what do you think Doom3 and UT2004 do? Distros don't do it because they all have package managers (which are better!).
    • by FreeUser ( 11483 ) on Wednesday April 20, 2005 @03:27PM (#12295975)
      The LSB is RPM-centric. It also has other flaws (in filesystem organization, to name one, although that is improving).

      Different distributions use different package schemes. Debian uses .debs, Source Mages uses tarballs+spells, Gentoo uses portage, etc.

      The "perfect container" is a tarball. Anything else you want to do (install wizard, compile script, install script, what have you) belongs outside of the package container. Need a one-click installation procedure? Include the script in the tarball, and provide a GUI that reads the contents of the tarball and lets you run a program from within the tarball (KDE has apps that can do this, for example).

      RPMs are flawed in various ways, and centric to particular distributions who happened to have representation early enough in the LSB process to push through a standard favoring their way of doing things over the broader, more portable standars (tar.gz).

      Until the LSB becomes a standard that is no longer Red Hat/Suse centric, its adoption by other distros will be lackluster at bets, and rightly so.

      As to your 40+ workstations that have been switched to Windows ... welcome to hell. If you think a little integration work in a heterogenous environment is hard, just wait for what Redmond's incompetence has in store for you. Your CEO won't be the one suffering, you (or the poor schmuck who replaces you after the next round of worms/trojans/viruses and other Microsoft goodies goes around) will be. *BSD and Linux aren't perfect, but their a damn sight better and easier to administer than Windows, and have the added benefit of working as well. Frankly, if you and your CEO were so hell bent on having something easy to integrate and use, and are obviously so willing to exchange flexibility to get it, you should have chosen to go with Apple for both your clients and servers. You would have traded less of your flexibility away, ended up with something much more solid and reliable than windows, and much easier to administer, and prevented a whole lot of heartache down the road. But then, I suspect your post is more of a dig at Linux and promotion of Windoze than it is a true history of some company actually being stupid enough to dump Linux for Windows.
      • The LSB is RPM-centric.

        Oh, good grief, rpm's work just *fine* on non-rpm-based distributions. (Try apt-get install rpm on a debian box some time.)

        --Bruce Fields

    • At least Linux as a server container works, because it has the same API as standard UNIX.

      There is no such thing as "standard UNIX", so even on that level it doesn't work.

      Personally, I think the biggest problem with standards is that there are so many to choose from. Think about how you get software for your linux boxen. .RPM? Tarball? .DEB? tgz? tar.bz2? Source?

      Each method has some benefits over others, and a team a zealots waiting to tell you why their's is best.

      Now think about CD distributions.
    • Copy/Paste Much? ;) (Score:5, Interesting)

      by Mad_Rain ( 674268 ) on Wednesday April 20, 2005 @04:02PM (#12296393) Journal
      Dude. I know it's probably nit-picking, but you really should cite someone you're quoting [slashdot.org], and save the plagarizing of yourself for when you're alone and in private. ;)

      by esconsult1 (203878) on Wednesday April 20, @01:07PM

      I know this is a rant, but my shop recently switched back to Windows from Linux desktops (about 40 people), why? Because the new CEO (and me too), were sick and tired of people trying to get things to work together properly. We were sick of not having an Exchange replacement (don't get me started on the open source ones now "available"). And new hires and our clients were just plain used to using the dominant containers out there (windows/mac).

      by esconsult1 (203878) * on Wednesday April 20, @07:23AM

      I know this is a rant, but my shop recently switched back to Windows from Linux desktops (about 40 people), why? Because the new CEO (and me too), were sick and tired of people trying to get things to work together properly. We were sick of not having an Exchange replacement (don't get me started on the open source once now "available"). And new hires and our clients were just plain used to using the dominant containers out there (windows/mac).

    • Agreed.

      OSS (I refrain from using the term "linux" since it is just a small part of a desktop) has a HUGE thing going for it right now: a complete lack of market penetration.

      While Windows has all of this cruft for the sake of backward compatibility, OSS has next to none. This means that OSS can take all of what is wrong with Windows and do it properly. The people who pull the strings NEED to sit down and get things right BEFORE critical mass happens. At that point, there's no turning back.

      As it sits, i
      • by Hast ( 24833 ) on Wednesday April 20, 2005 @04:40PM (#12296785)
        You have no idea what you're talking about, right?

        Points 1 and 3: Linux distros typically put settings in /etc. That's your "registry", only human readable and you can back it up and restore (relatively) effortlessly or move to another computer.

        Points 2, 4 and 8: Any modern Linux distro has a package handling system. You don't use the tar.gz files yourself, or even at all. These keep track of all software on your system and keeps it all up-to-date.

        Points 5, 6 and 7: That's the work of the desktop app.

        Finally: No-one NEEDS to do anything to get Linux out to everyone. OSS is not a product it's a process; you can join now or in a year or never. If you want to change what is happening then involve yourself in the process of making it happen.
  • by klipsch_gmx ( 737375 ) on Wednesday April 20, 2005 @03:07PM (#12295696)
    Yes Linux is better in how it handles hardware(ONE reboot AFTER install is complete is all I ever seem to have to do with a linux install, windows has at least 2 for JUST the os, leet alone dirvers, updates, etc.). But it's lacking in several other areas that would scare developers away.

    What exactly is the purpose of the LSB spec these days? When I last worried about it, I was under the impression it was so that ISV's could distribute software packages in such a way that they would work and integrate well on a variety of distributions, and nothing more. That is, it wasn't about providing consistent functionality across distributions in general, or about standardising things for standardisation's sake. The "Purpose" section in the LSB spec doesn't seem to clarify this for me, but rather describes what the LSB is composed of, rather than why it's composed that way.

    The big one is will it run out of the box, right now the way compatability between distros and even versions of the same distro work the odds are against it. The would probably have to ship a game with a spare cd containing all the variations on the binaries needed just to work on most of the mainstream distros.

    And as much as I laud and love the way Linux distros install in one go without reboot hell, and deal well with hardware changes, Games need good vidcard drivers and that requires getting ati and nvidia on board with optimized linux drivers Though this last point is somthing of a chicken/egg problem as is the next point.

    Linus still does not have installed user base to make porting a worthwile effort for many game/app developers.

    The concept behind the LSB was a good one and a step in the right direction even if the implementation had its detractors.
    • Not so much non exist as opposed to too many, esp on the gnome/kde integration of printing etc as you say.

      maybe he Mono stuff will help here, giving standard hooks the desktop environments can plug into, or maybe the LSB needs to include MONO??
    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Wednesday April 20, 2005 @04:17PM (#12296564)
      What exactly is the purpose of the LSB spec these days? When I last worried about it, I was under the impression it was so that ISV's could distribute software packages in such a way that they would work and integrate well on a variety of distributions, and nothing more. That is, it wasn't about providing consistent functionality across distributions in general, or about standardising things for standardisation's sake. The "Purpose" section in the LSB spec doesn't seem to clarify this for me, but rather describes what the LSB is composed of, rather than why it's composed that way.
      The ORIGINAL purpose of the LSB was to make it easier for COMMERCIAL ISV's to port software to Linux.

      The LSB people would write the standard...

      The various distributions would adopt that standard...

      The various ISV's would develop to that standard.

      I'm sure everyone can see the problems, right?

      #1. The LSB people couldn't get a complete standard written and published. Their current "standard" still doesn't include GNOME or KDE so it isn't going anywhere on the desktop.

      #2. The various distributions are different because the people running them have different approaches to solving the same problem. What incentive IS THERE RIGHT NOW for them to wait and adopt the LSB? That's right, they need an incentive.

      #3. The ISV's, seeing the delays, skipped the LSB and formed partnerships directly with the distributions (like Oracle did with Red Hat).

      So, what we have right now is a bunch of ISV's who are not writing LSB apps forming partnerships with distributions who are not abandoning their old ways to support the LSB which has not released a workable standard for either the ISV's or the distributions.

      The LSB, as it is currently focused, will always be a failure. Even if they managed to release a standard, it would only hold back the current speed of development.

      What the LSB really needs to do is focus on the things that would make a huge difference right now.

      #1. Fix the FSB. Right now, the location of a file depends upon how I install it. If I compile it myself, it goes in one directory. If I apt-get install it, it goes in a different one.

      #2. Expand the FSB (part 1). Standardize the naming of each file, right down to the version number. If some app depends upon libfoo-1.0.0.3 then that should be the same file, with the same name on each distribution.

      #3. Expand the FSB (part 2). Standardize the packages that contain the files that were standardized in #2. Package foo-1.0.0.3 would be named the same for each distribution and contain the exact same files of the exact same versions.

      #4. Get rid of the RPM requirement. Instead, specify the BASIC functionality that the package management system will have and the basic information contained within a package and the format. That way, the various systems can ADD that functionality to their existing systems.

      And the best thing is that those can be implemented over time. No more waiting for the LSB standard to be published BEFORE the distributions can become compliant BEFORE the ISV's can write and TEST their apps on those LSB compliant distributions.

      In the end, the apps can have stated dependencies that should be easily verified because of the file and package standardization.
  • Suppose you comply to a standard and the standard doesn't evolve very fast as well as the process for improving the standard being long-winded.

    This creates a situation where developers feel restricted and many open source developers develop for the fun and achievement. If they want restrictions, rules and regulations then they will program commercially.

    I'm not against standards but there must be reason for the slow adoption of this standard. We have to look and see who the standards are created by, who th
  • by jellomizer ( 103300 ) * on Wednesday April 20, 2005 @03:08PM (#12295727)
    I see Linux as a kernel, not the OS there is a Popular Linux based OS like GNU-Linux which has many distributions. But all based on the same design. With the Linux kernel you are able to make your own Linux Based OS that is not like GNU-Linux and works more like Windows or BEos or Mac OS. TiVo is a good example. It is a OS but I wouldn't call it GNU-Linux it is its own OS based on the Linux Kernel to handle all the grunt work of the kernel but how files are handled and interfaced is completely different. If you are forced to follow standards the amount of innovation you are allowed is cut back. Linux is great but there is still room for improvement and being forced to follow standards may force a person to work inside a box they may not necessarily want to be in. It is like saying the TiVo should use X11 as its method to display, not its own ones although theirs are optimized for the job of video playback. Why should working with Red Hat and Suse be so similar why can't they be different OSs with the same kernel. As for adoption if a person who doesn't like Red Hat the chances are they are not going to like Suse because they are so similar. Perhaps they need an OS that fits their way of thinking. Linux will be far better adopted when it is no longer though of as Linux but as what ever OS it is controlled (powered by Linux)
    • I see Linux as a kernel, not the OS there is a Popular Linux based OS like GNU-Linux which has many distributions.

      Richard, is that you?
  • by deanj ( 519759 ) on Wednesday April 20, 2005 @03:08PM (#12295729)
    They tried this before, more than 10 years ago with other UNIX systems.... Didn't end up working then, and it won't end up working now.

    People will always want to change things, and make their products "different" or "better". Whether or not they really do it or not... well, that's up to the people that end up buying and using what they come with.
    • by jaywhy ( 567133 ) on Wednesday April 20, 2005 @03:56PM (#12296328)
      You're absolutely correct. UNIX vendors tried this a long time ago and failed. The problem became you had multiple UNIX vendors accomplishing the same thing multiple different ways with no standards between them. This, of course, was one of the major downfalls of UNIX, and in part why it failed and how NT and Windows prevailed.

      The Linux server world and ESPECIALLY the desktop world are falling into the same trap. Multiple vendors solving the same problem different ways. It is becoming more and more obvious that standardization is next big test of Linux. Linux will NEVER grow out of it's niche if vendors and developers don't start participating in standards.
      • The problem is treating Linux on the desktop the same as Linux everywhere. I can run Linux off a floppy on a 386 or on a thousand-node GRID supercomputer costing millions - or anything inbetween.

        LSB is a Good Idea as it lets commercial developers release binaries that Just Fucking Work on a machine that would otherwise be running Windows XP. People releasing software for low-MHz devices or massive parallel processing systems will not be releasing MS Word replacements and accepting LSB as a global standard
      • by aussersterne ( 212916 ) on Wednesday April 20, 2005 @08:20PM (#12298562) Homepage
        You're missing the point. Think about your experiences in corporate America thus far.

        Imagine that you work in development for Vendor X producing Vendor X Linux. You have a marketing department and some managers over you, all hungry for targets and bonuses.

        As a developer, you have spent the last three months bringing the product in line with LSB for the alpha test. Now, as you detail your changes in a meeting, both marketing and management jump on you:

        "Wait, you mean our Vendor X Linux is now the same as Vendor Y Linux and Vendor Z Linux?"

        "NO!" You answer, almost in a huff. "It just shares a fundamental compatibility with them. A common set of file locations, libraries, etc., so that customers know that what runs on Vendor X Linux will also run on Y Linux and Z Linux."

        "So what you're saying," the manager responds, "Is that you're doing your best to lower barriers to out-migration among our existing customer base, while at the same time creating just the sort of backward-compatibility headache that is most likely to encourage it?"

        "Plus," the marketing person adds, "you're diluting the brand! We have a strong brand and are proud of the value adds that our differences from other distributions represent. If we're LSB and Y is LSB and Z is LSB, we're really saying to the customer that we're the same as they are. We don't want to be the same. We want to be better. We have a strong brand and we shouldn't be afraid to use it! We want to be the standard; we want to make sure that Y and Z match us. We certainly don't want to go around saying that we're doing our best to match them."

        Next thing you know, you're walking out of that meeting with instructions to roll back the changes you've just spent the last few months making, to ensure that the product is NOT LSB-ready.
  • by Ubergrendle ( 531719 ) on Wednesday April 20, 2005 @03:09PM (#12295737) Journal
    Certification costs money. To have credibility it must be peer reviewed, or reviewed/audited/approved by an external body. Then there's the QA and testing process. And this activity is not a one time activity, but a long term commitment to regression testing "every patch".

    Given that many linux distros are pretending to be enterprise-ready w/o enterprise sales or revenue would indicate that they are unable, uncapable, or unwilling to be certified. Basically they can't afford it.

    Of course I am speaking in general terms about linux distributions and the industry in general, there are numerous examples which can be used to refute my generalisations. However I think there's ALOT of consolidation required in the Linux world yet to achieve some of the more lofty goals of open source.
  • LSB Compliance (Score:2, Interesting)

    I don't see that for Linux to become accepted it has to go to one standard, bacuase it's becoming accepted without one standard. Part of it is most likely the whole RPM choice, though Debian based Distro's can do alien and format them to a .deb package other distro's don't have that option. But this brings up the whole point of splits from a base, like last week with Debian vs Ubuntu, ubuntu is using the new debian models and there are more Ubuntu destops being used then Debian though Debian is still you're
    • Most (all?) distros can at least force install an rpm package, It's pretty convinient even if you aren't on a rpm distro. I do it sometimes on Gentoo for binary packages like cedega.
  • Cost (Score:5, Informative)

    by gordon_schumway ( 154192 ) on Wednesday April 20, 2005 @03:10PM (#12295745)

    In case anyone else is curious, from this 2002 article [mozillaquest.com]:

    The cost for LSB certification testing is $3,000 for a Linux distribution. Certification testing for applications is only $1,200. The Open Group conducts the certification testing.
    I didn't find this info on the Open Group's website...
  • by p3d0 ( 42270 ) on Wednesday April 20, 2005 @03:11PM (#12295758)
    Adopt a standard which will ensure that if some piece of software is compiled on one LSB-compliant system, it will run on any other LSB-compliant system.
    No, that's not what LSB does at all. Even overlooking the obvious architectural differences between, say, PowerPC and Pentium LSB-compliant systems, you still have the various extensions that individual distros add. (Otherwise, why do we need different distros?) If you use one of those distro-specific features, then your code won't run on another LSB-compliant system.
  • Personally... (Score:5, Interesting)

    by ssj_195 ( 827847 ) on Wednesday April 20, 2005 @03:14PM (#12295790)
    ...I feel that as long as your repositories are up to date and reasonably extensive (as is the case with, say, Gentoo, Ubunutu, SUSE(?), but not Mandrake), installation of software under Linux is way better than under Windows. Seriously, it is completely awesome to just be able to bring up a GUI tool with neatly categorised software, check off 100 pieces of software, walk away and find them all installed without having had to do a single "Where shall I install this? Agree to this EULA! etc".

    I was once playing UT04, and all of a sudden the hard-drive went crazy, the frame-rate dropped and I rolled my eyes - obviously Linux was misbehaving again. It subsided after a minute or so (I kept on kicking ass the whole time, by the way, as I am hardcore :)) and a while later I quit. I then had a brainwave, and checked through the "Office" section of the K-menu - sure enough, OO.o was there. Turns out, I'd done an urpmi openoffice a while before playing UT, left it downloading, forgot about it completely, and the hard-drive thrashing while I played was the download completing and the installation taking place. I'd installed an entire fucking Office Suite without even lifting a finger. Cool stuff :)

    Of course, if you want something that is not in your repository, then prepare for the worst pain ever or go without. It would be nice if some measure existed to ease the burden on packagers, as it seems that keeping them up to date is a tedious and thankless task.

    • Re:Personally... (Score:3, Insightful)

      by Anonymous Coward

      ...I feel that as long as your repositories are up to date and reasonably extensive (as is the case with, say, Gentoo, Ubunutu, SUSE(?), but not Mandrake), installation of software under Linux is way better than under Windows. Seriously, it is completely awesome to just be able to bring up a GUI tool with neatly categorised software, check off 100 pieces of software, walk away and find them all installed without having had to do a single "Where shall I install this? Agree to this EULA! etc".

      Yeah, imagine if

      • Re:Personally... (Score:4, Insightful)

        by _Sprocket_ ( 42527 ) on Wednesday April 20, 2005 @06:37PM (#12297849)
        Windows lacks this type of thing because its popular, not because it is flawed.

        No. Windows lacks this type of thing because it operates in a completely different culture. The Windows world is dominated by a culture of control and marketing.

        First off there is the issue of proprietary software. Even when things are "free" as in "no fee", there is often some degree of control reserved for the distribution and even use of said software. That alone puts a damper on the "hundred thousand entries" you're expecting. But it goes further than that.

        While something may be available because of the functionality - it is also likely to be there because of marketing or sales strategies. That covers your dig at Microsoft's recent trouble over multimedia. But also includes finding Yahoo Search installed with Adobe's Acrobat reader.

        That's not to say that all of the above is "bad". It's simply a different environment. And it runs by a different common culture.

        And that's not to say that Linux is imune to this culture either. You're not likely to find UT2004 available after your next "apt-get update". And if you do install Adobe Acrobat Reader 7 for Linux, you're going to find it comes with Yahoo. But then, I can "apt-get install evince" and have a nice PDF reader for a ~1.7M download vs. the ~98M that I need to pull for Adobe's version.
  • The idea makes lots of sense: Adopt a standard which will ensure that if some piece of software is compiled on one LSB-compliant system, it will run on any other LSB-compliant system.

    Please forgive me if I am wrong, but I think that statement not true due to multitudes of incompatible library versions and such, although I do think LSB is a step in the right direction. The best we can hope is that LSB compliance will guaratee source level compatibility (and even that is probably unattainable goal due to d
  • by m50d ( 797211 ) on Wednesday April 20, 2005 @03:19PM (#12295858) Homepage Journal
    Not all of them, obviously. But there are some horrible things in the LSB standards. IIRC it mandates FHS compliance, which requires the utterly horrible /media. Also, on the apps front, LSB apps have to be mostly static, where good dynamic binaries and libraries is linux's greatest strength, and necessary since every app including qt or gtk would be nightmarish - your ram goes poof. And yet you can't make these part of the LSB standard, because important distributions don't have them installed, and don't want to. LSB needs a way to have apps depend on libraries, and it needs to take a serious look at where distributions aren't meeting it and why, because often it's because the standard is wrong and should be changed. The suggestion of multiple levels of LSB compliance could improve things a bit, if they can specify dynamic qt and gtk in one for a start.
    • I agree with the parent but would like to add on.

      I think that not only is the LSB concept flawed because they have picked some very POOR standards to comply to BUT they are also fundamentally going against linux tenants.

      Linux distro creators shouldnt have to spend a great deal of MONEY to get a little sticker. We are angry when Microsoft does it, why should we be softer 'cause its Linux.

      If the LSB project wants to be a nobel amalgamation of Linux on the desktop it shouldnt cost money to be certified, or
  • by Anne Honime ( 828246 ) on Wednesday April 20, 2005 @03:20PM (#12295876)
    Why would anyone want to have binary compatibility ? The main force of linux (as in unix) is source compatibility. It has been proven easier to fix things up in source code than in windows' binaries, and most of the troubles faced by windows users such as virus, worms and much everything else lies in the various binary incompatibilities, mis-interactions, and otherwise obscurities.

    Why would linux aim to have just that ?

    • by Vellmont ( 569020 ) on Wednesday April 20, 2005 @04:42PM (#12296802) Homepage

      Why would anyone want to have binary compatibility ?


      Because not everyone wants to be an open source software company. If linux is ever going to be used by a business, a regular end user, etc it has to be able to support closed-source programs. That means binary compatibility so a software maker doesn't have to support 15 different compiles of the same piece of software for each distribution.


      most of the troubles faced by windows users such as virus, worms and much everything else lies in the various binary incompatibilities, mis-interactions, and otherwise obscurities

      No, viruses and worms are caused by foolish users, insecure applications, poorly maintained computers, etc. It has NOTHING to do with binary compatibility/incompatibility.
  • Because.... (Score:4, Insightful)

    by devphaeton ( 695736 ) on Wednesday April 20, 2005 @03:21PM (#12295882)
    A number of things that make up the LSB have been in dispute as to whether they're the best way to do something or not. The one that comes immediately to mind is RPM-based package management. -I- prefer APT or compiling directly from source, but there are a dozen different ways to do it and they've all got their merits and pitfalls.

    These are Holy Wars, they'll never be solved, and they'll keep certain people from using an LSB system alone. (here it comes:)

    "Oh, but then you just install XYZ and you can do it your way."

    So you start with an LSB system, then install all these other apps and utils to bend it to your will. Now, ask yourself how different that is from what we've got now with all the 750 fragmented Linux distros?

    There are other things that are harder to change, i.e. filesystem layout. Once again, it's a holy war. The community will *never* come to an agreement.

    There is no "one size fits all" linux, and there never will be. Different people have different needs, and most linux users (well, or at least this used to be the case) have some extraordinary needs. That's why they use linux.

    Most of the people who would want a standardized base like that probably use a BSD. This is not a criticism of anyone or any system, it's just an observation.

  • by photon317 ( 208409 ) on Wednesday April 20, 2005 @03:29PM (#12296000)

    Because of the complexity and differentiation of linux platforms and whatnot, LSB will likely never be fully adhered to in a consistent manner by all vendors/distros.

    What I'd really like to see is a much simpler subset of really basic standards, with a different name, that would be relatively easy for all the vendors and distros to be compliant with. For example, I would expect this to be the nature of things it enforces:

    * Documentation other than man pages is always in /usr/share/doc for vendor supplied packages.

    * Man pages are always in /usr/share/man for vendor supplied packages

    * Init scripts should always exist in the location /etc/init.d/SVCNAME, and should always usefully accept the arguments "start", "stop", and "restart".

    * The following environment variables are always set to some correct-ish value in the default environment based on user configuration of the OS: TZ, HOSTNAME, PATH, USER, etc

    * The following basic *nix commands are available in /bin: [...], ditto for /usr/bin, /sbin, /usr/sbin.

    * The following list of common shells and language interpreters will always be installed in these pathnames: [bash, pdksh, perl, python, etc] (There might be an alternative "lite" version of the standard which excludes a requirement like perl or pythong or specific shells, for minimal/embedded environments). .....

    You get the idea - these are things that *most* distributions already do *mostly* the same anyways. After a few quick tweaks any distro should be able to re-release themselves as compliant with this standard. And once it's popular, vendors have a document to look at that tells them certain things they can rely on when writing linux-specific applications at the operating system level (aside from the stuff at other levels, like the linux and glibc and whatever else API/ABI stuff).

  • I would build a Linux box for any noob (read person who really doesn't do anything on their machine but email, surf the web, type the occasional letter, and print photos) computer user. I can build a machine and throw Linux on it, save hundreds on the OS and productivity software, and it will be the perfect machine for grandma or other non-techie person. For example, Fedora 3 comes with Firefox, an email client, a good messenging client, a media player, and a good word processor. That is pretty much all
  • by evil_one666 ( 664331 ) on Wednesday April 20, 2005 @03:30PM (#12296016)
    Why does nobody care about Linux Standard Base?

    1) A standard has been arrived at already already- it is known as POSIX (http://www.knosof.co.uk/posix.html)

    2) Linux Standard Base is yet another self appointed 'governing body' comprised of corporate 'industry leaders'. In other words, LSB hsa nothing to do with those who have made linux great, and therefore their 'ideas' will continue to be met with indifference.
  • LSB isn't the answer (Score:5, Informative)

    by sofar ( 317980 ) on Wednesday April 20, 2005 @03:35PM (#12296070) Homepage

    DISLCAIMER: IAADM (I Am A Distro Maintainer)

    put simply, LSB doesn't solve the desktop problem. It wasn't meant for that.

    The LSB was written to make sure that all those booming distros back in the days they were booming, were somehow unified by a comming file system structure, library setups etc.

    They really only mean to cover the (B)ase. This base was since then widely adopted and almost any distro conforms to this (B)ase more than 95%. Only outliers like slackware diverge, and often only minimally.

    This puts the burden on distro maintainers to get a certification on something that is completely obvious, and non-beneficial. It's like getting a prep school diploma when you're in high scool already.

    Also, the LSB is needlessly strict on some rules that hinder progress (init handling - chkconfig etc), where we should have moved to completely new solutions already (I loved that Makefile approach).

    so, expect more from freedesktop.org than from LSB...
  • i'm reading a lot of backlash against standards, and i suspect that most people responding don't understand the first thing about them. the LSB does not a vanilla linux installation make. it's a standard by which, hopefully, one can download a binary and it will "just work", whether you're on a "by hackers for hackers" distro or one that holds your hand. and complying to the standard doesn't necessarily inhibit creativity or progress, as the end-user/sysadmin is the ultimate authority.
    example: Slackware
  • Suppose I am Redhat. Why would I bother with LSB? In fact, wouldn't this be detrimental to my business? Right now, there is a huge community creating rpms for Redhat, and most commercial entities, if they offer rpms at all, offer them for Redhat. This preponderance of easy-to-install software encourages people to use Redhat, and encourages those already using Redhat to stick with it.

    In that light, what does LSB buy me? An easy escape route for my customers to switch distros.

    (used Redhat in this exa


  • UnitedLinux is what killed the LSB.
    Distro maintainers were presented with two standards to choose from: UnitedLinux or the LSB. Two standards is no standard.
    Then SCO killed UnitedLinux and no one was interested anymore.
  • Disclaimer: I have more b0xen running Linux and BSD than I have running Windows.

    Because the situation (I won't call it a problem, but it is for some) is that Linux development, especially the areas of the Kernel and non-commercail distros, is about what the developers think is cool, rather than what makes a practical and stable (in terms of applications running from kernel version to kernel version) OS. In many ways this is fine for a hobbyist OS, and liveable as an enterprise OS if you have someone like
  • Three problems (Score:5, Insightful)

    by iabervon ( 1971 ) on Wednesday April 20, 2005 @03:49PM (#12296258) Homepage Journal
    The main problem is that LSB specifies some stuff that people think are broken. (In particular, RPM and a C++ ABI that is both obsolete and newer than what people actually use). To the extent that the things LSB specifies are not broken, all the distros do things that way, and you don't need any certification to know you can use them. For applications, they only care about standardization if they need something that can't just be assumed, and these things aren't covered by LSB.

    It is, however, useful, but only really as documentation of common practice. You don't have to wonder whether you're ahead of the curve on adopting a version of zlib, because the LSB says what version you can expect.
  • xxx-config --path (Score:3, Interesting)

    by abes ( 82351 ) on Wednesday April 20, 2005 @04:05PM (#12296446) Homepage
    In line with what KDE and GNOME do (e.g. `gconf-config --libs-dirs`), why not have a single program that reports where different things are supposed to go? This would save the difficulty of having to having these companies/orgs actually agreeing to things, and would make it easy to make sure things always go in the right place (e.g. a makefile can simply do 'sys-config --install-bin-dir' to figure out where to install the resulting binaries). You don't even need to get the distros to agree, as these things can be fairly easily maintained by a third party. All you need to do is make sure this program always goes in the same location (e.g. /sbin/sys-config). Might even be able to replace autoconfig/automake by letting the program advertise the capabilities of the system (i.e. programs can register/unregister capabilities).

    Just a thought...
  • Gee, I dunno (Score:5, Informative)

    by OverflowingBitBucket ( 464177 ) on Wednesday April 20, 2005 @05:00PM (#12296964) Homepage Journal
    One gem from the LSB:

    Applications shall either be packaged in the RPM packaging format as defined in this specification, or supply an installer which is LSB conforming (for example, calls LSB commands and utilities). [2]

    [2]

    Supplying an RPM format package is encouraged because it makes systems easier to manage. A future version of the LSB may require RPM, or specify a way for an installer to update a package database.

    Which is basically use RPM, or you might be forced to use it in the future to remain in compliance.

    There really shouldn't be a requirement to use a particular package management system in the spec, unless there happens to be a quality, proven, popular system to choose from. Unfortunately, there isn't, and rpm really doesn't fit the bill. I'm not going to get into a debate over the shortcomings of rpm (suffice to say that I packaged software using it and hated it with a passion) as my feelings on it aren't important to the point. My point is there are valid reasons why multiple distros are trying their own package management solutions rather than settling on rpm. Forcing a particular solution arbitrarily (and the selection of rpm is arbitrary) is not going to encourage adoption of a standard.

    Add to this a number of other valid concerns from a whole bunch of people (flick through the replies above for a ton of examples) and you may start to find reasons why LSB hasn't been more warmly received.
    • by Anonymous Coward
      Your quote from the LSB is for third-party applications, not the OS. Choosing a single package format for third-parties to use to distribute binary packages makes a lot of sense, and the choice of RPMs was a logical one becuase RPMs have been the most popular package format for Linux.

      All an LSB-compliant OS needs to do is to make a way to install these foreign, third-party application packages. Debian uses the software "alien" for this, for instance.

  • by tlambert ( 566799 ) on Wednesday April 20, 2005 @06:01PM (#12297513)
    I'm really disappointed with this discussion.

    There are a couple of posts that get part of the answer to the question being asked and none of them has been moderated to higher than a 3 (and that one was somewhat off topic).

    A few years back, I tried to do something similar to what a part of what LSB attempts to do, and it was like pulling teeth to get anyone to even talk about it. The initive was called FABIO, for "Free Application Binary Interface Objective". The intent was to get all the x86 Linux and BSD distributions to sync up with a single ABI, hopefully derived from a commercial ABI - the front-runner at the time, by far, was Solaris.

    Nobody would do it, and it's for the same reasons that FABIO was stillborn, and the LSB is significantly more far-reaching than FABIO ever was:

    1) Loss of editorial control

    This is a big one for some projects. What if the LSB suddenly includes a library with a license that Debian can't live with, for example? What if I'm building an enterprise version of Linux, and I don't *want* to include graphics drivers that are part of the LSB 3.x specification? This is much less about what to put where as it is about what to include or not include in a distribution, and the acceptable per-distribution licensing policies and practices. The LSB throws in the kitchen sink.

    2) Commoditization

    If everyone conforms to a standard, what differentiates one product from another? This was touched on in that other posting. So far, no one has used the phrase "UNIX Wars", so I will. The UNIX Wars were about product differentiation. The other posting suggested that this was a result of market forces toward stratification, where different products rise up to meet different sets of needs. This is incorrect. FABIO only intended to standardize ABI - far less than the ambitious LSB. Further, it wanted to pick an existing commercial UNIX to standardize against, and finally, it wanted to define two levels of compliance. In the lowest level, you would be guaranteed that the standardized APIs were present. In the highest leve, you were able to turn off all APIs which were not standard: a guarantee that you could write code without unwittingly using a vendor extension, making the resulting binary non-portable. A mass exodus of developers to level 1 compliant platforms (to obtain the largest possible market) was expected... *if* FABIO made it. Neither the Linux nor the BSD camps bought into the idea: it would have rendered them commodities, differentiated only by philosophy and license. This is the same thing that drove the UNIX Wars: "I can't/won't compete against Microsoft, so I'll drive this other UNIX vendor out of business and take his market instead".

    3) It's too big to be meaningful in any real sense

    The LSB is too big to implement everything, and if you don't implement everything, you aren't LSB compliant. Face it, it's a superset of POSIX, and there's not one Linux yet that can claim full POSIX conformance for their system, let alone add in the other parts of the specification to get to LSB conformance. It's too damn big, and you can't turn off those things that are optional (you can barely do this with POSIX, using unistd.h, and if you do that with too many things, your system is useless anyway:. There's no agreed upon mandatory subset that lets you turn off the non-mandatory parts, and not get them at all, and know that all other mandatory compliance is there. POSIX has this problem in spades; the unistd.h mechanism is really poor at letting you pick interfaces to *NOT* be there: you can't. You also can't know, without a lot of research, what things are mandatory for conformance with standards built on top of POSIX - this is left as an exercise for the developer, who can say "if this interface is there, use it", but can't go anywhere and ask "what interfaces can I safely use, always, as long as a platform is conformant with standard XXX?". The LSB does a worse job: it includes POSIX, and then adds things on top of

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...