Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Software

The Linux Development Platform Specification : Beta 49

Daniel Quinlan writes: "The Free Standards Group is publishing a beta release of the Linux Development Platform Specification (LDPS) which tells third-party software providers how to best achieve binary-compatibility across different Linux distributions (well, at least until the Linux Standard Base is done). It's important to note that third-party software providers include not just the large vendors like Oracle and IBM, but also anyone who creates a binary package for use on more than one version of a single distribution."
This discussion has been archived. No new comments can be posted.

The Linux Development Platform Specification : Beta

Comments Filter:
  • Not at all.

    It's just reccomendations. If you want to install your daemon in /etc, then go right ahead, be like 7th Edition. But what this is, is a set of ideas on how best to do things so that users/administrators know what will happen when they install your program and also so that it interoperates nicely with that system.

    -Dom
  • Now what we need is a list of distributions that deliver the minimums that the LDPS recommends, and/or what is necessary to bring any particular distribution up to snuff. Without such a list, we're making the game harder for the new Linux developer to play, which will not endear Linux to anyone.

    I'm relatively certain that RedHat 6.2 meets the minimum criteria, but what updates (if any) would be necessary for earlier RedHat releases? What about SuSE, Mandrake, Debian and everyone else's Favorite Distribution?


    Are you moderating this down because you disagree with it,
  • When Slackware conforms to the rest of the distributions, it will be added. They can't add it to the list if it's not compatible. Sure, you can hack stuff to get it running on Slackware, but it nessecarily work out of the box. It being on the list or not being on the list isn't going to change anything.
  • For your Linux distribution to comply to the Ether Trogg Linux Standards Base, your distribution of Linux must contain the following:

    The Linux Kernel
    Those nifty-neato GNU tools
    A C compiler would be nice
    Hmm... that X Window System thingy
    Oh, yeah, and the source code for all the above.

    Follow this guideline, and you, too, can have a fully compliant Linux distribution!

    See, now that wasn't so hard, was it?

    Heh...

  • by zatz ( 37585 )

    Heh... the FHS is linked to from it. Doesn't look "proposed" anymore, I should check more often :)

  • Read the link from the story, why don't you:

    Example distributions that meet this specification
    • Caldera OpenLinux 2.4
    • Conectiva Linux 5.1
    • Linux-Mandrake 7.0
    • Red Hat Linux 6.2
    • SuSE Linux 6.4
    • TurboLinux 6.0
    • To be released: Debian GNU/Linux 2.2
    • To be released: Corel Linux OS Second Edition

    I don't think that's "making the game hard"... unless you are a Slackware or Debian user, in which case your file system is misorganized or your libraries are two versions behind where they should be, respectively :) (Not trying to start a distro flamewar, just offering some possible reasons those distros are not listed here. Redhat IMO is kinda bleeding-edge with the libs, and I hate them all for using SysVile init.)

  • I imagine the +1 bonus was from the original poster's high karma; it doesn't look like it's been moderated at all.

    -Karl
  • Well.. since this 'standard' is about binary compatibility, perhaps including a bunch of interpeted languages (and Java, which already has the binary compatibility thing covered) doesn't make much sense.
    Then again, you are a troll...


    -----
  • from the codewarrior system requirements:

    "RedHat Linux 5.2 later"

    They also make a different one for SuSE.

    So now Metrowerks needs two products to work with only two distributions of Linux.

    There is nothing wrong with Linux standards. Standards are good, it makes things easier. Nothing like trying to edit text files in 400 different formats and use 200 different graphical interfaces. Can you imagine the Internet without standards in how TCP/IP works?
  • I recently spent several hours trying to make the Mandrake version of Helix Gnome install properly in my nephew's computer, while on my Red Hat box, I've had way fewer problems. Seeing how Helix can have problems with Mandrake (which is hard to tell from Red Hat), imagine how difficult it must be to make their installations work under the various installations. Personally, I'd love to see these efforts have to worry less about installations.
  • It is such a great idea to have a minimum set of libraries, kernel versions, and X versions as well as some development guidelines.

    As long as the distributions meet these minimum standards, and the developers follow the guidelines then they will both meet somewhere in the middle and everyone will benefit with easier to install software.

    Keep in mind that these documents are the first of their kind and that there will need to be some negotiation and some give and take between developers, distribution makers and these standards bodies. Things may even get a little messy and peoples feelings may get hurt, but that is just human nature.

    Good Luck Everyone!

  • My only gripe about this is that Slackware is too depreciated a distribution in this text. Now, many people say that Slackware is behind on the times but as far as I am concerned there are many reasons to say Slackware is the best. What, is a little text editing of /etc/rc.d/rc.modules gonna kill anyone? Slackware makes everything clear and simple as pie. Down with confusing GUI setups! Add Slackware to the list dammit!!!
  • I wouldn't call it bigotry. I'd call it laying the foundation for the rest. If you want to support all of those languages, you have still have to support C first, since the kernel interfaces, standard libraries (like X), and standard tools all need C library support.

  • Surely a source-compatible specification would be a lot better than a binary one. That way, along with the work being done by the Linux Standard Base [linuxbase.org] project, applications would be compatible with all Linux distributions and not just the x86 ones.

    Users of non-x86 distributions (of which I am one with LinuxPPC) are left out a great deal of the time by the big companies when they release Linux binaries.

    Although a "common binary format" exists in the form of Java bytecode, I cannot see world+dog porting all of their apps to Java. Personally, I'm now writing all of my new apps in Java so that they run on all the platforms that I have without a lot of tweaking for each of them.

  • This is the kind of thing that Linux needs to really be a factor

    I agree. All the little Linux companies are gonna band together now, in common opposition against M$.

    It's still gonna be a couple years before Linux crushes M$, and the bickering begins ...

    ... Suse-only "extensions" ...
    ...Redhat-only "enhancements" ...
    ... Corel "innovations"

    ;)

  • I suppose this is were the open source people would say it doesn't affect them. I think it affects all users of Linux. I have had problems with certain packages (including source distros) were I could not use (or compile) unless I had a particular distribution of Linux. This solution, while being nothing more than a kludge, is a least some way forward towards the standard base. Go Redhat, or was is Debian, oh I forget.
  • haha! that paper is so famous, ac should not matter. Anyway, the paper describes how free software is the best way for hardware and consulting companies to make money. For example, if you give free oil changes, you make lots of money off of radiator fixes when you "accidentally" stick a screwdriver through the guy's radiator while changing the oil. Obviously free software doesn't give you an incentive to make the software run efficiently on hardware. If you are a consulting/services company, then there is no incentive to make the software easy to use, otherwise nobody would hire you to teach them... It is a very good paper...
  • I think that you are forgetting the main point of the LDPS. It is being created to give developers a standard set of library utilities to maximize porability. Non-linux based developers find themselves entering into a daunting and unknown world when they begin porting and developing on linux and this standard gives them a good idea of where to start.

    Another issue relating to standardization is that you don't have to follow it. If you do decide to use it you know that you are going to acheive the maximum compatibility among many linux systems. That is not going to stop the innovators from looking to new places and new resources to create cutting edge products. Relating to your analogy, I surely doubt the first proponents of fuel-injected technology would have let a "standard" slow them down. Sure they were sacrificing support amongst mechanics and the like, but who cares when you have something so much better that is likely to change the entire environment you are working in.


  • you are grasping at straws. nobody is going to attain "commercial success" until there are some standards in place.

    -=(V)0(V)0cr0(V)=-
  • >And then nobody will achieve commercial >success without sticking to that standard. Its OSS right? There isnt meant to be a commercial success involved...

    --
    Cause she's the cheese and Im the macaroni
  • Although I believe i see where your fears are comming from, there must be Some level of standardization or writing for linux based systems becomes difficult if not futile. The real trick is to write standards as non-restrictive as possible while still getting the job done.
  • This is the kind of thing that Linux needs to really be a factor, I believe. If we can achieve binary compatibility, I think we can avoid the intimidation no doubt felt by companies who fear that they'll have to release xx million versions of their software...
  • Shouldn't the American National Standards Institute jump in and take a wack at Linux? Or do they refrain from mussing with OS's? I can't tell, their website is a mess. See if you can unravel this mystery yourself! http://www.ansi.org Hell, I probably don't know what I'm talking about, but it would be much snazzier coding my ANSI C game with ANSI graphics on an ANSI OS platform (nuts - couldn't figure out anyway to work ANSI SQL into that...)
  • by Anonymous Coward
    The majority of superior development enviroments that you listed add a layer of additional abstraction away from the native enviroment. As such, the number of conserns with the kernel and C library feature sets that remain consistent is not as much a consern. One of the few exceptions might be Java where the JRE for a newer kernel might provide certain capablities in relation to multi-threading which do not perform as expected on the older common kernel. But if you write a 100% Python application, then you shouldn't have to worry about which kernel or version of glibc is being used by the user.
  • by Anonymous Coward
    Oh gosh. The official spec "aproves" C and gives a nod to C++.

    Let's think about many superior development environments (in no particular order):

    • Eiffel
    • Ada
    • Modula3
    • Java
    • Smalltalk
    • Scheme
    • Sather
    • Java
    • Ruby
    • Python
    • Perl
    • ML
    • (and many other worthy entries)

    Face it. The year is 2000. The new millennium is 5 months away and the official Linux development spec is 20 years behind the times. This spec needs to address the requirements of other modern development tools.

  • We don't need or want binary compatibility. First, having binary compatibility holds back adoption of new software. Suppose for example that the "standard" adopts gcc 2.95.2 and glibc 2.1.3 as base components for compatibility purposes. How many people, then, will use gcc 3.0 when it comes out? Isn't anyone annoyed that libc 5 still exists? When old technology is superceded, it should be replaced, not used forever in the name of compatibility. A secondary problem here is that not all platforms can use version X of tool Y. For example, glibc 2.2pre works fine on MIPS, but 2.1 is hopelessly broken. La de da, all the world's a peecee, ho hum.

    Even if that weren't a problem, binary compatibility, were it implemented across various distributions, would only encourage more closed-source development targeting Linux. This just gives would-be binary-only vendors a good excuse.

    Just say no to binary-only vendors' initiatives. And just say no to any distribution that goes along with it.

  • We've waited more than a year and this is what we get???

    I was on the LSB mailing list during its first few months of existence. The traffic was, like, a message a day, if that. Now a year later I can see what you can accomplish with a message a day. A minimal spec which doesn't really do anything except to specify the base versions for some common software? Well surprise, surprise - all major Linux distributions already meet this "Linux Development Platform Specification". Guess they just surveyed what's out there, came up with a least common denominator, and called it a standard. Bah.

    Now granted, I have no real right to complain as I could have been an active participant in the LSB and worked to make the result better, and I didn't. Still, I thought that the LSB was going to define real standards - standard APIs, standard ways for the linker to work, the filesystem to work, etc, etc ... eventually culminating in a standard desktop. Not one that couldn't be tweaked and reconfigured to one's heart's content, mind you, just a standard desktop so that users would have some chance of using Linux as easily as most can use Windows.

    And this is what we get instead? Bah.

    As always, I really blame RedHat et al., since they have the money to make real standards happen, but they can't seem to understand that the future of Linux will live or die by its standardization, or lack thereof.

    Remember, the opposite of standardization is fragmentation. Linux will fragment, and die. Thanks LSB! Thanks RedHat!

  • OK. I'm an idiot. I retract my complaints. Thank you for pointing that document out ...
  • (well, at least until the Linux Standard Base is done)

    At the risk of starting a flamefest over a now-obscure language, I have two words for the LSB: ANSI FORTH. Like many other standards, this one took so long to come to fruition and was so far from any existing implementation that it was largely irrelevant by the time it finally crawled out of committee because FORTH had been eclipsed by other languages.

    I'm not saying that Linux will be obsolete by the time the LSB produces results, but if the LSB waits much longer, it will be obsolete. In the meantime, most of the larger vendors have fixed on Red Hat as the de facto standard, for better or worse.

    The LSB is now about two years out of the gate. That's not an unreasonable space of time yet. But where will we stand at the three-year mark? Four years? C'mon, fnarking Mozilla will probably produce usable software before the LSB even has a partial standard at this rate.
  • by Parity ( 12797 )
    This spec talks about how to get -distribution independence- of a -binary- package. You're not going to have an architecture independent binary package (ignoring Java, etc, for the moment.)

    In any case, the recommendations in this specification look like they are compatible with a cross-platform application coding approach, though, naturally, there's more than just what's in the distribution-independence spec involved in cross-platform compatability.

    In other words - I think you're criticizing this spec for not meeting a set of goals that have nothing to do with the purpose or intention of this spec.

    --Parity
  • Draft Text of the LSB [linuxbase.org]
    The document cited in this article is -not- the LSB, and the reason all those things are already in most distributions is because they analyzed the distributions to see what they had in common! The LDPS is saying 'if you distribute a binary, compile it with this, because that's what people have' and 'if you maintain a distribution, make sure you have at -least- this, because that's the basics'.
    --Parity
  • I can't resist...

    Did you really expect that a document designed to help people get their apps running on Linux would say...

    Yes, you may have written your application in C, and you may have heard that Linux is a good platform for applications written in C. But really, you should throw it away, and retrain all your programmers, and get all new development tools and fscking
    REWRITE YOUR APP in Sather or Ruby, just to show how FSCKING MODERN you are.

    Yeah, that would certainly convince Adobe to port Photoshop to Linux! Great idea!

    Torrey Hoffman (Azog)
  • Requiring a standard set of apps to be there is nice, but one thing that would greatly improve the clue level of people out there is to specifically point out the hardware platform/architecture. Many many times, a vendor will release a package for "Linux" when what they usually mean is "Linux/x86". I think any standards document related to building Linux apps should point out that not only might the C compiler be different (or the shell, or whatever), but pointers might be 64 bits, or other odd things like, oh, say, the machine language.

    They want to standardize software versions across platforms. Don't be fooled into thinking that just because you conform to the spec, your x86 RPM is going to be useful to people running Linux on Alphas, PPCs, s/390s, etc. ad nauseum.

  • Its basically just a set of library dependencies.

    A distribution maker (redhat, mandrake, debian) can make a dummy RPM package 'standard-2000' that has dependencies on the named libraries. An app developer makes sure their RPM depends only on this package. Voila, everything works.

    The library numbering scheme means I can have multiple versions of libraries on my system.
    A distribution can move to glibc-2.2 when it comes out, and place glibc-2.1 on the CD to be available for 'standard-2000' if a package requires it. And later you can repeat the process adding a 'standard-2001', etc package as required.
    Then the CD box can say "Supports standards 2000,2001,.. " and the user knows that the third party app will work cleanly on this system.

  • Well, it didn't take long for that site to be cracked. As of 10:26 EST, www.geekflavor.com has a nice little ascii pengiun on it.
  • What's wrong with MIME? If I were to choose a UNIX standard as an example of something that wasn't broken, I'd have chosen MIME...
  • Does Slashdot read Slashdot? I guess no, because else they'd know that LSB = FREESTANDARDS.ORG (look here [slashdot.org])
    well... what the heck
    ps this is not a flamebait, name it offtopic or insightful or something ...
    _
    / /pyder.....
    \_\ sig under construction
  • Do you even know anything about Open Source?

    Standards are supposed to be hacked on by lots of people and the more open they are the better. Microsoft fails to fulfill this requirement because most of the standards they push are designed to create inoperability so that they can get ahead.

    Remember: Open Source does not imply chaos. I wish there was a universal "standard" for binary packaging for Linux and other free OSes, if we did the interoperablility between current distros would probably increase tenfold. Now the current binary packinging standards that exist are IMHO all Open Source, so yes, people can choose which one to use, but no, it's not trivial to move packages between different distros with the same kernel. I would think this is something you wouldn't want.

    But I digress: standardization of library versions, basic files (now you did read the LDPS before posting didn't you?) is very much needed, especially for third party developers and brand new developers (such as myself). We can use this to make sure apps run on all distros that adhere to this, and given the list on the site, most do anyway. Also, the list of recommendations is basically what you would need create a usable Linux system, so you aren't forced into being compatible with something you won't use.

    As Linux continues to grow, projects/organizations such as these will advance Open Source software, as Open Source development seems to revolve around Linux. It would be wise not to look for FUD where there is none.

    Marcus

  • 27 A conforming Linux Development Platform must contain the following packages:
    28
    29 * Linux kernel 2.2.x (x >= 14, use latest if possible)
    30 ftp://ftp.kernel.org/pub/linux/kernel/v2.2/

    And that is all very nice and sweet, but why exclude the Linux compatiblity modes of SCO/BSD/Solaris? BSD/SCO/Solaris have all worked hard to provide compatiblitly with this "standard" http://www.telly.org/86open/index.html [telly.org]

    Why are these "linux initatives" (redhatisnotlinux.org is an example) exclusionary of GPL/anything not linux? What is wrong with trying to support EVERYONE who is willing to provide 'linux elf' support?

  • Do we really want some organization telling third-party developers how to achieve binary compatibility?

    You are so right. That's why I hate TCP/IP too! We should all just be doing our own thing, work up all our own protocols, and use whatever we feel most comfortable with. Here we are, stuck with only 4 little octets for addressing. We shouldn't be so restricted by "the man" when deciding how we want to browse the web.

    While we're at it, why do we have only one mark up language? I want 5 or 6 to choose from at all times so I don't have to deal with things being dictated at me.

    Standards are just some way to have "the man" keep us down.

    [Sarcasm:Off]
  • No, this isn't interesting. It's just another knee-jerk response to a perfetly reasonable idea.

    Nobody's telling anybody to do anything or forcing you to do anything or enforcing any rules. What they are doing is starting to put some ideas down in some sort of standard so that those people who want to can conform to it and state this fact. A standard isn't a regulation, it's just a flag you can wave that says "we do stuff this way".

    the bad thing, of course, is that the standard turns out to be a pile of shite, nobody ends up using it, a whole ton of time spent is wated and we all go back to square one a little bit wiser.

    the main point here is that nobody's pointing a gun at your head. literally. this means you have the choice whether or not to even think about this. you could, for all the rest of us care, switch off your computer and climb the nearest mountain, meditate on how misguided your life has been up til now, fast by way of repentance and starve to death, lonely and unloved.

    the rest of us, however, will go on and enjoy a new era of cooperation between application developers on Linux. something that's been way too long in the making.

    it will be a good thing, or it won't be a thing at all. stop being such a bloody pessimist.

  • Pardon me for being a party-pooper, but I'm a bit worried about this situation. Do we really want some organization telling third-party developers how to achieve binary compatibility?

    Remember, one of the reasons we all switched to Linux from Windows or MacO$ is that Linux doesn't force people to conform to a certian standard -- the way that "Compatible with MacOS" or "MCSE Certified" stickers do. If we start setting a standard for binary compatibility, everyone will start basing their Linux configurations on that standard. And then nobody will achieve commercial success without sticking to that standard.

    I'm not trying to say that there's some "conspiracy" out there. I'm sure everyone has the OSS community's best interests at heart -- but the road to hell is lined with good plans. As soon as we introduce specific standards into the Linux community, even by suggestion, developers and retailers will end up adhering to them. No, they don't have to, but economics dictate that they will, or their products won't be able to compete with the "Linux Development Platform Approved" products. And then we end up locked into generic, formulaic garbage, just like Wintels and Macs are.

    This reminds me of that graphics card that lets players "cheat."

  • The point is not to more strongly entrench C or C++, it is to encourage the deployment of applications that won't merely Run Best on Red Hat Linux.

    If you want to use Sather-based applications on multiple distributions, then the tasks are twofold:

    • Make sure that there are, ubiquitously-available, Sather RPMs or DEBs or whatever that will run on the various distributions.
    • Make sure that the secondary libraries (stuff like a "libncurses-sather" or "libgtk-sather") has some well-thought-out location to be, that can be compatible across distributions, as well as suitable packaging.
    • Make sure that there's actually someone still doing development work on Sather. (Last I heard, the sponsors at Berkley weren't working on it anymore.)

      Oops. That's three things.

    At any rate, the "killer apps" on Linux are largely written in C, and all this standard is doing is to recognize that. It is entertaining to note that the standard somewhat discourages the use of C++ in that the libraries still haven't "settled down."

    If you want to start deploying "killer apps" written in OCAML (and there are some interesting OCAML apps! [hex.net] ), the requirements will parallel what the "LDPS" provides, albeit by changing "C" to "OCAML."

    Feel free to write the OCAML Development Platform Specification. No one will stop you, and if it's good, and is worth doing some rework of packages for, this may be a very good thing.

  • by Azog ( 20907 ) on Tuesday July 25, 2000 @06:08AM (#908068) Homepage
    ...pure bigotry...

    (snort.) Chill out. It's just a recommendation for people who are trying to get their application to run on as many Linux'es as possible. Most of those applications are written in C and C++. If you had read the document, you would note that most of the major distributions are already compliant. If you want to write your app in some other language, how does this document change anything for you?

    This is a good thing. Now that it's out there, maybe people will use it instead of just releasing their app for Red Hat 6.2! That is the problem this document is addressing. Now, all the Pointy Haired Bosses who make the decisions will at least think about all the other Linuxes besides Red Hat.

    For the rest of you, who have actually read the document... I have only one comment. I'm a little suprised that they recommend only X 3.3.6, and don't mention X 4.0.x. Are there backwards compatibility issues with the 4.0 series, or are they just not considered stable enough?


    Torrey Hoffman (Azog)
  • by Chalst ( 57653 ) on Monday July 24, 2000 @11:17PM (#908069) Homepage Journal
    In other words, the LDPS isn't intended to be a standard which
    tells distributions what to do. Rather, it's a recommendation to
    third-party developers about how they can create binaries that are
    likeliest to be portable.


    Not exactly totalitarian.

  • by zatz ( 37585 ) on Tuesday July 25, 2000 @12:35AM (#908070) Homepage

    I'm seeing a lot of comments that find this proposed standard threatening. I think you should look for evil elsewhere... try MIME or X11 if you want to gripe about imperfect standards that everyone has to deal with. This one just looks like common sense.

    From the phrase "binary-compatibility" I figured there wouldn't be much meat to this... and reading the beta, it seems to mostly be about which library versions to link against, and maybe which shell to expect. All quite harmless; nothing prevents you from having other versions of libc or bash lying around. And the idea of having a convenient checklist instead of actually testing your program on different distributions is nice, although I dunno if I would trust it myself.

    I think the file system layout standards (standards efforts, last I checked) are more interesting, and (slightly) more likely to actually cause someone a problem... assuming that ttys can all be found directly inside /dev is more likely to *need* to change than anything addressed here. This effort seems aimed at making it less likely that novice users will have to play games with ldconfig to get some random package to run.

    I have a linux box somewhere that still has libc4 programs running on it... only problem I've noticed in five years was due to a kernel change, and it was just poor coding in the app that kept it from handling some unexpected return values from sendto() gracefully. So from my point of view, things are fine anyway, and this will just provide guidelines for those that like official guidelines. The spec itself lists a bunch of popular distributions that meet it already, so it's not as if this was designed in secret and is now being forced on people... it's no more aggressive than documenting library calls in a man page, and offering developers some guarantee that said behavior is stable.

  • by whm ( 67844 ) on Tuesday July 25, 2000 @02:42AM (#908071)
    Something this spec completely fails to address is the problem of architecture independance. Many applications these days are written in an extremely x86-centric manner. The authors typically have nothing against their product being recompiled on other architectures (Like, say, the Alpha), but their coding methods prevent it.

    I don't think authors need to test their product on every architecture out there, but if they would pay attention to certain good programming guidelines, it would allow their product to be ported much easier.

    ~whm
  • by Frymaster ( 171343 ) on Monday July 24, 2000 @11:22PM (#908072) Homepage Journal
    I admit that I love playing with my SE/30, but if I have to hear the word "backword compatible" (okay, two words) once more...

    The real issue that we have to be concerned with is "forwards compatible". If we establish a written-in-stone set of rules for universal compatibility, advances by one or more distros that violate those rules may be squelched.

    Permit me one more damn car metaphor. Imagine in 1970 that the government, in order to ensure that garages and mechanics were capable of working on all models of vehicles, set down exacting standards for all car manufacturers on the design and implementation of carbeurators. The results, in the short term, are good: parts are cheap and interchangeable, machanical knowledge is more widely applicable.

    But then who would have dared introduce feul injection?

What the gods would destroy they first submit to an IEEE standards committee.

Working...