Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Software

The Linux I18N And Standard Base Merge 97

Leo Comitale writes "According to this press release the Linux Standard Base and the Linux Internationalization (I18N) project have merged and are calling themselves the Free Standards Group. I think it is really important for Linux to have a basic, low level standard for file system layout and support for international languages. These areas are critical to keeping Linux from splintering into a bunch of incompatible variants, and it seems these efforts are not getting as much support as they probably should be."
This discussion has been archived. No new comments can be posted.

The Linux I18N And Standard Base Merge

Comments Filter:
  • So much software is USA-centic even english speaking Biits and Aussis feel left out. Imagine if your name is müller and your software won't even let you spell your name correctly.

  • Shouldn't Linux and gang be involved heavily in this? You would think that the people with the most in depth knowledge of the 'low-level' workings of the OS would know what is best. If Linux says "this filesytem is best," I would tend to believe him.
  • C'est in cochon. Du bist ein schwien ich war dri seconde auf ien erste post??

    BG ;-) et alles.

  • err...that would be 'Linus' not 'Linux.' I know I know, my fault for not using 'preview.'
  • From what I've seen Linux seems to be pretty standarized already. Sure, most distributions have their own taste and flavours, such as Slackware trying to stay on the traditional/BSD side of things. But alltogether they seem to try to adhere to filesystem standards etc.

    Except one: RedHat (and thus Mandrake, etc). Kind of careless of me to drop such flamebait rich material, but I cannot escape from the expression that RedHat does seem to have its own mind, sometimes for the better and sometimes for the worst.

    My biggest fear is that one or two of the big distributions will not team up. It's pretty much a all or nothing issue. If even one decides not to adhere to the standards things are pretty messed up for they would not really be standards anymore.

    Then again, I've also learned that as long as you stick to *one* package tool you won't run in to trouble. I prefer source, someone else [rpm|deb|tgz|foo]. As long as we keep things a little organized ourselves, we'll be all fine.

    I would not hire any administrator who could not overcome the differences that exist. And those same differences should not be a big problem for the non-power user either since they are most likely to stick to one distribution anyway. Even if they wouldn't.. if you can make the transition from another OS to Linux, surely you can also learn to make the transition (either in thought or actual OS transfer) between distributions.

    That, or I have a way too bright view of the future and computers (and their users).

  • While I like the idea and the initials [FSG, next item after FSF in the Jargon File, perhaps? ;-)] I wonder if the new group will actually be able to do what they say they are trying to, which is to accelerate the use and acceptance of open source technologies through the application, development and promotion of interoperability standards for open source development.

    I mean, sure, they've got a bucket full of endorsements from some of the big players, but since when does adding members to a group speed up the decision making process? I mean, each of the major Linuxes have a financial interest in their own success, even if it's at the expense of the others, right?

    That said, I supported the idea of the LSB, and think this is a good thing, but is this any closer to something like a W3C decision making body than it's two predecessors?
    I also wonder how the FSG's members propose to deal with applications for non-Intel processors. If software is written for an x86 Linux, and is "FSG compliant", does that mean it should be recompilable for Linux PPC, etc.? My other (probably dumb) question is that if I write something that works on a distribution that is supposed to be "FSG compliant", but don't have the various distributions to test it on, how am I supposed to be able to get my software tested and certified?

  • First of all, I though that the LSB [linuxbase.org] was already managing a standard on filesystem layout.

    Second, I don't see it particularly important to internationalize this layout. What kind of ugly precedent would this set? If I wrote a program for a German-language compiler, would my code have to read:

    wenn (foo != foobar) {
    schreib ("foo und foobar sein anders.\n");
    }

    As it is, UN*X is pretty far removed from English, anyway. Don't mess with /etc...

    --

  • by Gurlia ( 110988 ) on Monday May 08, 2000 @08:38AM (#1084447)

    I'm all for a standard that allows applications to be written that can work seamlessly across Linux distributions. BUT. I hope this does not degenerate into uniformity. Let me explain.

    When Oracle first decided to release version 8 for Linux, I was very interested and downloaded it. However, I found out that apparently it was only targeted for RedHat, and I, a Debian user, couldn't get it to work properly. Libraries were missing, and the installer insisted upon a particular path to JDK, which was extremely annoying. There isn't even an option for me to specify an alternate search path or anything. It was hard-coded, and so were certain libraries that had Debian equivalents but were in a different location.

    What I'm trying to get at is, although a unified standard for where stuff should go in Linux is good, I fear that this may encourage developers to become lazy and simply assume a certain configuration and give no option for reconfiguration. For example, assuming that JDK must be installed in /usr/local/jdk/ or some such. I mean, I don't have a problem if they used /usr/local/jdk/ as a default, but if they don't allow you to re-configure where you want your stuff to be, that's very bad.

    Now, I realize that the initial download of Oracle I had was probably a hurried hack, so it's probably not their fault (that was probably the first thing they ever released for Linux IIRC -- not surprisingly they went for the commercially better known RedHat). But it underlines a problem with a lot of application developers: hard-coding platform-specific assumptions and leaving no room for reconfiguration.

    I know that it's easier (and arguably, results in more efficient products) to code for a specific platform, perhaps for the future Linux Standard. However, not allowing any room for reconfiguration is very annoying, and limits the scope of the application. An app that can be configured (with various amounts of effort) to run on, say, any UNIX system, would be more successful than one that just had to run on Linux because it was hard-coded with some directory paths or some obscure libraries that only existed on Linux.

    Programs should be portable, more or less (ie. it should come with reasonable defaults that can run as-is on the specific system it was designed for, but it should not require massive recompilation or source-level hacking to get it to work on a slightly differently-configured system). A standard is meant to serve as a reasonable default, and not as an excuse for lazy, non-portable programming habits.


    ---
  • I'm trying to remember - the LSB are the "Good" guys that want everything open... and the LSA were the "Bad" guys that wanted to control everything and have everyone get licenses from them - right/wrong?
  • I think you "yanks"* are missing the point.

    The single most important piece of internationalistaion is being able to display the characters on the screen. just taking the EU countries in europe there are at least 7 accents you can put on the letter A -- ä à â ã is all I can mamnge on my current keyboard.

    The other important piece is having a decent speel checker for your chosen language.

    * yanks --> derogative for citizen of USA.

  • primo alberino
    primeiro borne
    primer poste

    but would that fall under /dev/firstpost or /dev/first-post?

  • by Anonymous Coward
    I think you misunderstand the way LSB is currently going. It's more or less standardizing on the Red Hat way of doing things. Red Hat doesn't have to bend to the LSB's will, because they and their clones pretty much run that dog and pony show.
  • Why do we need this? We've already got Red Hat to define the standard. Take a look at all the software that only runs on Red Hat Linux. Geez, why don't you just give up running incompatible distributions, and use the one true distro? I don't understand why people can't just follow the group. No, they have to do their own thing. These people are ruining Linux for the rest of us!
    --
  • Well seeing as Linux Torsvald and Alan Cox are Finnish and Welsh respectively...of course there are no foreign programmers :)
  • by Matt2000 ( 29624 ) on Monday May 08, 2000 @09:03AM (#1084454) Homepage
    There's alot of debate for and against standards in the Linux community and in the open source community in general.

    Arguments against: Stifles innovation.
    Arguments for: Prevents fragmentation.

    My take is that certain administrative OS things should be standardized just to make our lives easier. I mean who really cares whether files go in /usr or /local, just pick one! Let's leave the room for innovation in things that really matter and make sure that simple things like deploying applications are as easy as possible.

    Hotnutz.com [hotnutz.com] - Funny
  • One of the problems with the existing standards proejcts is they not only define what the layout is, but they also define the package manager and the init-scripts format..those being rpm and sys V, respectivly. I personlly perfer the non-sysV version of the init files that Slackware uses...but they are someone appaled by posssibly being forcged to use rpm.

    ttyl
    Farrell
  • by dsplat ( 73054 ) on Monday May 08, 2000 @09:05AM (#1084456)
    Flame on, Pike. Some of us are actually doing something about it. I hardly think that people should use a different OS or a different compiler simply because they have different native languages. That doesn't make any sense. However, with free software, it doesn't make any sense to whine about not having translations into your own language. If you want them, do them yourself. I'm doing exactly that.

    However, there are a few pieces of underlying support necessary:

    1. The underlying software must actually read translated messages from somewhere (the GNU gettext mechanism works pretty well).
    2. Character sets and fonts that support your alphabet must be supported.
    3. The messages to be translated must be made available, and preferrably, the translations should be rolled back into the main distribution of the program.


    As for native speakers of American English (of which I am one), even if you are monolingual, there is a decent chance that you would like to have customers in other countries with names constructed out of funny characters. Having the software you run handle their names correctly doesn't hurt. Frankly, I'm not at a disadvantage if all the menus, error messages and help files are in English. But I need to be able to enter data that contains foreign characters. Internationalization benefits more people than you realize.
  • Do these two groups really need to merge to make decent internationalization a "fundamental" part of Linux? What is gained by a merger, rather than having the Standard Base folks read the I18N folks papers and incorporate them? I'm not implying that either group is unimportant, but merely that I don't see a significant change offered by an actual merger.

    An announcement that there was a draft standard for implementing multi-language or unicode support in Linux distributions would be important. This isn't.
  • I think you are missing the point.

    I have no problem with the full recognition of the letters of all character-based languages. You should be able to pick and choose whatever filename you want, and considering the fact that my Linux machine has a file named "*.*" at the root (full filename: "/*.*"), I think that's already covered. The "accent" problem you describe only goes as far as your keyboard, and is pitiful in comparison to Asian computer users that use symbol based print.

    As far as reorganizing the filesystem goes, NO!. Use symlinks if you don't like the way things are named. Personally, I think /etc/rc.d is a stupid place to put startup scripts, but I deal with it, so that other people can use my system.

    I also disagree that this is an "American"-only issue. The basis for most UN*X directory names is Latin anyway...

    --

  • by dsplat ( 73054 ) on Monday May 08, 2000 @09:12AM (#1084459)
    The Free Translation Project [umontreal.ca] has been handling the internationalization and localization of free software (primarily, but not exclusively GNU software) for quite some time. If you are interested in help internationalizing a program, or in participating in a translation team, it is a good place to start.
  • Of course the intentions of this project were good, but watch out: Linux must not and shall not be dominated by one company. But even if it isn't too much uniformity can be harmful to an OS as modular as Linux. Considering that Red Hat is seen as the only Linux variant by many, I support counter projects such as Red Hat is not Linux. Uniformity is good, but it must not lead to a decrease in modularity.
  • In a way, I would have to agree with you. I am a little nervous myself about Linux getting too "Vendor" specific. I don't want to see another Unix scenario.
  • Read the goofy things that have been come up with so far... The FSSTABLEFS or whatever is an exercise in pointlessness.

    Section 1: Philosophy [50 pages of authors blar blar patting on back blah blah blah]

    Section 2: We think important binaries should go in /usr/sbin. Don't you DARE make a top-level-directory that isn't on this list. [XXX: Are symlinks a good idea?]

    Section 3: [Unfinished, awaiting subcommittee 32.623.GNU/SMla,emGLS:11/2/3. Report 325.423]

    The rest of it is equally boring.

    Then onto the LSB... What is the POINT of making a reference distro? I like linux because I can update anything without impunity... They're trying to create a reference platform with glibc-2.???? and blah.224.323 and blah 3223.23.so and absolutely noone can come up with an idea as to how far the reach should extend... Determine default window ordering behavior? Startup sounds?

    Once a reference distro or whatever goes out that is stamped STANDARDS COMPLIANT LINUX, the value-add of distros becomes much less. Old crap will need to be included... And we will be STUCK with immature libraries and all sorts of backwards compatable cruft and a BORING UNIX PLATFORM FOR THE NEXT 10 YEARS...

    All I truly want is a nearly universal dependancy/packaging/installation/uninstallation system. rpm comes close, but badly formed packages are an issue. Shit still gets left around... rpms refuse to integrate with tarballs and vice versa.... blah blah blah... Without a common capability database or SOMETHING, it's a bitch to integrate packages of any type and configmakemakeins shit...

    And I also want a COMPLETE rethinking of autoconf and automake and m4 and configure and all that shit. Most of my linux shit won't compile on other shit, and that's just damn dumb considering they all are so much alike. And it's so goddamn wordy and verbose and fragile and fucking flimsy.. Stupid fucking make syntax.... try a make -d on ANYTHING sometime and be astonished..

    Shouldn't you people be groveling or something? And BRING ME SOME SHOES! Nice ones.

    FART!
  • RMS has traditionally been opposed to the LSB because they are not called the "GNU/Linux Standard Base". Now they are the "Free Standards Group", I wonder if he can now bring himself to endorse them?

    Admittedly, their press release was full of "Linux" with no "GNU", but that's a press release and they're traditionally full of s**t anyway :)

    Stuart.
  • If im not mistaken Linus T. is from Finland.
  • I don't understand it sometimes. We (the open source movement) want to have our software (Linux) take on the superpowers of the Software world (Microsquat) yet we have a hard time agreeing on wheather or not their shoule be standards. It's scary, and I hope this is a move in the right direction.
  • Shouldn't Linus and gang be involved heavily in this?

    I corrected your post for you :)

    Actually, Linus (and Linux in general) should have nothing to do with these kinds of standards. They are usually distribution specific; all distributions already use the same GNU/Linux kernel, so therefore all distributions are already adhering to the (only) kernel standard.

    Things like file locations drive me insane when moving from one distribution to the other, especially the /etc/rc* crap. I've mastered Debian's only to find it (slightly) different on other distros.

    p|ng steps off the soapbox.

  • Does anyone know if this includes support for BiDirectional languages? (Arabic and Hebrew).

    For languages based on the Latin character set (or even a non Latin set like Cyrillic, Kanji, ...etc.) it is a piece of cake to "localize" an application. Just make sure everything is in a message file for the appropriate language, for the labels, menus, errors, ...etc. and you are home free.

    However for Arabic (and Hebrew) the challenge is totally different. Display goes from Right to Left, and characters have different shapes depending on their position in the word and what other characters follow/preceed it.

    I am using Windows and Internet Explorer because there is no Linux product that supports all this in full.

    There are projects to arabize Mozilla, and the new New KDE/Konqueror 2 has Arabic support. This is all very encouraging, but nothing in production yet. Also, we need Arabic spreadsheets, word processing, calendar and scheduling PIMs, ....etc.

  • The other day I stumbled across a program in the console-tools package called unicode_start (and unicode_stop). Haven't had time to play... do these do anything useful? What unicode support is there in the console driver?
  • I see a lot of people bursting arteries because we Americans actually write software in American English

    No, no one is flaming (or should be flaming) people writing software in their own language. I don't know where you got that from. If you're talking about closed-source apps, I might agree that people might complain about English being the only choice. But with open source - no. The standard "do it yourself" often apply, interpreted as "translate it yourself". No need to rewrite the entire app, if the app was made cleverly. gettext [umontreal.ca] will parse many c programs just fine.

    No one expects you to translate your software into 11 zillion different languages. What you might do, however, is to make it easier for translators. This may be such things as not hard-coding US-ASCII everywhere. This may sound simple, but I've seen many programs not accept filenames with non-US-ASCII characters, or where such characters simply break the app.

    It might also be to write the strings in your app so that they are easy understandable even out of their context. This helps translators a lot. Avoid TLAs when you can and write easy understandable sentences.

    Also try to avoid assuming that all others whould like the same localization as you. Don't hard-code these settings in your application for example:

    • AM/PM clock
    • Legal paper format
    • Weeks begin on Sunday
    • Date formats and date strings
    • Inches
    I could go on and on, but you get the idea. These are things that can get people "burst their arteries" if hard-coded.

    As for american programmers writing in english: Don't assume that most programmers writing applications in English are american. If you look at the contributor list of many free software projects (like the GNOME [gnome.org] and KDE [kde.org] ones) you'll see that a lot of them are not from the US, maybe even the majority. English just happens to be the default language that applications are written in, and then translated into as many other languages as possible.

    Disclaimer: I am a Translation Project [umontreal.ca] translator, translating GNU software.

  • irstfay postay? Deine Deutsch ist furchtbar.

    Speaking of spell checkers in your previous post, you apparently need one for English and German =).

    No hard feelings, just a good ribbing...

    --

  • Character sets and fonts that support your alphabet must be supported.

    Good point. For example, what if the Unicode [unicode.org] character space doesn't include a particular alphabet [google.com], the alphabet your language [google.com] uses? All the i18n and l10n in the world won't save you in that case.

  • So why incorporate?

    Up until this point, the LSB [linuxbase.org] and Li18nux [li18nux.org] were operating as unincorporated organizations, which is bad for a number of reasons: legal liability, the inability to accept and distribute funding for development and other expenses, no entity to hold copyrights for the group, anti-trust issues (you need to be careful when you have competitors meeting in the same room), and more. We needed to incorporate (as a non-profit, of course).

    As far as the Li18nux and the LSB are concerned, they will more or less continue as before, although we'll be able to put more resources on each project so things will speed up. We'll be working closer together and referring to each other's specification, but the LSB and Li18nux specifications will probably be separate standards for some time.

    Why incorporate together? It makes sense and it's less overhead. We didn't need separate legal entities for these open-source standardization efforts.

    Some LSB specifics:

    Will the LSB be multi-architecture? Yes, although x86 is the main target, we are trying to draft the specification to apply for multiple architectures. Recompile the sample implementation and test suite and everything should work fine for other architectures. (The reality is that most third party software is released for Linux on x86.)

    Another thing: the whole "LSB stifles development" argument is very misleading. You can ship development libraries along with stable LSB versions if you want both environments. (It will be up to the distribution and system administrators.) Kernel developers like Alan Cox, Ted Ts'o, and H. Peter Anvin have been participating in the LSB for a long time - I don't think that would happen if we were going to stifle forward progress.

    Will having more members slow us down? Quite the opposite, actually. The main thing slowing us down is the amount of work to be done, not slow decision-making or the lack of consensus.

    Finally, recall that the word "base" is part of the Linux Standard Base name. Distributions will still have the same amount of room to add value, innovate, and distinguish themselves. We like the fact that there are different Linux distributions, each with something unique to offer. We just disagree about requiring commercial and non-commercial providers of software to port and test their software for five or ten different Linux distributions.

  • Have you been to the Unicode site [unicode.org] lately? But there is one problem: there are more distinct characters in this world's writing systems than there are 16-bit integers; some scripts [google.com] will never be included into the codespace.
  • Given that having many distributions is *GOOD*, and that it would be awful if we should end up having only one, these distributions have to be different one from another (otherwise why would they exist?). And being different means putting files in different places, calling directories with different names, having different start up sequences, and so on. This great pursuit for file organization standards seems to underlay the desire of an unique distribution, which is wrong and should be repelled. Variety is richness, uniformity is poorness.
  • I disagree that standardization stifles innovation. Rather, standardization frees the developer from having to worry about where to find libraries in the system, and where to install his programs by default. I'm not saying you should hardcode all your programs to a certain structure--rather, make the standards the default, and let autoconf handle the rest. It's not really different from what it is now--except RedHat started a slightly (in some cases very) annoying habit of dumping everything under /usr. /me really dislikes non-relocatable packages. :(

    Anyhow, the way I see it, we do need a firm low-level standard. All vendors should have that universal standard to shoot for, to be "compliant". Vendors and users should be guaranteed that in a "compliant" distribution, certain libraries will be found in whatever directory, and that no matter which package manager is used, the program will be able to find the resources it needs (be it libraries, or locale), and also that a particular package will install it's libraries and resources in the standard location (ok, deep breath--whatamouthfull!!).

    I do not think that this is stifling at all. What it does is provides a standard framework against which programs can be designed and implemented. In my mind, I equate what needs to be done with the LSB (nka FSG), with ANSI C. Most C compilers I've seen lay claim to, and do a good job with, ANSI C source code. They are not prevented one whit from adding extensions and "improvements". But there is a definition out for the C language, and in order to be ANSI-compliant, a compiler must at least meet that definition's specs.

    The same should be true of Linux distributions. There should be a base standard against which they are measured. Then, when a vendor releases a package, they can say, "This will work on all FSG-compliant distributions. Minimum requirements are yada version 1 and yadayada version 2". In others words, provide a single target for compliance for vendors to look to, and for distributions to looks to. Where the twain shall meet, there will be happy users (unless you are using a devel kernel, of course--then you have *crazy* user--oh, wait, I use devel...).

    It's a good deal all around. "Buy my C compiler (distribution)!! It's guaranteed ANSI/C (FSG) compliant!!" Good for vendors/developers, good for distributions, and most definitely good for users.

    Note that I don't think that versions should be included in any standard--only a hierarchy of where to find resources. As for other standards, like what libraries are included on a compliant distributions, etc--I think that's a long and hard road to travel. Might be better to have the vendor specify what's needed, preferably in the docs and install process, than to try and define what libraries make up a compliant system. It's probably something that we're going to have to look at sooner or later, though.

  • It's really funny. I see a lot of people bursting arteries because we Americans actually write software in American English and not Esperanto or Norwegian. Aren't there any software programmers or companies in your countries? Write your own software if it's such a bad problem not being allowed to spell your name with an umlaut or a thorn or an edh.

    This is so common its almost funny. And very american... "hey theyre complaining about us using our own language!"

    If you just took a look at the problem even you could understand. My language has three additional letters, å ä ö. Most asian languages use a whole different character set. How funny would you think it was if you couldn't make your application accept your name, or type a filename in your own language?

    It's not about the language you code in, it's about the lower-level language possibilities of the system.

    I support the internationalization stuff, but hwoo boy, what's with all this: "those stupid americans write software in American english! What, do they think theirs is the world's only language? When will they wake up and start writing software in Spanish and Portugese?!?"

    Really, please, show me one example of where someone said anything similar to that... I've surely never seen anything remotely like it.
    It just makes you look stupid.

  • This same discussion has been going on for as long as I can remember (which is not that long, but a good few years anyway).

    'We need a standard to stop Linux fragmenting and becoming incompatible', etc etc. But has it happened so far? Hardly. Can you name an application which runs on one distribution but not another (for reasons other than corporate stubbornness)?

    De facto standards seem to have served Linux well so far, why bother getting all formal and churning this sort of thing through endless committees?
  • s/Redhat/Slackware/

    Though I think it is much more a case of being stubborn than not playing nice...

    I'm curious though, if Redhat doesn't "play nice", what other distribution does? Certainly not the other RPM-based ones, right? And you certainly doesn't mean Slack, right?

    So, that leaves us with Debian and supposedlly Storm. Fair enough, I think Debian is actually being used as the LSB reference...
  • by jilles ( 20976 ) on Monday May 08, 2000 @11:17AM (#1084479) Homepage
    "Arguments against: Stifles innovation"

    This is a myth, in fact it promotes innovation because after a good standardization process there is no need to reinvent the wheel poorly. Essential is that the standardization process is done properly (i.e. it should be done by people who know what they are doing and who represent all major stakeholders) based on existing practice.

    "Arguments for: Prevents fragmentation."

    And makes it easier to switch distribution and makes it easier and more worthwhile to learn the details of the system since they are less likely to change. It makes it easier to write distribution independent applications (or am I the only one who thinks it sucks that many applications come in 4 or even more distribution specific packages).

    I think that standardization is good. Delaying it makes it unnecessarily hard to create applications for Linux. If the linux platform has to fragment then let it at least fragment for a good reason and not because vendor X stores programs in path A while vendor Y insists on using a different path.

  • I think it's /dev/null...
  • IINM, the Linux console already does Unicode (more or less). Each Unicode character set has 65536 characters, whereas VGA text mode can only have a 512 character charset, so there is some mapping that goes on (hence the Linux charmaps).
  • "The majority of programmers in the USA speak American-English. What fucking language do you think they should be programming in?"

    Let's assume for a second that not all programmers are in the USA... Linus himself didn't start out there.

    "Oh yeah, I forgot. Its totally in fashion to hate anyone from the US and dump all kinds of lame stereotypes on them while pretending to be "open minded.""

    How can you manage to bash everyone outside of the USA, and in the same comment say that everyone else is not being open-minded? I can see why you posted as an Anonymous Coward...

  • I don't think you know what you're talking about. This has nothing to do with the kernel whatsoever, unless you consider the 0.3ms gain from accessing /usr as opposed to /usr/local to be a criterion for filesystem layout standards.
  • The reason that I'm not that supportive of these attempts to standardize the unix file system is because they aren't fresh approaches based on practical experience.

    The standard appears to include every single nuance that the Unix system has ever had. Even those that are contradictory. I read half-way through it and was no better off than if I hadn't. It's like looking at a street map of a town. There isn't any logic to it; just a lot of dots to memorize.

    What I would have prefered is a standard that is forward thinking. Start out explaining the contradictory goals of file management (multiple architectures, read only partitions, local partitions), resolve that they can't all be solved at once. Pick an "axis" (I'd prefer "logical" organization over speed or read-only concerns). Then, try to design a layout that "makes sense" for "general purpose computing and development."

    What I got was a document of how things are done and "why" but you have to memorize many, many "why's."

    P.S. Of course, if I had my way, we'd banish generic "bin" directories and have one software product per one directory with its own "bin." (The PATH variable would have to be re-thought.) The exception is common libraries for all software. But, then again, as those stand, they are over-burdened with libraries most people never use anyway. See? Much to discuss.

  • No offence, but you must be pretty sad if you consider "typing in your name" to be equivalent to "programming in your favourite language". Or, wait a second, could you be talking out of your ass without actually reading the original post?
  • >The newly formed Free Standards Group > was organized to accelerate
    >the use and acceptance of open source
    > technologies through the
    >application, development and promotion of >interoperability standards
    > for open source development.

    Ok, so where are the calls to make sure this "Linux Standard" will be including Linux compatibility options?

    >any LSB-compliant application will run >successfully on any LSB-compliant Linux >distributions.

    Again, what about vendors that use a Sys V or BSD kernel and wish to support Linux via a Linux compatibility layer?

    >increasing compatibility among Linux and other
    >open source distributions and in helping to >support software vendors and developers to port >and write software for open source
    >such as Linux.

    Nice, but vague.
    If it is OpenSource, it doesn't NEED the LSB like shrink-wrapped binaries does it? Look at all the code as FreeBSD ports and the same code as Linux distros rpm/deb/whatever

    >LSB would also help
    >SCO's Server Software Division by increasing >Linux compatibility with SCO operating platforms.

    So why is SCO a 2nd class citizen in this? Why is the concept of Linux Compatibility layers buried at the end? Such should be at the top of the press release, because the LSB is the 1st *REAL* chance of a unified-shrink-wrapped-unix binaries in the history of Unix.
  • Hmm, so you're saying RedHat should decide the file layout? Where is the community control that Linux is famous for?
  • Your gripe seems a little unfounded farrelli, I don't recall ever seeing any requirement of RPM in the LSB. After all, if I'm not mistaken, they've used Debian as a reference.

    As for Slack, don't take it as a flame, but if you want BSD, you know where to get it...
  • "All I truly want is a nearly universal dependancy/packaging/installation/uninstallation system. rpm comes close"

    Last I knew, rpm is dependant on a database (not binary based like Windows) to figure out those things. That means it is no where near universal and depends on everything else being an rpm. You can do cross packaging with alien, but you soon learn that for it to really work rpm needs to
    1) Better catalogue its own dependancies.
    2) Play nicer in configuration file placement.

    Some of the rest of your vitrol is kind of cute, somewhat insightful but overall meaningless since I would rather judge an ideas technical merits rather than motivations (patting themselves on the back, etc...) ascribed by some misinformed outsider.

    Also a post mearly questioning "Why are they doing what they are doing?" in the middle of a sea of posts explaining exactly that makes one sound impetuous (jumping before looking.)
    ^~~^~^^~~^~^~^~^^~^^~^~^~~^
  • Redhat is the Microsoft of Linux... Let them have the control, and they'll pay you back handsomely....

    (j/k)
    ^~~^~^^~~^~^~^~^^~^^~^~^~~^
  • Hey Rob,

    With all the trolls that have been elevated to an +2 automatic moderation shouldn't someone who is proven informative and levelheaded like DQ get the same status?
    ^~~^~^^~~^~^~^~^^~^^~^~^~~^
  • RedHat [...] does seem to have its own mind, sometimes for the better and sometimes for the worst.

    Then again, I've also learned that as long as you stick to *one* package tool you won't run in to trouble.

    (-: I think you've hit the nail on the head here as to my concern. I think the higher-ups at Red Hat know that locking a market is easier than some of us realize.

    You get Joe Clueless Sys Admin at 500 big corporations to load up Red Hat, the choice of big business (tm), now with new, improved GUI installation. The Trojan Horse is the Red Hat file structure. Then you have 10,000 Joe Clueless programmers hard-coding the paths and other "special cases" into their work. Mr. Clueless always does this for some reason. Now Red Hat is half-way to locking people into their "proprietary" file structure, upgrade cycles and distribution prices.

    But other distros can use the same structure you say? Yeah, THAT will happen. Plus, we should be all to familiar with "embrace and extend" by now. Red Hat just keeps moving. They can even make a tool to rebuild the RPMs over night. I guess that sounds paranoid but it was just an example of the possibility.

    But you can download it for free you say? Maybe it's just me, but I also noticed some of the bigger distributions have made it relatively hard to download over the net. And they "bundle" their docs in and charge relatively high bucks for the distributions. (I don't find the docs THAT useful.) Thank goodness for LinuxCentral and the $2 distro CD, but most of the Joe Clueless aren't going to know about that. Plus they want a support package.

    So don't hard code paths, you say? One article here has already pointed out that Oracle works only with Red Hat because of the hard-coded paths and libraries. If I were REALLY paranoid, I might wonder if they have a deal. (-:

    All Red Hat has to do from here is hype up their name in front of the word Linux constantly -- Red Hat Linux, Borland C++ for Red Hat Linux, Red Hat Linux, ... and soon Mr. Clueless will simply say "Red Hat" instead of Linux. From my experience people love to claim allegiance to a leader as long as he doesn't stab too many people in the back.

    Already a bit of hand-waving, but you get the point. It doesn't take much to fill it in.

  • Then why use 16 bit integers? Use 21 bit integers, like Unicode actually does, and we can include everything, include Tengwar, Cirth and Klingon, which are actual proposals for Unicode being worked on.
  • Both those alphabets have proposals for Unicode, and will be included sooner or later. If you're too impatient, the ConScript Registry (http://www.ccil.org/~cowan/csur/ ) has standards for encoding those languages in the Private Use Block.
  • How about this:

    The current "plan-of-record" is to specify RPM as the file format. It is supported either directly, or indirectly by the widest number of distributions, and so far, no one has pointed out any deficiencies.

    Linux Standard Base Specification 0.2pre - chapter 13 [linuxbase.org]

    And personally I find the sysV init scripts a braindead idea, who needs to stop a server? Who needs a restart? What does it add that I can't do with 'killall -HUP .....'?

    Jeroen

  • I'm pretty sure that your birth language is not usually written in Feanorian script (or the Tengwar of Rumil, either). If it is, you had some f**cked up parents. Now, the Cirth, on the other hand . . .
    Sam TH
  • "major stakeholders" :

    I don't know what you exactly mean by that, but I know a lot of people will read "major corporations that have an interest in the thing".

    Corporations vs individuals is not a debate that I have to introduce to the slashdot crowd, but I really think it shows here.

    When I read an RFC, of course companies did put money in the thing and are trying to influence it so they can cash in. But at least, at the top of the RFC I don't only see company names! I see the names of the actual individuals who worked on it. People that are competent, that I can email, and that actually try to do a good job because their name is on the thing. If they try to push too hard for their company, they get flamed on the mailing list and they loose credibility, THE only thing that really matters on a mailing list.

    But then I check www.linuxbase.org, and what I see as "current members" is almost only company names (at least there is Debian & SPI). This makes me think this linuxbase.org thing isn't going to be as wonderful and open as linux and the IETF.

    Maybe I didn't have enough sleep :)
  • Again, what about vendors that use a Sys V or BSD kernel and wish to support Linux via a Linux compatibility layer? As long as they can provide a layer that conforms to the ABI and provides the described functions, there is no problem. So why is SCO a 2nd class citizen in this? Why is the concept of Linux Compatibility layers buried at the end? Such should be at the top of the press release, because the LSB is the 1st *REAL* chance of a unified-shrink-wrapped-unix binaries in the history of Unix. First of all, SCO is not a second class citizen. At the meetings I have attended thus far (both in person and conference calls), SCO has always had a representative. SCO is very interested in being able to provide an LSB compatibility layer on top of their UNIX offerings (via lxrun), and have been involved for quite some time. However, the primary focus of the Linux Standard Base is to provide a standard for Linux based operating systems. Making sure an app compiled on Red Hat runs on Debian is more important (right now) than making sure that same app runs on SCO.
  • by autechre ( 121980 ) on Monday May 08, 2000 @01:10PM (#1084499) Homepage
    bash-2.03$ whereis netscape
    netscape: /usr/lib/netscape /usr/X11R6/bin/netscape /usr/bin/X11/netscape
    bash-2.03$

    This is on my Debian system. I also do not like the fact that RedHat really likes to throw everything into /usr/bin. So, it is incorrect to say that "Linux" does this. Debian has defined things like this very specifically in their Debian Policy Manual for package maintainers, and certain others would do well to follow.

    Debian also does not install telnetd, etc. (and enable them!) by default. If you haven't tried Debian GNU/Linux before, I would highly recommend it...I used to use RH, but I've been completely sold on Debian and now run it on all of my machines. Even if it were not for the directory issue, apt-get would have won me over.
  • Flame away...but do we really need an imaginary language to be added to UNICODE?

    I like LOTR as much as the next geek...but come on.

  • Differences between distros can be harder to overcome.

    If I'm in Windows, I know the config data is in the registry. If I'm in Linux I know it's mostly in text files in /etc....

    But, how I do remember the six different flavours of Linux, and the specifics of their Java distros? Especially when the distros look a lot alike when you're just sitting in a directory.

    And there's no reason for us to have to know where specific things go in each distro, they could either keep everything in the same places, or they could read the locations out of standardized config files. Anything but the confused mess it's in now.
  • Hear, hear. soon Mr. Clueless will simply say "Red Hat" instead of LinuxI am afraid we are already at this point. I've heard it - haven't you? It may be a bit paranoid to expect those kinds of moves, but that kind of foresight is what keeps corporations like Redhat.
  • I think you meant schreibF.
  • I was under the impression that RPM could be used by anyone - other distros (eg. Mandrake) are certainly having no problems with it. How can a file structure be "proprietary" for fuck's sake? Redhat can't prevent any other distros from being more or less compatible without relying on non-free software, I'm sure that would be noticed in an instant and cause a huge stink.

    Plus, we should be all to familiar with "embrace and extend" by now. Red Hat just keeps moving. They can even make a tool to rebuild the RPMs over night. I guess that sounds paranoid but it was just an example of the possibility.

    And break compatibility with all third party programs, piss off all their customers and make their programmers's work even harder - yeah I can really see them doing that.

    So don't hard code paths, you say? One article here has already pointed out that Oracle works only with Red Hat because of the hard-coded paths and libraries. If I were REALLY paranoid, I might wonder if they have a deal. (-:

    So blame Oracle - has anyone even bothered pointing this out to them anyway? They are hardly a clueless company and I doubt they want to release a version of their product hampered by such stupidity.

    I'm not naive enough to think that Redhat is doing everything for the good of Linux, but I don't see how they can possibly control Linux. They can try to dominate the market, but the second they pull anything nasty all of their customers can pull out and go somewhere else.

  • Arabic spreadsheets, word processing, calendar, and such will all come once KDE 2 is far enough along that development on those can grow. After all KOffice and such all use the same basic technology that Konqueror does.

    And, with a hard freeze of kdelibs approaching, that's probably coming faster than you think.
  • Why is SCO a second-class citizen? They are, because SCO wants benefits, without sharing.

    Once they share the code to SCO, in the way we share our code to Linux stuff, SCO will become a first-class citizen.
  • There IS such thing as source incompatibility, just look at the problems everyone had when all the distros were making the transition to glibc 2.x, there were lots of broken apps because libc 5 wasn't installed by default. Most Linux code itself can be compiled to run on Solaris and FreeBSD but theres plenty of times when you've got library problems. And there's differences between distros, SuSE installed Netscape to /opt, I can't think of other distros that do (RedHat by default installs it to /bin IIRC). It's problems like these that LSB is trying to fix.
  • SuSE installs Netscape into an /opt directory. KDE, GNOME, and Applix are also installed to this directory. I prefer it over RedHat's overuse of the /usr/bin directory.
  • Well, I consider users and independent developers a stakeholder as well. Of course individually, it is difficult to raise their voice but when united in a lobbying organization (e.g. the FSF) they can have a meaningfull vote.

    I think large companies are entitled to some vote for the simple reason they invest a lot of money into linux and opensource.
  • The whole point of marketing a distro is to put in some candy other people don't have an to package some 3rd party (sometimes non-GNU/open) software. I don't know of any distros that make it hard to download their source or binaries. Yup SuSE still lets me install from an FTP server. I would bet RedHat still does also. Thats not very hard to get ahold of I don't think. Point Mozilla over to Corel's Wordperfect site, they have a script you can easily download to fix directories and such in case WP doesn't work properly on your distro.
  • I've always pictured RMS sitting at the table eating alphabet soup reading a newspaper picking out all the other letters except G, N, and U.
  • 'scuse the ignorance, but what the FUCK has this to do with i18n issues?

    This thread focussed on the Linux Standard Base. But it would also hold up pretty well for i18n: there's a great benefit if all distributions used the same translations and it would be odd and a-bad-thing(tm) if one or two would include their own seperate ones.

  • Take a look at here [pango.org]!

    Szo
  • It should read:

    Sotto la panca la capra campa, sotto la campa la capra crepa.
  • Programs should be portable, more or less (ie. it should come with reasonable defaults that can run as-is on the specific system it was designed for, but it should not require massive recompilation or source-level hacking to get it to work on a slightly differently-configured system).

    I couldn't agree more. One of the biggest problems I have with the CorelDRAW beta for Linux is its insistence on installing into /usr. Make /usr the default, maybe (although the FHS says to use /opt), but let the user choose. My /usr filesystem is full, but I have gigabytes free in /usr/local and /opt. Sigh.

  • Redhat is the Microsoft of Linux... Let them have the control, and they'll pay you back handsomely...

    This is so tireing... Just because RedHat has the largest share of the market doesn't make them Microsoft. If Suse gained popularity then people would say that Suse was the Linux Microsoft.

    Making these comparisons is really unfair to RedHat and also stupid. Microsoft is a predatory and prioritory company. Just because people tend to package their products for the most popular dist (esp. among corps) doesn't mean RH is evil... it just means developers are shooting for the setup a person is most likely to have.

    Besides even on the Win platform everything has a default install location (Program Files) but any decent program lets you change this.

    Anyway last but not least, RH pays Alan Cox to work on the kernel and that benefits everyone. :)

    Flame Away!

  • I'm saying the different distributions can pick the best way to do feature X; they'll probably each choose something similar to what the others have, although oddballs like Red Hat and Debian might want to go their own way.

    Then a consensus emerges and we end up with something fairly close to the 'best' solution. That's the way things have always worked in the past, not by having somebody decide in advance what the answer should be.

    Once things have settled down, then - and only then - somebody might write a standard for it, if necessary. The reason why standards are created for network protocols is because of the need for compatibility and to avoid proprietary protocols taking over. But compatibility between different Linux distributions will mostly be enforced by the free choice of users - would you switch to the Foo distribution if it broke your software? - and there isn't a threat of proprietary, closed systems, at least not from vendors like Mandrake who keep all their code free.
  • Well, whats /opt good for anyway? I mean, we have sertainly enough already?

    Is it /opt as in optional? Then what else should not be in there except the kernel?

    I don't quite grok this /opt thing, please explain to the ignorant-little-me

    --
    Erik
  • Use 21 bit integers

    So how will I store them? As 32-bit integers? That will sure increase the file size. It seems I'm going to have to gzip(1) _all_ my text files once that gets done.

  • ESR wants the tengwar in Unicode so he can use them to write the Lojban [tuxedo.org] language.
  • While your point is obviously not entirely serious, it is valid. As far as I know, the intent of Unicode is to encode the glyphs of every human language. However, there is the very real issue of encoding dead languages which may not have been discovered yet, and new sets of symbols for fields of study that have not yet been formalized.

    As for i18n and l10n saving you, that is one of the joys of free software. It adheres nicely to well-documented interfaces for internationalization. Build yourself a .xmodmaprc file to map your keyboard the way you want, create a set of fonts, and design a MULE definition of your new character set and language so that you can edit the text in Emacs. I18n can make it possible for you to accomplish that, which was my point.

    Part of the reason that I spoke up is that I am working on localization of free software into Esperanto. It is going rather slowly for a numbr of reasons. The obvious one that most people would cite isn't actually correct. Esperanto has 1-2 million speakers, so there are plenty of potential users. The problem is that all of them are at least bilingual. None of the translators on the team need the localization. And all of us have real jobs to keep us busy.

    Esperanto is frequently viewed as a fictional language. In a sense that characterization is honest if misleading. It was artificially created in the 1880's. It's creator is long dead, and it has been evolving naturally since then. There are even a few families that use it at home, generally because the couples met through Esperanto conventions and don't share a native language. Thanks to free software, we don't have to convince the world that we are worthy of support. We do it ourselves.
  • o how will I store them? As 32-bit integers?

    No, as UTF-8 or UTF-16. In UTF-8, characters 0-127 (your American-centered ASCII) takes 1 byte, 128-some number takes two bytes and three bytes for even higher values.

    A pure ASCII file thus takes just as little room as before.

  • Oh, so it is about code sharing?

    Ok, then explain why BSD and the BSD's Linux compatibility layer isn't even mentioned in the press release.

    The BSD licence is about code sharing, isn't it? BSD allows sharing with closed-commerical projects, GPL projects, and anything but public domain. Why is BSD a 3rd class (not even mentioned) Remember that Gary Johnson has said that he supports full LSB compatibility for BSD.

    (I look forward to your repy Mr. Polk.)
  • >There IS such thing as source incompatibility,

    Then what does SOURCE incompatibility have to do with Linux standard BINARIES?

    Source incompatibilities have more to do with programmers not being versed in portable code. In the old days of Unix we wrote code to work on the many flavors of Unix. New programmers need to read the old documents on code portability...to be educated that the code world does not begin and end with Linux. But, if you are busy pushing a "world domination" 'tude, then how will the new coders come to understand code portability?

    >Most Linux code itself can be compiled
    What is "linux code"? The Kernel? Or Userspace? A large part of the "linux" userspace is old-fashioned Unix code ported to Linux, as Yet Another Unix Version.

    So, to say 'linux code' what DO you mean by this?

  • I agree stongly.
    Although I am but a newbie in linux I do have a few years of MS under my belt.
    Last night I installed d1x ( open source descent ) on to my laptop via an rpm file. In windows I would expect it to install to c:\program files\d1x\d1x. You know, keep a game file off the path and put it some where organised. The rpm dumped it in /usr/bin.
    Why? A quick look confirms that there are over 2100 files in /usr/bin and a lot of them are files that I would have assumed would live some where more elagant. I realy do fail to see the logic of adding unimportant apps to an already huge dir.
    ps I run Mandrake
    pps If there is a bloody obvious reason for all these non-related files living in the same place could somebody please tell me..
  • No flame, I think you just missed the humor. The origional poster was proposing that RedHat be the defacto standard based on market share. It is the same reasoning PHB's used to make Microsoft the defacto standard it was 3-8 years ago. The salesmen and OEM's pushing that economicaly centralized philosophy were well rewarded (then raped.)

    So it was more a post in responce to the poster, rather than RedHats buisness practices, which are much better than Microsoft. I have no problems with RedHat or even their percieved dominance.
    ^~~^~^^~~^~^~^~^^~^^~^~^~~^
  • And a mixed Latin/Tengwar UTF-8 text (e.g. a dictionary) takes up how much space? What about if some fella were to write "Klingon 4dummiz" (can't write "For Dummies&trade") including some longish samples of Klingon text?

    gzip is my friend.
  • Actually, I can go download the code to FreeBSD, OpenBSD, or whatever any time I want!

    I actually don't happen to be one of those BSD haters. I *like* the BSD license, and actually prefer it to the GPL.

    FreeBSD, OpenBSD, and NetBSD share their code as much as Linux does, and more, if you consider that the BSD license is compatible with more open licenses than the GPL is.

    I wonder if this is the reply you were looking forward to.
  • I never said that sharing code was the #1 prirority of the LSB, now did I?

    I just said that SCO will always be a second-class citizen in the free software world, that's all.

    Ignore my earlier reply; I somehow mis-read what you said, mr.
  • At least on Debian, half of /usr/doc is gzipped anyway, so what's new?

    Really, is going to UTF-8 for all you Klingon/Tengwar/Japenese needs going to make that big of a difference in size? The only texts I have that come close to the size of my mp3 collection or my jpg/png/xcf collection is the GCC source code (for a wide variety of versions). That size isn't going to change by going to UTF-8 unless they stop using English (not likely.)
  • Well, fucking pardon me if I'd rather not deal with that. It'd bad enough now, so I'd rather stop it before it gets to that point. And yes, I do have some experience with older unixes, though I never admined them.

    But I still see small differences between Linux distros to be more damaging because we want Linux distros to work together. The unix manufacturers didn't want their product to coexist with other products.

    I'd like to see one package be able to install and run on any x86 Linux, at a minimum, and preferably, on any full-featured Linux on any chip, with the exception of things like Q3 that require special hardware, or chopped down distros like for embedded systems or very old hardware.

    And I find it a bit irrelevant if old unixes were worse. I'm not using them, not, for the most part, is anyone with a choice, because they did suck. I'd just like to keep Linux from ever ending up like that.

    And yes, you are a coward, to afraid to post a flame under your own name.
  • apt rocks!

    Imagine, oh poor benighted soul who does not have apt-get, how much better life would be if, next time you want to install a new package, say the "hugs" Haskell interpreter, you just type apt-get install hugs and it downloads hugs (and any other packages that hugs requires) and installs it within a few seconds. As far as I can tell hugs (and hundreds or perhaps thousands of other packages) are not available at all in pre-packaged form for RedHat.

    Zooko

  • I mean Linux code as in an app specifically written for Linux. And since you're a whiney old bastard read some of my comments on "world domination", I think any code written ought to be as portable as possible. I use three different OSes daily, I'd love to see certain programs on all of them because it would make my life much easier.
  • /opt is short for optional, SuSE makes many apps an optional (stuff you'd probably only get if you bought the SuSE CD set rather than downloaded it) component. It draws a line between standard packages and stuff you may or may not want. IIRC most of the things in /usr/bin and /bin are considered "default" packages to SuSE. If you want a better answer try it for yourself.
  • My point is that you were talking about a de facto standard. RedHat IS that de facto standard. RedHat controls the layout of the Linux file system, and most major distros follow it to some degree. Certianly most commercial apps are tested first with RedHat. If Linux came up with a LSB-Compatible type seal, then distros could choose to follow that, and if it got sufficiant support (why not, the only one who has anything to lose is RedHat) then developers would use that as the standard. Thus, people are free to use whatever distro they like (for whatever reason they need something custom), but people can use only distros that have adhere to the standard.
  • I have tried it myself.

    But still, "optional".. come on.. Anything besides the base package and the kernel is optional. The smallest SuSE "distro" I have seen was about 8MB, everything else was optional.

    I think it is stupid, but I won't start a major argument :-). I could very well be wrong.

    --
    Erik

"It takes all sorts of in & out-door schooling to get adapted to my kind of fooling" - R. Frost

Working...