Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Linux Software

Designing Good Linux Applications 209

An Anonymous Coward writes: "A guy from IBM's Linux Impact Team in Brazil has written a guest column on Linux and Main describing how applications should integrate with Linux. It's Red Hat-centric, but there is a lot of material about the FHS and LSB that most users probably don't know."
This discussion has been archived. No new comments can be posted.

Designing Good Linux Applications

Comments Filter:
  • First of all, (Score:4, Insightful)

    by ultrapenguin ( 2643 ) on Monday March 25, 2002 @05:22AM (#3219994)
    before going to design NEW linux applications,
    PLEASE take your time and DEBUG the current ones.
    The collection of half-abandoned software that has tons of bugs that nobody uses (perhaps of those bugs) is absolutely huge.
  • by October_30th ( 531777 ) on Monday March 25, 2002 @05:22AM (#3219997) Homepage Journal
    Integration is bad as shown by the bloated Microsoft applications.

    It's far better to do things in the *nix way: small, interacting but separated utilities that communicate via pipes.

  • by Bronster ( 13157 ) <slashdot@brong.net> on Monday March 25, 2002 @05:29AM (#3220017) Homepage
    While he doesn't mention Debian [debian.org] at all, it's clear that the article is strong on packaging. I actually prefer Debian's approach, having a list of sources from which you obtain software, and providing search tools for that list.

    The other important thing is that programs often don't work very nicely with each other, or need certain versions to work. This is where having a central system for controlling dependencies is rather important. I don't actually think Debian goes far enough at the moment (not really handling Recommends with apt), but it's getting there.

    The other important part of packaging is handling upgrades automatically. Packages have security problems, they have new features added. If you have to work out (a couple of months later) which --gnu-long-opts-enable --with-features --without-bugs you had to put on the ./configure command line to get a build that did what you wanted, you're likely to put off upgrading.

    # echo "http://debian.brong.net/personal personal main" >> /etc/apt/sources.lists
    # apt-get update
    # apt-get install bron-config

    Whee ;)

    (note - that URL doesn't exist yet, but it's my plan for the future).

    (note:2 - no ssh private keys in that ;)
  • by Osty ( 16825 ) on Monday March 25, 2002 @05:35AM (#3220034)

    From the article:

    /usr/local, /opt

    These are obsolete folders. When UNIX didn't have a package system (like RPM), sysadmins needed to separate an optional (or local) application from the main OS. These were the directories used for that.


    I understand that this is directly from the FHS, and not some evil concoction from the mind of the author, but dammit, I think it's wrong. Perhaps /usr/local is obsolete with respect to package managers, and that makes some sense (because the package manager should handle proper management of placed files, though in practice that's not always the case), but as long as open source is around, there will always be software that is compiled rather than installed through a package manager. There will also always be applications that are not distributed in your package format of choice (as long as there is more than one package management system, this will always hold true). In these cases, it's still a good idea to keep around /usr/local and /opt. Personally, I'll have /usr/local on my systems for a long time to come, because I prefer to use the Encap [encap.org] management system.

  • by slasho81 ( 455509 ) on Monday March 25, 2002 @05:39AM (#3220045)
    Designing Good Linux Applications

    The 'Linux' word is completely unnecessary - "Designing Good Applications" should suffice.
    Application design couldn't care less of the OS that the application is planned to run on.
  • by Tet ( 2721 ) <(slashdot) (at) (astradyne.co.uk)> on Monday March 25, 2002 @05:58AM (#3220097) Homepage Journal
    I understand that this is directly from the FHS, and not some evil concoction from the mind of the author, but dammit, I think it's wrong.

    Actually, no. It is from the diseased mind of the author of the article. He first cites the FHS, and explains how good it is to have a standard like that, and then proceeds to ignore everything it says. /usr/local is explicitly reserved for local use and therefore no package should *ever* install itself there (my /usr/local, for example was NFS mounted, and RPMs that tried to install there would fail because root didn't have write access to it). So far, so good, and we're in agreement with the article. But then he goes on to say that /opt should never be used. What? According to the FHS, /opt is exactly where IBM should be installing stuff. Quite how he's decided that the two directories are obsolete is beyond me. Both have well defined and useful purposes, both in common usage, and in the latest FHS spec (see http://www.pathname.com/fhs/ [pathname.com]). I'm afriad IBM have just lots a lot of respect from me for this...

  • by NinjaGaidenIIIcuts ( 568607 ) on Monday March 25, 2002 @06:26AM (#3220173)
    Your post could be interesting if you had explained to us why integration is bad. IMO, the yields of object separation over integration packs have belong to the fact you'd compile separate parts of code such as libraries, device drivers and kernel, instead of being tied by a Windows-like central management and interpretation for kernel and virtual machines. Then, "separation" benefits *nix. On the other side, "integration" benefits Windows.
  • Re:First of all, (Score:4, Insightful)

    by CanadaDave ( 544515 ) on Monday March 25, 2002 @06:50AM (#3220226) Homepage
    Programs that were abandonned were abandonned for a reason. Either there were too many bugs, the design was poor, or there is just no demand for it. There's no sense in working on an application just because it doesn't work. It's the natural selection process of Linux programs. The strongest survive. If a program is buggy and people really want to see it work and happen, it will get some attention eventually. Linux is a perfect supply demand scenario in most cases. When developers want Linux to do something they just make an application. The advantage of course over Windows is that other people usually come in to help out, and the code is all over the internet.

    There are more and more stable applications out there now, however. Take Mozilla for example. The long awaited 1.0.0 should be out in a month or so. An XMMS the MP3 player which is as good as they get (thanks of course to huge demand for a good MP3 player), OpenOffice.org which is slowly creeping towards their 1.0 release and beyond, KDE3/Koffice(and KOffice doesn't have many developers, partly due to low demand, but I think that will change soon). Things have really improved in the last year I think, and 2002 will be a big year as well.

  • /opt vs. RPM (Score:4, Insightful)

    by HalfFlat ( 121672 ) on Monday March 25, 2002 @07:05AM (#3220247)

    The author states that /opt is obsolete, and that everything should use RPM and install in /usr. Maybe this is the ideal in a system where everything is binaries-only, but I firmly believe it is poor administration practice.

    The RPM database is binary and fragile. Once it is corrupted, the data describing what belongs to what goes out the window. RPM-packages have to be trusted not to clobber existing files or make changes to configuration files that one wants left alone. The alternative is per-application directories and symlinks (or a long PATH variable); there are tools which automate this, such as stow. The advantage is that the file system is - or at least should be - the most stable thing in the system. One can just examine a symbolic link to see what package it belongs to. This makes removing and updating applications very easy, and also makes it easy to see if there are any links left around from older installations. Removing an application is typically as simple as removing the corresponding application directory.

    RPMs which install in the /usr tree will require root priviledges, whereas applications that can work from a self-contained directory can be installed by a non-priviledged user in their own directory, Also, /usr in principle can be mounted read-only. This will certainly slow down any attempts at installing software in it!

    I have had Redhat's installer corrupt the RPM database on multiple occasions; and I've had to override the dependancy checking innumerable times in attempts to update packages under both Redhat and SuSE, thus rendering useless the other purported benefit of RPM. New software typically comes in source form before RPMs; and the RPMs that do become available are almost always going to be third-party ones that don't necessarily play well with your system. By the time a vendor-created RPM becomes available, the distribution version you are using is no longer actively supported, and you'll need 300MB of updates to other packages just to satisfy dependencies. I've been there, it's horrid.

  • by mmusn ( 567069 ) on Monday March 25, 2002 @07:11AM (#3220259)
    is a command line application.

    Seriously, a lot of Linux applications try to duplicate the Windows world and end up being just as bad. For example, for audio software, a monolithic executable with GUI is a Windows-style application--hard to reuse, hard to extend. A bunch of command line applications that can be piped together and come with a simple scripted GUI, that's a good Linux application because its bits and pieces can actually be reused.

  • by Oink.NET ( 551861 ) on Monday March 25, 2002 @07:19AM (#3220279) Homepage
    This guy's ideas would be way more useful if he could think outside the stereotypical structure of today's Linux apps.

    Ditch the concept of spreading pieces of your app all around the FHS. This is organizationally similar to Microsoft's registry. It becomes a maintenance nightmare. Yes, RPM keeps track of some pesky details that let us get away with a messier install. Yes, the FHS does impose a common structure on what is an otherwise unstructured mess. But programmers are human beings, subject to the whims of ego, ignorance, and yes, even creativity and sheer brilliance. We're going to deviate from the suggested standards if given the opportunity, for one reason or another.

    Give me one main point of access to everything the application does. If you need to use config files, give me the option of manipulating them through the application itself, preferably in the context of my current task. Give me one place to go looking for all the bits and pieces of the app. No, the FHS isn't simple enough. Give me context-sensitive documentation so I don't have to wander outside the app to get my job done. Don't make me wade through a spaghetti-code config file, with the documentation propped open on a separate screen to keep from getting lost.

    Programmers are lazy. I should know, I am one. The last thing I want to do when I'm getting ready to release a program to non-techie users is tie up all the loose ends that seem ok to me, but not to the non-techie user. I'd rather document how to get a tricky task done than write the code that automates the tricky parts. I'd rather tell the user how to go tweak the flaky data in the database by hand than add another error-correcting routine. And it's more work to give the user one simple, full-featured point of entry to each piece of a complex application. But that additional work will make the application more usable, for the expert and the novice alike.

  • Re:My two cents: (Score:1, Insightful)

    by Anonymous Coward on Monday March 25, 2002 @07:34AM (#3220305)
    Lighten up. The OS mascot is a penguin for gods sake.
    Pros use it because it's good, not because the
    documentation was written while wearing a tie.
  • by morbid ( 4258 ) on Monday March 25, 2002 @08:55AM (#3220508) Journal
    The fact that he refers to them as "folders" does not bode well for his credibility in the rest of the article.
  • Re:dependency hell (Score:5, Insightful)

    by Skapare ( 16644 ) on Monday March 25, 2002 @10:21AM (#3220813) Homepage

    It's the creation of the spec file that's a chore. I have to know what dependencies the package has to make it. If I know already, such as by RTFM's the original source package docs, then I know all I need to know to manage it without RPM. I still see making an RPM here as a redundant step.

    I do some programming, but I still don't RPM-ify those programs ... yet. But when someone comes up with an "autospec" "autorpm" program which figures out everything to make the RPM file so it becomes as trivial to make it as to install it, I might be more interested. Right now I'll still with "./configure" and "make install" which work just fine for me.

  • by Oggust ( 526634 ) <d3august@dtek.chalmers.se> on Monday March 25, 2002 @11:33AM (#3221218) Homepage
    Am I the only one who thinks Unix used to be friendlier than it is now?

    Do you remember...

    • When all commands had a manpage? And it actually describe the program! And it would have all the proper sections like "Synopsis" and most oftenly it would include an example or two.
    • When all kinds of other things had manpages too. "man nfs", "man kernel" etc. actually worked?
    • When the "learn" program was around? Remember that? (It was a program that did interactive little courses on newbie topics. You'd do "learn vi" or "learn files". If you stopped at some point it remembered how far you had gotten and would start there the next time. It was simple, but useful.)
    • When vendors actually shipped usable termcaps for common terminals? (OK, the linux vendors are pretty good at this, but some of the "real" unix vendors... Brr.)
    On the positive side, I really do like Debian's manpage policy. Maybe there's hope.

    /August, feeling old today.

  • Re:First of all, (Score:2, Insightful)

    by winchester ( 265873 ) on Monday March 25, 2002 @12:41PM (#3221725)
    Well documented (for developers): because it's hard to grasp the big picture only by looking at the sources when the codebase is large: you end just seeing a lot of trees, but you lose yourself in the forest. Sources tell a developer how something is implemented and how it is supposed to be extended, but usually they tell very little on why things have been implemented that way. Intelligent comments in the code are good, but when a concept spans on several source files, a README on the subject or a tutorial are definitively needed.

    I always thought design documents were supposed to tell me this. I guess I must have been building too much software in corporate environments.

    On a more serious note, I see a disturbing lack of design documentation in open source software. This is in my opinion one area open source definitely should improve, togehter with project management. But that would make oss development a lot more formal, and a lot of people probably do not want that. Choices, choices.

"Floggings will continue until morale improves." -- anonymous flyer being distributed at Exxon USA

Working...