Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Operating Systems Software Linux

What Needs Fixing In Linux 865

An anonymous reader writes "Infoweek's Fixing Linux: What's Broken And What To Do About It argues that the 17-year-old open-source operating system still has problems. Leading the list is author Serdar Yegulap's complaint that the kernel application binary interfaces are a moving target. He writes: 'The sheer breadth of kernel interfaces means it's entirely possible for something to break in a way that might not even show up in a fairly rigorous code review.' Also on his list of needed fixes are: a consistent configuration system, to enable distribution; native file versioning; audio APIs; and the integration of X11 with apps. Finally, he argues that Linux needs a committee to insure that all GUIs work consistently and integrate better on the back-end with the kernel."
This discussion has been archived. No new comments can be posted.

What Needs Fixing In Linux

Comments Filter:
  • new mascot (Score:5, Funny)

    by Anonymous Coward on Monday December 01, 2008 @10:37AM (#25943709)
    I'm tired of that penguin
  • by MaxwellEdison ( 1368785 ) on Monday December 01, 2008 @10:38AM (#25943715)
    I am sure that only sane, rational, and courteous debate will follow. Finally an argument-free thread!
  • by JackassJedi ( 1263412 ) on Monday December 01, 2008 @10:38AM (#25943725)
    that Linux IS pretty much a mess, it's just that there enough hands around at all times to fix quickly enough whenever something breaks. That's pretty much how it works at the moment and this could be better indeed.
  • by ishmalius ( 153450 ) on Monday December 01, 2008 @10:41AM (#25943775)

    I am so happy that he has volunteered to do this. I was afraid that the article might be about wanting someone ELSE to do the work.

    • by sukotto ( 122876 ) on Monday December 01, 2008 @11:19AM (#25944477)

      So many people tut and say "Someone should do something", but so few step forward and say "...and that someone is me"
      -- Terry Prattchet

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday December 01, 2008 @10:42AM (#25943791) Homepage Journal

    Is more vendor support. Every supposed real problem with Linux is based on or related to a problem with a driver; nine times out of ten this problem is caused by the manufacturer being unwilling or unable to release specifications. The various vendors out there need to realize that Linux may not be the future, but it's a more likely future than Windows, and they need to put some effort into support. Of course, some of them have, and if you reward them by purchasing their hardware, they may do more of it. Regardless, having multiple GUIs isn't actually a real problem - it's an opportunity, not a setback, and meanwhile you can trivially use libqt to draw GTK+ apps [ubuntu.com] or use GTK+ to draw widgets for libqt programs [launchpad.net] (Sorry I haven't updated in a while, my last build FAILED on the build servers but worked at home, and it was a compiler error, NOT a library I forgot to specify. Nice work, Ubuntu!)

    • Re: (Score:3, Insightful)

      by internerdj ( 1319281 )
      This is probably the big reason that commercial OSs are popular. When traditional businessmen hand over information to another company they can accompany that with a contract that can through the legal system leverage some pain against the other business if the information is used against the company rather than for it. How are you going to do that against open-source? It is their job to protect the company from the competition. If they fail to do that then they can lose thier paycheck. To convince the
      • by Alex Belits ( 437 ) * on Monday December 01, 2008 @12:29PM (#25946013) Homepage

        Except, of course, in reality the "proprietary information" is something that a competitor should never ever consider using for product development because it's worthless details of a particular (usually bad) implementation of a trivial idea. The real purposes are:

        1. Break compatibility and extort money from everyone who tries to achieve it.
        2. Hide embarrassing details that demonstrate low professionalism of developers or expose underperforming products.

    • Re: (Score:3, Insightful)

      by karstux ( 681641 )

      If Linux had a stable interface for binary non-free drivers, we might see more support from the vendors. It's not a crime to not want to disclose your hardware.

      • Re: (Score:3, Insightful)

        by cowbutt ( 21077 )

        The kernel would also end up with multiple different ways of doing the same thing, but no-one ever knowing if they can be removed for fear that some obscure driver hasn't been updated to use the current way.

        Windows aims to provide binary compatibility, at the cost of complexity within the OS. UNIX, and Linux doubly-so, aims for source compatibility and improved architectural simplicity at the cost of some administrative complexity, aka 'Worse is Better' (http://en.wikipedia.org/wiki/New_Jersey_style [wikipedia.org]).

      • by howlingmadhowie ( 943150 ) on Monday December 01, 2008 @11:38AM (#25944861)
        quite apart from the fact that it has very little to do with 'not wanting to disclose your hardware', it is not the linux-way. the linux-way is to make the best possible kernel. for this reason, the binary interface keeps changing to make it better. the drivers just need to be maintained by people who are aware of this fact. if that means that the driver disk you got with your all-in-one scanner-copier-coffee machine doesn't work after 2 years, so be it.
      • by mrsbrisby ( 60242 ) on Monday December 01, 2008 @11:41AM (#25944927) Homepage

        We don't want that support. Those vendors have a tendency to produce low-quality drivers, and reduce the overall stability of the system.

        Most Vista and XP (and previously, Windows 98) apologists agree that the biggest reason Windows is perceived as unstable is due to low-quality drivers for low-quality hardware.

        By selecting hardware known to work with Free Software, I'm pretty much guaranteed a solid and stable experience.

        • Re: (Score:3, Informative)

          by Sigma 7 ( 266129 )

          We don't want that support. Those vendors have a tendency to produce low-quality drivers, and reduce the overall stability of the system.

          This is countered with a simple rule - if it malfunctions or stalls, it no longer executes.

          In the current state, Windows Vista can alert the user to a malfunctioning driver or device (in my case, it states nvlddmkm has failed and been restarted.) If a hardware driver is malfunctioning, the kernel can kill it and send a message to the appropriate monitoring program that a driver has crashed.

  • OS (Score:3, Funny)

    by ohxten ( 1248800 ) on Monday December 01, 2008 @10:48AM (#25943881) Homepage

    the 17-year-old open-source operating system

    First person to make a fuss about this gets a prize!

  • by Bizzeh ( 851225 ) on Monday December 01, 2008 @10:48AM (#25943887) Homepage

    the problem with linux, is that to many people want it to be to many things. there is no centralised effort to get it to do one thing.

    there are several GUI solutions, rather than a centralised effort, there are several browsers gunning to be the main browser, there are several sound sub-systems/servers... why cant these people learn to play together, and come up with something that fits everybody.

    i know i will get comments about "choice" its all about "choice", but its not, its not at all about choice to the common user... the common user want to switch the computer on, check their hotmail account, check facebook, and then talk to their friends on live messenger... THATS IT.... thats what the common computer user does now... they dont care how their computer does it, they dont care about the morality behind it, they dont care if the guy who made their file system killed his wife or not.. they dont even know what a file system is.

    its only the very advanced users who care about these things, and im afraid to say, that these users dont even account for 1% of all computer users.

    if linux based operating systems are to become as big as they want to be, they need to stop fighting among themselves and centralised their efforts. otherwise, we will be having this same story in another 17 years

    • Re: (Score:3, Interesting)

      by gfxguy ( 98788 )

      I disagree... it IS about choice, but more importantly, the competition to be top dog keeps everybody on their toes, and they are all doing some great work.

      I disagree that Linux is a mess, it's no more of a mess than Windows, and in many ways it's a lot better. We may have different distributions competing with each other, but MS OSs still have to compete with each other... XP, XP Pro, XP Media Center, Vista, Vista Ultimate.... all with at least slightly different capabilities.

    • Re: (Score:3, Insightful)

      by Draek ( 916851 )

      there are several GUI solutions, rather than a centralised effort, there are several browsers gunning to be the main browser, there are several sound sub-systems/servers... why cant these people learn to play together, and come up with something that fits everybody.

      Because you can't. There isn't a car that fits everybody, not a house, not a pen, not a shirt, nothing. And if we can't even agree on a simple *pen* that fits everybody, what makes you think we can agree on something as complex as a computer interface?

      If life can teach us anything is that there's no such thing as the "average" human, we're all fucked up in various, wildly different ways, we've got different tastes, and we *will* cry foul when somebody tries to impose theirs onto us. Plus, we may have severa

  • by suso ( 153703 ) * on Monday December 01, 2008 @10:49AM (#25943891) Journal

    and the integration of X11 with apps. Finally, he argues that Linux needs a committee to insure that all GUIs work consistently and integrate better on the back-end with the kernel.

    Call me old fashioned or whatever the cute term is now. But fuck that! If I ever see programs like cp become bloated with X library calls because some news reporter needs to see a GUI progress bar, I'm going to be very angry.

  • NetworkManager (Score:5, Insightful)

    by Xabraxas ( 654195 ) on Monday December 01, 2008 @10:49AM (#25943893)
    My biggest issue lately has been NetworkManager. It isn't absolutely necessary but wireless connections are quite annoying without it and more and more applications are becoming NetworkManager aware which means it is increasingly important to have it. It hasn't progressed that much since its inception and it's still not possible to configure most networking options to work with it. The NetworkManager homepage makes it clear that they are not interested in profiles, and their application makes it clear they are not interested in bridge interfaces or any other kind of advanced networking. So your options are to disable it and configure networking through your init scripts or deal with the extremely limited options of NetworkManager. My biggest complainst are that I cannot get a static IP on my home wireless while getting DHCP everywhere else and it's a real pain in the ass to set up bridged networking for use with a VM.
  • by Millennium ( 2451 ) on Monday December 01, 2008 @10:50AM (#25943903)

    Basically, what he seems to be complaining about is the lack of standards, and on this he has a point. But he clearly doesn't understand the difference between standard-as-implementation (the Microsoft way of doing things) and standard-as-protocol (the superior way).

    You can see some examples of standard-as-protocol, for example, when he talks about kernel ABIs, audio APIs, and such. But most of what he speaks of is mere whining about how there isn't Just One Way to do something, calling for standard-as-implementation when that simply isn't necessary: for example, the single configuration format or the "tight integration" between X and the kernel.

  • by Neil Watson ( 60859 ) on Monday December 01, 2008 @10:50AM (#25943911) Homepage

    1. Why is Linux blamed for configuration files that are written by application developers? Linux is a kernel and is not responsible for Sendmail. Further I fail to see how the point and click method of configuration is better than editing a text file than can be searched, backed up and version controlled.

    2. Why is it the responsibility of Linux distribution maintainers to provide a means for commericial vendors to package their product? Vendors had to spend money to get certified for other operating systems. How about putting a little work into understanding and using a Linux distribution.

    3. X freezing? Umm...

    Perhaps I've just feed the troll but, I'm sure the pointy hairs will read the artical and think it's all true.

  • by MostAwesomeDude ( 980382 ) on Monday December 01, 2008 @10:54AM (#25943979) Homepage

    http://www.freedesktop.org/ [freedesktop.org] is the link. Was that really so hard?

  • by Ed Avis ( 5917 ) <ed@membled.com> on Monday December 01, 2008 @10:57AM (#25944029) Homepage
    I'll leave others to comment on the rest of the article but I liked this one nugget:

    One thing that might help is a kind of meta-package format: a file which, when downloaded, is run by a client native to the given distribution. The client then goes to the software repositories for the program maker and obtains all the right packages to make that program run on that particular machine.

    We have the LSB, and distributions which make some effort to ship binary 'compat' packages, so that third parties can distribute their software in RPMv3 format (n.b. not the same format as currently used by RPM-based distros, which are on RPMv4) and it will just install and work on any i386 or x86_64 Linux system. But I wonder if that is slightly the wrong model. At the moment if you want some particular library you have the choice of statically linking it into your executable, or just relying on it being there in the target system; neither is very appealing.

    For example, suppose you want GTK version 2.16 or later but LSB specifies an older GTK (actually, it specifies a set of interfaces, but that corresponds to a particular GTK version). You could statically link your app with gtk-2.16, or you could include your own private copy of the library to be stuck in /opt/myapp/libs, but then what about Fedora 10 which does include a new enough GTK?

    Instead of providing a single RPM (or worse, lots of different binary RPMs for different distros), we should encourage vendors to set up a yum repository. Then to install their software you could add the third-party's repository to your software sources list and use the normal GUI tools to update and install packages. If they want to use some newer library which is not included in Ye Olde Enterprise Linux 1.1, then they can just add a package for that library to the repository, and it will be installed only on systems that need it. This also takes care of automatic updates, which are not provided if you just give people an RPM file to install manually.

    Of course, we don't live in a world where you can just 'encourage' third-party software vendors to do things and they'll jump to it; otherwise Nvidia would long ago have released free drivers. So you need to make it as easy as possible to set up a repository for yum or apt-get or smart or whatever packaging tool distros are using. It needs to be trivially easy. So I would suggest enhancing yum and the other tools to work from a plain directory of rpm files served over http. Just dump the files on a webserver, let Apache serve the directory listing and let yum point to that and Just Work. Or, if that's too dirty for you, use a directory on an ftp site (which at least has a defined protocol for listing the files available).

    I think a repository for package management programs like yum satisfies what the author is talking about when he asks for a 'meta-package'.

  • by line-bundle ( 235965 ) on Monday December 01, 2008 @10:58AM (#25944043) Homepage Journal

    In the sense that there is little originality, and it seems anything added to linux has to have occurred in another operating system.

    Linux/Unix has plenty shortcomings, but its evangelists believe it's so perfect it cannot be improved. Here is my short list of major peeves.
    1. Filesystem metadata/permissions. Why do files still have to have rudimentary metadata? Drives are massive and a few bytes would not harm. MacOS has added metadata. An example would be that a file should be able to keep a list of all the dates it was accessed. Why can a file only have one owner/group?

    2. Root is God. This must really be fixed. There should be a way for root to irrevocably divest its powers, and root does not need to access users file. A user should explicitly grant root permission to read his files. It will always be a major security issue because all one has to do is become root. Plan9 managed to do that.

    3. They lie about everything is a file. Why not extend this to networking resources ('cd http://www.gnu.org/ [gnu.org] would be cool ). Plan9 also succeeded there.

    I am sure linux evangelists are going to propose (hack-filled) workarounds or reasons it can't work, but I don't buy it. That is why I left linux.

    • by Anonymous Coward on Monday December 01, 2008 @11:12AM (#25944299)

      > 1. Filesystem metadata/permissions. An example would be that a file should be able to keep a list of all the dates it was accessed.

      Makes things slow. Most distros turn off logging the 'atime' (access time) because this requires writing to the disk on every read.

      > Why can a file only have one owner/group?
      To keep things simple, the GUI is kept this way. You can make it as complicated as want though with Access Control Lists - just like you do in Windows.
      For a GUI way to set this, see something like: http://rofi.roger-ferrer.org/eiciel/?s=5

      2. Root is God. This must really be fixed. There should be a way for root to irrevocably divest its powers, and root does not need to access users file.

      This is called SELinux and is installed with pretty much every distribution. But for what you want, the users should instead use encrypted home directories.

      > 3. They lie about everything is a file. Why not extend this to networking resources ('cd http://www.gnu.org/ [gnu.org] would be cool ).

      This is called FUSE, and is included with every distribution.

    • by amorsen ( 7485 ) <benny+slashdot@amorsen.dk> on Monday December 01, 2008 @11:15AM (#25944381)

      An example would be that a file should be able to keep a list of all the dates it was accessed.

      Fixed already. Extra attributes have been available for a long time. Feel free to use them.

      Root is God.

      Fixed. SELinux.

      Why not extend this to networking resources ('cd http://www.gnu.org/ [gnu.org] [gnu.org] would be cool ).

      Hard to do in kernel space. We're getting there in user space.

    • Re: (Score:3, Insightful)

      by myz24 ( 256948 )

      Here are some answers

      1. Linux has ACL support and all of the major distributions I've used recently include it by default. You may however need to modify your fstab to enable it. From there you can use setfacl and getfacl to set and get ACL entries. I use command line 99% of the time so I can't say if gnome or KDE will allow you to manipulate ACL entries.

      2. While I understand what you're saying I don't agree. As a person who also administers Windows networks one thing that always bothered me is there wa

  • by Kjella ( 173770 ) on Monday December 01, 2008 @10:58AM (#25944047) Homepage

    ...and none of them are listed in this article. Most of these are lame rehashes of old stuff that just isn't important. How about stuff like flash not crashing on me every two minutes? A IM client that doesn't freeze on file transfers with native MSN clients (I've tried several and they just don't work), some real compatibility with MS Office (the locked excel sheet for travel expenses breaks every time and I have to unlock it to actually make it work), fix the dual screen setup so that it actually works, that the side buttons on my mouse would work without hacking xorg.conf, all the ways WINE fails me and so on. I don't care that there's plenty choices, I just want at least one choice that works...

  • by petes_PoV ( 912422 ) on Monday December 01, 2008 @11:00AM (#25944075)
    The list's the same as it was 10 years ago - and will be in 10 years time.

    USB barely works. It's OK for mass-storage devices, but sucks hugely for high-bandwidth devices, or anything that's removable - and gets removed.

    Video: just as bad. Put these two together and you have a mess of non-functional webcams, video applications which sometimes hold together if you're prepared to spend hours and days hunting down just the rtight combination of codecs, libraries and applications.

    However, the worst part of Linux is tha parlous state of the documentation. A morass of different styles: .man .info HOWTOs, html, text-files. Almost none is available in more than one language and hardly any is kept up to date. Even less is declared obsolete, to stop people trying techniques that haven't worked in years - but is still highly-linked to on the web.

    Frequently, the best documentation for an application is the string command.

  • flatbed scanners (Score:4, Interesting)

    by viridari ( 1138635 ) on Monday December 01, 2008 @11:00AM (#25944083)

    I use Linux and FOSS almost exclusively for my photography workflow.

    Almost.

    See, when I work in film, I need to have a Mac around to handle the flatbed scanner. Because, unfortunately, Linux support for flatbed scanners really sucks rocks.

    gimp has some shortcomings as well but I understand they are being actively addressed so I won't bitch about that.

  • Hm (Score:4, Insightful)

    by Eddy Luten ( 1166889 ) on Monday December 01, 2008 @11:01AM (#25944091)
    During the periods that I felt brave and tried out Linux, there were several things that brought me back to Windows.
    • Unix-like filesystem design and partitioning
    • Native ISV support w/o Wine (Adobe, etc.)
    • IHV support such as good drivers
    • Clear end-user documentation (bought SuSe, RedHat and the manuals gave me nightmares).
    • A full featured IDE like Visual Studio that's not Eclipse

    I guess these are the main things without wasting too much time on this topic.

  • Well... (Score:4, Insightful)

    by ralphweaver ( 1406727 ) on Monday December 01, 2008 @11:01AM (#25944097) Homepage
    ... it depends heavily on what the goal is. If the goal is to overtake windows on the desktop, then largely, yes, I agree. However, linux is in good shape on the server, actually far better shape than Windows 2003 Server in reality. It's easier to manage, it's more reliable, it's cheaper, and harder to exploit. However, if linux is going to make a serious attempt at taking over desktop market share from Windows then there are two things that must be done-- simplistic flawless working audio. simplistic flawless working video. It takes many times more effort in linux to get audio and video working cleanly than it does in windows and until that changes there is no hope of linux gaining serious market share in the destop environment. (on the other side of that coin, once it's working in linux it never breaks unlike windows.. and you can simply copy your old configs over your new when you reinstall and everything works again.)
  • One thing... (Score:4, Insightful)

    by rickb928 ( 945187 ) on Monday December 01, 2008 @11:02AM (#25944109) Homepage Journal

    Documentation

    Everything else is secondary. Well, most everything. But without usable documentation, all else is futile.

    Oh, and would someone do some work on documentation?

    Thanks!

    • Re:One thing... (Score:4, Interesting)

      by 99BottlesOfBeerInMyF ( 813746 ) on Monday December 01, 2008 @03:55PM (#25950091)

      Oh, and would someone do some work on documentation?

      I wrote documentation for professional software for years, including Linux software. I had numerous colleagues ask me if there was a way they could write docs for OSS projects as a way of contributing and I tried really, really hard to help hook them up with projects... any projects. It was almost always a failure. Developers of OSS don't seem interested in the help and rarely take the time to explain parts of the software the writer can't figure out on their own, even though that explanation would then have made it into the hands of all the users. Good documenters are also user advocates who approach software from the end user perspective and thus tend to come across and point out a lot of usability issues in the course of their job. OSS developers tend to not only ignore such input, but often became rude and abusive when provided with such feedback (which companies usually have to pay big bucks to get). Finally, I've done graphics work professionally and can tell you the same is true for their attempts at contributions. One of the best graphics guys I know, who makes big bucks for his work, was so poorly treated when he tried to submit some free textures to an OSS game he played, he has vowed to never again try to work with "OSS nutcases" as he now calls them.

      All that said, if you're developing an OSS project and would like documentation help and you're willing to commit to actually working with a professional documenter, I can probably hook you up with a recent college grad, or a professional looking to do some free work to expand their resume to include OSS.

  • by Eunuchswear ( 210685 ) on Monday December 01, 2008 @11:03AM (#25944123) Journal

    He complains:

    1. Poor package management.

      The way packages are managed within any individual distribution is entirely up to the maintainers of that distribution.

      Who else should do it?
      He complains the distribution differences make life hard for people selling software. Well, tough, if they want money maybe they should work for it?

    2. Configuration files.

      There needs to be a consistent -- and whenever possible, self-documenting -- configuration system throughout, from the kernel to userland tools and user applications

      I know! Let's recreate the windows registry, but this time better!. Yawn.

    3. Unstable Kernel ABI. FUD.
    4. He wants a versioning filesystem. Like Windows has. (Does it?) I want a poney.
    5. Audio API. He says there are too many of them.
    6. The GUI is anarchic. (I see no black flags).
    7. X11 is not integrated with the apps. What the fuck does this mean.
    8. He wants "commercially hosted backup and restore". Maybe if he thinks there's money in it he should start a company instead of sitting on his fat ass and whining.
    9. Conclusion "Most of what's wrong with Linux isn't fatal", replacing it by a Vista look-alike would save all his problems.

    Just about the shittiest article I've read for a long time.

  • Not bad, but... (Score:5, Interesting)

    by mlwmohawk ( 801821 ) on Monday December 01, 2008 @11:19AM (#25944459)

    What I find in *most* of these sorts of pieces is that they are either cynically or subconsciously pushing for the winozification of Linux. He makes some good points along with the bad.

    (1) Package Management
    This is a good point if the debian people and redhat people could work toward a solution, it could be fixed as both systems have a great deal in common.

    (2) Configuration Files
    Bzzzt. Wrong. The foolish part of this subject is that while the Windows registry provides a standardized access to the data store, it only defines types and not what they are supposed to be. Lunux configuration files under /etc are, IHO, better and can be backed up and diff-ed.

    (3) Kernel Application Binary Interfaces
    I would like to see a stabilized and standardized device interface API for standard devices, something exposing a limited subset of the kernel that would simplify simple devices like block, serial, and network types of devices.

    (4)Native File Versioning
    Bzzt. Its called automatic backup people. This is a relatively new feature in Macs and barely working in Windows. Would be nice, but can't characterize it as something that's broken.

    (5)Audio Application Programming Interfaces
    This I 100% agree with. Choice is nice, but the geometric product of "choice" in system services means that rich multimedia applications are much harder to develop.

    (6)Graphical User Interface
    He sort of has a point about this and it has often been a problem.

    (7)Integration Of X11 With Apps
    Bzzt Wrong. X11 is a HUGELY powerful system and if you encounter a bug that crashes your session, that's a bug. Fortunately I haven't seen one of these in about 6 years.

    (8)Commercially Hosted Backup And Restore
    Bzzt Wrong. This is not "Linux" being broken, it is 3rd party vendors being stupid.

  • Video performance (Score:4, Interesting)

    by Sgs-Cruz ( 526085 ) on Monday December 01, 2008 @11:21AM (#25944513) Homepage Journal

    Video drivers. Video drivers. Video drivers. Or possibly X11. Or possibly the whole multi-layered graphics approach.

    I'm not sure whether it's the fault of the Linux kernel or the graphics card companies that won't release their hardware specs / open-source drivers. To the "newbie" Linux user, that's more-or-less irrelevant. I just know that I installed Linux for the first time two months ago on a brand new computer (AMD Athlon X2 6400+, Asus M3N78-VM with onboard GeForce 8200 graphics, 2 x 2 GB DDR2-800 [PC2-6400] memory) and gave up last week and installed Windows XP after spending eight weeks dicking around with video drivers / KDE vs. GNOME / xorg.conf / etc. trying to get the desktop performance up to the level where it doesn't take a good half-second for a bloody Firefox window to stutter its way up to full-screen from the minimized state.

  • Remote desktop (Score:5, Insightful)

    by lord_sarpedon ( 917201 ) on Monday December 01, 2008 @11:24AM (#25944569)

    I wish there was a windows remote desktop equivalent. Yeah! I can forward X11 apps over SSH! Network transparency! Cool! But over the internet - usually painful...high latency? oops. Connection dropped? App exits. Hope it autosaves.

    Ok, so let's use VNC. A lot better to be sure. Or NX, with its shockingly awesome speed and responsiveness.

    But how do I get at the apps I already have running? Nifty, I can ssh in to my desktop machine at home. I know I'm logged in to a gnome/kde/whatever session. Screen locked. What if I have Eclipse open and want to pick up where I left off?
    -Start a vncserver? That's fantastic. I just bypassed the display manager, so no warning about concurrent sessions. Let's hope that _all_ of my apps are careful about this weird case and don't barf all over my data.
    -Forward just eclipse? Maybe if I kill it first from my shell it won't complain.
    -Use x11vnc (hoping my session is on display :0, and setting environment variables appropriately)? Oh, look at that! Screen's locked. I'll just type in my password and get going. Works fine, except for the fact that my _monitor woke up_ and _everyone can see what I'm doing or hijack my session_ (keyboard and mouse working). Maybe I'll just quickly logout so I can start something in VNC...

    It's ugly, all of it.

    On the windows side, as most everyone here has seen, a) a session started locally can be connected remotely b) a session started remotely can be connected to remotely c) in either case, a "locked" screen is displayed as appropriate and nobody gets to see a haunted cursor and d) none of this breaks 3D acceleration or video overlays when switching back to local display. It's _incredibly_ useful. This is something you'd expect Linux to be _better_ at, a big selling point of desktop Linux...afraid not.

    I tried to pick some brains once about even the simplest hacks - like being able to poll X for display updates when it doesn't have a VT. And from that, I don't get the impression Linux will catch up in this department anytime soon.

    • by slifox ( 605302 ) *

      You must not have looked into NX very thoroughly... Particularly, these options:

      EnableSessionShadowing: Each user can require to attach to an already running session.
      EnableDesktopSharing: User can require to attach to the native display of the nodes.

      I use NX to share my desktop session over my VPN, so that I can login with my laptop while I'm away from home.

      Not only is NX very fast, but it also does not require a running program in the background (like x11vnc), since the SSH server doubles as an NX server i

  • by SirGarlon ( 845873 ) on Monday December 01, 2008 @11:32AM (#25944719)

    One of the axioms of free software is that users are free to fix whatever they want, when they want. So after 17 years of Linux evolution, why are these "problems" not fixed yet?

    In most cases, it's because the cure would be worse than the disease.

    This is one of the many fragmentation problems that makes it difficult for commercial software vendors to offer their products for Linux. No one package format will do the trick across distributions -- not without hassle, anyway.

    So obviously what we need is yet another package management system that's different from all the ones that exist now. Developed from scratch, of course.

    To that end, there's little or no centralized configuration: everything in the system is controlled through a welter of files, and there's no guarantee that the syntax of any one configuration file will apply to any other.

    Obviously the solution is to rewrite every program in the OS to use a standard configuration file format. Instead of, you know, writing a man page that explains how the configuration file works.

    If there is one complaint that comes up more often than any other about developing for Linux, it is the way the kernel application binary interfaces are a moving target.

    So we should freeze all kernel development until proposed changes go through a 2-year approval process by a configuration control board. We all know that keeps the Debian distro moving along smoothly.

    And so on.

    Bottom line: the author doesn't like Linux, doesn't bother to understand it, and wishes it were more like a proprietary OS controlled by a single vendor.

  • by GooberToo ( 74388 ) on Monday December 01, 2008 @11:32AM (#25944721)

    consistent configuration system

    What a dope; because we know this has worked so well for windows. The registry is a nightmare on Windows. Linux/Unix does have a consistent model and it is known as text configuration files. It's powerful and can be leveraged on even the slowest of links. One size does not fit all - although I've seen far too many applications use XML for this where it makes absolutely no sense whatsoever.

    native file versioning

    Seems Linux is now held to a higher standard. Again, what a dope. Outside of the VMS crowd, I've not seen a huge outpouring of demand for this feature. Having said that, I do believe a versioning FS is in the works and for all I know, some may already be available. Realistically, few people want this and most have no clue what it even means. For the general use case, RC-software already exists to fill this niche. His complaint is empty.

    audio APIs

    As far as I'm concerned, it's done. Pulseaudio [pulseaudio.org] and ALSA [alsa-project.org] are all that you need. If you have more specialized needs, then JACK Audio [jackaudio.org] takes care of you. For the majority of people, Pulseaudio has what you need and is also portable to Windows. Many (most?) distros are already moving or have completed their move to Pulseaudio. As far as I'm concerned, this issue is addressed, save only for migration time for slow adopters.

    integration of X11 with apps

    This means nothing. What a dope. All GUI applications which communicate with X are integrated.

    and integrate better on the back-end with the kernel

    Again, what a dope. This means nothing.

    In a nutshell, his complaints are silly, meaningless, or have been addressed. As far as I can tell, his only complaint which has any merit is audio API standardization and that has been achieved.

  • by SmallFurryCreature ( 593017 ) on Monday December 01, 2008 @11:42AM (#25944949) Journal

    Car anologies are of course famous on slashdot, but I won't do that. I will do instead something radically new. A MOTORcycle anology. Yes, you saw it here first!

    What is wrong with motor-cycles.

    • They offer no protection against the rain.
    • There is no seatbelt to keep you on your seat in a collision.
    • With just two wheels they easily fall over.
    • Only room for a single passenger.
    • The law says you got to wear a helmet making it impossible to either drink coffee or fix your make-up depending on gender/sexual preference.

    The list goes on. Any of them can be fixed BUT the moment you do that, you no longer have a motorcycle. You got something else.

    Back to linux. The NATURE of linux is that it is free. Not just free as in beer or whatever but free as to what anyone does to it. Anyone can create a distro or a new UI or whatever because that is what it is all about.

    Change that and you change the nature of linux and you get... well depending on who does it. A Red Hat or a Windows or an OSX. It might be BASED on linux, but it ain't Linux. Same as OSX ain't BSD. It uses it at its core but it ain't BSD. Similar to how Mandriva ain't Suse.

    The suggestions made CAN apply to a distro, even perhaps a collection of distro's but NEVER to Linux.

    It ain't just about the name, the author talks about kernel interfaces and X11 as if they are the same thing or indeed got anything to do with each other. They don't.

    There are already efforts to standarize Linux distro's making them use the same directory layout.

    But to make any such effort to official, that is the way into development hell that Windows and for that matter all gigantic software has become.

    Notice that Linux constantly improves, constantly changes. Well those two things go together. Either you get Linux that is a constant moving target or you get MS Windows that doesn't change in years and then breaks everything at once. Oh yeah, remember how Vista was such a dud because all its interfaces changed and none of the drivers work? Well guess what, to fix that, the next windows won't change... so you get a NEW OS in the a couple of years that hasn't improved at all, just a few bug fixes, if you are lucky.

    Apple knows this and just threw everything overboard two times already, last time with OSX because sometimes if you want to move on, you just got to break things.

    GNU/Linux is what it is because of what it is. Change that and you get something different and that might be to big a price to pay to end up with yet another commitee developed OS. We had those. You can get PLENTY of unixes developed by a single entity. Not a single one of them is a popular as linux. (Oh alright OSX might be more popular but that ruins my argument so I am ignoring it)

  • Committee?? (Score:5, Informative)

    by ThePhilips ( 752041 ) on Monday December 01, 2008 @11:48AM (#25945095) Homepage Journal

    the kernel application binary interfaces are a moving target.

    That's why we have glibc, which abstracts that ABI from applications.

    Kernel driver interface - the horse [mjmwired.net] was already beaten to death many times ( see here [slashdot.org]).

    a consistent configuration system, to enable distribution;

    Windows tried that with Registry - and it didn't worked. And it will never work since "one size never fits all" requirements of all applications.

    native file versioning;

    Was tried many times before and failed miserably. As long as majority of files are blobs, versioning on level of file system makes no sense. Versioning on level of applications is implemented already more or less everywhere it was needed and SVN/git is there for the rest of applications.

    audio APIs;

    See ALSA [alsa-project.org] and its user-space libraries.

    See SDL [libsdl.org].

    and the integration of X11 with apps.

    As was shown by FreeDesktop [freedesktop.org] initiative not really needed nor X [x.org] folks want to be bothered by all the end user bells and whistles.

    Finally, he argues that Linux needs a committee to insure that all GUIs work consistently and integrate better on the back-end with the kernel.

    Committee?? [quotegarden.com] Buahahhahahaha!!!1!!cos(0)!!!!!!!

    All what he says was tried before (see (11) [faqs.org]) and generally can be described as "failed".

  • Summary (Score:3, Insightful)

    by characterZer0 ( 138196 ) on Monday December 01, 2008 @11:57AM (#25945307)

    1) Some distributions have lousy package management tools (RHEL) and proprietary vendors insist on releasing software only for RHEL, taking advantage of every stupid little thing it does differently so the software does not work anywhere else.

    2) I want to use complicated programs that no end user needs to touch (i.e. sendmail) but I do not know what I am doing and I screwed up the configuration.

    3) Some hardware vendors refuse to give specifications so other people will writer drivers for free, but cannot be bothered to maintain drivers themselves.

    4) I want to use a filesystem that has a specific feature, and they exist. I am grumpy that one of them is not the default for my distribution.

    5) There are several audio APIs, each with advantages and disadvantages. I apparently do not know what I want to do, so I want one that has all of the advantages and none of the disadvantages so I do not have to plan ahead.

    6) I do not understand how graphical environments work on UNIX and think that Linux should be responsible for most of what X11 and graphical toolkits do.

    7) I want 'screen' for X11, but do not know about 'xmove'. But what I really want is for my damn proprietary video driver to stop crashing.

    8) There is no company that provides a backup solution because there is not enough market share because most Linux users learned how to use rsync and a handful of other tools.

  • by TheZorch ( 925979 ) <thezorch.gmail@com> on Tuesday December 02, 2008 @03:47AM (#25956671) Homepage
    There is a major problem in Ubuntu which the Ubuntu community seem unable to comprehend. n the Screen Resolution dialog in Ubuntu 7.10 it was possible to change what type of monitor you are using if X.org was unable to properly detect you're hardware. This functionality was removed from Ubuntu 8.04 and Ubuntu 8.10 and Kubuntu 8.10 with KDE4. The Ubuntu community seem completely unable to comprehend why this is a problem, however users who try Ubuntu and install the drivers for their video cards find themselves locked in at a resolution of 640x480 with no clear way of fixing the problem. In 7.10 it was as simple as opening the Screen Resolution dialog and changing what monitor you have, but now that functionality is gone. This is a big problem that can put off new users from ever giving Ubuntu or Linux in general any serious consideration and that is unacceptable.

Where there's a will, there's a relative.

Working...