Forgot your password?
typodupeerror
Linux Business Software Linux

OSDL's Review of Desktop Linux In 2006 200

Posted by kdawson
from the it's-a-wrap dept.
derrida writes "The OSDL's Desktop Linux Working Group has published its first year-end report on the state of the overall desktop Linux ecosystem. The report provides insight into the year's key accomplishments in functionality, standards, applications, distributions, market penetration, and more. Of great interest is the Market Growth part. Quoting from there: 'Most observers believe that much of the growth will take place outside of the United States. "It will be in the BRIC (Brazil, Russia, India, and China) countries," said Gerry Riveros, Red Hat, "because of the price and because they aren't locked in yet."'"
This discussion has been archived. No new comments can be posted.

OSDL's Review of Desktop Linux In 2006

Comments Filter:
  • Other countries (Score:3, Insightful)

    by Mazin07 (999269) on Saturday January 27, 2007 @10:16PM (#17787386) Homepage
    So they're going to say that Linux will really grow in countries like China and India, where street vendors hawk a variety of Microsoft bootlegs for less than $0.50?

    I'm not seeing the appeal.
    • by Hao Wu (652581) on Saturday January 27, 2007 @10:49PM (#17787534) Homepage

      in countries like China and India, where street vendors hawk a variety of Microsoft bootlegs for less than $0.50?
      Untrue. Chicken or dog cost at least many dollar. Half amount of carcass will trade for quality software more or less.
      • Re: (Score:2, Funny)

        by mgiuca (1040724)

        Half amount of carcass will trade for quality software more or less.

        So just to clarify, you mean, "Will trade for software of the same quality as half a carcass."

        Sounds about right.

    • Re: (Score:3, Insightful)

      by canuck57 (662392)

      So they're going to say that Linux will really grow in countries like China and India, where street vendors hawk a variety of Microsoft bootlegs for less than $0.50?

      Any you pay $179....

      Or should you pay $98 or $95...

      The OS is a commodity, $9.99 at Walmart with Office is the Windows future. Lets face it a $5999 dual core system not to long ago goes for $599 today. Everything in personal computing has gone down but Microsoft. Eventually the cost and perceived benefit is going to change. I even predict

      • I call this PONIIC, Price Of Not Investing In Change.

        It's actually called opportunity cost.

      • OS as a commodity (Score:3, Insightful)

        by LauraW (662560)

        The OS is a commodity, $9.99 at Walmart with Office is the Windows future.

        I wish this were true, and it probably is true in the enterprise market. But I think it's unlikely in the home or consumer market in the near term. Items become commodities (in the non-pork-belly sense) when there are many suppliers, producing nearly interchangeable products, competing mostly on price. We're not there yet in the OS world. There are still only a few major players: Windows, Linux, various flavors of Unix, and assorted niche OSs. Price doesn't seem to matter much to consumers and OEMs

      • The price paid for not investing in change is bankruptcy. Do you know the largest maker of horse driven wagons in the USA survived into the second half of the twentieth century? They did it by switching from horse wagons to making automobiles [wikipedia.org].

        But it's very hard to do. Very few companies adapt to a big technological change. Motorola did it but most of the companies that dominated electronics manufacturing in 1950 couldn't survive the switch to solid-state devices, just like wagon makers couldn't switch to ma

    • The appeal comes from not the miniscule lower cost of the OS, but the lower cost of the system required to run it.
    • Oh, the Irony! (Score:5, Interesting)

      by Bill, Shooter of Bul (629286) on Sunday January 28, 2007 @12:55AM (#17788070) Journal
      Your comment is correct. I spent two years in Haiti, the poorest in the western hemisphere. Did anyone run Linux, no. They used old hardware everywhere. Old hardware that would not run linux. I tried replacing the pirated copies with linux and failed! Bandwith is very expensice there at least when compared to Income levels. So downloading linux "for free" is actually much more expesive than the 50 cent Devils own copy of windows. You'd have to find an older version of linux that would run on the hardware ( unlikly in most cases) and then compare it the the windows equivalent. I'm sorry I love linux, but redhat 5 doesn't compare to win 2k on the desktop for new users. No, DSL linux didn't work doesn't matter beacause the oss applications running on top neccisary for real work ( openoffice or abiword) perform terribly on older hardware.

      Technology flows from the first world to the third. They will be the last to get linux on the desktop, not the first.
      • by mangu (126918)
        You'd have to find an older version of linux that would run on the hardware ( unlikly in most cases) and then compare it the the windows equivalent

        That's exactly what I did, installed Slackware from diskettes in an old notebook with 16MB memory and 1.3GB HD. Runs Abiword and Gnumeric fine, what you need for the majority of office work. It originally had windows 95, but do you know what is the newest version of a Microsoft OS that will install in a machine with 16MB RAM? And how would you fit a Microsoft OS

        • Re: (Score:3, Informative)

          by the_womble (580291)
          The reason for not using Linux I hear most often is "I can easilly find people who know Windows it needs to be fixed".

          Unfortunately, most of the people who fix consumer hardware are incompetent. People who fix PCs for consumers know how to re-install Windows and some apps and that is it. The fixed PC will, no doubt, be re-infected (well, what do you imagine the commonest problem will be with pirated software and no systems administration) within hours of be hooked up to the net again.

          On the other hand, it i
        • NO! NO! NO! THe hardware support isn't there. THe machines won't accept anythign except win 9x. I tried all of the distos you inked to. THey didn't work!! win 98 will run on any old piece of hardware, because they were built to run it. Ignorance would be a factor but they are a good 7-15 years behind hardware wise. It has nothing to do with ignorance. I'm sorry but you post is absolute unmitigated bullshit! to run linux reliably on the desktop you NEED newer hardware. SUre linux would save them money IF th
      • First, Haiti is not BRIC. It really shows ignorance to state otherwise. Plenty of it.

        Second, somehow those people managed to run Windows at old hardware... and you clam they couldn't run Linux?!?! Obviously, they haven't tried.

        • NO, I TRIED. I was there for two years. I tried and the best buinesses int he country. The textile factories, the PRestige brewery, Barbencourt, hardware distributers. Most wouldn't accept ANY DISTRO. TRUST ME.
      • Re: (Score:3, Informative)

        by vtcodger (957785)
        ***Your comment is correct. I spent two years in Haiti, the poorest in the western hemisphere. Did anyone run Linux, no. They used old hardware everywhere. Old hardware that would not run linux. I tried replacing the pirated copies with linux and failed! Bandwith is very expensice there at least when compared to Income levels. So downloading linux "for free" is actually much more expesive than the 50 cent Devils own copy of windows.***

        Only too right I fear. Windows 3.1 will run satisfactorily with 8mb of

      • by Comsn (686413)

        ( openoffice or abiword) perform terribly on older hardware.
        openoffice is a microsoft office competitor. you cant run ms office 2006 on old hardware either.
  • by NorbrookC (674063) on Saturday January 27, 2007 @10:17PM (#17787388) Journal

    I think the interesting thing about this is the projection of the greatest growth in the "BRIC" countries. I don't think it's so much that they aren't locked into Windows, as much as Microsoft has (inadvertently on their part) pushed it along. When MS started its big "anti-piracy" crackdown, it mostly hit in these parts of the world. Add in the high cost of Windows, and the ever-increasing hardware requirements for it, and a free OS that can run on existing hardware looks pretty darn good.

    The problem desktop Linux is still facing is getting more penetration in the biggest market - the United States. There are still areas where improvements need to made, and in some areas, applications to be developed. One thing that we have to recognize is that MS is not going to give up its stranglehold on the OEM installed market. The only way Linux going to be able to make any strides is to recognize that the user is going to have to do the install, and to make it easy for them. There's a project going on for Ubuntu which shows some promise, called Winbuntu - it's a Windows installer for Linux. I don't know how it'll work out, but it shows the concept.

    • the high cost of Windows, and the ever-increasing hardware requirements for it, and a free OS that can run on existing hardware looks pretty darn good.

      Is there any place that this is not true?

    • by westlake (615356)
      MS is not going to give up its stranglehold on the OEM installed market. The only way Linux going to be able to make any strides is to recognize that the user is going to have to do the install.

      You have written the obituary for Linux in the domestic consumer market.

      This market is middle class.

      This market is antithetical to everything the Geek stands for. If the Geek hasn't learned that lesson by now, he will have it pounded into him again and again over the next five years.

      This market can and will bear

    • by gbulmash (688770) * <semi_famous@y[ ]o.com ['aho' in gap]> on Sunday January 28, 2007 @01:07AM (#17788090) Homepage Journal
      You're forgetting a major factor here. Most people didn't learn the applications they use. They were *trained on them*. They never learned the program conceptually. They learned it procedurally... step-by-step.

      If you change the steps, the order of the steps, or the location of the steps, and they're LOST. Not only are they lost... they're angry, unhappy, less productive, complaining, and in need of re-training. It doesn't matter if the new software is better or "just as good". It doesn't matter if the platform is better. They know how to be productive when they're following these specific procedures. If changing the software changes the procedures, you have to re-train them in the new procedures, and you have to deal with all the productivity lost while they learn and adapt. And that doesn't include the pushback from the ones who resist change out of fear or inertia.

      Any exec or front-line salesperson who uses ACT!... never going to switch platforms until ACT! supports that platform. Seriously.

      And that's where your hurdle comes in. Change is neither easy nor painless. Imagine a pain meter on a scale from 1 to 10. Let's say that Windows is a 5 and Linux or Mac is a 2. But the adjustment of switching is an 8. People will opt to stay with the 5. They know the 5. They know they can tolerate the 5. Because even though the 2 is promised, the 8 looms large in the immediate future.

      What's going to prompt people to switch is when the combination of Microsoft arrogance and aggressive bad guys raise the pain of Windows to a 6.5, while the efforts of Linux and/or Apple developers lower the pain of switching to a 6.5 or lower. When switching is no more painful than staying the course (or possibly even less painful), you'll see the needles start to move in bigger ways.

      • You know, I think I'm going to bookmark this post, because it describes the situation with every government office and employee that I've ever encountered in my life, to perfection. And to a lesser extent, most large corporations.

        I'm not sure you know how right you are. (In fact, I hope you don't; and if you do, I feel your pain.)

        The "training problem" is something that most technical people fail to appreciate, because it almost universally doesn't apply to them, because they generally have some conceptual understanding of how their software and hardware operates. Once you have that conceptual understanding, it's nearly impossible to imagine how it would appear without it. It changes the way you think about the tools you use, on a fundamental level.

        Unfortunately, imparting that type of conceptual understanding to someone who isn't interested in learning it, is nearly impossible as well -- even when in the long run, it's almost certainly to their benefit to have it.
    • Re: (Score:3, Insightful)

      by mikearthur (888766)
      Technically, the biggest market is the EU, which has will and has seen far greater growth than the US, according to the article.
  • by timmarhy (659436) on Saturday January 27, 2007 @10:17PM (#17787396)
    it's already been identified that the majority of OSS gets developed outside of the USA, i think you will find america's court system and patent laws are going to result in doing software business inside the USA to become very unpopular through the next decade. you'll end up with only massive corperate entities like MS able to cope with these entry barriers, and if you think you can rely on companies like MS for innovation......
  • My 2006 report (Score:5, Informative)

    by br00tus (528477) on Saturday January 27, 2007 @10:19PM (#17787402)
    I should begin with that I am a confirmed Debian booster. I run Debian at home and love it. Anyhow, at work an old server was decommissioned and I was told I could have it, so I burned the latest unstable CD and tried to install Debian. No go. I have heard a lot about Gentoo but have never really played with it so I decided to try that. No - didn't work. So then I had some Red Hat CDs lying around so I tried that. No go again.


    In years past I have always noticed that FreeBSD always makes it easy to install. Makes it easy meaning it recognizes hard drives, network cards, even 56K modems, without a problem. I installed FreeBSD with two 3.5" standard FreeBSD install disks a few years ago over a 56K modem with no problem. Like the Apple commercials say - "it just works".

    I prefer Debian and Linux to FreeBSD, but Linux distros have a lot to learn from FreeBSD in terms of ease of installation. FreeBSD makes it really easy to install itself on a PC without barfing on network cards, hard drives and so forth. It was the same situation ten years ago when I was installing Slackware on multiple floppies versus my FreeBSD network installs. And from my experience last week, I see it still holds true.

    • by istewart (463887)
      On old hardware, I typically find myself reaching for FreeBSD before anything else. I came into possession of an old Pentium 133 with a CD-ROM drive that couldn't be booted from directly. The only Linux that would install was Damn Small Linux, and I disliked its behavior of booting from an image on the hard drive rather than installing to its own filesystem on the hard drive, so the next candidate was FBSD. It installed and ran great... the only caveat was that the 1GB hard drive was too small for a ports-t
    • Re: (Score:3, Insightful)

      by Firehawke (50498)
      I really don't want this to sound like I'm trolling, based on the content, so be warned in advance.

      About three years ago, I ran into a server (HP, if I recall correctly) that had the strangest problem I'd ever seen: Neither Gentoo or FreeBSD would run on the thing. You could install it, but the thing would randomly kernel panic within 5 minutes of being up, and you couldn't trace where exactly the crash was.

      If, on the other hand, you shoved Windows XP onto the thing.. it would run perfectly fine.

      To this day
      • by strider44 (650833)
        This is a post for 2006 not 2003. I'm serious here - even the difference between kernels 2.4 and 2.6 is a huge one so you shouldn't get discouraged by one problem that happened 3 years ago.

        The answer to your problem though is that it's probably a hardware issue, I'd guess RAM. You should have run memtest86 on it. It's anecdotal evidence though and pretty irrelevant - I had a graphics card that would work fine on Windows but due to a fault I believe in the framebuffer got image corruption on Linux. Th
        • by Firehawke (50498)
          In a way, by misunderstanding my point you've proven it. I'm saying exactly what you're saying in reply to br00tus -- I'll quote:

          'I should begin with that I am a confirmed Debian booster. I run Debian at home and love it. Anyhow, at work an old server was decommissioned and I was told I could have it, so I burned the latest unstable CD and tried to install Debian. No go. I have heard a lot about Gentoo but have never really played with it so I decided to try that. No - didn't work. So then I had some Red Ha
  • Is ODSL funded by Red Hat as well? Seems to have a Novell slant, to the extent of (ludicrously) claiming that cross-platform development is "finally" available with Mono 1.2. Like it wasn't with Java?

    Anyone care to estimate how many companies put Linux in place to run Mono vs. Java?
  • by Anonymous Coward on Saturday January 27, 2007 @10:45PM (#17787502)
    The press on Vista is so bad right now, it should really be used to push Linux adoption on the desktop in 07.

    I myself switched to Ubuntu on my home desktop because of Vista-fear.
  • It's not happening (Score:5, Insightful)

    by Animats (122034) on Saturday January 27, 2007 @10:47PM (#17787514) Homepage

    Every year, I see these "Linux is ready for the desktop" articles. But it never happens. Back in 2004, WalMart offered a $499 Linux laptop. They don't do that any more. [walmart.com] Lenovo, HP, and Dell have fooled around with Linux laptops, but try to order one on line. Search for "linux laptop" on Dell, and you get back "Dell recommends Windows Vista(TM) Business." There are some off-brand Linux laptops available, but they're overpriced.

    Linux on the desktop looked closer three years ago than it does now.

    • Okay, first off three years ago Ubuntu was not out yet. It came out in late 2004. Desktop Linux has come a long way since Ubuntu's release (hell, Ubuntu came a long way. Now they have distros that based off of Ubuntu and fix their parent distro's shortcomings - like Mint Linux).

      Two, I think the focus was on the desktop. Notebooks are slightly different animals. Sleep/Hibernate and all that fun, as well as many of them having their custom buttons on the keyboard.

      Three, when I saw Walmart selling Linspir
  • by starseeker (141897) on Saturday January 27, 2007 @10:50PM (#17787540) Homepage
    KDE in its current form is quite usable for most common purposes, and those abilities it doesn't have can probably be added as widespread adoption takes place. OpenOffice has its faults but it usually does the job. I would say at this point, it's not Linux as Linux that's the holdup. It's:

    1. Legacy systems, documents, and most importantly user training in said systems and documents. "If it ain't broke, don't fix it" rules when computers are the tool rather than the end goal in and of themselves, and it's hard to fault that logic. If you change your systems you're effectively "breaking" your employees in terms of their productivity, and fixing them is quite a job. It's only justified when the end benefits are worth the pain, and to be fair in most cases they probably aren't, at least in the short term. And we all know how good capitalism is at thinking long term.

    2. Compatibility with the largest possible market segment. If your customers/suppliers insist on dealing in old formats (see #1) then it's rather hard to force them to change. And every minute spent dealing with such issues is one less spent on work related to producing something.

    3. Costs of retraining your IT department and switching your software/machines. Yes it will take time - hardware support, IT helpdesk training, identifying and testing replacements for currently used apps, etc. Not painless at all.

    I would say Linux was "ready for the desktop" several years ago, or at least as ready as Windows. KDE and Gnome are excellent systems for most users, once installed and configured properly. (That's what admins are for - work PCs are not normally maintained directly by users, regardless of OS.) Now the problem is revealed as being rather deeper than originally anticipated - it's not JUST Linux that's the problem, it's change period.

    For home use, people want to play media and install thousands of commercial specialty packages, which are all written for Windows. More legacy software issues, with no budget or interest on the part of the people writing them (why target an uncertain platform populated by geeks who give stuff away?)

    The problems aren't technological now - I would say they can be more accurately characterized as inertia. It's hard to give people reasons to switch from something that works, even when the new thing is BETTER than the current one. Linux, due to legal constraints as well as not quite 100% compatibility with things like Word formats, is not and probably CANNOT become (legally) a drop-in which is better in all cases.

    Personally, I think the only hope for a massive switch to an open source OS is one where the software is written in such a fashion that it can be PROVEN (mathematically) to be secure/crash proof/what have you. Such a verifiable guarantee might gain enough interest/momentum to be worth the massive shifts that still have not taken place, but I am aware of no other lack in the marketplace severe enough to warrant it.
    • Re: (Score:2, Interesting)

      by Skewray (896393)
      You will note that, according to the Executive Overview, there is no KDE, only GNOME. Was the report written by Americans?
      • by dhasenan (758719)
        But the executive summary doesn't mention GNOME, either. GNOME isn't mentioned until Section 3. KDE would have been mentioned had it had recent developments regarding Portland, ODF, accessibility, or Samba. Whether this means that KDE already has the functionality and GNOME is catching up, or that GNOME has these features and KDE does not, I do not speculate -- I don't think about those features; the only one I use is ODF.

        Both KDE and GNOME are mentioned in Section 5. Neither are mentioned in sections 1, 2,
    • by twitter (104583) on Saturday January 27, 2007 @11:39PM (#17787790) Homepage Journal

      Legacy systems, documents, and most importantly user training in said systems and documents. "If it ain't broke, don't fix it" rules when computers are the tool rather than the end goal in and of themselves, and it's hard to fault that logic. If you change your systems you're effectively "breaking" your employees in terms of their productivity, and fixing them is quite a job. It's only justified when the end benefits are worth the pain, and to be fair in most cases they probably aren't ...

      Funny how Microsoft has gotten away with just that. Every version of Windoze has a few pointless GUI changes and little real improvement, yet the Dells of the world push it out. Vista and Office 2007 mark the largest GUI change in a long time. Legacy software is broken. Where does that leave the user's "faultless" logic?

      Free software interfaces are more stable. Window maker, is a Next clone and it's basics have not changed in fifteen years. There are several others, like the fvwm or olvwm, and Enlightenment, that have been just as rock stable. At the same time there have been many other excellent interfaces that have grown up. All of them are extensively customizeable so that you can have as much change in each as you like and they all work together, so you can mix and match. The same performance from Microsoft would have Windows 3.1 GUI be adequate, customizable still available and easily interchangeable with a dozen other excellent window managers. Right.

      The same arguments apply to file formats and hardware. Vista is bringing with it .DOCX, the M$ "open" format with a 6,000 page spec. It's also going to obsolete 54% of exiting computers and 94% of them are not really "premium" ready, so their users will soon be disappointed by an upsell that degrades their actual performance. DRM promisses to make it all that much worse.

      The real hope is that Vista goes nowhere. XP did not move hardware and it had much better driver and legacy application support at launch. It took four long years for it to be majority. People want new hardware and it's time for it to move. There are major improvements that are good for both performance users and people who want something small and quiet. If Vista's changes are so bad that it actually harms sales, look for Dell, HP and others to follow Lenovo's lead to make up the difference. That would break the M$ monopoly once and for all and then we would not have to worry about this upgrade train nonsense.

      Vista - the Ow is Now.

  • My experience (Score:2, Informative)

    by DarkWicked (988343)
    As a relatively new linux user, I can say that I've seen significant progress in the ease of use and functionality in just one year.

    I started with Fedora Core 5, got jealous of some of the functionality of my girlfriend's Ubuntu, and I'm now extremely satisfied of Fedora Core 6 which brought all the functionality it lacked and even great extras I didn't know I needed like the desktop effects (xgl/compiz).
  • If anyone wants to have an idea on just how tough it can be to get a binary graphics driver running, just check out the ATI forums at phoronix [phoronix.net]. It's a painful process though I will say one thing. While I can't run beyrl, I now have one hell of a FPS speed on my older ATI 9000 chipset graphic card.

    Next time, it's going to be Nvidia.

    • by MooUK (905450)
      I've had absolutely no trouble getting the binary nvidia driver working under Ubuntu Edgy. Nor any trouble getting Beryl to run. (And the results are most prettiful.)
  • OSDL = Open Source Development Lab. I had to look it up. Thought someone else might be wondering.
    • OSDL are your hosts here at slashdot. Few people don't know them, and you could just look at the very first links at /. page.

      • by AusIV (950840)
        I recognized the acronym, but I didn't know what it stood for. I know slashdot hardly qualifies as a journalistic publication, but even so I think it's bad practice to have an entire summary without ever mentioning what an acronym stands for.
      • OSDL are your hosts here at slashdot

        Nope, actually, Slashdot is hosted by the OSTG [ostg.com], this acronym meaning Open Source Technology Group.

  • I switched my Gnome desktop theme about 20 times last year, unable to decide whether a Vista-styled theme was glorifying or mocking the competition.

    Then I decided that I still prefer Clearlooks.

    Boy, what a year.
  • Printing (Score:3, Informative)

    by smitty_one_each (243267) on Saturday January 27, 2007 @10:58PM (#17787592) Homepage Journal
    Some kind of corner has been turned for the GNU/Linux desktop in 2006.
    I light off cups (that is, go to http://localhost:631/ [localhost] in FF), enter th IP address of the printer in the obvious place, and stuff works.
    It's a cheezy home wireless network; I really want the Dumbest Thing That Works, realizing that if there is a reset, DHCP may re-jigger things.
    Trying to figure out how to set a printer by IP in that other OS has baffled me. It's an Easter Egg hunt gone ronngg. The quest for simplicity has been abandoned at a variety of levels.
    At least I only have to suffer that OS at work.
    • Some kind of corner has been turned for the GNU/Linux desktop in 2006. I light off cups (that is, go to http://localhost:631/ [localhost] in FF), enter th IP address of the printer in the obvious place, and stuff works.

      That ass Raymond not too long ago held up CUPS as example of things FOSS is doing wrong. I hate that sometimes. A lot of very talented people did hard and thankless work to make things work as well as possible and all some can do is bitch about what doesn't. I'm no fanboi. I know there things th

      • Re:Printing (Score:5, Insightful)

        by zcat_NZ (267672) <zcat@wired.net.nz> on Sunday January 28, 2007 @12:14AM (#17787922) Homepage
        Browsing to http://localhost:631/ [localhost] in firefox to configure your printer is one of the totally counter-intuitive things ESR was complaining about. Browsing to some random port on localhost is like having to tweak a registry key in XP, and it should not be necessary or tolerated for anything a 'normal user' is likely to do.

        If you want to add a new printer there should be an "add new printer" tool somewhere obvious, like under the System menu. Bonus points if it already detects the attached printer for you, and if the system can be configured to pop up the add-printer dialog any time you plug in a new printer.

        • Sorta... (Score:3, Interesting)

          by Xenographic (557057)
          In terms of being cryptic and user hostile, I agree that editing the registry and browsing to some random port on localhost are about the same.

          However, there IS one important difference--it's quite easy to screw over a machine by mucking around in the registry, either by accident or because the instructions you found were incorrect. I can't compare that to simply browsing around localhost. What's the worst you could possibly do? Hit the wrong port and get a screen full of crap from chargen?
          • If you're configuring the computer by going to a privileged TCP port, then you had best hope that you can't reach through the port to interact with, say, any of the CUPS configuration interfaces; they're notorious for containing buffer overflows and other exploitable conditions. What did you say? That's a direct link to those interfaces? You're joking, right?
        • by unapersson (38207)
          What like:

          System -> Administration -> Printing

          followed by "Add New Printer".

          Which takes you to a Wizard. So it's part of GNOME already.
        • by MooUK (905450)
          On some distros, there *IS* an add new printer option in the system menu. Ubuntu, for example.

          'Course, clumsy users can still cock things up, but they do that on whatever OS.

          (I misconfigured a printer recently and couldn't work out why; I actually found it EASIER to fix the problem within the CIPS web interface)
  • by alan_dershowitz (586542) on Saturday January 27, 2007 @11:01PM (#17787602)
    I first tried Linux in 1997. At the time I couldn't imagine using it as a desktop. However, there were a few turning points for me:

    1) GOOD package management. I started out on Redhat. Whenever anyone brings up RPM problems, they get reamed on Slashdot "RPM IS NOT A PACKAGE MANAGEMENT SYSTEM!" Well, once upon a time, there wasn't Yum or Red Carpet, and the best thing there was (RPM) was still hell to use. Now between RHEL and Gentoo, I rarely have to worry about not finding dependencies. Thank God.
    2) 2.6 Kernel. The reason is because before 2.6, X under Linux always "felt" slow.
    3. Firefox.
    4. More expansive community, documentation. I remember in 1997 trying to get help:

    ME: "I'm trying to do X and it's doing Y. Does anyone have experience with this? "
    THEM: "RTFM"
    ME, (looking): "The man page doesn't say anything"
    THEM: "+b You've been banned, troll."

    Now I look at the Gentoo install documentation and user forums now, and I am just in awe. Likewise for many of the other major distros.

    Now that wireless is going smooth, the only thing I have to complain about is no matter what I do, font rendering is inconsistent and often ugly. But as of two years ago, I am a happy full time Linux user! Take this for what it's worth, I just wanted to share my experience.
    • by DrDitto (962751)
      I first tried Linux in 1995 (Slackware). It was cool, but the install lasted a few weeks. Used HP-UX at work/school for most of the next 4 years. My home PC ran Windows NT 4.0 for most of the following 4 years. I never saw one crash or BSOD with NT 4.0. Stable as a rock.

      Nonetheless, I tried Linux on my home PC again in 1999 (Redhat). Install lasted about a month. Tried Linux on my home PC again in 2001 (Debian). Better, but not as good as Windows 2000. Win2K was a great Microsoft release....just
    • by jimicus (737525)
      Now I look at the Gentoo install documentation and user forums now, and I am just in awe. Likewise for many of the other major distros.

      Things have improved drastically there, this is so true.

      It pains me to say it, however, but there's still a few (some surprisingly major) OSS applications with mailing lists which your "you've been banned, troll" comment would be an understatement.
  • Interestingly, the article mentions BRIC countries. The BRIC http://en.wikipedia.org/wiki/BRIC [wikipedia.org] is a set of coutries poised for great economic growth this century.

    So it's obviously a set of countries that Linux should be in..

  • by zx-15 (926808)
    I think that desktop linux is not ready because it still plagued by a problem of text configuration files. I'm perfectly OK configuring my debian box from various files in /etc directory, however most of the users e.g. normal people aren't, and as long as proper GUI configuration tools, like Control Panel in windows, are absent from KDE/GNOME desktop environments I don't think that majority of people would like to use it. And these tools would not be there for some time, because a few distros currently supp
    • by pembo13 (770295)
      What do they lack? Yasy? system-config-* and KDE Control Center?
    • like Control Panel in windows, are absent from KDE/GNOME desktop environments

      Like System Settings (kcontrol) in KDE and the Gnome Control Centre?

      The few occasions when I have resortd to editing text files has been when doing things that the majority of people do not do.

      There are lots of things for which Windows users need to edit the registry [google.co.uk]. This is usually more complicated than editing a Linux config file.

      Stop Spreading FUD.

    • by miro f (944325)
      openSUSE YaST2 and control center actually manage to configure pretty much anything that you would need with the desktop without any issues (well, except for the fact that it's damn slow)

      it's a shame no other distro has come up with anything nearly as good.
    • So use Mandriva. It has MCC, which works better than 'control panel'.
    • Re: (Score:2, Interesting)

      by Dilaudid (574715)

      I think that desktop linux is not ready because it still plagued by a problem of text configuration files. I'm perfectly OK configuring my debian box from various files in /etc directory, however most of the users e.g. normal people aren't, and as long as proper GUI configuration tools, like Control Panel in windows, are absent from KDE/GNOME desktop environments I don't think that majority of people would like to use it.

      I think you're wrong. I had to know what autoexec.ini was on windows 95, and Control Panel is not usable by most users, anyone who can use it can use a text file just as easily. Most users are quite content with a "I can't make it work, it's broken, but I can live without it" attitude, so the key thing is that the computer (mostly) configure itself - which on Ubuntu, it seems to do. Thank god.

    • "I think that desktop linux is not ready because it still plagued by a problem of text configuration files. I'm perfectly OK configuring my debian box from various files in /etc directory, however most of the users e.g. normal people aren't"

      As a confirmed Debian user I find it strange that you don't know about Synaptic [debian.org] a GUI front-end to the debian package manager. Have you mentioned Xandros [ataglance.co.za], Ubuntu or Linspire to the 'normal people', all three based on Debian and not a text config file in sight.

      "as
    • by MooUK (905450)
      Ubuntu seems to have easier-to-find config GUIs than windows in some ways, although if you're too fixed on "this option is in this control panel section" then naturally you'll have to relearn where it is.

      (This it the third post I've made in this discussion praising Ubuntu. This worries me. I'm no fanboy, or at last I thought I wasn't.)
  • Overlooked... (Score:3, Insightful)

    by Anonymous Coward on Sunday January 28, 2007 @12:57AM (#17788074)
    Is STANDARDIZATION. Linux will never make it until there is a hell of a lot more standardization. Not 50000 formats for releasing packages, but ONE that works everywhere. JoeSchmoSoft doesn't want to install 20 different distributions to test that their packaging works and to create, host, and support those 20 different package formats.

    In Windows, one installshield package does everything on any Windows version.

    Nor does Grandma Gertrude want to download all 20 to figure out which one works on her system if all she knows is shes running Linux. Even if she does grab the correct package, she certainly isn't going to be able to open shell, su to root, and dpkg -i or rpm -Uvh the file. She will double click it, see nothing happens, and give up.

    Something like VFW would be welcome, I install a video/audio codec, and it works in all applications. Also, some cleanup of the video code in general, why is there no decent video output, opengl has tearing issues even with __GL_SYNC_TO_VBLANK = 1, Xv is really pixellated, I forget the others now, but none look nearly as good as the standard overlay in Windows.

    Standardization of directory hierarchy. Does that executable go in /bin, /usr/bin, /usr/local/bin, /opt/bin? What about the conf file, is it in /etc, /etc/progname/, /opt/progname, /opt/progname/etc, ...? This may be my being 'used' to Windows, but I prefer each program has its own directory where all of its files are, that way I know exactly where to look, and I can have a nice overview of what is installed.

    Some just general stupidity as well, like I was bored and decided to try SuSE, so I pop in the 10.2 minimal cd which lets you install over the network. I boot and first off it takes like 2 minutes to boot into the installer, and its running like I'm on a 386, and if you've ever booted off the minimal cd, you'll see it has no reason to run like that. Anyway, I suck it up and figure it'll be okay once everything is on the hard drive. So initial question is where do you want the installer to get the files from, I was on a laptop, so I hit network, wireless, enter my WPA key, and off it goes, grabs the installer files, launches the installer, grabs all the packages, installs, and reboots to finish the configuration. Except... this time it doesn't ask for my network settings, nor save the ones I entered earlier, so I end up hitting skip on a bunch of files it needed for ending configuration because it couldn't see my freaking network anymore.

    So I figure oh well I'll just configure it by hand when it boots, so I reboot and after about 5 minutes of waiting for it to boot up to a login screen I just hit the power button and boot off the windows CD and remove all the partitions and reinstall Windows.

    Note: It is not the hardware, I had a Gentoo install on it ever since I bought it that worked fine, I just got tired of waiting for crap to compile all the time, and was hoping for a 'no-hassle' installation.

    Other general stupidity...

    - I have to install like 300MB of libraries in order to run Firefox if I'm running KDE.
    - I have to wait for all those libs to load every time I open Firefox off of a fresh reboot.
    - Total lack of standardized advanced GUI tools. What tool is good for administering what programs start when the system starts up? What about when the user logs in to X? How about something that tells me what video codecs are installed, what audio codecs?
    - No apps seem to be 'lightweight'. Look at my two favorite windows media applications, Foobar2000 and Mediaplayer Classic. Foobar2000 is a 1.6MB download, supports every audio format under the sun, and loads almost instantly on a cold start. The closest thing in Linux is AmaroK, which is a 20MB download, and loads slow as holy hell, and doesn't offer nearly the range of audio support foobar does. Now MPC is a 1MB download (3mb? uncompressed s
    • by ardor (673957)
      Most of your points are valid only if you manually install the components. Distros like Ubuntu do take care of the 300MB libraries etc.
      Some others:

      - Today RPM and DEB are the de-facto standard package formats. There are also GUIs for installing these packages, also handling dependencies. (gdebi for DEBs, I dont know how the RPM one is named.)
      - GStreamer is something like VFW, and is gaining popularity. (Ubuntu uses it by default.)
      - The directory hierarchy is irrelevant for the home user. Again, take Ubuntu
  • I know I'll get smacked for this, but I've installed Edgy Eft to build a modest home theatre PC with MythTV. I did this because I wasn't 100% happy with the Vista Media Center.

    It's still not working correctly because of some pretty glaring issues with useability and interoperability in Linux.

    Installation - whilst installation was in fact a breeze and as simple as a Windows install (although, Vista offers better hard disk controller management, so if things go screwy with your install, Vista would be the s
    • by petrus4 (213815)
      Depending on your hardware, try FreeBSD. VLC works a treat via ports, and installs a heap of codecs as well. Granted, the Open Sound System perhaps isn't quite as nice as ALSA, but it offers perfectly good stereo, and there are actually a number of sound filters available for XMMS at least which do various different things, so if you want something that boosts volume or cleans sound up in some way, you can probably get that from ports too.

      I wouldn't bother with Samba, personally...use FTP. You can put an
      • What do you advocate as a replacement? The Registry? ;) The Registry uses text in places...the DWORD is actually an assembler variable...but it also uses a lot of non-text and is completely non-transparent in places. There are good reasons why UNIX (and hence Linux) uses text config files...you can read about those here if you want.

        Ideally, as an end user in this particular case, I should have no idea whatsoever where the configuration is kept. It should just be "done for me" through GUI dialogs. At least

Entropy isn't what it used to be.

Working...