Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Linux Software

Linux on the Desktop Doubles in 2007 657

00_NOP writes "According to a report on Softpedia, citing Net Applications, Linux usage on the desktop doubled in 2006 — 07: though from a miserable 0.37% to a still not brilliant 0.81%. Given that Linux is free, is based on peer reviewed source (and so inherently more secure in the longer term) and that hardware support is now pretty good, how long are we going to have to wait for the big breakthrough?" Of course the focus of the article is that Vista is kicking butt over Mac/Linux, which is not particularly surprising.
This discussion has been archived. No new comments can be posted.

Linux on the Desktop Doubles in 2007

Comments Filter:
  • by __aajwxe560 ( 779189 ) on Sunday October 07, 2007 @09:21AM (#20887349)
    Background: I am a sysadmin for a 300+ node Linux shop, and have fairly lengthy experience in Solaris, Windows, and AIX as well.

    I still run Windows XP as my desktop of choice. I only run it because it came with the laptop that was provided to me by IT, or I would probably still be running Windows 2000. Very simply, I use the OS as a tool to get my job done, and Windows 2000 was doing the trick. Windows XP is now doing the trick. When there is something I want to do that Windows XP can no longer do, I will look beyond. If Linux starts to pioneer in new features and areas that Windows and the Mac OS cannot answer, then I will certainly consider it for my desktop OS. Meanwhile, I deal enough headaches from users at the server level that I don't feel like battling with my Linux wifi drivers, sound card strangeness, or having to jump through other hurdles to just stay productive. Of course there are patches and ways around most/all of the issues I have seen, but that doesn't mean its acceptable to me.

    Now, cue over to the server arena, and Linux is certainly replacing Windows boxes for all standard day-to-day servers. It does what I need, it does it well, and even offers features and ease of use that the Windows boxes simply cannot match. That was a compelling reason, with cost also being a close secondary, that we now run so many nodes.

    Meanwhile, who really cares. If _XXXX_ does what you want, use it.

  • by speaker of the truth ( 1112181 ) on Sunday October 07, 2007 @09:21AM (#20887351)
    Actually that would be a 100% increase. 0.81% is not a 100% increase of 0.37%
  • by jonathan3003 ( 797920 ) on Sunday October 07, 2007 @09:35AM (#20887425)
    Shitty website, low quality news. Just as an example, the same site has a different article that doesn't favor Vista: http://news.softpedia.com/news/Vista-Is-Nothing-Compared-to-XP-Move-to-Mac-OS-X-and-Ubuntu-Linux-65786.shtml [softpedia.com]
  • by Anonymous Coward on Sunday October 07, 2007 @10:06AM (#20887611)
    Any scientific journal worth its salt has some a mandatory peer review process, no matter how obscure the topic may be to the mainstream.
  • by Trax ( 93121 ) on Sunday October 07, 2007 @10:30AM (#20887795)
    Have you looked at the linux-uvc drivers found over at http://linux-uvc.berlios.de/ [berlios.de] ?

    To quickly install and test the driver, do the following:

    1. Install subversion
    2. Execute 'svn checkout svn://svn.berlios.de/linux-uvc/linux-uvc/trunk' (without the ')
    3. Execute 'make' in the source directory
    4. Execute 'sudo su' to become root
    5. Execute 'make install'
    6. Plugin the camera and use either lucview or ekiga to test it out
  • by ozmanjusri ( 601766 ) <aussie_bob.hotmail@com> on Sunday October 07, 2007 @10:34AM (#20887817) Journal
    That's right it's not exactly a 100% increase, it's slightly more.

    It's fortunate some of us can do the maths.

    In the past month, the open source operating system only increased its footprint on the market by 0.4%, from 0.77% to 0.81%.
    Maybe you can teach the "Technology News Editor" a thing or two...
  • Re:meh (Score:3, Informative)

    by Bert64 ( 520050 ) <bert.slashdot@firenzee@com> on Sunday October 07, 2007 @10:36AM (#20887833) Homepage
    Closed source is negative because it stifles progress...
    Each vendor has to reinvent the wheel, and can't legally learn from the others. With open source you can reuse other people's code and build upon it. Closed source ensures that only vendors with enough cash to develop a complete application can enter the market, with open source it's easy to build upon an existing project.

    Smaller companies or individuals who want particular features have very little chance of getting them in a closed source world, they would have to pay whatever fees a given vendor demanded *if* that vendor was even willing. With open source sufficiently capable people can implement those features, while other people can hire coders to do it for them.

    New hardware architectures are far less likely to succeed, just look at IA64 as an example, failing miserably even with the backing of Intel and HP, because people can't run their closed-source apps on it. And vendors won't port those apps until there's a market, thus you have a catch-22. Therefore processor makers are constrained by choices Intel made 30 years ago, as they try to develop new chips while maintaining compatibility. As another example, Apple had to spend considerable time and effort on Rosetta to allow legacy PPC apps to run on their Intel based Macs. In an open source world many of those apps could be easily recompiled, and doing so for a large number of them would probably have taken Apple less time and effort than writing rosetta.

    There's also the matter of trust, some large companies and governments are paranoid and want to see the source code and actually build it (so they can be 100% sure the binaries they have came from the source they've seen). A lot of people are equally paranoid, and some of them do have the capability to audit and compile the source.

    Long term support - closed source software is at the mercy of it's vendor, so there is a chance of the product being discontinued, or the source code being lost. Users of closed source software have no fallback in situations like these.

    Multi vendor support - with the source open, any vendor can begin providing support services around an open source application, customers are free to choose the vendor and support package that suits them, instead of being stuck with a single source of support. As a consequence, vendors are forced to compete. If you want a commercially supported linux you have plenty of choices, for commercially supported windows you have only one source.

    Less lock-in, with open source you are far less likely to find your data locked away in a secret format known only to one company.

    There are many negatives associated with closed source, and virtually no positives as far as the customers are concerned. If you have evidence to the contrary i'd like to hear it.
  • by Rakshasa Taisab ( 244699 ) on Sunday October 07, 2007 @10:40AM (#20887867) Homepage
    The OP didn't pay to be published in a peer-reviewed paper.
  • by kmhofmann ( 1038332 ) on Sunday October 07, 2007 @10:44AM (#20887909)
    But as long as the software is doing exactly what I *need*, and it's keeping me productive, I don't need to control it... I understand and respect the philosophy behind free software, but in the end, if non-free software helps me to get a certain job done quicker and/or better, the choice will be obvious.
  • by rtyhurst ( 460717 ) on Sunday October 07, 2007 @11:08AM (#20888051)
    Well, this, from the article seems like typical Microsoft spin and FUD:

    >Windows Vista, with all its overcriticized faults, evolved from 0.16% in December 2006 to 7.38% at the end of the last month. During the same period, Windows XP dropped from 85.30% to 79.32%, a percentage slips which makes it obvious that XP users upgraded/migrated to Vista and not to Mac OS X and Linux. While of course there is also a small segment that did in fact made the jump to the two alternative platforms, it is clear that the vast majority of XP users remain loyal to the Windows brand.

    "Overcriticized faults"?

    Ha! Vista is a buggy bloated piece of poo which it took them *7 years* to come up with.

    I got booted from a MS friendly site for saying so.

    Example 1: Firefox is cleaning house in the browser biz.

    Example 2: My local computer store can't keep up with the demand for free UBUNTU CD's.

    Example 3: "Songbird" is an elegant and good looking open source media player which even in the current developer version release works better and has far more interesting features than WMP.

    I'd say MS is running scared, and for good reason.
  • by ozmanjusri ( 601766 ) <aussie_bob.hotmail@com> on Sunday October 07, 2007 @11:22AM (#20888139) Journal
    Linux still won't install on my Intel DG9965WH Motherboard

    That's 'cause Intel has never made a DG9965WH Motherboard.

    Why would you expect Linux to install on imaginary hardware?

  • by afabbro ( 33948 ) on Sunday October 07, 2007 @11:43AM (#20888303) Homepage
    Its also why Japan is having its densely populated cities (along with other areas) laid down with fibre optic while we're stuck with inferior methods of internet access. Japanese businesses are willing to look at the long-term while American businesses only look to the next quarter.

    Yeah...that must be it. It couldn't be because the entire country of Japan is smaller than California, and when you subtract the inhabitable mountains, volcanos, etc. it's more like Nevada. Or that it has some of the densest metro regions in the world, including the world's largest, Tokyo.

    Nope, couldn't be that running fiber everywhere is a much smaller and easier task. Must be that the Japanese are so clever and the Americans so dumb.

  • by speaker of the truth ( 1112181 ) on Sunday October 07, 2007 @11:54AM (#20888381)
    Well as I said I'm not completely happy with it, I accept it. However continuing to wait for the next Windows OS to be released has been a fruitless wait so with the latest release, while Linux continues to improve (with only a cursory search already revealing one advantage). So that's why I'm switching.
  • by foobsr ( 693224 ) on Sunday October 07, 2007 @12:39PM (#20888721) Homepage Journal
    DG9965WH Motherboard

    This must be so new even intel does not now about it.

    If you instead meant 'Intel® Desktop Board DG965WH" intel [intel.com] thinks it supports Linux.

    Maybe the offered BIOS upgrade will do, just a wild guess.

    CC.
  • by burnin1965 ( 535071 ) on Sunday October 07, 2007 @12:46PM (#20888785) Homepage

    That's quite a flatline if you tilt your head on the side.

    The person who wrote that article either doesn't have a clue what they are doing with the statistics or they have learned to generate proper statistical lies.

    When you look at the chart included with the article it does appear to be a flat line. Funny thing is they all appear to be pretty much flat lines. Since the scale on the chart is 100% so the growth in OS/X and linux are masked by the market share of WindowsXP. I have to deal with these lies occassionally and all you can do is thump them over the head with a statistical clue bat. I'm no statatician but when people start drawing statistical conclusions from pretty pictures instead of the raw numbers with a good analysis (anova, t-test, chi-squared, something) their conclusions are suspect.

    If anything the numbers suggest that with support by PC vendors who are willing to sell and support PCs with linux preinstalled, there is a market. With no marketing and only recent support from a large PC vendor (Dell Ubuntu preinstalled) linux gets a nice up tick in market share. With more data points a true analysis can be completed to determine if there is a trend.

    Anyone who suggests that market gains by the latest monopoly product in any way suggest an acceptance of the product are simply ignorant. What else would anyone expect for the monopoly product from the company that manipulates the market and coerces vendors.
  • by btarval ( 874919 ) on Sunday October 07, 2007 @01:12PM (#20888967)
    There are currently about 1 Billion PCs worldwide. If Linux had a 0.81% "marketshare", that would equate to a grand total of 8.1 Million systems. Fedora alone provided a fairly accurate measure of installed systems last year (for FC6 IIRC), and the last I heard it was 7 Million and counting.

    But that's just Fedora alone. Ubuntu has a significantly bigger "marketshare" than Fedora. SuSE is also a significant player. Altogether, the Linux marketshare is probably somewhere between 3-5 times what Fedora is reporting, which would put Linux at about a 3-4% marketshare, worldwide.

    But the point remains that the numbers in the article don't jibe with what most other people are reporting. In fact, these numbers are downright silly.

  • Also, no. (Score:4, Informative)

    by btarval ( 874919 ) on Sunday October 07, 2007 @03:23PM (#20890013)
    Please read the fine article. They are explicitly talking about the grow of the Operating Systems, and they don't make any distinction in terms of Desktop or Server. In fact, the word "desktop" is nowhere to be found in the article at all.

    Now, one might infer that it's intended for desktops. But that inference is left up to the user. It is explicitly not what the article is claiming. TFA is only talking about their measurements of the total growth of OS's.

    Had they stated that it was for Desktops only, and that they weren't talking about servers, this article might have more credibility. But they didn't. They are, instead, trying to misrepresent things.

  • by spuzzzzzzz ( 807185 ) on Sunday October 07, 2007 @05:17PM (#20890875) Homepage

    Take a closer look at the advisories instead of just counting them. Windows 2003 had 135 advisories: 61% were vulnerabilities from a remote attacker and 24% were vulnerabilities from the local network. The most common vulnerability type was system access (54%) and 74% of the vulnerabilities were of moderate or higher criticality (and 41% were highly or extremely critical).

    In the same period, the linux kernel had 132 advisories. Only 19% involved a remote attack and 13% involved attacks from the local network. Of the 132 advisories, only 15% were rated moderately critical and none were of higher criticality. The most common type of attack was denial of service (46%) followed by privilege escalation and the exposure of sensitive information. System access (remember, this was a factor in 54% of Windows 2003 vulnerabilities) made up 2% of linux kernel vulnerabilities.

    Ok, so this was only the linux kernel; I'm not necessarily asserting that the whole *NIX software stack is secure. Nevertheless, your approach of looking only at the number of vulnerabilities is highly flawed. Lies, damn lies and statistics indeed.

  • by cindysthongs ( 1168367 ) on Sunday October 07, 2007 @05:38PM (#20891053) Homepage
    in reality, peer reviewing code does work - whether it is memory leaks, security holes, or even performance problems. Code is complex enough that nobody can be the be-all-end-all expert in every aspect. However, there is a good chance that a specialized expert in security that contributes to open source will see the problem and either fix it or tell you how to fix it. Same for memory leaks and everything else.
  • by Anonymous Coward on Sunday October 07, 2007 @06:12PM (#20891275)
    Using Ubuntu here, and while I do like the app-in-a-bundle approach used by Mac OS (as an aside, don't apps put files in ~/Library or something?) the .deb deal is great from a user perspective. Completely consistent, unlike the Windows experience, and greatly benefited by the completeness of the system, whereby thousands of useful free apps spanning many categories can be browsed and installed using a simple utility. I don't mind installs putting files all over the place as long as a) I don't have to worry about the details and b) the uninstall process will effectively reverse any changes made. The deb package system combined with Synaptic meets those requirements - it's basically abstracted to the point where it's more like you're just enabling the pieces of software that you want, rather than acquiring and installing them. You're probably right that it's more of a PITA for developers, though.

    With regard to the product Linux is chasing - I think there are a few things people need to take into account. OS X has the massive, order-of-magnitude-level advantage of designing for specific hardware. Windows has the not quite so massive advantage (from a compatibility point of view) of being the leader in market share, which means there's a compelling reason for hardware manufacturers to make sure their drivers work well on that platform. Linux on the other hand has neither of these advantages - it needs to support as much hardware as possible so as not to restrict the potential user base, but has to do most of the work itself since the incentive for a lot of manufacturers just isn't there. When you think about it like that, the state that Linux is in in 2007 is amazing. Also, I realise that from a user's perspective none of this makes the slightest difference - if something doesn't work, it doesn't work, and it doesn't matter why.

    My personal experience with Linux on the desktop (I've been using it in some form or another as a file/web/print/mail server for years) has been up and down. I initially had a crack with Corel, then tried Redhat, using variously KDE, Gnome and blackbox, and booted into the Kororaa LiveCD (the first one with desktop compositing) to check that out. Recently though I installed Ubuntu on my laptop, an HP which is a couple of years old now. I'd heard the stories about wireless support, graphics issues, etc, but for me the experience was amazing after my previous aborted attempts. Wireless with WPA, desktop effects, all the media buttons, the card reader, etc, all worked without a hitch on the first boot. Oh, and it booted straight into the native resolution of the panel, which is more than can be said for any Windows installation I've ever performed (I haven't installed Vista). Not to mention having a fully-featured free office suite installed by default, etc. Codecs were a total non-issue - Totem informed me that extra non-free software would need to be installed, and went ahead and did so. I was interested to see how long it would take, this time, to have to drop back to the command line to fix something (obviously I still use the command line for development stuff). I'm still counting, and it's been about three months. Based on this experience, my grandmother would be far better using Ubuntu than Windows.

    Linux has never been offered bundled with a desktop PC until recently, and you can be sure that if you installed the latest Gutsy on a freshly-formatted Dell laptop which originally came with Linux pre-installed, all the hardware would work perfectly out of the box. That kind of compatibility can easily be achieved by judicious hardware selection right now - and hardware support is only going to get better. Once the majority of Linux users are getting there by purchasing systems instead of switching from a Windows install on existing hardware, these issues will become much less of a stumbling block on the road to even more widespread adoption.

    I'm excited.
  • by Anonymous Coward on Sunday October 07, 2007 @09:12PM (#20892529)
    The numbers were taken from marketshare.hitslink.com, which counts the user-agent strings of people who visit their various websites. So this only counts the number of people who install Linux on their PC and then browse the web with it.

    The actual number of Linux installations is much higher than 0.81%, but the rest is mostly servers, routers, embedded systems and specialty applications.

All great discoveries are made by mistake. -- Young

Working...