Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Linux Software

Linux Desktop Myths Examined 718

Call Me Black Cloud writes "NewsFactor Network has an overview of the $95.00 Gartner report titled, "Myths of Linux on the Desktop". It's a good look at several points from the perspective of a corporate user, not a home user."
This discussion has been archived. No new comments can be posted.

Linux Desktop Myths Examined

Comments Filter:
  • by ambisinistral ( 594774 ) <ambisinistral&gmail,com> on Tuesday May 06, 2003 @12:47PM (#5892254) Homepage
    Ummm, is it my imagination or does this article link to a report you have to pay $95 to read?

    How are we supposed to comment on it?

  • Some FUD, not all (Score:5, Insightful)

    by Ars-Fartsica ( 166957 ) on Tuesday May 06, 2003 @12:49PM (#5892278)
    Its true that the initial TCO for linux will rise - whenever you are switching from one platform to another, there will be costs.

    I also don't believe Linux saves money on hardware compared to Windows - it seems many offices are holding back with Windows upgrades, and IT expenditures on all desktop hardware and software seems to be slowing. For most people, Win2K is fine.

    What the study fails to mention is security. Linux and open source in general appear to be far ahead of Windows in this regard.

    In any case, most IT people have become innured to these studies - they are often pointless mental exercises without much factual backing.

  • paid support (Score:4, Insightful)

    by jameson71 ( 540713 ) on Tuesday May 06, 2003 @12:51PM (#5892308)
    I don't see why "paid vendor support" is such a big deal with corporations, when it typically amounts to either A: Someone telling you what should have been documented on their web site or B: someone telling you to hire a guy to come in at $200 an hour to tell you you have a bad ram module, and replace it.
  • by Transient0 ( 175617 ) on Tuesday May 06, 2003 @12:51PM (#5892315) Homepage
    Something I have definitely noticed with initiatives like OSS which are still currently largely under the radar of the public is that those who are promoting them are screaming as loud as they can to get heard and will say whatever will get them a little attention. Things like "Linux won't cost you anything." "You never have to upgrade." "You get support forever."

    All of these things have a kernel of truth to them, but when someone looks a little more deeply at the issue and sees that it's more complicated than that it makes the original statement seem deceptive. It should be noted that even after the author goes through all the myths put forward by OSS proponents he still in the end says that he believes Linux on the desktop offers a real cost savings over Windows.
  • A wake-up call (Score:5, Insightful)

    by Thornkin ( 93548 ) on Tuesday May 06, 2003 @12:51PM (#5892318) Homepage
    It is too easy for Linux advocates, surrounded by their Linux friends, to lull themselves into a sense of complacency. Too often they weave tails of easy rollout and lower cost that are simply not supported by reality. While I suspect that this report will be attacked as FUD, instead it should be a wake-up call to the Linux community. It should be used to show us which direction to go and what to improve on. We should take it as a roadmap, not an attack.
  • Half Right (Score:4, Insightful)

    by Apreche ( 239272 ) on Tuesday May 06, 2003 @12:52PM (#5892328) Homepage Journal
    This guy is half right. Every one of his myths is indeed a myth. But there is truth in every myth that he fails to note. For example:

    Linux is Free:
    He says it isn't free because support costs money. Well, if you don't get support it is free. There are lots of CS and IT guys looking for jobs. If you hire them to support you rather than pay RedHat it may turn out to be cheaper.

    So "Linux is Free" is a myth. But "Linux can be free" is not. If you're going to talk about what is true and what is not you better be absolute. He also mentions the TCO myth. I have yet to see real numbers showing it go either way, and there aren't any here either. So don't bother looking for them.
  • Inflama-tastic (Score:3, Insightful)

    by Skyshadow ( 508 ) on Tuesday May 06, 2003 @12:54PM (#5892351) Homepage
    If the summary is indicative of the report (and I'm hoping it's not), let me say: Bullshit.

    Let's examine one of the "myth" bullets:

    Myth: Linux Means Longer Hardware Life

    "It is true that a three- or four-year-old PC that is not powerful enough to run Windows XP Latest News about Windows XP and Office XP may be able to run Linux and StarOffice," Silver says. "However, enterprises need to budget for some additional costs to maintain older PCs."

    Notice how the inflamatory, attention-grabbing headline does not actually describe the analysis below it. Rather than suggesting that the average useful lifetime of a PC running Linux is longer than that of a PC running Windows, they point out instead that older PCs might break down.

    They're charging $95 for this brilliant type of insight? The ridiculous idea that PC hardware's average working lifespan is three years aside, they're not making any point about Linux at all.

    *sigh* I got to keep my resident pointy hair away from this one, lest he see the P300 workstation on my desk (still completely usable, BTW) and assume I'm damaging company revenues...

  • Such Research (Score:5, Insightful)

    by Gleef ( 86 ) * on Tuesday May 06, 2003 @12:59PM (#5892409) Homepage
    "Linux vendors only support their consumer releases (and free distributions) for a maximum of two years, Silver noted."

    Sounds like the only research the Gartner Group did for this report was to call Microsoft, call RedHat, and find out what they do.

    They don't even bother to say what the TCO issues are between Linux and Windows, they just say "If [enterprise complications result in high TCO] is true with Windows, "we see little reason to believe that the cultural or political issues will change just because the enterprise is now using Linux," he observes. They didn't even check. They didn't do a study of their own, they didn't talk to people who have done TCO studies of this [winface.com], or talk to Businesses who have already made the jump [bryanconsulting.com]. They looked at Windows, and they guessed.

    And they charge $95 per copy for their uneducated guess.

    At least they can do some work before charging people for it.
  • by pecosdave ( 536896 ) on Tuesday May 06, 2003 @01:00PM (#5892424) Homepage Journal
    Yes, Linux has lots of bloat. More than Windows when you get down to it. The most important thing you overlooked, most of that bloat is optional in Linux unlike Windows. I've installed SuSE from a DVD, bloat is pleantiful, and removing it isn't always trivial, but it is doable, and you can opitonally start with a bare install. Try removing IE from XP. Optional bloat isn't so bad, and distro makers are moving in the right direction, as time progresses distros get better. Except more maybe RedHat which seems to be getting worse.
  • fair report (Score:4, Insightful)

    by poot_rootbeer ( 188613 ) on Tuesday May 06, 2003 @01:00PM (#5892425)

    This seems like a pretty fair and unbiased report... the only bullet point I have any issue with is the 'forced upgrade' one.

    While it's true that commercial Linux vendors do not support older versions of their distributions indefinitely, the nature of the upgrade cycle is different with free software than it is with a closed-source product.

    There are some costs that Linux and Windows upgrades have in common:

    ongoing support

    training

    productivity decreases as computers have to be taken out of service temporarily to apply the upgrades

    However with Linux, each upgrade to the OS is available free of charge. Microsoft requires you to give them money each time you upgrade. As such, forced upgrades are not as onerous on a company using Linux.

  • TCO (Score:3, Insightful)

    by Mistlefoot ( 636417 ) on Tuesday May 06, 2003 @01:01PM (#5892444)
    With the Total Cost of Ownership up for debate I think a main point is being missed.

    If I own a foriegn car, I expect the mechanic I use to charge a bit more (or a lot more). Plain and simple supply and demand. And I can't hire my friendly neighbourhood backyard mechanic neither because most backyard mechanics don't touch my brand.

    Linux, as the purveyor of a much smaller portion of the computing environment suffers the same fate these days. 8 out 10 users use something else. If and when that reaches a more equal ratio there should be more people available to maintain these systems. And less time spent helping out with small issues.

    Imagine an office full of staff who have been weaned on Windows. Toss them linux and half the maintainance costs wouldn't be on maintainance, but on solving issues the users create. Familiarity is a big part of the big picture.

    As Michael Robertson noted yesterday - Lindows users insist on Anti-Virus protection. Yet when a virus comes out in linux there is usually a fix as fast as there is detection for the virus. As linux becomes more mainstream small issues such as this will go away.
  • by tuffy ( 10202 ) on Tuesday May 06, 2003 @01:01PM (#5892446) Homepage Journal
    It seems /. has transformed "proof by anecdote" into something both "interesting" and "informative". Bravo.
  • by bellings ( 137948 ) on Tuesday May 06, 2003 @01:08PM (#5892528)
    Why would anyone write a .bat script on Windows to emulate a Bash or Perl script on Unix, when both Bash and Perl are available on Windows?

    I would be interested in any example of a Perl script you've written on Unix that will demonstrate the "basic undeniable fact" that Windows is far less flexible than Unix.

    Otherwise, STFU.
  • by platipusrc ( 595850 ) <erchambers@gmail.com> on Tuesday May 06, 2003 @01:09PM (#5892538) Homepage
    Since when is removing a shortcut the same thing as uninstalling an application?
  • Cost? (Score:3, Insightful)

    by lexcyber ( 133454 ) on Tuesday May 06, 2003 @01:10PM (#5892554) Homepage
    Even if the support and maintence cost is as high as windows. It will still be cheaper since you slash of the software license cost from the total price. Even if you only move your servers onto GNU/Linux and/or *BSD. You will save a whole lot of money. Since the server + client accesslicenses are very expensive.

    Isn't the best path just this minute to move over to openoffice.org for officeapps and GNU/Linux and/or BSD for the servers? As an initial move towards OSS and Free Software.

    comments?

  • Re:Do the math... (Score:3, Insightful)

    by Dot.Com.CEO ( 624226 ) on Tuesday May 06, 2003 @01:12PM (#5892570)
    Gartner provide highly valuable reports that are more than worth their price. If you were a CTO trying to convince the board that Linux is ready for the desktop, a Gartner report supporting your suggestion would be a very valuable weapon. $95 is peanuts in corporate land.

    And, yes, they make a very, VERY handsome profit.

  • by GrimReality ( 634168 ) on Tuesday May 06, 2003 @01:12PM (#5892575) Homepage Journal
    that on my computer Linux locked up every 5 minutes after starting GNOME, which I found out the problem was due to a four-year-old bug in the Linux kernel (so much for open source fixing bugs quickly) that caused it to corrupt memory and lock up X windows on my nvidia card.

    Do you know that Microsoft's virtual monopoly mens that hardware makers do not have the incentive to write drivers for alternative opeating systems. They could at least release full interface specs so that the work would be done by someone else. No they haven't done either. And I suspect Microsoft could be bullying (indirectly pressuring) hardware makers not to write drivers for GNU/Linux et. al. or release specs. Now, don't say that they have to make money from the dirvers. They don't sell their drivers, they have to provide it with the hardware they sell.

    I know Windows costs money but its not that expensive and besides Linux is only
    free if your time is worthless.

    'free' in the GNU/Linux terms means freedom, not moolah. I know this is a (Score, -2000, Overrated and Redundant), but I have no choice but to say it again. It means 'freedom'. Lower cost of acquiring the software is just a perk. Again consider installing Windows on 30 machines. With GNU/Linux one licence is good for all, while on Windows you pay for each workstation for software alone.

    These arguments are exactly what everyone I have spoken to seem to make. It is partly true but it is like listening to a part of a show that is supposed to be funny, but can only be funny if you had background information of the show. So, where is the background info. of this show? Microsof's dominance, coercion in many forms on hardware makers.

    Thank you for understanding.
    GrimReality
    2003-05-06 17:09:14 UTC (2003-05-06 13:09:14 EDT)

  • by kavau ( 554682 ) on Tuesday May 06, 2003 @01:14PM (#5892588) Homepage
    it is obvious that it is a deliberate effort by Linux fanatics

    I've always viewed discussions of this kind as some sort of miscommunication, rather than as a conspiracy by microsoft haters. The problem is, and probably always will be, that one has to distinguish between "Linux for Nerds" and "Linux for the Masses". The latter includes KDE or Gnome, OpenOffice, multimedia tools, and many more applications that anybody would ever install on his Windows machine, or ever use for that matter. Certainly there is lot of bloat.

    Strictly speaking, though, KDE and Gnome are not Linux. They are applications that run on top of Linux. Hence you should accuse the applications of bloat, not Linux itself.

    What the "Linux on a 486 PC" advocates are saying is that the Linux kernel itself is very compact, and that it is cleanly separated from any GUI. So the knowledgeable user (i.e. the Linux Nerd) can put together his own collection of apps, and his preferred lightweight window manager, to create an entirely unbloated distribution that will run just fine on old machines.

    Obviously this miscommunication is partly the fault of the Linux enthusiasts. Linux advocates have to learn that Joe User will never bother to understand the difference between the GUI and the underlying kernel. And I don't blame him for being a non-nerd. It is the responsibility of the Linux community to put this kind of statements into a language that can easily be understood by non-technical people.

    But if you decry those people as Linux fanatics, you are clearly overreacting. One can either discuss those issues matter-of-factly, or one can start a flame war.

  • by billstr78 ( 535271 ) on Tuesday May 06, 2003 @01:16PM (#5892628) Homepage
    Web server management, user account managment service startup, firewall managment, hardware configuration all can be configured using BASH and PERL on *Nix.

    Even though those utilities have been ported to Win2K, they cannot perform the same functionality on an operating system that hides 75% of it's operation from all users.

    THAT is what makes unix more flexible.
  • by RoLi ( 141856 ) on Tuesday May 06, 2003 @01:18PM (#5892645)
    All this FUD will not help Microsoft at all.

    The more mud MS slings, the more people will try out Linux because they will become curious about what can cause MS so much pain.

    So MS, bring it on!

  • Case Studies? (Score:2, Insightful)

    by redptam ( 602168 ) on Tuesday May 06, 2003 @01:20PM (#5892661)
    Are there actually any companies out there that have switched (either well in the past or recently) from windows to linux? If so, what have been their experiences?

    I ask because I think there is no way to end this religious-like argument until a true full fledged case study is done on a company of at least 200 or more employees running linux on ALL of the company's desktops.

  • by Otter ( 3800 ) on Tuesday May 06, 2003 @01:20PM (#5892673) Journal
    One of the biggest wails heard by the most vocal and fanatical zealots in the Linux community is that Windows and most of the programs that run on it are bloated and slow, while screaming about how fast and "un-bloated" Linux is. Where this myth started I do not know...

    I can tell you -- it goes back to the start of Linux advocacy, around the days of Red Hat 5.x. (Before that, the only people who could possibly want Linux wanted to run Unix on their PCs, so choosing Linux was a no-brainer.) Back then, Linux desktops running fvwm or AfterStep were usable on hardware that couldn't run Windows 95. So while usability, features and ease of installation were lacking, there was the ability to run on old hardware.

    Now, the software and usability have vastly improved, but at the cost of requiring the same hardware as Microsoft OSs. But the old argument of "It runs on low-end hardware!" still floats around, however irrelevant it is to a modern KDE desktop.

    Same, by the way, for "It's more stable!" At this point, I'm more likely to see desktop lockups from Nvidia XFree86 drivers than a Windows crash, but that doesn't stop the shrieking about how XP crashes every ten minutes.

  • by jlusk4 ( 2831 ) on Tuesday May 06, 2003 @01:20PM (#5892674)
    People talk about how secure Linux is, but how do you prevent some executable piece of email from reading the user's *own* address book and deleting the user's *own* documents (or worse -- corrupting them so the backups get hosed, too)?

    The problem isn't security, it's executable content. As long as executable content is never offered in any popular email program (or search-for-ET screensaver) in Linux, we're safe. How long will that last before some vendor brings out the spiffy new macro-language-in-email feature and users snap it up (once we get past the hurdle of even getting linux on the desktop)?

    John.
  • by binaryDigit ( 557647 ) on Tuesday May 06, 2003 @01:23PM (#5892701)
    Web server management, user account managment service startup, firewall managment, hardware configuration all can be configured using BASH and PERL on *Nix.

    You can do basic stuff like "net start w3svc", most any part of iis can be controlled through vbscript (adding users, virtual domains, etc), I don't know if a PERL lib is available, but it certainly could be. What hardware configuration do you refer to?

    Even though those utilities have been ported to Win2K, they cannot perform the same functionality on an operating system that hides 75% of it's operation from all users.

    First, this is completely false. You can access a HUGE amount of the OS via any scriptable language that can do COM calls. If Win2K was so closed, it wouldn't be so damned easy to write virus's for it. Plus, the things you mentioned above (web server/firewall mgmt) have nothing to do with the OS.
  • Re:A wake-up call (Score:5, Insightful)

    by fishlet ( 93611 ) on Tuesday May 06, 2003 @01:24PM (#5892725)
    I think that is a good point.

    Many of us linux users are used to making excuses for (or at least working around) problems with the Linux desktop. It doesn't work as smoothly as windows, it doesn't work anywhere near as smoothly as a mac- but there are so many other reasons that we like Linux that we tend to minimize them. It's just human nature I think, it's easier to criticize others than to admit our own faults. The first step in making progress is admitting what doesn't work and making it better.

  • by sheriff_p ( 138609 ) on Tuesday May 06, 2003 @01:25PM (#5892734)
    Please, please, could you offer something to back that up?

    An unpatched Linux machine is as vulnerable as an unpatched Windows machine. Security is to do with administration, not the operating system.

    The sooner Linux zealots realise this, and start saying things like "Linux provides an easier patch path", the sooner people will start taking them seriously.
  • by Anonymous Coward on Tuesday May 06, 2003 @01:26PM (#5892745)
    Optional bloat isn't so bad, and distro makers are moving in the right direction, as time progresses distros get better. Except more maybe RedHat which seems to be getting worse.

    Fighting moronic ignorance on slashdot is like trying to drain the ocean with a thimble.

    Perhaps you'd like to enlighten us as to why Red Hat provides so much non-optional, unremovable "bloat", while the sainted distros like Debian, Mandrake, Gentoo, SuSE, and Joe-Bob's Basement Linux get things right.

    Or maybe you're just trying to look like a badass by railing against "The Man" like a Jr. High student.
  • by IGnatius T Foobar ( 4328 ) on Tuesday May 06, 2003 @01:27PM (#5892767) Homepage Journal

    Gartner cannot view Linux rollouts with an open mind because Gartner insists on looking at Linux as a drop-in replacement for proprietary operating systems. Gartner refuses to alter its frame of reference.

    Deployment of Linux isn't just about Linux itself. It's about changing the rules, shifting the paradigms, that sort of thing. That's the piece that Gartner misses, every single time. To deploy Linux effectively you have to treat it as Linux, leveraging its advantages and steering clear of its (rapidly diminishing) disadvantages. Gartner wants to force-fit Linux into a Windows paradigm, so it's no surprise that they keep finding that it does so very poorly. Linux is not a drop-in replacement for Windows! It is an alternative, just like the Macintosh is an alternative.

    Only when you design for Linux and plan for Linux do you get to take advantage of its strengths.

  • by sheldon ( 2322 ) on Tuesday May 06, 2003 @01:33PM (#5892837)
    "So the web server management, user account managment service startup, firewall managment, hardware configuration and the like can all be configured in Win2K using PERL and other commandline utilities?"

    Yes.

    "As we argue, Windows engineers are trying to figure out a way to add a usefull file based configuration and command line shell to the next release of windows."

    They did that in Windows 2000.

    You're either ignorant or a troll, or both. Prove your otherwise by giving specific examples.
  • by digidave ( 259925 ) on Tuesday May 06, 2003 @01:34PM (#5892849)
    That's only partially true. OS design has a lot to do with how much damage a virus or hacker can do. On Windows, once some executable content runs, it has free reign over the system. On *nix, this is not usually the case.

    Truth be told, security has more to do with users than with the OS.
  • Desktop management (Score:3, Insightful)

    by demaria ( 122790 ) on Tuesday May 06, 2003 @01:35PM (#5892853) Homepage
    In the enterprise, desktop management is a very big issue that still hasn't been solved completely. In the Windows world there is SMS, ZenWorks and a slew of vendors offering application deployment, application management, asset control, metering and patch management. Does anything like this exist for Linux at all?
  • by Renli ( 585818 ) on Tuesday May 06, 2003 @01:36PM (#5892868)
    I don't know what you read. But the article I read wasn't hyping MS. He said in some ways Linux was better (say the registry of Win vs the file system of Linux for trouble shooting). And in some ways Windows is better. I thought it was pretty decent.
  • by banzai51 ( 140396 ) on Tuesday May 06, 2003 @01:38PM (#5892892) Journal
    That's been tried already. Its called mainframe computing. Client/Server computing, even with it's warts, is still cheaper and more prodcutive in userland. If the bighorkinmachine ever went down, you're SOL, EVERYONE is down. While in client/server while I may loose access to a program or two, I still can work on other things. Or I am smart enough to have redundancy (Citrix) to serve my applications and I don't have any downtime with a puking server. All still cheaper than the mainframe route.
  • by psgalbraith ( 200580 ) on Tuesday May 06, 2003 @01:38PM (#5892895) Homepage
    When the report states that Linux isn't free because support isn't free, it forgets that it's the licensing that's free.

    How much is the elimination of the threat of a license audit worth to your company?
  • Re:A wake-up call (Score:5, Insightful)

    by delcielo ( 217760 ) on Tuesday May 06, 2003 @01:40PM (#5892913) Journal
    I applaud your over-riding position that we should take such reports seriously and address them, I do think that many of the points raised in the article are true mostly in the short term.

    I haven't called AIX support in almost 2 years, never called Sun/RedHat/HP, etc. And the impression I get from listening to the Windows guys on the other side of the cube wall is that they get very little actual support from MS. Mostly they get pointed to vendors. So I'm not sure support is as big an issue for the OS as it is for the applications.

    Speaking of the Windows guys. I've seen several cookie-cutter MCSE's who got hired; but went through an enormous learning curve because they got their cert without really learning anything. I don't see where this learning curve would be more expensive than the curve I initially went through to learn Unix. All of this, of course, depends upon the individual; but I don't think a good Unix tech really takes more time to grow than a good Windows tech. The good Windows techs are the guys who understand the underpinnings. In other words... the geeks like us.

    In the short term, retraining and porting, etc. will cost more; but in the long term these will indeed produce a lower TCO.

    As for the forced upgrades, I don't think I've heard anybody say you don't have to upgrade Linux eventually. The difference is that you won't have to pay for a license every time. You also don't necessarily have to keep your hardware around forever, which the article suggests would cause you to have to support "16 different varieties of hardware." You could, as an alternative, buy cheaper hardware and replace it just as often. Guess what? Lower TCO.

    Most of these "myths" that he has exposed as false would be proven true in the long term.

    The biggest myth I see these days is the myth that you should be able to perfectly duplicate what you're doing without just doing exactly what you're doing.

    That seems short sighted.
  • The unix tradition is that when a user creates a file, whether it be directly or though downoading an email, its execute permission is off. This means that either the email client, or the user have to go out of their way to change the permissions, then execute the binary. Yes, it is still possible to shoot yourself in the foot. The ability to only screw with your own files is a benefit though. You personal documents are a lot more likely to be backed up, hope, than the full set of applications and system files. A virus which messes with installed programs or system files often means a complete rebuild of the system. Corruption of personal files can usually be fixed by a quick restore from backup.
  • by Master Bait ( 115103 ) on Tuesday May 06, 2003 @01:42PM (#5892948) Homepage Journal
    How are we supposed to comment on it?

    Well, being that it is a Gartner report, there's little to comment on except that it targets greybeard IT managers, those same managers who stuck their necks out in their youth to bring PCs into the workplace. These same greybeards are busy doing budgets and attending meetings all day long. These targeted IT greybeards don't know shit about IT anymore and have outsourced in-house IT expertise to leasing hardware from big PC vendors and decision-making to consultants. They have signed the Microsoft Perpetual Motion software license.

    They hire consultants to tell them what to do, to design and install networks, to decide when to lease new PCs, etc. They need those $95 papers from Gartner to assure them (and to 'prove' to the Board of Directors) that they're Doing the Right Thing. Their objective is to get by with a 40-hour week and to meet arbitrary Budget Objectives.

  • by johnnyb ( 4816 ) <jonathan@bartlettpublishing.com> on Tuesday May 06, 2003 @01:45PM (#5892997) Homepage
    Actually your position puts it _more_ in favor of Linux on the desktop. Why? Because Linux _isn't_ just an operating system, at least how most people come by it.

    Let's say you install Windows and Red Hat Linux on a PC. Windows comes with:

    * Wordpad

    * IE

    And that's pretty much it. Red Hat, on the other hand, comes with:

    * Mozilla

    * OpenOffice

    * The GIMP

    * Dia for diagramming

    * FTP programs

    * SFTP programs

    * CD-Burning software

    * Evolution

    * 3720 terminal emulator (for AS/400 app connectivity)

    * PDF viewing software

    * Development software

    That doesn't even count the server software it comes with. Other distributions pack even more in. Now, it usually takes ~ 30 minutes to an hour to install Linux. Probably about the same for Windows. However, after you are done installing Windows, you have to spend 10 minutes to several hours (like for Visual Studio) to install each application individually. You can save some money by using Linux applications on Windows, but you still have to download them each individually. How much time have we wasted? And that's assuming that all of your applications play nicely together.

    In addition, with the "workspaces" concept on the desktop, it creates better productivity for workers. The entire experience can be customized by the IT department if they wish. This _can_ be done to a lesser extent with Windows, if you have the right licensing agreements, however, getting all of the licensing together to do a full install of all the software you would need would be a ton of work, assuming all of your vendors wanted to play nicely together.

    Then you have upgrades. With Linux, as long as you have someone in-house who can code, you can keep your setup as long as you desire - no need to follow your vendor. If you don't like that road, you can play follow-your-vendor on Linux, too. In addition, with Linux, you get to pick your vendor, so you can choose one which works like your company works (fast-paced, traditional, etc).

    I would say there are two things that may cause you to have problems with Linux. Those are:

    * specialized software packages

    * technically-savvy users

    Yes, that's right, your technically-savvy users are going to be the ones who notice the change, not your "where's my desktop" type users. The ones who don't know technology at all will simply click on whatever you put in front of them in whatever sequence you tell them. Trust me, they are lost on whatever technology you put in front of them, you just have to give them a sequence of clicks and they will obey and do just fine. It's the medium-technically-savvy users that are difficult, because they've taken the time to learn Windows inside-out, and know how to get around all of it's quirks. They may not want to learn a new system with new quirks.

    Also, Linux systems are easier to manage. It's more obvious what causes processes to start up, which ones are messing with what resources. In reality, NT has a bunch of tools available for this kind of thing, but, as usual, you have to install them separately - ON EVERY WORKSTATION. Linux comes ready-to-manage locally or remotely.

    Add LDAP and Directory Administrator and you are set to go for large installations.
  • by Jason Earl ( 1894 ) on Tuesday May 06, 2003 @01:56PM (#5893109) Homepage Journal

    How is that different from the model most companies currently use where all files are stored on a central server?

    If a departmental file server goes down, or if the email server goes down then I have a pile of folks that can't use their computers for anything but solitaire. Sun was 100% right when they said, "the network is the computer." One of the major benefits of Linux is that it saves you money in licensing costs that can be directly applied to purchasing better (or redundant) hardware.

    The real advantage of thin clients, however, is that instead of hundreds of machines that need to be administered I now only have to adminster *one* machine (or two actually, because I am going to want redundant servers). Instead of babysitting rooms full of commodity x86 hardware (complete with all of the drive failures, software glitches, etc. that this implies) I now admin only a pair of identical server class machines. If a thin client breaks, I throw the thing in the trash and get a monkey to install another one. If I want to upgrade the software everyone uses I simply upgrade the server and I am done. Hardware upgrades are also ridiculously easy. Instead of filling up the landfill with used PCs, and spending time configuring new machines, I simply replace the servers and everyone gets a faster machine.

    The pendulum is going to swing back in the direction of thin clients, and Linux is going to be a huge part of that shift.

  • by saada ( 665307 ) on Tuesday May 06, 2003 @01:58PM (#5893121)
    we believe Linux users will feel forced to move to newer releases of Linux just as Windows users feel forced to upgrade to new versions of Windows
    If there are enough people who want to retain the same version of Linux for long periods of time, then there will be companies that will support exactly that. It's called competition, something you will never get in the windows world.
  • by YrWrstNtmr ( 564987 ) on Tuesday May 06, 2003 @02:02PM (#5893160)
    hmm...ISTR a comment similar to this a couple of weeks ago. Obviously, Windows comes with a LOT more than just Wordpad and IE. I thought that was a main gripe about Windows...too much bundling.

    And as far as installations, especially in the corporate world, ghost images are the rule of the day. We have several standard setups for different user groups. Takes maybe an hour or two (unattended) to fully install everything.

    Not saying that Linux is easier or harder or more comprehensive to push out a new install, but let's do be objective, shall we?
  • by Anonymous Coward on Tuesday May 06, 2003 @02:03PM (#5893168)
    at least all I've read so far. I manage the computers at my company (about 10 workstations and a server). We typically will stretch a computer as long as possible.

    Why? The replacement costs are staggering! and they have nothing to do with the cost of the machine itself! It is the endless time it takes to replace a Windows machine. M$ has made it as difficult as possible (bordering on impossible) to backup and restore a Windows machine completely. Even if you can image a Windows hard disk completely, it will never run on anything except that exact hardware. The way hardware vendors change machine configurations, you can't get the same hardware mockup if you order two machines on the same day! All applications are hopelessly entwined with the copy of the OS running on THAT machine.

    The only reliable way I have found to do this is to force users to keep the data files they work on on the server, do a routine weekly backup on e-mail files and bookmarks for each machine. When a machine must be replaced, I spend a minimum of two days reloading all of the software we need on each workstation from the install disks, loading patches for each of those programs and then restoring e-mail and bookmarks. This doesn't include the 1-2 hour wait on M$'s line to get another authorization number so I can reuse the Office Pro license on the new machine; I went thru that twice then found a pirated copy of the corporate version so I wouldn't have to waste that time anymore.

    Linux, on the other hand is simplicity itself. I simply back up several subdirectories. If the machine fails or I want to clone the machine to another, different set of hardware, I reinstall Linux on the new machine and restore the backed up subdirectories on the new machine. Voila! complete new machine with every application, all data files and all settings intact.

    M$ is sooooo concerned with piracy that they make preserving my company's data and work environment hell. Frankly, ANY amount of trouble with a different OS pales in comparison to the hassles outlined above.
  • by Anonymous Coward on Tuesday May 06, 2003 @02:06PM (#5893194)
    Thanks for typing all that so I didn't have to. Once again, people lacking the experience with BOTH OS's are chiming in with their idiocy. If you don't use both regularly people, you have no business even discussing the issue. There are tools, features, and options that you don't even know about.
  • by marktoml ( 48712 ) * <marktoml@hotmail.com> on Tuesday May 06, 2003 @02:06PM (#5893195) Homepage Journal
    True, true, true and oh so true. The problem at present is that objectivity has nothing to do with it. Too few businesses (or people) are willing to 'risk' the possibility of being unable to easily share the data generated by said apps.

    Note that 'risk' is a perception thing. There are many fine solutions to this preceived issue, but unfortunately almost all require thought. Something the masses are largely unwilling to do.
  • by jpetts ( 208163 ) on Tuesday May 06, 2003 @02:07PM (#5893210)
    If the bighorkinmachine ever went down, you're SOL, EVERYONE is down.

    You're absolutely right. But this doesn't really need to happen, except in case of a real catastrophe which will take down all the client-server stuff too. People, back in the '70 and '80 I used servers that had uptimes of 2 *YEARS* or more, and these were serving apps out to over 400 people. People are *so* used to the prophylactic reboot (Ooo-er, Missus!) on Windows machines, that they seem to accept machines going down regularly as normal. It currently IS, for Windows servers, but it doesn't NEED to be for other servers.

    The real issue here is control: people don't feel happy about letting IT control the resources. I would urge everybody to read A Unix Guide to Defenestration [winface.com] before they comment on centralised vs client-server computing.
  • by Anonymous Coward on Tuesday May 06, 2003 @02:08PM (#5893222)
    The problem isn't security, it's executable content.

    No, the problem is allowing executable e-mail. If Linux starts to do that, it will suck too.

  • by Anonymous Coward on Tuesday May 06, 2003 @02:17PM (#5893310)

    Myth: Linux Has a Lower TCO

    Management tools have been available for Windows for years, Silver observed, but many enterprises still have not been able to manage their Windows environment. This has often been due to too much complexity, lack of sufficient policies or standards, or cultural and political issues, according to Silver.

    If this is true with Windows, "we see little reason to believe that the cultural or political issues will change just because the enterprise is now using Linux," he observes.

    so, because he can't imagine it, it can not be.
  • Linux just planly works! And not just Linux: *bsd, nixes in general. You can rely on them. Is this is not something to count on a business, I don't know what it is.
  • by Anonymous Coward on Tuesday May 06, 2003 @02:20PM (#5893336)
    This mirrors politics.

    The left would want us all to use Linux, forgetting that in individualist systems there's no bottom line.

    The right would want us all to use Microsoft, forgetting their are other options besides the bottom line.

    Neither package is ready for the desktop. Face it humanity, you're a failure.
  • Most Linux email software is developed with security in mind. Thus, they prevent users from using executable content without making sure they know what they are doing. Most of them require you to save the file to the disk, change the executable permission to "on", and _then_ run the programs.

    Macros in documents _may_ come to be problematic, but that's yet to be seen.
  • by b17bmbr ( 608864 ) on Tuesday May 06, 2003 @03:01PM (#5893695)
    there is more to it than price. with microsoft, they own your computer. don't think so, read the eula. they tell you what you can and cannot do. and they own your data. tell me, without .doc, really, what keeps businesses so wed? there is hardly the anti os x sentiment. sure some, but not rampant. and only the core is OSS. and it sure aint free. having used linux on my desktop for years (and yes i also have an ibook), there is nothing i am missing not using windows. (and no i'm not some IT dude, i'm a school teacher) the whole linux desktop debate is stupid. if employess are too lame to understand /home/janedoe == c:\my documents, and can't figure out how to use star writer after using word for a few years, hell then a company deserves the morons they hired. that'll cost them far more than any software will.
  • by Anonymous Coward on Tuesday May 06, 2003 @03:08PM (#5893751)
    > Please give an example of "web server management" that can be scripted on Unix that can not be scripted on Windows.
    lessee...
    #!/bin/sh
    openssl req -new -key ssl.key/www.domain.net-server.key -out ssl.csr/www.domain.net-server.csr
    # openssl req -noout -text -in ssl.csr/www.domain.net-server.csr
    openssl-sign ssl.csr/www.domain.net-server.csr

    #!/bin/sh
    ##
    ## openssl-sign.sh -- Sign a SSL Certificate Request (CSR)
    ## Copyright (c) 1998-2001 Ralf S. Engelschall, All Rights Reserved.
    ##

    # argument line handling
    CSR=$1
    if [ $# -ne 1 ]; then
    echo "Usage: sign.sign .csr"; exit 1
    fi
    if [ ! -f $CSR ]; then
    echo "CSR not found: $CSR"; exit 1
    fi
    case $CSR in
    *.csr ) CERT="`echo $CSR | sed -e 's/\.csr/.crt/g'`" ;;
    * ) CERT="$CSR.crt" ;;
    esac

    # make sure environment exists
    if [ ! -d ca.db.certs ]; then
    mkdir ca.db.certs
    fi
    if [ ! -f ca.db.serial ]; then
    echo '01' >ca.db.serial
    fi
    if [ ! -f ca.db.index ]; then
    cp /dev/null ca.db.index
    fi

    # create an own SSLeay config
    cat >ca.config '${CERT}':"
    openssl ca -config ca.config -out $CERT -infiles $CSR
    echo "CA verifying: $CERT CA cert"
    openssl verify -CAfile /etc/httpd/conf/ssl.crt/ca.crt $CERT

    # cleanup after SSLeay
    rm -f ca.config
    rm -f ca.db.serial.old
    rm -f ca.db.index.old

    # die gracefully
    exit 0

    (Additional script to insert/verify the new cert into Apache and gracefully restart not shown)

    >Please give an example of "firewall management" that can be scripted on Unix that can not be scripted on Windows.

    I don't have this script on my boxen, but there exists multiple versions (google!) that will add deny/reject entries in the firewall for every rule match -- think nimbda/code-red/etc...

    >Please give and example of "hardware configuration" that can be scripted on Unix that can not be scripted on Windows.

    Hotplug comes to mind. Dynamic reconfiguration _without rebooting_. Or loadable kernel modules.

    >Please give an example of the "75% of it's (sic) [Windows] operation [hidden] from all users."

    Wow. This one begs for it. NT Kernel. Direct X, .Net, Samba, etc. Of course, that's just my glib response, since 100% of the linux operation can be in plain view of its users, if they so choose to view and understand.
  • by _Sprocket_ ( 42527 ) on Tuesday May 06, 2003 @03:21PM (#5893894)


    An unpatched Linux machine is as vulnerable as an unpatched Windows machine. Security is to do with administration, not the operating system.

    The sooner Linux zealots realise this, and start saying things like "Linux provides an easier patch path", the sooner people will start taking them seriously.


    This hits on a very important point.

    Usually this kind of conversation ends up as a flamewar debating over the vulnerability counts found on SecurityFocus, etc. Ignoring exactly what these numbers mean, how they are tabulated, and whether they compare apples to apples or not... they only tell a part of the whole story. The trouble is, when people think "security", they've become conditioned to think exploit numbers. And patches.

    Ideas like "Linux provides an easier patch path" is a good start. So would something along the lines of "Linux provides a more modular environment and control over installed components." But then, that's considerably longer than "Linux provides better security." Even if it leads to miscommunication.

    But it may be worth the extra effort. After all, at the risk of generating another slew of flames, infosec is one of the subjects that seem to draw a lot of comments from those who really don't understand the subject. Pointing out the strengths of one's favorite environment might hold more weight if it also included some education in the subject matter at hand.
  • Um...what?? (Score:3, Insightful)

    by SeanAhern ( 25764 ) on Tuesday May 06, 2003 @03:22PM (#5893907) Journal
    Myth: Linux Will Be Less Expensive

    They then go on to explain that the argument is that OpenOffice and Linux will be less expensive than MS Office and Windows. Their attempt to debunk this is to say that OpenOffice is available on Windows.

    Somehow this means that the "myth" is false? Their arguments don't stand to reason.

    First off, the argument of Linux being less expensive is much, much larger than just the cost of an office productivity suite. It has to do with licensing, user support, applications, TCO, uptime, and all sorts of other things. Saying "OO is available on Windows. Q.E.D." is almost a non sequitur.

    And how does saving money on an office suite, even if you don't migrate to Linux, mean that Linux costs more? It doesn't follow! If they argued other costs of migration (apps, user training, etc.), maybe they'd start down a logical line of argument. But the office suite argument is a dead end that doesn't lead to the conclusion that their headline would suggest.

    This article is mostly FUD.
  • by LadyLucky ( 546115 ) on Tuesday May 06, 2003 @03:28PM (#5893975) Homepage
    Am I the only person that *only* cares about my personal files and not about the system? That thinks the computer is here to do stuff for me, not for me to protect the stupid computer?

    Corruption of personal files is *catastrophic*. Imagine your house burns down, what do you want to save most? Do you say "Oh, we saved the house, but all your personal stuff is gone". That's just completely backwards. If the OS can't save me from a virus mucking with the personal files, then I don't give a damn about the system files, they can be fixed.

  • Sly Deception (Score:4, Insightful)

    by Euphonious Coward ( 189818 ) on Tuesday May 06, 2003 @03:32PM (#5894030)
    This is a very sly article. Its overall level of articulateness and internal cohesion suggest that it was written by a Gartner customer and published more or less unchanged. Make no mistake, despite the apparent evenhandedness, this report is meant to muddy the water. If Free Software really offers only a "slight edge" here and there, and numerous "problem[s] replicating this [or that] technology", who would dare switch? The section headings, identified as "myths", are meant to be taken as false, when in fact they all remain substantially true despite the author's quibbling.

    Perhaps the slyest bit of slight-of-hand was the claim that the cost of supporting Linux users would not be significantly less than for Windows users. As support, the author quotes somebody saying that Linux required about as much support staff as Unix -- then just guesses (ignoring contrary reports) that the same would obtain vs. supporting Windows desktops.

    Another is the suggestion that working well on older hardware actually counts against Free software. The author says, for instance, "After warranty support is over, many enterprises choose not to repair broken PCs, but to replace them with new ones." This is in large part because the repaired PC would not be able to run current MS software versions anyhow.

    Similarly, the author suggests that keeping older hardware means managing many more varieties of hardware. Yet, it is not old, well-understood hardware that is hard to manage, but the forced influx of new hardware needed to run new versions of software. Absent that forced turnover, an enterprise may reasonably stick with substantially the same hardware configuration (with optional upgrades in clock speed and storage capacity) until there are compelling, objective reasons to switch.

    Equally damning are the omissions. The author carefully avoids mentioning lock-in, and never mentions the possibility of obtaining support from independent (and possibly local, and competing) third parties, or from the in-house expertise that can only develop with Free software. For a good comparison, consider the SUNY Faculty Senate resolution published at http://orange.math.buffalo.edu/csc/resolution2_apr il2003_approved.html [buffalo.edu].

    I could go on and on, but the point is that the opposition has become more sophisticated. This is more clever than "Free software is a cancer that threatens the American Way", but the intent and the conclusion are the same. Now the strategy is "make minor concessions, but sow seeds of fear, doubt, and confusion." The falsehoods reveal the true intent.

    Try to guess which Gartner customer wrote this report.

  • by cgenman ( 325138 ) on Tuesday May 06, 2003 @03:38PM (#5894099) Homepage
    I can tell you though, if you spend more money to get a qualified, competent, hard-working systems administrator, it is twice as good as spending that money on your OS and skimping on your employee quality.

    Indeed. Our administrator just converted an underutilized webhosting box to a much needed mail-server while 500 miles away on a business trip, over nothing more than his Ibook. That kind of remote-managament is unheard of in the Windows world. "Apt-get install mysql-server"?

    The funny thing is, qualified, competent, hard-working systems administrators with years of experience are surprisingly available these days, and are going for far less than people might expect. A friend with 20 years of experience managing Unix networks summed up the problem like this. "All applications are filtered through the HR person. The HR person knows what Javascript is. The HR person has no idea what AWK is. The HR person is going to pass on the resume of the person with Javascript experience."

    Of course the issue of the *number* of qualified personnel required should also be brought up. I run in a pure Linux / BSD shop, so I will bring up the experience of a colleague who interned in a Mac / Windows QA shop. For most projects, there were 12 PC's, 12 Macs, 3 PC Technicians, and 1 Mac technician on setup. Generally speaking, the Mac technician would finish configuring and installing the software on all of his 12 machines before the 3 PC technicians. This was quite some time ago and is not directly linked to the Linux / Windows debate, but the point is that the choice of OS can have a tremendous effect on the number of people required to administer the network. In this case if the more expensive employee were paid 2x what the cheaper ones were, you would only be paying 2/3's of what you would with the cheaper ones, and would have a much happier employee to boot.

    Having seen what a competent linux administrator can achieve quickly and remotely, for example quickly knocking out scripts to do specific tasks (like migrating datapaths) that would otherwise take hours to do manually, it seems pretty clear that you would require fewer administrators for Linux than for Windows. Anybody either technically competent or extremely well trained can setup an IIS server, but it will take either of them quite some time.

    So your choices (if money is an issue):
    1) a higher paid employee running linux/bsd/etc.
    2) several lower paid employees running win2k.
    and if you are sufficiently small
    3) a regular employee taking over the win2k work.

    (note: this is not attempting to knock the technically competent Windows administrators amongst you, people for whom I have the utmost respect. But even you must admit that setting up a SOHO file server in Windows doesn't exactly tax your abilities).
  • by nitehorse ( 58425 ) <clee@c133.org> on Tuesday May 06, 2003 @03:52PM (#5894276)
    Wrong.

    Internet Explorer runs with Adminstrator privileges. So does Windows Media Player. And Microsoft Office. Including Outlook. The "finer-grained ACLs" on Windows NT-based OSes don't mean shit when the programs all get to run setuid root.
  • by njdj ( 458173 ) on Tuesday May 06, 2003 @04:21PM (#5894623)
    We hear this again and again: "Proprietary software is supported, free software isn't".

    It's bullshit. If you have problems installing a driver for Windows, do you think Microsoft will give you any support? Have you tried calling Microsoft tech support?

    "Be sure to install the latest service pack". That's your tech support from the vendor. You get effective support for M$ products exactly the same way you get effective support for free software - by posting a question on a newsgroup or forum.
  • by ivan256 ( 17499 ) * on Tuesday May 06, 2003 @05:31PM (#5895321)
    Am I the only person that *only* cares about my personal files and not about the system?

    No...

    Corruption of personal files is *catastrophic*. Imagine your house burns down, what do you want to save most?

    You convieniently ignored the "Personal files can be restored from backup" part of the parent comment. Even the best security in the world doesn't protect you from hardware failure, so it's a given that you should be backing up your personal data. It's not that hard or expensive, you just need to get in the habit of doing it. When you take that into account your house analogy falls apart. You can't easily make a duplicate of all your personal stuff from your house, but you CAN backup your data. If you DO backup your data, all all that's left to save is "the house".

    If you're not backing up your data, you will loose it. You're flirting with catastrophe. You've been warned.
  • by Eric Damron ( 553630 ) on Tuesday May 06, 2003 @05:33PM (#5895341)
    that after paying $95.00 for the six page FUD document, management is going to believe that it must be true.
  • by Carewolf ( 581105 ) * on Tuesday May 06, 2003 @09:00PM (#5897358) Homepage
    If I understand this correctly you are confusing "root" and priviledged mode. Internet Explorer, Outlook and Media Player are all running partly priviledged, and therefore they can potentially do the same or more damage than running as Administrator, but they are not running as Administrator :)
  • by derF024 ( 36585 ) * on Tuesday May 06, 2003 @09:19PM (#5897487) Homepage Journal
    The execute permission doesnt affect your ability to run scripts.


    well, you can still do "/bin/sh file.sh" or something like that but you can't run the script directly, which means that it won't run by double clicking it, and it won't run out of your email program.

    A buffer overflow in a stack smash attack can still fork a shell, the no-execute mount of the filesystem is just a PITA for the users, not the attackers.


    worms smash stacks (although this is now nullified with recent changes to 2.5) not viruses. the execute protection doesn't help against a worm, but a properly set up (firewalled) desktop system shouldn't be attacked by a worm anyway. properly protected firewalled-by-default systems only need to worry about user error allowing a virus on the system, and the execute protection effectivley stops viruses.

    Very few local->root exploits rely upon the ability to create exec'able files.

    i fail to see what difference this makes on a single user desktop operating system. in theory, every local user on a desktop machine has the ability to get root after typing in their password, so that they can safely install applications or make system-wide changes. in a corporate environment, a malicious employee can only take down their own machine, they can't send a virus to everyone in their dept. taking down everyone elses systems in the process.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...