Linux Desktop Myths Examined 718
Call Me Black Cloud writes "NewsFactor Network has an overview of the $95.00 Gartner report titled, "Myths of Linux on the Desktop". It's a good look at several points from the perspective of a corporate user, not a home user."
Ummm, is it my imagination... (Score:2, Insightful)
How are we supposed to comment on it?
Some FUD, not all (Score:5, Insightful)
I also don't believe Linux saves money on hardware compared to Windows - it seems many offices are holding back with Windows upgrades, and IT expenditures on all desktop hardware and software seems to be slowing. For most people, Win2K is fine.
What the study fails to mention is security. Linux and open source in general appear to be far ahead of Windows in this regard.
In any case, most IT people have become innured to these studies - they are often pointless mental exercises without much factual backing.
paid support (Score:4, Insightful)
some very good points (Score:5, Insightful)
All of these things have a kernel of truth to them, but when someone looks a little more deeply at the issue and sees that it's more complicated than that it makes the original statement seem deceptive. It should be noted that even after the author goes through all the myths put forward by OSS proponents he still in the end says that he believes Linux on the desktop offers a real cost savings over Windows.
A wake-up call (Score:5, Insightful)
Half Right (Score:4, Insightful)
Linux is Free:
He says it isn't free because support costs money. Well, if you don't get support it is free. There are lots of CS and IT guys looking for jobs. If you hire them to support you rather than pay RedHat it may turn out to be cheaper.
So "Linux is Free" is a myth. But "Linux can be free" is not. If you're going to talk about what is true and what is not you better be absolute. He also mentions the TCO myth. I have yet to see real numbers showing it go either way, and there aren't any here either. So don't bother looking for them.
Inflama-tastic (Score:3, Insightful)
Let's examine one of the "myth" bullets:
Myth: Linux Means Longer Hardware Life
"It is true that a three- or four-year-old PC that is not powerful enough to run Windows XP Latest News about Windows XP and Office XP may be able to run Linux and StarOffice," Silver says. "However, enterprises need to budget for some additional costs to maintain older PCs."
Notice how the inflamatory, attention-grabbing headline does not actually describe the analysis below it. Rather than suggesting that the average useful lifetime of a PC running Linux is longer than that of a PC running Windows, they point out instead that older PCs might break down.
They're charging $95 for this brilliant type of insight? The ridiculous idea that PC hardware's average working lifespan is three years aside, they're not making any point about Linux at all.
*sigh* I got to keep my resident pointy hair away from this one, lest he see the P300 workstation on my desk (still completely usable, BTW) and assume I'm damaging company revenues...
Such Research (Score:5, Insightful)
Sounds like the only research the Gartner Group did for this report was to call Microsoft, call RedHat, and find out what they do.
They don't even bother to say what the TCO issues are between Linux and Windows, they just say "If [enterprise complications result in high TCO] is true with Windows, "we see little reason to believe that the cultural or political issues will change just because the enterprise is now using Linux," he observes. They didn't even check. They didn't do a study of their own, they didn't talk to people who have done TCO studies of this [winface.com], or talk to Businesses who have already made the jump [bryanconsulting.com]. They looked at Windows, and they guessed.
And they charge $95 per copy for their uneducated guess.
At least they can do some work before charging people for it.
What you failed to mention (Score:5, Insightful)
fair report (Score:4, Insightful)
This seems like a pretty fair and unbiased report... the only bullet point I have any issue with is the 'forced upgrade' one.
While it's true that commercial Linux vendors do not support older versions of their distributions indefinitely, the nature of the upgrade cycle is different with free software than it is with a closed-source product.
There are some costs that Linux and Windows upgrades have in common:
ongoing support
training
productivity decreases as computers have to be taken out of service temporarily to apply the upgrades
However with Linux, each upgrade to the OS is available free of charge. Microsoft requires you to give them money each time you upgrade. As such, forced upgrades are not as onerous on a company using Linux.
TCO (Score:3, Insightful)
If I own a foriegn car, I expect the mechanic I use to charge a bit more (or a lot more). Plain and simple supply and demand. And I can't hire my friendly neighbourhood backyard mechanic neither because most backyard mechanics don't touch my brand.
Linux, as the purveyor of a much smaller portion of the computing environment suffers the same fate these days. 8 out 10 users use something else. If and when that reaches a more equal ratio there should be more people available to maintain these systems. And less time spent helping out with small issues.
Imagine an office full of staff who have been weaned on Windows. Toss them linux and half the maintainance costs wouldn't be on maintainance, but on solving issues the users create. Familiarity is a big part of the big picture.
As Michael Robertson noted yesterday - Lindows users insist on Anti-Virus protection. Yet when a virus comes out in linux there is usually a fix as fast as there is detection for the virus. As linux becomes more mainstream small issues such as this will go away.
Re:Dispersing the Linux Myths (Score:2, Insightful)
Re:One Issue Not Contended... (Score:5, Insightful)
I would be interested in any example of a Perl script you've written on Unix that will demonstrate the "basic undeniable fact" that Windows is far less flexible than Unix.
Otherwise, STFU.
Re:What you failed to mention (Score:2, Insightful)
Cost? (Score:3, Insightful)
Isn't the best path just this minute to move over to openoffice.org for officeapps and GNU/Linux and/or BSD for the servers? As an initial move towards OSS and Free Software.
comments?
Re:Do the math... (Score:3, Insightful)
And, yes, they make a very, VERY handsome profit.
Re:Dispersing the Linux Myths (Score:5, Insightful)
Do you know that Microsoft's virtual monopoly mens that hardware makers do not have the incentive to write drivers for alternative opeating systems. They could at least release full interface specs so that the work would be done by someone else. No they haven't done either. And I suspect Microsoft could be bullying (indirectly pressuring) hardware makers not to write drivers for GNU/Linux et. al. or release specs. Now, don't say that they have to make money from the dirvers. They don't sell their drivers, they have to provide it with the hardware they sell.
'free' in the GNU/Linux terms means freedom, not moolah. I know this is a (Score, -2000, Overrated and Redundant), but I have no choice but to say it again. It means 'freedom'. Lower cost of acquiring the software is just a perk. Again consider installing Windows on 30 machines. With GNU/Linux one licence is good for all, while on Windows you pay for each workstation for software alone.
These arguments are exactly what everyone I have spoken to seem to make. It is partly true but it is like listening to a part of a show that is supposed to be funny, but can only be funny if you had background information of the show. So, where is the background info. of this show? Microsof's dominance, coercion in many forms on hardware makers.
Thank you for understanding.
GrimReality
2003-05-06 17:09:14 UTC (2003-05-06 13:09:14 EDT)
Re:Dispersing the Linux Myths (Score:4, Insightful)
I've always viewed discussions of this kind as some sort of miscommunication, rather than as a conspiracy by microsoft haters. The problem is, and probably always will be, that one has to distinguish between "Linux for Nerds" and "Linux for the Masses". The latter includes KDE or Gnome, OpenOffice, multimedia tools, and many more applications that anybody would ever install on his Windows machine, or ever use for that matter. Certainly there is lot of bloat.
Strictly speaking, though, KDE and Gnome are not Linux. They are applications that run on top of Linux. Hence you should accuse the applications of bloat, not Linux itself.
What the "Linux on a 486 PC" advocates are saying is that the Linux kernel itself is very compact, and that it is cleanly separated from any GUI. So the knowledgeable user (i.e. the Linux Nerd) can put together his own collection of apps, and his preferred lightweight window manager, to create an entirely unbloated distribution that will run just fine on old machines.
Obviously this miscommunication is partly the fault of the Linux enthusiasts. Linux advocates have to learn that Joe User will never bother to understand the difference between the GUI and the underlying kernel. And I don't blame him for being a non-nerd. It is the responsibility of the Linux community to put this kind of statements into a language that can easily be understood by non-technical people.
But if you decry those people as Linux fanatics, you are clearly overreacting. One can either discuss those issues matter-of-factly, or one can start a flame war.
Re:One Issue Not Contended... (Score:3, Insightful)
Even though those utilities have been ported to Win2K, they cannot perform the same functionality on an operating system that hides 75% of it's operation from all users.
THAT is what makes unix more flexible.
This will not help Microsoft :-) (Score:4, Insightful)
The more mud MS slings, the more people will try out Linux because they will become curious about what can cause MS so much pain.
So MS, bring it on!
Case Studies? (Score:2, Insightful)
I ask because I think there is no way to end this religious-like argument until a true full fledged case study is done on a company of at least 200 or more employees running linux on ALL of the company's desktops.
Re:Dispersing the Linux Myths (Score:2, Insightful)
I can tell you -- it goes back to the start of Linux advocacy, around the days of Red Hat 5.x. (Before that, the only people who could possibly want Linux wanted to run Unix on their PCs, so choosing Linux was a no-brainer.) Back then, Linux desktops running fvwm or AfterStep were usable on hardware that couldn't run Windows 95. So while usability, features and ease of installation were lacking, there was the ability to run on old hardware.
Now, the software and usability have vastly improved, but at the cost of requiring the same hardware as Microsoft OSs. But the old argument of "It runs on low-end hardware!" still floats around, however irrelevant it is to a modern KDE desktop.
Same, by the way, for "It's more stable!" At this point, I'm more likely to see desktop lockups from Nvidia XFree86 drivers than a Windows crash, but that doesn't stop the shrieking about how XP crashes every ten minutes.
not sure about that "linux security" thing (Score:4, Insightful)
The problem isn't security, it's executable content. As long as executable content is never offered in any popular email program (or search-for-ET screensaver) in Linux, we're safe. How long will that last before some vendor brings out the spiffy new macro-language-in-email feature and users snap it up (once we get past the hurdle of even getting linux on the desktop)?
John.
Re:One Issue Not Contended... (Score:5, Insightful)
You can do basic stuff like "net start w3svc", most any part of iis can be controlled through vbscript (adding users, virtual domains, etc), I don't know if a PERL lib is available, but it certainly could be. What hardware configuration do you refer to?
Even though those utilities have been ported to Win2K, they cannot perform the same functionality on an operating system that hides 75% of it's operation from all users.
First, this is completely false. You can access a HUGE amount of the OS via any scriptable language that can do COM calls. If Win2K was so closed, it wouldn't be so damned easy to write virus's for it. Plus, the things you mentioned above (web server/firewall mgmt) have nothing to do with the OS.
Re:A wake-up call (Score:5, Insightful)
Many of us linux users are used to making excuses for (or at least working around) problems with the Linux desktop. It doesn't work as smoothly as windows, it doesn't work anywhere near as smoothly as a mac- but there are so many other reasons that we like Linux that we tend to minimize them. It's just human nature I think, it's easier to criticize others than to admit our own faults. The first step in making progress is admitting what doesn't work and making it better.
Re:Some FUD, not all (Score:3, Insightful)
An unpatched Linux machine is as vulnerable as an unpatched Windows machine. Security is to do with administration, not the operating system.
The sooner Linux zealots realise this, and start saying things like "Linux provides an easier patch path", the sooner people will start taking them seriously.
Re:What you failed to mention (Score:1, Insightful)
Fighting moronic ignorance on slashdot is like trying to drain the ocean with a thimble.
Perhaps you'd like to enlighten us as to why Red Hat provides so much non-optional, unremovable "bloat", while the sainted distros like Debian, Mandrake, Gentoo, SuSE, and Joe-Bob's Basement Linux get things right.
Or maybe you're just trying to look like a badass by railing against "The Man" like a Jr. High student.
It all depends on your frame of reference. (Score:3, Insightful)
Gartner cannot view Linux rollouts with an open mind because Gartner insists on looking at Linux as a drop-in replacement for proprietary operating systems. Gartner refuses to alter its frame of reference.
Deployment of Linux isn't just about Linux itself. It's about changing the rules, shifting the paradigms, that sort of thing. That's the piece that Gartner misses, every single time. To deploy Linux effectively you have to treat it as Linux, leveraging its advantages and steering clear of its (rapidly diminishing) disadvantages. Gartner wants to force-fit Linux into a Windows paradigm, so it's no surprise that they keep finding that it does so very poorly. Linux is not a drop-in replacement for Windows! It is an alternative, just like the Macintosh is an alternative.
Only when you design for Linux and plan for Linux do you get to take advantage of its strengths.
Re:One Issue Not Contended... (Score:3, Insightful)
Yes.
"As we argue, Windows engineers are trying to figure out a way to add a usefull file based configuration and command line shell to the next release of windows."
They did that in Windows 2000.
You're either ignorant or a troll, or both. Prove your otherwise by giving specific examples.
Re:Some FUD, not all (Score:3, Insightful)
Truth be told, security has more to do with users than with the OS.
Desktop management (Score:3, Insightful)
Re:Who sponsered this study? (Score:2, Insightful)
We've already done that. (Score:5, Insightful)
What about the cost of a possible audit? (Score:4, Insightful)
How much is the elimination of the threat of a license audit worth to your company?
Re:A wake-up call (Score:5, Insightful)
I haven't called AIX support in almost 2 years, never called Sun/RedHat/HP, etc. And the impression I get from listening to the Windows guys on the other side of the cube wall is that they get very little actual support from MS. Mostly they get pointed to vendors. So I'm not sure support is as big an issue for the OS as it is for the applications.
Speaking of the Windows guys. I've seen several cookie-cutter MCSE's who got hired; but went through an enormous learning curve because they got their cert without really learning anything. I don't see where this learning curve would be more expensive than the curve I initially went through to learn Unix. All of this, of course, depends upon the individual; but I don't think a good Unix tech really takes more time to grow than a good Windows tech. The good Windows techs are the guys who understand the underpinnings. In other words... the geeks like us.
In the short term, retraining and porting, etc. will cost more; but in the long term these will indeed produce a lower TCO.
As for the forced upgrades, I don't think I've heard anybody say you don't have to upgrade Linux eventually. The difference is that you won't have to pay for a license every time. You also don't necessarily have to keep your hardware around forever, which the article suggests would cause you to have to support "16 different varieties of hardware." You could, as an alternative, buy cheaper hardware and replace it just as often. Guess what? Lower TCO.
Most of these "myths" that he has exposed as false would be proven true in the long term.
The biggest myth I see these days is the myth that you should be able to perfectly duplicate what you're doing without just doing exactly what you're doing.
That seems short sighted.
Re:not sure about that "linux security" thing (Score:5, Insightful)
Re:Ummm, is it my imagination... (Score:5, Insightful)
Well, being that it is a Gartner report, there's little to comment on except that it targets greybeard IT managers, those same managers who stuck their necks out in their youth to bring PCs into the workplace. These same greybeards are busy doing budgets and attending meetings all day long. These targeted IT greybeards don't know shit about IT anymore and have outsourced in-house IT expertise to leasing hardware from big PC vendors and decision-making to consultants. They have signed the Microsoft Perpetual Motion software license.
They hire consultants to tell them what to do, to design and install networks, to decide when to lease new PCs, etc. They need those $95 papers from Gartner to assure them (and to 'prove' to the Board of Directors) that they're Doing the Right Thing. Their objective is to get by with a 40-hour week and to meet arbitrary Budget Objectives.
Re:Lets take an objective aproach. (Score:5, Insightful)
Let's say you install Windows and Red Hat Linux on a PC. Windows comes with:
* Wordpad
* IE
And that's pretty much it. Red Hat, on the other hand, comes with:
* Mozilla
* OpenOffice
* The GIMP
* Dia for diagramming
* FTP programs
* SFTP programs
* CD-Burning software
* Evolution
* 3720 terminal emulator (for AS/400 app connectivity)
* PDF viewing software
* Development software
That doesn't even count the server software it comes with. Other distributions pack even more in. Now, it usually takes ~ 30 minutes to an hour to install Linux. Probably about the same for Windows. However, after you are done installing Windows, you have to spend 10 minutes to several hours (like for Visual Studio) to install each application individually. You can save some money by using Linux applications on Windows, but you still have to download them each individually. How much time have we wasted? And that's assuming that all of your applications play nicely together.
In addition, with the "workspaces" concept on the desktop, it creates better productivity for workers. The entire experience can be customized by the IT department if they wish. This _can_ be done to a lesser extent with Windows, if you have the right licensing agreements, however, getting all of the licensing together to do a full install of all the software you would need would be a ton of work, assuming all of your vendors wanted to play nicely together.
Then you have upgrades. With Linux, as long as you have someone in-house who can code, you can keep your setup as long as you desire - no need to follow your vendor. If you don't like that road, you can play follow-your-vendor on Linux, too. In addition, with Linux, you get to pick your vendor, so you can choose one which works like your company works (fast-paced, traditional, etc).
I would say there are two things that may cause you to have problems with Linux. Those are:
* specialized software packages
* technically-savvy users
Yes, that's right, your technically-savvy users are going to be the ones who notice the change, not your "where's my desktop" type users. The ones who don't know technology at all will simply click on whatever you put in front of them in whatever sequence you tell them. Trust me, they are lost on whatever technology you put in front of them, you just have to give them a sequence of clicks and they will obey and do just fine. It's the medium-technically-savvy users that are difficult, because they've taken the time to learn Windows inside-out, and know how to get around all of it's quirks. They may not want to learn a new system with new quirks.
Also, Linux systems are easier to manage. It's more obvious what causes processes to start up, which ones are messing with what resources. In reality, NT has a bunch of tools available for this kind of thing, but, as usual, you have to install them separately - ON EVERY WORKSTATION. Linux comes ready-to-manage locally or remotely.
Add LDAP and Directory Administrator and you are set to go for large installations.
Re:Totally misses it on TCO (Score:5, Insightful)
How is that different from the model most companies currently use where all files are stored on a central server?
If a departmental file server goes down, or if the email server goes down then I have a pile of folks that can't use their computers for anything but solitaire. Sun was 100% right when they said, "the network is the computer." One of the major benefits of Linux is that it saves you money in licensing costs that can be directly applied to purchasing better (or redundant) hardware.
The real advantage of thin clients, however, is that instead of hundreds of machines that need to be administered I now only have to adminster *one* machine (or two actually, because I am going to want redundant servers). Instead of babysitting rooms full of commodity x86 hardware (complete with all of the drive failures, software glitches, etc. that this implies) I now admin only a pair of identical server class machines. If a thin client breaks, I throw the thing in the trash and get a monkey to install another one. If I want to upgrade the software everyone uses I simply upgrade the server and I am done. Hardware upgrades are also ridiculously easy. Instead of filling up the landfill with used PCs, and spending time configuring new machines, I simply replace the servers and everyone gets a faster machine.
The pendulum is going to swing back in the direction of thin clients, and Linux is going to be a huge part of that shift.
Myth: Linux Means No Forced Upgrades (Score:2, Insightful)
Re:Lets take an objective aproach. (Score:3, Insightful)
And as far as installations, especially in the corporate world, ghost images are the rule of the day. We have several standard setups for different user groups. Takes maybe an hour or two (unattended) to fully install everything.
Not saying that Linux is easier or harder or more comprehensive to push out a new install, but let's do be objective, shall we?
All Linux vs Windows comparisons miss the point... (Score:4, Insightful)
Why? The replacement costs are staggering! and they have nothing to do with the cost of the machine itself! It is the endless time it takes to replace a Windows machine. M$ has made it as difficult as possible (bordering on impossible) to backup and restore a Windows machine completely. Even if you can image a Windows hard disk completely, it will never run on anything except that exact hardware. The way hardware vendors change machine configurations, you can't get the same hardware mockup if you order two machines on the same day! All applications are hopelessly entwined with the copy of the OS running on THAT machine.
The only reliable way I have found to do this is to force users to keep the data files they work on on the server, do a routine weekly backup on e-mail files and bookmarks for each machine. When a machine must be replaced, I spend a minimum of two days reloading all of the software we need on each workstation from the install disks, loading patches for each of those programs and then restoring e-mail and bookmarks. This doesn't include the 1-2 hour wait on M$'s line to get another authorization number so I can reuse the Office Pro license on the new machine; I went thru that twice then found a pirated copy of the corporate version so I wouldn't have to waste that time anymore.
Linux, on the other hand is simplicity itself. I simply back up several subdirectories. If the machine fails or I want to clone the machine to another, different set of hardware, I reinstall Linux on the new machine and restore the backed up subdirectories on the new machine. Voila! complete new machine with every application, all data files and all settings intact.
M$ is sooooo concerned with piracy that they make preserving my company's data and work environment hell. Frankly, ANY amount of trouble with a different OS pales in comparison to the hassles outlined above.
Re:Lets take an objective aproach. (Score:1, Insightful)
Re:Lets take an objective aproach. (Score:2, Insightful)
Note that 'risk' is a perception thing. There are many fine solutions to this preceived issue, but unfortunately almost all require thought. Something the masses are largely unwilling to do.
Re:We've already done that. (Score:5, Insightful)
You're absolutely right. But this doesn't really need to happen, except in case of a real catastrophe which will take down all the client-server stuff too. People, back in the '70 and '80 I used servers that had uptimes of 2 *YEARS* or more, and these were serving apps out to over 400 people. People are *so* used to the prophylactic reboot (Ooo-er, Missus!) on Windows machines, that they seem to accept machines going down regularly as normal. It currently IS, for Windows servers, but it doesn't NEED to be for other servers.
The real issue here is control: people don't feel happy about letting IT control the resources. I would urge everybody to read A Unix Guide to Defenestration [winface.com] before they comment on centralised vs client-server computing.
Re:not sure about that "linux security" thing (Score:2, Insightful)
No, the problem is allowing executable e-mail. If Linux starts to do that, it will suck too.
if this doesn't say hot air, what does? (Score:2, Insightful)
Myth: Linux Has a Lower TCO
Management tools have been available for Windows for years, Silver observed, but many enterprises still have not been able to manage their Windows environment. This has often been due to too much complexity, lack of sufficient policies or standards, or cultural and political issues, according to Silver.
If this is true with Windows, "we see little reason to believe that the cultural or political issues will change just because the enterprise is now using Linux," he observes.
so, because he can't imagine it, it can not be.
There's one tiny little thing missing point: (Score:2, Insightful)
Left and Right sides of the brain (Score:1, Insightful)
The left would want us all to use Linux, forgetting that in individualist systems there's no bottom line.
The right would want us all to use Microsoft, forgetting their are other options besides the bottom line.
Neither package is ready for the desktop. Face it humanity, you're a failure.
Re:not sure about that "linux security" thing (Score:3, Insightful)
Macros in documents _may_ come to be problematic, but that's yet to be seen.
Re:Dispersing the Linux Myths (Score:3, Insightful)
Re:One Issue Not Contended... (Score:1, Insightful)
lessee...
#!/bin/sh
openssl req -new -key ssl.key/www.domain.net-server.key -out ssl.csr/www.domain.net-server.csr
# openssl req -noout -text -in ssl.csr/www.domain.net-server.csr
openssl-sign ssl.csr/www.domain.net-server.csr
#!/bin/sh
##
## openssl-sign.sh -- Sign a SSL Certificate Request (CSR)
## Copyright (c) 1998-2001 Ralf S. Engelschall, All Rights Reserved.
##
# argument line handling
CSR=$1
if [ $# -ne 1 ]; then
echo "Usage: sign.sign
fi
if [ ! -f $CSR ]; then
echo "CSR not found: $CSR"; exit 1
fi
case $CSR in
*.csr ) CERT="`echo $CSR | sed -e 's/\.csr/.crt/g'`"
* ) CERT="$CSR.crt"
esac
# make sure environment exists
if [ ! -d ca.db.certs ]; then
mkdir ca.db.certs
fi
if [ ! -f ca.db.serial ]; then
echo '01' >ca.db.serial
fi
if [ ! -f ca.db.index ]; then
cp
fi
# create an own SSLeay config
cat >ca.config '${CERT}':"
openssl ca -config ca.config -out $CERT -infiles $CSR
echo "CA verifying: $CERT CA cert"
openssl verify -CAfile
# cleanup after SSLeay
rm -f ca.config
rm -f ca.db.serial.old
rm -f ca.db.index.old
# die gracefully
exit 0
(Additional script to insert/verify the new cert into Apache and gracefully restart not shown)
>Please give an example of "firewall management" that can be scripted on Unix that can not be scripted on Windows.
I don't have this script on my boxen, but there exists multiple versions (google!) that will add deny/reject entries in the firewall for every rule match -- think nimbda/code-red/etc...
>Please give and example of "hardware configuration" that can be scripted on Unix that can not be scripted on Windows.
Hotplug comes to mind. Dynamic reconfiguration _without rebooting_. Or loadable kernel modules.
>Please give an example of the "75% of it's (sic) [Windows] operation [hidden] from all users."
Wow. This one begs for it. NT Kernel. Direct X,
Re:Some FUD, not all (Score:5, Insightful)
This hits on a very important point.
Usually this kind of conversation ends up as a flamewar debating over the vulnerability counts found on SecurityFocus, etc. Ignoring exactly what these numbers mean, how they are tabulated, and whether they compare apples to apples or not... they only tell a part of the whole story. The trouble is, when people think "security", they've become conditioned to think exploit numbers. And patches.
Ideas like "Linux provides an easier patch path" is a good start. So would something along the lines of "Linux provides a more modular environment and control over installed components." But then, that's considerably longer than "Linux provides better security." Even if it leads to miscommunication.
But it may be worth the extra effort. After all, at the risk of generating another slew of flames, infosec is one of the subjects that seem to draw a lot of comments from those who really don't understand the subject. Pointing out the strengths of one's favorite environment might hold more weight if it also included some education in the subject matter at hand.
Um...what?? (Score:3, Insightful)
They then go on to explain that the argument is that OpenOffice and Linux will be less expensive than MS Office and Windows. Their attempt to debunk this is to say that OpenOffice is available on Windows.
Somehow this means that the "myth" is false? Their arguments don't stand to reason.
First off, the argument of Linux being less expensive is much, much larger than just the cost of an office productivity suite. It has to do with licensing, user support, applications, TCO, uptime, and all sorts of other things. Saying "OO is available on Windows. Q.E.D." is almost a non sequitur.
And how does saving money on an office suite, even if you don't migrate to Linux, mean that Linux costs more? It doesn't follow! If they argued other costs of migration (apps, user training, etc.), maybe they'd start down a logical line of argument. But the office suite argument is a dead end that doesn't lead to the conclusion that their headline would suggest.
This article is mostly FUD.
Re:not sure about that "linux security" thing (Score:5, Insightful)
Corruption of personal files is *catastrophic*. Imagine your house burns down, what do you want to save most? Do you say "Oh, we saved the house, but all your personal stuff is gone". That's just completely backwards. If the OS can't save me from a virus mucking with the personal files, then I don't give a damn about the system files, they can be fixed.
Sly Deception (Score:4, Insightful)
Perhaps the slyest bit of slight-of-hand was the claim that the cost of supporting Linux users would not be significantly less than for Windows users. As support, the author quotes somebody saying that Linux required about as much support staff as Unix -- then just guesses (ignoring contrary reports) that the same would obtain vs. supporting Windows desktops.
Another is the suggestion that working well on older hardware actually counts against Free software. The author says, for instance, "After warranty support is over, many enterprises choose not to repair broken PCs, but to replace them with new ones." This is in large part because the repaired PC would not be able to run current MS software versions anyhow.
Similarly, the author suggests that keeping older hardware means managing many more varieties of hardware. Yet, it is not old, well-understood hardware that is hard to manage, but the forced influx of new hardware needed to run new versions of software. Absent that forced turnover, an enterprise may reasonably stick with substantially the same hardware configuration (with optional upgrades in clock speed and storage capacity) until there are compelling, objective reasons to switch.
Equally damning are the omissions. The author carefully avoids mentioning lock-in, and never mentions the possibility of obtaining support from independent (and possibly local, and competing) third parties, or from the in-house expertise that can only develop with Free software. For a good comparison, consider the SUNY Faculty Senate resolution published at http://orange.math.buffalo.edu/csc/resolution2_apr il2003_approved.html [buffalo.edu].
I could go on and on, but the point is that the opposition has become more sophisticated. This is more clever than "Free software is a cancer that threatens the American Way", but the intent and the conclusion are the same. Now the strategy is "make minor concessions, but sow seeds of fear, doubt, and confusion." The falsehoods reveal the true intent.
Try to guess which Gartner customer wrote this report.
Re:some very good points (Score:4, Insightful)
Indeed. Our administrator just converted an underutilized webhosting box to a much needed mail-server while 500 miles away on a business trip, over nothing more than his Ibook. That kind of remote-managament is unheard of in the Windows world. "Apt-get install mysql-server"?
The funny thing is, qualified, competent, hard-working systems administrators with years of experience are surprisingly available these days, and are going for far less than people might expect. A friend with 20 years of experience managing Unix networks summed up the problem like this. "All applications are filtered through the HR person. The HR person knows what Javascript is. The HR person has no idea what AWK is. The HR person is going to pass on the resume of the person with Javascript experience."
Of course the issue of the *number* of qualified personnel required should also be brought up. I run in a pure Linux / BSD shop, so I will bring up the experience of a colleague who interned in a Mac / Windows QA shop. For most projects, there were 12 PC's, 12 Macs, 3 PC Technicians, and 1 Mac technician on setup. Generally speaking, the Mac technician would finish configuring and installing the software on all of his 12 machines before the 3 PC technicians. This was quite some time ago and is not directly linked to the Linux / Windows debate, but the point is that the choice of OS can have a tremendous effect on the number of people required to administer the network. In this case if the more expensive employee were paid 2x what the cheaper ones were, you would only be paying 2/3's of what you would with the cheaper ones, and would have a much happier employee to boot.
Having seen what a competent linux administrator can achieve quickly and remotely, for example quickly knocking out scripts to do specific tasks (like migrating datapaths) that would otherwise take hours to do manually, it seems pretty clear that you would require fewer administrators for Linux than for Windows. Anybody either technically competent or extremely well trained can setup an IIS server, but it will take either of them quite some time.
So your choices (if money is an issue):
1) a higher paid employee running linux/bsd/etc.
2) several lower paid employees running win2k.
and if you are sufficiently small
3) a regular employee taking over the win2k work.
(note: this is not attempting to knock the technically competent Windows administrators amongst you, people for whom I have the utmost respect. But even you must admit that setting up a SOHO file server in Windows doesn't exactly tax your abilities).
Re:Anti-windows FUD (Score:5, Insightful)
Internet Explorer runs with Adminstrator privileges. So does Windows Media Player. And Microsoft Office. Including Outlook. The "finer-grained ACLs" on Windows NT-based OSes don't mean shit when the programs all get to run setuid root.
The "support" issue (Score:5, Insightful)
It's bullshit. If you have problems installing a driver for Windows, do you think Microsoft will give you any support? Have you tried calling Microsoft tech support?
"Be sure to install the latest service pack". That's your tech support from the vendor. You get effective support for M$ products exactly the same way you get effective support for free software - by posting a question on a newsgroup or forum.
Re:not sure about that "linux security" thing (Score:4, Insightful)
No...
Corruption of personal files is *catastrophic*. Imagine your house burns down, what do you want to save most?
You convieniently ignored the "Personal files can be restored from backup" part of the parent comment. Even the best security in the world doesn't protect you from hardware failure, so it's a given that you should be backing up your personal data. It's not that hard or expensive, you just need to get in the habit of doing it. When you take that into account your house analogy falls apart. You can't easily make a duplicate of all your personal stuff from your house, but you CAN backup your data. If you DO backup your data, all all that's left to save is "the house".
If you're not backing up your data, you will loose it. You're flirting with catastrophe. You've been warned.
Ths sad fact is . . . (Score:3, Insightful)
Re:Anti-windows FUD (Score:2, Insightful)
Re:That is pointless though (Score:3, Insightful)
well, you can still do "/bin/sh file.sh" or something like that but you can't run the script directly, which means that it won't run by double clicking it, and it won't run out of your email program.
A buffer overflow in a stack smash attack can still fork a shell, the no-execute mount of the filesystem is just a PITA for the users, not the attackers.
worms smash stacks (although this is now nullified with recent changes to 2.5) not viruses. the execute protection doesn't help against a worm, but a properly set up (firewalled) desktop system shouldn't be attacked by a worm anyway. properly protected firewalled-by-default systems only need to worry about user error allowing a virus on the system, and the execute protection effectivley stops viruses.
Very few local->root exploits rely upon the ability to create exec'able files.
i fail to see what difference this makes on a single user desktop operating system. in theory, every local user on a desktop machine has the ability to get root after typing in their password, so that they can safely install applications or make system-wide changes. in a corporate environment, a malicious employee can only take down their own machine, they can't send a virus to everyone in their dept. taking down everyone elses systems in the process.