The New Linux Myth Dispeller 155
TillmanJ writes: "Just a quick note to let everyone know that the New Linux Myth Dispeller is online at http://www.eruditum.org/lin ux/myths/myth-dispeller.html
It is not, however, ready for prime-time, but is usable.
If anyone has anything to add/correct/bitch about, send me some email.
In patricular, I would like to work with some non-english-speaking folks to translate it into whatever langs we can" Useful for clearing up the misconceptions of PHBs and other folks.
Re:News Flash: windows almost out of Beta!!! (Score:1)
Just curious...
NecroPuppy
Too zealous of a document (Score:3)
Re:Mythology and Reality (Score:1)
Because Linux is easy to install (Score:1)
Well, because that's not what's hard about Linux. If they can't install Linux, I suspect they can't cut-and-paste in a word processor either. The initial install with distributions like Red Hat, Mandrake, or Caldera is a piece of cake, and yes, anyone can do it. It's no harder than installing other OSes, because it's automated.
In my experience, the hair loss begins right after the initial install, when the user starts installing additional packages that didn't come with the distribution. RPMs, which are allegedly supposed to be easier than source tarballs, are a major pain in the ass at this point. Damn those dependencies and version conflicts!
That is what I want to see Joe Schmoe do, and I bet is that currently, Joe will have a lot of trouble. I know I did, and I'm a computer dude. I couldn't even install GLX in order to play a game until I gave up on RPMs and fell back to ./configure, make, make install.
---
More FUD (Score:2)
It claims that GPF (ie. segfault) can only be caused by hardware failure. GPF means segfault in the kernel. This means the kernel had bad code. Capiche?
Now, it calls the linux kernel "small".
375,056 KERNEL32.DLL
715260 Aug 1 23:54
twice the size of NT kernel. Yay for small.
Windows is bloated, because a *full development environment* is five times the disk space of a *text editor* ? Give me a break
It lauds Linux as being POSIX compliant, but windows 2000 is POSIX compliant too. (And Win NT with Interix installed).
The other points this FAQ makes (re. security, history, y2k compliance etc.) are all obvious and only worth bothering with if you are the sort who HAS TO refute a lamer's argument, rather than just call them lame and ignore them.
Is this goy one of those kooky clan schmucks? (Score:2)
Re:'nother myth; not ready for the desktop (Score:2)
Installation is not the acid test for the desktop, and Linux can be pre installed these days.
Re:'nother myth; not ready for the desktop (Score:2)
For bundling purposes though how much of a discount does Dell give you for Linux being loaded? None. So which box is more profitable, the Linux system or the Windows? I don't know, the support contract may cost and Windows is peanuts for Dell but there's at least more to the economics than you're suggesting.
Re:Win95 is not truly pre-emptive (Score:2)
However, how many Win16 apps does one run? secondly, you don't seem to quite understand how Windows multi-tasking works. Win16 apps are run inside a virtual machine, which is in itself a Win32 applications. The machine preemptivly multi-tasks all Win32 applications. The Win16 VM then cooperativly multi-tasks all Win16 applications. Thus the illusion doesn't just "vanish" when you run a 16bit application. All your 32 bit applications continue to be preemptivly multi-tasked, it's just that your 16 bit applications are cooperativly multi-tasked (against each other.) Thus, an Win32 application cannot hog the processor, and if a Win16 application does, it won't hog the machine, just the virtual machine. Since the virtual machine is a 32 bit application, it can be preempted so the result is that a Win16 application can only hog the proc from OTHER Win16 applications, not Win32 applications.
Re:Just making sure I have the terms down right... (Score:2)
Dispelling Myths is not the same as casting FUD about another product. If you defend a product honestly or point out it's merits it ain't FUD.
If on the other hand you were to spread half truths about a competitive product, and imply bad things will happen when you use a competing product that would be spreading FUD.
Re:'nother myth; not ready for the desktop (Score:1)
Re:Non-biased version? (Score:2)
FUD is when you spread misrepresentations to cast doubt on a competing product. In general I don't think this is widely done by the Linux community.
I've seen FUD get spread by Microsoft in other areas, it ain't pretty and it's obvious what's going on to the well informed.
If someone did this in the Linux community there would be a chorus of objections correcting the FUD. Heck just look at what's happened in this thread. Honest comparrisons are not FUD.
Re:Wow! Pro-Linux FUD! (Score:1)
I've never, ever had to have multiple glibc versions running on a machine. Also, you only need the base libraries for GNOME and KDE to actually run the apps - most of the stuff is only useful if you're using the full desktop.
In terms of memory usage, Linux blows NT4 out of the water (a bad thing) and is quite close to Windows 2000's bloat.
Not in my experience. Obviously, this kind of article is just going to lead to a flame fest all around, but I've run RH 6.1 on a 25 Mhz 486 with 16 megs of RAM - WITH X+KDE. And it wasn't noticalbly slower than the Win 3.1 it replaced. Try running NT4 on something like that - it ran poorly enough on my 350 Mhz P-II with 64 megs.
Linux DOESN'T take full advantage of hardware.
Linux doesn't support DirectX, and thus automatically lacks support for a lot of hardware features that are in DirectX complient hardware. The main reason was because transparant usage of hardware was a major design consideration for DirectX. It is based on the concept to support many different hardware features, have all applications use them, and then emultate those not supported by hardware. When the hardware supports new features, all apps and the OS automatically take advantage of them. Also, X doesn't have as compelete a support for many graphics operations that are possible in DirectX.
What the HELL are you talking about? DirectX is a development API. Vendors can also write drivers which allow DirectX to use the full abilities of their hardware - just as is done with every other graphics API, such as OpenGL or Glide, both of which run on Linux fine. My games run much faster when I use Glide than DirectX - if DirectX somehow magically makes hardware faster, how do you explain that?
Re:Red Hat != Linux (Score:1)
So what hardware does RedHat support that isn't supported on anyother distribution? What, none? The only reason for discrepancies in hardware support is age of the distribution. RedHat puts out a new version like clockwork, and newer technologies make their way in faster. As an example Debian releases much slower, so it may appear to support less hardware, but after installing newer versions of the shipping software it runs on everything RedHat will.
As a little test I have set a user down with a machine, the RedHat install manual, and a cd and asked them to install it. I came back the next day and found the system working completly.
What really seperates distributions are the tools that they ship with for configuring the system. RedHat, Mandrake and Corel focues on adding configuration tools, while Slackware doesn't focus as heavily on this. Systems that don't focus on easy end user tools aren't much of an issue because most users aren't going to start with them. If they aren't satisfied using RedHat they'll try other distributions. If they are satisfied they wont switch.
treke
Re:Myth Dispeller myth dispelling (Score:1)
games? (Score:1)
Linux: It's not as bad as you think it is (Score:2)
Aside from some dated information and inaccuracies, the problem I've had with the LMD for quite some time is that it's couched as a double negative: "Linux isn't evil foo", where foo is some undesirable characteristic. The entire flavor of the document would change if language were changed from negative to neutral or even positive. Otherwise it has this "are you still beating your wife?" flavor. The LMD would answer that question with a topic "Linux is no longer beating its wife".
Just as an example, the document would have an entirely different flavor if the system headings under "4. Systems Myths" were:
Suggestion is a powerful tool. The document should suggest that Linux is the cat's pyjamas. People will tend to believe.
What part of "Gestalt" don't you understand?
better metaphor needed (Score:4)
"suits" or "powers that be", or even Grand Poobahs... PHB does not work.
Needs some touching up yet... (Score:5)
"After getting a Linux CD, you'll probably be up and running within an hour. In the olden days, this has been true, and Linux can be made hard to install."
Not to be a grammar nazi here (we have someone at slashdot filling that position already), but "this" is unclear, and may confuse some people about the facts, as it implies that being up and running within an hour was only true in the olden days.
"Linux is well over twice as fast as NT"
Generalizations like this should have no place in a document of this type, especially considering you don't back this statement up with any data. Statements like these should be what your document is fighting against. Specifically, twice as fast at what? While it's probably true that Linux 2.4 is significantly better than NT at several tasks, there are definitely situations where NT beats Linux 2.2.
Haven't finished reading the rest of it yet.
Re:How about dispelling some pro-linux Myths? (Score:1)
Frankly, I think the best answer to "Linux is difficult to install" is not "Linux is easy to install" which is just another opinion. Things like the possibility of FTP/NFS installs, bootable CDs, and the fact that once something's installed you don't have to look at it ever again are things that can be mentioned. It doesn't have to make any anti-linux statements, but telling the full story is a very good idea.
Hopefully when this site becomes ready for prime-time, its contents will be based more on facts than opinions, even if it stays pro-linux.
the linux/slashdot bias (Score:1)
-who it is recommended by
-results of scientific testing of effectiveness
The problem is a Myth page that is so loaded with language that is too technical for most of the people it is trying to reach out to - the people who don't want to spend time to become educated consumers.
Non-biased version? (Score:2)
I think what we need is a non-biased version of a myths website. Something that will dispell anti-linux and anti-windows myths so people can make an informed decision about which operating system they want to use. If people were to see a site admitting that an OS has its problems, they'll probably be easier to trust.
runs on a DEC Alpha? (Score:2)
Re:Needs some touching up yet... (Score:1)
Another place that Linux is weak on is on desktop systems, due largely to the fact that whereas Windows and Macintosh both have very stringent UI guidelines, Linux has virtually none. KDE and Gnome are trying to change this, although I feel the somewhat hostile environment between the two is working more to make things worse than improve it. As a developer, I should be able to choose either GTK+ or QT and have the resultant program perform similarly on either platform. And as a user, I should be able to expect that any program I install should try to act reasonably with the preferences I have set in my "prefered" desktop environment. For example, I'd love to see a unification of UI "themes", and have a toolkit independent preferences system where both KDE and Gnome settings are stored via their respective control centres, and shared between both where applicable.
Portions of this document are no better than the FUD it attempts to refute, IMO, due to exaggerations on the behalf of the author. There's no harm in admitting weakness, in fact, it usually shows that you're trying to be honest and unbiased.
WinNT threading not fully preemptive (Score:1)
Hmmm...any ideas on how this could be abused?
much-reviled Windows regstry actually a good thing (Score:1)
Re:Non-biased version? (Score:1)
Re:runs on a DEC Alpha? (Score:1)
DEC is associated with things like the PDPs, VAXen, and VMS (Ok, maybe we wanna forget that one).
Which would you rather associate with your 64bit CPU?
Re:'nother myth; not ready for the desktop (Score:1)
I don't honestly think that the average user would even care if Linux came with source code or not. Think about it here for a second. What good is the source code to someone who doesn't have a programming background? Sure, the code is neat to poke through, and may give you an inkling of an idea about how the system runs, but what good is it to the average user? We have to remember that not every Linux user is a programmer.
And you have to remember that it doesn't matter whether every user is a programmer. Not every user needs to contribute for the system, as a whole, to work. I'm certain you don't believe that everybody who uses Linux has contributed code for the kernel. But has every user benefited from the stability that comes from code and bugfixes contributed by others because it was free? Of course. Likewise, you don't need to convince most end-users that free software is good - you just need to convince them that people are actively contributing to development. Point them towards the kernel development mailing list. Show them the Sun press releases. Tell them about the contributions IBM is making. Explain how this came about because anybody can contribute freely. They don't need to contribute at all.
What the hell? (Score:1)
Um, every windows since 3.1 has had premtive multitasking
Re:A few corrections (Score:1)
Not quite... NT3.51 had a clean, microkernel based design. For marketing based performance reasons, 4.0 and up have a bastardized design where a lot of higher level functions bypass the microkernel and operate straight on hardware.
How about dispelling some pro-linux Myths? (Score:5)
Re:I hope you're not just complaing here (Score:1)
The truth about installing Visual C++ (Score:2)
In section 4.4, talking about the relative hard drive merits of the differing OSes, Visual C++ is quoted as taking somewhere around "100 Megs". Installing it off of Microsoft Visual Studio 6 Enterprise Edition last week, it takes around 330 Megabytes. Yes, just for MSVC++. J++ had a cd all of its own, i didnt even dare go there.
That may not even be the worst offender, though. Symantec Cafe for Java (Database Edition) occupies around 580M on my harddrive, and i think it was just a typical installation.
Fross
Here is one... (Score:1)
This is definately a Linux myth!
Re:Needs some touching up yet... (Score:1)
The GUI/desktop myth section needs a lot of cleaning up.
Just more constructive criticism
Re:the linux/slashdot bias (Score:1)
A few issues - (Score:1)
Firstly, it must be pointed out that MS Office data formats cannot be treated as "standards" under any reasonable definition of the word "standard."
Almost 100% of offices use it, and require it. Its a de facto standard, but still a standard.
8.1 Linux is PC exclusive"
Just a nitpick. Looking at the debunking, this should say Linux is x86 exclusive.
The end of time? (Score:1)
nobody's gonna buy it. (Score:1)
to be taken seriously, it will have to adequately address the shortcomings of linux. I'm very pro linux, and even I stopped buying the PR on that site when I read about how Linux has more software than windows.
a more balanced view would be much better than one of a blind advocate.
________
Re:'nother myth; not ready for the desktop (Score:1)
-Peter
A few corrections (Score:3)
Mac OS X uses fully preemptive multitasking. It's core is Unix BSD.
MacOS and Windows 9x simply don't support this.
Mac OS Open Transport is one of the most advanced networking stack, and support almost everything you can do on Linux. It's based upon Mentat Portable Streams (used on Novell NetWare, Hewlett-Packard HP-UX, IBM AIX, Compaq Tru64 UNIX...)
Most apps do not crash linux too bad, but at least on the Linux boxes I used, X windows had a tendency to crash and take the whole system with it...
Linux Advocacy is nice. Impartiality would be nicer. At least it would help differentiate Linux from Apple/Microsoft, whose advocacy is often far for beeing objective...
Note for the author : how long did it take you to set up a PPP connection with the Debian ppp-config utility ? not more than 3 minutes ? or about 5 minutes ? At least be coherent.
Re:Myth Dispeller myth dispelling (Score:2)
With a closed up tight hardware architecture. It really doesn't matter how spiffy Steve's new toy is. So long as Apple and crew continue to dictate what hardware will be used I will continue to wish for their downfall.
Re:A few corrections (Score:1)
Easy to install? (Score:1)
I love Linux, and yes Linux is a nightmare to install. Many things have improved, that's certainly true, but: I recently added an IDE CD-burner to my PC.
I needed to: recompile the kernel, setup the SCSI emulator, find burner software... I spend half a day on this. I want Linux to improve - in most cases it's much better than Windows already. I think pretending that it is in all cases is counter-productive.
Re:A few corrections (Score:2)
7,769,104,384 bytes free (Score:1)
That's how much space is on my harddrive after installing Windows 2000 Professional, Office 2000, Visual Studio 6.0, about a half dozen full installs of games off CD, etc.
Disk space is cheap.
"Here is a few of my favorite programs", ha! (Score:1)
Re:'nother myth; not ready for the desktop (Score:2)
The "average" person cannot "install Linux". The average person cannot "install Windows". They can't install BeOS or Solaris, which are commercial products in a similar position to Linux. They can't install their DSS dish without the guy from the the store doing the alignment. They can't install a toilet without asking a plumber to do it.
This has nothing to do with whether they can use the product.
Re:Myth Dispeller myth dispelling (Score:2)
Some tests I ran (network intensive reading and writing of files) did run more than twice as fast on Linux as on NT on identical hardware.
Some other tests (compute intensive) ran as much as 10% slower, possibly due to poorer optimization of the gcc compiler.
Re:An example of hardware that Linux omits: USB (Score:1)
Re:Wow! Pro-Linux FUD! (Score:2)
Perhaps you may want to rephrase your argument to "Direct X supports a more powerful graphics api" which would make sense.
Neither DirectX or OpenGL or X or BeOS support graphics APIs that have not been invented yet!
Re:Wow! Pro-Linux FUD! (Score:2)
base libraries for GNOME and KDE to actually run the apps - most of the stuff is only useful if
you're using the full desktop.
>>>>>>>
Are you running Netscape? On a glibc2.1.3 machine, you need compatibility libraries. As for the libraries, Qt is 2.5 megs, kdesupport is 3.5 meg kdelibs is 5 meg. That's 9 meg. Assuming the RPM format uses compression, you're talking around 15 megs of libraries.
Not in my experience. Obviously, this kind of article is just going to lead to a flame fest all around,
but I've run RH 6.1 on a 25 Mhz 486 with 16 megs of RAM - WITH X+KDE. And it wasn't noticalbly
slower than the Win 3.1 it replaced. Try running NT4 on something like that - it ran poorly enough
on my 350 Mhz P-II with 64 megs.
>>>>>>>>>>
Obviously our experiances are different. However, Win3.1 was really bad in terms of performance, especially due to the real mode filesystem. (NT's filesystem is about 5 TIMES faster.) Also, you're comparison is uneven. KDE1.2 + X lacks a LOT of the features of NT4. My Slackware system runs GNOME 1.2, KDE 2.0b3, X4.0.1, and kernel 2.4-pre5. Then, load up the latest build of Mozilla, and you've got a system that is comparable to an NT machine. In that, it means it can run all the available software for the platform, it has an object model, and most of the features of NT's DE, and a browser (IE is always in memory if you've got on active desktop). My machine is very trimmed, 200megs before X and GNOME and KDE. (BTW. My system partition for NT4 is only 500MB, and it is presently only half filled. Apps are a different matter) That config takes up a good DEAL more RAM than NT. A lighter config, GNOME+Mozilla+X+kernel, still takes up more RAM than NT4. Linux + X + KDE1.2 also takes up (though slightly) more than NT4. At that point, it isn't even a fair comparison because NT's DE has so many more features than KDE1.2
What the HELL are you talking about? DirectX is a development API. Vendors can also write
drivers which allow DirectX to use the full abilities of their hardware - just as is done with every
other graphics API, such as OpenGL or Glide, both of which run on Linux fine. My games run much
faster when I use Glide than DirectX - if DirectX somehow magically makes hardware faster, how do
you explain that?
>>>>>>>>>
You miss the point entirely. I said that DirectX apps take much fuller advantage of the hardware than Linux apps. Secondly, it is NOT possible to write OpenGL apps that automatically take full advantage of the hardware. Let me explain. Say DirectX supports rotating bitmaps, scaling them, and blurring them. If you've got a piece of hardware that supports the first two, but not the third. The developer simply writes a driver that exposes these two features, and leaves the third to DirectX. Thus a DirectX application can use all three features, though the third feature will be slow on that particular card. However, when the user upgrades their card to one that supports all three features, DirectX will automatically use the hardware version. All this will happen transparently to the application, it will just notice that these operations perform faster. Now my point is that hardware with DirectX support tends to have a lot of these features in hardware. However, most Linux APIs don't have nearly as many features as DirectX, thus you sometimes end up with situations where there is support for a featuere in the hardware, but not in the API. For example, on your Linux machine, your soundcard's 3D sound hardware is going totally unitilized. By supporting a very broad range of features, and emulating those that aren't supported by hardware, DirectX makes sure that developers use those features, and when the user upgrades there card, apps can automatically use new acceleration features. As for OpenGL vs. DirectX, it isn't. DirectX is a whole lot more than just 3D, it is more like DirectX vs. OpenGL + ALSA + + X (for overlays and input) +SVGAlib. (BTW, the second combination doesn't come close to competing with the first.) If you're talking about D3D vs. OpenGL, read my article on OSOpinion called "Is OpenGL In Touble". In short, the method that D3D uses to support features is far superior to the method used by OpenGL. Think of it this way. Say I make a graphics card. It supports vertex tweening. Now, this feature isn't a part of OpenGL, so I write an extension to OpenGL called MY_vertex_tweening_extesion. Now what happens here, is that apps can use the vertex tweening features of my card even though it isn't part of OpenGL. However, there is a problem. The extension is propriotary. Meaning that my vertex tweening extension isn't compatible with ATI's vertex tweening extension. Thus, a developer has to write code for both (often several) cases. Now, the ARB (the people who control OpenGL) has the option to make something a standard extension. Thus, there can be a standard ARB_vertex_tweening extension. That way, I can just write code for that extension, and all hardware that supports it will automatically accelerate it. However, OpenGL moves very slowly. The core API doesn't really change that much, and extensions take a long time to come out. (For example multi-texturing came out a lot later as an extension than a feature of Direct3D.) Thus, OpenGL tends to have a lot fewer standard rendering featues than Direct3D. However, extra features don't really take that much code to add. What DirectX does is support a very wide range of features. (BTW> It gets a list of features to put in by asking graphics card makers what they're going to put into their new cards, and asking software developers what features they want to use) Thus, vertex tweening is already a part of Direct3D, and any app that uses it will automatically be accelerated on any hardware that supports it. Because extensions to OpenGL take so long to get standardized, it often happens that developers (except huge people like id, but I doubt he likes coding for each extension) often just choose not to support that features, or write their own software version. Worse, a lot of developers may just code for the cards that exist now, and future cards that support that feature will be left out.
Re:More FUD (Score:2)
Running Linux under VMWARE on NT would be equivalent to this so-called "Posix compiance".
The real shame is that MicroSoft probably would not be exposed to the wrath of the CS community if they had done even rudimentary Posix-compliance correctly in NT. All they needed were raw byte file names (ie case-dependent file names), raw byte files (ie get rid of "text mode" and ^M^J newlines), use forward slashes (they do already, but fix the documentation!), some hack so you don't need colons to name objects (like having /A/foo or //disk/A/foo mean the same as A:foo), and make all of their NT "objects and services" accessible as named files so at least access() works on them, support symbolic links, and all processes have a working stdin/out. All of this would have been trivial to do and if they had done this I believe Linux would be nowhere today since they would have produced a friendly programming environment rather than the horror they did.
Re:Here is one... (Score:1)
Re:An example of hardware that Linux omits: USB (Score:1)
Re:hrm... (Score:1)
Seems to me like FUD fighting FUD - if the facts are on your side, why not publish them?
Just making sure I have the terms down right..... (Score:5)
pro-linux = dispelling myths ?
some of it is not entirely true (Score:2)
Re:'nother myth; not ready for the desktop (Score:1)
Still doesn't help me (Score:1)
We cater to mostly Fortune 500 companies and it seems they're not interested in using Linux. If that's the case, it doesn't matter that Linux is stable, low-cost, ghostable, yadda yadda yadda. He said unless customers start demanding it like barbarians at the gate, the company won't invest money and research in creating a secure, scalable managed platform. (Linux is still used mostly as infrastructure servers in these companies, right?)
And just for the record he's a Winbloze engineer, but he's all for anything that'll bring in profits. Even if that means introducing Linux.
He might have a hard time translating... (Score:2)
I don't mean to nit-pick at other people's choice of words, but I couldn't resist this time.
I'm sure, despite the volunteer's superb mastery of the tongue-click language, it's gonna be a bit hard for him to translate the page if he can't understand the language that it's all ready in... and, I imagine it'd be a bit hard for him to communicate on the benovolent non-english-speaking people who wish to bestow there translatory (I know it's not a word, so what?
Okay, okay, I'm dumb being stupid, I just felt a compelling urge to submitt an irreverent and vain attempt to be funny, so what if I'm not actually funny? I'll be good from now on...
Re:Non-biased version? (Score:3)
FUD is when you spread misrepresentations to cast doubt on a competing product. In general I don't think this is widely done by the Linux community.
That's a joke, right? Slashdot is the home of anti-Microsoft FUD. Not that MS is a perfect company, or Windows is a perfect product, but if you listened to many Slashdotters, you'd think that it was impossible to get ANY work done because of the constant crashes, never mind that 50-100 million people use it every day.
--
Re:Red Hat != Linux (Score:1)
where on earth did you get that slogan? [redhatisnotlinux.org]
bring something new to the discussion, please. It was getting interesting before you stepped in.
Re:What? (Score:1)
Re:w00t!!!!!!!!!f adsfasdfasdf (Score:1)
Experts have lost perspective (Score:1)
Knowing that /. is a Linux haven, I still disagree that some of these items are myths. Particularly in the "Installation" portion of the Linux myth dispeller.
I contend that the Linux experts have forgotten how difficult Linux can be when you're new to it. In the case I present below, this was my very first fresh installation of a UNIX/Linux system, although I'm familiar with UNIX and have been a system admin for several years now (small network).
I have set up a single Linux machine as my firewall/masqerade and mail server, and I can say the following:
Here are my counters to the myths:
what kinda crap is this? (Score:2)
The gaming options on Windows leave something to be desired when compared to Linux? Ummmmm..... FUD? I love linux, and use it daily, but lies like this aren't very likely to help growth.
Simple question - who's going to read this? (Score:2)
The first thing that struck me was whom it was intended for. My guess is that the actual readership would probably break up as follows:
95% - linux enthusiasts & slashdot readers
2 % - windows supporters
2 % - people who are handed this doc by someone they know who's a linux enthusiast, and who actually read it.
1 % - others (PHBs, ordinary users, curious onlookers).
I can assure you no PHB is going to read a document which devotes an entire page to describing the doc, then goes on into copyright, structural layout of the document, etc. They lose interest after 3 lines (not kidding). What you would need is an executive summary at the top, followed by bullet points describing each myth and dispelling it, in order of myths most popular.
If the audience is to be your next door neighbor or friend who hasn't tried linux and heard these myths, I can just imagine their nonplused response. You see, people with a casual interest in something are not particularly interested in going thru minute analysis of propaganda battles. Imagine how interested you'd be if your bank handed you a brochure with a 20 page feature by feature comparison with its rival.
Newsflash - everyday users are as interested in detailed FUD analysis of OSes as you are in the FUD analysis of banks, tax strategies, hotels, etc. You just want to use it without thinking about it too much. This document is preaching to the choir.
If people shared our passion for debating windows vs. linux, they would already know all this stuff inside out. The whole thing is -1, redundant.
w/m
Propritry licence (Score:2)
Re:'nother myth; not ready for the desktop (Score:3)
This is the second comment I've responded to about this. The question isn't wether or not some distro is ready to have joe shmoe install it...Hell my dad is on his third windows box and he's never installed it! The question is wether or not linux is ready for the desktop. period.
I'm talking about getting a box pre-installed! my dad wouldn't know how to get any of those things working! I bought him a new video card and a flight sim for christmas, then I had to go and install everything because didn't know how to do it.
But windows is ready for the desktop, right? Why did I have to do that? Because, being "ready for the desktop" and being "ready for j. average user to add random hardware" are two different things.
My comment boiled down to this:
Everything is difficult when your new. But if you don't have preconceived notions of what things should be, then you can get over the difficulty very quickly.
end of story.
-Peter
Red Hat != Linux (Score:2)
Well, you've proved (in your case) that Red Hat has good hardware support, not Linux.
Oh, and I have used Linux before as well. Sorry to burst your bubble
--
Linux does not conform to the X/Open Standard (Score:2)
It would be very nice to see these comments corrected before this document is released. Please see www.motifzone.com
Re:Myth Dispeller myth dispelling (Score:2)
No (Score:2)
Microsoft's lies to the contrary notwithstanding.
Cheers,
Ben
Re:A few corrections (Score:2)
The key to OT performance is to respond to a network interrupt and pass pointers to a STREAMS message up the stack and back down again in less time than it takes to send a 1500 byte Ethernet packet. Since routing occurs at interrupt time, it is not affected by other applications. ... And, its always nice to see the look on people's faces when you explain to them that your Mac is saving the planet by running rings around NT or Cisco routers and is way easier to configure than a linux machine!
What a liar (Score:2)
we don't do windows
lynx --head http://www.eruditum.org | grep Server
Server: Microsoft-IIS/4.0
Liar.
I hope you're not just complaing here (Score:2)
How about random marketing myths? (Score:2)
Red Hat = Linux
I'm not saying this as a troll; I'm saying this because Red Hat honestly is perceived as equalling Linux, and I don't see Red Hat doing anything to stop it (I'm not saying they're TRYING to steal Linux either. They are just conveniently remaining quiet about this).
Please consider adding this to the myth dispeller document. It may end up being more important than many people think.
An amazingly outdated article (Score:3)
Yet, people think it's still oppressed, and feel the need to defend it further. What are you defending against? Linux has no detractors, except perhaps competition such as Microsoft, but only just, and that represents a tiny portion of the marketplace.
The articles reads like it was written in 1995. Windows only cooperatively multitasks? WTF? This was trendy to discuss in 1994, but since the release of Windows 95 (not to mention NT, and 2000!) the issue has long since been resolved. The author needs to pull his head of the sand, fly back to earth, and check out everything that happened in the last five years, which he has missed.
Re:'nother myth; not ready for the desktop (Score:4)
The fact isn't if people are willing to learn or not. Yes, people can learn how to use Linux if they sit down and learn it. Now how about getting devices like your sound card, your scanner, your printer, your modem, or your digital camera to work efficiently under Linux? Now we hit the snag. Some people may find Linux is easy to use, but what about device support? Yes, there is support for such devices under Linux, but it's still not as good as the device support you get under Windows. Not everything will work under Windows, but I'm willing to bet that there's a lot more that won't work under Linux.
Sure, Linux is supposedly easy to use for the average user to toy with. But it's still behind Windows when it comes to desktop readiness. Before you spout your FUD at me, is an OS that can be easily used (after learning it) yet doesn't have device support for everyday components ready for the desktop? I don't think so.
--
Wow! Pro-Linux FUD! (Score:5)
1) Linux may not be a nightmare to install, but it is still a nightmare to configure. The main problem is not so much that configuration is very text oriented, but it is not consistant. Some stuff is configured through user-space programs (hdparm and ifconfig.) Other stuff is configured thourgh text files. Some stuff is configured through scripts (the old rc.modules style) others are configured via stuff like modules.conf. Often, there is little feedback if you do something wrong. I still don't know what I'm doing wrong configuring ALSA.
2) Linux multi-tasking.
The site implies that Windows uses cooperative multi-tasking. That is simply not true. Windows95 and WindowsNT all use preemptive multi-tasking and in fact multi-task SMOOTHER than Linux. It is not so much a performance thing as a "feel" thing. The default quantum in NT is around 20 milliseconds or so on workstation, 50-100 on server. The default quantum on Linux is 50 milliseconds (newly lowered in kernel 2.4). So on Linux, each app gets a longer time slice. While this may be more efficiant, it degrades interactive performance (ie the "feel" of the system.)
3) Linux IS too huge. In order to get the same experiance as one does with Windows, you have to use KDE or GNOME. Otherwise your competing a product with more features against one with less features. Also, if you don't use GNOME or KDE, some of the other "FUD" becomes true. To get a Linux system comparable to a Windows NT system, you have to have GNOME+KDE(both so you have full compatibility) +Mozilla+X+kernel. Not to mention the multiple versions of glibc and all the additional (often redundant) libraries all the apps use. In terms of memory usage, Linux blows NT4 out of the water (a bad thing) and is quite close to Windows 2000's bloat.
4) Linux IS playing catch up. Most new kernel features (journaling FS, new automounter, LVM, etc) have all been implemented on previous operating systems. Not to mention the fact how much KDE and GNOME are playing catch up.
5) Other OS kernels do NOT load everything at the same time.
I don't know how they got this? Most of Windows is built out of DLLs which can be dynamically unloaded. Most UNIXs had modular kernels long before Linux. Microkernels like BeOS can turn off entire subsystems if they are not needed.
6) Linux DOESN'T take full advantage of hardware.
Linux doesn't support DirectX, and thus automatically lacks support for a lot of hardware features that are in DirectX complient hardware. The main reason was because transparant usage of hardware was a major design consideration for DirectX. It is based on the concept to support many different hardware features, have all applications use them, and then emultate those not supported by hardware. When the hardware supports new features, all apps and the OS automatically take advantage of them. Also, X doesn't have as compelete a support for many graphics operations that are possible in DirectX.
7) Linux threads aren't all they are cracked up to be. I have seen tests show that NTs threads not only take less time to create, but switch significantly quicker. Also, the sites makes excuses for Linux's lack of threaded applications.
FACT: Multi-threaded apps are better. They may have slightly more overhead and are more complex to write, but it really pays of for those with SMP machines. It also pays of in todays systems because of the increasing number of CPUs in the system. Not only due to SMP, but the specialized chips systems use. Graphics cards can do operations independant of the CPU, so for most graphics apps, it makes a lot of sense to have an independant display thread. Thus, the main-thread can do things while the graphics card is busy working. Same thing for 3D audio. Instead of blocking the CPU waiting for the sound card to finish working, spawn another thread and have them process together. The trend is moving towards PCs with more and more independant chips, and there is no excuse for writing single threaded applications.
FACT: Theading on NT doesn't use cooperative multi-tasking. Where did they get that? Threads are preemptivly threaded just like applications.
FACT: Linux doesn't use threads nearly as often as it should. By having the kernel and libraries heavily threaded, and with fine-grained locking, performance really improves.
However, BeOS hopelessly outclasses both in the threads department. The same tests that show that NT threads switch quicker also showed that BeOS threads switch 10x quicker (that is due to the different model BeOS uses for threads. I can't find the articl at the moment, but I'll post it when I do.) Also, the kernel, servers, kits and apps are heavily multi-threaded. The API encourages apps to be multi-threaded. If you've used BeOS on SMP machines, you know how important multi-threading is.
8) Linux really isn't that fast, depending on what you do. For server tasks it is undoubtedly a speed demon, but for desktop tasks, my NT4 machine (not to mention my BeOS machines) FEELS faster. Screens have less visible redraw, apps switch quickly from one to the other. Not to mention the fact that anything media oriented does much better on Windows than on Linux. (This is partially due to the APIs. X is really not great for fast display, OSS isn't really great for complex sound, the X input system can't compare to DirectInput, there really aren't that many MIDI APIs to speak of (at least those comparable to DirectMusic) and (as of now) 3D is STILL slower than on Windows.
9) The Linux desktop IS clunky. It's very attractive, but the Linux guys need to steal some ideas from the Mac instead of Windows.
Re:Wow! Pro-Linux FUD! (Score:3)
Not so fast...
There is a ton of solid evidence that fine-grained locking can kill performance, not to mention making code more complicated, harder to write for, and harder to maintain.
Granted, threading *can* improve performance, but the impact tends to differ depending on use and the type of machine you're running on. The kernel crew has taken a moderate stance on threading, not wishing to hurt low- and mid-end hardware performance to accomodate slightly higher performance on high-end hardware. IMO, this moderation is a good thing, since it keeps the kernel and associated libraries maintainable over the long run, and it allows Linux to run on things like embedded systems, as well as mainframes, with a minimum of performance penalties on any given platform type.
BTW, Larry McVoy (one of Linus' right-hand-men) gave a great talk at the CLIQ in Denver about this very issue. here are the slides [bitmover.com] from that talk.
Reality check (Score:4)
No matter what side it comes from, FUD is still FUD.
--
Re:Wow! Pro-Linux FUD! (Score:2)
Here is adifference between Direct3D and OpenGL. Say 5 new cards support a new feature (say cubic environment mapping.) Now, the philosophy behind Direct3D is to get feedback from developers on what features they are putting in, and support as many of those as possible. Thus, cubic environment mapping is already supported in Direct3D. Since the card makers don't want to leave out Quake, they write 5 different extensions for cubic environment mapping. This is again due to the fact that OpenGL does everything by commitie (and a slow one at that.) So, in Direct3D's case, all cards that support a feature will accelerate any game that uses that features. In OpenGL's case, many new features (new as in less than 1 or 2 years old) will go unused until a standard extension is made for them. It is not only a features difference, but a difference in the way the API is built. And Direct3D does support features that haven't been IMPLEMENTED yet. This is another design goal of DirectX (in general.) MS puts in features in the API that no cards support yet. Now games can (usually) use those features because they will be emulated (it's actually slightly more complex than that, but I don't want to get into Direct3D programming). When card manufacturers DO implement this features in hardware, there will be a number of games that already take advantage of it. So, a card manufacturer can put in cubic environment mapping, a feature nobody else has, and games that use it will automatically be accelerated, even though no cards supported that feature when the game was written. This is another major difference between OpenGL and Direct3D. OpenGL doesn't get extensions to features that nobody has yet implemented. However, this is a very good thing, because if you spend your money to buy a card with this new feature, the games you have will automatically use it.
Re:BeOS not great. (Score:2)
On my home PC, the thing just didn't cooperate at all. It didn't pick up the Network card, (RealTek
809), there are no drivers for it, and it messed up the graphics quite badly. The mouse cursor looked
like a multicoloured block. I have an NVIDIA TNT2.
>>>>>>
The graphics card shouldn't be a problem. I've got a TNT and it works fine. Your experiance is a bit unusual though. In my case I have a RivaTNT, a EtherFast100TX and a EtherPCI II network card, and an AWE64 soundcard. Everything was detected the first time around, I just had to put in the settings for the network, download BeNat from bebits, and I was off. From install -> network NAT server/desktop machine was about 20 minutes.
And my sound card didn't work either. I guess
that can be fixed, but I can't connect to the network to download the driver...beh.
>>>>>>
Again, your experiance is a bit unusual. Usually, though, any problems you have you can ask the beusertalk mailing list.
Anyway, when I tested the thing on my work PC, I didn't find it useful for the "tasks" that BeOS
claims it should be good at, at all. Sure, I managed to play 10 MP3's at once, but contrary to popular
belief, it DID slow the system down.
>>>>>
What kind of system do you have? I can play 10 MP3s without even hitting 100% on the processor. Which is a little unusual though. 4.5 used to get up to 12 or 15 without pegging the processor. The trick is to not start them all at the same time, but one at a time. If you just highlight 10 MP3s and hit Open, Soundplayer goes crazy trying to load them all. And watch this. Start up 24 MP3's all at the same time. Under Windows or Linux, it's time to reboot. However, in BeOS, the system slows down, but is still usable to the point where you can easy open up some application to close the MP3s. Right now, I'm running 10MP3s, and I can still browse the 'net fine.
When I tried to play a video, a 600mb MPG from a CD (which
works under Windows), it didn't open it for some reason. It just refused to open large MPG files.
>>>>>>
Are they Sonorsen encoded? Also, the built in media player was troubles with MPEG files. Look on BeBits for a new one that's much smoother. Also, I have a CD with several large (10-25 megs) I can open 5 or 6 of them without hitting 100% on the processor.
Another area where BeOS falls over is management. Sure it's got Telnet and SSH has been ported,
but why the heck? I mean, you can't manage the thing remotely at all. The FTP servers and other
servers I downloaded relied heavily on the GUI to operate. That's pretty useless.
>>>>>>>
The BeOS wasn't designed as a server OS. Still, I don't see what the management problem is. BeOS has Apache, and you can telnet in go to
Also, it doesn't seem to have a decent browser. NetPositive was fast, but couldn't do 80% of the pages
on the Net properly. I downloaded Opera and when I when to a Java-enabled site, it crashed the
system - yes, crashed it. The version was 3.6(I think). The system slowed down completely at first,
btu I did manage to bring up a window and kill the process. However, even though the Opera
processes were killed, the system was still too slow to use and I had to reboot it.
>>>>>>>>
Yes Netpositive is weak. As for Opera, I think the version you used was 3.6-beta7. It's still a beta, and one that was abandoned. Still, I can use it fine without any problems. (Except Java sites, mainly because BeOS doesn't yet have a JavaVM! I'm wouldn't be suprised if they didn't bother to protect against a crash if the user loaded a Java site and the OS didn't have a VM, since this was a beta version.) However, Be is working on bringing Java2 and Opera 4.0 to BeOS. Promises? Maybe, just like widespread OpenGL on Linux?
It seems like the best thing about BeOS is the GNU bash, and we all know that's from the FSF and
can be found on many other OS's. BeOS fans like be-fan over here talk about great things, but deliver
very little. Overall, and I'm not trying to put Be down - Linux or FreeBSD are better choices for the
desktop.
>>>>>>
I have a triple-boot of Slackware 7.1, BeOS 5, and WindowsNT 4. I use BeOS maybe 75% of the time. I reboot into Windows to use Visual Studio, program DirectX, and use my 3D and imaging applications. I reboot into Linux to fiddle around trying to install ALSA, recompile the kernel, compile the latest build of KDE2 (since it is only in RPM and source) Right now, I'm waiting for a new version of kernel 2.4-test because 2.4-test6 seems to have broken the NVIDIA driver. Oh yea, this is a GREAT desktop OS.
I'm not just saying that as a user who's only used Windows, Linux and FreeBSD, but as one
who's used BeOS as well.
>>>>>>>>
Well, that's your opinon. From my POV, I think WindowsNT 4.0 is probably a better desktop OS than Linux, and certainly is the best PC OS available if your doing graphics. BeOS is a much nicer overall use OS than either. And BeOS probably also appeals to a broader range of people. If you the hardcore CLI-user, BeOS has Bash, POSIX, Perl, Python, Apache, and all those wonderful text-mode programs you've come to know and loathe (not to mention application scripting throgh hey). If your a Mac-type GUI user, BeOS is super-easy, and everything can be done from the GUI. If you're a Linux/X user, you'll be happy with how the GUI and the CLI really mesh together well.
BeOS has potential, but Be need to sort out the instability, the lack of applications, (including a
good Browser - I know you can get Mozilla for it, but you need to compile it yourself unless you want
to use M7), and their management issues, which, for me, are the biggest issues.
>>>>>>>>>>>
BeOS is far from unstable. I just don't see your situation that often. Sure, you had a bad experiance with BeOS. I'm sure a lot of people had had the same with Linux. As for me, and a lot of other BeOS users, the OS is fast, stable, innovate, and works great. It just seems that the "average BeOS" experiance is closer to mine, than yours.
Linux Myth Dispeller Redux (Score:2)
***********************************************
Jon Tillman
LINUX USER: #141163
ICQ: 4015362
http://www.eruditum.org
jon@eruditum.org
***********************************************
Help Jon build a network!
Looking for giveaway computers & parts
Current Need: Tape Drive & PI/PII processors
Email me to find out how you can help
***********************************************
Mythology and Reality (Score:5)
For example, you can't just say "Multitasking under Windows 95 is partially preemptive." True, 16 bit apps run in a shared memory space and the GDI isn't fully reentrant, but a statement like that is just flamebait.
Also, the statement "Hardware is often ignored by other operating systems. On the other hand, Linux takes advantage of all the hardware it can." is ridiculous.
--
Re:'nother myth; not ready for the desktop (Score:2)
Preinstalled linux isn't exactly the same thing because linux is harder to configure: text files versus gui checkboxes most of the time.
It may be marginally easier to enter, for example, your ISP's DNS server IP address into a Win9x dialog box than it is to place it into a text file in /etc, but the real barrier is that it takes a hundred times more work on the user's part than either of those to learn what the phrase "DNS server IP address" means.
As Peter keeps saying and saying, the average home-user - Hell, the average office user too - seems to be incapable of configuring any OS, Linux, Windows 9x or you name it, with either text files or GUI checkboxes. Better documentation wouldn't help, because home users refuse to read computer documentation - in fact, half the time, they throw it all away the very first thing. Seriously. I know people who have bought new computers just because their Win9x systems have come down, as they so often seem to do, with bit-rot or registry leprosy or whatever you want to call it, so they don't boot right any more. Users like that never will learn how to do any configuration, GUI- or text-based.
For example, how hard is it to set up a modem and a "Dial-up Networking" icon in Windows9x, GUI and all? If you are the computer nerd in a small business, then you already know the answer: way too damn hard for the average home user! That's the reason that it was important to the point of federal lawsuit for AOL to have their AOL icon pre-installed on Win9x boxes; the presumption being that if users had to run the deep and abstruse AOL "setup.exe" program off the CD instead of having the icon already present front and center when the user turns on his new PC, AOL would probably lose half of their potential customers.
So if configuring anything on a PC is so difficult for home users, then how does it ever get done? For example, assuming that an ISP connection is not already preinstalled by the PC vendor, how do home users ever set one up? Well, either a.) some patient cubicle-slave at a help desk at AOL, EarthLink, etc., a human being talking over a 1930s-technology telephone, walks the home user, step by step, through the process of clicking all those checkboxes, or b.) some friend who has some notion what an IP address is comes over to the house and does it for the home user, or c.) the home user brings his box into work and has the office nerd do it all for him. And when the office nerd tries to explain what he's done and how to change all those little clickboxes or whatever in case the home user needs to switch to a different ISP or the ISP changes its DNS address, the home user turns away with his eyes glazed right over. I know; until last month I was that nerd. I've had guys turn glassy and spacey and dial out on their cell phones right in the middle of my explanation, which is quite offensively rude, I think. I can't tell you how much hardware I have installed thanks to my esoteric knowledge of hi-tek procedures such as putting the floppy disc in and double-clicking a:\setup.exe. If you can do this, and if on top of that you are 1337 enough to sniff around on driver CDs for a README.TXT, then these people refer to you as a "guru."
The point to the above being that at least eighty percent of users don't ever "climb the curve" at all, any more than they ever set the time-of-day clocks on their VCRs and coffee makers. For them, if you preinstall Linux, it is pretty much the same as if you preinstall Windows9x; except, of course, if they call Mindspring for help connecting or HP for help plugging in their new printer, and they say they're using Linux, then the tech support guy is likely to say "We don't support that OS.".
Yours WDK - WKiernan@concentric.net
hrm... (Score:2)
however, there are a lot of quips in there that make the pages read like a "hey, fuck you M$" style website. This is destined to make many write it off as anti-FUD FUD, so i would urge readers to take this one with a grain of salt. All in all, however, a very interesting read.
FluX
After 16 years, MTV has finally completed its deevolution into the shiny things network
Re:Wow! Pro-Linux FUD! (Score:5)
you forgot to add: multi-threaded apps are better for some things.
There are two ways of multi-threading an app: with threads that share everything (VM, open fds, cwd) but the stack, or with threads that share nothing but an area of shared memory. Under most OS's, the second is much slower than the first, which is why there's been this push towrdsa multi-threading in the first way. Both Solaris and NT have this problem, and when MS's interests coincide with Sun's, people just get the impression that that is "the way to go".
OTOH, if you look at Linux (and FreeBSD, and even Plan 9), both kinds of threading work well, with very fast context switches. Under Linux, processes switch almost as fast as threads. So you can choose between the two ways of multi-threading, not by "which is faster", but by "which is more appropriate to the app".
Programming with several execution contexts sharing a single VM space is tricky. You need very careful locking, to prevent one thread's data updates from stomping on another's. On SMP machines, you get accesses to the same physical memory from all the CPUs, which means more bus traffic needed to maintain cache coherence. Multi-threaded, shared-memory programming is only worth it if your app calls for it, i.e if your app has a lot of shared state that all the threads will be working on.
Multi-threading with a separate VM space is considerably easier: each thread (or process -- processes are just one kind of thread) runs fully protected from interference, and you can always set up a shared memory zone for whatever shared data is needed. Each processor is mostly working on separate areas of memory, so bus contention is lower. The downside being that this kind of app tends to use more memory than the former.
Finally, single-threaded, event-driven programming should not be counted out. It turns out to be the most appropriate, for a surprisingly large number of problems. In some cases, you're better off running several copies of a single-threaded server (say, one per CPU), than a multi-threaded one.
IMNSHO, the worst thing that has come out of both Java and NT is the unthinking assumption that all programs should be multi-threaded, with a shared memory space, and that all servers should necessarily use one thread per connection. Yes, there are many cases where this is good, but there are also many cases where other solutions are just as good in performance, and much simpler and more maintainable in programming. Just say no to Sun's and MS's thread hype!
Re:'nother myth; not ready for the desktop (Score:4)
And your comment brings up a serious question here. The freedom you get with Linux is great, there's no denying that. But can you sell Linux to the average user on the fact that it's free (as in speech)?
I don't honestly think that the average user would even care if Linux came with source code or not. Think about it here for a second. What good is the source code to someone who doesn't have a programming background? Sure, the code is neat to poke through, and may give you an inkling of an idea about how the system runs, but what good is it to the average user? We have to remember that not every Linux user is a programmer.
You may be able to sell Linux to the masses based on the fact that's it's free as in beer, but if you try selling it based on the fact that it's free as in speech, people will get confused and say, "So what?"
--
'nother myth; not ready for the desktop (Score:5)
He gave an old laptop to a buddy of his who was in need of a computer. His friends previous experience with computers was limited to double clicking on a prodigy icon on his dads computer several years ago. "The computer is free", my buddy said, "on the condition that you keep the redhat 6.2 that I've installed on there."
At first he wasn't sure if he made a mistake imposing that condition on his gift as his phone was ringing off the hook ( "hey, how do I...?" ). But then, after a while, the phone stopped ringing. When the two of them eventually met up again, my friend left slack jawed as his buddy was talking about joining one of lugs he'd seen online after getting the internal modem working.
So you see, the point of this convoluted little story is that linux *is* ready for the desktop. Everything is new to everyone at some point so there's no reason that you wouldn't be able to stick a brand new linux box in front of some one who's never used a computer and tell them, "hey, this is what an os is supposed to look like. okay?". But see, that's what microsoft has managed to do with their billions of dollars for people all over the world. They've said, "this is an OS. This is what an OS does. If an OS doesn't do this, then it's *difficult*. If an OS doesn't do this, then it's not ready for the desktop."
But that's just crap. Everyone I've seen can and does learn how this OS works. You've got to get over your preconceived notions of what an OS is and go and find out for yourself. People are willing to learn. I've seen it.
-Peter
Linux installs & desktops - multiple anectdotes (Score:2)
The Linux side (RH6.1) took a couple of minutes and noted that the mouse had moved, and a couple of other things. After that, everything was fine.
Windows, on the other hand, took over half an hour and a handfull of reboots, after which it was STILL having trouble. It was a couple of days later that I had all the pieces of the windows side patched back together.
My first foray into Linux occurred because I was handed a Windows laptop that ate DAYS of my time trying to get it to work with a simple PCMCIA ether/modem card. I got to the point where an elaborate ritual was needed every time I put the box to sleep. After installing a few patches, I could put the machine to sleep, but it crashed every time I tried to shut it down(!).
I installed RH5.1 on the laptop and spent the evening hunting down appropriate drivers. This process was FAR easier than reloading Windows. Once installed, Linux was FAR more stable than Windows. I later upgraded to 5.2
My roommate at that time was a Windows geek. He loved windows. He thought it was the best thing since sliced bread.
He spent 6 months as a MS-windows install expert. Every once in a while, he'd come home with frustration all over his face over an install that was simply NOT working. As someone who was specializing in MS-Windows installs he would sometimes spend a whole day trying to beat a machine's install into submission.
When a new roommate moved in (a complete non-technofile), we started on Windows, and I weaned him over to Linux. This was mostly for my own sanity, since it was far easier to give him his own login than to f*ck around with the Win95 users kludge. It wasn't long before he was glorying in how usable and stable Linux is. I think that he almost forgot that the computer even runs windows. (I created a 'win95' command that allowed him to automagically flip over to windows. Beyond when I showed it to him, I don't think that he ever used it).
My newest roommates are also relative computer neophytes. I gave them logins, installed the RealAudio extensions and let them loose. The hardest part was getting them started over the phone (I gave them nasty passwords) I got one running with a text editor over the phone. Since then, I haven't gotten any complaints.
In a recent job, we installed dozens of Linux boxes of various configurations. Other than driver hunts for esotheric hardware, installation either went like a breeze, or the problem was traced to bad hardware. (firewalls and VPNs were a different story). A recent addition to our group was such an MS groupie that he helped write a bood about Win-2000. He actually complained when it looked like we were going to force him to keep Windows on his desktop. He solved the problem by installing VMware.
A different group in the same compamy was responsible for NT/95 installations. When their chief installer wanted to install Linux, we gave him a spare install CD and didn't worry about it. It was actually that easy. He still complains about NT/95 installations.
SUMMARY
Windows installs are a pain, Linux installs are a breeze, Linux stability makes for user happiness. The only way that MS can get away with even claiming that Windows installs are easy is that they have people like my first roommate who pulled his hair out so that customers could be handed a nice, clean, working machine. As long as I know that I've got the apps available I'd rather hand someone a Linux box than a Windows box -- especially if I'm going to have to support it later.
Re:Wow! Pro-Linux FUD! (Score:2)
There's a ton of things in this document I object to. For instance, under the multitasking section ("4.1 Linux multitasks only as well as Windows or Mac"), they say "Microsoft and Apple would have you believe that their operating systems multitask (run more than one program at once). Using the term loosely, they do. Using the term strictly, they task-switch only."
Later, he goes on to say that "Microsoft Windows and MacOS started as CMT systems, and have gradually moved towards a PMT model, but still retain some of their CMT roots. Most notably, their graphical infrastructures retain structures that require cooperative/sequential behaviour." I love that phrase, Graphical Infrastructure. What a load of BS. What he neglected to mention is that all the windows for the OS itself are controlled by the same process. When the process is busy, you may have problems moving its windows around, but you should still be able to control winamp.
On a multitasking sidenote: Under Windows 2000, you can now do things like format a floppy and drag a window at the same time. Apparently they finally figured out that you don't have to stop everything on the system to read/write a floppy.
I also like this particular statement a lot: (Under 4.2 Linux crashes frequently) "Programs can never crash the system under Linux, because of the way it's built with things like memory protection, instruction monitoring, and other devices built into any true kernel."
Well, bullsh*t. I've run programs and had linux crash more than once. Where do you draw the line between the application and the OS? Both linux and windows use a C library (In linux's case it's libc, like the other unices. On windows it's MSVCRT??.DLL) which, if buggy, can take down the kernel. Granted, linux is more stable than NT or 2k in my experience, but it's not like I've never run a user-space program (Nearly everything is user-space, of course) which took down linux.
Too bad we don't live in a perfect world.
And finally, my very favorite entry in chapter four: 4.10 Linux is fragmenting.
And I quote:
This is nonsense. There are now more linux distributions than there are linux users. While there exists a specification for the ideal shape of the directory hierarchy, most distributions follow it only loosely, or not at all. Redhat, Slackware, and debian all disagree as to how things should be done, with most of the distribution-packaging companies falling in behind one or another of them. It is getting better. It's still not converged.
Now, technically, linux is a kernel, and redhat is an operating system, but linux (the kernel) is not very useful by itself in most applications, so you must look at the distribution as a whole. Linux also refers to the mass of all distributions. This is the basis of my disagreement with this guy.
In the end, though, this article of his seems to be predominantly opinion garnished with the occasional fact. He does not cite sources and so will end up looking just as silly as Micro$haft, except for one thing; He doesn't have a big name. If he intended this to be a rebuttal to M$, then he went wide. If you don't have a big name, you have to look credible, and this doesn't.
Re:Wow! Pro-Linux FUD! (Score:2)
I have to partly agree to this one, as it took me a few DAYS of fiddling with the XFree86Setup horror under Xfree 3.3.6 to get my monitor to work.
However, there is no reason to believe that things aren't getting better. When I upgraded to Xfree 4.01 there was no need anymore to fiddle with the scan
frequencies in a poorly documented text file.
>>>>>>
Undoubtedly things are getting better (though in the case of XFree86 configuration actually got WORSE from 3.3.6 to 4.0) but it is nowhere near Windows yet. XFree86 is a chinch to install. However, try to install ALSA with an ISA soundcard, or configure networking for multiple ethernet cards (which most distros don't do. Mandrake 7 does it, RedHat 6.1 and Slackware don't do it) get NAT working, install new drivers, etc. All of these are significantly harder in Linux than in Windows.
2) Linux multi-tasking.
You said that the windows multitasking feels smoother, though I personally doubt most people can tell the difference betweeen 20 msecs and 50 msecs.
What I am more concerned about is overall system stability which is abysmal under Windows 9x and not quite perfect under Windows NT, while Linux
systems are known to go on for years without crashing.
>>>>>>>>>>>>>>
Have you USED Windows NT? (4.0, not W2K) People CAN tell the difference between 20ms and 50ms (and whatever other system things keep NT multi-tasking smoother.) They might not be able to say "oh this is 20, and this is 50" but they can tell, "hey, switching between apps feels smoother." With NT, I can be running a ridiculous load (3D renderer, image editor, IDE, MP3s, etc) not switch between apps quickly and easily. More imporantly, there is little redraw and the system doesn't FEEL loaded. However, Linux to me feels much more like Windows98 than Windows NT. Apps just take longer to start up, longer to switch, and the system actually FEELS loaded under heavy load. It's a preception thing, and preception matters. In the early days of BeOS (when the OS was very immature) Be used to impress people with how fast the UI was. Back then the used tricks like really small quanta to make the OS "feel" fast. (Of course now that BeOS is matured, it really IS fast, without UI tricks.)
3) Linux IS too huge
Heh, I doubt Windows 2000 is much better. I agree that X is a bit of a mess and so is Mozilla, but KDE or Gnome along with the Linux kernel and modules
don't seem to use more than 10 MBs or so of RAM on my machine. This lives room for X, Netscape, and a few other things to run quite comfortably on a 64
MB system.
>>>>>>
That's the problem. Windows 2K ISN'T much better. NT4 is a LOT better. At start up, my NT 4 machine was around 18meg used. After initializing both KDE2 and the GNOME libraries my Linux machine is pushing the high 30s. NT4 + IE +plus two simple apps takes up a LOT less RAM than KDE+GNOME+X+Mozilla+Linux+two simple apps (one KDE, one GNOME)
4) Linux IS playing catch up
It may be true that Linux is in the process of implementing some functionality that has been lacking, but at least it's not quite hanging off the same ancestral
x86 MSDOS code like the "consumer" Windows versions do.
>>>>>
What's your point? Linux is playing catch up to both commerical UNIXs and Windows.
5) Other OS kernels do NOT load everything at the same time.
I have nothing against Windows' dynamically linked libraries model, except when those DLL files royally screw up your system by being replaced and
corrupted by random programs.
>>>>>>>>
Again what's your point? DLL files are almost exactly like Linux dynamic libraries (.so files) except Windows inanely installs them all in one directory. My point is that the FAQ makes it seem like Windows is this big monolithic OS that has all drivers compiled in and loads everything into non-pageable kernel memory whether or not it is needed. That's simply not true.
6) Linux DOESN'T take full advantage of hardware.
Again, there might be a little truth in this, but the main problem here are the closed standards. For example, I am forced to run on a alpha version of a
reverse-engineered CLM driver for my winmodem, which happens to be slow and a bit buggy. Yet because the driver was open source, I was able to go in and
work out an annoying bug myself (which locked up the whole system on a modem retrain) instead of having to pay $50 to M$ support just for the privilege of
dialing their number.
>>>>>>
What does OSS have to do with anything. I stated because of the design of DirectX, Windows apps take advantage of more hardware features that Linux apps.
7) Linux threads aren't all they are cracked up to be.
I personally can't argue much here as I am not an expert in the Linux vs. Windows vs. BeOS thread architecture...
8) Linux really isn't that fast, depending on what you do.
I agree with you that X is a speed bottleneck, but that is because of its client server model. Also as I said before, my Internet experience with Linux has been
somewhat slower, but probably because of the driver I use. My machine feels just as fast or faster for pretty much anything else.
>>>>>>>>
X isn't slower because of the client server model, X is slower because it is bloated and not designed with a high-power client in mind. BeOS uses a client server model as well, (in fact theoretically BeOS's display should be slower due to the use of messaging) but it's display is quite fast. GDI is decent (faster than X) but only because it is a DLL written in ASM and implanted in kernel space.
9) The Linux desktop IS clunky
I wouldn't exactly call it clunky, a little less intuitive perhaps. Overall it offers the same functionality, and much more (i.e. multiple virtual desktops) over
the standard Windows desktop. I am sure that a few years (or months) of evolution can fix that.
>>>>>>>>>>>
Never said that the Windows interace WASN'T clunky. The linux desktops do tend to be slighly clunkier than Windows one (due to less use of context help, right click menus and lot) but both pale in comparison to the Mac and (because it takes a lot of CUEs from the Mac) GUIs.
Also, you keep saying that these are problems a few more months of evolution won't fix. That may be true. However, just because something will be true in the future doesn't mean it's true now.
Re:Wow! Pro-Linux FUD! (Score:2)
Re:Wow! Pro-Linux FUD! (Score:2)
Win95 is not truly pre-emptive (Score:2)
Therefore you can run a whole lot of 32-bit apps and it looks pre-emptive. But run a single 16-bit application from Win 3.1 days in the mix, and the illusion evaporates.
Cheers,
Ben
Re:BeOS not great. (Score:2)