Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Linux Software

Linux in the Enterprise: Fact vs. FUD 372

Kelly McNeill writes "When I first started touting Linux as a soon-to-be superior alternative to Microsoft Windows, almost no one at my company had even heard of the product. Nearly two years later, it's difficult to find a computer magazine that does not extoll the virtues of Linux. However, these praises are often laced with caveats: Linux is a "server OS", that it's difficult for novices, that it's "not ready for the desktop". To some extent these concerns are simply due to a reasonable fear of the unknown. "
This discussion has been archived. No new comments can be posted.

Linux in the Enterprise: Fact vs. FUD

Comments Filter:
  • If my wife can figure out how to get MacOS -> Linux on our iMac, it's set.

  • by drwiii ( 434 )
    I've always thought of Linux as "the desktop" and BSD as "the server". To each their own, I suppose. Not a flame, just an observation. (:


  • Four years ago, I tried installing linux on my machine. The install was unhelpful, and I finally gave it up as a bad job, not having enough time to make it work.

    With Redhat and others producing Linux for the market, there is support and competition to produce a user friendly product. When I get ready to install Linux on my new machine (the old one being to kludgy & loaded to bother with), I expect the install & operation to go relatively smoothly. Maybe not as smoothly as an WinNT install, but I haven't done dozens of them either.

    Mike Eckardt
  • by jd ( 1658 ) <> on Friday November 12, 1999 @08:51AM (#1538047) Homepage Journal
    A lot of non-techies, when asked about Linux, aren't even aware it has a GUI. I think many would be pleasently surprised by X, when using KDE, Enlightenment or XFCE.

    Also, many believe there are no GUI tools for doing anything (for much the same reason). Again, there are many, as anyone who browses Freshmeat - or even the menu options of many X11 window managers knows.

    Many believe it doesn't support current hardware, unaware of just how much Linux 2.2 and 2.3 support.

    Many believe that Linux doesn't have any software. Star Office, Applixware and KOffice all testify otherwise. But people won't know about these, if they never hear of them!

    This, I think is the key to it. People have no real, reliable information on Linux. There are no ads on TV, no ads in the papers or the magazines, no posters in the major stores, no demo machines in the windows... These are major sources of information for a lot of people, and Linux doesn't have any of them. Instead, people only hear how Microsoft doesn't make it, how Microsoft doesn't write anything for it, how MS Office won't run on it, etc. All they hear is the negative.

    If you get told often enough that the glass is half-empty, you will never see that it's also half-full.

  • When I first took my current job, I suppressed the urge to push Linux at my boss, because I figured that the cream rises to the top. He told me before I got here that I could run Linux on my desktop, which I do. He let me know earlier this week that he had ordered, for us, a Dell PowerEdge 1300 with 2 processors & RAID & had ordered Linux to go along with it!
    If you build it, and it is quality, they will come.
  • I started out learning Linux about 6 months ago with a RedHat 5.2 installation, dual-booting with Windows NT. I have a P200MMX box that ran NT like a tank. Slow, *relatively* stable (stabibility is always relative - at least my box wasn't being rebooted 4 times a day like my roomies). But, every once in a while, my NT ball will go "balls up" on me, and NT bluescreens are a lot harder to remove than standard 9x bluescreens. I was blesssed with living on a floor with a few geeks on it, and 2 of these geeks were Linux users who showed me the door. They helped me with an install, with recompiling my kernel, etc. Within 2 months I found myself killing the old Windoze distro, and getting a bigger hard drive to mount my root partition on. I still have a dual boot machine, but the only time I boot into Windoze is to install software to run under WINE. Since I've switched almost fulltime to Linux, I've learned far more about my computer than any compsci class can teach me. I'll never have to hear the words "Our software is not designed for that", or "wait for the upgrade". I run a stripped down version of Win98 on my laptop, but that's only because it's out of necessity and because there are hardware that my Linux box doesn't support (parallel port scanners, etc + stuff that won't run under WINE, such as chem programs). As far as getting new users onto my box, the biggest hurdle is the login prompt and adaptation to the GUI. Adaptation to the GUI is pretty easy if I select a non-fancy theme that looks and feels close enough to windows. And, once that's done, about the only other thing that users miss from a Windows system is a blue screen 8-) I think in order for Linux to suceed in the desktop arena, it should have a standard, stock GUI that's easy to learn. For the newbies who have never seen a M$ box, Linux is not a bad choice, once the box is all configured and ready to go. With enviroments such as GNOME and KDE, it's quite possible that a person can do all their work without ever dorpping to a terminal/shell. Having worked for tech support, a standard GUI helps. I have enough trouble getting people to right click on "My Computer" already, could you imagine supporting the miraids of possible themes? Not saying that the configurability is a bad thing, but corperations should have standards to make IT staff's life easier. -=- SiKnight
  • by Anonymous Coward
    Haven't we seen this topic on slashdot 50000 times???????????????????? Why don't we just dig up the old archives rather than rehashing this shit once a week?
  • by Anonymous Coward
    As a computer geek I love that Linux is open source and hackable and stable etc etc. But I see stories like this on Slashdot all the time where a Linux advocate is trying to persuade a non-geek friend to switch to Linux. It makes me wonder, how could an architect, artist, musician, or anyone else who isn't a computer expert be expected to use Linux? I have a computer science degree and 15 years of computer experience and I could barely figure out how to configure Red Hat. It took me hours to figure out how to get it to read a floppy and I never did figure out how to change the screen resolution without recompiling code, etc etc. It was immediately obvious that Linux is meant to be a server platform for people who have a good deal of computer experience, not a desktop platform for everyday users.

    And this isn't something that is going to be fixed by slapping a few GUI fronts on some of the configuration files. Linux is not designed for non-geeks and it's silly to keep on pushing it as a desktop OS in its current form. Most of the people who hack Linux and many of the people who advocate it fiercely don't even understand Joe-average-user enough to understand why Linux is so hard for him to use.

    As a computer programmer, there is a lot about Linux that I love. But I also know that designing software that is easy to use by anyone is one of the most difficult and important goals of designing good software, and this is an area where Linux doesn't even attempt to make an effort. And why should it? It makes a damn nice server for people who know what they're doing. Leave it at that and stop harrassing those poor dumb Windows users. They have enough problems.

    Ok, you can start telling me how wrong I am now...

  • I've been using Slackware as my desktop OS at work and at home for about 4 years. It especially makes sense at work, where I open a bunch of xterms to Sun servers. Most everyone else struggles with Exceed's X implementation on Windo$e. I never could get it configured right. With Linux, it works right out of the "box".

    BTW, a lot of M$ zealots make claims like "Linux is command line based, doesn't have a stable GUI, etc, etc" I guess they don't know that the first X Window System release was in 1984, about the same time as M$ Windoze version 1.0.
  • by Anonymous Coward
    Captain Kirk said that he only runs Windows, even though Spock said that Linux was the logical choice, and all Bones could say was "Damnit Jim, I am a Doctor, not an operating system".
  • I absolutely agree! Until linux has something like "/Program Files" (universal place for user programs), "Start->Settings->Control Panel" (universal place for drivers/configs - linuxconf is getting there for config stuff), and a good, easy, standard way to install *and remove* software, it can't be considered a user OS.

    My partner (non-techie) will run Linux because she has an admin onhand to config the machine. My folks won't because they don't. It's that simple.

    Folks, we've gotten to the point where we have someone driving the OS - and we've arrived. We have a good OS now. We need someone to drive the user experience. I hope that RedHat will be able to do that for us.
  • I'll get flamed for this, but...

    I love kde, and the gnome and enlightenment combo. And the tools like netcfg and such that linux provides are useful... but people seem to be missing that you still need to know what's going on below the surface. I *seriously* f#cked up my system when python bombed netcfg a while back. And I didn't know what files were being touched. I eventually had to torch the box and reinstall. Kinda like NT. And I sorta knew what I was doing... I can't imagine my parents being able to cope.

    Really, for the desktop, an OS needs to be built from the ground up around the user expierence. I *love* the Unicies, but user interface has *always* been a secondary consideration. And it shows.

    Just my humble opinion, however. :)
  • that it's difficult for novices

    It is difficult in particular, but not just, for novices. Given the dominance of Windows and the commitment of the hardware vendors to this platfor, a novice still has to worry about hardware compatability issues for Linux while this is largely a non-issue for Win98.

    I don't think I'm a novice, but after three weekends I'm still trying to figure out how to get Linux onto my sexy new Sony Vaio N505X. It's an all singing, all USB and ilink, single PCMCIA slot laptop, and I can't for the life of me figure out how to plug and play the modem, the video controls only work in Windoze etc, etc. Even softboot from Win to Linux is more difficult than I thought. (But next weekend I'll get it working - I think...)

    Enough about my woes: the point is that as a (novice) user you have to worry about compatability issues. People who are setting up servers are paid to worry about these issues - and presumably skilled - but my mom just wants to read e-mail. Linux is not a choice for her (yet!).

  • by The Wing Lover ( 106357 ) <> on Friday November 12, 1999 @09:07AM (#1538059) Homepage

    I think we're forgetting something. A large portion of us have been using computers since we were 5 or 6. To us, everything seems obvious -- how to install new software, the difference between root and the rest of the users, why sometimes we have to use the keyboard to do things instead of the mouse...

    But Linux simply isn't ready for non-computer-geeks to be using all the time. It's propbably okay for smart non-computer-geeks, as long as they have a bit of support once in a while. But it's still not ready for Aunt Helga who wants to check her email once in a while and run a word processor.

    - Drew

  • by tuffy ( 10202 ) on Friday November 12, 1999 @09:09AM (#1538060) Homepage Journal
    The article quotes explicitly: "somebody else set this up for me and I never have to touch it again because it never breaks" which is the mentality at work. Nobody expects Joe User to set up a Linux box, but the auther does believe Joe User can use one fine once it's set up properly - much like an average user can use properly installed Windows.

    And if a Linux GUI is the user's first experience, there won't be any Windows training to undo. It seems reasonable to me...

  • I don't think I'd ever reccomend Linux as a server OS actually. While you all may call me a BSD bigot or whatever, I've never seen a Linux server widthstand the pressures of the serving environments I was in. You can all say that I needed to tweak it more, and I can counter with, "ok, show me the *SINGLE* tuning point that I can use to increase performance/capacity like MAXUSERS with *BSD". I've never had anyone come back with a reasonable response.

    I've never seen Linux maintain a reasonable uptime while being a heavily loaded server. I've never seen Linux's file system handle a crash well. I still don't think its secure enough for me to want to deploy anywhere than a desktop.
  • I spent most of yesterday trying to get a Slowlaris machine to run with two NIC cards. (OK, full disclosure: I had one of my staff doing it while I surfed Slashdot and offered helpful manager-type suggestions like "Did you try adding a route?").

    It was a pain. In the end, I don't even know why we got it to work -- it was one of those "fiddle with it until something works" kind of things. There were no GUI tools and the help was lousy. Sometimes, when we did a netstat -r, it would hang for 5-10 minutes. WHY?!

    If you showed a novice the trials we went through compared to the ease with which you can accomplish the same thing using a simple Linux GUI (or even CLI tools that worked), they'd guess that Linux was the expensive commercial operating system.

    We would have been better off installing Sparc Linux over Slowlaris and gotten some real work done. I'm totally serious.

    I have an equally low opinion of HPUX. Nowadays, when I'm faced with the prospect of using a commecial OS (and not just NT), I cringe.

  • by |DaBuzz| ( 33869 ) on Friday November 12, 1999 @09:10AM (#1538063)
    Microsoft also claims that Linux has no journaling file system, ignoring the fact that the SGI's XFS is a journaling file system(10). They also ignore the fact that NT 4.0 itself lacks a journaling file system!!(11)

    #11 Try this: go to, select Search, and search the Microsoft web site for NT Journaling File System. You'll get three hits, and the first of these in order of relevance is the "Linux Myths" page! One is a false hit in that it simply links to the "Linux Myths" page, and the third is the Server Operating Systems Newsflash, Volume 5, Issue 40, that quotes from the "Linux Myths" page.

    A search for the exact phrase "NT Journaling File System" gives ZERO hits while a match on all words gives 62 hits.

    In both cases, footnote #11 is completely incorrect no matter how you search.

    It's this sort of thing which makes the analysis no better than MS's Linux myths page.

    FUD by any other person is still FUD.

  • The fear of the unknown is precisely why RedHat, LinuxCare and other future support companies will be making money, certainly the IBM global service arm is not complaining. Let's face it, for the non-cognosti, computers are complex, difficult and tempermental (and that's just the installation :-) ). You, as the resident Linux expert, get paid for reducing risks of the IT budget being flushed down the toilet (correct me if I'm wrong but I believe the track record is 50% of major IT projects are a complete disaster and have to be scrapped). If IT is not their strength, it makes sense for companies to outsource the operation of their infrastructure to others, much like we just pay for water and electricity without worrying about the piping and dams. Given this model, the logical conclusion is that Linux would be the preferred choice as less of the profit disappears into their coffers. Expect their app-host hosting efforts to redouble once Linux starts taking big chunks of their developer/desktop market.

    The internet does change things in that it refocuses efforts on the services and thus reduces hardware to supporting roles and software to enabling agents. As I've been telling people at this end, the cost is in the infrastructure but the value is in the services. Once the hype of e-commerce dies down, then you might be able to objectively measure the value-cost proposition and work out what needs to be done in redesigning corporations around the flow of information, much like old factories needed to be freed from the constraints of steam-driven belts and pulleys.

  • Why do we constantly insist on oversimplifying the "Linux" package? Marketing reasons from the commercial sector? Whichever reason, here are the facts:

    The entity commonly refered to as "Linux" is based on a kernel called Linux, which does all the low-level dirty work in the OS. On top of Linux we've got the basic GNU tools, and several free-and-stable programs that can make a copmuter running the operating system a hell of a server. Sysadmins like that.

    But, GNU tools are optional. Add-ons. So is the server software.

    And, the X windows system is optional. You can choose from a variety of X servers, or none at all. You can also use a completely different window system. you can use differnet GUI based end-user applications. Each one is unique, and completely optional. They might kick ass, and they may suck. But if they do suck, QUIT BLAMING LINUX AS A WHOLE. This is mainly directed to the thousands of clue deficient software reviewers and evangelists out there who couldn't tell the difference between a daemon and a watermelon.

    Too many local "Linux experts" who have seen my computer, running Enlightenment in X, say "Hey, my Linux doesn't look anything like that." And most of the naysayers to the Linux movement have less of a clue than that. Frankly, I'm starting to care less. Don't get me started on how many people out there think Linux is a new company in silicon valley.

    FUD is just the end result of ignorance and laziness on the part of members of the press trying to make a quick buck. The same press that keeps discovering "new" technologies like E-mail, multi-gazillion dollar net startups like (for all your lemonade needs!), and internet cell phones.

    I'm not saying that Linux should be touted as complex, but that there are millions of parts to the commonly reffered to whole of Linux. Denying the individuality of different projects commonly included in each Linux distribution is akin to denying that your car has four seperate tires, and each on can be made by a different company or have different characteristics.
  • Well I think you should try debian then. Debian has a package called menu which creates a very nice version of the /Program Files concept. Under icewm (looks convincely enough like windows 95) and just hit the button that would be where start is. All programs that have the nice GUI interface are under categorized menus and are easily launchable. This can give even a newbie a chance to use the programs at first glance. Also given the fact that when xdm starts up it allows a user never even to have to look at a command line again after the initial install. Now this is not to say that a person will never need to use the command line but for most "common" tasks this will work. The commands dpkg -i and dpkg -r work great for upgrading software.
  • It seems to me, that there is a growing movement in the computer industry to sell your trade journals and attempt to attract attention to your publication (whether it be print or HTML): mention LINUX. But only mention it as a server and be sure to scare any users away from using it as a desktop replacement.

    I just want to give some thoughts on the matter.

    I am not going to say that Windows is the best OS possible. I won't even dream of it. However, I do have to say that there is some comfort in familiarity. If you take someone that has never used Linux before and plop them in front of a Linux running KDE or GNOME (or whatever you are using for an X-GUI) they will be able to do most of the GUI-related functions without problems. Launching apps, browsing the file system, change the background and all that. However...that is where the similarity begins to end. Trying to explain how the file system works or how the directory structure works to someone that is DOS-based can be quite a challenge. Basic things like the use of the slash are opposite. DIR is replaced by LS. The GUI isn't as "clean" or "neat" as Windows 95/98/NT4 to most people. It is different enough to have people who are used to Windows pine for what they had previously because there is a shift in preception that needs to ocurr...and most people are not willing to put the effort into making that change. Most users who use Windows don't even know that it is just a pretty DOS, nor do they care. They can point and click and that does what esoteric commands like "copy *.* c:\temp" does (said with tremendous sarcasm).

    On the other hand, most (proficient) Windows users are steeped in DOS history. Something that has caused heartache all around the computing world because this fear to remove DOS from our everyday life has left us with Windows95/98, an amalgamation of old DOS and new "32bit" code which, honestly, does not work efficiently (or correctly at times as people on this group frequently point out). If only MS could have stepped away from DOS (kind of like OS/2 was going to) and created something new that worked better than just patching DOS up to a "useable" "32bit" level. I know most people here are not NT fans, but it is at least a step in the right direction for MS with an attempt to remove DOS from the day to day lives we all lead. All they have to do is mask what the "OS" is under the GUI of NT and you can convert all of the people who are relying on DOS to become "deDOSified".

    Thankfully Linux is a nice opportunity to help us rid ourselves of good 'ol DOS and force people into changing their thoughts and mindsets on how things ought to be. However, I have to say that Linux won't make it to your average person's desktop until you can mask what is underneath of the GUI in a manner that they don't have to deal with the underpinnings of the OS to make it run like they are used to.

    Of course, making Linux look and operate like Windows would take all the fun out of playing with it now....but if you want to reach the masses here is one of the many ways you can bring Linux to the desktop.....


  • I've worked in an environment where I tried very hard to get people to start using linux, and even among technical people, there's more resistance than you think, and some of it probably isn't linux.. it's just human nature to resist change, good or bad. Linux will have a signifigant curve to climb just here.

    But, let's be realistic: Joe user cares zero about adminstration or even backups in most cases. They should, but that's just not the case. Linux is very much geared towards power-freak gadget-head techies, and that's why we love it so much!

    This isn't a bad thing though! What we need is a idiot-friendly version of linux that installs from windows with 2 clicks. Something that makes redhat look technical. An installer that can automatically detetect common partition configurations, make linux a home automatically, and install away! Hell, I'd even like that.

    But it doesn't stop there. You need to have a distribution that is 100% gui oriented. No complicated user add procedures - and adduser myname is too compliciated. Just boot into E or KDE or whatever, run a web browser and have a WHOLE $HITLOAD of GUI applications available in the start menu, with lots of eye candy.

    Gnome and KDE are coming a long way towards this goal, but we're a few years off. Everyone working on their own little piece will bring us this goal - are you listening, Corel/Redhat/Debian?


  • Then why not have the M$ penalty phase include a financial penalty that goes towards advertising OSS?

    You won't get the eyeballs if they don't know where to look...

    Your Working Boy,
  • "Microsoft claims that existing Linux GUIs are cumbersome and difficult to use. In fact, my /mother/ sat down and began using the KDE28 desktop with no training, no prior experience, and not one single problem. " Big deal. My mother uses VMS every day. But seriously, KDE is only one of several GUIs available for linux. Redhat is the most dominant linux distro and pushes GNOME. And most linux apps today are either command-line or designed for only x windows in mind. The only way linux is going to be "easy to use" is if everyone decides to stick with one desktop with UI guidelines so that you know when you press control-c you are copying. Or was that alt-c :) And that's never going to happen. It's not what linux is about.
  • XFS is certainly not yet supported under
    Linux, and probably won't be for awhile (though reiserfs, another journaling
    filesystem, is due in the 2.4 kernels and is
    currently available as a quite-stable patch).

    Also, the 4 gig file size limit is _not_
    filesystem dependent. The 4 gig limit is in the VFS layer, which _all_ filesystems use. The VFS layer on 64-bit platforms supports 64-bit
    file sizes, though.
  • "Microsoft also claims that Linux has no journaling file system, ignoring the fact that the SGI's XFS is a journaling file system. They also ignore the fact that NT 4.0 itself lacks a journaling file system!!"

    Can anyone clarify the issue of journaling file systems? I took the author's advice and searched for "ntfs journaling file system". A new article popped up (November 1999 Technet) where the following claims are made:

    "NTFS is a journaling file system with fast file recovery. Journaling file systems are based on the transaction processing concepts found in database theory. Internally, it more resembles a relational database than a traditional file system. It is comparable in function to the Veritas file system found on some UNIX implementations." mplete/windows/winnt/winntas/technote/ntun ixvw.htm

    Also, is there a version of XFS that will build on Linux?
    claims the following:
    "The code in this directory is original IRIX-XFS xfs_log* code which has not yet been ported to work in Linux. It is intended for viewing, not compiling."

    Hopefully, we aren't replying to FUD with FUD...

    BTW, I'm typing this from my Linux desktop that is also acting as a file server, and a router -- only 22 days of uptime due to a power outage...
  • Actually HAL was somewhat able to function correctly. The ability to have AI in an operating system would be a very significant advance. I cannot at this time see anything like that being developed for windwows; not even remotely. Basically if that were said that would mean that it W2k would be more advanced than linux. I realize that perhaps it was an unstable AI but never conceede any point to the enemy if you are to win in debate. If your position is important to your just stick to bare facts. A great deal of mishaps in this world are directly related to unintended consequences.
  • "Also, Microsoft, when charting throughput of Internet Information Server vs. Linux+Apache carefully refrains from mentioning that it would take at least 5 incoming T1 lines attached to your Linux server before this scalability becomes a factor. How many "common customers" have 5 dedicated T1 lines feeding into a 4-processorserver? I'm not sure I know of any."

    This is very common now that people are saying that a X Linux configuration could easily saturate a T-1 (or multiple T-1s). For some reason this is still an acceptable benchmark. In the day of people having 10mbit cable connections and 1.5mbit DSL connections in their home, the bar for a server should be can it saturate a T-3 or above. This T-1 reference non-sense is antiquated and has to stop.
  • FUD = Fear, Uncertainty, and Doubt

    New, hip, generic term for intentional misinformation.

  • by ~k.lee ( 36552 ) on Friday November 12, 1999 @09:26AM (#1538080) Homepage
    I don't mean this to be a flame, but when are we going to stop posting links to osOpinion? The general quality of osOpinion articles seems to be very low: I have yet to read a single piece that does not contain vague handwaving generalizations and even factual errors. For example, in the above piece by Dave Leigh, he writes:

    1. Microsoft also claims that Linux has no journaling file system, ignoring the fact that the SGI's XFS is a journaling file system. They also ignore the fact that NT 4.0 itself lacks a journaling file system!!

      First of all, wasn't there a thread a couple of weeks ago in which we discussed the journalling abilities of NTFS? Second, XFS has not been released for Linux yet. Third, there is a journalling filesystem for Linux, but it's not XFS: it's ReiserFS [].
    2. I've got Doom, Quake, and other multiple player games for entertainment (although I'm personally a board game fan).

      This is just silly. Game support under Linux is extremely sparse right now. In a world where even Macintosh doesn't get ports of even the most popular games (witness the recent Half-Life debacle), we'd be really foolish to claim that Linux has enough games for the average home consumer.
    3. Thirdly, even if other Unixes were cannibalized, what would it matter? Linux would remain, and the point I made in the above paragraph works in reverse. Those Unix developers that now exist will move to Linux with no effort, and there will be no discernible effect in the workplace.

      Clearly, this was written by somebody who doesn't know much about Unix. Linux is like Unix, and the transition would likely be easier for commercial Unix developers to make, but it's hardly going to be a transparent, effortless transition.
    4. An entertaining footnote (#40): Again, the Gartner Group plays tug-of-war with themselves. The same short report recognizes that SCO and SGI are competitors and supporters of Linux, but the Gartner Group never bothers to answer the question as so why this may be the case. Clearly, the study in question is severely flawed and displays a shocking lack of understanding.

      No, clearly Dave Leigh displays a shocking lack of understanding about the technology industry, where relationships of simultaneous competition and support are incredibly common. Sun, for example, supports Linux by releasing StarOffice under the GPL; on the other hand, it would be entirely happy cannibalizing the Linux market to grow Solaris/Java if it could. In fact, most astute observers believe this is exactly where Sun wants to lead us.
    And the above list is just a quick sampling of Leigh's errors and misunderstandings. The mistakes are all the more annoying since they appear to be direct regurgitations of things that have been repeated countless time by the less-iformed zealots [0] here on /.

    If I want to hear things like this, I'll read an old /. thread with my threshold down to 0. There's no good reason to link it as an article. Most osOpinion articles seem pretty much the same: they may be flattering to Linux, but they don't elevate the level of discourse, and they don't belong here.


    [0] As opposed to the well-informed zealots, who are (unfortunately) all too rare.
  • I guess they don't know that the first X Window System release was in 1984

    I thought that the GUI was invented on UNIX by Xerox. or was i just wrong? deatils, anyone?
  • Ok I don't mean to be critical but have you done a simple task "A third grader in his first basic class" could do. Just fire up gdb or use strace and run apprunner specifically see where and when the code is crashing on your machine. Perhaps run the code from a command line and see if it outputs any error message. On the windows version of the code you can see specifically where the code fails if you just run the thing from a dos prompt. Yes I admit that Mozilla is a bit bloated but that is all optional bloat anyway. I take it you cannot improve the code with the arguments given thus far. Now I will admit that I have not contributed to the mozilla project but at least take an active part in helping them with some material. I see comments like these as nihilistic and nonproductive without valuable data backing them up.
  • It would be nice if people who knew what they were talking about would write these articles. This guy is way out of his league, and his ignorance of both Linux and the "Enterprise" computing space shines through like a cutting laser through Glad[tm] wrap.

    To whit:

    Also, Microsoft, when charting throughput of Internet Information Server vs. Linux+Apache carefully refrains from mentioning that it would take at least 5 incoming T1 lines attached to your Linux server before this scalability becomes a factor. How many "common customers" have 5 dedicated T1 lines feeding into a 4-processor server? I'm not sure I know of any.

    Perhaps he has never heard of Dell? Or Barnes & Noble's? Those are two large IIS installations that I'm sure are using at least "5 incoming T-1 line". Which is not to say that there aren't Linux installations that are of the same scope (although they escape my mind in this moment). The point is saying that the Mindcraft benchmark is totally meaningless because most people don't need to scale that high is tantamount to saying, "Well, ummm, we do better on the low-end," which is true, but I wouldn't advertise it.

    While the largest swap file size is 128 MB, you can mount as many as you need. However, most users do not use swap files at all; they use the more stable swap partition, and this is not limited in size.

    This is flat out WRONG. Swap partitions, at least in 2.0.x could only be 128 MB. Yes, you could have multiple swap partitions.

    Anyway... it just irks me, since this kind of sloppy advocacy just makes us all look like a bunch of idiots.

  • My wife uses a Linux box that I maintain for her. She dosn't know what to do when fsck bails on a boot check or how to install software nor does she have the root password. I've had to walk her through some things over the phone. I added mount floppy/unmount floppy commands to her command menu so that she dosn't have to use a CLI. (I'm about to add a rm ~/.netscape/lock command to her menu :^P)

    She does have a reliable system which she can use to do the stuff that she does: Netscape mail, Netscape for browsing, Applix for WP and a bunch of gnome card and strategy games. She can print her email and she's generally happy.

    Problems are in reading MS files. Last night, Applix couldn't read a RTF file from Word 97 that she needed to print. Dust off the laptop that wins Win95, boot it up, plug in the printer (install the fscking printer driver) and print an 8 page document.

    She is continually amazed when I telnet into the box and fix something broken without getting her up from her seat or even getting myself off the couch.

    My wife, I think I'll keep her.
  • I smell a job for a non-profit society. I for one would chip in 10-20 bucks to see an ad in the middle of Futurama.

  • >I spent most of yesterday trying to get a >Slowlaris machine to run with two NIC cards.

    uh, sorry, it's easy. It's also easy with linux,
    assuming you have two NICS that _can_ play
    together - determining this is the hard part.
    It's also ridiculously easy with IRIX too, I have
    one machine with two ethernets, and one FDDI,
    it was PAINFULLY EASY to do.

  • about the only other thing that users miss from a Windows system is a blue screen 8-)

    Both KDE and GNOME have a screensaver that emulates all sorts of crash screens from different operating systems, including Win9x and WinNT.

    "I already have all the latest software."
  • Ok, a couple of minor technical notes: Swap files and partitions are NOT boundless, but they are not bound to 128M. That limit was done away with a while back. I beleive the current limit, on an i386, is 2G. I've tested that. It didn't care much for more than that. Of course I wasn't using the BigMem patch(64G Physical :)) Security: It's Plug 'n Play. If you don't like ext2's security you can always switch to a Kerberos/AFS or CODA security systems. That will give you much finer grain control. You would still need a traditional filesystem(or devfs) or /dev. But, with that type of setup you would only need to have administrators in /etc/passwd. Thus only they could access stuff in /dev once you remove other access. - kimo_sabe --- Free your software, and your ass will follow
  • The window GUI was invented by Xerox PARC in the 1970's. It was Smalltalk based, but I don't know what the OS was. Maybe their Smalltalk was also the OS. I'm pretty sure it wasn't Unix.

    BTW, they also invented the mouse, Ethernet, object-oriented programming, and who knows what else. Talk about missed opportunities!
  • Having run an NT shop with five co-workers, we quickly learned that MS's search returns variable results -- a list of returns one day might be a completely different set, in a completely different order. It can be as bad as the time I looked up something, and told someone across the room what to search for, and he got totally different results - less than 90 seconds after I did my search. I did another search, and got the same results that he did -- by reloading the page.

    In summary, the bad results for #11 are at worst mere egg on the authors face, and probably an honest mistake. In no way do I see that as an example of FUD.


  • Now, anyone who reads my comments knows I'm about as anti-MS as anyone, but...

    Where the hell does this writer come off saying that NT 4.0 lacks a journaling file system?

    NTFS has had journaling since 1993, as far as I know.

    OTOH, It fragments very badly, but so does ext2.

    Interested in XFMail? New XFMail home page []

  • Minor nitpick:

    Windows 1.0 wasn't released until 1985, to the best of my knowledge.

    Interested in XFMail? New XFMail home page []

  • by Parity ( 12797 ) on Friday November 12, 1999 @09:48AM (#1538105)

    But Linux simply isn't ready for on-computer-geeks to be using all the time. It's propbably okay for
    smart non-computer-geeks, as long as they have a bit of support once in a while. But it's still not ready for Aunt Helga who wants to check her email once in a while and run a word processor.

    Actually, I think that's -exactly- who it's ready for, if she can get it preinstalled. If Aunt Helga has a preinstalled Netscape and WordPerfect and either has KPPP set up or has step-by-step instructions like are handed out by ISPs for setting up windows... she's all set. The system won't crash, won't get viruses, won't re-mail worms to her friends via outlook... Okay, okay, I'm spreading FUD against MS now, I'm bad.

    I think who it's -not- ready for is non-geeks who want to do a lot of advanced stuff. It's when you start doing Advanced Stuff(TM) that you start needing the command line. It's also true that Linux -doesn't- have all of the software that Windows does, and the more esoteric the application the more likely that Linux doesn't have it. (Though, we have some pretty esoteric stuff.) The print-seperations advantage of Photoshop over The Gimp comes to mind, and I don't think we have a professional CAD program yet.
    But a friendly word procesing/web/e-mail environment? Sure. No prob.

  • Hrmmm I've never had a problem with SAM setting up two NICS. Of course I've never had a proble with SAM in general. SMIT is another story though.;)
    "We hope you find fun and laughter in the new millenium" - Top half of fastfood gamepiece
  • by rjh3 ( 99390 ) on Friday November 12, 1999 @09:57AM (#1538114)
    To be fair, I can and do use Linux as my primary desktop. And a mix of Linux and Solaris on production servers. But my "business desktop" remains NT. Because:

    1.) Corporate file format requirements. I must be able to read, write, and modify documents that are shared activities. These documents are in MSWord format, MS Excell, MS Powerpoint, Visio, and PDF formats. I know and sometimes use Star Office. But it cannot modify MS documents without loss of formatting. There is no Visio. There is only PDF reader, not a full function PDF creator.

    2.) Corporate communications requirements. I am required to have dial in and LAN access to Lotus Notes. No client support yet.

    3.) Hardware variation support. Linux cannot support some of the highly integrated devices found in laptops and low-cost PC's. In my particular case it is a laptop. I don't get to pick the model. I have to take what the corporation provides. It is good quality, but has Linux problems. In general Linux support trails hardware availability by 6-12 months.

    You can point in each case to a truthful "we are working on it". But working on it is not the same as available and robust today. These are reasons why I anticipate that within a few years Linux (and probably also *BSD) will be viable on the desktop. But viable in a few years is not the same as viable now.

    Other people will have other particular problems, but the general categories of mandated file formats, mandated corporate communications, and hardware variations will keep coming up.
  • "correct me if I'm wrong but I believe the track record is 50% of major IT projects are a complete disaster and have to be scrapped"

    It's closer to 80%. See AntiPaterns [] for more info...
  • There is a (new, still unfinished) website dedicated to debunking FUD by countering it with true, 100% provable facts.

    Visit the FUD-counter site at: []

    The project is still new and we could use a few volunteers to help us out...

  • The Wing Lover flew in with:
    But Linux simply isn't ready for non-computer-geeks to be using all the time. It's propbably okay for smart non-computer-geeks, as long as they have a bit of support once in a while. But it's still not ready for Aunt Helga who wants to check her email once in a while and run a word processor.

    Coulda fooled me. The SO is about as antitechnological as it gets (a can of paint is close to the limit; the kitchen is way too Buck Rogers) and the kids are -- well, kids. Oddly enough, though, it doesn't seem to slow them down any, and they seem to take it for granted that they can do Stupid Net Tricks but not access each others' data (which they used to do for mischief.)

    Now all I have to do is keep the daughter from finding out that I could let her have some of the messaging S/W that she wants but that it doesn't work because I have it blocked at the firewall....
  • An advantage of Linux is that if the customer is on the 'net, you can (if they give the authority) telnet/ssh into their box and, and start a remote X session that shows on -your- desktop. If you're logged into their user account, you'll see exactly the desktop they have.

    I know that -I- wouldn't want to let tech support log into -my- account or root on my box... but, I'm a techie and perfectly capable of fixing my own box. I think the ordinary user would, in most cases anyway, be willing to compromise their privacy in exchange for tech support being able to just go in and -fix- it instead of those tedious phone conversations. "Click Control Panel... Click Gizmo-Driver... Select the 'Advanced Settings' Panel. Please read me the values from top to bottom... "

    Anyway, my entire point being that customizable does not necessarilly mean less supportable.

  • Agreed. KDE is surprisingly easy to use.


    1) Both KOffice and Staroffice feel clunky to me (may be a matter of opinion). I'm waiting for the next staroffice.
    2) no standard and reliable installation and de-installation methods a'la windows uninstall menus. You can argue for RPM, but its not exactly easy for a beginner. We need to be able to click on an executable, have it install like installshield, and be able to be uninstalled from a centralized and from e menu which brings us to:
    3) wmconf; it's a good idea, but it needs a lot of work. a standard menu format that actually works 100% of the time (and is automatically added to) is a necessity.
    4) lots of little applications that cater to the end user (not just techie toys). A good example is that application called CompuPic, recently released for linux. It caters to the end user, it works great, and it has an intuitive interface (probably because they stole it from ACDSee -- or was it the other way around). Also applications such as dreamweaver, quickbooks, quicken and golf games (hey, old people like them)
    5) Games! We need current games to draw people to the platform. Hopefully XFree 4.0 will be all that it promises.
    6) Video/audio codecs and some types of hardware. DVD, full quicktime with every codec we'll ever need, a reliable realplayer, all the various avi and mpeg codecs, and final non beta flash and shockwave players. These things in particular keep me bound to the windows world (well besides the games.

    Linux (and *BSD) are going to get there soon enough. I wonder how many redhat releases we are away from where the user won't have to worry about ever looking at a command line (unless of course they want to), directly booting into x, and have a reliable and standard (not that non-standards will necessarily hurt) way of getting things done (or not done in the case of video games).

    I don't see the problem with having commercial closed software on the platform. It's not like it's stopping open source alternatives from developing (though it may the other way around). People have to make money some way -- and quality software can't always be done where the only revenues are added services and support.

    Of course, that's just my opinion, and I may be wrong.
  • Here we have someone talking about NT, Linux and the Enterprise, who obviously knows very little about either NT or the Enterprise.

    "How many "common customers" use 4-way NT boxes? Very few, in my experience"

    We are talking about the Enterprise. We are talking about 1000+ users on systems - in these circumstances such servers would be common. Just because lots of Linux people work with single CPU linux boxen in small companies doesn't mean that multi CPU machines are at all uncommon in larger companies.

    "(Linux supports many file systems6; NT supports far fewer). Among the file systems Linux supports is SGI's XFS, recently released to Open Source, with a max file size of nearly one million terabytes7. "

    I was not aware that XFS was part of linux - perhaps it has been rolled into the latest kernel version. Or perhaps we are counting third party file systems that can be used with each OS. XFS is brand new to Linux, and I am aware of very few applications that make use of it - maybe Oracle 8 does?? NTFS has been around for years, and is well supported.

    ". Also, Windows NT clustering is limited to failover ONLY. Linux is capable of distributed clustering ("Beowulf" technology 12), which can enhance system performance dramatically. "

    I'm not at all sure I see the relevance of Beowulf clusters in the Enterprise. We are talking about large corporate IT systems, not scientific type systems.

    And do you _really_ believe that Linux failover clustering is as well tested as NT's? And have you administered both kinds of cluster? Or are you infact merely re-iterating a TurboLinux press release?

    ". While your support options for Windows are limited, your support options for Linux are not"

    I see. So you are discounting the many many 3rd party Windows support operations? Are you really saying that HP's windows support is no good? Or that the many large resellers have no idea what they are doing? Are you saying that ICL doesn't support Windows when it uses it in projects?
    There are far, far more people able to support NT than Linux, especially when 'support' means support of large, complex developments, rather than simply supporting a distribution, or providing general Unix Q and A style help.

    "Although you can purchase local support for Microsoft products, such support is strictly limited to training and workarounds. "

    This is utterly untrue.

    ". Microsoft Windows support is simply not in the same league.

    Rubbish. Microsoft may be no good at supporing Windows, but there are plenty of 3rd parties who are.
  • Simple: you didn't have a route set up to your DNS server (or possibly had a route set up that was broken somehow; I forget which causes DNS lookups to hang instead of just break), so "netstat -r" blocked waiting to do a reverse dns lookup on something like your gateway. On Linux use "netstat -rn" to avoid reverse lookups; it's probably the same for solaris.
  • by Parity ( 12797 ) on Friday November 12, 1999 @10:19AM (#1538136)
    There are a few things that can be meant by a journaling filesystem. What NTFS has is -not- what xfs or reiserfs have.

    What people mean by a journaling filesystem in this context is a filesystem that has a scheme whereby changes are written to the journal, then, in idle moments, marked 'in progress' in the journal, written to the filesystem, and then marked 'done' in the journal.

    With this scheme, if you go down in mid-write, you simply scan the journal for the 'in progress' notation and re-do the right. Ta-da, stable filesystem. You -can- lose data, if a write doesn't get into the journal, of course, but you won't get filesystem damage. As a result there is virtually no fsck time on reboot.

    Take an SGI/xfs machine, and a Windows NTFS machine. Start them doing some stuff, and then pull the plugs. Now reboot. NTFS needs to scandisk, because NT is not a true journaling FS. SGI checks its journal, and is up and running in no time.

    I expect true journaling in NTFS-2K. If it isn't there, well... then MS will lose the server market completely in no time.

  • Actually, the article and MS both have it wrong. I have seen Linux perform well on 100Mbit ethernet. That would saturate a T-3 or two.

    What I want to know is what sort of crazy setup would have 4 NICS feeding into the same segment? (that's >8 T-3s BTW) The reason that configuration was chosen is because Linux doesn't scale well to 4 NICS. Had they chosen to use a single Gig ethernet card instead, the benchmarks would look different.

  • My comments here were based on the article, which is a comparison between FUDslingers and the Linux community. I understand that there are many developers who code for the sheer joy of coding, but your arguments are flawed. Many of the applications being created by OSS developers are simply a functional remake of a product that they were missing in Windows, or that they liked in Windows and wish to see in Linux.

    I really did not mean to imply that we are "aiming our missles" at MS. I meant it more to be like this:

    You say I cannot run a mile in under 6 minutes. I practice and practice until I can not only run a mile in under 6 minutes, but can do it in under 5, just to prove I can. There is no better feeling than accomplishing something that others tell you is impossible.

    Remember, when Linus wrote the original kernel, he was only able to do it because he knew that writing an OS was supposed to be impossible. Guess he proved everyone wrong. =]
  • The wise AC said: "Since when has malicious intent been a prerequisite of FUD or ignorance an exemption from it?"

    Thank you, that was my point ... inaccurate statements due to malice or ignorance has the same effect on an unsuspecting sys-admin who's trying to sort out all the "myths" being spewed out these days.

    What makes this inaccuracy glaring is the arrogance in which it is presented, similar to the arrogance many anti-MS folks site often.
  • I think who it's -not- ready for is non-geeks who want to do a lot of advanced stuff.

    Nothing is ready for non-experts who want to do a lot of advanced stuff. This is pretty much the definition of "experts": "Those who can do advanced stuff". And what is a geek, other than an expert computer user?

    Unless you are an expert driver, can you do a bootlegger reverse? No. Unless you are an expert pianist, can you rip through "The flight of the Bumblebees"? No. Unless you are an expert accountant, can you do your company's taxes? No.

    Unless you are a geek, can you do advanced stuff with a computer? No.

    Linux is not ready for non-geeks who want to do advanced stuff. No OS is ready for non-geeks who want to do advanced stuff. No OS will ever be ready for non-geeks who want to do advanced stuff. As systems get easier to use, the definition of "advanced stuff" will change to match.

  • An embarrassingly bad article from someone who shows little experience of either Enterprises or NT.

    "How many "common customers" use 4-way NT boxes? Very few, in my experience. "

    What little experience you have, then. Such hardware is commonplace in the Enterprise.

    "Understand that while Linux is not the correct choice for every server application it's becoming increasingly hard to find an application for which it's not the best fit. "

    Well, all applications that want to access storage faster than SCSI. All ones that require efficient LAN speeds >100MBs. All ones that require not only a jounaled fs, but one that is supported by applications. All the ones that require hot-swappable CPU's. All the ones that require more than 4 displays. All the ones that require more than 4 CPU's. I'm not saying NT does all this either - but your statement made it look as though Linux was overtaking Solaris. I think not.

    "While your support options for Windows are limited, your support options for Linux are not. "

    I see. So you are saying that there are fewer support options for Windows? Companies such as ICL, HP, Compaq don't count? All the integrators such as Logica don't count? Resellers don't count?

    FAR more companies support NT than support Linux. More importantly, they support large complex, customised rollouts and systems, not just a particular distro or general unix q and a. Cygnus is a proper support company. There are many, excellent companies like Cygnus in the NT world.

    "Although you can purchase local support for Microsoft products, such support is strictly limited to training and workarounds. "

    This is simply untrue. Call it FUD or a lie.

    "As for the availability of applications, let me simply tell you about my own experience. I've used Linux on the desktop in my home exclusively for the past two years. "

    Right. I thought we were trying to get beyond personal anecdotes, and that we were talking about the Enterprise, not your home. Enterprise desktops have requirements as different from home user desktops as you can imagine. If you had worked in the IT department of an Enterprise on desktop builds, you'd know that.

    "For business use, the major general purpose tool Linux lacks at the moment is a Lotus Notes client. "

    And Office. And Remedy. What's remedy? Ah, it's the fault management and order handling system used in many many enterprises. Enterprises, remember? What about front ends to SAP and Baan? What about call center software? What about ProE? What about Oracle Financials and OSM? Ah.

    "Now PCs ship with no language at all (unless MS Office shipped with your PC, in which case VBA sort of counts). You have to buy a language on your own, which takes desire and money... lots of it"

    Right. So I have to buy Perl for Windows? And I have to buy the djgpp C compiler for windows? And I have to buy lisp for Windows too, eh?

    Less of the FUD please.

    I think that's plenty of criticism for one evening. This article discredited Linux.

  • netstat -rn. It was probably trying to look up a host. -n will prevent it from doing so. Yes I agree with you that usinjg such as system requires knowledge of the os. I've never had problems with such things, but I enjoy tinkering with the os. However, once you know where everything is, it's not so hard. Editing text files and certain commands isn't hard. It's a combination of a lack of experience and intuitiveness. Actually; if you learn something like solaris, bsd or linux in detail, you can often translate that knowledge over to the other OS providing you learn the nuances and differences including where the files are stored. In other words, the learning curve gets less steep after you learn your first unix-like OS.

    Oh yeah. This message was written in netscape in x running wmaker + kpanel and kfm which is in an x-win32 using XDMCP window off a server in the server room (which also serves 20 other users concurrently) on my windows 98 desktop :)
  • Now, anyone who reads my comments knows I'm about as anti-MS as anyone, but...

    (just as a side note, it's kinda sad that people feel they have to make this sort of a disclaimer before going up against the

    Where the hell does this writer come off saying that NT 4.0 lacks a journaling file system?

    I'm wondering if anyone can shed some light on this topic? Some people say it does, some people say it doesn't. The "it doesn't" crowd seems to have more empirical evidence, but if it doesn't than what *does* it mean when people say it's journaling? Microsoft doesn't usually flat-out *lie* (they just distort, embellish, stretch, and spin the truth), so I suspect they're talking about some filesystem feature that exists, even if it isn't really journaling.

    OTOH, It fragments very badly, but so does ext2.

    I'm also not clear on this :) I installed an ext[2]fs defragmenter but haven't used it yet, due to the fact that its documentation essentially says that in most cases you won't see more than a 10% speedup from defragmentation, and given that I don't know whether it'll eat my filesystem alive, I didn't think it was worth it ;-)

  • I should've been a little clearer. When I say 'advanced stuff' I don't necessarilly mean 'advanced computer hacking and networking things'... I mean, things like, designing your own animation; photo editing/seperating ; um...
    I'm stalling out here, I know there are more examples.

    There are people out there who use computers as a means-to-an-end. They don't want or need to know how to do advanced computer stuff to do advanced audio/video/image/science/whatever stuff.

    At least, they -shouldn't- need to know. The computer should facilitate, not inhibit. If people cannot accomplish advanced tasks with a computer without also being computer experts, then computers aren't doing there task correctly.

    Or, to extend your analogy, if an expert driver also needs to be an expert mechanic to do a bootlegger reverse, then there's something wrong with the car.

    I don't really think that you meant to say that, though, I think I wasn't clear enough in what -I- meant. I have to admit 'advanced stuff' is a pretty vague category. ;)

  • 2) no standard and reliable installation and de-installation methods a'la windows uninstall menus. You can argue for RPM, but its not exactly easy for a beginner. We need to be able to click on an executable, have it install like installshield, and be able to be uninstalled from a centralized...

    You are 100% on the money. I love rpm (the program) - it's so reliable and it's a joy to use once you get to know it. But I was a linux-newbie a few short months ago - hardly knew how to man or --help, so I used gnorpm. Gnorpm is - um - really badly designed from as far as user interface goes, but at least it knew things I didn't at the time, like where to find files and how to issue the commands...

    Sombody who wants to establish themselves quickly as an open-source star please create a better interface to rpm. For bonus points, make it work seamlessly with .deb packages as well. What I'd suggest is a shell to rpm that re-factors it into 2 parts:

    Install Package - gives you a file browser window filtered for rpm's (and debs?). Automagically shows the package info as you move the cursor, as well as telling you whether the package is already installed and showing the existing info if it is. Single click to show info, double-click to install.

    Remove Package - gives you a list of installed packages that you can see the info about by single-clicking or remove by double clicking.

    Note how we get away from m$'s Add/remove programs stupidity. (1) Not all packages are programs. (2) I always know whether I want to add or remove, so give me two separate menu items, please (3) I want this primarily accessible from a menu, not from some icon buried way deep in the system. Sure, make it idiot proof, but don't make it user-proof.

  • by jd ( 1658 )
    Personally, I'd rather see something done, than waste my breath blaming someone for something I don't even know is true.

    Which is more constructive? To think and be positive, taking positive action, or to imitate a brain-dead berserker wasp, with no capacity for original thought and an obsession for inflicting senseless injuries for no purpose beyond the fact that it -is- senseless?

    If you want to play the part of a brain-dead berserker wasp, go right ahead. I'm sure Red Hat can afford some rolled-up newspaper, and would appreciate the baseball practice.

    Personally, I'd rather see Linux succeed, than throw imagined attacks at some imagined enemy over a wholly imaginary crime. Imagine that!

  • On Mac and Windows machines I select something, go somewhere else and hit "paste", then realize that I forgot to "copy" and have to go back and do it all over again. And don't even talk to me about the semantics of focus or window-stacking. And typing 'ls' in a command prompt gives me "invalid command or filename," what's this nonsense?? (I still can't get a command prompt on the Macintosh)

    Clearly Macs and Windows machines are difficult and unintuitive to use.


    (for those who haven't figured it out, the above is..not sarcasm, but certainly not literal truth, since I can and do use Windows and Macintoshes, minus some back-and-forth flailing when I have to cut-and-paste ;-) )
  • Actually HAL was somewhat able to function correctly. The ability to have AI in an operating system would be a very significant advance.

    I disagree. The last thing you want in an OS is an AI. You may want one running on an OS as an application--or even as a sysadmin.

    The OS is a layer (or set of layers) that sits below your applications, where your applications are the things that actually do your work for you.

    An AI, especially of HAL's caliber, is the highest level of computing we can imagine. Computers perform mental activities, and because we are humans, we cannot conceive of any higher-level mental activity than human thought. We can postulate that such higher levels exist, but we couldn't adequately describe them.

    So if the AI is part of the OS, what the hell is it running? Gods?

  • Yes. As well as the fun involved in changing hardware around. Especially when it involves recompiling the kernel.

    Video card setup should get better when XFree86 4.0 comes out (it already got better with 3.3.5).
  • The notion that Linux (preinstalled) is difficult for beginners to grasp or use is a complete fallacy, perpetuated primarilly by the likes of Microsoft and numerous astroturfers lurking about the net. It has absolutely no basis in reality, to wit:

    I gave Linux to my mother. Now she refuses to ever use Windows.

    I gave Linux to my sister. Now she refuses to ever use Windows.

    I gave Linux to a pilot friend of mine. Now he spits on the Microsoft name (I kid you not)!

    What do these people have in common? They are all basically computer illiterate. For a long time they remained that way because they were running Windows. Whenever the machine would break running Windows they would blame themselves for "having done something wrong and broken it." They were afraid to try doing things themselves, without someone (usually me) holding their hand, for the same reason. Lost Saturdays and reinstalls of the hosed Windows OS were all too frequent.

    All three are very pleased with Linux, StarOffice, Netscape, and KDE (gnome in one case), and aren't afraid to try things out because, after months of having things work right, they have gained confidence in knowing that, as a regular user, they CAN'T break anything. The kinds of questions I am now confronted with take about two seconds two answer and are of the form of "Jean, how do I do ... with Linux?" which is a breath of fresh air compared to what I used to get "Jean, my Windows PC doesn't work anymore, can you come over and fix it?"

    Now they can use their computers to reliably get done what they want to get done, and I can get on with my life, spending almost no time having to play tech support for them.
  • That problem HAS to be fixed (which I understand it will be) before Linux can be a desktop OS. You can't have users reinstalling Linux every day because they reboot the computer when Netscape crashes, and then don't know how to fix it when X doesn't come back up.
  • Of course, if tweaked right, apache + linux (or FreeBSD in my case) will scale to the utmost bandwidth limits. I run apache on a number of servers on 100mbps net links and they scale incredibly well. I/O (and processing power if server a large number of dynamic pages) are the real limits.

    Of course, if you really want to push the limit; pick up Zeus webserver. It's multithreaded and has been proven to scale to the same levels IIS can. I've lost the page now, but incredibly high end benchmarks (talking 4-30 processors here) have shown that Zeus can consistently beat IIS. The current champion benchmark is an IBM machine with 12 processors (if i remember correctly). It handles 16,0000 more requests per second than IIS could. With the same amount of processors, the results were similar (but equally out there in terms of real world usability).

    Of course, you can always forget that and set up an apache server farm. I run 3 on a 100mbps net link (with mod_perl and php3). This is with stock PII 400 + 512 megs of ram (2 gigs on the db server) and DPT Fibre channel raid.

    Oh yeah; try hosting 6000 ip based chrooted user account domains on each box with database, php and perl access on IIS.
  • I have realplayer G2. I run it on my freebsd box in linux emulation. It crashes a ton though (as it is an alpha).
  • by Loge ( 83167 ) on Friday November 12, 1999 @11:32AM (#1538216)
    If the point of this document is to show that the Linux community is just as capable of generating FUD as Microsoft, then it has succeeded. As a tool for realistically positioning Linux's capabilities, though, it is quite useless.

    This is some of the most half-baked blathering I have seen in a very long while, and it is really quite sad: every fool who goes in making claims like these and is promptly shot out of the saddle will set the movement back in the eyes of those who watch it happen. If the Linux community is going to get serious about taking on Windows NT, it will have to do a lot better than this.

    How many "common customers" use 4-way NT boxes? Very few, in my experience.

    This statement is a brilliant testimonial to the sheer naivite of Linux advocates, showing a very deep misunderstanding of how servers are deployed and used. This kind of comment strongly supports the view that many Linux developers are holed up in a bedroom somewhere, tweaking code on a home PC in their spare time.

    NT loses these same benchmarks when comparing single-to-single processor and dual-to-dual processor machines.

    And which benchmarks are these? I didn't see a footnote here.

    Microsoft also claims that NT performs better than Linux when serving static web pages. However, e-business is not powered by static pages. It's powered by Active Server Pages and CGI5. Windows doesn't fare nearly so well in these comparisons.

    Again, where is the evidence for this claim?

    As for the technical specs quoted by Microsoft... they are out of date. The Linux kernel addresses 4 GB RAM, not two.

    The 4 GB function is a kernel patch to the 2.3 kernel, which means that it is non-production, beta code until 2.4 ships. If I buy Red Hat 6.1 or some other 2.2-based kernel today, it will support no more than 2 GB...period.

    Among the file systems Linux supports is SGI's XFS, recently released to Open Source, with a max file size of nearly one million terabytes

    Again, this is not production is a statement of intention by SGI to release its XFS code to the Linux development community.

    Note that not all of the features supported by Linux are included by default in every distribution, but they all can be added if missing.

    OK, and who will support these functions when they are added? The distribution suppliers? Not if it isn't in their product. The hordes of volunteer help-desk personnel idling on USENET groups? Only if it's a K00l question, dude. The profusion of promising startups dedicated to supporting commercial Linux sites? Better check that fine print again...!

    The fact that this can be compiled into the OS kernel or not, depending on the needs of the users allows every installation to tailor the smallest, fastest, most stable custom kernels to their specific needs.

    Assuming they know how to compile a kernel...

    Linux's stability is only based on anecdotes. Microsoft seems unable to differentiate anecdotes from testimonials.

    OK, and where are these "testimonials" again (and I'm not counting USENET or Slashdot postings)?

    Be that as it may, there are a lot of these anecdotes. Many of them are include documented uptimes ranging from months to years. Footnote: I myself experienced an eight-month uptime between kernel upgrades, and I do not mean scheduled uptime.

    See above.

    Microsoft also claims that Linux has no journaling file system, ignoring the fact that the SGI's XFS is a journaling file system

    Again, this is not production code.

    They also ignore the fact that NT 4.0 itself lacks a journaling file system!!

    This is just wrong, and you are not doing the Linux community any favors by essentially lying about Windows NT's capabilities. Here's a simple test for you: put two identical systems next to each other, one running Windows NT 4.0 and the other Linux. Boot each of them up. Then pull both power plugs out, and reinsert them at the same time. Which system will be up and running faster?

    Also, Windows NT clustering is limited to failover ONLY.

    This is also wrong. Windows NT 4.0 has Web-server load balancing functions built in as part of its Convoy clustering technology. Oracle Parallel Server is widely deployed on Windows NT to create scalable database clusters.

    Linux clustering was developed in association with NASA, an agency having a far stricter definition of "mission criticality" than any commercial entity. NT has no equivalent technology.

    This is wrong again. The PVM and MPI technologies on which Beowulf is based have been running -- and have been widely deployed -- on NT for years.

    Free doesn't mean low TCO. Actually, it does. Microsoft's TCO calculations are based against other commercially marketed Unixes, which have very expensive initial acquisition and support contract costs and traditionally high education costs.

    And Linux support contracts are free? There won't be any education costs to move users to Linux?

    Support costs can be very low, as purchased support can be supplemented with award-winning Usenet support.

    I'm sorry, I am picking myself up off the floor from laughing so hard. I have submitted many, many questions to various Linux discussion groups over the years and the quality of answers is *wildly* uneven, with the majority of answer ranging from irrelevant to plain old wrong. Any IT professional who depends solely on USENET for Linux support should be fired.

    Microsoft itself charges for support for a product that they licensed (not sold!) you at considerable cost without warranty.

    And Linux support programs *do* come with a warranty? Please, show me these programs!

    However, with Linux you're not required to purchase support at all.

    With Windows, you're not required to purchase support either. What's your point?

    The Linux User community, operating at no charge, garnered the 1996 InfoWorld Product of the Year award for Best Technical Support.

    InfoWorld has a long history of anti-MS sloganeering, and seems to give out its awards simply based on the fact that they aren't MS products. After all, this is the magazine that for the previous five years had given OS/2 their product-of-the-year award. If I as an IT professional had made a purchasing decision based on InfoWorld's recommendations in 1994, I would be in fairly deep trouble right now.

    Microsoft claims that your security administrator must be an expert to properly configure security. My own knee-jerk reaction to this is, "When do you NOT want an expert supporting your systems? If you can't afford one full-time you hire a consultant"

    OK, and you just got through going on about how low Linux cost-of-ownership was?

    In point of fact, NT's security is ... not too difficult to crack.

    OK, this is is just plain FUD (meaning you have made a claim that is supported by nothing more than the fact that you made the claim).

    Properly configuring Linux security is mostly a matter of removing those services that are not needed and religiously applying security patches as they appear. This should be standard operating procedure for any, regardless of the platforms used.

    Correct, so why should it be any different for NT?

    Microsoft claims that existing Linux GUIs are cumbersome and difficult to use. In fact, my mother sat down and began using the KDE desktop with no training, no prior experience, and not one single problem.

    OK, now let's put her in front of a GNOME desktop with a stopwatch, and see how long it takes her how to figure out how to do things there.

    For business use, the major general purpose tool Linux lacks at the moment is a Lotus Notes client.

    And an Exchange client (although that is hardly surprising). So now we have taken about 90% of the messaging users off the table.

    And the Windows Notes client can be run if you simply install WINE (Windows emulator for Linux).

    And just how long will the program run before a segmentation fault?

    The Future.....

    In the future, we'll all be flying around in air cars! Nuff' said.

  • by Anonymous Coward on Friday November 12, 1999 @11:42AM (#1538227)

    I find it interesting that right after you berate the original poster for claiming that quad-processors are uncommon, you turn around and claim that large scale Beowulf clusters are irrelevant. If the enterprise has such an enormous need for quad processors and up, then surely they would also benefit from rock solid, NASA tested, distributed clustering technology. BTW, if you insist on having real power in a single box, why don't you examine Sun's E10k server line...those boxes go up to 64 processors and support gigabytes/sec bandwidth on the backplane. NT can handle that, right?

    Has it occured to you that NT needs a quad processor to support 1000s of users because NT is an ineffeciant resource hog? I'm not saying thats the case, but I regularly see stripped down linux boxes with yesterday's technology outperforming new NT boxes. Just my personal experience though...

    It seems you don't understand the concept of a file system. Once you set up a particular file system on a disk partition, its completely transparent to your applications. You setup a partition as xfs, put your data/apps on it and gain the benefits immediately. Big RDBMSs like Oracle aren't really relevant to the discussion since they often run using raw disk access to the partition directly without making use of an intervening file system.

    What exactly do you mean when you say NTFS is well supported? Could you please name some other vendors who support NTFS? Filesystems are transparent to most applications, so this statement makes no sense to me. In addition, while xfs is being integrated into linux, I've heard of no plans from Redmond to integrate it into NT.

    The article is not discounting support options for windows. The author is merely pointing out that because NT is a closed source product, third party support vendors can only issue workarounds and training. 3rd party support cannot fix bugs because they don't have the code. In that sense, if you need a problem in windows actually fixed, the only one who can help you is microsoft. If you discover a bug in your OpenSource linux code, anyone, including your in house IT staff can diagnose and fix it. You're not dependant on one and only one supplier. No matter how good HP's NT support is, I seriously doubt that they can patch the source for any microsoft application/componant and rerelease it. HP can look for ways around core bugs, but they simply can't fix them since they don't have the source. The result is an endless supply of kludges that further destabilize the system and limit users. This what happens when people aren't allowed to actually solve problems.

  • Then you install the hotfixes and service packs at home to 20MB a piece (just how are you supposed to get those if you only had one machine??? Each one of these requires a reboot. (5 reboots)

    Bzzt. Try again. Service packs are cumulative so you have one reboot for the latest service pack. How many reboots to upgrade the linux kernel to the latest? Oh, and to answer your question, you order the service pack CD, or do a web install of the service pack if you're connected.

    Then, if you have a laptop, try to get the PCMCIA ethernet to work. Last year I had to pay $80 to SystemSoft to get a software layer that would allow my 10/100 card to work, after buying a new one because the old one wouldn't. (2 reboots)

    A linux user complaining about NT's driver support? Now this is funny. As far as laptop drivers go, just make sure you buy a card that lists NT as a supported system, and then it just works. And if not, complain to the card manufacturer.

    Not to mention that I couldn't get NT to use more than 4GB for a partition.

    This would be caused by what is known in the industry as a 'chair to keyboard interface error' or a 'keyboard driver error'. NT supports larger than 4G partitions, just not in the first partition due to BIOS limitations on x86 boxes.

    A typical NT workstation install by a competent admin takes about half an hour.

  • by Anonymous Coward
    Here's another quote from MS' TechNet [] indicating that Journaling is something that is being ADDED to NT. It's great quote for another reason in that it explicitly spells out a MAJOR compatibility problem between NT and Win2k; and, it admits that Win2k will be even more of a resource hog than NT necessitating likely hardware upgrades:

    Q: What features has Microsoft added to the NTFS in Windows 2000 (Win2K)? Will I be able to dual-boot Win2K and Windows NT 3.51 or NT 4.0?

    A: Microsoft has added new features to Win2K's file system, such
    as disk quotas, encrypted files, journaling, and junctions. For
    example, Win2K's file system will automatically encrypt and
    decrypt files as the OS reads them and writes them to the hard
    disk. In addition, the Win2K file system's journaling functionality
    will provide a log of all changes users make to files on the volume.
    Reparse points will let programs trap an open operation against
    objects in the file system and run the program's code before
    returning file data. For example, Win2K systems will be able to use
    junctions, which sit on top of reparse points, to remap an operation
    to a target object. This functionality is a crucial aspect of the Win2K
    file system. Finally, administrators will be able to use quota levels,
    such as Off, Tracking, and Enforced, to control users' access to a
    drive and on a per-user basis.

    To answer your second question, when you install Win2K, the
    installation upgrades any existing NTFS volumes (i.e., drives) to
    the Win2K file system. If you're installing Win2K on a system that
    is running any version of NT other than NT 4.0 with Service Pack 4
    (SP4), the installation program will inform you that you will no
    longer be able to access earlier NT versions. If you install Win2K
    on an NT 4.0 with SP4 machine, your system can use SP4's
    ntfs.sys to read and write to Win2K's file system, but it can't use
    the Win2K file system's new attributes. The first time you access
    removable media, your system will convert it to the Win2K file
    system. In other words, this installation process limits your ability
    to dual-boot Win2K and earlier NT versions.

    In addition, critics have complained about NTFS's slow performance
    relative to FAT. You don't have to be a rocket scientist to predict
    that Win2K's file system will be even slower than NTFS. Thus, the
    never-ending upgrade spiral continues--you'll need new hardware
    to properly implement Win2K.
  • "USB support? It's still part of the experimental kernel and is not supported by any of the major distributions yet. Corporate desktops can not use unsupported software from They need the tech support facilities provided by a major Linux distributor."

    How many corporate desktops would be worried about USB anyway? Most corporate operations most likely have a LAN, with printers, etc. shared by the LAN, not attached locally via USB/parallel/whatever.

    It seems to me that USB is really geared more toward the home user, who is more likely to have a lot more peripheral devices attached locally to a single machine. "But, that's just my opinion, I could be wrong..." --Dennis Miller

  • Your milage may vary, but I have a Navigator 4.61 session that has been running on my main Linux box for a couple of weeks and I use it for browsing on a daily basis.

  • Sombody who wants to establish themselves quickly as an open-source star please create a better interface to rpm.....Remove Package - gives you a list of installed packages that you can see the info about by single-clicking or remove by double clicking.

    Try kpackage []. (Isn't it great how many "I wish someone would make a..." comments are answered by "Try this!"?)

  • I don't know what video card you are using or what you are doing, but I haven't had an X session crash in eons.

  • Okay, we should host that article as a "'Linux Myths' Myths" page. Additionally, I think we should have a "Windows NT Myths" page to refute all the hype MS is spewing.

    - Ed.
  • In my defense, Microsoft has been *claiming* NTFS is a journaling file system since NT 3.1 was introduced. I guess I'm not really surprised to learn that that's not entirely true.

    Apparently it does something kind of, almost like journaling, but not quite.

    Interested in XFMail? New XFMail home page []

  • (I am on the far left of the slashdot bellcurve, so pretend I am Aunt Helga for a moment ...)

    I wrote the directions on
    this page for anyone interested in connecting to an ISP using kppp. []

    They're not perfect, I but I meant them to be lighthearted and easy to follow.

    Even still, there is an undeniable catch-22 in that someone who wanted to use these directions or any other online documentation won't have it to use from home. This is getting to be less and less of a problem as Internet access gets more and more ubiquitous - hopefully soon we'll all have DSL and kppp will be only a memory ... :) But hopefully someone could print this out at a friend's house or the library, or print it using the OS that came with their Compaq Presario before reformatting and installing something better.



  • I even previewed, I swear it! How did that closing dissapear?! I don't know.

    My bad. But the page still works, I think. If not, it's here. []


  • >so what about a local sports team?

    ooooh ...... isn't there a hockey team called the penguins .....

  • It's obvious I know, but one of, if not the chief reason that many people can't use Linux (actually, can't use anything besides Windows X or Macintosh) at work is the dysfunctional relationship that companies have with Microsoft, which goes somehthing like this:

    SCENE ONE: The recent past

    Company: "I want a word processor."

    MS: "Here's MS Word."

    Company: "Great! It's got an OK interface and lots of features! We'll standardize on it, and figure the cost of buying it is money well-spent."

    MS: "Nice doing business with you. See you in a few months!"

    Company: "What?"

    MS: "Oops -- look at the time. Gotta go."

    SCENE TWO: The even more recent past.

    Company: "Errr ... some of our suppliers now have a newer version of Word and send us files in it all the time. Can we read them with our old version of Word?"

    MS: "ha, ha. No. But you can buy this economical upgrade and then be using the same version they are. ha ha."

    Company: "That doesn't seem very nice, but ... OK."

    MS: "Nice doing business with you. See you in a few months."

    Company: "Hey, do you mean this keeps happening?!"

    MS: "Whoops -- look at the time. Gotta go. You have a nice day, hear?"

    Companies which deal in information exchanged electronically have it in their interest to insist on non-proprietary formats unless they absolutely need them, for functional reasons. (Lotus notes, say. -- but not WP docs.)

    So what are good formats?

    Text, for things that don't need pretty formatting

    HTML, for things that need to be accessed in a variety of formats

    XML - for more complicated things what need to be acessed in a variety of formats

    PostScript - for manuals etc for which you need exact typographic control.

    And yes, this message contains forward looking statements. It's true that not every place of business is flexible or imaginative enough to use a different color of paint, never mind an operating system other than the one in place now. But long term, I think the linux infection will keep casting shadows on hothouse-flower file formats.

    As others have pointed out, now is not the same as Soon. True -- but would you rather get on the boat leaving now or the jet leaving soon?* Best case scenario I would think for companies which are unable / unwilling right now to switch to a Free / free OS would be to at least study the possibility, because sooner or later you'll have to consider it. To mix a few metaphors: Inertia is a powerful force in business, but tides do turn eventually, and this is a full moon.



    *Too many variables left undefined, I know, but you get the point ;)
  • The full context of the test was a comparison between NT and Linux serving static web pages using 4 100Mbit ethernet cards. In that setup, it would indeed be crazy to use that configuration, especially if reliability were an issue. (I would use a master server and 2 or 4 proxy servers connected to the master through a private net.

    For your case, it's not so insane. (Though 32 MBytes/second seems a bit high even for 4 dozen workstations running CAD). A good workaround would be to get a switch with a Gig uplink port, and tie the server into that. raid 1 over nbd wouldn't be a bad idea either. It's not exactly hot failover, but it's not bad. GFS is shaping up to be an excellent option for that sort of environment. In a GFS setup, I would have at least two servers connected to the raid by Fibre channel. That would give hot failover.

    Re: your PS, I agree completely. That's one reason why I consider NT as an enterprise server to be at best insane. Enterprise server space is not very reboot friendly.

  • To the anonymous coward poster below:
    I do. Bite me.

    You shouldn't need anything more than a p100 to do network address translation and packet filtering. I have a p100 here that has an 11mbps wireless (line of sight) vpn connection to work. It also serves files over samba and collects connection information through snmp. It handles this quite well and the limit is most often the link speed (well that and the crappy weather up here in canada).

    I also have a p200 on a cable modem and 96 megs of ram and it runs apache w/mod_perl and php3, qmail, snmp, delegate, ftp, ssh, samba and some remote x applications. Base NT with IIS is so bloated it could barely run (uses 48 megs of ram before even doing anything). Linux or *BSD is perfect for lower end hardware.

    Again, I don't see why you can't run the firewall on the same box as the ftp and web box. This is of course unless you're completely out of memory (NT on 32 megs of ram is such a joke btw). As for processor speed, it should be able to take it as long as you dont have a ton of fast ftp sessions going (since most ftp daemons use a lot of processor for some reason).
  • by hey! ( 33014 ) on Friday November 12, 1999 @03:26PM (#1538315) Homepage Journal
    I have to agree. The person who wrote the article inadvertantly commited the same errors that MS does when promoting NT: FUD, vaporware and plain misinformation.

    I think Linux advocates should get out of the dichotomy game. When you do you're playing Microsoft's game; Sun Tzu would disapprove of letting your enemy choose the battleground.

    MS plays this game because they have to. They only have one product with any kind of future: NT (maybe two if you count WinCE). Consequently, they have a one size fits all approach on everything larger than a breadbox.

    The opposite of NT is not Linux, it's using the right tool for the job. This means perhaps some proprietary Unix on the very high end, Linux for desktop and an astonishingly wide range of server missions, some OpenBSD where a little paranoia would be healthy; etc. What you end up with is a family of tools that are right for specific jobs, but share a number of common features and utilities so that it is relatively easy to move betwen them.

    Looked at this way, it is remarkable that NT works as well as it does in so many different roles. It's like having a Swiss Army knife with twenty or thirty blades. You can drive screws with it, but if you're doing a lot of screws you want a real screwdriver, or even a power screw driver. Unfortunately, NT lacks two of the Swiss Army knife's great virtues: it is neither cheap nor compact.

  • An NT Cluster is used in an Enterprise to provide broad robust support to many different users who see it all as a single machine.
    Like a 100-machine MOSIX [] cluster with programs wandering across machines, appearing to be a single machine.
  • Never seen Netscape crash my X session. I noticed some instability with Navigator 4.0x on Red Hat 5.x, but I've had good luck with 4.5 or newer on SuSE 6.x and I am currently using Navigator 4.61 on both SuSE and Red Hat amd it seems pretty stable. In no case have I seen it cause an X crash. I haven't had X crap out on me since the days of the 1.1.x kernel series, and that has been a long time ago. It was also likely due to bogus video card drivers. I use mostly Matrox Mystique II or S3 (Virge DX and Trio64), but a couple of my minor boxes have Cirrus Logic 54xx video cards.

  • Ahhh, thanks. That's what I'd thought. So are the claims about having to run chkdisk for hours just nonsense?

    I also thought that ext3fs and reiserfs only did metadata, but I'll have to doublecheck that. (the best thing, of course, would be if they allowed you to choose which method to use) I'd expect that for, say, a workstation/desktop, metadata-only journaling is actually superior because of the performance issues, is this correct?

  • Because the 'real' Unices have them, and now Linux does too, with ReiserFS, and it'll have another when xfs (from sgi) gets out of beta.

    Right now, NT competes because it's a heckuva lot cheaper than a 'real' Unix and comparable in features to Linux & *BSD.; if NT doesn't keep up, Linux and *BSD will wipe it out. It can only hold market share as long as it's comparable. Right now, Linux and NT are, despite all hype, reasonably comparable. I expect Linux and NT to remain reasonably comparable with the advent of Win2K and Kernel 2.4. I hope Linux will pull dramatically ahead, and I think we have a big lead in stability, but we'll have to see. Anyway, if the feature-discrepancy becomes too large, the less-featured system will lose.

    In other words, the playing-field isn't stable. People want more out of their systems as time goes on. Holding still isn't good enough in an evolving technology.

  • Please remember my conditional 'if it comes

    It's trivial to block the ports by default with tcp_wrappers in the install script, so only localhost can access them and disable the services that aren't needed even locally. This can be done at the shop before Aunt Helga even sees the box.

    Besides, even if it's not done, who's going to set up a 'p0rn and warez' server on a dialup box that's only connected thirty minutes a day? Let's be serious, Aunt Helga doesn't have DSL.

  • A point by point rebuttal of a point by point rebuttal. Will the wonders never seize ;-)

    >>How many "common customers" use 4-way NT boxes? >>Very few, in my experience.

    >This statement is a brilliant testimonial to the >sheer naivite of Linux advocates, showing a very >deep misunderstanding of how servers are >deployed and used. This kind of comment strongly >supports the view that many Linux developers are >holed up in a bedroom somewhere, tweaking code >on a home PC in their spare time.

    I work in a small computer shop in Boise Idaho. We buld "servers" for customers all the time using NT. Most are simply normal machines, PII or III's with 128M of RAM sitting there serving files and printing. We've made dual processor machines twice. They were very cool. I also build Linux machines in my spare time (ha!) and sell them to customers as Samba servers, IP-Masquerading via diald or DSL, and intra-office e-mail servers. This is the so-called normal NT server. My completely unsupported wild claim is that there are more of these types of setups then 4-way monsters serving databases. I think that is what the author was refering to.

    >>Note that not all of the features supported by >>Linux are included by default in every >>distribution, but they all can be added if >>missing.

    >OK, and who will support these functions when >they are added? The distribution suppliers? Not >if it isn't in their product. The hordes of >volunteer help-desk personnel idling on USENET >groups? Only if it's a K00l question, dude. The >profusion of promising startups dedicated to >supporting commercial Linux sites? Better check >that fine print again...!

    I think the profusion of promising startups are there for exactly that reason. Where do you turn when BlueHat won't support your install? LinuxHare! (names changed to protect the guilty). Rather, I don't think they would _not_ support you just because you had modified your kernel. Speaking of which....

    >Assuming they know how to compile a kernel...

    IMNSHO, recompiling your kernel is an intergral part of using Linux; i.e., if you are installing Linux in the enterprise and you _don't_ know how to make menuconfig, you ought to be fired. Similar to installing NT and not knowing how to configure it to support dual processors.

    >>Linux's stability is only based on anecdotes. >>Microsoft seems unable to differentiate anecdotes from testimonials.

    >OK, and where are these "testimonials" again >(and I'm not counting USENET or Slashdot postings)?

    Here he is quoting Microsoft's page. Microsoft says that Linux stability if based entirely on anecdotes and testimonials. He (the author) is not required to do their homework by looking some of those anecdotes up. And, by the way, what is wrong with USENET postings? Slashdot postings are suspect, of course, the same as those posted to are. That's a nonissue. However, USENET is valid forum for discussing the benifts and disadvantages of various OS's and a wonderful place to find anecdotes. I would also direct you to web pages for and against the respective "contenders".

    >This is just wrong, and you are not doing the >Linux community any favors by essentially lying >about Windows NT's capabilities. Here's a simple >test for you: put two identical systems next to >each other, one running Windows NT 4.0 and the >other Linux. Boot each of them up. Then pull >both power plugs out, and reinsert them at the >same time. Which system will be up and running >faster?

    This is what is known as a strawman. Yes, the NT system will be up and running faster. No, that doesn't mean that is a full journaled file system. There is another posting on this page which details explicitely the _new_ features of Win2K, one of which is (you guessed it) a journalling filesystem. That is, a fully journalling filesystem.

    >>Free doesn't mean low TCO. Actually, it does. >>Microsoft's TCO calculations are based against >>other commercially marketed Unixes, which have >>very expensive initial acquisition and support >>contract costs and traditionally high education >>costs.

    >And Linux support contracts are free? There >won't be any education costs to move users to >Linux?

    I think you missed the adjetives. Very Expensive initial aquisition costs is Very True for most commericial unices. The initial cost of Linux, as we all know, is very small. Also, a Sun contractor who shops where I work charges $250 per hour. You can undoubtably find a linux contractor who charges the same, but I would be willing to bet that the average rate is lower across the board. Yes this is unsubstantiated conjecture, so sue me ;-) Secondly, the training for users to move to Linux is a (mostly) one-time expense. Yes, it inflates the TCO in the beginning, so you must sit down and measure the benefits to the disadvantages and go with whatever suits your needs. I don't run your business, so I can't tell you what that is.

    >>Support costs can be very low, as purchased >>support can be supplemented with award-winning >>Usenet support.

    >I'm sorry, I am picking myself up off the floor >from laughing so hard. I have submitted many, >many questions to various Linux discussion >groups over the years and the quality of answers >is *wildly* uneven, with the majority of answer >ranging from irrelevant to plain old wrong. Any >IT professional who depends solely on USENET for >Linux support should be fired.

    As my debate teacher would use to sing "Strawman, strawman, you've got the brain of a strawman". Please note that he says "supplement". As in, while you are waiting for a call back from your support company (or, while waiting on hold) you can post to USENET to see if you can get an answer that way. I fail to see where he said to rely on USENET for everything.

    >>Microsoft itself charges for support for a >>product that they licensed (not sold!) you at >>considerable cost without warranty.

    >And Linux support programs *do* come with a >warranty? Please, show me these programs!

    Again, you must love straw. He was refering to the fact that they charge you a _lot_ to basically lend you some software that they don't garuntee. This is to contrast with Linuxland, where they will give you the software for free or low cost, and still not warranty it. He was also not refering strictly to support programs at this point.

    >>The Linux User community, operating at no >>charge, garnered the 1996 InfoWorld Product of >>the Year award for Best Technical Support.

    >InfoWorld has a long history of anti-MS >sloganeering, and seems to give out its awards >simply based on the fact that they aren't MS >products. After all, this is the magazine that >for the previous five years had given OS/2 their >product-of-the-year award. If I as an IT >professional had made a purchasing decision >based on InfoWorld's recommendations in 1994, I >would be in fairly deep trouble right now.

    And if we had all blindly purchased computers for the last 15 years based on Gates famous quote "We will never need more than 640K of RAM", we would be in fairly deep trouble. Hind sight is 20/20. I have no beef or stake in Infoworld, but to attack them for not forseeing something is rather small of you.

    >>Microsoft claims that your security >>administrator must be an expert to properly >>configure security. My own knee-jerk reaction >>to this is, "When do you NOT want an expert >>supporting your systems? If you can't afford >>one full-time you hire a consultant"

    >OK, and you just got through going on about how >low Linux cost-of-ownership was?

    Low does not mean zero. Security is important enough to dedicate some money and/or time to. This applies whether you are running NT or Linux, which is exactly what the author was saying.

    >>Properly configuring Linux security is mostly a >>matter of removing those services that are not >>needed and religiously applying security >>patches as they appear. This should be standard >>operating procedure for any, regardless of the >>platforms used.

    >Correct, so why should it be any different for >NT?

    It isn't, except for the fact that you mostly have to apply service packs, which include not only security patches, but bug fixes, new features, and other "improvements", requiring extensive testing against all of you software, and of course must be reapplied everytime you change your system. However, this thread was beat to death a couple weeks ago, so I won't rehash everything.

    >>For business use, the major general purpose tool >>Linux lacks at the moment is a Lotus Notes >>client.

    >And an Exchange client (although that is hardly >surprising). So now we have taken about 90% of >the messaging users off the table.

    If by messaging you mean e-mail, and thereby imply that Linux is incompatabile with 90% of the email solutions out there, I would beg to differ. If you are reffering only to intracorporation "messaging", then I have no idea, but you don't back up your numbers either.

    In summary, "If the point of this document is to show that [you are] just as capable of generating FUD as [the next man], then it has succeeded. As a tool for [evaluating the truth of the original article], though, it is quite useless."

    just another techie,

    p.s. my user info is out of date, please send all flames to and rational replies and critiques to

  • Cpt. Kirk: "Scotty...we've got... to have... *more*.. power"

    Scotty: "Aye Captain... just one moment while I turn the power up on the engines"

    --a short pause followed by several loud whirring noises... then loss of speed

    Cpt. Kirk: "Scotty... we're losing speed ! we've got... to have... *more* ... power"

    Scotty: "..uh (several mumbling sounds) Captain... we've got a problem..."

    Cpt. Kirk "Quick Scotty..what's the problem ?"

    Scotty: "Well ... I'm not sure...uh..who's Dr. Watson?"

    Cpt. Kirk: "Dr. Watson ???? Scoty... *what* are you... *talking*... about?"

    Scotty: "Well Captain... My screen is telling me that I've got a problem and Dr. Watson is making a log file for me... but I don't know where the file is... "

    Cpt. Kirk: "Dammit Scotty... just reboot the damn thing."

    --fade to darkness
  • Windows'', how can third parties be any better?

    I can't follow the logic. Okay, I write some software and only I have the source code. I can't manage to support it well, yet people who don't have the source code can do a better job? In what universe?

    Like the man said, support is basically limited to training and workarounds. Without access to the code, if an issue arises, all you *can* do is work around it. Some of these third parties are simply good at providing workarounds.

    If I'm wrong, show me one service pack, for any version of Windows, that didn't come from Microsoft.
  • Four years ago, I tried installing linux on my machine.

    Five years ago, I was doing contract programming for Linux at an firm that was using it as a basis for taking their information business to the web. They were also using it to back up the hard drives of Windows machines on the WFWG network, with the help of early versions of Samba.

    I first started using Linux a year or so before that. I knew goofs who were capable of getting it running back in 1993. You must have had a broken distro.

    There were already halfway decent installs four years ago. I have kicking around somewhere a bunch of 1995 InfoMagic CD-ROMS with Slackware. From what I remember, the installation was darn easy. That was four years ago, so there! ;)

    And of course, people were able to get it running as far back as 1991. ;)
  • CS degree and fifteen years experience and can't figure out RedHat Linux? Gotta be kidding! Children who aren't even that many years old are running it, changing screen resolution, reading and writing floppies, etc.
  • There are many advantages to using MS databasing products. With their 'Random Field Swapping' technology (pat. pend.) you can keep your employees on their toes. Imagine the laughs when triple bypass patients are sent to maternity!

    And the 'Random Wait' technology ensures that your employees will take those frequent breaks that are so important for prevention of RSI!

    Finally, MS Exchange guarentees that your executive staff will never have to suffer from too many important emails in their inbox.

"If it's not loud, it doesn't work!" -- Blank Reg, from "Max Headroom"