Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Linux Business

A Case for Linux in the Corporation 426

_UnderTow_ writes: "Saw this over at Anandtech. It's a pretty descriptive account of a reasonably large corporation (7000+ employees) transitioning their network infrastructure over to Red Hat Linux. Has details of the company's initial move to NT, and their eventual move to Linux as the cost of licensing gets out of control."
This discussion has been archived. No new comments can be posted.

A Case for Linux in the Corporation

Comments Filter:
  • advocacy? blah.... (Score:2, Interesting)

    by Lxy ( 80823 )
    I didn't switch to linux because someone told me too, I switched because I needed an alternative OS. This is a good sign of things to come. Build a better OS, and people will come. Of course, it helps that Microsoft enforces license policies that soak consumers for every penny they're worth and even corporation who WANT to be legal are unsure of their licensing. The more Microsoft starts bullying people around, the more enticing free software becomes. If Microsoft ever stoops to the level of leased OS's there will be a whole lot more stories like this one.
    • by Anonymous Coward
      It seems to me that the commercial structure of MS's software makes it harder to admin.

      I just wiped off my laptop, and as I write this I'm in the process of reinstalling windows and office on it. I installed W2K and Office 2000, and I'm in the process of patching everything. This is literally a 4 or 5 hour job. Now admittedly this is a slow machine (233Mhz, 228MB of ram), but that's still pretty crazy. And I have a DSL line -- this isn't

      What if I had to do 700 of these things?

      How does central application installation work under Windows? Is it even possible? How do they keep track of the licenses? Can you patch office once and have the changes propograte throughout the network?

      Imagine a Linux network where applications are all stored on central file servers. You don't have to worry about whether or not someone has their KWord license. You can just let everyone read the NFS shares.

      My point is that apart from the licensing fees, there's an overhead assocated with keeping track of who can run what. To protect their interests, MS has set things up in ways that make administration harder.

      Things like centralized office suite administration haven't been high profile in linux up until now -- the focus has been on making usable office apps, things that don't totally suck in comparision to M$ Office.

      But I think there are some real opportunities to do things that MS will have more trouble pulling off, on account of the licensing.

      apt-get is a beautiful thing. What would an enterprise level apt-get look like? What would allow you to install software or updates on 100,000 machines? Would would allow you to roll back a bad update on all of those changes? What would allow you to keep track of different software configurations for different job descriptions or hardware configurations? What would it take for admins to control what users can do with apt-get, so they don't break things?

      What would it take for R3dH@t (or someone else) to feed updates into a large corporations office appication framework automatically?

      It seems to me that Linux had a lot of groundwork laid for this sort of thing, and that it could be made to happen more easily than a lot of people think.

      I think that everyone had a moment with apt-get. You've set up a new system, it doesn't have much on it, and someone sends you a zip file. So you say, "apt-get unzip", and 20 seconds later you can unzip the file.

      In a windows environment, that works with zip (although it's definitely harder and slower). But what about Visio? If someone sends you a Visio document, you can't just download Visio.

      We, on the other hand, can deploy a desktop that will download our diagram program on the fly when someone clicks on the file icon.

      What does that do to admin costs? (Or: what does that do to our jobs?)

      I believe that network aware package administration is going to be the thing that wins the enterprise for linux in the end.

      • Great comment. It seems to me that GNU/Linux has many advantages not normally discussed. Your comment begins to show more of the potential advantages.

        Also, Windows has many disdvantages most people don't understand. For example, with Microsoft Windows there is a potential of unrepairable operating system corruption. Microsoft Windows has a file called the registry (SYSTEM.DAT) that often becomes damaged and unrepairable. Below is a message copied without change from a Microsoft error display. As you read it, please keep in mind that registry damage is extremely common.

        Registry Repair Results

        Windows found an error in your system files and was unable to fix the problem. Try deleting some files to free up disk space on your Windows drive. If that doesn't work then you will need to install Windows to a new directory.

        The computer with the bad registry has gigabytes of free disk space. "Installing Windows to a new directory" also means re-installing ALL the applications, and driver updates, and so on. "Installing Windows to a new directory" is equivalent to re-formatting your hard disk and starting over. This is not file system corruption, which is easily fixed. This is unrepairable operating system corruption.

        Please also realize that this is only one of MANY such issues.

        One reason to use GNU/Linux is that it is of much higher quality. Linux doesn't seem to have the same vulnerabilities as Windows. I don't think there is a Linux message that says, "The corruption is too great to repair. You will have to install everything again."

        Why does Microsoft use a single file for most configuration information? Apparently Microsoft uses this as a method of copy protection. A user can copy a program's files, but the program will not operate without the registry entries. Unfortunately for Microsoft Windows users, this single file can become corrupted by a buggy application. If the corruption is great enough, the entire operating system becomes corrupted and unusable and unrepairable.
      • I just wiped off my laptop, and as I write this I'm in the process of reinstalling windows and office on it. I installed W2K and Office 2000, and I'm in the process of patching everything. This is literally a 4 or 5 hour job. Now admittedly this is a slow machine (233Mhz, 228MB of ram), but that's still pretty crazy. And I have a DSL line -- this isn't

        What if I had to do 700 of these things?


        You would push the patches out using a login script.

        I do agree that administration is more difficult in the NT world though. The basic problem is that NT is not very transparent so when something goes wrong, your troubleshooting is pretty ... blunt. Not as good as getting an error, as you might see with Apache, of:
        error loading tomcat.so in line 274 of httpd.conf
        Is it really a valid dso?

        My average downtime for crashes on NT is about three hours per incident, though I am not that experiences in troubleshooting the monster. With Windows 2000, it is less (1 to 1 1/2 hr) because it runs better on modern hardware, but with Linux, it is about 15 minutes :)

        Linux can definitely show an increased TCO if it crashes less often and is more easily fixed :)
  • by Steveftoth ( 78419 ) on Friday August 31, 2001 @12:59PM (#2239623) Homepage
    Before you flame me, read this whole article. This is a fairy tale of linux winning over microsoft. Not that it couldn't (or didn't) happen, it's just that the author presents it in such a format as to make it unbelievable. Did anyone else get that same impression?
    • i'd have to agree with you on this on. i've been using ms products everyday for the last 5 or 6 years now. they are what i know and i know how to fix them no matter how many times they break. i have a decent linux knowledge, but the trust of always knowing how to fix it isnt always there. with ms products lots of people know whats going on, so they are a resource. if know you know your linux, then yes, this is the way to go, but if your not sure if you can support linux in business america, then i'd stay away from it.
      • I have to ask... how much did you know about NT when you started? This is what bothers me about my current company. We can't even consider linux because I'm the only person who understands it enough to admin a box. We have no NT people on staff, yet the addition of Microsoft servers was a must. So for whatever reason, the company spent thousands of $$ to get us into all sorts of training classes and set up test networks so we could break the OS and try to fix it and stuff. Why? Because no one here understood the OS. When I mention we can do the same thing with linux but come out far cheaper because of the hardware and software cost savings they tell me "we don't want to go through THAT again". Go figure.
        • by Anonymous Coward
          "We can't even consider linux because I'm the only person who understands it enough to admin a box."

          And that's exactly why. So if one of your linux systems breaks, and you're unavailable.. who does your company call?

          At least with NT, there's an out -- they can call MS support and eventually get an answer. And MS is ALWAYS there. Your company doesn't have to rely on Timmy's little brother or "that guy from the computer store" to solve your linux problems when there's no one else around.

          Confidence that you can find someone else who knows about the OS in a pinch is almost more important than understanding it yourself, especially when you have $$$ on the line.

          ---

          When was the last time you tried getting tech support from Microsoft? I've worked at a company that had about 500 employees and was M$ from top to bottom. The few times that one of the admins wasn't able to figure out a niggling problem, the would call M$ and pay the ridiculous per incident charges (even though the company was handing M$ somewhere on the order of high six figures a year in licensing fees). More often than not, Microsoft was unable to solve the problem and would invariably revert to "wipe the disk and reinstall the operating system."

          I left there about 2 years ago and management had just decided to switch most of the servers over to Redhat Linux and wean the company off NT. A few weeks ago, I spoke to one of my friends who remained behind. Apparently, the CTO/CIO's were happy enough with Redhat 7.1/StarOffice that most of the employees were going to be migrated over to Linux + KDE. They're keeping a small number of windows machines around for compatibility reasons when they have particular products that require windows to function.

          This isn't a fairytale. It's happening in the real world, especially during these days of belt tightening.

          Cheers,

          OK, so I'm a coward. Sue me.
    • by Anonymous Coward
      I read the whole thing. And it rang fairly true to me. There's nothing in there that is factually incorrect about MS or any of their hardware..

      The talk of having to scale up hardware with new versions, add redundant systems, separate server functionality.. Have a PDC and a BDC.. I've heard it all before, because that's how MS makes its money.

      Then in walks Linux, on lower end hardware, and deals a knockout in one section. Then the next. Slowly integrating, not completely replacing.

      If this were a fairy tale, it would probably talk about every NT license being ditched, about Linux on the desktop.. It isn't. It's about Linux showing its true colours as a low-cost high-stability replacement to some NT servers.
    • Yes, this story was written to sensationalize and dramatize the actual conversion. Getting past that, though, all the events mentioned are easily replicable. Replacing a 6 server NT cluster with a pentium class machine running linux? Yup. I believe it. NT likes dedicated machines (I suspect a licensing tactic there, another machine == more fees) yet linux can handle many services with incredibly little power. The server may have been a little taxed, handling 7,000 e-mail accounts on a single pentium box may stress it a little, but other than e-mail taking a little longer to send the end users won't notice. On NT the server would BSOD and e-mail would be down. Linux gets slow under pressure. NT crashes. Big difference in terms of $$ there.
      • by Anonymous Coward on Friday August 31, 2001 @01:21PM (#2239721)
        The server may have been a little taxed, handling 7,000 e-mail accounts on a single pentium box may stress it a little, but other than e-mail taking a little longer to send the end users won't notice

        I don't thing the box would've been taxed that badly... I once worked for a company that had 3500 email accounts on a single cpu, Pentium Pro 150Mhz machine with only 64MB ram and running FreeBSD and it did just fine. We typically had 1800-2000 concurrent users getting their mail via POP3 from that box at any given time during the business day. I can imagine a modern P-III or Xeon box pushing close to a GHz speed and hundreds of MB's of today's cheap memory with fast Ultra160SCSI disks running Linux or FreeBSD could handle thousands of simultaneous IMAP/POP users with ease.
      • I find it very, very hard to believe that 1 pentium class server could replace what a 3+3 server NT cluster could do at any decent utilization. It's not 6 servers, it's 3 + 3 backups. And they conveniently left out the utilization of the servers, and the hardware on them. If 3 servers are using .5% total utilization, then yeah, a pentium class machine could do the same.

        Does anyone have recent NT vs Linux benchmarks (on the same hardware please). I haven't been able to find any.

        This article was short on technical details needed to make a fair comparison between the NT and linux state here.

        as for NT crashing under pressure, that's interesting because I have yet to have my 2k box crash, and I have seen many 2k servers under heavy heavy load doing perfectly fine.
        • as for NT crashing under pressure, that's interesting because I have yet to have my 2k box crash,

          Wow, talk about a non-sequitur. Tell me how that statement does not compute.

      • If you look at the article again, they didn't replace the entire corporation's servers with one pentium class machine, but just one "particularly busy" section of it. It's not hard to believe a pentium-class machine could easily do file, print, and email for 500-1000 users. I mean, I've had trouble getting the load up to 0.1 with 100 users doing that.

    • Yes absolutely. I got the same impression. The author claimed that the company in question had their servers going down at least once every 10 days.

      What the HELL were they doing to these servers? Mine has been up for 140+ days at home (I have run NT Stress Test on it for a week as well), and at work here we have a 2K server up for 80+ days right now, and it's used a lot, it has 2 printers on it, a stack of hard disks and email as well, and we've had no problems.

      And if this company was so proud of the change, why didn't they let us know their name?
      • I was curious about this as well. W2K is pretty solid -- maybe not as solid as Unix, but the gap is closing.

      • What version of windows are you running though? From what I've heard (and experienced) win2k is a lot more stable than NT4 was, but in this article they state that they never made the switch to win2k due to the costs, and not wanting to use Active Directory.

        I could easily see NT4 having problems like this.
        • good call about 2k vs NT4, but still, every 10 days? I've used NT servers under heavy use before, and as long as we didn't fiddle with things they stayed running for a while.

          Defintely NOT once every 10 days...

          • The few NT shops I've been in usually rebooted their NT 4.0 servers on Fridays just so they'd be likely to stay up over the weekend ...
            • I have known many people who shut down their computer at the end of the day just because they didn't think that leaving it on was a good idea overnight.

              I have known admins who thought the same way. Too much Win98 usage made them think that it was better to reboot servers whenever possible.

              ARGH! I can type in a new decent comment in under 2 minutes slashdot! bugger off!
          • by irix ( 22687 ) on Friday August 31, 2001 @04:20PM (#2240771) Journal

            When I used to write some web apps that ran on IIS (about 1 year ago - thank god those days are over) we worked with a fairly big NT shop.

            Their policy was to reboot the NT web servers one per month on schedule, becuase if you went any longer IIS would go into a death spiral and take NT down with it.

            This place was staffed with lots of MCSEs, etc. and this was their answer to problems with NT/IIS. No joke.

      • Try installing junk shareware on it, uninstalling it and see if it blows up. Difference is poorly written code and problems don't down a linux machine *as easily* as a windows box.
      • Yes absolutely. I got the same impression. The author claimed that the company in question had their servers going down at least once every 10 days.

        Looking at Netcraft stats, it is not unusual for major web server to be rebooted every 7 days... Windows 2000 fares better, around 14-20 days per reboot.
  • by joestar ( 225875 ) on Friday August 31, 2001 @01:02PM (#2239633) Homepage
    There are thousand cases of Linux uses in corporates (large and smaller ones as well) on MandrakeBizCases.com [mandrakebizcases.com]. Worth a look.
  • by OSgod ( 323974 )
    ...sounds like a case of poor administration, a well written article although not technologically objective.



    What's missing are any verifiable facts. Until any are presented this article goes in the round file -- i.e.: somebody's pipe dream of the way Linux should help.



    All of the major vendors list the company name with most case studies -- it is common practice. Who is the company? Is their third party verification of the reported shift?



    It could happen -- it might have happened -- it is useless to use this article to sell management on the benefits of open source -- this has few if any real details.



    Please, please present some factual and verifiable accounts that can be used in making OS decisions!

    • by Telek ( 410366 )
      Absolutely. I must confess, I am a Microsoft user (and apparently the majority of /.ers are as well), but this does look like a dream. No names provided is a rather odd thing. I must admit however, I am starting to find linux more and more appealing.

      I also asked myself this interesting question:

      Hypothetically speaking, of course, if I had not paid for my software yet, out of all of the software that I use, how much of it would I still use if I had to pay for it?

      I'd be over to linux in a heartbeat if I had to pay for everything on my system. So I don't think that MS can claim that piracy isn't still helping them, at least in part.
  • by FatRatBastard ( 7583 ) on Friday August 31, 2001 @01:09PM (#2239662) Homepage
    I don't care if you never implement a Linux/*BSD box, or if you think Linux is the biggest piece of crap to ever be installed on a computer. The simple fact that its an alternative to NT (and one that, as this article shows, can be done piecemeal) is good for the industry. It keeps MS honest. As an IT director you have one hell of a bargining chip at your disposal. You still may go with MS tech, but at least you can do it with some leverage on the licensing terms.
  • I don't know if the story is true or not, anyone know of a Washington State corporation with 7000 users that recently made the switch? I am from the area and am not aware of anything of that maginitude.

    But, fairy tale nature aside, the article does show how big companies can get trapped in the licensing whirlpool. It used to be that no-on got fired for buying IBM. Now it is Microsoft that cannot do wrong. But even that is changing and companies that need to look hard at their bottom line should take note!

    So I find this to be good ammunition for me as my fledgeling company starts to sell GNU/Linux-based business solutions. Of course my target market isn't companies with 7000 employees; more like 70 to 700. But I need all the bullet points I can make even with them.

    So thanks for this posting!

    Jack
    • I was trying to think of what company that could be.

      Not a huge company like Boeing, as they talk of only 7,000 employees.

      The company is growing rapidly.

      The company runs evening and night shifts.

      The company has point of sale terminals.

      The company must not be highly profitable from the description.

      The company doesn't seem to have very bright IT people.

      Some signs point to Amazon.com, but I can't be sure. :)
  • by Anonymous Coward
    ...this would be a very interesting article.

    As it stands, it's just annoying. How do we know how much is true, and how much is embellished (or even pure fantasy?)

    I was about to pass it along to a colleague but decided not too. It's just TOO unverifiable.

    I happen to be a Mac user with very little personal or professional involvement in either WIndows NT or LINUX.
  • by idot ( 130605 ) on Friday August 31, 2001 @01:16PM (#2239690)
    While this analysis details very niceley what MS charged for service, the writer completely left out what RedHat charges, in this case or even generally.

    Could someone with experience post some figures?

    How long will RedHat be involved in providing service for a company they have switched to Linux. If all goes so smooth, why not hire an experienced sysadmin inside, why outsorcing?

    • If all goes so smooth, why not hire an experienced sysadmin inside, why outsorcing?




      One doesn't exclude the other - in fact, our high end contracts include RHCE-certifications so we know we're dealing with someone who knows Linux.

  • by Auckerman ( 223266 ) on Friday August 31, 2001 @01:18PM (#2239706)
    One of the main reason I have heard time and time again for companies not switching to another lower TCO OS (MacOS, some open source Unix) is the cost of retraining. Here, MS, clearly made the cost of ownership HIGHER than the cost of retraining and a company noticed it. Now, after MS tries to move everyone to .NET and owning a WinTel computer requires annual fees, don't you think more companies will move away from Windows?

    • Good point, but the cost of retraining isn't just the cost of the training. It's also the cost of the lost productivity incurred by having everyone go back to the bottom end of the learning curve and having to figure out how to be efficient again.

      Still, it's good to hear stories about M$ losing customers. Having viable alternatives is what the free market is all about.
  • by astrashe ( 7452 ) on Friday August 31, 2001 @01:19PM (#2239711) Journal
    It seems to me that the commercial structure of MS's software makes it harder to admin.

    I just wiped off my laptop, and as I write this I'm in the process of reinstalling windows and office on it. I installed W2K and Office 2000, and I'm in the process of patching everything. This is literally a 4 or 5 hour job. Now admittedly this is a slow machine (233Mhz, 228MB of ram), but that's still pretty crazy. And I have a DSL line -- this isn't

    What if I had to do 700 of these things?

    How does central application installation work under Windows? Is it even possible? How do they keep track of the licenses? Can you patch office once and have the changes propograte throughout the network?

    Imagine a Linux network where applications are all stored on central file servers. You don't have to worry about whether or not someone has their KWord license. You can just let everyone read the NFS shares.

    My point is that apart from the licensing fees, there's an overhead assocated with keeping track of who can run what. To protect their interests, MS has set things up in ways that make administration harder.

    Things like centralized office suite administration haven't been high profile in linux up until now -- the focus has been on making usable office apps, things that don't totally suck in comparision to MS Office.

    But I think there are some real opportunities to do things that MS will have more trouble pulling off, on account of the licensing.

    apt-get is a beautiful thing. What would an enterprise level apt-get look like? What would allow you to install software or updates on 10,000 machines? Would would allow you to roll back a bad update on all of those changes? What would allow you to keep track of different software configurations for different job descriptions or hardware configurations? What would it take for admins to control what users can do with apt-get, so they don't break things?

    What would it take for RedHat (or someone else) to feed updates into a large corporations office appication framework automatically?

    It seems to me that Linux has a lot of groundwork laid for this sort of thing, and that it could be made to happen more easily than a lot of people think.

    I think that everyone has a moment with apt-get. You've set up a new system, it doesn't have much on it, and someone sends you a zip file. So you say, "apt-get unzip", and 20 seconds later you can unzip the file.

    In a windows environment, that works with zip (although it's definitely harder and slower). But what about Visio? If someone sends you a Visio document, you can't just download Visio.

    We, on the other hand, can deploy a desktop that will download our diagram program on the fly when someone clicks on the file icon.

    What does that do to admin costs? (Or: what does that do to our jobs?)

    I believe that network aware package administration is going to be the thing that wins the enterprise for linux in the end.
    • its called RIS (Score:2, Interesting)

      look it up, its called RIS and works under win2k. you set up one server and install all the software and needed changes. now you start a win2k install on any box and point it to the server. its installed exactly to your liking. most companies just use a hard drive blaster anyway. check out this doc for more info

      http://www.microsoft.com/ISN/whitepapers/p56782. as p

    • 1. Any shop with over 5 identical machines should have Ghost or Drive image. You install the OS, apps, etc. Make an image via a network boot disk. Put boot disks in machines, boot to them, blow image on. Change SID, rename machine, reboot. Add to domain. Done. All the big cloning software packages support multicast as well. MS also provides some tools

      2. As the tech lead here, I am responsible for licensing. Yeah, its not fun. But most enterprise software isn't fun either. Recently I spent some time trying to figure out what getting Solaris 7 would cost us if we acquired a machine that could run it - remember, Solaris 8 is free and downloadable - 7 isn't.

      3. Terminal services are viable for NT/2k. You can run apps centrally. It requires serious horsepower at the server side, but people are doing it. That is another way people do app installs and licensing - if you have 50 offices, and 50 comptrollers around the country, make the client binary accessible via terminal services. Centralize the server, and just install terminal services client for those 50 people. Upgrades are a non issue after that.

      4. Application installs - login scripts, as well as all kinds of software packages. MS SMS is a serious package you can do inventory, software pushes/distribution, etc with.

      Office and OS licensing could be MS's downfall. Basically, you need a quick to install xclient that would allow complete office functionality through it. Its gettting to the point where OS + office + client access licenses cost as much as the client pc. If you can offer a (not really, centralized computing aint new) new paradigm that allows the existing machines to sit as they are, without cutover costs, you have a winner. I don't think network computers will really take off because the price differential between them and real pc's keeps getting worse.

      ostiguy

      • I use ghost to do backups, and to swap OSs around, and I love it.

        But aren't individual machines supposed to have their own license numbers? With their new activation technology, isn't MS going to start making it impossible to slide on this?

        I'm not saying that any of this stuff is impossible for MS to figure out. I'm really just saying that NT administration is hard, and that there is room for improvement here.

        I know you can write code that will do anything, including installing other programs. But I don't have to do anything so complicated with Debian to use apt-get -- I just type the command, and boom, it's over. You don't have to reboot, login, or any of that.

        I do admit that I was off target on a big part of my post -- if I knew more about NT administration, I'm sure I'd acknowlege it's ability to admin things centrally.

        I don't think the thin client thing is the answer, though. There's a real difference between using the old school NFS installs, where the storage is remote but the cpu is local, and the Citrix/Terminal Server strategy. TS is an inefficient way to get central management, it's stupid to have to throw away 97% of the power of the local CPU just for that.

        • by SuiteSisterMary ( 123932 ) <slebrunNO@SPAMgmail.com> on Friday August 31, 2001 @02:49PM (#2240245) Journal
          But aren't individual machines supposed to have their own license numbers? With their new activation technology, isn't MS going to start making it impossible to slide on this?
          No. Take any Microsoft product of recent; say, anything from office2000 up. Probably even earlier, but I can't say. Drop to a cmd prompt, and navigate to the setup program. Then do a 'setup /a' and watch, as in beautiful majesty, the software makes what is called an 'administrative install' which preconfigures the license key, company name, and all that stuff. Then it installs it to a designated location, such as a network share. Then, go to microsoft.com, find the Resource Kit page for your software, lets say Office 2000 again, and download the core tools. You'll likely find something called 'custom install wizard' which you run against this administrative install. This will then take you through from 1 to 40 wizard pages where you customize anything and everything about the install. When it's done, you get an MST, or Microsoft Setup Transform file. Then, using a command such as
          \\myfileserver\myinstalls\office2000\setup TRANSFORMS=mytransformfile.mst /qa-
          you'll get an install, preconfigured, no user input. Just progress bars. Then, using something like SMS, Zenworks, Tivoli, whatever, you automate the installation of these.
      • As far as the Solaris 7 licensing costs go, almost any machine you purchase from Sun will give you the choice of either Solaris 7 or Solaris 8. License is included with the purchase of a new machine, although you should be aware that several of the newer machines (Netra X series or Sun Blades) will only run Solaris 8 or above.
        Honestly, as a Solaris admin and fan, I cannot name a reason that I would stay with Solaris 7 if it was at all possible to move to 8. I haven't seen any incompatabilities between 7 and 8 except with a few tools like top, which needed to be recompiled. I'd recommend picking up an inexpensive box with Solaris 7 on it, (make sure that they transfer the RTU with it) and install everything you plan to. Then, upgrade to Solaris 8, and see if it all works. I think that in almost every case, it will work without problem, and you'll have added benefits of Live Upgrade, more stability, and fewer security problems.
      • 1) regarding driveimage or ghost.. that's fine for rolling out machines (Though still a bit clunky)... but what about installing software later? That's the hard part. It's a breeze with unix.
        2) Solaris 8 is free and downloadable, but not for commercial use, so that point isn't really valid.
        3) Terminal services are available.. but you need machines with serious horsepower, and given the architecture, it's only so scalable. (You can buy MUCh larger sun machines, for example, for doing similar things). Terminal services licensing is also a nightmare.
        4) Application installs can be done remotely, but it's still a pain in the ass, and a far cry from simply installing once in unix and allowing everyone to use the software.

        Regarding integration....
        A bunch of X workstations (thin-client), a big unix server, and then utilize something like citrix & some win2k servers for those windows apps you just can't get away from. Makes licensing centralized, and allows you more control.
    • "What if I had to do 700 of these things? "

      You would automate it, either with Ghost or sysprep or RIS, etc.

      "Imagine a Linux network where applications are all stored on central file servers. "

      Yes you can do that, but you'll have to upgrade your network to 100baseT to the desktop, switched to gigabit in the closet with each closet having a file/print server that did nothing but provide the read-only executable content to the clients.

      I don't need to imagine because we used to do things this way. As the computers became faster, this way of doing things became less and less efficient. Actually it became less efficient about the time Pentium's first came out in '94.

      "What would an enterprise level apt-get look like? "

      That's the RedHat Network. Their service they charge $20/month per desktop for.

      "We, on the other hand, can deploy a desktop that will download our diagram program on the fly when someone clicks on the file icon. "

      I assume you are speaking of Windows 2000 here, as that is the way it can operate using Windows Installer Services.

    • How does central application installation work under Windows? Is it even possible? How do they keep track of the licenses? Can you patch office once and have the changes propograte throughout the network?

      There are a number of alternatives. There are third party solutions like Norton Ghost. Starting with Win2k, there is now the built-in MSI installer, which using Microsoft Scripting Host to do the installation work.

      My point is that apart from the licensing fees, there's an overhead assocated with keeping track of who can run what. To protect their interests, MS has set things up in ways that make administration harder.

      NO, it's not run. But there are lots of ways to automate it. At GM, they typically have everything loaded on an application server, and then control access through Tivoli, which takes care of a lot of stuff like giving access to the shares and actually downloading the registry entries, necessary files and installing the icon on the desktop, etc.

      apt-get is a beautiful thing. What would an enterprise level apt-get look like?

      Tivoli. :-P

    • What if I had to do 700 of these things?
      Disk images for installs, and Microsoft SMS for software/patch distribution as well as asset tracking.
    • One of the 'hidden' costs of using MS products is the amount of time & resources spent simply staying current, in case of the feared 'surprise Audit' where MS basically threatens to ruin you if they so much as find one license out of order.

      IT's not the cost of the OS for each workstation... it's the recurring costs in upgrading, new licensing schemes, auditing...
      Plus rediculous non-recyclable licences such as those for Terminal Services (From what I recall, if you license one workstatoin to use them, you can't later move it to a new one if tha tworkstation breaks)

      Network installation? SUre, it can be done.. but nothing like what you can accomplish simply and easily and *logically* with a unix network.

    • by MikeRepass ( 199982 ) on Friday August 31, 2001 @03:06PM (#2240342)
      This draws from my experience administering WinNT 4 and 2k so I might miss lots of things (please flame away), but there are a variety of options for remote installation and management of machines in a Win2k environment.

      First, there's the RIS system, which allows you to set up a server with a custom CD image (the normal Win 2k Pro image works fine but you can also slipstream service packs and updates as needed). Then, you create boot floppies. So long as you make a machine acount with the proper MAC (captured in the GUID) address of the machine you want to build, you simply boot from the floppy, it finds the RIS server, and builds itself. You can set up scripts to install/customize applications once the machine build is complete.

      After that, the Active Directory can be used to advertise policies, which can inclue software updates, service packs, and a variety of things. I don't have much experience there, so maybe somebody else can offer info.

      Finally, the big end-all of Microsoft distributed network management is SMS, this behemoth (which is as difficult to administer as Exchange) not only provides a huge SQL DB of all inventory information, but you can use it to distribute and control practically any possible software update necessary, such as remotely instructing a machine to upgrade itself from Win98 to Windows2000 at 4:00 am (or after the user logs off if someone is logged on at that time).

      In short, and its difficult to say, and I'm in no way a fan of Microsoft (running Debian for two years now), but Win2k does actually provide a robust and featureful means of remotely managing computers. And quite naturally, there are components for license management. The problem is, it's all so complex. In my group, we looked long and hard at SMS, and even licensed a copy of BackOffice, but we soon realized it was just beyond our scope to implement. It's hard to make the senior guys understand that in order to keep the machines up to date, you need to hire as many additional people as you do for email (Exchange). They say "but what did we hire you for?" The tools Microsoft provides are very powerful, more powerful than I think a lot of people realize, but they're just so complex that I don't think they offer much to the worked-his-way-up-from-tech-support-admin. It takes months of planning and education to successfully implement and maximize any of these options, and I don't think many organizations can spare their top admins for that long.

      This is where I think GNU/Linux (specifically Debian) has a great chance, one I'm aggressively trying to push in my organization. All one has to do is set up a server with the debian mirror scripts, run an in house mirror that updates nightly (be sure to make a reasonable contribution if you're gonna be downloading a lot). Then, using simple bootfloppies with some scripts, you can boot and build machines with minimal configuration, which then download and install everything from your local mirror. All you have to do is set up the appropriate servers, once again easy with debian, have each machine mount /home off of a share somewhere, and you're good to go, a robust and nightly updated (simple cron jobs) system.

      To me, apt-get is a next generation tool that significantly alters the paradigm of computer usage. Once you make the switch to apt, you never go back. It completely alters how one looks at building, managing, and upgrading PC's, and I think it, along with samba, are the two best selling points to Linux in corporate IT world.

      Wow, sorry to have gone off a bit here, but it's Friday and I'm bored. As always, these are just my opinions, and your mileage may very. Feel free to flame away, I'm interested to hear what people have to say.

      Mike
    • How does central application installation work under Windows? Is it even possible? How do they keep track of the licenses? Can you patch office once and have the changes propograte throughout the network?

      Yes, it's definitely possible. There's a Microsoft product called SMS (Systems Mangement Server?), and there are some great third-party tools like Ghost. Of course, non of these are Free (or free).

      I think that everyone has a moment with apt-get. You've set up a new system, it doesn't have much on it, and someone sends you a zip file. So you say, "apt-get unzip", and 20 seconds later you can unzip the file.
      In a windows environment, that works with zip (although it's definitely harder and slower). But what about Visio? If someone sends you a Visio document, you can't just download Visio.

      You're comparing apples with oranges. When someone sends you a Visio document to your Linux computer, you also can't just download Visio. And, as you said, you can download a free unzip tool for Windows.

      We, on the other hand, can deploy a desktop that will download our diagram program on the fly when someone clicks on the file icon.

      I'm not sure whether I'd want software to install over the network automatically, as it can lead to totally inconsistent systems, or even the activation of mail-attachment style viruses (if the software is downloaded from the internet).

      Anyway, this is also possible with newer versions MS Software that uses the Windows Installer service which will allow application to be "advertised", AFAIK including setting file type associations before the application is installed. When you have the setup files on a network server (I think Microsoft's license explicitly permits that), it can work just the same way. However I don't know if clients that have an application advertised, but not installed, need a license.

      What does that do to admin costs? (Or: what does that do to our jobs?)

      Nothing, because Linux doesn't magically give you a self-administering network. It will cut down license costs, might reduce hardware costs, and may increase reliability and security.

      But even with the automatic software deployment you described, you'll still need administrators to set that up before the clients can make use of it. The admins will just have to do less dumb work like walking to each user with the installer CD.

  • by sheldon ( 2322 )
    While I feel Microsoft's software is substantially better than any solution one could deploy with Linux, I do feel their licensing structures have gotten entirely out of hand in recent years.

    Competition on this level will cause Microsoft to revisit their pricing and become more competitive. Essentionally causing the same thing to happen to MS as MS caused to Sun, Novell, Oracle, etc. when they came in and undercut those companies by half or more.
    • I feel Microsoft's software is substantially better than any solution one could deploy with Linux

      Please tell me that you mean that in the context of the article -- that MS would have been better in this case, if only the licensing were better. I could belive that. But if you mean to imply that MS is substantially better across the board, that's just absurd. I am doing things with PHP, Apache, MySQL, and Linux that NT and 2000 just can't do. Not "MS is a little weaker" but MS doesn't even offer it. For example, PHP shared sessions on a server farm -- MS says "wait until ASP.net!" And working with mod_rewrite for on-the-fly, behind-the-scenes rewriting of URLs (NOT the same as a redirect). And for that matter, the server farm itself -- with Linux and LVS, I put together an easy 3 box farm for $15,000, and it's faster than the $50,000 machine it replaced. That's superior technology.

    • But you can't undercut 0 or you'll get a divide by 0 error (or Overflow in Visual Basic!?! - how retarded is that???).

      Speaking purely of licensing, MS can never undercut free apps. That's one reason why they've been looking to other revenue sources.
  • This reads *exactly* like what my life was like, late '97 to late '99. Uglier and uglier NT network (we had roughly 35 NT domains with only 2000 users), more and more fragile services (mostly mail and printing because our file serving was from NetWare), higher and higher costs (and more and more time) to get anything done.

    I kept suggesting Linux (yes, back then). I even setup a non-crashing backup print server--but I was the only one who used it regularly (of course, everybody used it about twice a week....). Unfortunately three factors worked against me:

    1) Linux wasn't quite as big then as it is now.

    2) The network admin was nearly techno-illiterate. She could do the stuff she had been trained to in a couple of NT classes but nothing else. Linux scared her. And she wasn't the kind of person to educate herself to conquer fear--her method was to insult and ignore the source.

    3) We were about 1 hour from Redmond. It's hard to shield yourself from The Presence when you are that close.
  • I don't buy it (Score:4, Insightful)

    by erroneus ( 253617 ) on Friday August 31, 2001 @01:42PM (#2239849) Homepage
    Wow, I can read that in so many ways...

    First, I don't buy into the credibility of the story. I want to know hard information about this particular case study. While the generalities of the story rings basically true to my ears (probably because I want it to be true) the absense of referencable specifics make the story factually questionable.

    Second, maybe it's just my lack of experience on the matter, but there were some licensing costs there that I never even heard of before. Maybe it's simply because I never bothered to notice. But "I don't buy it" also means that I don't pay for MS's licensing costs so I wouldn't know. What I do know is that Microsoft has been riding on the momentum of accepted piracy for so long and without a doubt, it was intentional. It's like a drug dealer -- get'm hooked and then charge them for it dearly later. Corporate America and hundreds of thousands of IT professionals are frightened to death about the "withdrawls" from Microsoft and like an addicted smoker, they would rather pay the costs of continued use rather than kick a bad habit and do what's best for the "body."

    I'm all for MS Windows as a client, to be honest. It works good [enough] for the end user and it's damned easy. And since MS Office enjoys enough corporate ubiquity, it's still potentially damaging to use anything but MS Office where different companies do business together. HOWEVER that has no bearing on the server side which is exactly why it has historically been an easier market to enter. The geniuses behind the SaMBa project are probably the biggest heros in the story of Linux as they enabled something that simply made it all work.

    So I'd like to see some follow-up like knowing more specifics such as what company this is, when it happened and such. Who from RedHat can confirm this story?

    I want to believe it so badly that I almost do. More importantly, I want something I can use later without looking like a moron unable to answer the practical questions.
    • So, Anandtech is a bunch of liars? I've seen enough of their unbiased reporting to believe what they put up on their site, as least as much as a commercial place like cnn.com.
    • You say:
      I'm all for MS Windows as a client, to be honest. It works good [enough] for the end user and it's damned easy.
      T'aint so.

      If what you mean is "Windows is easy for a non-technical user to use (with a skilled sysadmin handling the problems of keeping things running).", I think you're wrong. Windows is HARD, and unintuitive. So is KDE. The only difference is that most folks who've been forced to work with computers for a while have learned what buttons to push on the Windows aplications they use regularly, to do the things they do regularly. When a non-technical user gets that same "what button" knowlege on a Unix system, Unix is easy.

      The secretaries in the Statistics department here have windows PCs on their desks, and use them largely to run xservers so they can connect to the Unix compute server. After training, they find it easier to get their work done using vi and plain-TeX than using Windows applications. They do use IE for web surfing, since it works much better than Netscape 4.7X. They use other windows applications too, where they find it easier than Unix (it's AIX, when I was there), but much of their time is spent using vi.

      If you mean something like: "Windows makes lower demands on non-technical sysadmins", you might be right, though I'm not sure. I have had a hard time getting up to speed on managing my own machine at home, but it works far better now than when I ran windows. The learning time has been well spent, in my case.

      I am firmly convinced that, given a competent sysadmin to set things up right and keep them humming, and users with the same level of experience on the system, a *nix system will be at least as easy to use to accomplish useful work as a Windows system. It may well be harder to do the things that you did on a MS system, such as automatically running viruses, but I'm talking about getting work done.

      So I'd like to see some follow-up like knowing more specifics such as what company this is, when it happened and such. Who from RedHat can confirm this story?

      I also would like to see some specifics, but the City of Largo Adopts KDE 2.1.1 [kde.org] story shows that it is indeed possible to put Linux on the desktop, and the back end, of a fair-sized organization. They weren't switching from NT, but If you wanted to badly enough, I think this shows that you could. I would especially like to find out what Linux support and training are costing them.

      Any company is all sweetness and light with a new customer, until you buy. At that point, you're no longer a new customer, you're one of the people who get screwed to subsidise the sweetheart deals for the prospective new customers. MS and Pitney Bowes (and Friden-Alcatel, and Postalia, and ...) can play this game in a particularly mean way, since they get you locked in with a large investment which becomes worthless if you stop leasing (or purchasing upgrades for) their product. The great thing about Linux is that RedHat, SUSE, etc can't get that kind of lockin. If this story isn't true, I bet there's one just like it that is.

  • I am a little interested as to who everyone is so concerned about companies adopting linux? I think I've heard all the arguments: it's good for the Linux community, it's good for the companies(and the economy), it whacks Bill in the balls . . . whatever. But in my opinion, the beauty of Linux lies in the fact that it is used largely by users who want to use it, not those who have to. And it makes no sense to me why you or I should care whether corp X uses Linux, BSD, Windows, or an old Lisp machine unless it personally affects us(through our jobs or investments).

    I am not trying to sound elitist -- I am not saying that "those not enlightened enough to use Linux should not." What I am saying, is that mindshare, both in the terms of users and corporations is rather irrelevant. Besides, if you believe that Linux is perfect for everything(and I don't -- my Windows machine is a great equivalent of my Dreamcast), then those corporations who use Linux will have lower costs and a competitive edge, resulting in economic success and in the displacement of Windows using companies. If this is what's happening now with the adoption of Linux, it makes no sense for us to care about it as anything more than a vindication of the OS, and I think there are very few people at Slashdot who need convincing.

    What saddens me is the decline of the hacker ethic and the change of emphasis from "Lets make it better so people use it" to "lets yell louder about how good it is so people use it." And what saddens me even more is that I am wasting time writing this and not coding . . . I guess I am being a little hypocritical. But still, I am convinced there is no reason cheer after a company's adoption of Linux and boo after hearing "Windows." The reason people cheer at football games is that they can't come down to the field and help out. Well, in the case of linux, we can.

  • by John Murdoch ( 102085 ) on Friday August 31, 2001 @01:49PM (#2239898) Homepage Journal

    Hi!

    Like others, I'm a bit disturbed by the anonymous "case study" that was presented in this article. I'd feel a lot more comfortable knowing who the company is, and some third-party verification that such a change actually took place.

    But there's no denying the central argument: Microsoft's licensing fees have dramatically jumped in price, and the terms of their licensing agreements have gotten substantially worse. Yesterday, for instance, I received an email from Microsoft regarding SQL Server licensing. In short, I have till October 1 to upgrade all of my SQL Server 7 licenses to SQL Server 2000--or I lose the right to to "upgrade" price for SQL Server 2000. If I choose to upgrade after October 1 I will have to pay the full retail price.

    I'm a big believer in the concept of "don't fix what isn't broken." While the move from SQL Server 7.0 to SQL Server 2000 isn't a big deal (at least for our SS7 applications) I see little reason to spend bucks upgrading server databases that don't need to be changed. But if I need to migrate those down the road, I'll have to pay substantially higher fees--the pay-me-now-or-pay-me-later demand from Microsoft just infuriates me.

    But the licensing problem gets worse. Microsoft has dramatically raised their prices and dramatically restricted their terms. Case in point: we're starting to develop a project for a small startup non-profit organization. This is a group that does physical therapy on horseback for handicapped kids--they used to be part of Easter Seals, but Easter Seals has dropped them. (Long, sad story.) They're on their own, and they need to get organized. We want to help them (we're working pro bono publico) and we're recommending a "virtual office" concept. Don't build/buy/rent an office building: instead, let volunteers and paid staff function from home. Manage the office functions in a web application, handle the phones with call forwarding and related telephony stuff, and so forth--it's the 21st century, and there's lots of cool things we can do to hold costs down so program funds can be focused on kids and horses.

    Sounds great, right? Except--we run right smack into Microsoft licensing. We're a Microsoft shop--and part of the benefit of doing pro bono projects like this is the hands-on experience we get with new development tools. This would be the perfect project for Microsoft's dot-Net technologies. That is, until we go live--and have to pay $2500 per processor for the server license for the OS, and another $2500 per processor for the SQL Server 2000 license. I'm entirely willing to develop the site for Equi-Librium pro bono--I am also willing to pay Microsoft a reasonable fee for the software we'll use. But five thousand U.S. currency one-dollar simolians is most definitely not a reasonable fee.

    So this lets-all-get-experience project may well get done with PHP, PostgreSQL, and FreeBSD. And when we're done we'll have experience with a bunch of non-Microsoft tools, and we may have a different answer for clients who want scaleable applications but can't (or don't want to) pay Microsoft's fees.

    Despite the propaganda, Microsoft didn't win the PC wars by skullduggery or deceit. They won by targetting the "influential end user" (their words) and providing lots of information. Software consultants are precisely the kind of people that Microsoft has depended upon, and we've been a very loyal Microsoft shop. We've benefitted enormously from the Microsoft Developer Network program, and we've steered a lot of clients to Microsoft-based solutions (and thus Microsoft operating systems) over the years. But Microsoft's pricing, and licensing, and upgrade policies have us--among the most loyal of Microsoft loyalists--actively questioning our relationship to them.

    John Murdoch
    Wind Gap Technology Group [www.windga...argetblank]

    • Y'know, I think that's the most surprising comment I've read here in a while, because I remember arguing with you about the costs and benefits of Microsoft technology a year or so ago, John. I respected your opinions at the time because you were really able to back them up, and I have to say that I still do. And if now you're thinking about other alternatives (including FreeBSD on the web server, I see :), then maybe Microsoft really does have a problem.

    • I don't see how offering existing users an incentive to upgrade now rather than later is some sort of evil campaign to raise licensing fees. Lots of companies offer incentives to upgrade; if there isn't a time limitation what's the incentive?

      In fact, the MSRP for SQL 2000 with 5 client licenses is $1499.00; no change from SQL 7.

      If your charity is 501c3 certified, then they qualify for MS charity pricing. Your $10,000 solution is more like $2000 - with seat licenses - and many companies get the stuff for free if they apply to MS.

      • Hi!

        Forced upgrading: it isn't evil--but it is most certainly designed to generate licensing fees. Look at Microsoft's own words: "join the Software Assurance program or face substantially higher upgrade prices in the future."

        SQL Server 2000 pricing: the 5-client license price is immaterial: this is a web app, so I have to buy the per-processor license. Charity pricing: we'll look into it, but we're going to host the solution--not Equi-Librium.

        We're still a Microsoft shop--but Microsoft is forcing us to look at other options because of their recent pricing moves.

    • but the problem is that noone "NEEDS" to upgrade to SQL 2000. Windows 2000 server has code written in it to not allow SQL 6.5 to run. It's not that it wont run it's built in obsolesence.

      99.9% of all SQL uses do not need an upgrade past 6.5 there is absolutely no need to unless when you need those super advanced added features. It's as stable as 7.0 and 2000, scales the same (horribly) Just like there is absolutely no reason to upgrade from NT 4.0 you dont gain any extra features that are required for security or useage. (in fact 2000 is just as bad as 4.0 in security. you cannot lock down a machine in a domain environment.)

      Microsoft is shoving every one of their products down everyone's throats. They threaten you by taking away the "discounts" and try to scare you.

      Me? my servers are going to stay at NT4.0 until they go to linux. they will NEVER go to 2000 or XP because both of these OS upgrades offer nothing but fluff... and being a offshoot part of corperate I can do this.... Sometimes it's good to be the bastard stepchild of the company.
  • by gelfling ( 6534 ) on Friday August 31, 2001 @01:57PM (#2239945) Homepage Journal
    Isn't that the final irony that the biggest wealthiest and some would say most sophisticated companies will be the biggest consumers of NT-2K-XP while everyone else just gets by with fast good reliable stable safe open source. Fortune 500 firms will be able to afford all the convolutions of Windows code and will smugly assume that they're getting the best bang for the buck. They're not that sensitive to support costs so they'll be fat dumb and happy. Smaller firms, nonprofits and the like will use anything but Windows code.

    But the biggest irony of all will be that MS will finally be an enterprise provider not because their stuff is any good but because large companies can afford it.
  • by blakestah ( 91866 ) <blakestah@gmail.com> on Friday August 31, 2001 @02:35PM (#2240158) Homepage
    Microsoft haters still have something to worry about. The company operates with a 40% profit margin. Only the mob and the phone company can get away with that kind of margin.

    What this means is that Microsoft could substantially reduce all their prices and still make a reasonable margin - one comparable to other companies like AOL whose margin is 1%.

    All Microsoft really needs to do as free competition arises is reduce price structure enough to keep the free solutions out because it costs to much to switch. This cost of re-tooling will ring true with CTOs, and they will be quite happy to keep paying what they've been paying.

    However, Microsoft wants it all. The new licensing strategy with XP intends to increase company gross by 60% over the next 5 years or so. Or kill it, one of the two. But a monster with 30 BILLION dollars hard cash in the bank is pretty hard to kill. They can come back failure after failure if necessary, and still buy all their competitors.
    As to the credibility of the story, I find it entirely believable. One of the large issues is that the story compares fairly incompetent NT engineers with competent linux ones. Even so, server administration requires much less admin time on linux - we estimate it is a 3 to 1 difference.

  • Comment removed based on user account deletion
  • My School (Score:2, Interesting)

    by Beowulf_Boy ( 239340 )
    Last week I became a Tech guy at my school,
    they called me 3 days before it started, and asked me to help setup 80 new computers.
    While I was putting Windows on a few, I setup Linux on one, and showed it to the tech director.
    He was really impressed, and now I get to setup 2 labs of 30 computers apiece, and find out what happens from there.
  • by ctid ( 449118 ) on Friday August 31, 2001 @04:49PM (#2240901) Homepage
    Just in case nobody has posted this yet, the author of the article at Anandtech explains that there's an NDA in force [anandtech.com]. It'll be eighteen months before he can reveal the name of the company. You'll have to search for "Paul Sullivan" to see his comment.

    Failure is its own reward.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...