Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Linux Business

Gartner Group Squints At Future OS Growth 222

Icebox writes: "Cnet is offering up this bit from GartnerGroup that includes their predictions for the next few years in the OS market. Their predictions are aimed stricly at the business side of this but it is interesting to see how their ideas stack up against what Slashdot's readership expects. Pay particular attention to Factor #9."
This discussion has been archived. No new comments can be posted.

Future OS Growth

Comments Filter:
  • My cousin is the CFO for a startup company called I-tech (shameless plug) and he actually went to college with the CEO of the Gartner Group. He has a million stories about how this guy was a real poser and didn't know his arse from a hole in the ground. Again, this is all heresy but my cousin is a real straight shooter so I believe him.

    So, do you think the CEO writes these forcasts?
  • by 11thangel ( 103409 ) on Thursday November 02, 2000 @08:23AM (#654863) Homepage
    While many users may consider linux more powerful, etc, they often forget to consider the learning curve. Unix systems are harder to use, lets face it. Linux, BSD, etc, will continue to be powerful, and crash less often, but windows being more "assume the user is an idiot when programming" oriented, it will continue to have a good part of the market. IMHO, unix should be used for servers and for the workstations of advanced users, at least until the interface can become more standardized and easy to use.
  • From the end of the article:

    The total-cost-of-ownership argument on behalf of Linux will disappear (Unix platform vendors like IBM and Sun already offer their Unix OSs at virtually no charge).

    *COUGH* Microsoft *COUGH*
  • by blogan ( 84463 ) on Thursday November 02, 2000 @08:24AM (#654865)
    They don't forecast the rise of the Amiga? What? Didn't they even attempt any research? :)
  • by Anonymous Coward
    10) After the Justice Department demands the breakup of Microsoft, Gates and Ballmer sequester themselves in Redmond with thousands of the company faithful.

    Negotiations break down after several days, and a fierce assualt by BATF troops is quickly repulsed by the computer controlled defenses at Microsoft.

    A second assuault two days laters succeeds, and Gates and Ballmer are led away in handcuffs. The central defense computer is found with a BSOD.
  • by Soko ( 17987 ) on Thursday November 02, 2000 @08:28AM (#654867) Homepage
    The big hardware players like IBM, HP and Compaq are likely tired of cow-towing to Bill & Co. every time they sell a machine. Linux means they can free themselves from any outside control in product development. It also levels the playing field for everyones hardware. Lastly, Linux is buying them mindshare from the people they need to support thier products - sysadmins, delvelopers and IT personnel in general - by introducing them to a *NIX variant, they get a toe-hold for thier proprietary versions of *NIX. No wonder they're all solidly behind Linux - it's good business.
  • by update() ( 217397 ) on Thursday November 02, 2000 @08:31AM (#654868) Homepage
    Check out Gartner's previous pronouncements on this subject [slashdot.org]. In particular, read Gartner Slams Linux [slashdot.org] from just over a year ago. I wish they'd explain what new development has caused them to upgrade Linux's prospects from hopeless to unstoppable. XFree86 4? TuxRacer? Gnome Foundation press releases?

    Of course, the 2.4 kernel is due soon. But that was true last October, too. ;-)

  • The author of this report seems to assume that eventually the market will be divided between 2 or 3 dominating OS's, but that doesn't seem plausible. I mean, if all, or even many, users were likely to take the path of least resistance and go with the biggest, most widely supported systems, Linux itself would not have gotten very far.
    Much as I like Linux, I don't think it alone will satify those who reject Windows.
    --from a not-so-secret fan of BeOS
  • I think we shouldn't pay attention to the figures released by the Gartner group. Why?
    Because you can't aim a prediction strictly at the business side. The business side depends on the customers opinion and the customers(most of them love Linux) opinion is not easy predictable. Maybe Linux is not as userfriendly as it should be, but this is a question of time and another issue.
    Predictions of companies like Gartner are also quite dangerous, because Gartner (or other Companies) may support special companies and try (with their predictions) to take influence on the market.
  • by Chalst ( 57653 ) on Thursday November 02, 2000 @08:36AM (#654871) Homepage Journal
    The findings seem plausible to me, but they declined to comment on the
    new MS strategy. A way of looking at the .NET roll-out is that MS
    have given up their strategy of trying to convert heterogenous systems
    over to MS only systems, and have attampted to go for peaceful
    coexistence.

    It's a dangerous strategy if OS X takes off, since now there is a
    UNIX-like OS with a first rate desktop / GUI development environment.

  • My cousin is the CFO for a startup company called I-tech (shameless plug) and he actually went to college with the CEO of the Gartner Group. He has a million stories about how this guy was a real poser and didn't know his arse from a hole in the ground. Again, this is all heresy but my cousin is a real straight shooter so I believe him.

    So you believe that the CEO is a poseur, while still accepting that the Gartner Group is your true faith, thus committing heresy?

    --
  • by the_other_one ( 178565 ) on Thursday November 02, 2000 @08:40AM (#654873) Homepage

    According to the article BSD will cease to exist. Or at least remain completely unnoticed by journalists.

  • by dillon_rinker ( 17944 ) on Thursday November 02, 2000 @08:40AM (#654874) Homepage
    Linux shipped revenue in 2005 will approach 20 percent of the revenue of Unix and 17 percent of that of Windows.

    This, of course, says nothing about the number of units shipped or the number of boxes with Linux installed, as the "shipped revenue" of Linux can be as low as the purchaser desires.

    well-established OS environments that continue to benefit from the research and development resources of Microsoft, Sun Microsystems, Hewlett-Packard and IBM,

    As nearly as I can tell, Linux benefits from their research as well. The lag is a little higher, but the 'geek effect'* means that Linux benefits from any research that benefits any other operating system.

    Much of the beneficial backlash Linux has gained at Windows NT's expense will dissipate by 2002, forcing the Linux community to refocus and re-energize its campaign for wide corporate acceptance.
    Yup. Market growth of any non-MS OS must eventually slow, as only hardcore MS-ites will continue to buy MS products, and everyone who's convinced that MS is non-optimal has already stopped buying MS products.

    Interesting article, but really only worthwhile from the point of view of business people and marketroids. Every business in the world could drop all support of Linux, and guess what?
    It would still exist and grow and improve.

    *geek effect: Geeks work in technical places. Geeks like Linux. Geeks learn new technical things in technical places. Geeks say "Hey, let's add this to Linux!"
  • by fragermk ( 96318 ) on Thursday November 02, 2000 @08:41AM (#654875)
    This report says nothing about how Apple products will effect things. Don't forget, way back in 1979 Steve Jobs brought the personal computer to the mass market. Apple is about to release a modern graphical operation system with a true Unix core. My personal prediction is that Apple will steadily gain marketshare in three critical computing markets: small to medium scale data serving (webserving), home clients and business clients. I see Apple's (not Linux's) marketshare equaling Microsoft's by the year 2005, if not sooner.
  • Gee, machines with a decent OS. What a concept... (Though I saw no mention of OS X (Darwin) in the mix, but that's essentially correct too.)

    Look for consumer machines to run Linux (PC style) & OS X (Mac Style,) while Windows gets squeezed further and further (unless they give up and go with Linux [not {expletive deleted} likely? To survive, M$ could mutate & metastacize. They already make more money from their investments than their OS!] like Apple went for BSD/Darwin... )

    Now are we surprised?
  • Shouldn't that be "when programming assume the user is an idiot"?
  • by Keck ( 7446 ) on Thursday November 02, 2000 @08:43AM (#654878) Homepage
    While many users may consider linux more powerful, etc, they often forget to consider the learning curve. Unix systems are harder to use, lets face it.
    Yeah, but I'm reminded of various fortunes/quotes:
    • "Unix is user-friendly -- nobody said it was learning friendly..."
    • "Unix is user-friendly -- it's just selective about it's friends..."
    • "The learning curve for *nix may be steeper, but the view is better once you scale it."
    • "The learning curve for *nix may be steep, but at least you only have to climb it once!

    And how true that is! Unix gurus from 20 years ago can at least know where to start with linux, as opposed to a DOS user sitting down to Windows ME, 2000, etc. There is less fundamental difference between any two (or three) *nix "variants" than between any two Micro$oft OS's -- DOS, Win9x, ME, 2000, NT, WinCE (the best named MS OS ever) and now Whistler..
  • by mikeee ( 137160 ) on Thursday November 02, 2000 @08:45AM (#654880)
    The point of Gartner reports isn't prediction, or somebody would have noticed by now that they're right out to a timeframe of exactly one corporate budget cycle and further out are about as accurate as my magic 8-ball.

    Their real purpose is to codifiy and legitimize the conventional wisdom among IT executives by making it expensive and official. For only $5K/year you, too, can CYA!
  • by rxmd ( 205533 ) on Thursday November 02, 2000 @08:48AM (#654885) Homepage
    One factor they ignore is the general Microsoft-mindedness of computer users, especially corporate users. For example, while it is known that StarOffice and to some extent KOffice [koffice.org] (even though that's still in beta and is going to remain there for more than just a while) are as compatible as it can be expected with Microsoft applications, still the average user thinks it is best to stick with MS Office for "compatibility reasons". The same applies to Internet browsing, and to some extent to OS choice as well ("I've got Windows at work, and I know how to use it, so I'll stick to it at home as well". Swap work and home, same effect.).

    Another thing they ignore is that computers are not only used for productive work. This was to be expected from Gartner, them being analysts, but still the fact remains that Windows is a better platform for games and will be, especially with the anticipated rise of technologies like the X-Box [xbox.com].

    A final thing they ignore is that OS choice appears to be less deterministic than one thinks it should be - while a number of people know that non-MS platforms are better for some applications, they still use Windows, and Windows does have its advantages, some geeks would rather be hanged than use it and stick to sometimes less comfortable and/or less powerful solutions.

  • by molo ( 94384 ) on Thursday November 02, 2000 @08:50AM (#654887) Journal
    I just wanted to point something out here. While I agree this does seem to be a good endorsement for Linux's role, I have to wonder about some of these conclusions. One in particular was interesting to me:

    7. The performance advantages of RISC over Intel-based servers will decline by about 20 percent to 30 percent each year [..]

    They mention nothing to back this up, and it just plain doesn't make sense to me. This kind of conclusion without any presentation of a reason behind if makes me take this whole report with a grain of salt.
  • by Anonymous Coward
    1) Microsoft .NET will crash and burn leaving a crater the size of Texas after having millions poured into its development. Nobody wants remote storage of data and software. Look at the Javastation for an example. Microsoft will loose more marketshare as they scramble back towards Windows NT based OSes.

    2) Linux will still be around, but the distros will have to rethink their strategies. RedHat shot themselves in the foot with that garbage they released as 7.0. Cutting edge is one thing, stability and compatibility are another. Debian is on the right track, maybe give them a slight boost in the release department. Stability is one thing, staying up to date is another. I predict that IBM and other hardware vendors of the like may release their own versions of linux and give back some to the community.

    3) Linux moves more into the desktop. KDE2 is nice and may sway more companies to migrate. Look at non-profits and little-profits that are computer savy to make the migration. Yes, the learning curve is greater, but the cost savings alone allow for hiring Joe Highschool for occasional maintainance and consulting. Yes even the non-profit discount still requires a decent cash outlay for MS, etc.

    4) Freebsd and the like make heavy inroads in the server area. I'm still waiting for a distro of linux to install on a server that doesn't require gigs of drive space. Yea, I know you can do min installs and then install what you need, but xxxBSD just seems cleaner to install. And you have to love the ports.

    Just remember, Linux and Unix are now at the place now where MS was in the early 90s. There OS was powerful but difficult to learn. Eventually Linux and Unix will get to the point where the normal user never sees the shell prompt.

    IOW, Microsoft less, linux/unix more.
  • by Otterley ( 29945 ) on Thursday November 02, 2000 @08:53AM (#654892)
    The total-cost-of-ownership argument on behalf of Linux will disappear (Unix platform vendors like IBM and Sun already offer their Unix OSs at virtually no charge).

    What the Gartner Group doesn't understand about TCO (or at least fails to recognize in this report) is that the price of the OS license is only a tiny fraction of TCO. Here's what I think really matters in the calculation of TCO:

    Primary documentation. Is the operating system well-documented? Are the manuals (or man pages) accurate, well-written, clear and concise? Linux doesn't score particularly well in the man page department, unfortunately--I think it loses points here. However...

    Secondary documentation. How good is the secondary, user-contributed documentation out there relative to other operating systems? Are there well-written, current HOWTOs, user guides, tips and tricks, etc out there? If so, how well are they organized? Linux scores very high here. Other UNIXes are not so lucky. Windows has a lot of users out there--there are a lot of helpful tips on the Net but the documentation is not organized well.

    Source code availability. It's an old adage that "the source code is the best documentation," and it's hard to argue otherwise. If I really want to know how a certain function call works, or how the kernel is talking to the hardware, I can dig out the glibc or kernel sources and see for myself, and be 100% certain of the accuracy of my conclusions (well, as long as my understanding of the code is correct). Once again, Linux and other free OSes score high here. With commercial UNIXes and Windows, you have to trust the documentation -- which is often not 100% accurate. In addition, if there's just a bug that needs to be fixed, it's often much easier to fix it yourself if you have access to the source code. Waiting for Microsoft or a commercial UNIX vendor to get a patch out to you can be painfully costly if your product or service's success is dependent on your software vendor's turnaround capability.

    In summary, what I think really counts the most towards TCO is the relative understanding of the managers who plan system rollouts and the administrators who maintain them once they're out there. I've seen way too many Windows NT admins out there banging their heads on their desks because something doesn't work the way the documentation says it will, and they have no hope of getting it right because Tech Support doesn't know either. This leads to a lot of costly experimentation (labour is much more expensive than hardware) to make things work.

    On the other hand, while Linux documentation isn't always perfect, it's plentiful, it's reasonably organized, and if you just can't find what you need from the HOWTOs or the man pages, you can always fall back on the source code.

  • Gartner makes hundreds of these type of reports, and the people writing them are different, so if you get a pro-microsoft analyst, than you'll see that viewpoint come across.

    I've read enough gartner reports to know that most of what they say
    a. is commonsense
    b. too general to be of any use to anyone (of course you can actually hire the analyst who wrote the report for big $$$$ to get a more specific report for your company/industry)
    c. tied up in complex diagrams
  • by chazR ( 41002 ) on Thursday November 02, 2000 @08:58AM (#654897) Homepage
    Over the next five years, a large number of recent graduates who are in sysadmin positions will start to rise to positions of greater purchasing power in IS departments. Many of these people have grown up with Free operating systems.

    Additionally, new recruits into corporate IS departments will also have had significant experience of Free operating systems at University.

    Together, this means that a lot of the traditional barriers to Linux/*BSD in the server room will disappear.

    Coupled with the increasing quality of desktop tools for X (Gnome, KDE, StarOffice, KOffice etc) this *may* cause a gradual acceptance of Linux etc. on the corporate desktop.

    Happy days ahead.... - Mind you, I have been wrong before, and the Gartner Group are not exactly perfect.
  • This report was aimed at your boss, not at you. Apple is carving out its own niche and it is NOT in the boring 9-to-5 world of the work office.

    Apple can rule the living rooms and professional home-offices with style, dash and panache.

    If you're enjoying something, do you want to wrestle with failings of the OS, or would you rather just enjoy it.

    If you're creating something or advising someone, do you want to wrestle with failings of the OS, or would you rather just get it done.

    Linux boxes will rule the garages, "we mean business" home-offices and tinkerer's shops.

    If you're tinkering at something you want to do, do you want to wrestle with failings of the OS, or would you get something accomplished.
  • A way of looking at the .NET roll-out is that MS have given up their strategy of trying to convert heterogenous systems over to MS only systems, and have attampted to go for peaceful coexistence.

    This is a possibility. It is also possible that they are trying to displace java and take all of java's promises away from sun. Right now, Sun is poised in a very interesting position, and I am sure that MS strategy revolves around sun to some degree.

  • It's a dangerous strategy if OS X takes off, since now there is a UNIX-like OS with a first rate desktop / GUI development environment.

    OS X is awesome, simply put. It put *BSD right in front of the most GUI driven group yet.. the Mac users. The OS itself is great but the cost of Apple Hardware will keep it from spawning into the next Good Idea(tm). Comparably equipped PCs costs about 1/2 the price of a Mac. I want a G4 cube in all its glory running OS X, but the cost of the thing plus its non-upgradabilityness (word?) pushes me away. I think this is also what will keep OS X from overtaking the desktop as it very well could.

    "You'll die up there son, just like I did!" - Abe Simpson
  • I wish they'd explain what new development has caused them to upgrade Linux's prospects from hopeless to unstoppable.

    They did - increased vendor support from the mid-range server market players like Compaq, Dell, and IBM.

  • Most likely a lot of developments that they hadn't forseen, like the widespread adoption and corporatye buddying with linux. Last year, Sun was saying that linux was aweful and going no where. This year they say that a linux sale is a unix sale and because a unix sale isn't a windows sale, a linux sale is good for them. IBM last year was very contemplative of Linux. This year it's going out of it's way to support them. And last year, Microsoft wasn't an official monopoly, so many company's were afraid to step forward and attempt to support linux.

    I don't think it's any new software that changed Gartners view of linux. Not kernel 2.4.pre4, XFree 4.0, Apache 2.0, Perl 6, KDE 2, Gnome, et al. It's just that last year, everyone was very skeptical of linux, and very wary of microsoft. This year, people are interested in linux and less afraid of microsoft, so they're willing to experiment a bit more than they would have a couple years back.

    My, what a difference a year makes!
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Thursday November 02, 2000 @09:04AM (#654905) Homepage Journal
    Huh? They think that Monterey is actually happening? That is very optimistic.

    Thanks

    Bruce

  • by Tony Shepps ( 333 ) on Thursday November 02, 2000 @09:05AM (#654906)
    I had access to their research in early 1998 when I worked at a big-5 consulting company. Even in that time frame, Linux was not on their radar screen. Anyone who has access to their archives, please feel free to do searches on Linux for March 1998 and earlier. I'll wager you won't find anything more serious than a casual mention.

    At one point during that time, they predicted that four Unixes would survive. I believe the winners were Solaris, HPUX, Digital Unix, and SCO. That's right; March 1998 or so and Linux was NOT EVEN MENTIONED AS A PLAYER, much less a survivor.

    Now, they sprinkle notes like "0.7 probability" throughout their predictions, so they have an out, but one would rather they show more of their work.
    --

  • The author of this report seems to assume that eventually the market will be divided between 2 or 3 dominating OS's, but that doesn't seem plausible.

    Prior to this year, the computer landscape looked like:

    Microsoft 94%
    Apple 4%
    Everyone else 2%

    so a shift to

    Microsoft 60%
    Linux/Unix 35%
    Apple/everyone else 5%

    is a very radical shift.

    OF course the world will never fully exist of two OSes and only two OSes. It's just the most popular and visible OS's will change in the coming years.
  • Okay, so let's say that in 2005 we've got a 50/50 split between Windows and *nixes. What I expect will happen is that support for some of the major Unixes will fall off as the market consolidates. My guess would be that HP/UX which will be the likely victim of this. Since HP makes its real money on selling hardware, to make their hardware palatable, they'll begin canibilizing HP/UX in favor of Linux which is already getting better support from vendors.

    So then even if you have the same 50/50 split between *nixes and windows, I think you'll see Linux slow start to gnaw away at the propertiary unixes. Eventually as these OS's lose market share they'll divert their existing IP and research efforts into Linux (see also SGI). With each fallen Unix variant, more energy will be diverted into making Linux better.

    So, if my theory holds up, not to far down the road will be down to four major enterprise OS's: Windows 2000, Linux, Monterrey, and Solaris. So then who's next? Monterrey. IBM has invested much in that product, but they've also invested heavily in Linux. Why waste resources on two development efforts when one is clearly growing and beginning to look like a dominant player. So, monterrey fades, and you are down to three.

    Sun has been downplaying Linux on big systems, but as this all evolves, if Linux keeps moving full steam ahead, those arguments will cease to be plausible. Sun may resist eventually, but with the majority of Enterprise unix work running on Linux, it will be hard to argue the business case for backing Solaris any further.

    So then it will be just Windows vs. Linux. I dunno, come back here in 10 years and see if I'm right :)

    ---

  • Apple's resurgence won't significantly affect the markets that this study predicts.

    Apple's only server-based products, OS X Server and AppleShare IP, aren't targetted at the high-end server markets that IBM, Sun, and Compaq are jousting for, and at the low-end, neither product has the installed base of Windows NT or the 'geek chic' of Linux. OS X Server and ASIP probably will have more resilience to migration than Novell Netware, since a company wouldn't invest in either Apple product without a specific reason (say, the ability to NetBoot Mac clients from an OS X server, or ASIP's native support of AppleTalk). Until those needs can be met by a competing product, it's unlikely that current installations of OSXS or ASIP will fall.

    On the other hand, Apple can remain a very successful and profitable company without selling a single product in the high-end server market. All those servers have to have somebody to send data to, after all...
    --

  • Comment removed based on user account deletion
  • Too bad they think that the Unices can match GNU/Linux in TCO, just because the vendors give away their software for `virtually no cost'.

    Why is it so difficult to understand for reporters that the cost of acquiring the software for a GNU/Linux system has nothing to do with the total cost of ownership ? That gratis software does not necessarily mean that bugs get fixed within the hour, etc.

    No wonder they are having a hard time predicting the Free systems growth :) Oh well, maybe this time they are less far off than last. In a few decades they will maybe even learn.

  • They don't cite references, but they are essentially correct.

    As the gap in actual core functionality between so-called RISC and so-called CISC architectures continues to narrow, there will be less of a performance lead by RISC architectures. The Pentium III / Athlon / PowerPC G4 architectures are actually remarkably similar - the main differences lie in the translation firmware that Intel and AMD use to translate old CISC instructions to RISC. Besides that, "RISC" processors have once again started adding layers of new functions that don't fit the RISC model - the PowerPC G4's Velocity Engine is a good example.

    Ars Technica [arstechnica.com] has some good articles on these issues, but they're not on the front page anymore.

    RISC vs. CISC is an outdated argument - the good debates in the future involve NUMA and other on-the-horizon technologies. This study doesn't really take those into account (beyond a brief mention of IA-64).

  • the report doesn't really seem to appreciate the complexities of free software and the unix world - they focused totally on commercial unices and linux, forgetting about freebsd, openbsd, etc... additionally, there is no account for the effect the open source has on buying decisions. sure, AIX, etc. are virtually free with hardware purchase, but that doesn't mean you can re-compile the kernel for smaller memory footprint, fix bugs in-house, or any of the other options open to users of free or open source software...
  • Much of the beneficial backlash Linux has gained at Windows NT's expense will dissipate by 2002, forcing the Linux community to refocus and re-energize its campaign for wide corporate acceptance.

    Gartner factors in the transition to windows 2000, but clearly has failed to factor in the replacement of the Windows NT backlash with newer, improved Windows 2000 backlash. Even more users upset that previous versions!
  • KOffice was released last week as part of KDE2.
  • Many of the assumptions made here assume that Linux is like a business, that it will respond to market demands as they change (such as the suggestion that as Linux's advantage over NT4 dwindles from new 2000 installs, Linux will adopt to improve itself). And while there are businesses based on Linux, Linux itself is not a business and therefore will not act like one. Therefore, it's impossible to make predictions about how Linux behaves, only on how Linux may be used by others.

  • The Gartner Group has had of history of gassy emanations about the future of Linux which have proven wrong. I think the problem is Linux simply doesn't fit into a mold that makes it predictable by some market research firm. Linux's strength has less to do with number of units shipped as a great idea that's taken hold of bright people worldwide. Linux is a way of thinking, not a new product created by some marketing department. I see it as essentially a reaction by engineers that are tired of being dictated to by marketing departments.

    It's this idea that makes Linux act like an amoeba. It constantly changes shape and slithers around into all kinds of new and unexpected market sectors. I'm constantly surprised and gladdened at the vast array of applications Linux is put to. That's what you get when you have access to all the source code: amoeba-like flexibility. You can't have this in a commercial product. Vendors want to sell you a black box and not try to look inside it. Commercialized software is inflexible.

    I expect someday soon Microsoft is going to wake up and realize that Linux psuedopods have occupied all the markets they want to enter. Even worse, big nasty psuedopods are slowly slithering and inching their way into all the markets Microsoft now occupies. They also won't be able to fight it because it's not possible to truly fight such an amorphous enemy. They best they'll be able to do is whine about "that big slimy monster" that has no head to chop off. But we Linuxers will just grin and keep pushing the psuedopods forward. Maybe the Linux mascot shouldn't be a pengion, but an amoeba.

  • I can't see Apple replacing Microsoft. Even if it did we would merely have traded a big closed monopoly for a big more-closed monopoly.

    Macophiles rarely seem to recall the real reason why the Apple computers, which were far superior, fell to the Dos/Intel machines: The fact was that the Win/Dos/Intel machines were alot more open, while the Apples were pretty much black-box.

    The reason Wintel now feels a threat from Linux is that Linux is to Windows as Windows was to Apple (In terms of openness and accessibility).

    Unless apple decides to really open their box and allow anyone to make hardware/software compatible with their's, they have no chance of taking over the market whatsoever.

    Toppling the WinTel hegemony is tough- akin to overthrowing the telephone company, or competing with the interstate highway system. The only way another O/S could do that is to really be shockingly better for the tasks a desktop user want to do. Ironic that success in the business arena depends upon how well a platform supports games.

  • That's the conventional reading ;->

    But seriously, that MS decides to compete with Java on its own
    terms means that they think the write-once run-anywhere philosophy has
    more appeal than the ours-is-the-richest-environment-for-any-task they
    used to pedal. I'm guessing the difficulties they faced making entry
    into the enterprise, and the generally cool reception that W2k-alpha
    received is precisely the reason for this change.

  • by Ami Ganguli ( 921 ) on Thursday November 02, 2000 @09:21AM (#654924) Homepage

    Gartner's clients are CTOs and managers who quote Gartner reports in order to justify pet projects. It works like this:

    1. Techies play with cool stuff.

    2. The techies start whispering into the ears of their managers about the cool stuff they're playing with. The techies know not to challenge the status-quo too much or they'll be ignored, so the managers only really hear about moderately cool stuff.

    3. Managers (many of whom are has-been techies) start to daydream about the cool stuff the techies have mentioned. Some if it sounds like it might be useful, but of course the big boss (CEO, CFO, etc.) will never go for it, oh well.

    4. Gartners asks the managers and CTOs what they've been thinking about.

    5. Gartner produces a report that reflects what the managers _would_like_to_do, but don't really have the guts for.

    6. Managers buy Gartner reports and use them to justify their pet projects.

    The conclusion: Gartner is really reporting techie opinions, filtered through a powerful "you can't handle the truth" lens and contaminated with strange manager ideas.

    What will happen now is that all the managers who were dreaming of Windows installations will keep doing what they were doing. All the managers who were dreaming of Linux will have some ammo to justify jumping in with both feet.

    If the techies in the organization like Linux, the Windows projects will fall strangely behind schedule while the Linux projects will go surprisingly well (it's amazing how happy techies make a project go better). In two years, shortly after the managers have noticed that their Windows projects are going nowhere, Gartner will report that Linux is suddenly the greatest thing since sliced bread.

  • I would venture to guess that they predicted that because the x86 chips are becoming RISC processors with a CISC "interpreter" (for want of a better word, I'm no EE) bolted on.

    As the RISC core gets faster and faster, the overhead of the CISC interpreter becomes less and less important, relatively.

    Of course I am most likely completely wrong...

    "Free your mind and your ass will follow"

  • by fgodfrey ( 116175 ) <fgodfrey@bigw.org> on Thursday November 02, 2000 @09:24AM (#654926) Homepage
    OS X runs a Mach microkernel at its core. This is not Unix. However, the way Mach works is that you put various "OS servers" on top of Mach. I think originally it was designed as (among other things) a way of running multiple OS "kernels" at once while Mach handled all the device driver interfaces and things like that. Part of OS X (since it was part of NeXTstep) is a BSD OS server (I think this is what BSD Lite is). Along with the OS server is a full BSD userland so you can pop open a window and run your favorite shell. Apple does all they can to mask the fact that deep down, sed/awk/grep and friends can run, but they are indeed there.

    I imagine, though I'm not an expert on OS X's kernel architecture, that they will choose to do Cocoa (the official OS X API - or is it Carbon that's the API? Whichever...) as another server directly on top of Mach. So, native OS X aps won't go through the BSD kernel, but native BSD aps will.

    I've seen a few prereleases of OS X and I was quite impressed. If I had a modern Mac, I'd definetly be running it. In the mean time, I'll stick with another OS that the Gartner Group thinks is going to go away - Irix (Hey Gartner guys! What OS is coming down the pipe that can do 1024 processors in a single kernel image like Irix and replacing it? What? There isn't one? Didn't think so :)

  • I would disagree with that. At the time that Apple was losing the OS war, they had two platforms - Mac and Apple II. There is almost no way to be more open than the Apple II, yet the Mac certainly won that battle (Steve Jobs had more than a little to do with that, though). I think the real reason Mac "lost" was that it was too expensive. If Apple could have sold the Mac for the same price as a PC clone, I really think they would have won. In the end, the people making the majority of the purchasing decisions (ie non-geeks) don't care about open vs. closed, they care about price first and ease of use a far second. They would probably care about reliability more if they actually understood what it meant...
  • Man I could not disagree more with you.

    Why use a system if it's hard to use and there is an easier one around ? Reasons could be idealism, religion, financial, force etc. Those are not the reasons I prefer a system with decent tools and network connectivity for my work. I have worked with NT, and I've been doing real work there. And I did not like it - why ? Well, if you have two machines and you are building networked software - then you better have a monitor and a keyboard on each one of them, and move both of them into the room where you sit. Not so with most other systems I've touched. Ok, so maybe not everyone is developing distributed systems allright. But the tools - man... When you get an ``enterprise edition'' of a development suite and the tools they ship are mostly of the type ``CPU stress tester'' and ``windiff'', you sort of start wondering... Then you stop wondering.

    Anyway, I fully agree with anyone who claims that the UN*X like systems (such as GNU/Linux) are not the easiest systems for everyone to use. Sure, no system is. But don't generalize. We are *a lot* of people out here who don't care that much about following the pack and doing what we're told the others do, but actually need to do real work that is made possible only because of networked time sharing systems. Free or proprietary, gratis or costly, but all interoperable and the easy choice for some of us.

    That was my 0.02 Euro.
  • The article was dealing with servers rather than workstations. I would agree with you about workstations, although it is possible to set up a dummy terminal type thing that would be just as easy as windows as long as the users didn't have to maintain them (I think Sun's Sunray xterms are pretty cool.) From the context of using unix/linux as a server, it is definitely easier than windows 2k or NT once you learn what you are doing. Now, there is a learning curve in either case for a lot of systems tasks, and since everything in Windows is based on the gui, it may seem easier to work on to the lazy person who doesn't want to type anything. However, if you look at the tasks you have to do, it is almost always easier to work on unix. A good example of where they are equal is adding a user. In Solaris, I can go to the command line and type "useradd b0z" and create a user named b0z on the system. This is simple, but you may not want to remember the commands. That's fine, there is a gui program called admintool that you can use in a similar fashion to the user manager in windows NT. So I would say they are equally easy in that task. Now, when it comes to something such as backing up a server, you can do it easily since you simply have to back up files. For example, if you have some specific things in your configuration that you need to back up, simply copy the contents of /etc to a tape and you're done. In Windows NT, you have to export the registry, and even then if you back it up and try to import it again that can be kinda sketchy. Basically, when you have to get down to the guts of the OS, unix is far easier than NT because the contents of /etc are much easier to deal with and more intuitive than the bloated registry with it's endless keys that can appear multiple times on the hives. Also, when you have to administer an NT server, you can't just ssh in and start and stop services very easily. There is software like SMS that you can use, but since it is gui based, you take longer to fix the problems in this case (if you are on a modem or something equally slow.) Yes, it is possible to use the NET command in NT to do some tasks, but it is not nearly as useful as simply opening a terminal window on the server itself. Also, you have much more robust scripting capabilities in unix based systems. You have perl, tcl, sh, ksh and other languages you can use to automate system administration tasks on unix. On NT, you can use some of these, but they can't interact with the OS as closely since a lot of tasks in NT require a user to be present by pointing and clicking on things.

    Unix is much easier to work with and maintain for a server OS, but Windows is better than linux for a desktop. Since the article was about servers, I guess you should read the article next time. :oD

  • I agree with the point about the business side, but it doesn't mean we
    should ignore these reports. For example, Gartner's list of weak OS's
    probably hasn't increased any of their lifespans.
  • Over the next five years, a large number of recent graduates who are in sysadmin positions will start to rise to positions of greater purchasing power in IS departments. Many of these people have grown up with Free operating systems.

    That has been happening for the last 30 years.

    -
  • True, end users just wanna get their work done.

    I think there are a couple of other reasons that Windows is perceived to be easier than Unix/Linux. One is that you hear everybody say it all the time. Linux is more powerful; Windows is easier (in part because it's less powerful). But it's not that black-and-white. Windows is actually pretty difficult and non-intuitive for certain things.

    The other is that documentation for Windows abounds and most of it doesn't come from Microsoft. Go buy a modem, look in the manual (or CD) and it will lead you by the nose how to set up dial-up networking under Windows. Is there documentation for Linux? You know the answer as well as I do. Often I have found that doing something in Linux turns out to be extremely easy but arriving at the point where I knew what to do was difficult due to lack of documentation. Case in point: how do you add dial-on-demand to pppd? Answer: add the word demand to the pppd options file. What could be easier? But how long did I have to dig around in man pages, how-tos and comp.os.linux.help to find that out? Perhaps an hour. The mythical average user isn't going to dig around in any newsgroup even 15 seconds, he's going to say "screw it where's my pointy clicky interface?"
  • Ahaaaa.

    /me nods wisely.

    I was just talking out of my ass. <grin>

    "Free your mind and your ass will follow"

  • RE: "TCO advantage to Linux will disappear..."

    Linux's big strength isn't just FREE AS IN BEER, it's also FREE AS IN SPEECH, and until I can read kernel source for AIX, Solaris ET AL, I think Linux and FreeBSD have defininite advantages not mentioned in the Gartner article...
  • Toppling the WinTel hegemony is tough- akin to overthrowing the telephone company ...

    From USAToday (print edition) for 11/2/02000:

    ... Just a few years ago, long-distance reigned supreme. ...

    Those changes are the first tremors in a seismic shift that analysts say could mean the death of the residential long-distance business as we know it. ...

    If so, WorldCom joined the death march Wednesday, saying it's forming a new tracking stock under the MCI brand for its slow-growth consumer long-distance and dial-up Internet businesses. AT&T decided last week to split into separate companies for consumer, business, broadband and wireless. Some analysts expect Sprint to issue a similar tracking stock Friday.

    Those familiar brands -- AT&T, MCI and Sprint -- won't disappear any time soon. And you'll always be able to buy home-based long-distance service.

    You just might not recognize it. ...

    So your point was?

    Steve M

  • BSD has already been co-opted by the moron script kiddy masses just as Linux was in 1998. Now an expensive commercial UNIX is the way to show your geek plumage..... and twm :)
  • Larry Wall's keynote at the 1999 Linux World. It's an old saying, though, and I'm not sure where it originates. I'm pretty sure it's not really Larry's or I would have attributed it.

  • by chazR ( 41002 ) on Thursday November 02, 2000 @09:53AM (#654946) Homepage
    I disagree. When I was at University (left 9 years ago) I wasn't aware of *any* free operating systems. Certainly none that were 'industrial strength'. Please let me know which OSs you are thinking about. (I did Maths, so YMMV)

    We used SunOS on workstations, BBC Micros (I kid you not) and a chuffing great Control Data monster running something nasty (but it compiled and ran FORTRAN77, which was all we needed).

    My point is (and this comes from experience of hiring people) that recent graduates in CS *all* run free operating systems out of choice. These operating systems are now totally capable of earning their food in the datacentre. I am *seeing* this pressure to adopt Linux/*BSD where I work.

    Share and Enjoy.
  • Sorry, I haven't listened to a thing the Gartner Group has said when it comes to predictions since the late 80s. You might as well buy the National Enquirer shortly after a New Year and read predictions for the upcoming year.

    For example, and I wish I saved it, I remember quite well all those nifty graphs around 1989 showing total installations of OS/2 and how they'd overtake MS/DOS, Macs, and anything else you could think of within a few short years.

    So, I've committed a /. crime (and tradition) of commenting on the article without reading it.

  • I don't think that the Apple II lost any battle, it just went to the end of it's natural life. The IBM PC clone has only lasted as long as it has because the 286 and 386 were used as speeded up 808x's, and when the restrictions of real mode started hurting, we could escape from it. If intel had made 16Mhz & 33Mhz 8086's, then probably the PC platform would have died and something new would have replaced it.
  • When I run helix-update, I get all the newest stuff in a neat little window. It gives me warm fuzzies. True, I only get a list of packages that have been blessed by Helix-code, but all I do is double-click on the little soda-can on my desktop. When the smoke clears, all the neat little apps are installed, and I can access them all from my GNOME menu.

    Having said that, what's *really* holding Linux back is NOT ease of use. We're there as far as that's concerned. It's strictly a matter of application availability, ie: We either need more support from mainstream application vendors, or OUR applications need to BECOME mainstream. Either would be fine, and either would totally solve our problems. Ease of use (though there is always room for improvement) is yesterday's news.
  • This report by Gartner Group, in my view, is flaw. By 2005, the majority of users will treat the OS in the same way as they treat virtual memory today -- i.e.: they will ignore it and not known that it exist.

    Application and "agents" are all that will matter.

    -- George

  • 4) Freebsd and the like make heavy inroads in the server area. I'm still waiting for a distro of linux to install on a server that doesn't require gigs of drive space. Yea, I know you can do min installs and then install what you need, but xxxBSD just seems cleaner to install. And you have to love the ports.


    Have you heard of ZIPslack? It a Slackware [slackware.com]. It's designed to be loaded from a zip drive. I think that will satisfy your needs.

    RobK
    ---
    RobK
  • The OS discussion is no different than the game console discussions. It boils down to the software. What "Linux"(Why does everyone here say Linux? My business sells BSD systems *shrug*) is alot of good applications that can be used in business. My company has a few projects, but no investment capital so these are going slow... 1) Bootable CD, it starts off bsd, locates drive, formats, runs BSD off the CD, and data on the drive. It will come with Apache/PHP/PostgreSQL installed. Now on some CDs we will put different versions of our software. Our main one being Glimpse, which is a business management/POS software can be included in the PHP scripts.
    So you pop the CD in a computer, boot it up, type in its IP address. And bam! Software installed, go around to all the clients web servers, make the home page be the IP of the server we just installed, and they run all the software from there. There is a profitable world out there, but Linux hits a big hurdle in its license. FreeBSD uses the BSD license, which is a usefull one. Thats why apple is using it in MAC OS. Linux is free to use, but making money off it? I dunno, thats why we use BSD and PostgreSQL, less licensing issues.

    Sorry if I offended anyone, but "Linux" as it stands sucks, and everything with it. Everyone is so busy doing nothing, or coding useless features, its disgusting. You have to make devices work, not by recompiling a kernel, but autodetected, and installed on the spot. You have to make standard documentation, (check out the manual on www.freebsd.org as a start, but even that is too technicall at points), you have to make it so the server can be set up without ever seeing a command prompt, steps being explained logically, and lastly developmental tools. Last I checked, I was the only one who even submitted a php editor to www.zend.com, (someone coincidently downloaded that and make a GREAT version of it, when I finish moving and get all my computers out of boxes, I'll post the link to it). PHP could propell the "Linux" world if developers could pop a CD in a machine, grab another computer on the network, open the web browser to the server, and start working immediatly.

    TurboRoot GS Data Design bleach@theshop.net is my personal email if you are working on a PHP development library.
  • I think originally it was designed as (among other things) a way of running multiple OS "kernels" at once while Mach handled all the device driver interfaces and things like that.

    Anyone know if this means you could run both OS X and Hurd simultaneously on the same machine? (odd thought but it might be interesting for Hurd development.)
  • Start with a little ancient history:

    Linux will hasten the demise of weaker OS variants, such as OS/2, NetWare, older NT versions, SCO OpenServer and UnixWare, SGI Irix and other nondominant legacy OSs.

    Then add some utter stupidity:

    Revenue for AS/400 and S/390 systems continuing to decline slowly through 2005 as in previous forecasts - both AS/400 and S/390 will remain viable investments throughout the next five years -- For what, pre-y2k accounting packages that drive dot-matrix printers?!

    Plus some beating-around-the-bush:

    Solaris momentum "speed bump" in 2000, with increased discount levels until next-generation UltraSPARC III systems appear (especially to replace the UE10000); -- can you say "just don't use the 400 and 450MHz processors with the cache bug?"

    Finally, top it off with a buzzword-gasm:

    Countering this trend will be accelerating server deployments for e-business and services, Web portals, intranets/extranets, application service providers and Internet service providers. The increased storage needs of business-to-business and business-to-consumer activity will drive torage subsystem costs higher in a ratio to total configuration costs -- blah blah blah

    Way to go Gartner Group!

  • Perhaps this [slashdot.org] is a good explanation of the change in thought?

  • I don't agree. One problem with x86 is that it doesn't scale well.
    It's much easier to design RISC architectures to make use of SMP and
    NUMA than try to coax the x86 to do the same.
  • > do you think that the IT people want to fuck around for hours on end trying to get a simple Windowing env. to work?

    In my business environment, they fuck around for hours getting Windows (NT) to work. Then fuck around for several more hours trying to lock it down so the users can't do anything to break it. Then they still have to worry that people won't fill their C: drives with MP3s (which can cause SMS updates to break when they run out of space).

    Once they've got everything as they like, they just ghost it to all the other machines. It's just as easy to clone/lockdown under Linux.

    And, BTW, I installed KDE2 on Tuesday on my Debian machine. apt-get did all the work, including handling dependencies & building menus for all the packages on my system.
  • If you want idiots to be able to run your servers, you will get what you deserve: idiots running your servers. Why managers think this is a good idea, I'll never understand. I guess the point is that idiots are much cheaper to employ. Unfortunately, they're also more expensive to clean up after when they screw up your servers.

    Really, I think that most UNIX administrators know more about what they are doing than NT administrators. I consider that a Good Thing.
  • I use what gets the job done. I used to have a Win95 486 machine run as a file/print server and internet router. It worked fine, I could set it up with my eyes closed. But the dang thing locked several times a week, or simply stoped dialling out, or the print queues didn't respond anymore, whatever.

    So I reformatted and installed RedHat. Took me quite a while to wade through HOW-TOs and MINI-HOW-TOs to figure out the IP masq stuff (there seem to be endless conflicting versions floating around). Print serving and Samba was easier, didn't take too long with the right docs. After I finally got it all going, the thing was quite reliable. It worked weeks and months on end without a hitch, even on this paltry platform (486 16MB RAM 250MB HD). However, every once in a while there were glitches with the print queues and dialling out. A quick reboot often worked, but sometimes I had to screw with the config. Fiddling with printcap files only twice a year or so I simply forget all the options and keywords. Same with the diald.conf file. That's when I longed for a GUI. So I got linuxconf which I have mixed feelings about, but it improved matters somewhat.

    Basically I want Linux with KDE and a config system that can actually make sense of all the shit in /etc and provide me with valid options for settings I couldn't be bothered to remember. That's what Windows does better in my opinion, and that's why I can find my way around quicker there. But once KDE or Gnome get augmented with a DEEP config tool that goes beyond fluff configuration and can work on anything or most things in Linux, that's when I'll use Linux A LOT MORE. Until then it'll just run my print and file server in the closet. Oh, and when Kylix finally ships, I'll switch desktops too.

    Paul
  • The claims must be true, since people pay thousands of dollars to Gartner to recieve these reports.
  • I'm not sure that, as far as the Gartner Group predictions are concerned, 'end user' friendliness matters. Face it, what this report relates to are enterprise class installations.
    Given that, you aren't going to toss a newbie in front of a full on production Unix environment that a large portion of your company is basing a revenue stream on.
    Well, I suppose you could, but then again, I doubt that company would reap any benefits from said large installation.

    In other words, if a company is using Unix, that company is much more likely to hire experienced admins. Or at least clued individuals.

  • Look, you DOWNLOADED a current copy of kde from their website, but didn't DOWNLOAD the latest qt that it uses, instead you took the out-of-date one from the CD in your hand. Of course that doesn't work. That's the same as, say, in Windows trying to use a game that requires DirectX7, and then installing DirectX5 from your old Win install CD. If you had also downloaded the qt that the KDE site *has a URL pointing you to*, then it wouldn't have been as messed up. I'm not denying that it's a lot of steps, and it's not easy, but your example situation seems contrived to be harder than it needs to be on purpose.
  • Revenue for AS/400 and S/390 systems continuing to decline slowly through 2005 as in previous forecasts - both AS/400 and S/390 will remain viable investments throughout the next five years -- For what, pre-y2k accounting packages that drive dot-matrix printers?!

    For Java. Seeing as how it shattered some Java benchmarks by an order of magnitude. [javaworld.com]
  • I was thinking of the way the x86 handles interrupts. It's a major headache with SMP.
  • I don't think they were talking specifically JUST about servers either. They were talking about vertical integration - Microsoft has proven that in order to effectively compete (in terms of world-domination) you have to have not only a good server, but a good client. Actually, it's not even that simple. Sun has a good server, and a good client, but that client does not fulfill the role of a low end client. Their client is better suited to the role of a high-end workstation, for power-users. There's really only one reason why they can't take over the desktop (low-end client) with Sunray, and that's because they don't have the dominant office suite, and that's a platform dominance issue that is unique to Microsoft. I think if Microsoft didn't exist, or if they only produced an OS, then Sun would have a much better chance at offering a complete solution that made sense to most businesses. But Microsoft owns the low-end to midrange desktop, because Microsoft owns Office. Apple might end up in the same rut all the Unixes are in - OS X is a great server (except they don't have any truly server-class hardware - yet), and it's a great power-user Workstation OS, but they're going to have to hold onto the Classic OS to keep the low-end people. Either that or do a LOT more work on the consumer version (they need to tier the OS three levels, not two). Microsoft doesn't understand this. They're trying to get rid of Win95/98/ME, and they don't understand that their customers still NEED it. NT is too high end, too complicated for normal low-end users. The basic problem is, as I see it, multi-user vs. single-user. Someone's got to figure out how to eliminate the multi-user complexity from a low-end OS, but still let it work well with it's multi-user big brother. I think the reason Windows works so well at this is because the Multi-user part of windows, NT, is just multi-user enough, to provide basic security, but not soooo multi-user that it prevents Win95/98/ME from interacting, and being managed, etc.
    As far as administration goes - from any NT Server box, you can remote start/stop services on another NT (server or workstation) box. It's not obvious, but neither is a lot of Unix stuff either. You use Server Manager. NET is suprisingly powerful, if you get to know it (and also unreliable as hell). NT's domain security model is a peice of crap for large networks - especially when trust breaks. It's replacement, MAD, is too new, and not supported with ubiquity, and eliminates the critical peice of the puzzle, Win95/98/ME.

    Windows also has perl, and you can run sh and ksh with 3rd party packages. DOS scripting is pretty decent but like you say, nothing like Unix shell scripting, the lack of a built-in equivalent to telnet server on NT is what I feel is most crippling.

    Personally, I feel that if Apple can fix their OS strategy (by creating a low-end version of OS X - as simple to use as Classic, but with all the features of Unix - but without the boggling permissions hassles that you and I take for granted, yet cause non technical people to pull their hair out for basic tasks) - AND if they can keep MS Office, acquire a database like MS Access (I know, but "the market" needs it for OS X to be taken seriously), AND a serious Exchange-compatible mail client (Outlook Express does not cut it in the workplace) - then also take care of their hardware problems with better high-end server offerings, then they have all the important peices of the puzzle, and will kick major, major ass in the marketplace, despite the overpriced hardware.

    The more MS tries to cut-out Win95/98/ME (which they need, but don't want), the more that makes them just like Sun, only less effective in the server space - the only think MS would have then is Office, and the API marketshare.
  • by Black Parrot ( 19622 ) on Thursday November 02, 2000 @12:24PM (#655021)
    Almost without exception, Gartner offers a description of the situation on the ground today as a "prediction" for what the situation will be like in 2-5 years.

    That's true for at least part of the linked report. Other parts are as vague as a newspaper's daily horoscope:
    3.Hewlett-Packard's transition from PA-RISC to IA-64, with big decisions for users, in 2003
    Now that's really a helpful forecast, isn't it?

    I have come to the tentative conclusion that what Gartner's reports are really for is not to predict the future, but rather to buffer today's news for PHBs.

    A PHB might go into shock if s/he found out that 30% of the internet runs on Linux today, but if s/he reads a Gartner report saying that that might be the case in 2-5 years, then s/he can start getting used to the idea, and maybe not poop pants when a Linux firewall is discovered down in IT. Indeed, s/he might engage in a bit of self-congratulation for being ahead of the curve, rather than firing some staff and ordering the offensive machine removed.

    So while I remain critical of the intellectual content of Gartner's reports (and ditto for others of that ilk), I now also recognize that they are providing an important service to the public, and I applaud them for it.
  • There are a couple of remaining issues, but yes, we're certainly advancing much faster than the competition. Linux is *definitely* easier to install than Windows already, for example [comparing Win2K and WinME with Red Hat 7 - in the WinME install, USB users don't even get mouse support for the first part of the install].

    But here's what I see as some remaining issues to do with ease of use.

    * Many root-requiring GUI apps display error messages rather than asking for authentication details
    * Many error messages are too hard to understand, epecially for console apps. Try modifying /etc/fstab to mount a disk [/dev/sda] rather than a partiton [/dev/sda4] and look at the error message that pups up during your next reboot. This isn't Linux specific, but it sure does need improvement.
    * The filesystem is still all over the place. Directories like /top confuse things significantly [what exactly is optional anyway]? Is grep optional? Is KDE? Or is it anything that didn't come with your distro - so Acroread for example, should be installed in different places on different distributions. Clueless folk like Adobe and Citrix put entire self contained apps into /usr/lib
    * There's no stadard format for configuration files. An XML DTD would help significantly
    * Sys Admins have to deal with multiple permission systems because rwx doesn't offer any fine grained control. So filesystem rwx, Squid and Samba ACLs, and probably more all have to be managed using different system. Linux needs POSIX ACLs badly.
    * Lack of anti-aliasing or font smoothing causes accessibility issues for those with vision problems
    * lack of comprehensive, distributiuon specific documentation. This is improving over time, but certainly the HOWTOs are often too generalized in nature [eg, they often tend to favour kernel recompiles even if they are not necessary for your distribution]. Those that aren't generalized usually are advertising pieces for a certain distro.
    * Unfortunately, RPM lacks apt like qualities, and is far more prevalent. Once this is sorted, Linux apps WILL be easier to install than Windows
    * Some minor changes need to be made. What the hell is a GNORPM? Why not just call the app `Installer' and put it on people's desktops?

    Well, there's my $0.02

    -------
  • I think you misunderstand my point. I wasn't suggesting that OS X
    might be seen as a rival enterprise solution UNIX, but rather that its
    existence may make an all-UNIX strategy look tenable for companies.
    If MS doesn't coexist well with UNIX, maybe ditching MS is the right
    response. The lack of an unintimidating desktop environment used to
    be a problem with this, maybe OS X will change this.

    .NET is real. The technical part has been under development for a
    while as NGWS, the differences are that this is now being placed as
    the centrepiece of MS strategy, and that a much higher level of
    cross-platform interoperability is being envisaged. It's clearly a
    response to Sun's general strategy, and it differs from it on various
    levels.

  • "Unix systems are harder to use"

    Let's see.. type in user name and password, click on program icon (or click on menu icon then on program name), click on things in the window and type things, click on an icon in the top part of the frame to quit...

    What's the difference between using different windowing interfaces? Oh, yeah, in some you double-click, in some you use more than one mouse button... Yup, that's harder to use.

  • Not only that, but if IBM can "get" Linux accepted in the marketplace on as wide a scale, exactly as wide as their current marketshare for Intel/Windows machines, it then makes economic sense for them to stop kowtowing and sucking up to Intel, and selling PowerPC laptops running LinuxPPC. It would be a nearly seamless transition. They wouldn't even have to suck up to Apple or Motorola, because LinuxPPC wont' be dependent on AltiVec (like OS X's eye candy is), so IBM's chips won't need AltiVec. Wouldn't THAT be wicked if Intel shipped ThinkPads with 8hr battery life, running LinuxPPC, on an 800 MHz G3 or G4 sans AltiVec? With software toys that integrated with the rest of IBM's offerings, S/390 connectivity, AS/400, ADSM client? etc. At HALF the price of a PowerBook running OS X at 400MHz? Stevearoonie would have a SHIT fit.
  • Under Debian Linux it's as easy as:

    apt-get install task-kde

    It figures out all of the dependencies for you and magically installs them on your computer (downloading them from the Internet as necessary). Better yet, when the the KDE team decides to rev KDE, upgrading is as simple as:

    apt-get update ; apt-get dist-upgrade

    If, for some reason unknown to me, you decided to install KDE on a system that doesn't do this for you, well then, that's your problem I suppose.

    Just because you installed KDE the hard way does not necessarily mean that there isn't an easy way.

    And Debian's KDE install is probably the hardest one of the lot. After all, you have to install Debian first, and that can be tricky. There are plenty of distributions that install KDE by default (whether you want it or not).

  • The technically better product does not always win . How many years has Unix existed for X86. Yet there are more copies of Windows sold.

    The GPL is one factor that could hamstring Linux. But only if a court case is brought that gets ugly for commercial interests. Look at the VirginConnect Linux box, and the way it is packaged. YET, no lawsuit on this matter exists. (perhaps it is because the VirginConnect members are LEASING the box from Virgin...)

    A radical change in the GPL would also cause a problem with Linux. (if the GPL changes, and no effect is noted beyond some /. flamage...the change isn't readical)

    Another thing which can damage Linux is the 180+ versions of Linux all clammoring for marketshare. Smacks of the UNIX versions of the 1980's.
    (ok, who has a link to the "linux os versions counter page" I have heard reference to..this way we can all see the 180+ versions)

    FreeBSD total users is at about 20% of the TOTAL of Linux. Think of any Linux distro that has 20+%? And BSDi is getting moeny from ISP's to work on and improve FreeBSD. ($5 mil from Yahoo and $5 mil from Livin' on the edge)

    Given the final outcome of a Linux ELF binary as the X86 Unix standard Binary, the lasting effect of Linux might just be a standard binary.

  • OS X will not overtake the desktop because it's too complicated. Still. Apple, IMO has failed to put an easy Macintosh GUI on Unix. Although there's still room for success, the current incarnation is GREAT as a high-end workstation or power-user OS, but it's still too complicated for grandma. They need to hide the Unix-ness better.

    In fact, I think they need to offer a seperate tier, they need a Desktop OS, a Workstation OS, and a Server OS. OS X PB is the Workstation OS. OS 9, long-term, will not make it as the Desktop OS, because it does not support Cocoa. I think that the Desktop version should be identical - code-wise, to the Workstation OS, and a few checkboxes enabled to reveal all of the power-user stuff.

    The other thing Apple needs; MUST have, in order to compete, sadly, is MS Office, including MS Access. Sad but true.

    And finally, good high-end Server hardware. Stuff that competes on the level of Sun's high-end hardware. Then, they offer what MS offers, a platform to run Office on - Sun doesn't have that. They offer what Sun offers, high-end bulletproof Unix - and what Sun does not offer, an Office platform, a low-end grandma platform, and WebObjects, which could just be the Next Big Killer App, if only there was a killer platform for it to run on.


  • There's a lot of good sense in Gartner's projections, which, being their projections, probably err a little on the conservative and hedging side of things.

    I think the growing importance of the server appliance market is right on the mark.

    Since hardware is getting so cheap, what you'll find is increasing appeal for the low software cost barriers of Linux based solutions. And, since Linux is advancing on the multiple fronts of

    1. embedded
    2. real time
    3. enterprise(big SMP)
    these solutions will become more ubiquitous on a faster time scale, IMHO.

    Indeed, Linux will tunnel thru the mid-range server market to the high end (witness the economical clustering technologies that compensate for Linux current lack of extreme SMP scalability) to eat the lucrative high end business before the conservatives in the mid-range start waking up to the advantages Linux solutions hold for them, too (especially if MS starts charging subscription rates to bring in revenue). The current market divisions of hi/med/lo for Unix/W2K/Linux will cause both the first and the second to disappear, in that order.

  • XFCE looks like ass.
    Use WindowMaker or bb.
  • You mentioned NUMA in the post I first replied to. I know hardly
    anything about it, and have a vague idea that it is a bit like SMP but
    looser and with fewer synchronisation problems. Is this right, and
    does it make x86-style interrupts easier?
  • You're wrong. Long distance is a phenominally profitable business, but it's not a fast growing business, so these companies stock prices get hammered --- NOT!

    Long is not "phenominally profitable business".

    Long distance is a good cash flow business. But the profits, caused by the price wars between the long distance providers, have eroded away.

    The head guy at Worldcom, Ebbers, in his remarks on a conference call to analysts recounted how they had to bid on the long distance business of Kmart at below cost. No profits there. Essentially Worldcom uses long distance as a loss leader.

    You are correct when you state that one reason these companies are splitting to jack up the stock prices. But the other side of that is that they are spinning off the no longer profitable long distance businesses.

    Steve M

  • OK, as a new Linux user I have to say something. Linux is not any harder to use for the end user. But, (BUT!) trying to do any sort of changes to the system (something a homeuser would want to do) IS much harder. Even something as simple as installing softare, or even changing a monitor.

    I've just recently installed Linux on one of my old systems to run a counter-strike server. I have some limited experience with Solaris from school, so I was not coming into totaly blind.

    The initial install was pretty simple. Not something your average user would want to handle (I wouldn't expect my parents to be able to handle it, or even any of my brothers for that matter). The redhat installer detected everything but my monitor just fine. Setting up partitions should be automated a little better. But I've got enough computer experience that I worked through it.

    After the install, I added in a network card. Piece of cake. Pluggedd it in, turned it on, and it was detected.

    From there things went bad. The redhat recommended partition sizes would not hold all the files I needed to add for the counter-strike server. So, I do a little research on how to add another partion. I play around for a while, figure out how to create a new partition, add a filesystem, and mount it. But then I royally screw up trying to copy the existing usr directory into it.

    Time for a reinstall.

    So I do a reinstall, setting up the partions better this time. I get the counter-strike server running pretty easily by following a tutorial. great everything is running fine.

    A few weeks later my monitor goes out (not linux fault, it was running on my windows box). So I borrow a new from work. Plug it in, turn on my linux machine. No good. As soon as Xwindows starts up the signal gets scrambled (refresh rate too high). But how do I just boot into text mode? (I had set graphical login in the install) No idea.

    So I spend all night trying figure out how to NOT boot into the graphical login. By piecing together some info I can find about 'LILO', and xwindows, and 10 other 'tutorials', eventually I figure that I need to:

    - hit control-X on the OS loader.
    - enter 'linux 3'
    - use Xconfigurator (capital X, the rest lower case of course, even though most references to it call it XConfigurator or xconfigurator) to change my monitor settings.

    Nice, easy and intuitive eh?
  • Yes, well, try to open an MS document in StarOffice, and it takes about 10 times longer, while you wait for it to translate it. Once it's done, sometimes there are formatting problems with tables, sometimes fonts are different, sometimes Macro functions in spreadsheets don't work as expected, you just never know. This is simply unacceptable to a user that does a majority of their work on documents from coworkers who use Office, and heavily use the special Microsoft-proprietary features.

    This is even true for MS Office for Mac. Some formatting just doesn't make it across.

    Personally, I think Adobe could step in and use a PDF based cross-platform (Win/Mac/Solaris/Linux) office suite to wipe Microsoft from the face of the Earth. But, apparently, they don't want to wipe out half their user base (Windows) by undermining the OS.
  • OK, I've finally had it with everybody trumpeting the TCO equation. We have a lot of qualified technical people here, and Gartner has a bunch of reasonably savvy business analysts, but does anyone bother to ask the people who REALY KNOW about TCO? Oh, no, we just spout our local dogma as the truth.

    Hard facts: the people who know TCO are the System (NOT SOFTWARE) Architects and the MIS/Operations Managers. I've very little faith in Director-level (or above) management knowing about real costs, and Programmers (no matter how smart) don't have a fucking clue about what TCO means.

    TCO has four major considerations, which vary by the company involved. There is no such thing as a "global" TCO. There's the TCO for your company, and there's the one for Yahoo, IBM, etc. all of which are different. And each consideration is of a different weight (importance) depending on your situation.

    1. Upfront costs: Software Licensing and proper hardware (ie. what's the cost for hardware for OS A vs. hardware for OS B if they do the exact same thing). For those on a tight budget, this is usually the overriding concern. They can afford to pay more over the long haul, since the cost is spread out, but one-time outlays are a big issue. In huge companies, even multi-million dollar initial costs can be absorbed if the 5 year outlays are significantly cheaper.
    2. Ongoing Support: The usual: do you need it? (is there enough in-house expertise to avoid it, or do you only need it for 1/2/3 years instead of in perpetuity?) how good it is? what's the cost per response level? Depending on the vendor pricing scheme and the initial cost outlays, this can dwarf initial expenditures, or it can be a tiny drop in the bucket.
    3. Maintenance Costs: The big one here is Qualified Personnel. Do you have enough people with spare cycles to do the care+feeding of this new system? Will they need training? How hard is it to hire new staff with the required skill set (or people willing to be trained for it)? Are we going to have to use Consultants to set up the system, or will the vendor do it for us? How many systems can one person manage? Then there are the software costs: How fast can we get bug-fixes? Who do we get them from? How much do updates cost? Stability? Considerations for large-scale managability? How do we monitor system performance?
    4. Growth and Obselescence issues: Give the initial hardware buy-in, how easy is it to plan for moderate growth? What are upgrade costs? What are expansion costs (ie. more users, more processors, etc.)? How long can our initial buy-in last until we have to upgrade it? How long do we expect to need this solution before we replace it with something else? Can this solution be integrated into a larger one, or should it remain a solution unto itself? How hard (and costly) are migration issues? What about application portability - is this important?

    One things I will point out that seems go against a mantra here on /. : Open Source OSes have no real maintenance advantage over closed-UNIXes. Sorry, but that's the truth for 99% of the businesses out there. Nowdays, installed commercial UNIXes use virtually the same userland toolsets as the FreeUNIXes (who cares that SUN doesn't fix bugs in their version of sendmail? Anybody actually run Sun Sendmail? I thought not.) The only real differences are in major applications (which are almost always closed-source, even on FreeUNIXes), and the base kernel itself. Only the largest companies (or those whose business depends on doing kernel-level development e.g. Cobalt) will care. Fixes for important problems in this sort of stuff come out of the commercial vendors as fast as they do from the FreeUNIXes, and honestly, virtually no company has the resources (or even the expertise) in the programming staff to fix subtle bugs in stuff like the Linux kernel (or glibc, or whatever). And if you think the admin staff is going to do this, hey, well, I've got a bridge in Arizona I wanna show you...

    -Erik
    (yes, I'm a System Architect, and yes, I'm pissed off...)

  • I have come to the tentative conclusion that what Gartner's reports are really for is not to predict the future, but rather to buffer today's news for PHBs.

    Definitely. My experience with the analysts, Gartner, Forrester, and the like, is that they take generally available information, make it look pretty and "professional," back it up with some survey data (sometimes), and offer it for very high prices to corporate buyers via a direct sales force that won't stop calling you once they get your name. Fortunately, the news sites are more useful every day, so you really don't need to subscribe, unless you really don't get technology and would rather pay someone else to get it for you.

    Remember, the analysts all said "B2B" would be the next big thing...

  • To reach today's 20,000,000 users Linux had to grow at an anual rate of 537%. (5.37^10 ~ 20M) Gartner is forcasting that it will slow down by 511%. I wonder how much they were paid to think that?
    --
  • Cool, thanks. Actually, with the help of your pointer, I was able to find what appears to be a more correct (and I think better) version:

    It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.

    I'll change my .sig accordingly.

  • We used SunOS on workstations,

    SunOS wasn't always called SunOS; it's just a flavor of BSD.

    Charging for Unix is a pretty new idea. Unix was free for a long time before it was charged for, and experience with Unix is experience with Unix; especially pre-SVr4 Unix such as SunOS.

    -
  • Until they boot it on a 1024 processor machine, they haven't supported 1024 processors. They are going to find out the same thing that we found out - large hardware exposes large numbers of bugs in your code. We will be (hopefully, pending software and also pending getting enough ceramic covers for our ASICs...) shipping an actual 1024p system in the next few months. We have already shipped a 512p system and have a bunch more on order (two more shipping next week).
  • XFCE is ok
    that's my _opinion_
  • What?
    I use windows, mac and linux. I was
    addressing a specific point that this
    person made. I made a _suggestion_ because
    s/he see,ed to be looking for something.

    I was trying to help. I am not a zealot,
    just a good samaritian. If you can't
    understand, then maybe you
    should point the zealot accusation elsewhere.

    I would sugest looking in the mirror.
    ---
    RobK

Some people claim that the UNIX learning curve is steep, but at least you only have to climb it once.

Working...