Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Linux Software

Does Open Source Fail the Acid Test? 226

Norman Lorrain writes "This month's Computer magazine is running an article by Ted Lewis in which he predicts Open Source software will hit a brick wall. "
This discussion has been archived. No new comments can be posted.

Does Open Source Fail the Acid Test?

Comments Filter:
  • I don't understand why Open Source software is so different that normal software that would make it fail the acid test. So what if the source code is free? Consumers will still buy the product. There's more to a software product than the intellectual property, there's packaging, manuals, support, and name recognition. The GPL even says that your software doesn't have to be gratis. If I develop software under the GPL, sell the binaries on CD with manual for $500 with a 90-day support plan, and then sell the source for $1000 with extensive developer support, it's still following the licence. Of course, whoever buys the binaries and source can redistribute it. But who are companies going to trust--a noname redistributor or the actual developer of the product? It's all about name recognition.
  • Sendmail I can see. But you haven't lived until you try to configure sendmail, and I'm sure we all know that. But Emacs and Perl? Bloated? I think not. I've never had emacs behave in a slow, buggy or crashy manner which would suggest bloatedness. Yes there are a lot of features I don't use, but how on earth does that make emacs bloated? Hell, it doesn't even *LOAD* alot of the features I'm not using. Just because something is extensible, it's bloated?

    Finally, if you're calling Perl bloated. You obviously have never used it for more than half an hour. Perl has got to be the best programming for rapid prototyping. Things that would take hours to program in any other language, takes maybe half the time using Perl.

    Frankly, you've got a lot to learn before you go criticizing other peoples software.
  • by Zoloft ( 218 )
    as well as the currently snowballing way of things, has already proven him sadly mistaken.

    End of story.
  • I'm assuming you used the &nbsp; tag to space out your chart. If this appeared correctly in preview mode, but was rendered incorrectly in the submitted comment, be aware that Slashdot tends to translate character codes into the resolved character, then submit these. I've had similar problmes with the &gt; and &lt;& (< and >) tags.

    The trick is to preview your post, then go back to the original edit screen and clean up any errors. You can't do 'edit' => 'preview' => 'submit'.

  • You want a good Windows Text editor, try Vim. :)

    It's also a good Unix/Be/etc... editor that everyone should use. Of course this is all IMHO. :)
  • I'm currently enrolled at UNLV and I'm having a tough time finding anyone else that uses Linux... *sheesh*

    So convert your friends :) Since I came here as a freshman last semester, I have seen something like nine new Linux installations just in my dorm building, and my roommate, who works for university software support, says that they are gearing up to formally support Linux along with Windows and MacOS :)

    I'm at the University of Massachusetts at Amherst [umass.edu], by the way.

  • by cduffy ( 652 )
    I use Emacs for all my programming and don't even know the simple key commands you mentioned here. When running in X, it's much (much, much) friendlier.

    I'd hardly call the text editor "archaic". Rather, I find it (with appropriate modes loaded and font-lock enabled) far better than the MSVC++ interface I used to work with. Please be specific in your criticisms of the editor proper. I can't defend the mail/news, calander and the like as I don't use them.

    As for Lisp... . All I've needed to do so far is just read others' code and make very minor changes (setting variables and the like). I don't need to _know_ lisp to do these things (though I will be needing it for an AI class next year).
  • ...aren't that hard to use, particularly with RPMs. If you're checking RedHat-compiled packages, you need only one key.
  • by gavinhall ( 33 )
    Posted by The Orge Captain:

    As I understand the industry at this time I find the commnets made in this article to be fairly uneducated. I bieleve there is a fad going on but it is not the open source movement I bieleve that this will be a trend in the future as software gets closer to the user being more custom for its intended purpose rather then one size fits all of today. The fad I see is people who place linux on there desktop machines and with some of the members of my age group who install Linux thinking that they will become some sort of hacker by installing it. Linux is already to complicated for the average user to run because it was never designed to be run by the average user the people who were constructing it work constructing it for there own use and so there was little attempt to make things not necessarly easy, but straight forward there. There is a tremendous amount of upkeep involved in keeping any unix system runing paticularyly in securing it (Unlike Microsoft the unix community actually fixes vulnerbilitys in a timely fashion). I see linux as NTs nemesis, educated users running the operating system on higher preformace workstations and departmental servers. (What I bileve is needed in the corporate enviroment is a less complicated operating system that runs only what corporate destop users need a browser, word processing etc, enterprise ware. This would reduce the amout of problems that are created by windows in a corporate enviroment (horendous up keep and abuse) Linux is and excellent operating system and Im glad that I have it.
  • Posted by Steve Linberg:

    I am not a professional programmer, so I cannot tell if this guys arguments against perl are valid or not, although it seems to me that OO elements in perl 5.x are completely unneeded by a scripting language and perl is not useful for anything but scripting and small to medium CGIs.

    There's debate over the usefulness about OO in perl - some people love it, some don't - but Perl doesn't force any computing paradigm in particular on you. You don't have to use the OO stuff if you don't want to.

    The notion that perl is only useful for small-to-medium CGIs is ludicrous, however. I use Perl to manage my life. I design and maintain a website with hundreds (and soon to be thousands) of pages, all built in three different versions (text-only, html 3.2, html 4.0/css) that would be impossible to do without the Perl preprocessing compiler I use (and wrote). I also use ePerl heavily on the server side to interface with the mySQL databases.

    I've been programming professionally for 15 years (and hacking for 20), and I was raised in the trenches of assembler and C. Perl is unlike anything I've even used. It's incredibly powerful, much faster than any high-level language has any right to be, and it can do just about anything. You might not want to write Photoshop in Perl (or you might), but its creators call it a "swiss-army chainsaw" for very good reasons. The hundreds of libraries of free, tested code add immense functionality without bloating the language. "Swiss army tactical nuke" might be a better moniker.

    I can't imagine doing the work I do without Perl.

    My two bits.

  • This silly piece -- insofar as it deserves any attention at all -- is subject to the same refutation used by Dr. Samuel Johnson against another silly theory in the late 18th century:

    Of Bishop Berkeley's theory of the non-existence of matter, Boswell observed that though they were satisfied it was not true, they were unable to refute it. Samuel Johnson struck his foot against a large stone, till he rebounded from it, saying "I refute it thus."
    That is, the writer asserts any number of things that might be true, but (empirically) aren't. Linux isn't showing up in server rooms all over the world because it's the Latest Hot Thing, one of the frequent fads that seem to sweep the IT management community; it's showing up because it is superior to all other available options for solving the problems the individual sysadmins must face.

    If you actually analyze carefully Ted Lewis' article -- an exercise I don't recommend -- you'll see that none of his statistical arguments hold water -- even in terms of the mathematics, let alone their factual premises.

    That is, where it isn't simply incoherent. Consider, for example, the following:

    "As the
    Linux 2.0: What It Is and Isn't sidebar shows, Linux has yet to incorporate many of the features and applications that will doom it to a complex future." [p. 127]
    What does this sidebar actually say [p. 125]? That Linux has:
    • Multithreaded kernel
    • SMP
    • Multiple FS support
    • Disk striping and mirroring
    • Multiprotocol networking
    • X11 GUI
    • GNU tools
    • all the internet tools
    Linux lacks:
    • Video card support
    • Wireless LAN support
    • "A good selection" of productivity tools.
    OK, in other words everything needed for a flexible and high-powered server OS is right there. Notice how lame the "lack" list is, and that the same lacks also apply to the other Unixes -- even granting that there are widely-used video cards that XFree doesn't support (questionable, and how important is this in a server, anyway?), that X.25 doesn't count as a 'wireless LAN' (all the IT managers out there who depend on wireless LANs please raise your hands), and that "a good selection" requires more than two complete suites (StarOffice and Applix).

    Thus adding additional video card support to XFree, additional kernel drivers for wireless LANs, and two or three more office suites will obviously bloat the Linux kernel, while SMP, multiple file systems and net protocols, and RAID support did not. Sure. Right.

    This article would be more at home in a collection of postmodernist academic essays (or perhaps press releases from government bureaucrats) than in a technical journal; it's a wonderful example of content-free writing.

    Craig

  • > Who wants to pay attention to the nuts & bolts and the really CORE mechanics which is dull to say the least ??

    Look at all the contributors to the kernel mailing list on such topics as memory management and file systems (are these "CORE mechanics" enough for you?).

    Everything is fascinating to somebody; Alan Cox got hooked originally because he became fascinated by, of all things, ethernet cards.

    And believe me, if there's a fundamental bug in the memory management or your ethernet card driver, whether these things interest you or not you'll notice it, and probably report it in a whiny and offended tone!

    Craig

  • Yep, damn right I'm greedy. I want great software ASAP. And I know that Microsoft is never going to get me there.
  • One of the biggest advantages of free or Open Source software is that it prevents a single entitly such as Microsoft from extending their position through coercion. Bill didn't get to be the richest man in the world by selling great software, and I'm sure this point hasn't been lost on IBM, Compaq, HP, etc..

    TedC

  • If you don't like Emacs, use vi (I like pico).

    If you don't like sendmail, use qmail.

    It isn't like you don't have any choices, unless you install windows. (let me know when windows comes with multiple, independent editors and e-mail servers, and everything else. Then let me know when you find a good *text* editor for it, and a good e-mail server, etc, etc. Then let me know if you had to port UNIX to do it. :)

    As for the article, it was completely wrong. Testing programs for errors has nothing to do with how long their source code is. Seeing as how Linux is currently *the* *dominant* *unix* in the market, I don't see how the other players could do any better. Linux may not scale quite as well on other people's specific enterprise hardware configurations, but it can't do worse than NT. Therefore, Microsoft also fails the acid test.

    What was the test again?
  • by rlk ( 1089 )
    Moral responsibility can be the strongest of all. Linus and RMS and Alan Cox and Larry Wall stake their personal reputations on the quality of their software, and know that if they let their standards slip they'll have to live with the opprobrium of their peers and their consciences. In a typical commercial setting there are a lot of competing interests -- this quarter's revenue, a promise to a key customer -- and this leads to lack of focus on product quality in some cases. Stockholders and VC's don't care if a product has a bug unless it's going to directly impact the bottom line.

    As for the nuts and bolts, many of the best open source programmers work on those aspects. Alan Cox maintains vast swathes of the low levels of the Linux kernel, and owns responsibility for 2.0 maintenance (yes, 2.0.36 is quite recent, and 2.0.37 pre-patches are coming out periodically). Stephen Tweedie and Ingo Molnar do a lot of work on the VM system. Whatever their fame or lack thereof in the outside world, their peers are well aware of their accomplishments.

  • And the macro viruses still wouldn't work on the linux side of things because they are dependant on WORD which isn't avaible for Linux, so a macro viruses based on MS WORD wouldn't have the slighest idea of how to acess the Linux filesystem. Also since WINE would actually be handling the actual disk I/O and not a MS program under it, it shouldn't be to difficult to have WINE look for and direct this sort of activity to /dev/null instead of to a hd if it does in fact proves to be a problem.
  • I suspect you and your friends who work for the MS PR department would like very much for people *NOT* to know about these kinds of articles that appear in mags like this one was published in so they can go uncontested. Unfortunely for you, Slashdot exposes this kind of garbage to the light of day so it can be subjected to the ridicule that it deserves. Don't for one second think you are fooling anyone with your snarky comments concerning Senegan's motives for alerting people to this article. The tricks you and others at Microsoft pulled with OS/2 and other computers like the Amiga and Atari ST isn't going to work with OSS and Linux....
  • This guy is obviously twisting truth here in pursuit of an agenda. Last I saw someone do testing of Apache vs. IIS (ZDnet, no less), Apache beat it quite handily in the speed department. And, Apache also serves to answer another of his arguments, namely code bloat. Apache handles this nicely through the use of modules (as does perl..) Don't want that piece of bloat? Ok! It doesn't get loaded at runtime. The gimp is a nice illustration of that, also

    Re: Solaris vs. Linux code size/defect ratio, last time I installed Debian, it had *lots* more to it. Of course, in typical Free Software fashion, it's all modularized, so you only get what you ask for. And, I've seen solaris boxes get wedged in X somewhat frequently. kfm even brought a Solaris x86 box here to its knees. It was causing things to go in extreme slow motion, wouldn't die with kill -9, and even survived a switch to single user mode... And nfs was *not* involved. I've never seen that type of thing happen with Linux.
  • The emmergence of cutting-edge hardware along with the accompanying cutting-edge drivers (for 95/98) happens at such a rapid pace that there's no way Linux could keep up.
    Sure there is. The companies can release their own drivers for their cutting edge hardware. Even if they don't contribute it back as source, as in Creative Labs' case, the drivers can still exist. Even in that worst-case, the overwhelming majority of the code *is* Free.
    Then there is the last and largest subgroup of the consumer market. I call this group the email-drones. <--snip--> Anyway, back to the point, here is yet another market where Linux will never make any serious inroads. <--snip--> I already get too many calls from them whenever something happens on their machine that they don't understand and I shudder to think about trying to have them add a directory to their PATH or some such over the phone. Once again, Windows wins because it is easy, not because it is the best.
    You've really got to cut down on the use of NEVER. :) You're starting to sound like those guys who said that airplanes could never fly. I think the 'ease of use' for these people is just how polished the distro is. Besides, do you find it any easier to try to talk people through fixing windows when it's scrambled itself? Even if reinstalling is the end-all answser, reinstalling an OS by phone proxy has got to be more of a pain than telling someone how to start pico in non-wrap mode. ;)

    Case in point: I did a phone proxy debian installation with a guy who was completely computer-illiterate. Even though I had to supply a special driver for the network card (it wasn't supported in the standard kernels at the time), emailing it to him to be saved on a floppy and transfered to the Linux box wasn't hard at all. And, it worked the first time, and expectedly.

    Remember - the world and it situationas are not static. Just because Linux is for computer literates now doesn't mean that it won't be accessible to most everyone in a few years. It's already managed to make itself much more accessible than Unix has typically been. :)
  • Agreed, there are a lot of disowned semi-finished, semi-attempted packages out there... personally, I don't think that's such a bad deal. It's a shame there was the wasted effort, but in most cases it appears that these abandoned packages are a result of:

    Another package which is better/farther along

    Real life concerns (usually revolving around real job or school workloads)

    A project leader who doesn't know how to attract people to his banner, or know how to set reasonable goals

    A Boring Project which doesn't capture the interest of the parties involved for the long haul, or ever attract enough people to have completed

    In the case of the first, that's just simple software darwinism... either the coder helps them out or simply looks for something else to do. Saves us from wasted effort in duplication.

    The second, well, not much can be done about that... ya gotta eat, but I've noticed that a lot of people will go and do that stuff and come back, or at the least hand control over to a competant fellow developer.

    The third can be summed up in pretty much one word: Freedows. ;)

    The last is the only one that worries me, and has been identified by others as a 'trouble spot' for OSS software development under the so-named bazaar model... people often shy away from the uncool/boring projects for obvious reasons. I personally see this as an excellent opportunity for profit-motivated coders to fill the gaps in with commercial offerings... YAPMFL (Yet Another Profit Margin From Linux).

    I question this 'untested update' thing you mention though... if you're looking for code stability, chances are good you've already latched onto a decent distro (rh, deb, suse et al) where a centralized body compiles, tests and packages changes prior to release. I've personally found redhat pretty good about supplying stuff which just works out of the .rpm, and seems quick on the draw to fix buggy things (although it took me 2 days to finally DL the wu.ftpd exploit-proofed RPM... they REALLY need to work on that site mirroring from updates.redhat.com).

    --
    rickf@transpect.SPAM-B-GONE.net (remove the SPAM-B-GONE bit)

  • Just remember, OSS authors: Source code is a LIABILITY, not an asset. We want you to solve problems using *less* source code, not more.

    Source *can* be a liability. Most of the time, it isn't. And remember: we *do* use less source code, because we can spend more time optimizing the available source and reuse it, not because we obfuscate it.

    (What was that? 5E+7 LOC for NT? A similar functional OSS Distribution sould barely cross 2E+7!)

  • The graph given in the report showing the exponential growth in the kernel code is at odds with the data from: http://ps.cus.umist.ac.uk/~rhw/kernel.versions.htm l . Using the tar.gz files listed therein (granted, it is compressed) I made a graph of the kernels over time. Here's the results in ASCII (my apologies for the size and formatting): Size in MB
    12 ++----+-----------+----------+----------+--------- --+----++
    + + + + + + +
    | A |
    10 ++ AA ++
    | AAA |
    | AAA |
    | AAA |
    8 ++ AA ++
    | AA |
    | AAA |
    6 ++ AAA ++
    | AA AA A |
    | AAAAAA AAA |
    | AAA |
    4 ++ A ++
    | AAA |
    | AAA |
    2 ++ AAA ++
    | AAAAAAA |
    | AAAAAAA |
    + + AAAAAAAAA + + + +
    0 0 ++-AAAAAAAA-------+----------+----------+--------- --+----++

    1992 1993 1994 1995 1996 1997 1998 1999
    There is no exponential growth in the archive file sizes. I've got graphics graphs, but no site to host them. Anyone want to contribute web space for my extraction source code and graphics?
  • Because it is based wrong data. Ted Lewis wrote: "Unix has more than 10 million lines of code, while Linux has only 1.5 million." Now, Linus has just recently said: "Right now its in the amount of 15,000,000 now" (lines of kernel code) (see that transcript of IRC with Linus, it was here yesterday)

    Linus didn't type that number, someone else was transcribing his phone conversation to IRC. That latter person misstyped. The source to the Linux kernel is about 1.5MLOC.

    However, the article is still wrong because the error test wasn't comparing kernels, but the utilities, and the GNU utilities are not dramatically different in size than the commerical utilities, in terms of KLOC.

  • you rock. i think that may be enough. You've got the point. that'll work.
    Reality Warriors are hard to find. Anyone with the hunger must suffice. No surrender.
    Hang on. We'll help.

  • by jd ( 1658 )
    An interesting use of half-truths. Feature-creep DOES lead to bloat... ...BUT Linux doesn't HAVE feature-creep - Linus won't allow it.

    Secondly, the shortage in programmers. A distributed development model means that you almost -can't- have the same kind of shortage a company could have.

    Thirdly, the modular design of Linux solves complexity issues, design issues and stability issues.

  • ..I suppose that proprietary software (for example, Win2000 or Netscape 4.x) is less bloated?

    Daniel
  • IIRC, Emacs is just the core LISP environment and editor..all those other things are separate programs that happen to run inside Emacs.

    Daniel
  • I sent a response to Computer the day the
    issue arrived in the mail. I've already received word that it will be published, if they have space.

    I have a copy of my letter on my home machine; maybe I can dig it out and post it.
  • Corba is not solution either becouse it lacks something as simple as shell concept "let other program read output of this program" Lets think about it and we surely find a solution.

    Well, CORBA works on the level of objects and methods, not programs. It does have the concept of methods returning values.

    But the point of the UNIX shell was that programs read and wrote streams of data, which isn't really the case in graphical applications. The closest analogue would be to have one program's window contained in another's, which X of course supports (usually called "swallowing" a window in another). CORBA can be used to negotiate this. This is how BABOON and OpenParts work, AFAIK.

  • For one, the message you were responding to never said that emacs was good for everyone and everything. It just said that it wasn't bloated. Perhaps it's hard to use, and perhaps you don't like its extension language, but that doesn't have anything to do with it being bloated.

    Two, not being a programmer, I'm not surprised that you don't like emacs. I am a professional programmer and it makes my life a hell of a lot easier. I won't get into the details, but it really saves a lot of effort. To be fair, a lot of programmers don't like it either, which is fine for them.

    Three, I agree that having to know lisp to configure it is a pain, but this is no longer the case. There's now a pseudo-graphical configuration mode which requires little to no lisp. I still wouldn't consider it a walk in the park, but it is easier than it used to be.

    Finally, have you ever used Gnus (the news reader)? It's far from bad. I wouldn't call VM bad either, but it's not outstanding.

    That all said, I agree that emacs is a bit archaic. And it would be better if it were modularized somewhat. As much as I love Gnus, it could do to have a real GUI and some machine-native code to speed it up.

    And I'm not sure that all US students do learn LISP. I did do work with AI (in college) so I did, but I can't speak for everyone else. Perhaps they should teach LISP or some kind of functional programming language to all students, though, just because it's good to be able to think in that paradigm.

  • Just in case anyone is still reading this thread... (sorry, my home computer is broken)

    Sort of. They could be said read streams of events, and they write drawing requests. But this level of interaction is basically useless for most scripting purposes (though there are a few where it's useful). Just to be clear, the events they read are like "this keycode was pressed; the mouse button was pushed down at coordinate x,y; the cursor entered your window". The drawing requests are like "draw a line from x1,y1 to x2,y2". Also, the event streams are targeted at windows, not programs (where windows include both top-level windows and inidividual widgets). Piping these together would be basically useless except for things like UI testing.

    If you want to talk about high-level events, that's where CORBA comes in.

  • Complexity management has been a problem in software development from the beginning, and this has been known for a long time (see The Mythical Man-Month by Fred Brooks, a book nearly thirty years old).

    What I find interesting is that there really have been no revolutionary improvements in complexity management in all this time. There have been incremental improvements, such as software tools like more advanced languages to automate some of the most menial work of programming or revision control systems that provide improved concurrency control and communication about software changes, and somewhat improved design methodologies like object-oriented design, but despite some grandiose claims nothing has really solved the problem that designing and building complex software is hard.

    Open Source software is not immune to this problem. What source code availability and open development processes have done is introduce other incremental improvements, mainly increasing the available labor for implementation and debugging. Successful open-source projects tend to remain focused on fairly specific goals (usually development of specific applications) and have efficient centralized management. These usually translate into advantages over commercial software development projects working on projects of the same scale because open development tends to distribute work over more people and more time than commercial developers can afford.

    Lewis doesn't seem to acknowledge that commercial software is equally susceptible to the complexity problem, or that the smaller number of participants in most commercial projects put them at a disadvantage for doing the grunt work of implementation and debugging. His assertion that open source software is somehow going to be unable to obtain wide market share is, I think, based more on cultural and psychological factors such as the pervasive belief that you're getting more if you pay for something than if you get it for free rather than on any inherent weaknesses of open source software relative to commercial software.

    However, I do think it's important for Open Source software advocates to realize that it is just as susceptible to the complexity problem, and that distributed development is only an incremental rather than a revolutionary improvement in programming technology -- a brute-force solution that uses greater amounts of conventional effort to attack the problem, rather than a fundamental change in approach that would allow similarly complex projects to be done with less effort.

    There are a lot of other flaws in Lewis's article, such as the assumption that Open Source software is about market share rather than people solving problems, that large commercial enterprises are "strong" and open development is "weak", or the various factual errors in his characterization of Linux and other open source software, but the biggest flaw to me is his apparent assumption that commercial development is capable of solving problems that open development is not even though both use essentially the same software engineering methods.

  • by bain ( 1910 )
    Bravo. Spoken like a true prodigy

    Who would like to volenteer supporting a 200 user office moving from M$ to Linux ?

    Bain
  • For one thing this article only looks at the 2.0 Linux kernel, and another it does not do any comparisons to number of programmers vs. defect density or try to figure out the defect density of the various microsoft operating systems.

    Also if a high-end *nix can have 60% defects and still remain on par with linuxs defect density then what does that say about bloat? Considering linux has virtually all of the functionality of the high-end *nix at the core level.

    As for the arguments on defect density and bloat increasing with linux that *could* be true if everyone who decided to code opensource apps or work on them did not have the unix mindset of making one thing do what it's supposed to and do it well, and not integrate everything into it.

    As long as there is a will to do something, something will be done.

  • Here we have a magazine with such a clue, they distribute their articles in PDF format (making it harder to read and limiting their ability to make money from banner ads) and require people to log in to read their stuff ("go away, new customers! we don't want you around!").

    And we have their "think tank" consultant from the "Technology Assessment Group" (translation: we can't do it, so we slam everyone who can).

    Even as a columnist, he seems to be lacking on the competence curve. With all the bad spelling ("Torvold"??), erroneous facts (best video support outside Win9x, and a preferred platform for new advances in wireless LAN technology), distortions (the "defect density" BS), bad math (Linux has three times the developers per line of code that NT has, even given his questionable stats, and Linux is undersupported?), and outright lies (Apache "losing the performance battle to IIS"), it's hard to imagine that he even did any research, much less deliver any thoughtful conclusions. The majority of the article is a dissertation on his own theories of how the software market works (all delivered without a single cite or bibliographic reference); thus, when Linux violates some of his "basic principles", it's no wonder that it comes up short in his "analysis".

    If it weren't for the fact that this article isn't likely to be thrown on my desk as Exhibit A why someone shouldn't use Linux, I wouldn't even waste my time on it.
  • Anyone have a copy in HTML or something... I hate PDF.
  • This guy is rediculous. He talks about Linux's reliability being suspect? I don't care what his reasons or arguments are. The proof is in the pudding. Linux development/quality is not slowing down, and it's not going to. OTOH, I can think of one megacorp who's software is gonna be way late :)
    --
  • I really hate to use language like this but it's all BS.

    This is the most monumental pile of fud I have read in years. So let me relax and tear his claims apart.

    1 : Linux is compared to Unix when Reliability is tested. Not Windows NT. This is significant because the only selling point of NT in Big offices is that it's cheaper to deploy and therefore lower quality is acceptable for none critical costs. As long as Linux is Better than NT it's good enough to crush it. Solaris and Aix can stay where they are for now.

    2 : All his Embrace and Extend scenarios assume that the GPL doesn't exist. Simply put anyone who sells Linux must keep it open. End of story.

    3 : He compared Linux size to that of other unixs and NT based on the number of Lines of code. Sure Linux is small by that measure. Now put it against SCO based on features included in the Kernel. Suddenly Linux is the lumbering giant.

    4 : Apache's once large dev team has shrunk to less than 20 because the hype has died down, or maybe it's because the developers have all the features they want so only a dozen good men are needed to fix security breaches and tweak performance.

    5 : IIS matches apache in performance. This is the biggest BS of the lot. It's also why I like the idea that Apps based on QT won't be ported to Windows. Apache may be slower than IIS under light loads when running on NT with a large computer that's bearly stressed. Compare Apache on Linux to IIS on NT when both are on identical hardware. Huge difference.

    6 : Linux is compared to Unix on market share based only on the $$$ spent. Linux is free to cheap, therefore The fact that it's equal to all the other unixs combined in volume means nothing to this "reporter".

    7 : He talks about management difficulties and how Linux development has slowed down since the 2.0 release. In the same section he states how the management problems are being effectively dealt with. I.e. Delegate.

    8 : He compares Linux dev team to NT Dev team based on numbers of programers and testers. Linux has 200 full time while NT has 400 full time. Linux has 1000 Par time, NT has 0 part time. NT has 60,000 Beta testers max and 250 internal testers who can peak at the source code. Linux hase over 100,000 testers who peak at the source code. Including me.
    Too bad that didn't make it in.

    This is the most beautiful fud fest I have sean in a long time and yes. It's BS but it's beautiful BS. We are in the 3rd stage of Ms Ghandi's 4 stage struggle. They are fighting us now. Next year we win.
  • The vast majority of the OSS/Linux talking heads seem to neglect the INTENSE stratification that exists in the computer business. I think Microsoft downplayed this very factor when they decided to feed NT to the common user, and are now paying very heavily for it . . . what's this there's going to be another consumer-grade OS based on DOS . . why? because NT is not compatible with cutting edge gaming at its core and will never be. I sure hope that the moron that came up with that strategy has been ripped a new one by his boss.

    I know very little about the requirements of the server market, but I suspect that if reliability and stability is a strong factor then this great rise in NT usage might start to fall away with the advent of W2k, that is unless W2k is initially released in a very stable form heh heh heh. The next viable solution is a Unix-type platform, and it seems to me that this is where Linux will start to make some serious inroads against NT & the bix Unixes.

    Then there's the consumer/business market. I initially clump them together because both of these major markets tend to use similar hardware, whereas the server market uses distinctly more advanced hardware. On the business end, Windows is pretty strong because it's fast. What I mean is that you can order a hundred or more PCs with Windows pre-installed and with a minimum of effort, all these PCs can be hooked up to a central network, and employeeds can be plunking numbers into spreadsheets and emailing jokes within a very short time. While Linux might offer advantages in maintance, configurability, and performance, it will be hard to convince the business folks to go for it because it doesn't come preinstalled (although this might be changing), and when something doesn't work right there isn't a centralized someone to call and bitch at. Forget that Microsoft charges for support, if this is a business that isn't a real concern. Add to this the additional cost of training virtually all the employees to use this OS that most have never seen and some have never heard of.

    As far as the consumer market goes, there exist many sublevels. At the top (money-wise) there are the gamers. These are the people who plunk down many thousands of dollars on a predictably regular basis just so they can prance around as cyber-warriors (I have nothing against gamers, but sometimes I wonder if the tons of money might not be better spent). I think this is where Microsoft fumbled badly. The gamers are constantly looking for ways to get their machines to run faster, and with the talk of QuakeIII being multithreaded so it can use multiple processors, and the growing realization that for 32-bit programs that consume a lot of memory, NT offers a large performance advantage over 95/98, it should be obvious that Microsoft saw a way to feed an OS to a segment of people that are willing to spend a lot of money on a very regular basis. Unfortunately the very things that make NT run better on high-end machines also make it a nightmare for games that require direct-hardware access. That Microsoft thought (and possibly still think) that they could have their cake and eat it too still amazes me. But I have digressed. This is another market that Linux will not have an influence in. The emmergence of cutting-edge hardware along with their accompanying cutting-edge drivers (for 95/98) happens at such a rapid pace that there's no way Linux could keep up. Couple this with an already severe lack of game support (there are a lot of mainstream games other than Quake and Heretic), and the stage is set for 95/98 to rule this market segment.

    Now for the consumers that are knowledgeable about their machines, and are eager to learn more. Generally these tend to be students at all levels, aspiring young programmers that are trying to learn as much as possible as quick as possible and don't have to worry about things like a 9-to-5 job. Here is where Linux already lives, and is probably growing rapidly. Personally, this is where I fall. I've only been messing with computers for a couple of years now, but I've already worn out Windows 95 and just the other day I hosed my NT installation so that I might be "all Linux". I maintain a small 98 installation just for games. I think this segment of the consumer market is approximately the same size as the gamer segment, and possibly just a little larger. The only problem is that within this segment are also the people who use computers for what they are intended, and hence they don't need the latest new hardware so their dollars tend to stay close to home instead of being scattered into the market. While this is a good personal philosophy, it really sucks for the market segment, as it lessens the importance of the segment. Linux also amplifies this sentiment. I have two machines at my house, one is a dual PPro that is my primary machine (and even though it is considered antiquated by many, it is still blazingly fast for everything that I ever need to do), and the other is a 486/66. When I was given the 486 it was slogging away with win95, and I was very impressed at the performance increase that was gained just by changing the OS to Linux. So the people who never felt like they needed a huge machine to begin with now will have yet another reason not to spend their money.

    Then there is the last and largest subgroup of the consumer market. I call this group the email-drones. The people who belong to this group are the vast multitudes that are buying PCs because Betty down the street has one and if Jenny gets one then she can send email to Betty who can then reply to her email, all without picking up the telephone. And then there's the chat and the ICQ and before you know it the beauty salon of the 50s has moved into the den and onto the computer screen. This market group is single reason for the existance of both the iMac and the $500 PC. Everyone was shocked when the iMac didn't come with a floppy drive . . well, when I was mentioning this in passing conversation to my mother, she said right away "What do you need one of those for?". I rest my case. I'm also not trying to be in the least sexist by seemingly populating this large market segment with stereotypical images of the "gossiping housewife". I use this image as a behavior descriptor only, as it applies to an equal number of males as to females. Anyway, back to the point, here is yet another market where Linux will never make any serious inroads. My parents are still using an antiquated P75 machine, and everytime my father grumbles about things taking a long time to load and/or programs crashing I think about setting up a Linux installation on their machine, as I know Linux could easily fulfill every one of their computing needs and it would do it faster and better. But I already get too many calls from them whenever something happens on their machine that they don't understand and I shudder to think about trying to have them add a directory to their PATH or some such over the phone. Once again, Windows wins because it is easy, not because it is the best.

    So what does this have to do with the article? Well, I think that Linux will continue to grow in both market size and stature, and that it will become a real competitor to Microsoft in many areas, especially in the high and low end. But unfortunately the real meat is in the upper and lower middle of the market, and I suspect that the ability for Linux to penetrate this area is slim to none, but not for any of the reasons described in the article, rather for the simple reason that the demands of those market segments do not mesh with the attributes and philosophy of the Linux OS. Personally I think this is a good thing. There's a reason I have a 98 installation, it plays Age of Empires REAL well, and it gives me a place to go to "relax" at the computer. I like having the "play" separated from the "work" in this manner, and while I remain a strong admirer and advocate of the Linux OS, I think the first question that must always be asked is "What do you plan on doing with your PC?". And I'm glad that Linux is not the solution to the reply "Umm . . I don't know . . games and stuff . . "

    Sean
  • Personally, I've always found the core mechanics the most fascinating, (sometimes to the exclusion of all else). The nuts and bolts are what makes Linux powerful and elegant.

    As far as fame goes, Linus receives the most by dealing with core kernel code, the core of the core as it were. Alan Cox, working on core-kernel and driver code (mainly), currently receives second billing. RMS's contributions to make, glibc, (long list) gets far less attention than deserved. And for the developers working on the "fancy" projects such as X11, XFree86, Gnome, KDE, StarOffice and Gimp, great as their contribution is, there is far less name-recognition.

  • The author states that code bloat and featurism will become a problem in the long term. Well, free softeare has been arounf for a long time and we've seen no real evidence of it yet. Most free software projects go through an early phase of rapid development and then settle down to a quiet middle age. Remember how often GCC and Emacs were released in the early 90's? Now minor releases are made at about 1 year intervals.

    What really happens when a project matures is that the original programmers strike off into new areas (like GUI desktops) and leave their previous work to those interested in maintaining it.
  • I'd be more convinced if he had given some (ANY!) examples of popular open-source software floundering under its own success. Until he can do that, his whole rant is mere speculation.

    BTW, what did he mean by "Linux going commercial?" at the end of the article? It's GPL, ace.
  • This article was so full of factual errors and misconceptions that I think the author didn't do any research at all, and is completely ignorant of the topic he writes about.

    The business case for Linux and its open source software cousins is based on two fundamentals: the software is free or nearly so, and the availability of the source code attracts a large number of debuggers - again for free or nearly so.

    No, the business case for Linux and OSS is based on these two fundamentals: it performs better than its proprietary counterparts, and it fails less often. The author goes to great lengths to describe how total cost of ownership is a key factor in buying decisions, yet fails to realize that TCO is influenced most by unit failure rate, and studies have shown that unit failure rate for Linux is far less then for Windows. Thus, TCO should be far less for Linux simply because you don't need support as often.

    ... the failure rate for the utilities on the freely distributed Linux version of Unix was ... 9 percent ... Unix has more than 10 million lines of code, while Linux has only 1.5 million. So the Unix defect rate could have been as high as 60 percent and still paralleled that of Linux.

    This is a simple math bug. The failure rate for Linux is 9 percent. If the size of Linux were ten times its current size, the failure rate would probably still be 9 percent. That's what percents are - ratios. His "defect density" sounds to me like exactly the statistics he is refuting (thus, the defect rate of Linux is 9%, compared to 15-43% for other Unix), since density is a ratio. 60% of 10 million sounds like a lot more than 9% of 1.5 million.

    "Linux, once again, has had over 1,000 people submit patches, bug fixes, etc. and has had over 200 individuals directly contribute code to the kernel. ... Microsoft's core development team consists of 400 full-time program-mers and 250 testers... When compared to the size of the Windows NT effort, Linux is woefully undersupported.

    Microsoft has only two times the core programmers, but they have 6 times as many lines of code to work on, and Linux is the one that's undersupported? Sounds like Linux has three times as many programmers per line of code than Microsoft.

    Already, Apache is losing the performance battle against Microsoft's IIS.

    Didn't he see the recent article by ZDNet showing that Apache outperformed IIS by a wide margin?

    Even the opensource community admits to this weak-ness: "The biggest roadblock for open source software projects is dealing with exponential growth and management costs as a project is scaled up in terms of innovation and size" (http://www.opensource.org/halloween1.html).

    He quotes a Microsoft source saying Linux is in trouble, then attributes it to the Linux community. Of course Microsoft is going to put Linux down. When taken in the correct context with the correct attributions, this statement loses his purpose.

    So we can rule out any scenario in which Microsoft takes over Linux.

    The GPL would prevent this anyway. Even if Microsoft tried to "absorb and extend", they'd never be able to do it without violating copyright laws, because any extending they did would have to be released as soon as they shipped it to any customer. Even if they try to tie applications to "their" Linux, any work they'd done to make it "their" Linux would have to be released under the GPL, and could be replicated in other Linux variants.

    A third scenario is most likely: Linux will turn commercial

    Linux is commercial. What do you think Cygnus, Red Hat, Caldera, SuSE, and Pacific HiTech are doing with it? They certainly aren't charities - they're out for profit.

    To make a lasting impression, software developers still must cope with absorb-and-extend and other techniques of the strong. To do so, they will have to retain a certain amount of proprietary code in their products and charge for their intellectual property, one way or the other.

    First off, as stated before, absorb and extend just won't work with GPL'd code. It can't. Any work Microsoft puts into enhancing Linux (sans applications, which are effectively independent of Linux itself) becomes part of Linux, which only improves it for everyone else too. Any support Microsoft puts into Linux would only help the Linux community. Second, Linux's "intellectual property" isn't the code, it's the people behind it. Companies like Cygnus, Red Hat, and Caldera make lots of money off OSS by having the right people and services, not by having proprietary code. Microsoft can't duplicate that without a significant investment in people and time.

  • I guess if Open Source software succeeded, it would contradict something he wrote in one of his books.

    Grrrr... I don't know why a hatchet job by some clueless economist should make me angry, but it does. My only consolation is, tomorrow I'll have forgotten his article, but he'll still be afraid of Open Source.

  • who? guys who enjoy such a work
  • "To qualify as a world-class success and not just a fad, each new product or method must pass the acid test of 'crossing the chasm' that separates early adoption from mainstream acceptance. Linux, and open source in general, fails this acid test."

    Or, more accurately, perhaps hasn't yet passed this acid test. It can only be said to have failed that test if, after some "reasonable" period of time, it hasn't passed it; otherwise, simply by discussing a product early enough in its life cycle, one could dismiss almost any new product as "failing this acid test".

    Perhaps it will fail the test, and perhaps it won't. His article should be treated as a prediction to be tested against future reality, not as a firm description of what will happen. (A phrase that's more and more in my mind these days is "Just because somebody says something, that doesn't necessarily mean it's true.")

  • I was about to try to answer without flaming but I find myself unable to do that. Here are some glaring stupidities:


    "... The concept of free software is a well-known and frequently practised strategy of the weak"

    He then draws a parallel between AT&T's licensing of SystemV with Linux's openness. AT&T's strategy was a market-driven, commercial venture, Linux's openness isn't.


    "When HP and Sun acknowledge Linux as a viable challenger, we will see its rapid deconstruction as these competitors first embrace and the extend its advantages. Linux is on the verge of being squashed be the strong and will probably not survive in its present form."

    Here's the question of code forking again and I must admit that it's worried me before. Thankfully, the open licenses under which the Open Projects operated do not permit it, as Linus said in the MSNBC chat.


    About the failure rate of Unix utilities: "... it should not be confused with defect density, a more reliable metric... Unix has more than 10 million lines of code, while Linux has only 1.5 million." His conclusion is that Linux's defect density (lines_of_code/failures) is much higher.

    Wrong! The correct ratio is functionality/failures. If a product does what it should do in less lines of code, that's actually a benefit because that's the mark of better, cleaner code that's much easier to debug.


    "Thus, Linux's reliability is suspect. In fact, we can expect Linux defect densities to get worse..."

    The point is that I've run Linux at home for the last 7 years at home on four different computers and at work for the last 5 years on many, many different computers and in that time have had four unscheduled crashes. Yup, can count them on ONE hand. I have also used Windows for the last two months and since have had more than 20 crashes and have had to re-install 3 times.

    Linux gives me more functionality than Windows, for a better price and keeps my personal data (the most important part of my computer) safe.


    He later goes on to compare the number of full-time programmers at work on Linux and Microsoft, and the number of beta-testers pointing out that Microsoft has so much more.

    Well that doesn't reflect very well on them, now, does it? Considering that they're challenged and therefore but on the same footing as the Open people.


    "... Support diminishes still further when the hype wears off an open source application... Mozilla's mailing list declined by 58 percent... Apache..was developed by a cast of thousands is now supported by fewer than 20 core members. Already, Apache is losing the performance battle against Microsoft's IIS."

    Mozilla's mailing list: Expected. That's called stabilization. Apache's development team: That's all that's needed, baby! Apache's performance: One wonders where he gets his numbers.


    Page 2.


    "... Unfortunately, these exceptionally talented programmers (Open source) are in limited supply and, as any open source program becomes widely distributed, this talent will become increasingly scarce."

    Limited supply? As far as I known the human race hasn't become infertile, all of a sudden. Anybody can become an Open Source Programmer, in fact, zillions of kids now in school are getting in on the act. That's your supply.


    And this is where I quit. The next bits made me angry enough that the only thing I could reply was "F**k off and die, you idiot!" I guess that qualifies as flaming. Oh well.

    DannyC.

  • cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks cypherpunks

    (you get the idea)

    I don't think Slashdot posts any articles that haven't been 'cypherpunks' enabled.

  • Measures of defect density are meaningless, for the most part. I should know, because I have worked as a Software Metrics dude as a full time job. Now, I must admit that I haven't read the article, but if he says that Linux has a higher defect density than other Unices, then this can be accounted for quite easily.

    Consider one difference between Linux and other unices, namely the definition of a "standard" distribution. There isn't really any such thing in Linux. As a result, we pretty much have an infinite number of systems, since any two programs that interoperate within the system can be classed as a sub-system. Ergo, we have a lot more problems with interoperability. If you want to bump up the defect density for propaganda reasons, you just count as many individual incompatibilities as you want, but treat the "area" as being fixed. So you can basically prove anything in terms of your numbers.

    There's another difference here between Linux and other unices. Namely, users are expected to have a bit more common sense when it comes to ironing out the kinks. If there's a problem with one bit of software, they can often leave it aside and work out how to fix it later. They can *still* have a system that works well, and still has a lot more features than the equivalent "other" unices.

    Also, compare this with Microsoft's method of producing software. They don't give a damn about defect density. They realise too that it doesn't tell you anything. Instead, what they do is classify bugs according to their impact. Then they trade off testing/bug-fixing so that they only fix the major, high-impact stuff. Then they release what are effectively beta versions and let the customers find out the niggling errors that aren't too serious.

    This seems to be a new form of FUD tactic from a pro-microsoft head. Since linux is the enemy, simply pit linux heads against unix heads. It doesn't matter if the issue is irrelevant. It diverts attention from the real issues.

    Beware of statisticians: numbers are an easy source of divisiveness.
  • I saw a great deal of legitimate gripes with this article. One thing that was not mentioned yet was the growth of the internet's partial responsibility for the OSS revolution. I have to wonder if that was a stratigic omition. The net definatly makes it much more feasible for developers to spontainiously coordinate with little to no overhead. That flexibility of the internet, renders figures that do not weigh the recent popularity of ipv4 in fair proportion, slanted and antique in nature.

    With the solid statistics just mentioned backing it up, I knew I knew the material in this article had originated from somewhere else. After a little searching I turned up a group who had done similar research and turned up similar results.


    That group's transcript...

    Did you dress her up like this?

    No, no... no ... yes. Yes, yes, a bit, a bit. She has got a wart.
    She Turned me into a newt.

    A Newt?

    I got better

    what do you do with witches?

    Burn! Burn, burn them up!

    And what do you burn apart from witches?

    More witches! Wood! So, why do witches burn? [pause]

    B--... 'cause they're made of wood...?

    Good!

    We have found a witch, may we burn her?
    What also floats in water?

    Bread!

    Apples!

    Uh, very small rocks!
    I'm not a witch, I'm not a witch!!

    They dressed me up like this.

    No we didn't!

    No! No!

    And this isn't my nose, it's a false one!

    So, how do we tell whether she is made of wood?

    Build a bridge out of her!!
    A duck! Exactly! So, logically... If she weighs the same as a duck, she's made of wood? And therefore?

    A witch! A witch! A witch! A witch!
    Right, remove the supports!
    [whop] [creak]

    A witch! A witch!
    It's a fair cop.
    Burn her! Burn her!
  • Heh.. tru running Emacs on a 386SX-16 with 8 megs of RAM. Then you'll understand the true nature of Emacs: Eighty Megs And Constantly Swapping.

  • I started out reading this thing in the hope that I would see a well done study of the possible limitations of the OS development model. Instead I find the usual, crappy and unsustained semi-arguments that we see whenever somebody claims to have given the OS-movement a serious, critical look.

    I found Mr. Lewis to be full of himself, and this finding was substantiated when I went to the website http://www.friction-free-economy.com [friction-f...conomy.com] that promotes his book. In the section called "about the book", he goes on and on about how science and technology joined economy in the industrial revolution, and how the current powers that be/were didn't understand what was going on. Let me point to a couple of phrases that I found particularly amusing, given his stance on OS's role in the current "revolution":

    - "Drucker's Law still applies: the people in the midst of the revolution don't know what is hitting them. And they won't know until after the IPO1 is over. Like passengers in a speeding boat, spectators in the software age know the river's current is swift, but they don't know where the raging falls lie."

    Too true. But I you find it rather pathetic to rant about how people failed to understand the industrial revolution, and then commit an article which thoroughly demonstrate that he hasn't properly understood the networking revolution that he claims to be preaching? Mr. Lewis has no idea what's hitting him, and thus serves as a poor guide for others.

    - "It may be too fast for royalty [drawing a parallell to the powermongers of the Industrial revolution, here], but the software economy is on its way. It may be a mystery to the establishment, but it is well understood by the Netheads2 in Wired World3 . It may violate the doctrine handed down by classical economists, but it does follow a set of laws. It may be just in time."

    A mystery indeed, Lewis, and indeed one that you don't grasp the way you claim. Oh, and by the way, we apologize for violating your doctrine. The final sentence in the previous paragraph points to the next:

    - "The late 20th century marks the beginning of the end. Within the next 20 years we will discover the new laws of the software economy. We have early warning signs - Netscape Communications Corp. parleyed 16 months of software development work into an IPO valuation of $3 billion. Companies like Microsoft and Adobe Systems which were unknown a decade ago are now the darlings of the stock market, and the nouveau riche telecommunications industries like 3COM, Cisco Systems, and Bay Networks have turned from small-cap industries into powerhouses of the new century."

    Okay, people: This guy is a friggin Ph.D., he holds degrees in Mathematics and Computer Sciences, so let's not write him off as an idiot (even though he claims Microsoft was unknown a decade ago). His angle is just skewed. He has published numerous books, and the one that probably sheds the most light on his skewed vision on computing, is the one from 1976 called "How To Profit From Your Personal Computer".

    From this, we may postulate that Mr. Lewis is probably hooked on the income side of computing. Don't get me wrong, though, I have no problem with anybody making money by facilitating my work. However, this places Lewis among the current day's power-mongers (I may be inflating his ego here, but the guy is Chairman of Computer Science at the Naval Postgraduate School in Monterey, Califorina, which may or may not be a big deal - kind of hard for me to know from where I sit), and if we were to allow a parable from his own account of the industrial revolution, he should thus be one of the last to know what hits him in this revolution. According to his recent writings, he appears dead on schedule...

    Far from being discouraged by Mr. Lewis futile attempts to discredit the OS movement (some of his points are valid, though, had they not drowned in the rest of his rubbish), we should see this as a confirmation that OS is on the right path. This path will need some adjustment from time to time, and some of the necessary adjustments may stem from people with no better understanding of the cause than that of Mr. Lewis.

    Those who don't know their history, are doomed to repeat it...

  • I don't think Mr. Lewis has a firm grasp on the whole reason Free Software (FS) exists.

    First off, Linux and FS in general was not started to squish MS. I still don't think it exists to squish MS. That's not the point. This is a community center that all are welcome in. These community volunteers are not here to build a non-profit organization to put the Embassy Suites out of business. Just a place for people who want a good, down to earth, quality place to go. One that they, too, are free to contribute to. A place that can exist after the founders leave, for whatever reason.

    I don't know about the majority here, but I started using linux for a few core reasons.

    1) I wanted UNIX. Period. UNIX has power that NT just can't achieve. It's flexible, stable, logical, and versitile.

    2) I wanted to learn to code. Having source code to everything is a big plus there. I can peruse the kernel, KDE, Afterstep, sendmail, whatever I want. That makes for some great learning material. Plus, I can reuse this code freely.

    3) I am never held captive to a bug again. Without even claiming to be a programmer (I am not) I can say that I have fixed a bug for myself. A NIC driver. Not some glamorous GUI office suite. Just a NIC driver for a card I needed to get working right at that very time. Stores closed, no money, choice between 2 NICs, one broken, and one with a driver that would not compile. Read the error messages from the compiler, fixed what logically seemed to be the problem, and Shazam... a working network.

    None of this has anything to do with MS other than you can't do it with MS :)

    The biggest reason I try not to use Windows (of any version) is that I have lost work. Alot of work. Windows freezes, and occasionally takes a partition with it. Not often, but how often does it take?

    I do disagree with MS in their business practices, but that still is not the reason I use linux and GNU software. I use it because it's just better. I can get things done. And I can learn, which is very important to me.

    Do I care if corporate (insert country here) adopts linux? Only to the extent that I may get the one application I really need that isn't on linux yet, and that hardware developers will start contributing open sourced drivers. I think I'll get the drivers and apps with or without this, so in the end, it really doesn't matter.

    As to the remarks about bloat and featurism. I see a checks and balances here. The programmers ARE the users. If RedHat decided to start playing "embrace and extend" tomorrow, what would happen? Well, I think the first thing that would happen is the keepers of the code (us... the users and programmers, by grant of the GPL) would just revert the code. And stop using RedHat. RedHat would sink like a rock, and they must know that. Aside from being able to fix and use software on your own terms, this is the most important aspect of Free Software. Without us, they don't exist. And since anybody can start a company to take their place, using the very package they put out, they hold on by honor and reputation alone. Anything they add that isn't liked by the rest of the FS programming world will be an orphaned bastard child left on the side of the road. Plenty of other distro's to take their current place at the top (they are top in the US, anyway, for the time being.)

    Those of us who really want to use linux/*BSD/HURD/etc... are in no danger of losing it. We will still wake up tomorrow and have all the code. Nothing lost, nothing to worry about. And there will always be someone that wants to make it do something THEY need. And we will all benefit. New day, same way.

    This is linux. You will NOT be assimilated, but you are always welcome to join in. We're having fun over here :)
  • > The Unix error defect could have been as high as
    > 60% and still parrelled that of Linux.

    That's absurd. Think about. In pratical use, it's how often it fails for you. 60% is unusuable. Linux's bugs are far from making it unusable. While his comparison based on lines of code may make sense to him, it doesn't make any sense to me.

  • Call it OpenNews if you want or OpenJournalism..

    I hate clicking on article to find out i have to pay to read it.. Just say no!

    StickBoy
  • I fear he may be right about feature creep leading to bugs. Look no further than Microsoft for a perfect example.
  • you can look at it either way. The manufacturers need to feel a greater demand for it than they do now. Wouldn't it be easier for the people who know exactly how the hardware works to write a driver for an OS for which they can get the source than the other way around?
  • His argument seems to hinge on the fact that there are only so many programmers with the will and talent to produce quality open source software. The problem with this logic is that open source software itself actually CREATES more talented programmers because would be coders and programming students can learn by reading actual CODE! While I don't believe that ALL code should be open source, I do believe the operating system and core components should be, otherwise you end up with microserfs using the advantages that only they as the OS provider have to squash everyone else. An OS owned by the people is the only way to prevent that.
  • I have an essay overdue on proof by contradiction, but let me try to apply it here...

    It is clearly true that an expensive system with bloat, high defect density, and a remarkable snobishness towards customer feedback has indeed become quite mainstream, particularly in the operating system market, so this argument is fairly weak from the beginning.

    Now, it is not really clear whether Linux will or will not suffer from the kinds of bloat and creeping featuritis that he describes. Speaking of the kernel itself, its creation still lies with one person who reads and re-reads every single patch which is applied to be sure everything is done correctly. This is a distinguishing characteristic of Linux kernel which almost certainly contributes greatly to the tightness of its code, particularly given its large featureset.

    As for maintainance, the problem is clearly alot easier with open source software. One has the same recourses as before (complain to the author/vendor) with the added benefit that they can complain to any arbitrary third party and get them to fix the problem, then submit a patch to have it fixed generally if they so choose.

    Basically, the trouble with his argument is that it is entirely speculation, without any evidence to support it. It is logical to think that free software could not possibly work conceptually, particularly with proprietary software being so much more profitable, but it is also demonstrable that this is not the case.

    In general, it is free software's more efficient nature (compare RedHat's employee count and market penetration to Sun's employee count and market penetration) that will make it work, most likely.

  • I think the MS page is Slashdotted. :-)
    Or maybe their scripting needs debugging...

    Microsoft VBScript runtime error '800a01f4'

    Variable is undefined: 'URL2Code'

    /NTServer/inc/global.inc, line 30
  • While many of his arguments were flawed as has been covered by other contributors, the fact that he butchered Linus' last name severely discredits the entire article IMO.

    Let's face it folks, nobody *knows* what the future will bring. It's all mental masturbation. All I know is it's fuckin' great to be alive in a world where I have a better-than-Ultrix 4.2 (what I was brought up on) Unix workstation on my desk *at home*. I only dreamed of that 5 years ago.

    As far as the future of Linux goes, let the people decide. Linux is what you make of it. Nobody will force bloat down your throat (hehe). The modularization of the kernel is such a wonderful thing. I'm not a kernel programmer, but I can't understand why the central kernel code itself should need to rise exponentially... I'd guess more along the lines of *linearly* if it's being done 'right'. And I imagine the module code will increase with whatever the market brings. But that's just mental masturbation ;)

    Orp
  • Compare Netscape Communicator to Gecko.

    'nuff said. Lewis's claim of bloat killing
    OSS software is bunk, because the link between feature creep and bloat isn't there.


    Feature creep in OSS means somebody offering new command utilities, maybe a GUI here and there. If I don't put it in my hard drive, it isn't bloat.

  • So according to this guy, if I write a killer app, compile it for Linux, and sell it binaries-only, that will somehow make Linus wither like the wicked witch of the west once watered?

    Worst case RMS won't invite me to his Superbowl party.

    Then there's his statement "but Linux is still a Unix, and Unix is still losing market share". That's not FUD. It's fudging.
  • Hmm. Let's see.

    Sendmail, BIND, Apache, INN, are probably some of the MOST used software on the internet. The versions bundled with most "commercial" *nixes are derived from the OSS versions, and most installations immediately replace vendor-supplied version with the latest OSS editions.

    Nope, that's not mainstream. Clearly, we hit the brick wall and just kept going.
  • The Slashdot effect is great if it helps free projects, but if you quote from pay or registration sources, you just dignify their behavior and bring new customers to them.

    I find it really annoying when I click on a link on Slashdot just to find that I have to register or pay some company. No, thanks.

    And I find it preposterous that some people here actually comment on a short quote from a larger article that they could not read in total because you have to pay for it. Read it completely and THEN you can comment on it.
  • Bloatware is a result of one thing: a company's iterative processes through version numbers to get more revenue. Open-source projects are immune to this trend, because open-source projects aren't driven by corporations, but by necessity.

    Programs in the open source model exist to solve a problem, not to get people to buy it.

    As far as 'not main-stream acceptance', what about Apache? sendmail? bind? XFree86?
    Granted, not everyone is using those programs, but if he's trying to predict what Linux (any *nix really) will be as far as a desktop operating system in several years... well... he's nuts.

    Also, he fails to grasp that new talented programmers are created daily. Linux is quite popular on campuses. I can't cross campus without hearing about it somewhere. Linux can easily move in to capture future talented programmers in a way that Apple and Microsoft's astroturf campaigns cannot possibly attain.

    Real computer science majors won't touch Microsoft or Apple products. That's why the business building has the shrine to Microsoft, and not any of the many computer buildings. (Macs are for the liberal arts people).



  • But these are *new* Linux users, not old time "Unix is everything" people. I hear people asking their friends what distro they should install and I see people with Running Linux books on their desks in my Computer Science classes. It was NOT this popular last year or the other three years that I've been attending college.

  • I go to the University of Texas at Austin. There's LOTS of Linux people around!

    Consequently, a lot of the pro-Linux folks from Dell and IBM get hired from here. Many of my professors are also working at IBM and they all talk about how you can't walk through their buildings without seeing penguin stickers on the doors.

    It's encouraging that there will be a future beyond working with the broken Win32 SDK; and with the trial looking so well, a future beyond working for Microsoft.

    (Not that I ever would)

  • ...because, while the author questions other people's statistics, he doesn't provide any inkling as to where he got most of his. So when he says that Linux has a higher 'defect density', he doesn't quantify this -- most likely because he cannot.
    But this article is a great argument if one assumes that the industry will stay as it is, which has always been risky. IBM and others thought that mainframes would always rule the earth, Microsoft was once unknown too. Hell, I remember not very long ago when everybody did the BBS thing and life was good. But things change.
    This article also seems to assume that CIOs will continue to have huge supplies of cash to throw at technologies and at blaming others (i.e. look at the support industry at present for what it really boils down to), and just as the computer has whet the corporate appetite for more profits, there has to be another target after all is eeked out from these machines. And guess who'll be in line? That's right, the ones who keep buying things for them.

    But the largest problem, and the problem that the Linux community has to deal with right now, is that we're in a position where we just don't have to care about being commercial, or being big, or so on. This article treats Linux as a business, which it never intended to become. But more and more now I see people who are treating it as if it should be profit-motivated, and it's a shame that the chase after money has caught so many of us. Yes, Linus can be considered a bottleneck, but only if one feels that a certain schedule of releases must be kept up, which just isn't the case. Too often, we seem to be sacrificing our ideals to win approval of business, and by-and-by most of what business cares about here is making money, not keeping freedom.
  • He may or may not have some valid comments there about why OSS software will succeed or fail.

    Who cares?

    I don't give a rat's fart whether OSS succeeds or fails in the commercial market. I use it because I like it. Not because everyone else says I should use it.

    If businesses decide not to go with OSS software it's their choice, not mine. I'll still stick with Linux.

  • Maybe some of the OSS projects will go down, probably others will arise....
    Stability is a question of moving, not unlimited growth. Take a look at evolution...

    Regards
    kampi
  • I do not have any conclusive refutation for the conclusions (that Linux will die of instability, be embraced-and-extended by MS, or problems increasing market share), but I can pick a hell of a lot of holes in Todd Lewis' reasoning.

    Firstly it is a more-or-less established fact that the OS market is not friction-free. Any OS provider experiences increasing returns as support from hardware OEMs and ISVs comes on line. This is roughly were Linux is now.

    Secondly he seems to be comparing apples with oranges in trying to calculate the defect density. He says Linux has only 1.2million lines of code. This is the figure for the kernel. But he takes the defect rate for the *entire*system* including the bundled utilties, and compares it to the (probably correctly calculated) defect density for some (unnamed) commercial Unix. He goes on to say that all one-off figures are suspect, but supposes that this somehow casts doubt on Linux' reliability.

    His list of features missing from Linux is truly pathetic. Linux (or rather XFree86) has video card support second only to Windows 98, and anyway this is an irrelevance on a server platform. Same goes for productivity software. Show me a wireless LAN worth using and I will believe the last point. Hw goes on to say that the addition of these feature will destabilise the system. How, exactly ? These are pifflingly small additions compared with the work already done.

    He goes on to make some dubious comparisons between the Linux and Apache core team sizes and those of the competing teams in Microsoft. He does not seem to know that there is no general correlation between team size and productivity or stability of the resulting software, and willfully ignores the fact that the core teams are only a small part of the development effort and an even smaller part of the debugging effort.

    He quotes the suspect statistics from the halloween document showing a decline in mail list traffic on the Mozilla groups (proves nothing), and the equally suspect statistic that IIS is faster than Apache (yes, but only on Win32, beyond 12 users or so any Linux platfrom running apache thrashes IIS on NT into the ground).

    He believes that Linux is 'just Unix' and Unix is losing market share to NT. I think this misses the salient point. People want to use Intel hardware because it is cheap - they very much do not want to use MS operating systems. Linux runs better on Intel hardware than any other OS - therefore it is primarily competing with NT as a low cost OS, not with Unix as a Unix.

    Finally he seems to beleive that Linux is vulnerable to embrace and extend, but he does not say how. Any attempt by Microsoft or anyone else to do so would face onrunning competition from the open source community and an obligation to release any modifications to the core system. That would make Linux a harder target to beat than, say, Netscape. Phew, that was long ...

  • Hi all,

    There is something I think we in the Linux community aren't being completely honest about. Sure, the number of new Linux software applications in development are astonishing, and yes these applications are more than a match for the offerings of conventional software houses. But I think we are perhaps in too much of a hurry to show off these new developments, because they are for the most part in development - a great number in very early stages of development - and people out there in the world, including the media, are mistakenly beginning to think that these applications are what is on offer from the Linux community.

    The result is that tens of thousands of people who are new to the Linux community - including the media - download the latest development releases of things like Gnome, Enlightenment, etc., and then get frustrated with stability issues, poorly implemented features, or just installing the thing!

    Part of the problem is that so much has been happening over the past year that most Linux applications are in development. There are so many unfinished applications on offer that it is difficult for people to even find stable applications that they can use.

    We really need to emphasise to new users and the media that Linux _is_ super stable and a promising alternative to other OS's, but _only_ if really stable, and usually fairly mundane, applications are used.

    People used to the Windows world are accustomed to downloading every update they can find. This is not a safe policy in the Linux world because most of the time these 'updates' are untested.

    We and the media has created a situation where new Linux users are hungry for every new and glossy app we can develop. The hype, in my opinion, is getting a little out of hand, and people are being mislead.

    Why don't we encourage RedHat or one of the other Linux distributors to counterbalance their RawHide distribution with a RockSolid distribution. Hence we coud always point the media, business users, and new Linux users to the RockSolid distribution, and keep them enthused about Linux by keeping them informed of what applications will soon be declared 'rock solid'.
  • I'm not a professional programmer myself (whatever you define that to
    be), but the tone of your comment suggests a lack of familiarity with
    perl or emacs.

    Perl is in fact a good scripting language, but the scope of
    scripting is much larger than you think, which could explain your
    disdain for its OO features. Scripting languages are often used to
    write rather large (100,000+ line) applications. OO provides features
    for complexity management that become very useful when applications
    reach that size. Also consider that other popular scripting languages
    (including Python and Tcl (through the [incr tcl] extension) provide
    OO structure. Scripting does not just refer to 20 line bourne shell
    scripts that pipe run a couple of useful binary apps together.

    As for emacs, I'll first note that I am an avid user of emacs and I
    disagree with your perspective of it. Emacs was not written to be
    "user-friendly" text editor. It is more an extremely extensible text
    editing environment. Flexibility appears to be much more a design
    criterion than short learning curve. I agree that emacs has a high
    learning curve to comprehend all the features, but the effort spent
    familiarizing yourself with operating it reaps many benefits. I found
    that once I got over the hump of learning a large portion of the
    commands in emacs that I could work very productively. Human-Computer
    Interface issues experts have shown (sorry, I have no references) that
    human motor memory works much better using a keyboard than using a
    mouse -- try hitting a menubutton without looking at your
    screen.Memorization of key commands makes one much more productive
    because you rely less on screen feedback to execute commands. Perhaps
    the difference could be compared to that between having to stare at
    the keyboard and type with one finger and being a touch typist.
    Learning how to touch type takes some time, but the benefits are
    grand.

    As for useability, I'm not sure what version of emacs you have used,
    but the versions I use (both Xemacs and emacs) have pull-down menus.
    I've always preferred a program with menus that allow me to learn to
    use the application. Over time I memorize the key commands and
    forsake use of menus any longer. I would also note that, at least on
    unix, that emacs motion and editing key commands are rather pervasive;
    they work in applications such as Netscape, FrameMaker, and any tcl/tk
    based gui application. I'm not sure you could call them standard,
    but they can be found many places.


  • In a quick look, I loved this one the most, even more than the growth rate assumption above it. Now this is a scientifically proof clause to use as the basis for an explanation of how the OSS movement will "implode".

    Laugh of the week, if it only didn't have a load of less critical readers. I hate the power of clueless media.
  • I think that the development of these 'RockSolid' distros is inevitable, and a good thing. It only makes good business sense...

    We'll probably see this when some distro (probably Red Hat) feels they have enough momentum, and a rock solid app base to make a push for the home desktop environment. I also believe that this may happen in the relatively near future (a year or two.)

    Frankly, we're almost there anyway...I'm a computer guru by no ones definition, but i have very few problems with the applications that i use crashing (I'm using SuSE 5.3, btw.) I'm not saying that I haven't had some difficulties in getting my system up and running, but these have been almost completely hardware related.... I'm well aware that I differ from most people in that I'm willing to try and dig to solve a problem, even if I've no idea of what I'm doing. However, I'm reasonably convinced that the amount of digging that I've actually had to do is pretty small.

    Two other things I've not seen mentioned yet.
    1) As time goes on, I think that the portability of Linux will be what makes it viable...particularly since _someone_ will be motivated to get it working on the bleeding edge (at least until something better comes along.)

    2) If Linux becomes too bloated, or can't do the job anymore, then something else will come along that will. This isn't a bad thing...it's healthy.

  • Um, you have to hard-code those spaces or they (all but one) get stripped out by the browser -- use the HTML character entity &nbsp; instead.

    Zontar

    (somewhere in tenn.)

  • >Frankly, a great deal of viruses have gotten into
    >closed source program than one can count...
    >usually these programs are benign and called by
    >the name "easter eggs"! Have you ever seen the
    >little doom clone that the Microsoft programmer
    >put into excel?

    ``Easter eggs" started to appear because corporations did not want their programmers to attach their names to the software. (The corporations were afraid that head-hunters could then lure the best people from said companies to other jobs.) But since it is a human trait to want to attach one's identity to one's work, these best people figured out how to leave some sign that a real person did have something to do with the software.

    In the Open Source/Free Software model, this need to attach one's identity to something is met by a line or two in the comments:

    # Written by Joe Blow
    # (c) by Joe Blow, under the terms of GPL

    > How hard do you think it would
    >be for one of those programmer to put a few more
    >as a trap door because they are pissed of at the
    >company?

    Disgruntled employees are endemic at every high tech corporation. Treat the employees better, make the code available for peer review, & this problem will go away.

    Geoff
  • He says that Linux hass 1.5 million lines whereas unix has 10 million lines. What a load of garbage.
    What a load of garbage! He is comparing kernel
    to kernel+applications. Solaris is supposed to have 8 million lines of code and that refers to kernel+standard utilities that come bundled with Solaris. What is this crap of Linux having less defects because it has less code ( and thereby subtly implying it does less).
  • Thanks for a beautifully written article.. It's always good to see something like that on Slashdot..
    There are a few things that maybe we'll have to agree to disagree on though... You say that Linux won't make inroads on the market of the 'average home user' (read gamer/email drone).. I've found that quite a few people who fit into this category who've actually installed Linux, and been happy with it.. A small learning curve (maybe the same as going from win 3.x to win 9.x) and they're there..
    I've also had a _lot_ of Win users wandering past my workstation at work (where I set up a Linux box to handle the department's webserver, fileserver etc.), and look in awe at my basic Windowmaker screen with the clip.
    Everyone wants one, and now, most of the people on the floor who own PCs are running Linux at home, to check out this OS that seems to do so much more than Windows.
    I'd agree with your points, if you stated that there's a long way to go before Linux makes inroads into those markets...
    I believe it will.. Not to the dominance of the market that MS have.. But I believe they'll maintain a reasonable presence.
    Never is a long long time.
    I still have a nice long list of lots of names back in '94-'95 who were telling me that Linux would never make it mainstream, that it'd never be a commercial viability and you'd never have non-guru users.
    Some of them I still phone up to laugh at before I ask them out for a beer.
    They're now the ones asking for my advice on how Linux can be used to increase the reliability of their company's information systems.
    All that said, it's merely my experience of the situation, so what I write is coloured by my own experience and bias.
    It'll still be interesting to see what, despite all the FUD that's spread, happens.
    These are, indeed, interesting times..

    Malk.
  • ... is what people gain by slagging on OSS. Assume, then, that he is not part of an elaborate plot to FUD the OSS world to death (hatched by MS, of course). What does anyone gain by making the world safe for corporate america? Do middle managers sit in their offices thinking, Damn, I have to adopt a policy that makes more criminal, white-collar, lowlife a$$h*le$ rich; lets get our publication company to slag on that product that isn't technically owned by anyone. I see why ZD is a FUD machine: no one there is smart enough to type ls and then interpret the results. But the rest of the world? It just confuses me. You're picking on the little guy.
  • "To qualify as a world-class success and not just a fad, each new product or method must pass the acid test of 'crossing the chasm' that separates early adoption from mainstream acceptance. Linux, and open source in general, fails this acid test."

    I strongly disagree with this. The key question here is, what is the mainstream market? It is interesting to note that the OSS that has been around for a while has been very widely adopted among its intended market: software developers. Would anyone say that Emacs, vi, grep, sed, gcc, gzip, tar, to name just a very few, do not have mainstream acceptance amongst developers? And what about the server world: BIND, Sendmail, Apache, not to mention the Berkley TCP/IP stack?

    Now, it is true that there is not wide spread adoption of GNU/Linux on the desktop. No one is arguing that, and that is not the immediate goal. GNU/Linux is currently being (re)positioned as a server OS. Its original 'market' was simply hobbyist OS hackers. A perfect example of its new market are ISPs. What proportion of hobbyist OS hackers and ISPs run GNU/Linux or *BSD systems? More than enough, I am sure, to qualify as 'mainstream acceptance'.

    Pronouncements that OSS is just a 'fad', and that it 'fails this acid test' are certainly premature. IMHO, the evidence points to the exact opposite conclusion. In the markets it was designed for GNU/Linux and OSS already have mainstream acceptance. The question is whether they will be able to gain such acceptance in other markets, such as the workstation and desktop markets. This has yet to be seen, and to state that OSS has failed the acid test is like stating that Microsoft has failed the acid test because the majority of toasters do not run WinCE.

    "The more the open source paradigm succeeds, the more untenable it becomes because maintenance and support grow more complex, and costs increase due to a scarcity of talented programmers. Success leads to features, and feature creep leads to bloated software."

    Here the author seems to get confused between individual OSS projects, and the 'OSS paradigm' itself. Ignoring such distinctions the argument seems to go:
    1) Maintenance and support for OSS grows more complex as that software succeeds.
    2) [Implied] There is a limited pool of talented programmers willing to work on OSS projects.
    Therefore:
    3) Costs increase because this pool is exhausted due to 1.

    Firstly, the statement that success breeds features is not necessarily sound. Success does not have to equate to feature bloat, and it does not have to mean increased complexity. 'ls' is a very successful utility. Is it bloated? I think not. One key design principle of UNIX, and therefore of GNU/Linux, is that of modularity. This directly counters feature creep. For non-utility apps this principle is not as strong, however it is still important. X is a classic example: it works with all the different WMs, which (should) work with all the different X apps. (Please! No GUI flames! Maybe I shouldn't even mention this as an example, oh well...)

    Also, if a project is succeeding that implies that it is becoming more reliable, getting better documentation, and gaining a larger user base. Now, it is true that as the user base grows the testing becomes more exacting, more bugs will be found, and more features/enhancements requested, but:

    One key aspect of OSS, notably espoused by ESR, is that as the user base grows, so does the support base. This counters the third point. As an app becomes more widely used, more people will want to hack at it, fix bugs, write docs, etc. It could be argued that the proportion of developers in the user base is too small to alleviate the problem. This is simplistic, however. Programmers are not the only ones who contribute to a project. Writing documentation, sending in bug reports, sharing ideas, providing feedback: these are all very important contributions that anyone can make.

    Furthermore, it is perfectly possible for companies to pay people to work on OSS. If the cost of paying for the maintenance of existing OSS software is less than the cost of buying or developing and maintaining alternate versions of the software, then the argument is irrelevant. Note that in some cases it is impossible to maintain proprietary software at any price (e.g. when the owner goes out of business).

    If I have the time I might try to address the rest of the points in the article, but for now these will do.
  • I had my classes in Russia where they taught Pascal and C

    FORTRAN and BASIC anyone?

    What's LISP?
  • The article misses one big point. (aside from many inaccracies and invalid comparisons)

    It is possible for opensource software to be supported by commercial entities w/o becoming proprietary.

    e.g. in http://www.opensource.org/for-suits.html [opensource.org] several models are introduced. I think that with IBM, Dell, and compaq now (or soon) offering hardware with Linux installed one bears repeating:

    Widget Frosting

    In this model, a hardware company (for which software is a necessary adjunct but strictly a cost rather than profit center) goes open-source in order to get better drivers and interface tools cheaper.
    .....
    The open-source culture's exemplars of commercial success have, so far, been service sellers or loss leaders. Nevertheless, there is good reason to believe that the clearest near-term gains in open-source will be in widget frosting.

    If IBM and Compaq switched just one tenth of their AIX and Digital Unix teams to Linux programming, they could contribte immensly....

  • The slashdot post now links to a password protected article. I assume that the magazine disliked the slashdot effect. Anyone know where to find another copy?
  • The author is correct to a certain extent in believing that the romantic ideal of volunteer development of very large open-source applications will become increasingly strained in the future.

    But the development of Linux is not simply going to collapse. Because it has become crucial to the businesses of so many companies, it is in their interests to spend a lot of money maintaining and improving it. It's simply a matter of re-adjusting the idea of 'an infinite number of hackers' to 'an infinite number of software / hardware companies etc'.
  • As a member of the scientific community I can appreciate the problems of scale up. What works on a lab bench doesn't necessarily work on the factory floor. The points brought to light in this artcile are worthwhile if they inspire the Linux community to work on solutions for scale up. The community (in order to attain the lions share of the market) needs to continuously recognize emerging challenges and develop from its chaos a means of meeting the needs of the community much the same way as a laboratiry does. The goal of scale up is to remove the austerity with which a procedure is completed. Take something seemingly complex and fine tune it such that anyone can see the steps involved. This doesn't make the procedure less complex or the one performing it any smarter it just carefully defines the parameters needed to complete the job. The so called shortage of talented people is overcome by a surplus of technicians. Before Win98 was released I watched an infomercial for it. The two Win98 programmers that appeared on the show were not boy geniuses they barely had a better working knowledge than I do (and that is pretty pathetic)they were technicians. The framework is already in place for hoardes of young semi-experienced programmers to help out; check out freshmeat. Centralized control and distribution of patches, fixes, and new ideas are abundant.
    I strongly disagree with the author in that programs need to be bloated with features. Watch the evolution of cars. When a new car comes on the market it is usually complete, functional, and full of features and options. As the line of cars ages and new generations are released they too get heavier and more bloated with 'luxury'items. This continues until the heavy bloated car becomes a dinosaur and is remade.
  • This guy is a bisuness man, he has no idea what he's talking about. Besides his problems with his logic (which were already noted in some above posts) he seems to be claiming that free software has a very small market share and the market share isn't gonna succeed and its companies are going to make decisions about the future of free software. What a joke. Sure some bisuness have found ways to make a bit of money off free software, and some contribute to it, but in no way do they control it. The majority of free software is simply stuff people do in thier spare time, and they would usually LIKE to see it widely used, but they aren't going to go through any corperate hotshot bs to make it widely used. This is definitly the most ignorant free software FUD i have ever seen. And oh yeah, this IS FUD alright.
  • I prefer Emacs.

    Just because I critize it, doesn't mean that I don't like it. In fact, I would say I am critical because I like it so much.

    Emacs is an excellent editor. It does what I need it to do really well. It's just that all the other stuff it is expected to do, doesn't make sense for an editor to do.

    vi is nice for fast and dirty editing, but not for long editing jobs (mostly because I'm more familiar with Emacs command, so I'm more productive with it).
  • Ok, I don't get it. He asserts Linux and Open Source (oddly enmeshed and confused in his article) are doomed to marginality because they are not commercial software.

    Ok, let's look at the success of other commercial Unix/Unix utility vendors. Well, guess what, compared to the installed based of Microsoft they are all marginal.

    It's not the development model, or commercial vs. non-commercial, or the defect rate in Linux utilities vs HP/UX - its that every other player has to compete with the 2000 lb gorilla of Microsofts installed base.

    Beyond that his article is riddle with factual flaws and bad logic, and a general statistical illiteracy. Linux doesn't support video cards? Hmmm... I must be imagining what I am seeing on my screen right now.

    Linux will probably never rule the desktop - oh well. It will rise to a position of power in the Server OS market, a position based on its technical merit and performance - which is on par or surpassing that of most commercial Unixes.

    Open Source will always be a valid development model, suffering from its inadequacies as does the commercial development model. Some OSS products will see commercial success, some will only continue to be used by the Linux hobbiest. Who cares. The OSS movement has produced some awesome software products - and I like to use many of them - I could care less what this guy recommends his Fortune 500 droids do.

    These products will ultimately stand or fall on their own merit, not on the politics of their development model.

    -josh
  • Apparently MSNBC was misquoting Linus some last night. (Last I saw on linux-kernel, Linus was going to check over the transcript. I'm assuming he'll be posting to linux-kernel again after he does that.) The linux kernel has ~1.6 Million lines of code. But the article is not comparing 10 Million lines of kernel code in a commercial Unix to the Linux kernel code. Rather, it is comparing kernel+utilities of a commercial unix to kernel code of linux. It is also interesting that Linux is far more cross platform then any commercial unix. Unicies like HP/UX and Solaris only have to deal with one, or maybe two very controlled hardware environments. Linux deals with somewhere around 10-15 (guesstimate). Overall, linux is in a better position by that benchmark.

    Also, Linux is not the only open source project out there. There is still FreeBSD, OpenBSD, and NetBSD as far as other open source unicies go.
  • I'm thinking that they had 'maxservers = 120' left from the default config :P
  • Here I am in my trusty lynx, and I'm asked for a password.


    That's not the frustrating part. I take a guess at a l/p combo which works, being an old crypto fan. ;-)


    But then I found that the article can't be read anyway, because it's in some proprietary format. This is ridiculous. Can't Slashdot consider such formats (as well as passwords) when accepting articles for submission?

  • Okay..before anyone flames me..let me clarify-I love linux.

    Now..I feel that with large pieces of code (esp OSS) when there is no responsibility except moral, there will be instances of bugs coming through. Also people who work for recognition will work on the more "fancier" aspects of things for which they get attention. Who wants to pay attention to the nuts & bolts and the really CORE mechanics which is dull to say the least ?? Let me know!
  • Hey..you admitted it yourself. I admit the low level is there but then most of the people are working on the GUI. At the end of it all, its what deep within that matters. It exists now..BUT who is working on improving it ????
  • Your right, but why are there so many desktops (KDE, GNOME, GNUstep, etc.) and so many window managers?

    Having 3 different desktops is as stupid as having so many 'kinda BSD' OSs

  • Funny, I find Xemacs and an Eterm (or five, sometimes) to be the most powerful development environment. Period. has it occurred to you that Emacs (& Xemacs) is meant to be a power tool, not a quick editor?
    Try flying an F14 or a Mig. I bet you that they're a hell of a lot harder to fly than a single prop airplane. Guess what, they're also a hell of a lot more powerful. Emacs exists when what you're doing isn't quick and dirty, it's for when what you are doing is big and complex.
    Think of it this way: emacs isn't meant to be intuitive to those who haven't used it, it's meant to give maximum power to those who master it. That's a good thing. When you don't want to master it, you don't use it. If you want to master it, you find that you have an extremely powerful tool at your fingertips, even if you may require all ten of them.
    Think of it like Kung Fu. It's awfully complex and difficult. (Much more so than emacs, Kung Fu takes decades to master.) The result is awfully powerful and effective.
    As for the rest, if you can't learn a language in a week, you're probably not the sort of person (at least yet) who would benefit from emacs anyhow.
    Anyhow, nothing aside from food, shelter, water, and air is good for everyone. Different tools for different jobs, to each his own, etc. To quote some really good people during the interface wars (my term), "The only intuitive interface is the nipple. After that it's all learned."
  • Could someone put the PDF file up on their web page for me, so I could have a look at it? It sounds really interesting, because I thought right from the start that the open source model had some problems with it.

    The only way I figured it could work is if computer communism was established, where tax dollars went to the government to pay the software developers, and everyone got free software.

    Of course, that wouldn't work. :)
  • There are TWO critical issues for good software - programmer quality and motivation.

    Its amazing the good work you can get from fairly average people who are highly motivated. On the other hand, I'm sometimes amazed at the crap I have produced when I couldn't give a damn about the outcome.

When you are working hard, get up and retch every so often.

Working...