Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Linux Software

Open Code Has Fewer Bugs 337

ganns.com writes "Reasoning, which sells automated software inspection services, scrutinized part of the code of Linux and five other operating systems, comparing the number and rate of programming defects. Specifically, Reasoning examined the TCP/IP stack and found fewer errors in Linux. 'The open-source implementation of TCP/IP in the Linux kernel clearly exhibits a higher code quality than commercial implementations in general-purpose operating systems,' the company said in a report released last week. Reasoning also compared the code with that used in two special-purpose networking products and found it superior to one of them."
This discussion has been archived. No new comments can be posted.

Open Code Has Fewer Bugs

Comments Filter:
  • Ooh baby (Score:4, Funny)

    by Anonymous Coward on Thursday February 20, 2003 @09:17AM (#5343146)
    But bugs are cool..does that make me a geek for using Redhat?
  • Hmmm... (Score:3, Funny)

    by Craig Maloney ( 1104 ) on Thursday February 20, 2003 @09:18AM (#5343156) Homepage
    I guess it stands to Reasoning that more developers hammering on code leads to fewer bugs. :)
    • Re:Hmmm... (Score:2, Insightful)

      by Anonymous Coward
      i think the enjoyment is also important.

      take 2 people that have the same skills, one enjoys a complicated task, the other does not. more than likely the result will be better by the person enjoys it, because they will show more care.
    • Re:Hmmm... (Score:3, Insightful)

      by cyb97 ( 520582 )
      Depends on how the source-tree is managed...
      Too many cooks spoil the broth!
    • Re:Hmmm... (Score:3, Interesting)

      by Ooblek ( 544753 )
      I suppose you don't find it odd that a consulting company, who has some of the biggest names in commercial software development as clients, finds that the very threat to the people that pay them is of a higher quality? While I don't doubt there are cases where open source software is of better quality, I also believe the converse that there are cases where commercial software is of higher quality. So one little scan on a sub-system of Linux vs. "commercial software" means that open source is the best hands down? I doubt it.

      Also, why is it that they won't name the commercial software they scanned on their home page? Why is it that I have to provide contact information to view their report? Since everyone here is so critical of BS moves MS makes, why are they not asking the same questions of this for-profit entity?

    • Or... (Score:5, Interesting)

      by intermodal ( 534361 ) on Thursday February 20, 2003 @11:49AM (#5344501) Homepage Journal
      People coding something because they want to (and because they need it for something for themself) leads to better code. I know when I do something for myself, I don't half-ass it.

      Coding for the end result = quality

      Coding for a living = paycheck

      Any questions?
      • Re:Or... (Score:3, Insightful)

        "Coding for the end result = quality"

        Too bad that quality doesn't always bubble itself up to the UI. That's probably my biggest complaint about open source software is that few ppl actually put serious thought into the UI design. It starts off as a utility written to solve their own problem and eventually it becomes useful enough to share with ppl. VirtualDub comes to mind. Kick ass prog, hardly intuitive in terms of UI.
        • Re:Or... (Score:3, Interesting)

          by intermodal ( 534361 )
          This is true, though I am sure it suits the guy who made it just fine...that's one of the great things about having the source. You can do it yourself if you are so inclined. While not everyone has to pay to use it, somebody has to donate their time to create it, usually to their own ends. Then they choose to share that improvement or creation with others. Not trying to use something overly noted here on /., but thats what Linus did with Linux...
  • by Raul654 ( 453029 ) on Thursday February 20, 2003 @09:18AM (#5343159) Homepage
    Companies such as Oracle and Microsoft typically sell binaries incomprehensible to humans rather than the comparatively understandable source code.

    After seeing this [ioccc.org], I think that statement is being a bit generous
    • by Anonymous Coward
      .. Also opening those binaries in Notepad resulted in a message, file too large, please use wordpad, which resulted in a message, file too large, please use Office XP which resulted in a message, file unreadable please open in .NET.

      On other hand opening a open source file resulted in a quick readable file, which was in simple comprehensible english. the first word was some bin or bash , a proper dictionary word.

      AK
  • Statistics (Score:5, Insightful)

    by Caractacus Potts ( 74726 ) on Thursday February 20, 2003 @09:19AM (#5343168)
    How about using a larger sample of code before making such bold statements. It's probably true that the code has fewer bugs, but when you abuse statistics it just makes things look dishonest.
    • Re:Statistics (Score:5, Interesting)

      by Xtifr ( 1323 ) on Thursday February 20, 2003 @09:42AM (#5343367) Homepage
      This is not the first such study, there was a paper published in the early nineties which tested various standard unix command-line tools from a variety of vendors. They subjected the tools to horrendous stress and abuse, and found (to their suprise) that the GNU tools were the most reliable, with approximately a 1% failure rate in their bank of tests. The second best was HP, with about 8% failure rate, and everyone else was between 12-20%.

      I don't have a link, but the paper was pretty widely publicised at the time, and should be fairly easy to track down. It was the first major study to really show an emperical link between openness and reliability, but it was far from the last. This latest one is merely one more in a long list.
      • what would be interesting to find out, is if some enterprizing young lad who read this paper, decided to go ahead and fix the bugs which caused the 1% fail in the gnu tools just because it was FS and he could. I would like to see the results of the same tests run again today :)
      • Re:Statistics (Score:3, Informative)

        "there was a paper published in the early nineties which tested various standard unix command-line tools from a variety of vendors."

        I believe you're referring to the fuzz papers [wisc.edu]. They basically threw a bunch of random garbage at different commands and then watched for core dumps.

      • Comment removed (Score:5, Informative)

        by account_deleted ( 4530225 ) on Thursday February 20, 2003 @10:23AM (#5343669)
        Comment removed based on user account deletion
  • by asmithmd1 ( 239950 ) on Thursday February 20, 2003 @09:20AM (#5343176) Homepage Journal
    Pope is catholic
    Bears are found to sh*t in woods
  • by jfrumkin ( 97854 ) on Thursday February 20, 2003 @09:21AM (#5343184) Homepage
    Over time, successfull open source projects which address a particular issue will most likely have fewer bugs; just being open source doesn't mean fewer bugs (or better software). It just means that it has a better chance, if it survives, of being better software.
  • I get it (Score:3, Funny)

    by undertoad ( 104182 ) on Thursday February 20, 2003 @09:21AM (#5343187) Homepage
    Open source has fewer bugs.
    Bill Gates: bugs are cool.
    Ergo, open source is not cool!
  • Let the linut zealot vs. m$ zealot postings commence!

    and to them I say...

    Someone's boxen is only as secure as their updates go...Not all m$ boxes are as secure as linux boxes and vice versa. End it there.

    oh...read the article too...it's not even about m$
    • The funny part is - people always say the TCP/IP stack from windows is ripped off from FreeBSD.
      So the conclusion is not OpenSrc/ClosedSrc
      Its Linux TCP/IP stack contains less bugs than the FreeBSD one.
  • Is it me or does the article start out talking about a comparison of code quality the slide over into the Microsoft bad, Linux/OSS good?

    Its also kind of strange that they don't even disclose what they compared Linux (kernel 2.4.19) to. Not really a big selling point for Linux. Oh wait its free ;)

    On a side note, I would like to see the .1 errors per 1000 lines of code. Let the people know where the problems are so there can be less.

  • by mccalli ( 323026 ) on Thursday February 20, 2003 @09:22AM (#5343199) Homepage
    'The open-source implementation of TCP/IP in the Linux kernel clearly exhibits a higher code quality than commercial implementations in general-purpose operating systems,'

    Really? But I thought most commercial OSes derived their TCP/IP stacks from BSD code in the first place. And since BSD is open-source, shouldn't these commercial OSes show roughly the same level of quality then? Or are they arguing that the Linux TCP/IP stack is superior to the BSD one?

    Cheers,
    Ian

    • by Jimithing DMB ( 29796 ) <.gro.dbwgt. .ta. .efd.> on Thursday February 20, 2003 @09:54AM (#5343454) Homepage

      Actually, you've inadvertantly stumbled upon an excellent point.

      No code is perfect to begin with. The BSD stack is still improved from time to time. The BSD stack that companies folded into their code years ago has since had some major changes and the companies haven't bothered to take many of those changes into account.

      Had they been required by license (GPL) to keep the code open, then it could be fixed by other people. Instead, the implementation has languished. This in fact is one of Stallman's great resons for keeping all code free.

      However, the reality of it is that our current environment still favors closed source software. With any luck, people will slowly start to wake up and realize that source code needs to be open for all software projects. Think about it. If it was normal to receive source with binaries, nobody would really think twice about it. It's only seen as a bad thing because it's not what Microsoft does. But the reality is that Microsoft has a business model that works well for them, a giant monopoly. The reason their competitors fall on their asses is because they are trying to play as if they were MS, which they are not. It's not impossible to compete with Microsoft, it's just impossible to compete head-on.

      • by Eccles ( 932 ) on Thursday February 20, 2003 @11:23AM (#5344234) Journal
        However, the reality of it is that our current environment still favors closed source software.

        I'd say it's not environment, it's economics. Apache has flourished because the people who develop it are also people who use it. But what percent of graphic designers are really using the Gimp vs. Photoshop? Maybe Photoshop has more bugs, but it has more usable features (performance also?), and that's what its users want. Unless you can come up with a scheme to fund development of open source in the same way that software purchases fund closed source, closed source is going to be the only way to develop software where the users generally aren't also the developers.

        I develop commercial closed-source software. I'd absolutely love it if some sugar daddy came up to me and said, keep doing what you're doing and I'll keep paying you what you're getting paid, except we're making the code open source. But it isn't going to happen.
        • by Bodrius ( 191265 ) on Thursday February 20, 2003 @12:04PM (#5344658) Homepage
          Aye. It could be that the TCP/IP stack that the article mentioned has "flourished" (become better software) because the people who develop it are VERY MUCH using it.

          Linux geeks grok TCP/IP networking, and Linux users DEPEND on TCP/IP (not 'it would be nice to have web access and surf porn while I type this memo') for practically all of its market share. Like gcc, TCP/IP is part of the Linux deal.

          It would be biased to regard this as conclusive evidence of the superiority of open-source unless other, less sexy areas of Linux development are compared to their commercial counterparts in the same way.

          As evidence that certain commercial companies have not put priority on the TCP/IP stack of their OS, this could very well be good evidence.

          But this doesn't necessarily mean the commercial companies are inferior; they may very well be right in having different priorities.

          For example, for a Windows user it's more important that the Media Player works perfectly than having an efficient TCP/IP stack. Even on the server side it's not a big issue on their market. It's under so many layers of software, appearances and priorities that their clients would never notice if they made it better anyway.

      • by maynard ( 3337 ) on Thursday February 20, 2003 @12:16PM (#5344789) Journal
        The Linux IP stack is a complete rewrite and doesn't derive from the traditional BSD sockets code at all. In particular IP packet formation between Linux and BSD is completely different. The header and tail portion of an IP packet is handled in a single pass through called an "sk_buff". In BSD header and tail formation of the packet is handled in two passes, one for the header the next for the tail, in an "nbuf". The BSD protocol implementation is traditional and the one described in TCP/IP Illustrated, while the Linux implementation is completely new. I believe that one positive feature of the Linux implementation is that it has allowed for zero copy networking, though that's a limited benefit which is only of use to a very small subset of servers connected to very fast network links. A big positive of the BSD stack is that it's old, rigorously tested, and very well documented. Note that the System V Streams implementation is completely different as well, so Solaris and other SysV derived kernels follow their own method for packet formation. I make no claims that any of these protocol implementations are better than the others, only that the code base and history are completely different.

        I've attended a few USENIX kernel internals courses but that's the extent of my competence (have poked through the source out of curiosity though). Please feel free to post additional information or correct any mistakes I may have made.

        Cheers,
        --Maynard
      • by Abcd1234 ( 188840 ) on Thursday February 20, 2003 @12:16PM (#5344791) Homepage
        However, the reality of it is that our current environment still favors closed source software. With any luck, people will slowly start to wake up and realize that source code needs to be open for all software projects. Think about it. If it was normal to receive source with binaries, nobody would really think twice about it. It's only seen as a bad thing because it's not what Microsoft does.

        Please! I'm no MS apologist, but this is getting plain stupid. This isn't just about MS, believe it or not. The fact is, open source as a business model is seen as a bad thing because it's not what a huge number of companies making billions of dollars a year do. Have you heard of Oracle? IBM? Sun? Apple (our latest hero)? I could go on... the fact is, there are a TON of companies out there making big bucks selling closed source software. And more power to them!

        In the real world, closed source is, apparently, a viable business model. And thus far, open source isn't. Honestly, how many companies are actually making some real money making products which they also release the source to? Until this starts happening, closed source is going to be predominant... and there's nothing wrong with that!

        Personally, yes, I agree that open source is a good thing. But assuming that all software should be open based purely on some moralistic view is ridiculous. The world is far more complicated than that. Statements like "source code needs to be open for all software projects" is just plain naive, IMHO.
        • The thing is that open source products aren't necessarily made by a company whose primary purpose is selling software. Alot of open source is worked on by people who make companies work. Eg, Company X makes widgets, they need a widget inventory control. Company Y makes car parts, and have written an inventory control program. They release this as open source, Company X uses it, and there internal guys find & fix a bunch of bugs. Because it is open source both companies are gaining the added benefit. I think you will find that for most open source projects (especially those that are not high profile) this is how they are being financed. Remember 95% of code written these days is for internal systems that are not released onto the market.
    • by gurps_npc ( 621217 ) on Thursday February 20, 2003 @10:18AM (#5343627) Homepage
      While the commercial OSes derive from BSD code, it is not the same thing. Related to that, there are three sources of bugs that Closed OS's will have but Open OS's will not. 1) Errors in the derivation of the BSD code - that is they generally have to make minor changes in the BSD code to get it to work with their product. 2) Bugs in the Non-BSD code that is wrapped around the BSD code. 3) Errors found in the BSD code after the Closed code was written. Usually the closed Os will NOT upgrade the BSD code for a bug found in it because either 1) they are lazy, 2) they are ignorant of the bug, or 3) doing so would require a re-write of the Non-BSD code.
  • "The Linux" (Score:5, Funny)

    by sczimme ( 603413 ) on Thursday February 20, 2003 @09:23AM (#5343205)

    Reasoning, which sells automated software inspection services, scrutinized part of the code of the Linux and five operating systems,

    Including the Solaris, the Windows, the AIX, and the HP/UX.
  • Bah. (Score:5, Interesting)

    by KefkaFloyd ( 628386 ) on Thursday February 20, 2003 @09:25AM (#5343223) Homepage
    I find the fact that they did not say what OSes they compared to be very... suspect. What about Mac OS X, FreeBSD, and other open source OSes that have Open Source TCP/IP implementations in their kernels? Since they did not say what OSes are being used...

    "Reasoning declined to disclose which operating systems it compared with Linux, but said two of the three general-purpose operating systems were versions of Unix."

    How lame. For all we know, they could have tested the Amiga OS, Mac OS 9, Windows 3.1, A/UX, and NeXTStep! Other than this, the article is pretty vague and does not seem to give me much meat on the subject, nor a link to the study (you have to go through some forms and give up personal info to get it at www.reasoning.com).
  • Most cryptographic algorithims do not gain acceptance without being open to peer review to spot flaws and potential weaknesses...

    So why should any of this article be a suprise or even particulary note worthy?
    • So why should any of this article be a suprise or even particulary note worthy?

      perhaps because when large numbers of people are uneducated about something they use and make daily decisions about, it is shocking to them to learn that their assumptions (probably brought about by marketing) are erroneous.

      Other notable, and obvious "surprises" in research:

      -Two parents are better than one.
      -More concealed carry = less violent crime.
      -You are more likely to get sick at a hospital than at home.
      -Breast milk is better for babies than formula.

      A lot of money and time have been spent researching these topics, only to find what many of us already knew to be true and obvious.

      Not everyone is educated and experienced in everything, and it can be painfully difficult to dissuade people of their delusions. Especially when they've been formed out of ignorance.
  • No Suprise There (Score:5, Insightful)

    by Greyfox ( 87712 ) on Thursday February 20, 2003 @09:26AM (#5343240) Homepage Journal
    The attitude I've seen in the corporate world is that open source products are made by amateurs and is therefore in some way not blessed by the magical corporate coding fairy which makes all the shit churned out by corporate code shops stink less. This attitude is arrogant and does not take into account the simple fact that all those people who got into programming just for the money tend not to work on open source products. When you've got code that is both written and reviewed by legions of people who love to code and who find good computer programs to be beautiful, you're going to get better code.
    • by syle ( 638903 ) <syle&waygate,org> on Thursday February 20, 2003 @09:39AM (#5343338) Homepage
      This attitude is arrogant and does not take into account the simple fact that all those people who got into programming just for the money tend not to work on open source products.
      It also doesn't take into account that many people working on open source ARE professional programmers during the day.
    • by sir_cello ( 634395 )

      "get better code"

      Better code is not the only thing in the world. What about better design, better architecture, better dedicated talent, better testing resources, better hardware and tools support, etc. It's hard to take something about code defect ratios and turn that into a wide-sweeping statement. I can show you plenty of low defect code that is part of a bad design.
    • by mrpuffypants ( 444598 ) <mrpuffypants&gmail,com> on Thursday February 20, 2003 @09:42AM (#5343366)
      I'm encountering this in my new job that I recently took. I walked into a company that was using an antiquated MS Exchange system for most of their communication, old networking hardware(which is another issue entirely), and software packages that hadn't been updated in about 5 years because the company that originally wrote them has gone under in recent years (.bomb)

      After looking at everything I suggested a lot of open-source alternatives to all the current software. The prices to buy it all was zilch, and upgrading all the hardware can be done in-house, without the help of "contractors" that charge out the ass just to support their own software. The system would work great, a lot better than the currently antiquated crap we are using.

      After presenting my ideas to management they shot it down totally. They, with their mind for the bottom line, couldn't understand how people would release software totally for free. They kept asking me when they would pull the bait and switch on us. It's two whole different schools of thought, and the only way that I can implement it now is to do it slowly behind their backs until they don't even know what hit them when they don't have to reboot the server daily anymore =]
      • by mccalli ( 323026 ) on Thursday February 20, 2003 @10:13AM (#5343575) Homepage
        After looking at everything I suggested a lot of open-source alternatives to all the current software....After presenting my ideas to management they shot it down totally.

        What would be their motivation to replace the software? Does the current set-up work? Is there a burning need to replace?

        Often "it would be a better system" isn't enough. If the old system works well enough and takes few resources, then it's doing its job fine and doesn't need a potentially risky replacement. And it sounds like what you proposed was a large change.

        the only way that I can implement it now is to do it slowly behind their backs

        Careful, young grasshopper. These aren't your private machines. If you've presented your ideas and they've been rejected, then do not sneak in those changes anyway. To do so could have serious ramifications for your job. Stick by what you've been told, and do things openly.

        Cheers,
        Ian

        • well, it wasn't a huge change at all, just a few mail systems and other stuff that can be migrated quickly and easily. They recognize that their systems is ailing, but when I came in they were looking at dropping a huge chunk of change on a Novell system, which I immediately said no to myself. Since they are evaluating their options, i suggested open source software. They rejected it because they didn't understand it and it's "ideals"

          Careful, young grasshopper. These aren't your private machines. If you've presented your ideas and they've been rejected, then do not sneak in those changes anyway. To do so could have serious ramifications for your job. Stick by what you've been told, and do things openly.

          as for that, doh, i was just shoting off my mouth...it's too early in the day and I hate using this crap system. =]
    • Re:No Suprise There (Score:5, Interesting)

      by Rary ( 566291 ) on Thursday February 20, 2003 @09:58AM (#5343480)
      I'm quite happy to report that this is not entirely the case everywhere in the industry. I happen to work for a consulting company that has become quite fascinated in recent times with the magic that is open source. And we love selling open source-based solutions to our customers, who in turn, love buying them.

      Basically, the business logic goes something like this:

      We can build your application in one of two ways.

      1. $5000 for proprietary products (app servers, IDEs, etc.), and $5000 for our time and effort (total = $10000), or...
      2. $1000 for proprietary products (the rest are all open source), and $7000 for our time and effort (total = $8000)
      Needless to say, this goes over well for the client ($8000 expense is better than $10000 expense), and also for us ($7000 revenue is better than $5000 revenue ).

      Obviously, I'm just picking numbers at random, but I think you get my point.

      Not every client is eager to jump on open source tools, but more and more they're finding that it's a really good idea. Especially when a major consulting company with an excellent reputation (ie. us) comes along and tells them that this is a good idea. People tend to listen to us, because we tend (historically speaking) to be right a lot of the time.

      PHBs might tend to be stuck in the mindset that "if it's free, it must suck, if it's expensive, it must be worth it". But when they pay a high-priced consultant to come in and give them advice, and that consultant says "you know, you can buy IBM's WebSphere Portal Server for $140,000 per CPU, or you can use the open source Jetspeed, which is practically the same thing, in fact, WebSphere Portal is basically just Jetspeed repackaged with some extra tools that you probably don't even need," even PHBs can understand that kind of logic.

    • While I would tend to agree that, all things being equal, having millions of people reviewing the same lines of code and having a large number of people actively partaking in the authorship will contribute to quality, it is disingenious to assert that participation (and hence quality) naturally and necessarily flows out of open source code. Just because code participation can happen does not mean that it does. Linux is exceptional. It is #1 out of a very small group of open source code bases (e.g., Apache, Bind,...what else) that really enjoys that substantial levels of participation. Not only is most open source code currently not popularly participated in, even essential and important packages (e.g., Open Office), but I do not believe that this such popular attention can scale in the future to support other kinds of code or a much larger quantity of code. Linux enjoys being the most prominent open source code package and it enjoys a relatively narrow scope, i.e., it's just a kernel. [The open source community loves to use the varying definitions of Linux interchangably, they talk about Linux as if it is Windows, i.e., a complete OS, but then when, say, security bugs come out for one of the numerous utilities, they assert that Linux is just a kernel (the correct definition)] Under the smaller kernel definition, Linux enjoys a couple key advantages over most of the areas that the open source community presumes to conquer:

      A) Linux is percieved as being a worthy task of a "hacker" (e.g., elite, low level, etc)--as opposed to, say, a word processing suite or one of the many mundane but important features that may save users millions of hours.

      B) It is so popular for users and such an exclusive focus, you can be sure that a significant contribution will be seen by many geeks, again, unlike a word processing suite

      C) Because it is relatively small, especially if you throw out all the drivers and the experimental stuff, and leave it to things like the very popular TCP/IP stack (which was reviewed in this article)

      Linux and some of its associated code are very good in some respects as a result of their incremental improvements and bug fixes. The trouble with this is that when you significantly expand the scope of Open Source efforts things start to fall apart. In such a relatively unstructured environment as the popular open source method, i.e., little to no centralized development/testing/etc, there is every reason to believe that the overlap is key. In other words, since there is really no official QA or group of individuals that can be told or expected to methodically test, evaluate, or fix areas, the open source community essentially depends on random overlap. When you have a sparse group of competent developers or testers, you will run into trouble, as in the case of word processors. Likewise, when you depart from the relatively well established world of the kernel, when you start having to develop everything, not just the code, but the framework, the UI, the API, etc, from scratch the dependence on overlap becomes more and more of an issue. It's one thing to accept the a small patch for a well established tcp/ip stack, it's another thing entirely to have to coordinate the changing of a whole API, to make multiple pieces fit together seamlessly, without a more concrete organization (which can just barely happen in the manner popular open source development method).

      Code review is a good thing, in and of itself, if nothing else, for its ability to make those many incremental enhancements and bug fixes. From a strictly technical perspective "open source" code can work as addititive, i.e., you develop, say, your Word processor with a more traditional software group, and then you allow the public to contribute. THe trouble with that is that for most code there simply isn't a viable business model that can support that sort of development effort, or at least, I don't see one and none of the current methods really compute, and as a result open source fails to deliver. I think that there are areas where open source code can thrive. For instance, I'd love to see IBM and a coalition of other software and hardware companies band together to make Linux (or some other kernel) into a complete OS that is every bit as easy to run as Windows and more stable, flexible, etc. It'd be good for everyone involved (except for MS of course) and it's quite doable. However, except for cases like that, where you have a very definite common good, i.e., a reasonably priced OS/API that allows strong and equal footing for 3rd party developers and manufacturers, there simply isn't a formula to actually pay for development. Consequently, open source will not produce better code by and large.
  • by phuturephunk ( 617641 ) on Thursday February 20, 2003 @09:27AM (#5343241)
    The more points of view you apply to solving a problem, the quicker, and better you'll solve it. The beauty of human reasoning isthat no two people will view the world in *exactly* the same way, therefore each one of their respective paths to the solution will be different...Travelling that path to one solution can, as we know, lead to other SOLUTIONS to other PROBLEMS.. The more heads that work, the more solutions discovered . . and so on..

    • The more points of view you apply to solving a problem, the quicker, and better you'll solve it.

      "Better?" Maybe. "Quicker?" Definitely not. If you've even been to a single high level design review, you'd know that not only does everyone have their own opinion, but they all adamently believe theirs is the only right way to do it. And they'll fight with you and argue for hours trying to convince you that their way is better than yours, John's, and Ted's. Meanwhile, Ted can't believe that John would propose something so stupid, and John thinks your idea will be a memory hog.

      So where does the "quicker" part come in?

      Also, I would like to refute the idea that open source projects have all these eyes scouring them. There are a helluvalotta mothballed projects on SourceForge that looked pretty cool, but there's no interest in them. Sure, there are quite a few people actively working on the latest-and-greatest, bleeding-edge Apache mods, and kernel patches, but does anyone care about the Widget Formatter that *my* company needs? No. But throw a little money behind it, and you can have 2 or 3 developers working on it, full time, who will produce software *exactly* to your specification, not just how some programmer in New Zealand thinks it shoud work, in his opinion.

  • by Peter_Pork ( 627313 ) on Thursday February 20, 2003 @09:27AM (#5343243)

    Open Code Has Fewer Bugs

    The study looked at a single part of an operating systems (TCP/IP stack) and then the posting made a very general claim about open source software. This is cheap engineering (a.k.a. bad science). Period. You need a much larger sample to make such a claim. A single data point is meaningless. In fact, I believe that code bugs are much more a function of programmer performance and code complexity than open vs. close source development model. Opening the code may have a positive impact, but it is not the major factor to consider. The last thing Open Source needs is this kind of marketing strategies...
    • Indeed code bugs may be a function of programmer performance and code complexity rather than open vs. closed source. However, the fashion and frequency of when those bugs are discovered and/or fixed IS a function of open vs. closed source.

      Personally I use mostly Microsoft software and have experienced many bugs in it, but have yet to contribute 1 single patch or even file a bug report. Who dares when as soon as you pick up the phone they start talking about how much you want to spend today!?

      As for open source I have contributed development and/or patches to

      JHotDraw
      PMD
      Netbeans
      Java

      And I have made specific bug reports on many more open source projects like

      Linux sound drivers
      XDoclet

      I tried unsuccessfully for about 2 years to get a simple but tremendously annoying bug fixed that really affected the usage of VisualCafe.(switched to JBuilder then Netbeans)

      I have done these things only for my own personal benefit in general. Yet the open source products have benefited, while the closed source ones have not.
    • You're right. This study alone does not prove anything. However, trying to draw out trends that would lead to that conclusion that could be verified through further study is valuable. Taking your arguement to an extreme no one would ever study anything because a single data point on anything would be useless.

      Perhaps it would be better to say that there is preliminary evidence that seems to show that open code has fewer bugs.

      I believe that code bugs are much more a function of programmer performance and code complexity than open vs. close source development model

      Open Source projects have access to many more developers which leads to there being a much larger body of knowledge and skill to bring to bear on a project. The more eyes that look at the code the better the code will become.
  • by Apreche ( 239272 ) on Thursday February 20, 2003 @09:29AM (#5343258) Homepage Journal
    Severity of bugs is more important that number of bugs. I could have a program with one bug, doesn't work. And you can have two bugs, feature broken and memory leak, but it works. Who is better?

    I also like the assumption in the title that because linux was found to have fewer bugs than some other OS's that open software in general has fewer bugs. Take a look at some of the bug lists on sourceforge projects and tell me that again. Number of bugs varies by project, not by open-ness.
  • Why did they test only one free software kernel while testing four proprietary ones? I'm not saying that if, say, a *BSD kernel was used, the results would necessarily be something else, but making general statements of open code by examining only one open project is certainly not very accurate. Although I suspect that these inaccurate conclusions are more in the Slashdot side than in the study.

  • Stating the obvious (Score:5, Informative)

    by seanadams.com ( 463190 ) on Thursday February 20, 2003 @09:30AM (#5343271) Homepage
    'The open-source implementation of TCP/IP in the Linux kernel clearly exhibits a higher code quality than commercial implementations in general-purpose operating systems,

    Well of course it does! The Linux and BSD IP stacks are benchmarks. This is where practically all protocol research happens - how would anyone be able to verify your results otherwise? Furthermore, only the free stacks are useful for compatibility testing because they are so configurable.

    So obviously it stands to reason that this code is much more complete and bug-free than any commercial implementation. THOUSANDS of people are studying every single line of this code on an ongoing basis.

    I've worked on a number of commercial IP stacks - some from scratch, and some based on Linux. Any IP stack written from scratch is understandably simpler, but it's not that hard to implement the essential RFC requirements (i.e. the "MUST"s) and make it stable. Now, making it FAST and making it use all of the bleeding-edge TCP stuff... that's another story. Only Linux/BSD are there (and of course any other OSes which use their stacks).
  • They decline to name the commercial OSes. In a world where DBMS makers often refuse to let reviewers disclose performance benchmarks, could this be because no company like Sun is going to disclose the source code for their OS so that some company can go and run a comparison of the quality of their code against their competitors ?

    As for Linux beating one of two "special purpose" networking products, the flip siding of beating one of two is that one of two of the commercial OSes was superior to Linux.
  • Both to the linux dev team and to the commercial vendors. Pointing out an aggregate number of defects is pretty stupid without telling the developers what they are.

    What I'd love to see is: If all of the significant defects found were reported, how many *still* exist 1 month later.
  • I am curious why they choose the TCP/IP stack. My guess would be that TCP/IP would be where Linux and BSD's would be the strongest. I am also a little confused because I thought Win2000 used BSD's TCP/IP stack.
    I am sure those guys are alot smarter than I, but I am a little confused on testing.

    Anyone?

  • by Jeppe Salvesen ( 101622 ) on Thursday February 20, 2003 @09:34AM (#5343296)
    They chose the TCP/IP stack. That is almost certainly the best tested of all the components in Linux. It is used by everyone, so the eyeball count is particularily enormous.

    If they would compare the implementations of something less popular, the numbers MIGHT be different. x.25 or something.
    • And here you have the fundamental point. The parts of OSS that all the coders use are better than the parts of CSS that the coders use, since CSS is usually aimed at end users, not developers, and designed with someone other than the author in mind as an end user. CSS is more likely to have a consistent level of code review (i.e. ISO9000), while OSS is likely to focus on the bugs that cause the developers problems. If you use the software in the same way as the developers, then that's great, if not then OSS can be less stable than CSS. While this article gives some useful propaganda for use on PHBs, it's not actually very objective or useful.
  • by BigBir3d ( 454486 ) on Thursday February 20, 2003 @09:35AM (#5343299) Journal
    For this reason:
    Reasoning declined to disclose which operating systems it compared with Linux, but said two of the three general-purpose operating systems were versions of Unix. The comparison was done with version 2.4.19 of the Linux kernel. For the comparison products, the company had access to the source code that for proprietary software is usually a closely guarded secret.
  • by MondoMor ( 262881 ) on Thursday February 20, 2003 @09:35AM (#5343306) Homepage Journal
    The code for the shuttle's GPCs is closed, and it's regarded by many as probably the most bug-free code around with any degree of complexity. It's been upgraded several times since the '70s, and rarely have errors been found.

    It probably had one of the longest development times for its size, too. Which helps a lot.

    Quality has nothing to do with whether code is open or closed source. It's got everything to do with the environment in which it was written. Code written under extreme management pressure from a profit-hungry megacorp is just as bad as code written by an ignorant or uneducated dork in his basement.
    • that MS code is just as good as code written by an ignorant or uneducated dork in his basement.

      He'll probably have his mom sticth it into a sampler he can hang on his office wall.

      (Of course, personally, I don't think it's true. Bill has the resources to throw at code to make it much worse than any single dork can, but that's just my opinion)

      KFG
    • How can it be regarded as bug free if no one has the opportunity to review it?

      I'll believe it when I see it.
  • yeah right (Score:3, Insightful)

    by nomadic ( 141991 ) <[moc.liamg] [ta] [dlrowcidamon]> on Thursday February 20, 2003 @09:37AM (#5343325) Homepage
    Specifically, Reasoning examined the TCP/IP stack and found fewer errors in Linux.

    So they just looked at the code and found all the bugs. They must have the best programmers in history working for them if they could just look and find all those bugs that it usually takes years for mortal programmers to find.
    • Re:yeah right (Score:3, Informative)

      by gmuslera ( 3436 )

      There is also an article about this here [zdnet.co.uk].

      They not searched for any kind of possible bug, the article says specifically what they were looking for:

      Reasoning looked for programming problems such as memory that was marked as free when it was in fact still in use, memory that was being used without being properly initialised, and attempts to store data that exceeded the space reserved for it. This last problem is often associated with buffer overruns, a major weakness that under some circumstances can let an attacker take over a computer.

  • Much more interesting would be a code comparison between open source samba and the micro$oft CIFS code..
  • Not about Microsoft! (Score:3, Informative)

    by Daengbo ( 523424 ) <daengbo@OOOgmail.com minus threevowels> on Thursday February 20, 2003 @09:38AM (#5343331) Homepage Journal
    Before we begin the bashing, let's note that two flavors of 2.4.19 were compared to two closed source Unix operating systems. Let's try to keep the evil empire out of this one!
  • by DeadSea ( 69598 ) on Thursday February 20, 2003 @09:40AM (#5343346) Homepage Journal
    The Linux TCP/IP is an area of code that is known to be robust. It has been analysed again and again. Windows TCP/IP stack is widely regarded to be inferior on many counts. If you choose TCP/IP as your area of study I don't doubt that you will come out with these results. If you chose another area such as USB protocol, you would find very different results.

    TCP/IP is better on linux because many very talented people have worked on it. This is an area in which open source software development has worked well. However, it does not mean that open source developement always works better.

  • by dido ( 9125 ) <dido AT imperium DOT ph> on Thursday February 20, 2003 @09:40AM (#5343349)

    If they found 0.1 errors per 1000 lines of code, did they approach Linus and Co. to point them out? Has Reasoning submitted any kernel patches to address the errors they say they found?

  • by tbmaddux ( 145207 ) on Thursday February 20, 2003 @09:47AM (#5343401) Homepage Journal
    Right? [slashdot.org]

    Were they testing code from 7 years ago?

  • by SatanicPuppy ( 611928 ) <Satanicpuppy@gmail. c o m> on Thursday February 20, 2003 @09:51AM (#5343431) Journal
    The code I write for myself is the cleanest stuff in the universe. I get freaky about extra lines or lines that look "ugly" or inelegant.

    Now when I'm at work I toss out functional, ugly code. Doesn't work quite as well, but 90% of the users will never know that. I'll write catch statements for the most obvious errors, but I don't sit and brood about what some hypothetical idiot might want to do with the code. If there are enough people who hit an error there, I patch it, and move on with my life.

    By and large, high production commercial code is sloppy. There isn't any profit to be made in making it pretty or elegant, and we all know how (for a random example) MICROSOFT feels about profit.

    Open source is just the opposite; if you're not making any money on it, you're doing it for your own personal satisfaction, and I think most people find it more satisfying to have clean baddass code, rather than sloppy junk code. Heh. Especially when your NAME is on it, and the SOURCE is available.

    Just my .024 euros.
  • by Anixamander ( 448308 ) on Thursday February 20, 2003 @09:54AM (#5343449) Journal
    Reasoning "sells automated software inspection services."

    The key word here is "sells." They would have a tought time selling this to open sourcers, what with everything wanting to be free and all. Instead, they show the big closed source companies that their code isn't nearly as bug free as the open stuff, therefore they really need to buy this.

    I'm not denying that open source is less buggy, but always question the motivation of the company making the claims. Just because Reasoning's assertions fit your own neat world views doesn't mean that they are without bias or secondary motivation.
  • by Skapare ( 16644 ) on Thursday February 20, 2003 @09:59AM (#5343488) Homepage

    What I am wondering is which is the cause and which is the effect:

    Microsoft source code is defective because it is closed.

    Microsoft source code is closed because it is defective.

  • Great news! (Score:4, Interesting)

    by indecision ( 21439 ) on Thursday February 20, 2003 @10:06AM (#5343531)
    Now if they could please point out the filenames and line numbers in question, perhaps we could eliminate the bugs altogether...
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday February 20, 2003 @10:07AM (#5343538)
    Comment removed based on user account deletion
  • by OldAndSlow ( 528779 ) on Thursday February 20, 2003 @10:13AM (#5343574)
    This study was done with an automated inspection tool. Ergo, it cannot assess the dynamic behavior or the code. Ergo, it can't tell how many defects there are. The company's web site talks about finding memory leaks. Important, but it doesn't give a true picture of software quality.

    This looks like a publicity stunt by the Reasoning folks.
  • by zimmermantech.com ( 624686 ) on Thursday February 20, 2003 @10:15AM (#5343593) Homepage
    "Reasoning examined the TCP/IP stack and found fewer errors in Linux" The TCP/IP stack in Linux (and for that matter, most operating systems) was borrowed from BSD. Shouldn't this comparison be a testament to quality BSD instead of Linux? Paul Zimmerman http://zimmermantech.com/webcam.htm [zimmermantech.com] "Comments should be like skirts - Short enough to keep your attention, but long enough to cover the subject"
    • (-1) Misinformed

      The linux TCP/IP stack was not pulled from BSD, it was written from scratch, or at least most of it was anyway. That's why when you see bug fixes for the BSD stack you don't see them in Linux, and vise-versa.
  • by realnowhereman ( 263389 ) <andyparkins AT gmail DOT com> on Thursday February 20, 2003 @10:23AM (#5343672)
    This is still an argument for the open source method, but I think that the code quality should be attributed to a different source. Perhaps it is not about an inherently good or inherently bad method. What if age is the key factor?

    The Linux networking code has been in for a long time. Not in it's present form, obviously, but each change builds on the last; as it must in open source - it would be foolish to start afresh when you have something that works. So a cylcle develops and at each stage the code gets better. Compare this with proprietary; can they look at a competitors code? No. They must start afresh and so their code is effectively younger.

    Further, if we measure software age not in units of time but in units of updates, open source has the advantage that there are many updates, there is always someone new to look at the code. No company can compete with the sheer quantity of viewings and therefore updates that occur in open source developments.
  • Already Analyzed (Score:5, Informative)

    by Euphonious Coward ( 189818 ) on Thursday February 20, 2003 @10:49AM (#5343919)
    The Linux Weekly News already has an analysis of this report up at http://lwn.net/Articles/22623/ [lwn.net]

    Two key points are that (1) most of the bugs Reasoning found are false alarms (which is an occupational hazard for this kind of analysis), and (2) one reason Linux does so well is that those lunatics at Stanford have been doing just this kind of analysis for quite some time, so most of the easily-found bugs were found long ago.

    This doesn't invalidate any of their conclusions, of course: the Stanford lunatics haven't been analyzing NT, they've been analyzing Linux, and for sound academic reasons.

  • begin troll

    This may be another feather in the Open Source cap, but I wonder if Open Sources is a good thing in the first place. Think about it for a second. Linux replaces Unix in the server world (which is happening). Companies that make closed source Unix OS's lose money, then they fire people. Company's get used to not paying for software so they start using Open Sources more. More closed source companies lose money, more fire people. Just something to think about when your hacking away at your latest kernal patch. You are writing software so companies can spend less money, executives can give them selves big bonuses for saving money, and vendors can fire people. I'm a consultant for big companies, I've seen it, it happens.

  • and... (Score:2, Redundant)

    by b17bmbr ( 608864 )
    this is news to /. readers?
  • ... so with the luxury of time, it _should_ be less buggy.

    I don't think releasing the source is necessarily a good thing for a commericial app. How would you control updates and mods? Where would the configuration control come from? I just had my first encounter with CVS at Sourceforge. _NOT_ straightforward. I don't think you could scale that up to a million purhcasers.
  • by Junta ( 36770 ) on Thursday February 20, 2003 @11:52AM (#5344530)
    Open Source is still largely deveolped as a hobby of enthusiasts. Some companies have their hands in the pie too, but even the resultant effort seems to be more in the style of the hobbiest than of the typical company effort.

    Two factors. When I develop closed source apps for work, especially if it is something I have no real passion about, I tend to have messier code. No one is going to see it anyway. If I ever change jobs, a potential employer isn't going to ever see that code to review my style. If no user or the community in general will not see it, I'm more likely to take riskier shortcuts and settle for inelegant hacks. As long as no obvious runtime problems occur, then it is enough. When I submit patches for open source applications, I take more pride in the work. I want it to be clean, easy to read and follow, and free from amateurish looking code.

    Secondly, even when I would like to re-evaluate approaches I use in a commercial environment, the business end of things will push deadlines. Time that I would have normally taken to go back, clean up, and rework the bits that work, but are too inelegant is denied. There is a significant amount of care with respect to market trends, customer demands, and marketing promises that interfere with quality code. With open source, you do it as you feel like it. Take as much time as you want. Sure, there are frequently deadlines in large projects (feature freeze, etc), but the penalty for not being able to meet those deadlines just means your work will be delayed to the next release cycle. There is no danger of losing your job, and even if consistantly missing feature freezes means you lose cvs write access, or are not taken as seriously, it really is no skin off your back, and you can almost always get back in through picking up the pace again...
  • by Eric Damron ( 553630 ) on Thursday February 20, 2003 @03:19PM (#5346546)
    It is natural for open source projects that survive to become very high quality. Look at it this way: If you buy proprietary software from a corporation, you can be sure that they are motivated by the bottom line.

    Corporations are there for one reason only: profit. This in itself does not mean that the products that they put out will be inferior. However, being motivated by profit means that:

    1. They will push their employees to put out a product quickly.
    2. If a product has flaws, it is the bottom line that dictates the priority given to fixing that flaw.

    Open source on the other hand is completely different. Although it can be motivated by profit usually it is not as much. A lot of people do it because they just want to do it. This in itself does not make open source less buggy. I would say that most young projects have as many or more bugs in them than proprietary projects.

    However, if the projects live for a long time it is because dedicated coders have decided to spend their time improving the product. This dedication over a period of time without the pressure by management to quickly push the product to market is the reason that open source becomes better than proprietary software.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...