Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Red Hat Software Businesses

Open Source == Faster bug fixes 330

solar writes "SecurityPortal.com is running a comparsion between RedHat, Microsoft, and Sun Microsystems on the response time between software bugs being found and patch releases. Find out if open-source is the champion bug squasher we all believe it to be. " Interesting bit.
This discussion has been archived. No new comments can be posted.

Open Source == Faster bug fixes

Comments Filter:
  • by Anonymous Coward
    You've pushed us back another 72hrs with your ranting. Oh the inhumanity! We'll never see the source if people like you keep speaking their minds ;-).
  • by Anonymous Coward
    Unfortunately, SecurityPortal can make no claim comparing Microsoft and Linux.

    Using a one-way ANOVA (or a t-test) at even 0.05% significance, the difference in the means between Microsoft's response time and Linux's response time is far below the threshold for statistical significance. In other words, there is a fair probability that Microsoft is has as good or better response than Linux. The results show a statistically insignificant difference.

    Statistics lesson #1: if a site does a comparison with just a few samples or without providing variances and t-test results, chances are high that their "results" aren't results at all.

  • by Anonymous Coward

    We have our best perl coders here slaving over the Slash release. Patrick, Rob, and Pater are trying to convert their undocumented code

    Sounds like 3 programmers just got a crash course in job security. I bet one of them realized, "Hey, if we release the code, then 300 to 3000 programmers will get to work on this stuff and the suits will see us sitting on our duffs. Then they'll start wondering what they're paying us for. Let's keep the code here so that we'll have something to do."

    Ain't Open Source a bitch? The nastiest thing you can do to a worker is increase productivity. It makes everyone look bad.

  • I'm not one of those test managers who manage but don't test. I've done everything from writing the tests to performing them -- for banks and large government projects. I own my own company, and brought in over $100,000 last year in the Y2K frenzy.

    Mostly I do (test), and don't spend much time managing -- unless it's to fight for more time to test.

    If anyone knows or supports the idea of detailed regression testing, it's me...insisting on a contractual minimum level of testing. I've generated test plans with over 1,000 individual tests -- all specifically targeted to the requirements in the actual contracts. (I don't just make this stuff up!)

    Yet, with Open Source projects, there's little or no formal testing. What testing gets done tends to be tainted; testing by the developers themselves. Do I care? Actually, no. The quality tends to be fairly high since the developers, themselves, care. They pay attention, and the quality of programmers is visibly increasing.

    This trend -- and that, yes, testing jobs suck -- has convinced me that less testing will be approved in the future and that I'd better adapt.

    Five years ago, I would have promoted VV&T as a good if unrespected career. Now, I'm convinced that it's a place for two groups of people only;

    1. Loosers: ex-school teachers, temps, less technical analysts who just pick up a check, and "black group" annoying folk, or

    2. Professionals: The kind of people who know IEEE or CMM test methods and tools or at a minimum have exceptional puzzle solving and logic skills and seek those tools out.

    It's difficult not to be lumped into and treated like group #1. No, I'll amend that. It's nearly impossible not to have someone chew your ass out every few weeks just for doing the tests right and documenting actual, real, problems or deviations from the requirements they, themselves, aren't paying attention to.

    My decision? Drop testing and become a network admin. After a while, when I can do the job well, I'll try to find a new nitch and specialize. Five computers in my basement, networked, one of them a Sparc system with Solaris, a pile of O'Reilly books, and countless hours installing/changing/securing them has given me the basics. On a scale of 1 to 5, I'm a 2...and don't deny it; I know I can do quite a bit of dammage and don't yet understand how to be a good admin. In a couple years, I hope to be at level 4.

    Sure, as an admin I'll catch about as much hell and get thrown off projects by the same political currents as before, yet after going through testing I'll be prepaired for that kind of flack and lack of respect. After a while, it won't matter...but I'll have a good job and in demand. Then, I'll get back to double digit contract increases...ahhhh!
  • It looks like Slashdot is caught in the classic software trap. An ad hoc application grows in an unforeseeable way through various hacks. It's success leads it to assume mission critical status, and yet it is unmaintainable and undocumented.

    Does choice of language affect the unmaintainablity? You bet it does. And in the opinion of many, Perl is not well suited to large maintainable programs. One method of taming this problem is to choose a more modern language, an Object Oriented language. While the benefits are not realized in many partial OO languages (such as C++), clean languages like Eiffel and Python are an effective tool in order to ensure long term maintainability and extensibility.

    (No flames please. This is offered up a food for thought, and as an object lesson in how even people like Rob Malda ignore the best advice of their Computer Science professors.)

  • >We have our best perl coders here slaving over the Slash release. Patrick, Rob, and Pater...

    Ever stop to think there might be folks out there that could help who are better than your three best perl coders?

    >The Slash code really is hardcoded in many ways and they are trying to unhardcode it for you now.

    And we're somehow not able to figure this out ourselves? Or at least lend a hand finding bugs, hardcoded assumptions, off-by-one errors, etc?

    Y'all need to stop trying to make a release that 'looks good,' and take a little time to re-read CatB. "Release early, release often." Not "when it looks pretty," not "when we think we have most of the bugs fixed," not "after we've added this one new feature."

    Because what that's going to get you is an immaculately-debugged Slash that is going to have to be rewritten anyway since you didn't expose the code to enough eyes to find the bad practices and assumptions that no doubt live in there. You'll have cleaned up a bunch of code that will get thrown away anyway. cf Mozilla....

    Do it now -- 'tar cvfz slash-0.50.tar.gz slash/' and put it out there, warts and all, money where your mouth is. I double dare you -- you might end up surprised at how much faster the code (and therefore your revenue generating site) gets better. Ah, and that takes us right back to the original topic, donnit?


    --
  • The point you're missing is here, and yes, it's an interesting point to discuss: "Once we invested time, effort and money to write this software, how can we avoid having someone larger than us profit from our labors at our expense?"

    If you assume for someone else to profit, you have to lose, you will avoid this situation at all costs.

    If you consider it possible for yourself and another to both profit in some manner, then you will be more inclined to allow this situation to happen.

    There are good arguments on both sides- for instance, when you include corporations in the equation you have to understand that, while you may wish a mutually-profitable situation, the corporation is legally bound to not only try to profit, but to try to hurt you and cause you to lose, if you are in the same line of business. It cannot cooperate with you. But at the same time, if the corporation is copying your GPLed source, there is a limit to how uncooperative it can be. It can take your code, use it, outmarket you and then withhold its changes until release- but then it has to let you have them, and even without using your code it still outmarketed you, get used to it ;)

    The basic issue is simply this: how important is it that you be able to prevent someone else from profiting by your labors? Are your labors so miraculously advanced beyond the rest of the industry that (a) nobody can help you with them, and (b) they'd make a significant difference in the performance of your competitor? This is software, people- there's never been much of a link between quality and profitability. If there seems to be a gain from cooperating, consider the possibility that 'having someone else profit from your labors' is just a chance you'd have to take.

    This whole 'winner takes all' concept seems to have grown out of the years of Microsoft monopoly. I suggest that this is not the only way the world works, and that the software industry is drifting back into regions where developers can profitably cooperate with each other. It's _normal_ to not need to take a 100% hostile ruthless attitude at all times. Such things are quirks of history, and we have lived through such a quirk. Amazingly enough, some things survived, such as Macintoshes and Linux. Now it's time to settle down a bit, quit scorching the earth, and get back to more normal interactions.

  • Have you ever joined a developmentteam that was working on a software project for already some time? Did they expect you to know where to find every bit of functionality right away? NO!

    [sigh]...
    Have you ever joined a development team that was working on a software project for some time with a buggy API underneath it, that they didn't have the source to? Have you ever just have to learn to live with its quirks (like they already had)? Have you ever wanted to have the source to that d**m thing and be able to fix it yourself?

    No, this doesn't mean you don't have to spend a couple hours staring at the thing going "how the fsck does that work?", but it does mean you at least can do this if you're so motivated (like I am). And it's much easier to understand the code if you understand the API.

    Of course, getting folks to use your revisions is a different matter... :)

    As for making patches that don't mess other stuff up, that's a matter of having properly designed and documented software -- which a well-maintained open source project will be.

  • such as this - if a company knows of a material defect in their product and conceals such to the consumer, resulting in losses to the consumer - said greedheads are liable under the higher standards of gross-negligence, recklessness, or even intentional tort, resulting in statutory treble damages or unlimited punitive damages in some cicumstances.

    The greedheads are already well on their way to taking care of that little problem. Go read up on UCITA.

    http://www.troubleshooters.com/ucita/ [troubleshooters.com]

    http://www.2bguide.com/nccusl.html [2bguide.com]

    Here's a list of Infoworld articles on UCITA [infoworld.com]

    You can find a whole lot more besides these by doing a Google [google.com] search.

  • Red Hat doesn't do the majority of the bug fixes.

    Red Hat just re-packages programs that other people write. The real bug fixes are being done by the project maintainers.

    The only exception to this is that the Debian project has package maintainers that foster a good relationship with the upstream project maintainer. The Debian project really passes those bugfixes upstream.
    --
    I noticed
  • I'll agree with you on that. I've it happen to many people. My opening line of "I'll probably get flamed and moderated down for this " wasn't intended as carrot, but truly reflected my feeling (as you put it) that you do have to tip toe around. In hindsight, I don't like it when I see other people make similar comments, so I won't be doing it again.
  • Oh! Was that my tongue in my tongue in my cheek? ;)
  • Nice.

    I guess that it depends on the platform/market.
  • I'll probably get flamed and moderated down for this (there seems to be a rather vocal fundamentalist open source community here on /.)

    Everybody toutes open source and how good it is. These people generally cite certain prime examples to support their arguments. (I think also think that many of these people are as biased and tunnel-visioned as the esteemed Mr. Raymond.)

    For some software it works well. Although Linux is a stable and it's security holes are filled quickly, it's not as successful as other products. Everybody seems to want it to be success on the desktop - I won't call it a success until it actually puts a dent in Microsoft's huge majority. Mozilla certainly far is from being a success. We await the future on that one. I don't think that there are any hugely successful pieces of open source software (sendmail? but then it basically started off with a monopoly)

    However, I do not think that all software is suited for open source. The last company I worked with developed database marketing software. Nobody is going to work on that kind of software for free. If we'd open the source then our already established (and far bigger) competitors would have been able to leverage our work in putting us out of business. I don't work for free, I have to pay bills. Infact, I'm quite happy to be paid a lot of money for what I do.

    /. doesn't seem to think that open source helps fix bugs. Otherwuse the source that they've released would be more up to date, and consistently kept that way.
  • Check out squishdot.org [squishdot.org], where you can find the free software behind technocrat.net and hundreds of other weblogs. We could use some more developers.

    Thanks

    Bruce

  • Do you mean that crackers need a whole bunch of holes before they start breaking into your systems?

    I don't think so ...




  • As you have pointed out, there _are_ so-called "unmaintained" software in BOTh the open-sourced and close-sourced software arena, and you have also aptly pointed out that the "deadends" in the open-sourced are not-so-deadends, because if you have the skill and inclination, there _exists_ the possibility that YOU take up the job to fix whatever is needed to be fixed.

    You have also pointed out that _NOT_ everyone has the skill and inclination, and you hinted that, therefore, the "open-sourced" model doesn't work for all occassion.

    But do you ever try to put your thought on the OTHER SIDE? Think of the close-sourced software - when they are NOT maintained, they remains DEAD no matter if those who want to fix it has the skill and inclination, or not.

    Take for example - the software OPTASM - a REMARKABLE assembly laugnage compiler. It is a CLOSE-SOURCE product, and is no longer in the market.

    I heard that the company that used to own OPTASM was sold to Symantec, the people who produces Norton Utilities, so, I contacted Symantec, trying to find out if they still sell the OPTASM compiler or not.

    The answer I got is NO. Symantec isn't selling OPTASM anymore, and they have no plan to update the product. That means, essentially, the people in Symantec has ABANDONED a remarkable product that was once one of the BEST assembly compiler for the X86 chip line.

    And when I further enquired Symantec regarding the possibility for them to release the SOURCE of OPTASM to the general public, - since they are NOT going to sell OPTASM anymore, I figured that they have NO PLAN to make money out of that thing anymore, right? - and the answer I got from Symantec is a BIG SILENCE.

    I can code. Although I am not a CRACK PROGRAMMER, I have enough experience to do _some_ update and code cleanups for the OPTASM compiler, if I can get the source to it. And I AM WILLING TO DO THAT.

    In other words, I _HAVE_ the skill and the inclination to update the OPTASM compiler, but because of its close-sourced nature, there is NO WAY I get to do it.

    On the other hand, if the OPTASM compiler was a open-sourced product, with my skill, and inclination, at least I can get to TRY to update the thing.

    In summary, a HUGE difference between the Open and Close sourced software arena are in the MAINTAINABILITY of the ORPHANED softwares - the softwares that are not being maintained anymore.

    The orphaned, but open-sourced softwares _could_ be updated by ANYONE who has the skill and inclination, but those orphand close-sourced software will be DEAD FOREVER, if the owner decline to release the source to the public domain.

    I hope what I am saying here will bring attention to those who have the right over source-codes of orphaned and previously close-sourced softwares, and hope that they will release their sources to the public domain. If they do not want others to PROFIT from the good gesture (releasing source-codes to the public domain), then they could have release it under GPL.


  • Hmm: the link you give simply asserts that this is an urban legend -- there's no further information.
  • As far as I recall, it has to do with the fact that the bumblebee (and other sorts, for all I know) is continually altering the pitch of its wings during flight -- the scientists who were baffled by its seemingly-impossible ability to fly had only consider the case of static wings.
  • Red Hat is generally considered least secure distribution of Linux available because of the default configuration which runs all kinds of servers after boot-up, not because of their security updates.
    They are usually the first Linux distibution to release updates after a security problem arises.
  • If you go and buy Windows 95 off a store shelf (if you could find a copy) it's the first release. Why? Inventory.

    Nope, it's support. They don't want to have to train their OS tech people on original, OSR 2, 2.1 and 2.5. They (Microsoft) want the OEMs to handle that kind of stuff.
  • it is not about writing software "for free"
    ... but it is about releasing all rights you might have to the software you've written.

    it is not about not being paid a lot of money for writing it
    ... although it is about having abuse heaped upon you, your company and your product if you don't give into the demands of the mob.

    it is not about giving your competitors the ability to put you out of business
    ... unless, of course, your business is writing software.

    Sigh. Same old, same old... There are any number of extremely convincing arguments for using and supporting open source; to spout the useless and emotionally-charged rhetoric you did contributes little or nothing to the discussion at hand. It does not answer the questions raised by the original poster:

    • Once we invested time, effort and money to write this software, how can we avoid having someone larger than us profit from our labors at our expense?
    • What is the economic model that allows me to profit from my knowledge and skills, since you insist I am to give away the fruits of my labor for free?
    • Who would write this highly-specialized software if we didn't, and why would we do it if there was no incentive for us to do so?

    These are serious questions, from someone who is at least willing to listen to an explanation... and instead of trying to answer his questions, or explain to him why you believe what you do, you deride him for not being a believer. I'd say that I'm staggered and shocked by your arrogance, but unfortunately, it appears to be all too common among open source advocates in general, and /. trolls in particular.

  • Comment removed based on user account deletion
  • This is what attracts me to Open Source Packages/OS as security solutions. You can audit the source code yourself for security vulnerabilities (overflows). Plus the code is audited by security hobbiest/professionals all over the world.
  • Yes, and I've seen scientific arguments that a bumblebee should not be able to fly. But the fact is, it can fly, so any scientific "evidence" to the contrary is probably not taking other factors into account.
    ----
  • NDA?

    I heard *NOTHING* about removing the fix for NDA reasons, and I was in support at the time. From the moment we had a fix up, there was a fix available all day every day until we got our "final" fix.

    The original one was "M310-hangfix", I believe; the !@#*! bug came out days after 3.1 shipped.

    I believe we didn't ship source for the fix until we had a "final" patch - that may have been the NDA deal.

    I do know that the reengineering work was pretty much internal; we were aware of flaws in the initial patch (performance hits, however minor), but we wanted a fix out so people's systems would stay up long enough for them to get the newer fix.

    But yes, the fact that BSDI and Intel engineers were on a first-name basis probably helped immensely in getting the fix out.

    The question this raises is, why did it take longer for Windows to get a fix than it took BSDI? They certainly have their hooks in over at Intel.
  • Would somebody please explain exactly what part of any commonly-accepted definition of "Open Source" dictates what release methodology or schedule the developers are compelled to follow?
  • You are correct that Speed != Quality. This is why OpenBSD [openbsd.org] has solved, sometimes years ago, the identical problems which Bugtraq/CERT advisories are still appearing for other Open operating systems (Linux, FreeBSD, NetBSD, for the most prominent examples). This is also the reason that perl -MCPAN -e shell is one of the cleanest installers available -- including download, checksum validation, compilation (for XS'ed C code), and ubiquitous regression testing.

    It takes dedication and commitment to make a good open product into a quality open product.

    ... and this coming from someone with 5 Linux boxen (among others)...

  • Open source software will always be the quickest for bugfixes assuming that the software is currently being actively
    maintained.

    Aye, there's the rub. There are a fair number of Open Source packages that are no longer maintained. Bug fixes may or may not ever come for those packages.

    Of course, you do have the opportunity to fix 'em yourself, if you have the skill and inclination. So even un-maintained packages are not dead-ends, if you have the time and talent to do your own maintenance. It should be clear that not everybody does.

    The inescapable conclusion is that the User classes are doomed to use buggy software if they don't pick their packages carefully. This is true of commercial closed-source products just as it is of Linux applications, world domination or not.

  • If you want something to be open source, get off your ass and write something that's open source. Criticizing other people for what they choose to do with their own software does nothing but annoy people.

    If you think that you can write a better slashdot, then go out and do it. I wish you well. The folks that run this site are under no obligation to release anything that they do. The author is the one who ultimately has the rights to the software that he writes, no matter how much you demand otherwise.

    Ability To Do Better is NOT a prerequisite to the Right To Complain. If that were true, then:

    all movie critics can't complain unless they can make a better movie.
    all game critics can't complain unless they can make a better game.
    and so on.

  • Here I would like to define the phrase reverse dogma. It is most commonly seen on /. when a poster (ie Malc) posts a controversal anti-dogmatic post. Kind of like the anti hero of Signal 11.

    I especially dig the phrase "I'll probably get flamed and moderated down for this", since almost every post I see with this phrase gets moderated up. Moderators, they are playing with your mind. It's almost like the poster is say "nah nah, I dare you to moderate me up!"

    I think finkployd is on the money when he postulates that Malc is just trying to be funny, or he's craving for karma.

    Frac

  • 2k is supposed to have some provisions for not allowing other random progs to overwite dll in system/system32 (which would be nice) - every random Joe Blow app should *NOT* replace system-wide dll s. Ever. Even MS Office (are you listening, chief of software architecture??

    Actually, an edict was handed down at MS saying exactly what you're saying - system DLLs are now only to be updated in service packs - not in apps.

    Fingers crossed that it stays that way...
    Si
  • I agree with the sense of your argument, but would like to expand upon it with a few potential reasons.

    1) Multithreaded development and debugging. The well-discussed reason is the distributed model of work. Since there are many potential testers in the world for each piece of Open Sourced software, there are also many potential patchers for siad software. The person who finds the problem actually has a possibility of being able to offer a fix.

    2) Risk aversion. Big corporations like M$, Sun, HP, IBM, etc. have reputations to consider. If they offer a "fix" that later has to be fixed itself, they are embarrassed and sales could be hurt. This is bad for the decision makers because they have something to lose for their efforts besides pride -- money. The Open Source community has much less to lose. This has primarily been due to the fact that since their work was volunteer, they could hardly get fired or sued. As the world awakens to Open Source and corporations enter the arena, I wonder if this will change.
  • by / ( 33804 )
    Be the bee, be the wing, happy birthday Martin Luther King?

    Tux is here; Gates to shear; Someone pass another beer!
  • After all, according to theory, a bumblebee shouldn't be able to fly, but I have been stung several times!

    The AFU FAQ Shows this as False [urbanlegends.com]

  • What I found interesting is that Microsoft has more bugs to be fixed. This in itself isn't a surprise, as much as the fact that the general public doesn't seem to want to face-up to this fact.

    And, with Windows 2000 on the way, won't it just get worse?

    Not an incredibly insightful comment, I know. That's why I'm hoping for some insightful responses. :-)

  • Well, they *do* have less bugs...
    98 claimed something like "3000 bug fixes from 95 and a complete rewrite of the memory management subsytem" (which it really needed)

    Win2k claims that there are some ungodly number of situations where you no longer have to reboot. Which is nice, but there are still far too many. You can change your IP address, for example, without rebooting, but a host name change sends you packing...

    there's something to be said for network restart...

    Better is good. Not having to fix 3000 of 8000 bugs is better 8^)
  • Yes, and how many are "Oh, $h17" bugs which just all of a sudden break stuff (I'm thinking of service pack 6 + Lotus Notes(MUST DIE!)). For a week, the only solution they had was, "um, you'll have to run as Administrator until we get off of our arses." Not that Lotus Notes(MUST DIE!) has a whole lot of trouble breaking things by itself...

    but that is another rant for another day...

  • Bumblebees can't fly, they just don't know that they can't, so they do ;-)

    Steven Rostedt
  • I remember reading an article about how the person who reviewed some Windows source code for the Microsoft v. DoJ trial found that there were plenty of known bugs. Apparently, if my memory serves me correctly, he was allowed one-third of the Windows source code and in that one-third, he found over 3000 documented bugs. Included was exactly how the bugs worked and what they did and, in some cases, even how to fix them.

    Chris Hagar
  • Naw, it's not slashdotted. They're just very, very, security conscious. I got back 'Forbidden: you do not have permssion to access "ads.html" on this server.' Dang, that's great! Can I enable that feature on my end? "Forbidden: you do not have permission to display ads on my browser." ;)
    --
  • I completely agree with you on the fact that open source software will work in just about every situation. I do, however, have an objection to the GPL being good in all situations. Here are two examples;
    ---

    1) It makes for almost perfect competition. Therefore it is only suitable for glue or base framework to your business model. The only way around this is to keep your changes completely internal.

    The converse argument would be to look at companies such as RHAT. I believe this to be invalid because:

    a) Operating Systems is a very wide market. Therefore it does not apply to most software.

    b) There's no way in hell they can get a traditional software company profit margin, unless they provide value added closed source or complete solutions.

    2) There are some really stupid limits on commercial extension. The LGPL fixes this.

    ---

    There are some other arguments against open source in general as well:
    ---

    1) Some large projects take substantial resources to maintain and extend. There are a lot of projects that do not begin and end after a program meets specifications. Under open source, the customer has the right to fire you and have someone else extend the product, possibly putting you out in the cold. This can be especially hard on companies who have built reliance on their customers and/or want to expand their business. The problem is magnified many times if the market for your product is limited to only a few potential customers. Therefore, you have to be very careful with your contract (if you even have one, ha).

    In the end, it is more efficient. However, it might not be in my best interest. If anyone can come along and underbid my contract and take over the work I had been doing, I might think twice.

    2) The goodwill of humans can't be taken for granted in business. If I use a personally made toolkit to make a variety of applications, I do not want others using that commercially. Someone may use my code in other applications without asking. To fix the problem, I can restrict use of certain code, and in the event of violation, litigate. Another solution is to LGPL parts of the code to enforce return on modification.

    --

    Anyway, I want to ensure myself a healthy return while being efficient for my customers as well. What I do not want is to make my work a commodity that is easily duplicated.
  • Yep, good ones.

    What I do find bad is the lack of typical end user applications. Most X applications on freshmeat typically fulfill specific goals of those developing the product. This does not always meet the needs of typical end users who need an easier to use UI or documentation legible to the average computer user.

    Even free applications on www.winfiles.com are usually more user friendly (as well as shareware). Also, since many are recreational projects, they never get close to their commercial alternatives.

    Hopefully this problem will be rectified - either by the big boys moving in and making Linux accessible to the normal user in their interest - or benificial standards agreed upon by an initiative interested in making it accessible.

    If Linux becomes successful enough, we might even see many commercial applications that cater to the end user such as easy to use image editors (I'm talking good UI and easy process hand holding), and just about everything broderbund, sierra and others develop. I am not convinced that it is possible for these particular categories to be open sourced because recreational projects really usually do not go into every possible detail and take the time to effectively plan except for their own use. They also usually are working to fulfill their own goals for a program and not providing a solution to others problems for compensation.

    In short, I think it's because those who have input in the program are generally working to fulfill the goals of the community that is contributing.
  • A "fix" also has the possibility of opening up bigger holes.

    The fastest fix is to disable that service until a well-tested patch comes out. This is true whether it is closed or open.
  • the latter [closed source] favors the development of monoliths, since they represent a harder-to-reverse-engineer, and therefore steeper, wall for competitors to climb

    It does something else as well. A monolithic product having once garnered customers for one of its features is more likely to seduce them into using it instead of its competition for its other features. And it can be sold against a range of competitors, none of whom offer its full range of features.
  • 8. Avoid Captive User Interfaces

    • CUIs assume that the user is human
    • CUI command parsers are often big and ugly to write
    • CUIs tend to adopt a "big is beautiful" approach
    • Programs having CUIs are hard to combine with other programs
    • CUIs do not scale well
    • Most important, CUIs do not take advantage of software leverage


    This captures something I was trying to explain to a UI class I took a few weeks ago. The rest of the points tie into it in various ways, but the two things I brought up were:

    • Any interface that I can't automate out of my way is a bad one because no matter how optimized it is, it forces a certain minimum amount of interaction with me.
    • Internal protocols and file formats should be documented and accessible to readily available tools.


    These two points led me to the statement "Open source is exposing the interfaces" a couple of weeks ago right here on Slashdot. The ideas behind that are simple:

    • If an open source project is going to thrive, it needs to interface with existing protocols and/or file formats.
    • Protocols that are published for free on the Web (RFCs, etc.) are the most widely available.
    • The more widely available the interfaces the greater the number of collaborators the project can potentially attract.
    • The code itself publishes the interfaces in open source. Since they can't be kept completely secret, there's not much point in hiding them.
    • Open interfaces encourage interfacing other projects with yours.


    The difference between the interfaces in open source and proprietary software is not clear-cut. But there is a tendency for open source to have a more dynamic view of the world. The programmers working on the project aren't going to have complete control over all of the customization. So there is an incentive to give away a rich configuration mechanism, or someone will build it in.

    One example, among my favorites, is the Free Translation Project [umontreal.ca]. This project is enhancing quite a number of open source projects with translated messages and documentation in a variety of languages. I suspect that my own team are the entire community of users for the Esperanto translations at the moment. No proprietary software project could ever justify the cost of rolling the translations into the distribution and testing them. Our team has taken on that burden. We translate, we test, .... We have that option because the programmers who got there before us wanted to internationalize and gave us the interface (gettext and the strings to translate).
  • Well, I submitted an article, withc several links about how M$ knew this was a problem, was sliding to the *nix way of using static libraries. Win2k is really gonna fix about 90% of the stability problems in Win today. I know most slashdotters don't believe that, but I don't care. I KNOW it to be true.
  • To clarify just a bit more, IIRC:

    The bumblebee (and other insects like dragonflies and houseflies that have similar flight surfaces) *use* the vortexes that their wing flaps generate to cause a slight vaccuum above the wing, which creates additional lift, in addition to the normal lift from the wings' downbeat. Dragonflies in particular take advantage of the weird turbulences their wings generate to do all the amazing dragonfly types of things they do in the air.

    I could have sworn I had seen a reference to some scientist at Berzerkely (I think) right here on Slashdot who had recently built like a 50x scale model of a bumblebee to study its aerodynamic properties and come up with the canonical explanation of how they manage to stay aloft.

    -=-=-=-=-

  • People have been saying this for a long time. OSS bug fixes are simply faster because lots of people are reporting bugs and fixing them.

    This is why I think the benefits of OSS strongly outweigh the setbacks. Much better code is produced for worthwhile projects.

    "You ever have that feeling where you're not sure if you're dreaming or awake?"

  • Apples and oranges. There is a vital difference between Linux and NT:

    Linux, as large as it can get, is not One Big Monolithic Operating System(tm), rather a huge collection of programs, utilities and other errata that are all maintained respectively by their individual owners. NT is all Microsoft, and they are responsible for all of it.

    In fact, to go one step further, "Linux" refers to nothing more than the kernel, which is what, a meg and a half? The rest are seperate programs and errata from different poeple and organizations, therefore they can't actually be called part of the operating system--which is why purists prefer calling it GNU/Linux.

  • open software != good software

    Sure, not every piece of OSS is good - but OSS can be made good.
    If you find an OS tool that does what you always needed to do, but is written badly, fix it up (or hire someone who does).

    The ONLY benefit open source software gets is the off-chance that a programming guru happens to have absolutely nothing to do that day and fixes the bug before the core developers get to it

    And, of course, that if the core developers don't get to it in a reasonable time, someone definitely will fix it (that's part of what Linux distributors are there for) - and when one of us fixes it, the others get the fix as well.
  • That is why glint was broken on the Red Hat 5.0 CD-ROMs, right? Because Red Hat cares about quality, eh?

    Well, all I can say here is that shit happens, even here.
    Also, there's a big difference between QA'ing one updated package (I'm not aware of any errata package needing another update for the same problem), and QA'ing an entire distribution - the more packages you have to QA, the more likely it is that something gets overlooked.
    Also, please keep in mind that Red Hat didn't have as many people to look after bugs back then as we have now.
  • ping is still broken for rh 6.1.

    Get the iputils and netkit-base packages from Raw Hide. This will help.
    We don't usually issue errata for bugs that aren't critical for most users.

    It is an undisputed fact that RH has been aware of this since October 7, 1999. Over three months on something as simple as /bin/ping, and no fix!

    Well, not quite true. The fix has been around for quite a while (in Raw Hide).
    But yes, there was indeed a mistake, it wasn't added as a comment in bugzilla and the bug wasn't closed. I've done that now (ping isn't my responsibility though).

    Stupid stuff like this occasionally happens everywhere, and probably can't be avoided completely. (I wish it could.)
  • Would an all-bugs comparison bring the same results?

    My guess is yes - it would bring at least similar results.

    If you find a bug in Windows, what do you do? Microsoft does not even have an official bug-reporting system. That's (part of) why long known bugs in Windows (such as "can't install driver from directory with long name unless I tell the installer the short name") simply don't get fixed.

    Most Linux distributions, on the other hand, have a bug tracking system (Red Hat's, for example, is at http://bugzilla.redhat.com/bugzilla). The developer responsible for the package you're reporting a bug in is immediately notified.
    If a bug is left unattended in the Red Hat bug tracking system for 7 days, the system sends another mail to the assigned developer (repeated every 7 days).

    Someone WILL take a look at the bug, and probably fix it (stuff like "On my xyz system with the AAA graphics card, my X server hung yesterday and lacking a network card I had to reboot" is VERY hard to reproduce and even harder to fix though), or at the very least decide he doesn't have the time to look into it deeply and pass the bug report on to the maintainer(s) of the base package, and update our package as soon as the maintainer(s) release a new version.
  • I realized I left out another important benefit to open source bug hunting:

    • if a bug is quite serious but intermittent or hard to duplicate, the "all bugs are shallow" rule kicks in. With the source widely distributed, the chance of a person encountering the bug being equipped to fix it goes way up. This should not be overstated as such bugs can still be hard to stamp out.
  • Slow down, friend. It was hard for me to tell which side of the issue you're on because you went so breathlessly fast. However, this part I can answer:

    What I don't understand is: why are you bashing MSDN, the biggest developersource on the planet?

    I did not say that the sorts of information that are in MSDN are not useful, they are indispensible. But they are especially indispensible because so little information is available from other sources. These scraps are not a substitute (in terms of usefulness) for source, not the source of an example, but the source of the implementation. I'm bashing MSDN because

    • It is a proprietary software/proprietary data expensive product from a monopoly. I don't mind paying for things, but monopolists by definition charge higher prices than competition would allow, and Microsoft is such an extensive monopoly their competitors in virtually every area of software are forced to pay them for information that should be included with the operating system for free.
    • It is a hodgepodge of articles, samples, etc., similar to what one finds if one searches the web: 1000s of hits, very hard to find what's important. Nothing wrong with that as one source (heck, I search the web too) but alone that is not enough. What would be better would be if the source to the API call in question were available.
    • Because it is a monopoly product, they can get away with all sorts of less than optimal functionality, including sleazy little things like many many MSDN links invisibly sneak you on to their website... what if I work at a competitor and I don't want Microsoft to know what we're working on? They go to great lengths to let you view only, no copying or linking outside of the product. Much of it is in HTML, but can I use my favorite browser? They go out of their way to stop you from doing that, all to extend their monopoly.

      Have you ever worked on an MSWindows-platform product that was competitive with Microsoft? I don't understand why all the Microsoft-boot-licking engineers we see here -- that by no means includes everybody, I'm talking about the people who defend Microsoft with an enthusiasm out of all proportion -- don't resent the many hours they spend searching for a work-around to some poorly documented Microsoft bug, while Microsoft's engineers laugh and sneer at you behind their phony smiles. Are you guys sheep? It's like the athletes from a lesser team volunteering to help rub-down the stars from the dominant team before the game. "Oh, OJ, you're so strong!" I don't get it.

  • So name a bunch of OSS projects that don't rely on people working for free.

    Irrelevant to the discussion at hand, in which it was asserted that Open Source means writing software for free.

    For a more pertinent reworking of your question:

    Name a bunch of non-OSS projects that enjoy the advantages of people working (coding, debugging) on them for free.

    The point being that OSS projects tend to attract lots of free help because there's lots of people out there willing to offer it. Help that closed-source projects typically refuse at the front door.

    That being said, I'd guess there are one or two significant open-source projects that are planned to achieve success without relying on free coding help.

    But, I can't say what they are. (Best guess offhand: IBM Jikes compiler.) Maybe there aren't any.

    Even if there aren't any, that certainly doesn't mean nobody doing OSS development gets paid quite well for doing it, which is what was, in effect, asserted in the original post to which I replied.

    (BTW, even Microsoft relies on lots of free help, in the form of beta testing, etc. I doubt MS could release a quality OS without relying on such free help. The question is, can it do so without relying on free coding help, the sort of help Linux and the BSD's leverage so successfully?)

  • it is not about writing software "for free"
    ... but it is about releasing all rights you might have to the software you've written.

    False. You still have some rights. You've given up the right to restrict other people copying it, though, which is indeed a big restriction.

    it is not about not being paid a lot of money for writing it
    ... although it is about having abuse heaped upon you, your company and your product if you don't give into the demands of the mob.

    Huh? Sounds like an emotion-laden argument to me. So you're saying Bill Gates would get more abuse heaped on him if MS released all its software under the GPL? Hmm.

    it is not about giving your competitors the ability to put you out of business
    ... unless, of course, your business is writing software.

    False. More correctly: "unless, of course, your business model assumes the ability to charge a lot of money for a substantial percentage of the copies of the software that actually get distributed."

    Sigh. Same old, same old... There are any number of extremely convincing arguments for using and supporting open source; to spout the useless and emotionally-charged rhetoric you did contributes little or nothing to the discussion at hand.

    First, I didn't pretend to offer a comprehensive explanation for why to use open source. In fact, I strongly suggested the poster take the time to research the issues.

    Second, I didn't "spout" anything "useless" and especially nothing "emotionally-charged". All I posted was simple fact, into which you read (not having particularly good comprehension skills, I would guess) your own agenda.

    Note that I didn't say there weren't challenges to be addressed when writing software for open-source distribution. But the poster had essentially claimed things about such an approach that were incorrect, and I was offering some simple, direct statements to illustrate that, and suggesting he follow up with some actual research.

    It does not answer the questions raised by the original poster:

    Of course not, since the poster asked basically no questions, he just spouted some emotion-laden rhetoric (e.g. about having to pay bills) that I explained was false on several counts. So, speak for yourself, if you have questions to ask.

    Once we invested time, effort and money to write this software, how can we avoid having someone larger than us profit from our labors at our expense?

    There are many ways to consider, none of them perhaps quite as easy as resorting to the proprietary model. Ask dress designers (fashion) how they do it -- for all intents and purposes, that field is inherently Open Source, yet they make billions (and, yes, their products are functional as well as artistic). There are other fields that don't allow the equivalent of proprietary software distribution as well. Study them, if you would like to consider whether OSS is worthwhile (before your potential customers require it, of course, at which point you have no choice).

    What is the economic model that allows me to profit from my knowledge and skills, since you insist I am to give away the fruits of my labor for free?

    I insisted on no such thing. Scratch the above "First", i.e. here's item Zero for you: learn to read.

    As far as answering your question, why don't you investigate the various models already in use throughout various industries, not just computing? (And if you think the source code for your product necessarily constitutes 100% of the "fruit of your labors", you need to do some reading up on the OSS industry, and learn why all of its maintenance problems are not 100% solved due to availability of source code, for example.)

    Who would write this highly-specialized software if we didn't, and why would we do it if there was no incentive for us to do so?

    People with incentive. Say, the potential customers, who, if they insist on OSS only, will find a way to fund its development themselves. Then you can accept the funding, develop the software, and make $$ maintaining it. That's just one model, but it works.

    These are serious questions, from someone who is at least willing to listen to an explanation... and instead of trying to answer his questions, or explain to him why you believe what you do, you deride him for not being a believer. I'd say that I'm staggered and shocked by your arrogance, but unfortunately, it appears to be all too common among open source advocates in general, and /. trolls in particular.

    Excuse me, but it was the poster who arrogantly made unsupported, unsubstantiated claims about what OSS development implies. All I did was say "wrong; wrong; wrong" and point him in a better direction. And I didn't deride him for not being a believer -- just for making false assertions in stating why he holds his beliefs. E.g. I don't deride someone for saying they believe in Santa Claus, but if they claim he lives at the North Pole and satellites have verified that, I might take issue with that claim.

    If people want answers to questions, they ought to ask questions, rather than state falsehoods. The latter can work, but, in a public forum, it's a sure way to showcase one's ignorance and/or arrogance.

    (For example, if you weren't so interested in trashing OSS advocates like myself, you'd have just asked the questions, instead of ranting about the content of my post.)

    So here's a question I have for you. You have a great idea for a new software application, believe it'd be worth $2B in the next five years, and would take only two years to develop via the proprietary model (up-front funding, hire programmers, closed development, etc.).

    Problem 1: you can't find good programmers, because they are unwilling to work for such low pay on software they can't maintain themselves after they leave your employ.

    Problem 2: during market research, you discover your potential customers are no longer willing to pay much of anything for a copy of non-GPL'ed software, due to having been burnt by lock-in software so often in the past.

    How do you make your idea happen using the proprietary model, when you can't hire decent programmers and won't be able to sell the final product anyway?

    Now, this isn't likely the case today, but might be for your product's market within a few years. The problems are a bit overstated, but all that has to happen is a) programmers worldwide decide they deserve vastly more $$ to work on proprietary software, due to their not having a long-term relationship with it vis-a-vis OSS and b) while a) drives your costs up, customers are less and less willing to pay lots of $$ for each copy of proprietary software, figuring if they would pay lots of $$ for a copy, it must be an important to them, and therefore it should be OSS so they don't get locked in to it, which drives your potential revenue stream down.

    Next question: besides the problems described above, do you see any opportunities in this scenario?

  • I'm not sure we can apply this to the whole Linux vs Microsoft thing, other than to say that a new modality changes the whole landscape. But I guess that's what Open Source is all about. In this case, we're the bee.

    More precisely, Open Source software is the bee, we're the wings, and our efforts testing, debugging, and improving software constitute the turbulence that Closed Source development tries hard to exclude but Open Source development, ideally, welcomes with open arms.

    Be the bee, be the wing, be more Open with your sting!

    Okay, I just made it up and maybe it's a bit lame, but it could inspire somebody to come up with an improved version. ;-)

  • If you want something to be open source, get off your ass and write something that's open source.

    Perhaps the people who are being loud about this issue are OpenSource authors? (I am [openverse.org])

    I for one am not interested in writing a new /. I am interested in the community involvement in IMPROVING this one which is impossible without an ongoing availability of the /. source.

    I cannot speak for everyone but I don't think anyone is interested in the code to "steal the slashdot crowd and relocate them to a new forum". nor do I think this is even possible. The net has shown that he who comes first, leads (amazon.com's auction site is a good example as it lags behind eBay even with the troubles eBay has). Users become loyal to the first commers and I include myself as a loyal slashdotter (for now)

    What I am shouting about is the hipocrisy which runs rampant from a site which is making money off of our OpenSource model while refusing to participate in it. I'm glad they're making money don't get me wrong... It's a good thing to have money. It's a bad thing to have made that money off of someone else's idea and then to deficate on the idea with comments like "ask me again and I'll delay it again". At the very least, Andover should make a public announcement as to IF,WHEN,and HOW the source will be released and stick to that schedule. They have nothing to loose, and everythgin to gain. There are other discussion forums out there. Some may even argue that they are better, more user friendly, etc... But we WANT to be loyal to /. /. is making it hard on a lot of us to remain loyal.


    They are a threat to free speech and must be silenced! - Andrea Chen
  • Trying to judge Microsoft on patch management is a hit and miss situtation at best. For how much MS claims it that it's programs have to be part of the core OS, it's obvious that product groups aren't talking much to each other. For example, take Windows NT. It's no where near perfect, but you can pretty much count on Service Packs every now and then that have some degree of regression testing. This is at least a passible pack stragegy.

    Is it anywhere near IBM or HP for OS patches? Hell no. MS is where HP and IBM were 7 years ago when it comes to patching and bugs.

    Let's take a look at MS at it's worst. Microsoft Outlook 98. As Steven Webb of Microsoft Technical support described the patches "strategy" went a little something (paraphased) like this:

    "I have this printing problem. [Describes problem]

    Well, you know those security patches or the archive patch? It should really be considered a service Pack. See, it has about 150 odd fixes inside of it.

    Is that documented somewhere in Technet?
    No.

    So it's been fully regression tested right?

    Yeah, sure...that's the ticket"



    Luckally I was a premier support customer. Basically, you pay MS a boat load of money and they assign some dude to you who is supposed to be dedicated just to your company, yet is never at his desk to answer your call. However, you do get to see all the neat little comments in the Technet that are marked confidential.

    For MS a confidential note is usually the exact steps it takes to reproduce the problem. You can't be letting non-support contract customers figuring out what that intermittent problem is and demanding a free hot fix. No sir!

    "You owe me thirty-five dollars!"
    "I don't have a dime."
    "Didn't ask for a dime, thirty-five dollars...cash"

    Other common confidential notes indicate Y2K fixes that are undocumented, or other problems that will get fixed when the hot fix is applied.

    The best part was when I asked the tech "Exactly what is fixed in the security patch your are recommending?" To which I was told they don't have that information. Even after escalating MS has yet to document exactly what it fixed in any of the Outlook patches. They said they would do better with documenting patches in Outlook 2000. The best solution I got was to install the patch, check the dates on the DLL then corilate that to the file dates documented in the 150 odd hotfixes...

    Long story short, I don't think it's only a matter of Opensource VS Closed source. It's how committed the is to fixing the problems and how open they are about it. MS in general lacks consistancy across product lines. In many cases it seems that patches don't come out when they are needed, but rather when the press puts a problem in the spotlight. And as long as MS pays for IT managers and CIO's to spend a week in seattle getting brainwashed I don't think that is going to change.



    --
    Gott'a run, time to reboot the NT box.
  • Please talk more about the bumblebees. These are the threads that make Slashdot a wonderful place!

  • You can audit the source code yourself for security vulnerabilities (overflows). Plus the code is audited by security hobbiest/professionals all over the world.

    Well, what about the possibility of a very clever open-source developer actually adding a subtle security vulnerability intentionally while helping out on a feature or two of linux? Then when some huge ecommerce company starts using the version... BAM! He strikes and takes all the credit cards in the database or takes whatever else of value there might be to take. Does anybody know if this has ever happened or at least been tried before?
  • I believe good code has an average of 1 bug for every 1000 lines. Windows 2000 has how many lines? 30-40 Million (I'm guessing here) So we're looking at 30-40 thousands bugs. And that's if it's good code and not rushed out the door, sort of tested stuff.
    And yes, for all you Windows people out there, if you reboot your machine every day of course you don't have problems. But seriously, what sort of reliablity is that?
  • Don't be too quick to judge based on the statistics Security Focus gave:

    Looking at their results, the time to fix 50% of the bugs is 4 days for Red Hat and 3 days for Microsoft.

    After 1 day, Microsoft fixed 42% of their bugs. Red Hat only 29%.

    I know I'll probably get moderated to hell for this, but the simple fact is the "average" statistic tells nothing at all. What the results seem to be saying is that Microsoft is faster on simple bugs (probably better distribution channels) though they fail on the more difficult bugs (probably more complex code, but who can tell without the source).

    Acutally, there is some discussion of this point in the article, and your interpretation of the statistics is probably more of a distortion than the "average" time presented in the article. One point specifically made in the article is that the time it actually takes to fix a bug is not well demonstrated by their statistics. The time they mention is the time between when the security hole is generally known and when it's fixed- not between when it's first discovered and fixed.

    A significant percentage of the security holes are discovered, worked on, and not publicized until the bug fix is already available. It's litterally a case of "We found this hole and here's the patch," and not one of actually fixing the bug in less than a day. Apparently, about 42% of Windows holes and 29% of Linux holes fit into this category. In that respect, the statistics are much more favorable to Linux than to Microsoft. They actually mean that only about 14% (8% of all bugs v.s. 58% not announced and patched simultaneously) of holes in Windows that are publicized by someone other than Microsoft are patched within 3 days. In Linux, though, about 30% are fixed and available as RPMs within 4 days.

    That also means that the average time for bugs not announced by the respective vendors is actually longer than the averages presented. The average time to fix a non-vendor announced bug is more like 27 days for Microsoft and 17 days for Red Hat. Since the non-vendor announced holes are the really scary ones- the vulnerability is known and there's no available cure- that's a more reasonable comparison.

  • Not really. Most people have known and said that Open Source would respond to bug-fixes faster than closed source models. I believe the -average- turn-around time for some parts of the Linux kernel is 24 hours or less.

    Is this important? IMHO, yes. Never mind corporate interest, something like the Bastille Project, OpenWall, Trustees, ACL, or even ReiserFS, could -never- have been written by 3rd parties if Linux had been closed-source, and many of these packages might never have been written at all.

    100,000 developers & debuggers is a lot more than Sun, Microsoft, IBM and Apple can muster, combined. Why should it surprise anyone that such a large number can out-perform such small companies? *Note: The 100,000 is an estimate of the number of people who are on the kernel mailing list, plus the number who aren't but who play with the development kernel and will write bug reports in the event of finding any.

  • As per your request, hugely successful pieces of Open Source software, one list of:

    • BIND
    • MySQL
    • Apache
    • Berkeley Standard Distribution, Berkeley Tapes
    • Gnu C/C++
    • X11, Revisions 4 through 6
    • Sendmail
    • Listserv
    • LBL's Traceroute
    • PHP
    • SATAN security scanner
    • Linux kernel
    • [Free|Open|Net]BSD kernels
    • IBM Secure Mailer (Postfix)
    • Tcl/Tk
    • Perl
    • Python
    • UCL VIC/RAT/SDR
    • PGP 2.6/GPG
    • SSH/OpenSSH
    • Sun RPC system
    • Sun OpenWindows
    • MPI
    • PVM
    • Pine
    • NCSA Web Server
    • CERN Web Server
    • IRCII client and server
    • MiniVend
    • OpenCart
    • GateD 3.x
    • MRouteD
    • MRT
    • PIMD
    • PIMDD

    Additions and corrections welcome. And I don't require payment for them, either. :)

  • by Malc ( 1751 ) on Monday January 17, 2000 @01:11PM (#1364476)
    "These people generally cite certain prime examples to support their arguments.

    And we all know THAT is a really dumb way to support arguements. Better to just yell and scream. "


    Let me clarify: everybody seems to use the SAME examples. A couple of sunny days doesn't make it summer.


    "and you actually support linux"

    You've got that one right (I started with Slackware on a 1.1 kernel). Unfortunately I work from home on in an NT environment, so other than web browsing, I've have had little incentive to reboot into it for a while. I like the way Linux gives me back control of my machine. But in reality, NT is very stable (I haven't had any need to reboot in over two months), and it fills my requirements equally well.

    "find a more popular web server than Apache (suprise, open source). Seems PERL is pretty open"

    I'll give you that. I think that current popularity of perl is partly due to the success of Apache. Thanks to the AC's reponse for the Netcraft address (http://www.netcraft.com/survey/ [netcraft.com]) that demonstrates Apache's rise to popularity at a time when there was no real competition (the NCSA server quickly proved insufficient). Apache like many of the other UNIX tools that have been the glue of the internet are popular because they were the first successes. As Linux finds it difficult to enter the desktop market, so Microsoft finds it difficult to dominate the internet.

    I think that Apache's popularity will decrease in favour of IIS of the next few years. Whether Microsoft will pass Apache in popularity is another issue, and will certainly be a key point in the open vs closed source debates.

  • BSDI had a fix on day one or two of the F00F bug announcements. Sources reported that the BSDI fix was reverse-engineered to make a Linux fix. Days later, BSDI came up with improvements to their fix (first enabling it only on Pentium chips, later improving performance even on those systems affected).

    BSDI made their fix first because they had information about the bug direct from Intel (under NDA, before the bug was announced to the general public). They were forced by Intel to remove the fix they posted almost immediately because it violated the NDA. The Linux fix, IIRC, was not reverse engineered from the BSDI fix, but was a separate effort that worked in a slightly different way, without the help of Intel's additional info. As I understand it, BSDI's fix was later reengineered to behave in the same way as the Linux fix.

    PS. I'm not knocking BSDI here, who I think make a great product. I'm merely correcting misinformation (at least, I think I am -- my memory's not great, and I'm too lazy to search the Linux kernel archives to find out for sure :-)

  • They should have, IMHO, tracked how bugs fixed made it into production environments. The chief complaint I hear now is "I have to depend on some kid in Nebraska to fix his software, or at least get the patch happening and released?" (no offense to anyone from Nebraska). I had this very conversation at a meeting where I was in a kind of cheerleader/salesman mode for Linux/OSS.

    I think that ANY commercial Linux distributor should, as point one of the business model, establish a means to rapidly and loudly manage bug fixes and updates. Hell, call 'em "Service Packs" so the PHB's will understand what you're talking about. Coordinate with the developers. Try to create a "path of least resistance" for people, esp. those that don't care about technology, just that they can have it fixed. Hire a couple of people to deal directly with developers and customers.
  • by Admiral Burrito ( 11807 ) on Monday January 17, 2000 @11:29AM (#1364479)

    From the web page. Apologies for the formatting.

    1999 Advisory Analysis
    Vendor - Total Days of Hacker Recess - Total Advisories - Days of Recess per Advisory
    Red Hat - 348 - 31 - 11.23
    Microsoft - 982 - 61 - 16.10
    Sun - 716 - 8 - 89.50

    Interesting that Red Hat comes out better overall than both of the others, even though Red Hat is generally considered least secure distribution of Linux available.

  • by um... Lucas ( 13147 ) on Monday January 17, 2000 @11:20AM (#1364480) Journal
    Microsoft Windows users use their computers in multitudes of different ways than Linux users. While a Linux user would know to use "shutdown -n blah blah blah" a Windows user might just hit the power switch. That's not a direct comparison, but really, I'm sure that there are some flaws in Linux that just haven't been found because Linux users operate their computers (for the most part) in a different manor than Windows.

    Also, it doesn't help that with their popularity, Microsoft draws the fire of every scriptkiddie, security wannabe, etc, who all want to be the first to find a new bug and either exploit it or publish the fix.

    The article seems to be slashdotted right now, so i can only speculate here... But i'm not convinced that Open Source produces cleaner code. It just allows you to have multitudes of people available to fix flaws, AS THEY'RE FOUND. In other terms, though, I think that closed source still has some advantages in a completely different context from security. Just more direction, rather than having everyone run around coding whatever it is they want to code.

    Here's hoping that Microsoft stood up to their promise of not shipping Win2000 til it was ready.
  • by seebs ( 15766 ) on Monday January 17, 2000 @01:50PM (#1364481) Homepage
    FWIW, BSDI had a fix on day one or two of the F00F bug announcements. Sources reported that the BSDI fix was reverse-engineered to make a Linux fix. Days later, BSDI came up with improvements to their fix (first enabling it only on Pentium chips, later improving performance even on those systems affected). I assume the Linux folks did too.

    Solaris and MS took weeks, plural, as I recall.

    Conclusion? Competent engineers who care make for faster code fixes too.

    (Disclaimer: I work for BSDI, but honestly, if I didn't really think their engineers were that good, I wouldn't work here either.)
  • by Frac ( 27516 ) on Monday January 17, 2000 @07:40PM (#1364482)
    It's a very offtopic 5, but I think it points out a very big hypocrisy on /.'s part.

    apparently, the moderators also believed that this hypocrisy (especially in the light of Rob's obnoxious answer in the /. interview) should be moderated up for more people to see.

  • by powerlord ( 28156 ) on Monday January 17, 2000 @11:34AM (#1364483) Journal
    Of course by the same token, there are a fair number of Closed Source commercial products that have been dead-ended due to lack of developer support. In those cases there is even less likelyhood for bug fixes to come out.

    In the OpenSource community there is always the possibility that, so long as a project is useful someone may pickup the torch and keep maintaining/developing a project beyond the point that the initial author(s) is/are involved.

    Case in point would be MetaCreatations. They produce quite a number of 3D modelling/animation programs (Poser/Bryce/Painter/Ray Dream Studio). There have been recent rumors (it may be fact by now) that they are dropping support and development of all these packages in favor of a 'web-enabled' product called MetaStream (ie. producing content for a web plugin), and a 'High-End' product called Carrara.
    The current install base either has to pay a much bigger than usual (and for some unafordable) 'upgrade' fee to switch to Carrara (versus the modest fee charged to purchase a new version of a software they currently own), or else stick with Ray Dream Studio and any bugs that remain.

    If it was OpenSource then there would be more possibility bugs being fixed even if a 'new and improved' product came out, simply because some people would both care, and have the code to do it.


    Colleen:Its a black-hole.
    Hunter:Is that a good thing?
    C:It is if you want to be compressed into oblivion.
    H:Oh.. coooool.
  • by mOdQuArK! ( 87332 ) on Monday January 17, 2000 @11:24AM (#1364484)
    A nice example of proof-by-anecdote.
  • by NateTG ( 93930 ) on Tuesday January 18, 2000 @12:19AM (#1364485)
    I suppose this is karma bait, but assuming that the same day fixes are all "friendly" a reasonable assumption considering the inertia of companies of these sizes then the following figures come out:

    Red Hat:
    348 days for 22 fixes 15.8 days per fix
    Microsoft
    982 days for 35 fixes 28.1 days per fix
    Sun
    716 days for 6 fixes 119.3 days per fix

    Also Red Hat had 29% (about 1/3 for those non-math inclined out there) friendly bugs, MS had 42% (~2/5) friendly bugs and Sun had 25% (1/4) friendly bugs.

    Draw your own conclusions.
  • by solar ( 94732 ) on Monday January 17, 2000 @11:27AM (#1364486)

    I thought it was interesting that although Red Hat came out on top by a decent margin, the article said that RedHat could be even faster if they payed more attention to the community.

  • by eric.t.f.bat ( 102290 ) on Monday January 17, 2000 @01:31PM (#1364487)
    Billy, Steve, and all the rest
    Try to make their code the best,
    Flapping hard against the storm,
    Going nowhere, getting warm.
    Linus T. and RMS
    Found the way to true success:
    Hiding errors makes no sense;
    You can fly with turbulence!

    : Fruitbat :
  • However, I do not think that all software is suited for open source. The last company I worked with developed database marketing software. Nobody is going to work on that kind of software for free. If we'd open the source then our already established (and far bigger) competitors would have been able to leverage our work in putting us out of business. I don't work for free, I have to pay bills. Infact, I'm quite happy to be paid a lot of money for what I do.

    Maybe someday you will understand what Open Source Software (OSS) is about well enough to not make such strange statements.

    Here are some clues: it is not about writing software "for free"; it is not about not being paid a lot of money for writing it; it is not about giving your competitors the ability to put you out of business.

    And here's the biggest clue the market has yet to particularly appreciate: once customers of software come to appreciate the benefits of insisting on Open Source, and assuming they therefore do insist on it in greater numbers, you won't be able to get paid well for writing software that isn't Open Source.

    Keep in mind the fact that the huge valuations of RHAT and LNUX on NASDAQ; the success of GNU/Linux, Apache, etc.; and so on have all taken place with nearly 0% of the end user of software insisting on Open Source per se. All those end users have cared about so far is faster, better, "cooler", more reliable, etc., all of which Open Source can deliver, more or less.

    Once it becomes clear that Open Source per se delivers advantages closed source cannot -- advantages that can trump all the disadvantages (slower, fewer features, etc.) in any given instance -- how will you make money writing closed-source software?

    Really, do take the time to investigate just how much money people are making writing OSS, how much more freedom we have to change jobs and still take our expertise, even our code, with us to the next job, and so on, before you make public statements about whether OSS is "suited" for certain kinds of software development.

  • how much more freedom we have to change jobs and still take our expertise, even our code,

    I should probably clarify my "even our code" statement.

    It refers to the fact that, of the various products whose source code I've been an "expert" in maintaining during my career, I can be hired to maintain that code by a large percentage of its users only when the code was distributed Open Source (more specifically, GPL'ed).

    Put another way, closed-source development isn't just about locking in customers -- it's about locking in the vendor's programmers as well, by making it harder than it would otherwise be for them to get good jobs maintaining the same code elsewhere.

    So if you're still using PRIMOS, or Numerix machines, or you use Cadence's NC-Verilog on Suns, it really won't help you much that I've got some expertise maintaining portions of those systems, unless you've paid lots of extra $$ for the source. The $$ I was paid by the respective companies (Pr1me, Numerix, Cadence) to work on that code was, indeed, decent, but is about all I can expect to ever earn for that particular expertise.

    But if you're using GCC, g77, etc., you can hire me to work on them. (In theory, anyway; I'm not exactly looking for work these days.)

    That makes OSS development more valuable to me as a programmer than closed-source software development, and in fact it gives me incentive to favor long-term viability of OSS products over the sort of short-term focus that has so characterized closed-source products over the past 20 years.

    I.e. if I can profit from closed-source development for only the duration of my employment at, say, BigSoftwareCo, then it behooves me to maximize my salary & benefits during that duration, leaving it essentially up to BigSoftwareCo to decide how it will maximize its long-term viability. Since I can profit from OSS development for the duration of the practical life of that software, I have more incentive to make it live a long, healthy life, even if that means making less $$ in the short run (not necessarily always the trade-off I have to make, but one I've willingly made a few times already).

    So, as an OSS developer, I'm more interested in making sure the software is, and remains, useful for a long, long time (ideally, with as few changes made by others to my own code as necessary, so I have maximum expertise in it).

    Though I've personally applied similar incentives when writing closed-source software, it hasn't been due to financial or most other incentives, because there really aren't any. In fact, one of the reasons I was attracted (back) to OSS development is that I could continue to apply my own sense of ethical software development while being able to gain some potential for financial reward for it (for a change), or at least while not being punished (say, by management) for doing things such as taking extra time to make sure the software works correctly and as documented.

    (Not that there aren't all sorts of similarly bone-headed people in the OSS movement pushing for "gimme what I want now, you worry about making it work right later" from time to time. But I don't report to them. Besides, it's usually easier for the general public to see these sorts of discussions going on in OSS development than in closed-source development. That allows people to come to more informed conclusions regarding, e.g. the long-term viability of proposed extensions.)

  • by belgin ( 111046 ) on Monday January 17, 2000 @11:26AM (#1364490) Homepage
    If you look at what they were comparing, the numbers they found were pretty much in line with what one would expect. The Linux systems are always going to be fixed fairly quickly, because there is a distributed network working on the problems as they crop up. This is the whole point behind open source software.

    Microsoft does amazingly well when you stop to consider what the have to work with. Their code is probably very complex due to the requirements of backwards compatibility and interaction along unusal connections between types of software. They only have a comparatively small number of programmers to be working on it at a given time, and they get the hot seat as soon as there is a problem. Everyone in the business world simultaneously expects perfection and low quality from MS, so that they can bitch about something all the time. When you consider the strains they have to deal with, they are doing very well.

    I work for IBM and Sun is one of our big competitors, so I can't really say anything without risking excessive personal bias. However, I suspect that people are less inclined to roast Sun for every security breach, as there are fewer personal users than either of the two other systems.

    B. Elgin

  • by re-geeked ( 113937 ) on Monday January 17, 2000 @02:05PM (#1364491)
    Which brings me to an OT rant about small ISV's and (wait for it) open source:

    The consulting company I work for (and many others) makes a lot of money fixing problems created by some dork who was too stupid to realize you can't start your own software company. ;) Usually, the story goes like this:

    Dork writes app with some puny but vital business purpose (If unethical dork, insert "on customer's time" here.), invariably in a lame-ass tool that really should only be used to handle smallish recipe files.

    Dork manages to sell to one big client where an in-law works.

    Dork makes major release and generates marketing pamphlets for distro at industry trade shows, which promise that the app will have user docs and a real-database port Real Soon Now.

    Meanwhile, the app really hasn't progressed beyond "almost alpha" at the one big client who is paying for "integration services".

    Hapless company (my future client!) correctly decides that buy is better than build, but incorrectly assumes that there must be a decent package out there for this. Someone at hapless company randomly stumbles upon dork's marketing pamphlet or web site, and buys in.

    Hapless company is promised by dork that software is ready to go, and the "integration services" should only last a month or so.

    Fast forward six to eighteen months (depending on client IQ)...

    Client has no system, or worse, has a crippled system and turned off or stopped paying for the old one, has spent hundreds of thousands on dork hours alone, has no docs on how to install, operate, or fix, has no source to allow me to fix or diagnose it for them, has no database schema (and often a dork-encrypted/proprietary database), is paying more thousands for staff to babysit and undo the misdeeds of the app, has dork saying that he can't spend any more time with them (assuming his number is still listed), etc.

    Basically, the only good news is that hapless company didn't bring Andersen Consulting in to do the app :-)

    So, how would this have improved with Open Source?

    Well, dork could have:

    Had input on his app from client or consultant help,
    Started a bit smaller before hapless company showed up,
    Made continuous improvement to his app based on experiences at first big client,
    Gotten paid the honest way, for development services, rather than for vaporware,
    And eventually have built a user base big enough to handle the Hapless account smoothly.

    Meanwhile (and, per RMS, more importantly), the client could have:

    reviewed dork's code before betting the company on it,
    brought in extra help that would really fix problems, not just clean up after them,
    been assured that they could still improve the app after dork fled the country,
    had a consultant provide necessary docs and schema,
    and been part of a community of users that would work together to improve the app, whether dork was there to help or not.

    Or, at least I've heard it could work this way...
  • by jjsaul ( 125822 ) on Monday January 17, 2000 @11:35AM (#1364492)
    I used to code at a small accounting software company - and saw the worst side of this issue. Starting with the totally irrational resistance of the luddite owners to posting patches on the web site, there was an economic disadvantage to releasing patches when another (due to city tax schedule releases) would be out a few weeks later - shipping 10,000 diskettes was a substantial cost for a small company. Add to this the understaffing of the testing department and tech support, and even the refusal of the owners to allow us to use point designations for patches (on the theory that it advertised how many time it took to get it right), and you can imagine the confusion and frustration. I don't (oh blasphemer!) think open source is the solution to every problem - but I'm sure that my prior employer wasn't the only sociopathic corporate greedhead torturing employee and customer alike. I gave them a lot of unsolicited legal advice (why I'm no longer working there ;-) - such as this - if a company knows of a material defect in their product and conceals such to the consumer, resulting in losses to the consumer - said greedheads are liable under the higher standards of gross-negligence, recklessness, or even intentional tort, resulting in statutory treble damages or unlimited punitive damages in some cicumstances. Of course it is common in the industry to hide bugs as long as possible, under the mistaken idea that quietly fixing the bug in a later release saves consumer goodwill by avoiding embarassment. Sometimes the lag between discovering a problem and coming up with an assured good fix is even justified. But maybe what we need is a good Pinto case - wherein the bean counters at Ford decided that the cost of adding an 8 cent plastic cap to a bolt in front of the gas tank was more than the projected number of immolation-deaths per year. Jury-award was a record at the time, nailing Ford for hundreds of millions in punitive damages to demonstrate the moral repugnance of such calculation. Something to think about, at least.
  • by WinTired ( 125929 ) on Monday January 17, 2000 @12:06PM (#1364493)

    The article clearly focuses on plugging security holes, which is just a subset of the vast debugging space out there. Sure this may be the main concern of a sysadmin, but what about the 95% of us who do not have to admin for a living? Would security be our prime concern? Would an all-bugs comparison bring the same results?

    Most bugs are just annoying, but some make you waste time, some lead to wrong results with varying consequences and some lead to data loss. I have never seen an advisory or a mailing list dealing with this kind of bug. I *know* there must be some, but the point is it's so easier to be informed about security gaps. Isn't anybody paying attention to overall quality or is this just a natural PR reaction to the known preference mainstream (even underground) media has for security holes, given its theft/trespass inviting nature?

    It's easy to understand one's motivations to code, but we just debug because we *have* to. So, if these smaller bugs are something software can live with (mainly software planned to last only for a certain period), what would be the motivations for real debugging?

    I am not saying that an intense debugging effort maens quality (maybe even the contrary is true), but if the only motivation to take corrective measures is pressure from consumers/clients who can have sensitive data compromised, then we will continue to use buggy software.

    OTOH, when pride, reputation and commitment enter the scene, then we do our best to excel. So, my guess to the question in the first paragraph is that OS can have a response time orders of magnitude shorter than commercial products if we consider bugs in general, but that, if true, would be something hard to prove.


    -------------------------

  • by jbarnett ( 127033 ) on Monday January 17, 2000 @11:51AM (#1364494) Homepage
    First, let me say this isn't a flame, troll bait non any disrespect to the open source community or Red Hat software, just my opinion.

    No offensive, but I don't think Red Hat is a fair repersentive of Open Source software, at least in this test here. The test is going to be on open source verus closed source in terms of "turn around" on bug fixes.

    Linux/GNU which is what Red Hat is pushing, is not coded by Red Hat. It is made by out side developers and Red Hat only puts the product out. When Red Hat learns about a patch fix, they check it, package it and then up load it.

    Case goes like this, a bug is found in Linux/GNU and people are informed for it. Some guru fixes the bug, posts it to ftp site. After awhile Red Hat finds the fix, reviews it, packages it, documents it, puts it on it's ftp then releases it, the announces the fix is avaiable.

    The extra step of Red Hat doing this, is going to cost allot of time for the open source community. It should not count when the bug is found to when RH announces the fix, it should count from when the bug is found to when their is a working fix or work around anywhere in any form (even if it is source) on any ftp site.

    Red Hat can only work with what the community gives them, say this

    Day 1 - bug that opens a hole in program found in OSS

    Day 2 - nothing
    Day 3 - Maintainer and head programmer of XYZ announces a fix and uploads the source to ftp.

    The problem is fixed, patch is avaiable to close the hole and fix the nasty bug is XYZ software.

    Day 4 - nothing
    Day 5 - Red Hat reviews the code
    Day 6 - Red Hat tests the code
    Day 7 - Red Hat packages the code
    Day 8 - Red Hat documents the code and uploads it to ftp
    Day 9 - Red Hat announces there is a fix avaiable for Red Hat users that are using XYZ software.

    Now, yes, I know this is REALLY dramatic, but I am trying to make a point. (BTW, I got some REALLY good fishing stories). Anyway, I haven't seen a bug in a RH last more than 3-4 days at the very most (then again I don't use RH), but time of fix, to time of RH announces a fix, can be, and is drawen out. This could impact the study some what.

    Neither a-less I still know Red Hat is going to kick MS ass at bug fix turn arounds, but the point it, raw OSS could do it faster and better than RH ever could. But the packages are allot nicer with RH :)

    Again this is not a flame and I don't mean any disrespect to anyone.

  • by zlexiss ( 14056 ) on Monday January 17, 2000 @11:14AM (#1364495)
    Sometimes I wonder how many closed source bugs have been known before the bulletin/news went public, with the fix withheld until there was a known "problem". Which can make the response time seem really nice if you're just holding onto the bugfix and releasing at the right moment.

    And I'll still wonder what's with the legalese every bulletin has about "no known people being affected" by the security bug.

    zlxiss
  • by Tower ( 37395 ) on Monday January 17, 2000 @11:35AM (#1364496)
    Not to mention that Windows "security" has been notably poor about keeping people out of where they should be... you want to delete kernel32.dll or add some extra bytes to ifshlp.sys - it may ask you, are you sure, but it lets you do them... not too great.

    2k is supposed to have some provisions for not allowing other random progs to overwite dll in system/system32 (which would be nice) - every random Joe Blow app should *NOT* replace system-wide dll s. Ever. Even MS Office (are you listening, chief of software architecture??

    Imagine installing BitchX or XAmp and having them overwite parts of QTLibs, Xlibs, and why not, glibc... our versions *have* to be better, right?

    Oh well...
  • by throx ( 42621 ) on Monday January 17, 2000 @04:16PM (#1364497) Homepage
    Don't be too quick to judge based on the statistics Security Focus gave:

    Looking at their results, the time to fix 50% of the bugs is 4 days for Red Hat and 3 days for Microsoft.

    After 1 day, Microsoft fixed 42% of their bugs. Red Hat only 29%.

    I know I'll probably get moderated to hell for this, but the simple fact is the "average" statistic tells nothing at all. What the results seem to be saying is that Microsoft is faster on simple bugs (probably better distribution channels) though they fail on the more difficult bugs (probably more complex code, but who can tell without the source).

    John Wiltshire
  • by MattMann ( 102516 ) on Monday January 17, 2000 @12:15PM (#1364498)
    Their metrics don't measure what's important. Open source has two huge bug-related benefits:
    • if you are a developer and something you are working on is not working, you can figure out why. you don't run into the problem of an unresponsive undocumented API. Look at all the crap MSDN-CD is full of, and how impossible it would be to get any work done without it, and how many little SDK idioms you need to resort to.
    • if a bug you encounter is important to you, but "unimportant" to other people -- because it is obscure, or the software is no longer being supported... many possible reasons -- you can fix it yourself or hire someone to. you are not dead in the water.

    Important bugs in important software will be fixed just about as quickly in either system: the 5 key people who know the source behave more or less the same way in open or closed source situations. It's the vastly larger number that matter to most developers. And, as more and more developers realize this and enjoy more working on open source, it won't matter what the other guys think.

  • by cburley ( 105664 ) on Monday January 17, 2000 @11:47AM (#1364499) Homepage Journal
    As far as speed goes, big deal... give me a fix that works.

    Didja read the article? I know it was /.'ed -- I waited a longish time for it -- but it addressed the quality-of-fix issue pretty well.

    BTW, while I don't know for sure whether you're right that many OSS projects don't regression-test such fixes first, I do know the ones I've worked on could stand some improvement...and also that it's a bit easier to regression-test a fix to a small component than a large one, and that OSS thrives on collections of small components in a way Closed Source $$$-making development doesn't (the latter favors the development of monoliths, since they represent a harder-to-reverse-engineer, and therefore steeper, wall for competitors to climb).

    Also, the article made mention of various Microsoft-issued "fixes" that, themselves, had to be fixed. Didn't mention that happening with GCC, though it has happened there (not security fixes AFAIK, but the same principle applies), but the implication was that the most heavily-funded closed-source-development organization in the world doesn't seem to do to well producing correct fixes in the first place.

  • by cruise ( 111380 ) on Monday January 17, 2000 @11:26AM (#1364500) Homepage
    Taco, How can you find this information interesting while refusing to release the slashdot source via CVS and following the OpenSource model which so many other applications use.

    Your comment says "interesting" while you still remain uninterested in your user's demands to Open the /. source.

    Slow /. bugfixes is more interesting perhaps?


    They are a threat to free speech and must be silenced! - Andrea Chen
  • by wrook ( 134116 ) on Monday January 17, 2000 @12:13PM (#1364501) Homepage
    When I worked at a certain telecommunications company, we always joked about this. Bellcore specified that you needed a 30 day turnaround time for 70% of the bugs that were reported by customers. The only way we could reach this target was by introducing bugs with known solutions. That way it would only take 21 days to fix it (14 days to get to the right department and 7 days to verify the bug fix).
  • by Anonymous Coward on Monday January 17, 2000 @02:55PM (#1364502)

    I think I'd like to point Slashdot readers to a wonderful book: The UNIX Philosophy by Mike Gancarz. This book explains the tenets and values that traditional UNIX programmers have held. It goes on to list the 9 most primary tenets:

    1. Small Is Beautiful
      • Software Engineering Made Easy
      • Small programs are easy to understand
      • Small programs are easy to maintain
      • Small programs consume fewer system resources
      • Small programs are easier to combine with other tools
    2. Make Each Program Do One Thing Well
      • By focusing on a single task, a program can eliminate much extraneous code that often results in excess overhead, unnecessary complexity, and lack of flexibility.
    3. Build a Prototype As Soon As Possible
      • The fact is, everyone is on a learning curve
      • Even the masters know that changes are inevitable
      • Why do you think they call it "software"?
      • Prototyping is a learning process
      • Early prototyping reduces risk
    4. Choose Portability over Efficiency
      • Next ----'s hardware will run faster (fill in "quarter", "year", whatever)
      • Don't spend too much time making a program run faster
      • The most efficient way is rarely portable
      • Portable software also reduces the need for user training
      • Good programs never die--they are ported to new hardware platforms
    5. Store Numerical Data in Flat ASCII Files
      • ASCII text is a common interchange format
      • ASCII text is easily read and edited
      • ASCII data files simplify the use of UNIX text tools
      • Increased portability overcomes lack of speed
      • The lack of speed is overcome by next year's machine
    6. Use Software Leverage to Your Advantage
      • Good programmers write code; great programmers "borrow" good code
      • Avoid the not-invented-here syndrome (i.e. don't reinvent the wheel just because you didn't invent it first!)
      • Allow other people to use your code to leverage their own work
      • Automate everything
    7. Use Shell Scripts to Increase Leverage and Portability
      • Shell scripts give you awesome leverage
      • Shell scripts leverage your time, too
      • Shell scripts are more portable than C
      • Resist the desire to rewrite shell scripts in C
    8. Avoid Captive User Interfaces
      • CUIs assume that the user is human
      • CUI command parsers are often big and ugly to write
      • CUIs tend to adopt a "big is beautiful" approach
      • Programs having CUIs are hard to combine with other programs
      • CUIs do not scale well
      • Most important, CUIs do not take advantage of software leverage
    9. Make Every Program a Filter
      • Every program written since the dawn of computing is a filter (it takes input data, and processes it somehow to possibly produce output)
      • Programs do not create data--people do
      • Computers convert data from one form to another

    As you probably guessed, Open Source _pushes_ Tenet 6 to the forefront. Let others use your code!

    Along with those primary, religiously-followed tenets, 10 lesser tenets are typically followed:

    1. Allow the user to tailor the environment (Yeah! This is seen plenty in Unix user interfaces)
    2. Make operating system kernels small and lightweight (OK, so Linux doesn't go to any extreme with this one...)
    3. Use lower case and keep it short (well-known and practiced throughout; lowercase is easier to read)
    4. Save trees (Why print out programs when you have less, gless, lynx, etc?)
    5. Silence is golden (make program output good enough for humans to understand, but terse enough for other programs to use!)
    6. Think parallel ("There is an old joke in the computer world that goes something like: if one woman can have a baby in nine months, does that mean that nine women can have a baby in one month?")
    7. The sum of the parts is greater than the whole (use small tools to build big projects!)
    8. Look for the 90 percent solution (don't exhaust yourself making it work in every nitpick situation)
    9. Worse is better (think VHS vs. Beta; VHS sucks but it won!)
    10. Think hierarchically

    The book also mentions something very important: The Three Systems of Man. Software goes through the First system, the "innovative" cycle where one or only a few develop something revolutionary, to the Second system, where committees are formed for the software so more people can feel they're worth something contributing to the idea, and the Third system, where experts who left the scene during the 2nd stage come back to implement the idea, now that the obvious solution for it is well-known and has been walked many times.

    CREDITS: This posting contains lots of quotations from, of course, the book: The UNIX Philosophy by Mike Gancarz, Copyright 1995 Butterworth-Heinemann. ISBN 1-55558-123-4 ... about $19.95. Well worth the money.

  • Disclaimer: I work at Andover.Net.

    People calm down. We have our best perl coders here slaving over the Slash release. Patrick, Rob, and Pater are trying to convert their undocumented code and database schema into something that can be installed on other machines besides this one. The Slash code really is hardcoded in many ways and they are trying to unhardcode it for you now now. But they very much appreciate your flames so please keep 'em coming. =)

  • by account_deleted ( 4530225 ) on Monday January 17, 2000 @11:53AM (#1364504)
    Comment removed based on user account deletion
  • by Rombuu ( 22914 ) on Monday January 17, 2000 @11:41AM (#1364505)
    Sure, open source can get bug fixes out there faster... but its not like for most open source projects anyone is going out and regression testing the fixes against anything to make sure nothing else is broken by the fix, etc...

    As far as speed goes, big deal... give me a fix that works.
  • by bero-rh ( 98815 ) <bero AT redhat DOT com> on Monday January 17, 2000 @01:14PM (#1364506) Homepage
    Redhat is kinda slow at getting out the official fix. For example, the linux kernel is at 2.2.14 but Redhat has not put out a official rmp yet even though 2.2.14 contains a bunch of fixes

    Red Hat has actually released several 2.2.14 RPMs in Raw Hide [redhat.com], our more experimental version. If you want to be on the bleeding edge, use that.

    Also, check the source RPM for Red Hat's 2.2.13 kernel - it already contains a number of the fixes that later made it into the official 2.2.14 kernel.

    We don't put out errata RPMs for every minor bug (misspelled man pages and such); this stuff gets fixed in Raw Hide and then makes it into the next release.

    Errata RPMs are released only when they fix a MAJOR bug, such as a security problem (such as the bind update currently available) or a real functionality problem (such as the lynx update).
    Releasing them for every minor problem, or every base version update, would be a bad idea because it would be very hard to keep track of everything. (And of course it would lead to "You need to update 1500 packages before Red Hat Linux works well" FUD from Microsoft and other people who don't care to check what an update does before writing flames).
  • by bero-rh ( 98815 ) <bero AT redhat DOT com> on Monday January 17, 2000 @11:35AM (#1364507) Homepage
    The fact that we're reading your message shows we're paying attention to the community. ;)
    The thing that slows Red Hat errata down is called Quality Assurance. Bugfixed packages don't leave Red Hat without having run at least a couple of tests to verify
    • the new package actually fixes the problem
    • the package still does what it is supposed to do
    • it doesn't introduce any new similar problems

    I'd rather delay a package for a day than having to release yet another security update for the same package the next day...
  • by dpilot ( 134227 ) on Monday January 17, 2000 @11:33AM (#1364508) Homepage Journal
    The assumption was that bee wings act like airplane wings. Uner those assumptions, a bee would not be able to fly. Somewhat more recently it was shown that bee wings do not work the same as airplane (or ornithoper) wings. Aside from the flapping thing, there's a basic modal difference.

    Airplane and ornithopter (and bird?) wings work on laminar airflow. Try 'too hard' to fly, and you get turbulence above the wing. In other words, a stall.

    The bee has a different method of dealing with this. Rather than prevent turbulence, the bee wing uses turbulence, and has a machanism for continually spinning the turbulent vortices off of the wing. In this flight mode, a given size wing has much as 50X more effective lift than in laminar mode.

    I'm not sure we can apply this to the whole Linux vs Microsoft thing, other than to say that a new modality changes the whole landscape. But I guess that's what Open Source is all about. In this case, we're the bee.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...