

Open Source == Faster bug fixes 330
solar writes "SecurityPortal.com is running a comparsion between RedHat, Microsoft, and Sun Microsystems on the response time between software bugs being found and patch releases. Find out if open-source is the champion bug squasher we all believe it to be. " Interesting bit.
Shut up! (Score:1)
Statistically Insignificant Results (Score:1)
Using a one-way ANOVA (or a t-test) at even 0.05% significance, the difference in the means between Microsoft's response time and Linux's response time is far below the threshold for statistical significance. In other words, there is a fair probability that Microsoft is has as good or better response than Linux. The results show a statistically insignificant difference.
Statistics lesson #1: if a site does a comparison with just a few samples or without providing variances and t-test results, chances are high that their "results" aren't results at all.
Open Source is Ruthless (Score:2)
Sounds like 3 programmers just got a crash course in job security. I bet one of them realized, "Hey, if we release the code, then 300 to 3000 programmers will get to work on this stuff and the suits will see us sitting on our duffs. Then they'll start wondering what they're paying us for. Let's keep the code here so that we'll have something to do."
Ain't Open Source a bitch? The nastiest thing you can do to a worker is increase productivity. It makes everyone look bad.
A rant on Regression Testing & Open Source... (Score:2)
Mostly I do (test), and don't spend much time managing -- unless it's to fight for more time to test.
If anyone knows or supports the idea of detailed regression testing, it's me...insisting on a contractual minimum level of testing. I've generated test plans with over 1,000 individual tests -- all specifically targeted to the requirements in the actual contracts. (I don't just make this stuff up!)
Yet, with Open Source projects, there's little or no formal testing. What testing gets done tends to be tainted; testing by the developers themselves. Do I care? Actually, no. The quality tends to be fairly high since the developers, themselves, care. They pay attention, and the quality of programmers is visibly increasing.
This trend -- and that, yes, testing jobs suck -- has convinced me that less testing will be approved in the future and that I'd better adapt.
Five years ago, I would have promoted VV&T as a good if unrespected career. Now, I'm convinced that it's a place for two groups of people only;
1. Loosers: ex-school teachers, temps, less technical analysts who just pick up a check, and "black group" annoying folk, or
2. Professionals: The kind of people who know IEEE or CMM test methods and tools or at a minimum have exceptional puzzle solving and logic skills and seek those tools out.
It's difficult not to be lumped into and treated like group #1. No, I'll amend that. It's nearly impossible not to have someone chew your ass out every few weeks just for doing the tests right and documenting actual, real, problems or deviations from the requirements they, themselves, aren't paying attention to.
My decision? Drop testing and become a network admin. After a while, when I can do the job well, I'll try to find a new nitch and specialize. Five computers in my basement, networked, one of them a Sparc system with Solaris, a pile of O'Reilly books, and countless hours installing/changing/securing them has given me the basics. On a scale of 1 to 5, I'm a 2...and don't deny it; I know I can do quite a bit of dammage and don't yet understand how to be a good admin. In a couple years, I hope to be at level 4.
Sure, as an admin I'll catch about as much hell and get thrown off projects by the same political currents as before, yet after going through testing I'll be prepaired for that kind of flack and lack of respect. After a while, it won't matter...but I'll have a good job and in demand. Then, I'll get back to double digit contract increases...ahhhh!
while they're at it, tell them to use Python (Score:2)
Does choice of language affect the unmaintainablity? You bet it does. And in the opinion of many, Perl is not well suited to large maintainable programs. One method of taming this problem is to choose a more modern language, an Object Oriented language. While the benefits are not realized in many partial OO languages (such as C++), clean languages like Eiffel and Python are an effective tool in order to ensure long term maintainability and extensibility.
(No flames please. This is offered up a food for thought, and as an object lesson in how even people like Rob Malda ignore the best advice of their Computer Science professors.)
Re:You people are ruthless! We're working on it! (Score:2)
Ever stop to think there might be folks out there that could help who are better than your three best perl coders?
>The Slash code really is hardcoded in many ways and they are trying to unhardcode it for you now.
And we're somehow not able to figure this out ourselves? Or at least lend a hand finding bugs, hardcoded assumptions, off-by-one errors, etc?
Y'all need to stop trying to make a release that 'looks good,' and take a little time to re-read CatB. "Release early, release often." Not "when it looks pretty," not "when we think we have most of the bugs fixed," not "after we've added this one new feature."
Because what that's going to get you is an immaculately-debugged Slash that is going to have to be rewritten anyway since you didn't expose the code to enough eyes to find the bad practices and assumptions that no doubt live in there. You'll have cleaned up a bunch of code that will get thrown away anyway. cf Mozilla....
Do it now -- 'tar cvfz slash-0.50.tar.gz slash/' and put it out there, warts and all, money where your mouth is. I double dare you -- you might end up surprised at how much faster the code (and therefore your revenue generating site) gets better. Ah, and that takes us right back to the original topic, donnit?
--
Re:Open Source doesn't always == faster bug fixes (Score:2)
If you assume for someone else to profit, you have to lose, you will avoid this situation at all costs.
If you consider it possible for yourself and another to both profit in some manner, then you will be more inclined to allow this situation to happen.
There are good arguments on both sides- for instance, when you include corporations in the equation you have to understand that, while you may wish a mutually-profitable situation, the corporation is legally bound to not only try to profit, but to try to hurt you and cause you to lose, if you are in the same line of business. It cannot cooperate with you. But at the same time, if the corporation is copying your GPLed source, there is a limit to how uncooperative it can be. It can take your code, use it, outmarket you and then withhold its changes until release- but then it has to let you have them, and even without using your code it still outmarketed you, get used to it ;)
The basic issue is simply this: how important is it that you be able to prevent someone else from profiting by your labors? Are your labors so miraculously advanced beyond the rest of the industry that (a) nobody can help you with them, and (b) they'd make a significant difference in the performance of your competitor? This is software, people- there's never been much of a link between quality and profitability. If there seems to be a gain from cooperating, consider the possibility that 'having someone else profit from your labors' is just a chance you'd have to take.
This whole 'winner takes all' concept seems to have grown out of the years of Microsoft monopoly. I suggest that this is not the only way the world works, and that the software industry is drifting back into regions where developers can profitably cooperate with each other. It's _normal_ to not need to take a 100% hostile ruthless attitude at all times. Such things are quirks of history, and we have lived through such a quirk. Amazingly enough, some things survived, such as Macintoshes and Linux. Now it's time to settle down a bit, quit scorching the earth, and get back to more normal interactions.
Re:It's more subtle than this (Score:2)
Have you ever joined a developmentteam that was working on a software project for already some time? Did they expect you to know where to find every bit of functionality right away? NO!
[sigh]...
Have you ever joined a development team that was working on a software project for some time with a buggy API underneath it, that they didn't have the source to? Have you ever just have to learn to live with its quirks (like they already had)? Have you ever wanted to have the source to that d**m thing and be able to fix it yourself?
No, this doesn't mean you don't have to spend a couple hours staring at the thing going "how the fsck does that work?", but it does mean you at least can do this if you're so motivated (like I am). And it's much easier to understand the code if you understand the API.
Of course, getting folks to use your revisions is a different matter... :)
As for making patches that don't mess other stuff up, that's a matter of having properly designed and documented software -- which a well-maintained open source project will be.
Greedheads and UCITA (Score:2)
such as this - if a company knows of a material defect in their product and conceals such to the consumer, resulting in losses to the consumer - said greedheads are liable under the higher standards of gross-negligence, recklessness, or even intentional tort, resulting in statutory treble damages or unlimited punitive damages in some cicumstances.
The greedheads are already well on their way to taking care of that little problem. Go read up on UCITA.
http://www.troubleshooters.com/ucita/ [troubleshooters.com]
http://www.2bguide.com/nccusl.html [2bguide.com]
Here's a list of Infoworld articles on UCITA [infoworld.com]
You can find a whole lot more besides these by doing a Google [google.com] search.
Colossally Missing The Point (Score:2)
Red Hat just re-packages programs that other people write. The real bug fixes are being done by the project maintainers.
The only exception to this is that the Debian project has package maintainers that foster a good relationship with the upstream project maintainer. The Debian project really passes those bugfixes upstream.
--
I noticed
Re:Open Source doesn't always == faster bug fixes (Score:2)
Re:reverse dogma (offtopic: -1) - moderators pls r (Score:2)
Re:Open Source doesn't always == faster bug fixes (Score:2)
I guess that it depends on the platform/market.
Open Source doesn't always == faster bug fixes (Score:2)
Everybody toutes open source and how good it is. These people generally cite certain prime examples to support their arguments. (I think also think that many of these people are as biased and tunnel-visioned as the esteemed Mr. Raymond.)
For some software it works well. Although Linux is a stable and it's security holes are filled quickly, it's not as successful as other products. Everybody seems to want it to be success on the desktop - I won't call it a success until it actually puts a dent in Microsoft's huge majority. Mozilla certainly far is from being a success. We await the future on that one. I don't think that there are any hugely successful pieces of open source software (sendmail? but then it basically started off with a monopoly)
However, I do not think that all software is suited for open source. The last company I worked with developed database marketing software. Nobody is going to work on that kind of software for free. If we'd open the source then our already established (and far bigger) competitors would have been able to leverage our work in putting us out of business. I don't work for free, I have to pay bills. Infact, I'm quite happy to be paid a lot of money for what I do.
/. doesn't seem to think that open source helps fix bugs. Otherwuse the source that they've released would be more up to date, and consistently kept that way.
Com'on over where it's free (Score:2)
Thanks
Bruce
Re:Open Source will always be the quickest (Score:2)
Do you mean that crackers need a whole bunch of holes before they start breaking into your systems?
I don't think so ...
One HUGE distinction between Open vs Close sourced (Score:2)
As you have pointed out, there _are_ so-called "unmaintained" software in BOTh the open-sourced and close-sourced software arena, and you have also aptly pointed out that the "deadends" in the open-sourced are not-so-deadends, because if you have the skill and inclination, there _exists_ the possibility that YOU take up the job to fix whatever is needed to be fixed.
You have also pointed out that _NOT_ everyone has the skill and inclination, and you hinted that, therefore, the "open-sourced" model doesn't work for all occassion.
But do you ever try to put your thought on the OTHER SIDE? Think of the close-sourced software - when they are NOT maintained, they remains DEAD no matter if those who want to fix it has the skill and inclination, or not.
Take for example - the software OPTASM - a REMARKABLE assembly laugnage compiler. It is a CLOSE-SOURCE product, and is no longer in the market.
I heard that the company that used to own OPTASM was sold to Symantec, the people who produces Norton Utilities, so, I contacted Symantec, trying to find out if they still sell the OPTASM compiler or not.
The answer I got is NO. Symantec isn't selling OPTASM anymore, and they have no plan to update the product. That means, essentially, the people in Symantec has ABANDONED a remarkable product that was once one of the BEST assembly compiler for the X86 chip line.
And when I further enquired Symantec regarding the possibility for them to release the SOURCE of OPTASM to the general public, - since they are NOT going to sell OPTASM anymore, I figured that they have NO PLAN to make money out of that thing anymore, right? - and the answer I got from Symantec is a BIG SILENCE.
I can code. Although I am not a CRACK PROGRAMMER, I have enough experience to do _some_ update and code cleanups for the OPTASM compiler, if I can get the source to it. And I AM WILLING TO DO THAT.
In other words, I _HAVE_ the skill and the inclination to update the OPTASM compiler, but because of its close-sourced nature, there is NO WAY I get to do it.
On the other hand, if the OPTASM compiler was a open-sourced product, with my skill, and inclination, at least I can get to TRY to update the thing.
In summary, a HUGE difference between the Open and Close sourced software arena are in the MAINTAINABILITY of the ORPHANED softwares - the softwares that are not being maintained anymore.
The orphaned, but open-sourced softwares _could_ be updated by ANYONE who has the skill and inclination, but those orphand close-sourced software will be DEAD FOREVER, if the owner decline to release the source to the public domain.
I hope what I am saying here will bring attention to those who have the right over source-codes of orphaned and previously close-sourced softwares, and hope that they will release their sources to the public domain. If they do not want others to PROFIT from the good gesture (releasing source-codes to the public domain), then they could have release it under GPL.
Re:Poor research (Score:2)
Re:Poor research (Score:2)
Re:Bottom Line (Score:2)
They are usually the first Linux distibution to release updates after a security problem arises.
Re:Smaller software companies even worse ... (Score:2)
Nope, it's support. They don't want to have to train their OS tech people on original, OSR 2, 2.1 and 2.5. They (Microsoft) want the OEMs to handle that kind of stuff.
Re:Open Source doesn't always == faster bug fixes (Score:2)
it is not about writing software "for free"
... but it is about releasing all rights you might have to the software you've written.
it is not about not being paid a lot of money for writing it
... although it is about having abuse heaped upon you, your company and your product if you don't give into the demands of the mob.
it is not about giving your competitors the ability to put you out of business
... unless, of course, your business is writing software.
Sigh. Same old, same old... There are any number of extremely convincing arguments for using and supporting open source; to spout the useless and emotionally-charged rhetoric you did contributes little or nothing to the discussion at hand. It does not answer the questions raised by the original poster:
These are serious questions, from someone who is at least willing to listen to an explanation... and instead of trying to answer his questions, or explain to him why you believe what you do, you deride him for not being a believer. I'd say that I'm staggered and shocked by your arrogance, but unfortunately, it appears to be all too common among open source advocates in general, and /. trolls in particular.
Re: (Score:2)
Open Source (Score:2)
Re:Poor research (Score:2)
----
Re:OSS, Closed Source, and F00F. (Score:2)
I heard *NOTHING* about removing the fix for NDA reasons, and I was in support at the time. From the moment we had a fix up, there was a fix available all day every day until we got our "final" fix.
The original one was "M310-hangfix", I believe; the !@#*! bug came out days after 3.1 shipped.
I believe we didn't ship source for the fix until we had a "final" patch - that may have been the NDA deal.
I do know that the reengineering work was pretty much internal; we were aware of flaws in the initial patch (performance hits, however minor), but we wanted a fix out so people's systems would stay up long enough for them to get the newer fix.
But yes, the fact that BSDI and Intel engineers were on a first-name basis probably helped immensely in getting the fix out.
The question this raises is, why did it take longer for Windows to get a fix than it took BSDI? They certainly have their hooks in over at Intel.
Re:CmdrTaco is UNINTERESTED (Flaimbait -1) (Score:2)
Re:Speed != Quality (Score:2)
It takes dedication and commitment to make a good open product into a quality open product.
Re:Open Source will always be the quickest (Score:2)
maintained.
Aye, there's the rub. There are a fair number of Open Source packages that are no longer maintained. Bug fixes may or may not ever come for those packages.
Of course, you do have the opportunity to fix 'em yourself, if you have the skill and inclination. So even un-maintained packages are not dead-ends, if you have the time and talent to do your own maintenance. It should be clear that not everybody does.
The inescapable conclusion is that the User classes are doomed to use buggy software if they don't pick their packages carefully. This is true of commercial closed-source products just as it is of Linux applications, world domination or not.
Your old-fashioned argument is flawed. (Score:2)
If you think that you can write a better slashdot, then go out and do it. I wish you well. The folks that run this site are under no obligation to release anything that they do. The author is the one who ultimately has the rights to the software that he writes, no matter how much you demand otherwise.
Ability To Do Better is NOT a prerequisite to the Right To Complain. If that were true, then:
all movie critics can't complain unless they can make a better movie.
all game critics can't complain unless they can make a better game.
and so on.
reverse dogma (offtopic: -1) - moderators pls read (Score:2)
I especially dig the phrase "I'll probably get flamed and moderated down for this", since almost every post I see with this phrase gets moderated up. Moderators, they are playing with your mind. It's almost like the poster is say "nah nah, I dare you to moderate me up!"
I think finkployd is on the money when he postulates that Malc is just trying to be funny, or he's craving for karma.
Frac
Re:Why is this surprising? (Score:2)
Actually, an edict was handed down at MS saying exactly what you're saying - system DLLs are now only to be updated in service packs - not in apps.
Fingers crossed that it stays that way...
Si
Re:Open Source will always be the quickest -- why? (Score:2)
1) Multithreaded development and debugging. The well-discussed reason is the distributed model of work. Since there are many potential testers in the world for each piece of Open Sourced software, there are also many potential patchers for siad software. The person who finds the problem actually has a possibility of being able to offer a fix.
2) Risk aversion. Big corporations like M$, Sun, HP, IBM, etc. have reputations to consider. If they offer a "fix" that later has to be fixed itself, they are embarrassed and sales could be hurt. This is bad for the decision makers because they have something to lose for their efforts besides pride -- money. The Open Source community has much less to lose. This has primarily been due to the fact that since their work was volunteer, they could hardly get fired or sued. As the world awakens to Open Source and corporations enter the arena, I wonder if this will change.
Hmm (Score:2)
Tux is here; Gates to shear; Someone pass another beer!
Poor research (Score:2)
The AFU FAQ Shows this as False [urbanlegends.com]
Why is this surprising? (Score:2)
And, with Windows 2000 on the way, won't it just get worse?
Not an incredibly insightful comment, I know. That's why I'm hoping for some insightful responses.
Re:Why is this surprising? (Score:2)
98 claimed something like "3000 bug fixes from 95 and a complete rewrite of the memory management subsytem" (which it really needed)
Win2k claims that there are some ungodly number of situations where you no longer have to reboot. Which is nice, but there are still far too many. You can change your IP address, for example, without rebooting, but a host name change sends you packing...
there's something to be said for network restart...
Better is good. Not having to fix 3000 of 8000 bugs is better 8^)
Re:Cheating on bug fix times? (Score:2)
but that is another rant for another day...
Re:Poor research (Score:2)
Bumblebees can't fly, they just don't know that they can't, so they do
Steven Rostedt
They know about plenty of bugs (Score:2)
Chris Hagar
Tee hee hee! (Score:2)
--
Re:Open Source doesn't always == faster bug fixes (Score:2)
---
1) It makes for almost perfect competition. Therefore it is only suitable for glue or base framework to your business model. The only way around this is to keep your changes completely internal.
The converse argument would be to look at companies such as RHAT. I believe this to be invalid because:
a) Operating Systems is a very wide market. Therefore it does not apply to most software.
b) There's no way in hell they can get a traditional software company profit margin, unless they provide value added closed source or complete solutions.
2) There are some really stupid limits on commercial extension. The LGPL fixes this.
---
There are some other arguments against open source in general as well:
---
1) Some large projects take substantial resources to maintain and extend. There are a lot of projects that do not begin and end after a program meets specifications. Under open source, the customer has the right to fire you and have someone else extend the product, possibly putting you out in the cold. This can be especially hard on companies who have built reliance on their customers and/or want to expand their business. The problem is magnified many times if the market for your product is limited to only a few potential customers. Therefore, you have to be very careful with your contract (if you even have one, ha).
In the end, it is more efficient. However, it might not be in my best interest. If anyone can come along and underbid my contract and take over the work I had been doing, I might think twice.
2) The goodwill of humans can't be taken for granted in business. If I use a personally made toolkit to make a variety of applications, I do not want others using that commercially. Someone may use my code in other applications without asking. To fix the problem, I can restrict use of certain code, and in the event of violation, litigate. Another solution is to LGPL parts of the code to enforce return on modification.
--
Anyway, I want to ensure myself a healthy return while being efficient for my customers as well. What I do not want is to make my work a commodity that is easily duplicated.
Re:Open Source doesn't always == faster bug fixes (Score:2)
What I do find bad is the lack of typical end user applications. Most X applications on freshmeat typically fulfill specific goals of those developing the product. This does not always meet the needs of typical end users who need an easier to use UI or documentation legible to the average computer user.
Even free applications on www.winfiles.com are usually more user friendly (as well as shareware). Also, since many are recreational projects, they never get close to their commercial alternatives.
Hopefully this problem will be rectified - either by the big boys moving in and making Linux accessible to the normal user in their interest - or benificial standards agreed upon by an initiative interested in making it accessible.
If Linux becomes successful enough, we might even see many commercial applications that cater to the end user such as easy to use image editors (I'm talking good UI and easy process hand holding), and just about everything broderbund, sierra and others develop. I am not convinced that it is possible for these particular categories to be open sourced because recreational projects really usually do not go into every possible detail and take the time to effectively plan except for their own use. They also usually are working to fulfill their own goals for a program and not providing a solution to others problems for compensation.
In short, I think it's because those who have input in the program are generally working to fulfill the goals of the community that is contributing.
Speed is not always desired! (Score:2)
The fastest fix is to disable that service until a well-tested patch comes out. This is true whether it is closed or open.
Re:Speed != Quality (Score:2)
It does something else as well. A monolithic product having once garnered customers for one of its features is more likely to seduce them into using it instead of its competition for its other features. And it can be sold against a range of competitors, none of whom offer its full range of features.
Re:Slashdot Readers: The UNIX Philosophy (Score:2)
This captures something I was trying to explain to a UI class I took a few weeks ago. The rest of the points tie into it in various ways, but the two things I brought up were:
These two points led me to the statement "Open source is exposing the interfaces" a couple of weeks ago right here on Slashdot. The ideas behind that are simple:
The difference between the interfaces in open source and proprietary software is not clear-cut. But there is a tendency for open source to have a more dynamic view of the world. The programmers working on the project aren't going to have complete control over all of the customization. So there is an incentive to give away a rich configuration mechanism, or someone will build it in.
One example, among my favorites, is the Free Translation Project [umontreal.ca]. This project is enhancing quite a number of open source projects with translated messages and documentation in a variety of languages. I suspect that my own team are the entire community of users for the Esperanto translations at the moment. No proprietary software project could ever justify the cost of rolling the translations into the distribution and testing them. Our team has taken on that burden. We translate, we test,
Re:Why is this surprising? (Score:2)
Re:Bumblebees flying AND:Poor research (Score:2)
The bumblebee (and other insects like dragonflies and houseflies that have similar flight surfaces) *use* the vortexes that their wing flaps generate to cause a slight vaccuum above the wing, which creates additional lift, in addition to the normal lift from the wings' downbeat. Dragonflies in particular take advantage of the weird turbulences their wings generate to do all the amazing dragonfly types of things they do in the air.
I could have sworn I had seen a reference to some scientist at Berzerkely (I think) right here on Slashdot who had recently built like a 50x scale model of a bumblebee to study its aerodynamic properties and come up with the canonical explanation of how they manage to stay aloft.
-=-=-=-=-
Naturally! (Score:2)
This is why I think the benefits of OSS strongly outweigh the setbacks. Much better code is produced for worthwhile projects.
"You ever have that feeling where you're not sure if you're dreaming or awake?"
Re:Why is this surprising? (Score:2)
Linux, as large as it can get, is not One Big Monolithic Operating System(tm), rather a huge collection of programs, utilities and other errata that are all maintained respectively by their individual owners. NT is all Microsoft, and they are responsible for all of it.
In fact, to go one step further, "Linux" refers to nothing more than the kernel, which is what, a meg and a half? The rest are seperate programs and errata from different poeple and organizations, therefore they can't actually be called part of the operating system--which is why purists prefer calling it GNU/Linux.
Re:Open source means little if... (Score:2)
Sure, not every piece of OSS is good - but OSS can be made good.
If you find an OS tool that does what you always needed to do, but is written badly, fix it up (or hire someone who does).
The ONLY benefit open source software gets is the off-chance that a programming guru happens to have absolutely nothing to do that day and fixes the bug before the core developers get to it
And, of course, that if the core developers don't get to it in a reasonable time, someone definitely will fix it (that's part of what Linux distributors are there for) - and when one of us fixes it, the others get the fix as well.
Re:RedHat's response time (Score:2)
Well, all I can say here is that shit happens, even here.
Also, there's a big difference between QA'ing one updated package (I'm not aware of any errata package needing another update for the same problem), and QA'ing an entire distribution - the more packages you have to QA, the more likely it is that something gets overlooked.
Also, please keep in mind that Red Hat didn't have as many people to look after bugs back then as we have now.
Re:RedHat's response time (Score:2)
Get the iputils and netkit-base packages from Raw Hide. This will help.
We don't usually issue errata for bugs that aren't critical for most users.
It is an undisputed fact that RH has been aware of this since October 7, 1999. Over three months on something as simple as
Well, not quite true. The fix has been around for quite a while (in Raw Hide).
But yes, there was indeed a mistake, it wasn't added as a comment in bugzilla and the bug wasn't closed. I've done that now (ping isn't my responsibility though).
Stupid stuff like this occasionally happens everywhere, and probably can't be avoided completely. (I wish it could.)
Re:Bug != SecurityHole (Score:2)
My guess is yes - it would bring at least similar results.
If you find a bug in Windows, what do you do? Microsoft does not even have an official bug-reporting system. That's (part of) why long known bugs in Windows (such as "can't install driver from directory with long name unless I tell the installer the short name") simply don't get fixed.
Most Linux distributions, on the other hand, have a bug tracking system (Red Hat's, for example, is at http://bugzilla.redhat.com/bugzilla). The developer responsible for the package you're reporting a bug in is immediately notified.
If a bug is left unattended in the Red Hat bug tracking system for 7 days, the system sends another mail to the assigned developer (repeated every 7 days).
Someone WILL take a look at the bug, and probably fix it (stuff like "On my xyz system with the AAA graphics card, my X server hung yesterday and lacking a network card I had to reboot" is VERY hard to reproduce and even harder to fix though), or at the very least decide he doesn't have the time to look into it deeply and pass the bug report on to the maintainer(s) of the base package, and update our package as soon as the maintainer(s) release a new version.
Re:It's more subtle than this (Score:2)
Re:It's more subtle than this (Score:2)
What I don't understand is: why are you bashing MSDN, the biggest developersource on the planet?
I did not say that the sorts of information that are in MSDN are not useful, they are indispensible. But they are especially indispensible because so little information is available from other sources. These scraps are not a substitute (in terms of usefulness) for source, not the source of an example, but the source of the implementation. I'm bashing MSDN because
Have you ever worked on an MSWindows-platform product that was competitive with Microsoft? I don't understand why all the Microsoft-boot-licking engineers we see here -- that by no means includes everybody, I'm talking about the people who defend Microsoft with an enthusiasm out of all proportion -- don't resent the many hours they spend searching for a work-around to some poorly documented Microsoft bug, while Microsoft's engineers laugh and sneer at you behind their phony smiles. Are you guys sheep? It's like the athletes from a lesser team volunteering to help rub-down the stars from the dominant team before the game. "Oh, OJ, you're so strong!" I don't get it.
Re:Open Source doesn't always == faster bug fixes (Score:2)
Irrelevant to the discussion at hand, in which it was asserted that Open Source means writing software for free.
For a more pertinent reworking of your question:
The point being that OSS projects tend to attract lots of free help because there's lots of people out there willing to offer it. Help that closed-source projects typically refuse at the front door.
That being said, I'd guess there are one or two significant open-source projects that are planned to achieve success without relying on free coding help.
But, I can't say what they are. (Best guess offhand: IBM Jikes compiler.) Maybe there aren't any.
Even if there aren't any, that certainly doesn't mean nobody doing OSS development gets paid quite well for doing it, which is what was, in effect, asserted in the original post to which I replied.
(BTW, even Microsoft relies on lots of free help, in the form of beta testing, etc. I doubt MS could release a quality OS without relying on such free help. The question is, can it do so without relying on free coding help, the sort of help Linux and the BSD's leverage so successfully?)
Re:Open Source doesn't always == faster bug fixes (Score:2)
False. You still have some rights. You've given up the right to restrict other people copying it, though, which is indeed a big restriction.
Huh? Sounds like an emotion-laden argument to me. So you're saying Bill Gates would get more abuse heaped on him if MS released all its software under the GPL? Hmm.
False. More correctly: "unless, of course, your business model assumes the ability to charge a lot of money for a substantial percentage of the copies of the software that actually get distributed."
First, I didn't pretend to offer a comprehensive explanation for why to use open source. In fact, I strongly suggested the poster take the time to research the issues.
Second, I didn't "spout" anything "useless" and especially nothing "emotionally-charged". All I posted was simple fact, into which you read (not having particularly good comprehension skills, I would guess) your own agenda.
Note that I didn't say there weren't challenges to be addressed when writing software for open-source distribution. But the poster had essentially claimed things about such an approach that were incorrect, and I was offering some simple, direct statements to illustrate that, and suggesting he follow up with some actual research.
Of course not, since the poster asked basically no questions, he just spouted some emotion-laden rhetoric (e.g. about having to pay bills) that I explained was false on several counts. So, speak for yourself, if you have questions to ask.
There are many ways to consider, none of them perhaps quite as easy as resorting to the proprietary model. Ask dress designers (fashion) how they do it -- for all intents and purposes, that field is inherently Open Source, yet they make billions (and, yes, their products are functional as well as artistic). There are other fields that don't allow the equivalent of proprietary software distribution as well. Study them, if you would like to consider whether OSS is worthwhile (before your potential customers require it, of course, at which point you have no choice).
I insisted on no such thing. Scratch the above "First", i.e. here's item Zero for you: learn to read.
As far as answering your question, why don't you investigate the various models already in use throughout various industries, not just computing? (And if you think the source code for your product necessarily constitutes 100% of the "fruit of your labors", you need to do some reading up on the OSS industry, and learn why all of its maintenance problems are not 100% solved due to availability of source code, for example.)
People with incentive. Say, the potential customers, who, if they insist on OSS only, will find a way to fund its development themselves. Then you can accept the funding, develop the software, and make $$ maintaining it. That's just one model, but it works.
Excuse me, but it was the poster who arrogantly made unsupported, unsubstantiated claims about what OSS development implies. All I did was say "wrong; wrong; wrong" and point him in a better direction. And I didn't deride him for not being a believer -- just for making false assertions in stating why he holds his beliefs. E.g. I don't deride someone for saying they believe in Santa Claus, but if they claim he lives at the North Pole and satellites have verified that, I might take issue with that claim.
If people want answers to questions, they ought to ask questions, rather than state falsehoods. The latter can work, but, in a public forum, it's a sure way to showcase one's ignorance and/or arrogance.
(For example, if you weren't so interested in trashing OSS advocates like myself, you'd have just asked the questions, instead of ranting about the content of my post.)
So here's a question I have for you. You have a great idea for a new software application, believe it'd be worth $2B in the next five years, and would take only two years to develop via the proprietary model (up-front funding, hire programmers, closed development, etc.).
Problem 1: you can't find good programmers, because they are unwilling to work for such low pay on software they can't maintain themselves after they leave your employ.
Problem 2: during market research, you discover your potential customers are no longer willing to pay much of anything for a copy of non-GPL'ed software, due to having been burnt by lock-in software so often in the past.
How do you make your idea happen using the proprietary model, when you can't hire decent programmers and won't be able to sell the final product anyway?
Now, this isn't likely the case today, but might be for your product's market within a few years. The problems are a bit overstated, but all that has to happen is a) programmers worldwide decide they deserve vastly more $$ to work on proprietary software, due to their not having a long-term relationship with it vis-a-vis OSS and b) while a) drives your costs up, customers are less and less willing to pay lots of $$ for each copy of proprietary software, figuring if they would pay lots of $$ for a copy, it must be an important to them, and therefore it should be OSS so they don't get locked in to it, which drives your potential revenue stream down.
Next question: besides the problems described above, do you see any opportunities in this scenario?
Re:Bumblebees flying AND:Poor research (Score:2)
More precisely, Open Source software is the bee, we're the wings, and our efforts testing, debugging, and improving software constitute the turbulence that Closed Source development tries hard to exclude but Open Source development, ideally, welcomes with open arms.
Okay, I just made it up and maybe it's a bit lame, but it could inspire somebody to come up with an improved version. ;-)
Re:CmdrTaco is UNINTERESTED (Flaimbait -1) (Score:2)
Perhaps the people who are being loud about this issue are OpenSource authors? (I am [openverse.org])
I for one am not interested in writing a new
I cannot speak for everyone but I don't think anyone is interested in the code to "steal the slashdot crowd and relocate them to a new forum". nor do I think this is even possible. The net has shown that he who comes first, leads (amazon.com's auction site is a good example as it lags behind eBay even with the troubles eBay has). Users become loyal to the first commers and I include myself as a loyal slashdotter (for now)
What I am shouting about is the hipocrisy which runs rampant from a site which is making money off of our OpenSource model while refusing to participate in it. I'm glad they're making money don't get me wrong... It's a good thing to have money. It's a bad thing to have made that money off of someone else's idea and then to deficate on the idea with comments like "ask me again and I'll delay it again". At the very least, Andover should make a public announcement as to IF,WHEN,and HOW the source will be released and stick to that schedule. They have nothing to loose, and everythgin to gain. There are other discussion forums out there. Some may even argue that they are better, more user friendly, etc... But we WANT to be loyal to
They are a threat to free speech and must be silenced! - Andrea Chen
Bugs In General At Microsoft (Score:2)
Is it anywhere near IBM or HP for OS patches? Hell no. MS is where HP and IBM were 7 years ago when it comes to patching and bugs.
Let's take a look at MS at it's worst. Microsoft Outlook 98. As Steven Webb of Microsoft Technical support described the patches "strategy" went a little something (paraphased) like this:
"I have this printing problem. [Describes problem]
Well, you know those security patches or the archive patch? It should really be considered a service Pack. See, it has about 150 odd fixes inside of it.
Is that documented somewhere in Technet?
No.
So it's been fully regression tested right?
Yeah, sure...that's the ticket"
Luckally I was a premier support customer. Basically, you pay MS a boat load of money and they assign some dude to you who is supposed to be dedicated just to your company, yet is never at his desk to answer your call. However, you do get to see all the neat little comments in the Technet that are marked confidential.
For MS a confidential note is usually the exact steps it takes to reproduce the problem. You can't be letting non-support contract customers figuring out what that intermittent problem is and demanding a free hot fix. No sir!
"You owe me thirty-five dollars!"
"I don't have a dime."
"Didn't ask for a dime, thirty-five dollars...cash"
Other common confidential notes indicate Y2K fixes that are undocumented, or other problems that will get fixed when the hot fix is applied.
The best part was when I asked the tech "Exactly what is fixed in the security patch your are recommending?" To which I was told they don't have that information. Even after escalating MS has yet to document exactly what it fixed in any of the Outlook patches. They said they would do better with documenting patches in Outlook 2000. The best solution I got was to install the patch, check the dates on the DLL then corilate that to the file dates documented in the 150 odd hotfixes...
Long story short, I don't think it's only a matter of Opensource VS Closed source. It's how committed the is to fixing the problems and how open they are about it. MS in general lacks consistancy across product lines. In many cases it seems that patches don't come out when they are needed, but rather when the press puts a problem in the spotlight. And as long as MS pays for IT managers and CIO's to spend a week in seattle getting brainwashed I don't think that is going to change.
--
Gott'a run, time to reboot the NT box.
Re:Bumblebees flying AND:Poor research (Score:2)
Re:Open Source (Score:2)
Well, what about the possibility of a very clever open-source developer actually adding a subtle security vulnerability intentionally while helping out on a feature or two of linux? Then when some huge ecommerce company starts using the version... BAM! He strikes and takes all the credit cards in the database or takes whatever else of value there might be to take. Does anybody know if this has ever happened or at least been tried before?
Re:Why is this surprising? (Score:2)
And yes, for all you Windows people out there, if you reboot your machine every day of course you don't have problems. But seriously, what sort of reliablity is that?
Re:Statistics tell all sorts of lies (Score:2)
Don't be too quick to judge based on the statistics Security Focus gave:
Looking at their results, the time to fix 50% of the bugs is 4 days for Red Hat and 3 days for Microsoft.
After 1 day, Microsoft fixed 42% of their bugs. Red Hat only 29%.
I know I'll probably get moderated to hell for this, but the simple fact is the "average" statistic tells nothing at all. What the results seem to be saying is that Microsoft is faster on simple bugs (probably better distribution channels) though they fail on the more difficult bugs (probably more complex code, but who can tell without the source).
Acutally, there is some discussion of this point in the article, and your interpretation of the statistics is probably more of a distortion than the "average" time presented in the article. One point specifically made in the article is that the time it actually takes to fix a bug is not well demonstrated by their statistics. The time they mention is the time between when the security hole is generally known and when it's fixed- not between when it's first discovered and fixed.
A significant percentage of the security holes are discovered, worked on, and not publicized until the bug fix is already available. It's litterally a case of "We found this hole and here's the patch," and not one of actually fixing the bug in less than a day. Apparently, about 42% of Windows holes and 29% of Linux holes fit into this category. In that respect, the statistics are much more favorable to Linux than to Microsoft. They actually mean that only about 14% (8% of all bugs v.s. 58% not announced and patched simultaneously) of holes in Windows that are publicized by someone other than Microsoft are patched within 3 days. In Linux, though, about 30% are fixed and available as RPMs within 4 days.
That also means that the average time for bugs not announced by the respective vendors is actually longer than the averages presented. The average time to fix a non-vendor announced bug is more like 27 days for Microsoft and 17 days for Red Hat. Since the non-vendor announced holes are the really scary ones- the vulnerability is known and there's no available cure- that's a more reasonable comparison.
Interesting, but surprising? (Score:3)
Is this important? IMHO, yes. Never mind corporate interest, something like the Bastille Project, OpenWall, Trustees, ACL, or even ReiserFS, could -never- have been written by 3rd parties if Linux had been closed-source, and many of these packages might never have been written at all.
100,000 developers & debuggers is a lot more than Sun, Microsoft, IBM and Apple can muster, combined. Why should it surprise anyone that such a large number can out-perform such small companies? *Note: The 100,000 is an estimate of the number of people who are on the kernel mailing list, plus the number who aren't but who play with the development kernel and will write bug reports in the event of finding any.
Re:Open Source doesn't always == faster bug fixes (Score:3)
Additions and corrections welcome. And I don't require payment for them, either. :)
Re:Open Source doesn't always == faster bug fixes (Score:3)
And we all know THAT is a really dumb way to support arguements. Better to just yell and scream. "
Let me clarify: everybody seems to use the SAME examples. A couple of sunny days doesn't make it summer.
"and you actually support linux"
You've got that one right (I started with Slackware on a 1.1 kernel). Unfortunately I work from home on in an NT environment, so other than web browsing, I've have had little incentive to reboot into it for a while. I like the way Linux gives me back control of my machine. But in reality, NT is very stable (I haven't had any need to reboot in over two months), and it fills my requirements equally well.
"find a more popular web server than Apache (suprise, open source). Seems PERL is pretty open"
I'll give you that. I think that current popularity of perl is partly due to the success of Apache. Thanks to the AC's reponse for the Netcraft address (http://www.netcraft.com/survey/ [netcraft.com]) that demonstrates Apache's rise to popularity at a time when there was no real competition (the NCSA server quickly proved insufficient). Apache like many of the other UNIX tools that have been the glue of the internet are popular because they were the first successes. As Linux finds it difficult to enter the desktop market, so Microsoft finds it difficult to dominate the internet.
I think that Apache's popularity will decrease in favour of IIS of the next few years. Whether Microsoft will pass Apache in popularity is another issue, and will certainly be a key point in the open vs closed source debates.
Re:OSS, Closed Source, and F00F. (Score:3)
BSDI made their fix first because they had information about the bug direct from Intel (under NDA, before the bug was announced to the general public). They were forced by Intel to remove the fix they posted almost immediately because it violated the NDA. The Linux fix, IIRC, was not reverse engineered from the BSDI fix, but was a separate effort that worked in a slightly different way, without the help of Intel's additional info. As I understand it, BSDI's fix was later reengineered to behave in the same way as the Linux fix.
PS. I'm not knocking BSDI here, who I think make a great product. I'm merely correcting misinformation (at least, I think I am -- my memory's not great, and I'm too lazy to search the Linux kernel archives to find out for sure :-)
bugs fixed vs distribution channels (Score:3)
I think that ANY commercial Linux distributor should, as point one of the business model, establish a means to rapidly and loudly manage bug fixes and updates. Hell, call 'em "Service Packs" so the PHB's will understand what you're talking about. Coordinate with the developers. Try to create a "path of least resistance" for people, esp. those that don't care about technology, just that they can have it fixed. Hire a couple of people to deal directly with developers and customers.
Bottom Line (Score:3)
From the web page. Apologies for the formatting.
1999 Advisory Analysis
Vendor - Total Days of Hacker Recess - Total Advisories - Days of Recess per Advisory
Red Hat - 348 - 31 - 11.23
Microsoft - 982 - 61 - 16.10
Sun - 716 - 8 - 89.50
Interesting that Red Hat comes out better overall than both of the others, even though Red Hat is generally considered least secure distribution of Linux available.
Re:Why is this surprising? (Score:3)
Also, it doesn't help that with their popularity, Microsoft draws the fire of every scriptkiddie, security wannabe, etc, who all want to be the first to find a new bug and either exploit it or publish the fix.
The article seems to be slashdotted right now, so i can only speculate here... But i'm not convinced that Open Source produces cleaner code. It just allows you to have multitudes of people available to fix flaws, AS THEY'RE FOUND. In other terms, though, I think that closed source still has some advantages in a completely different context from security. Just more direction, rather than having everyone run around coding whatever it is they want to code.
Here's hoping that Microsoft stood up to their promise of not shipping Win2000 til it was ready.
OSS, Closed Source, and F00F. (Score:3)
Solaris and MS took weeks, plural, as I recall.
Conclusion? Competent engineers who care make for faster code fixes too.
(Disclaimer: I work for BSDI, but honestly, if I didn't really think their engineers were that good, I wouldn't work here either.)
Re:Why is this a 5? (Score:3)
apparently, the moderators also believed that this hypocrisy (especially in the light of Rob's obnoxious answer in the /. interview) should be moderated up for more people to see.
Re:Open Source will always be the quickest (Score:3)
In the OpenSource community there is always the possibility that, so long as a project is useful someone may pickup the torch and keep maintaining/developing a project beyond the point that the initial author(s) is/are involved.
Case in point would be MetaCreatations. They produce quite a number of 3D modelling/animation programs (Poser/Bryce/Painter/Ray Dream Studio). There have been recent rumors (it may be fact by now) that they are dropping support and development of all these packages in favor of a 'web-enabled' product called MetaStream (ie. producing content for a web plugin), and a 'High-End' product called Carrara.
The current install base either has to pay a much bigger than usual (and for some unafordable) 'upgrade' fee to switch to Carrara (versus the modest fee charged to purchase a new version of a software they currently own), or else stick with Ray Dream Studio and any bugs that remain.
If it was OpenSource then there would be more possibility bugs being fixed even if a 'new and improved' product came out, simply because some people would both care, and have the code to do it.
Colleen:Its a black-hole.
Hunter:Is that a good thing?
C:It is if you want to be compressed into oblivion.
H:Oh.. coooool.
Re:Why is this surprising? (Score:3)
Just a thought on "friendly fixes" (Score:3)
Red Hat:
348 days for 22 fixes 15.8 days per fix
Microsoft
982 days for 35 fixes 28.1 days per fix
Sun
716 days for 6 fixes 119.3 days per fix
Also Red Hat had 29% (about 1/3 for those non-math inclined out there) friendly bugs, MS had 42% (~2/5) friendly bugs and Sun had 25% (1/4) friendly bugs.
Draw your own conclusions.
RedHat's response time (Score:3)
I thought it was interesting that although Red Hat came out on top by a decent margin, the article said that RedHat could be even faster if they payed more attention to the community.
Re:Hmm (Score:3)
Try to make their code the best,
Flapping hard against the storm,
Going nowhere, getting warm.
Linus T. and RMS
Found the way to true success:
Hiding errors makes no sense;
You can fly with turbulence!
: Fruitbat :
Re:Open Source doesn't always == faster bug fixes (Score:3)
Maybe someday you will understand what Open Source Software (OSS) is about well enough to not make such strange statements.
Here are some clues: it is not about writing software "for free"; it is not about not being paid a lot of money for writing it; it is not about giving your competitors the ability to put you out of business.
And here's the biggest clue the market has yet to particularly appreciate: once customers of software come to appreciate the benefits of insisting on Open Source, and assuming they therefore do insist on it in greater numbers, you won't be able to get paid well for writing software that isn't Open Source.
Keep in mind the fact that the huge valuations of RHAT and LNUX on NASDAQ; the success of GNU/Linux, Apache, etc.; and so on have all taken place with nearly 0% of the end user of software insisting on Open Source per se. All those end users have cared about so far is faster, better, "cooler", more reliable, etc., all of which Open Source can deliver, more or less.
Once it becomes clear that Open Source per se delivers advantages closed source cannot -- advantages that can trump all the disadvantages (slower, fewer features, etc.) in any given instance -- how will you make money writing closed-source software?
Really, do take the time to investigate just how much money people are making writing OSS, how much more freedom we have to change jobs and still take our expertise, even our code, with us to the next job, and so on, before you make public statements about whether OSS is "suited" for certain kinds of software development.
Re:Open Source doesn't always == faster bug fixes (Score:3)
I should probably clarify my "even our code" statement.
It refers to the fact that, of the various products whose source code I've been an "expert" in maintaining during my career, I can be hired to maintain that code by a large percentage of its users only when the code was distributed Open Source (more specifically, GPL'ed).
Put another way, closed-source development isn't just about locking in customers -- it's about locking in the vendor's programmers as well, by making it harder than it would otherwise be for them to get good jobs maintaining the same code elsewhere.
So if you're still using PRIMOS, or Numerix machines, or you use Cadence's NC-Verilog on Suns, it really won't help you much that I've got some expertise maintaining portions of those systems, unless you've paid lots of extra $$ for the source. The $$ I was paid by the respective companies (Pr1me, Numerix, Cadence) to work on that code was, indeed, decent, but is about all I can expect to ever earn for that particular expertise.
But if you're using GCC, g77, etc., you can hire me to work on them. (In theory, anyway; I'm not exactly looking for work these days.)
That makes OSS development more valuable to me as a programmer than closed-source software development, and in fact it gives me incentive to favor long-term viability of OSS products over the sort of short-term focus that has so characterized closed-source products over the past 20 years.
I.e. if I can profit from closed-source development for only the duration of my employment at, say, BigSoftwareCo, then it behooves me to maximize my salary & benefits during that duration, leaving it essentially up to BigSoftwareCo to decide how it will maximize its long-term viability. Since I can profit from OSS development for the duration of the practical life of that software, I have more incentive to make it live a long, healthy life, even if that means making less $$ in the short run (not necessarily always the trade-off I have to make, but one I've willingly made a few times already).
So, as an OSS developer, I'm more interested in making sure the software is, and remains, useful for a long, long time (ideally, with as few changes made by others to my own code as necessary, so I have maximum expertise in it).
Though I've personally applied similar incentives when writing closed-source software, it hasn't been due to financial or most other incentives, because there really aren't any. In fact, one of the reasons I was attracted (back) to OSS development is that I could continue to apply my own sense of ethical software development while being able to gain some potential for financial reward for it (for a change), or at least while not being punished (say, by management) for doing things such as taking extra time to make sure the software works correctly and as documented.
(Not that there aren't all sorts of similarly bone-headed people in the OSS movement pushing for "gimme what I want now, you worry about making it work right later" from time to time. But I don't report to them. Besides, it's usually easier for the general public to see these sorts of discussions going on in OSS development than in closed-source development. That allows people to come to more informed conclusions regarding, e.g. the long-term viability of proposed extensions.)
About what I expected... (Score:3)
Microsoft does amazingly well when you stop to consider what the have to work with. Their code is probably very complex due to the requirements of backwards compatibility and interaction along unusal connections between types of software. They only have a comparatively small number of programmers to be working on it at a given time, and they get the hot seat as soon as there is a problem. Everyone in the business world simultaneously expects perfection and low quality from MS, so that they can bitch about something all the time. When you consider the strains they have to deal with, they are doing very well.
I work for IBM and Sun is one of our big competitors, so I can't really say anything without risking excessive personal bias. However, I suspect that people are less inclined to roast Sun for every security breach, as there are fewer personal users than either of the two other systems.
B. Elgin
Re:Smaller software companies even worse ... (Score:3)
The consulting company I work for (and many others) makes a lot of money fixing problems created by some dork who was too stupid to realize you can't start your own software company.
Dork writes app with some puny but vital business purpose (If unethical dork, insert "on customer's time" here.), invariably in a lame-ass tool that really should only be used to handle smallish recipe files.
Dork manages to sell to one big client where an in-law works.
Dork makes major release and generates marketing pamphlets for distro at industry trade shows, which promise that the app will have user docs and a real-database port Real Soon Now.
Meanwhile, the app really hasn't progressed beyond "almost alpha" at the one big client who is paying for "integration services".
Hapless company (my future client!) correctly decides that buy is better than build, but incorrectly assumes that there must be a decent package out there for this. Someone at hapless company randomly stumbles upon dork's marketing pamphlet or web site, and buys in.
Hapless company is promised by dork that software is ready to go, and the "integration services" should only last a month or so.
Fast forward six to eighteen months (depending on client IQ)...
Client has no system, or worse, has a crippled system and turned off or stopped paying for the old one, has spent hundreds of thousands on dork hours alone, has no docs on how to install, operate, or fix, has no source to allow me to fix or diagnose it for them, has no database schema (and often a dork-encrypted/proprietary database), is paying more thousands for staff to babysit and undo the misdeeds of the app, has dork saying that he can't spend any more time with them (assuming his number is still listed), etc.
Basically, the only good news is that hapless company didn't bring Andersen Consulting in to do the app
So, how would this have improved with Open Source?
Well, dork could have:
Had input on his app from client or consultant help,
Started a bit smaller before hapless company showed up,
Made continuous improvement to his app based on experiences at first big client,
Gotten paid the honest way, for development services, rather than for vaporware,
And eventually have built a user base big enough to handle the Hapless account smoothly.
Meanwhile (and, per RMS, more importantly), the client could have:
reviewed dork's code before betting the company on it,
brought in extra help that would really fix problems, not just clean up after them,
been assured that they could still improve the app after dork fled the country,
had a consultant provide necessary docs and schema,
and been part of a community of users that would work together to improve the app, whether dork was there to help or not.
Or, at least I've heard it could work this way...
Smaller software companies even worse ... (Score:3)
Bug != SecurityHole (Score:3)
The article clearly focuses on plugging security holes, which is just a subset of the vast debugging space out there. Sure this may be the main concern of a sysadmin, but what about the 95% of us who do not have to admin for a living? Would security be our prime concern? Would an all-bugs comparison bring the same results?
Most bugs are just annoying, but some make you waste time, some lead to wrong results with varying consequences and some lead to data loss. I have never seen an advisory or a mailing list dealing with this kind of bug. I *know* there must be some, but the point is it's so easier to be informed about security gaps. Isn't anybody paying attention to overall quality or is this just a natural PR reaction to the known preference mainstream (even underground) media has for security holes, given its theft/trespass inviting nature?
It's easy to understand one's motivations to code, but we just debug because we *have* to. So, if these smaller bugs are something software can live with (mainly software planned to last only for a certain period), what would be the motivations for real debugging?
I am not saying that an intense debugging effort maens quality (maybe even the contrary is true), but if the only motivation to take corrective measures is pressure from consumers/clients who can have sensitive data compromised, then we will continue to use buggy software.
OTOH, when pride, reputation and commitment enter the scene, then we do our best to excel. So, my guess to the question in the first paragraph is that OS can have a response time orders of magnitude shorter than commercial products if we consider bugs in general, but that, if true, would be something hard to prove.
-------------------------
Red Hat? (Score:3)
No offensive, but I don't think Red Hat is a fair repersentive of Open Source software, at least in this test here. The test is going to be on open source verus closed source in terms of "turn around" on bug fixes.
Linux/GNU which is what Red Hat is pushing, is not coded by Red Hat. It is made by out side developers and Red Hat only puts the product out. When Red Hat learns about a patch fix, they check it, package it and then up load it.
Case goes like this, a bug is found in Linux/GNU and people are informed for it. Some guru fixes the bug, posts it to ftp site. After awhile Red Hat finds the fix, reviews it, packages it, documents it, puts it on it's ftp then releases it, the announces the fix is avaiable.
The extra step of Red Hat doing this, is going to cost allot of time for the open source community. It should not count when the bug is found to when RH announces the fix, it should count from when the bug is found to when their is a working fix or work around anywhere in any form (even if it is source) on any ftp site.
Red Hat can only work with what the community gives them, say this
Day 1 - bug that opens a hole in program found in OSS
Day 2 - nothing
Day 3 - Maintainer and head programmer of XYZ announces a fix and uploads the source to ftp.
The problem is fixed, patch is avaiable to close the hole and fix the nasty bug is XYZ software.
Day 4 - nothing
Day 5 - Red Hat reviews the code
Day 6 - Red Hat tests the code
Day 7 - Red Hat packages the code
Day 8 - Red Hat documents the code and uploads it to ftp
Day 9 - Red Hat announces there is a fix avaiable for Red Hat users that are using XYZ software.
Now, yes, I know this is REALLY dramatic, but I am trying to make a point. (BTW, I got some REALLY good fishing stories). Anyway, I haven't seen a bug in a RH last more than 3-4 days at the very most (then again I don't use RH), but time of fix, to time of RH announces a fix, can be, and is drawen out. This could impact the study some what.
Neither a-less I still know Red Hat is going to kick MS ass at bug fix turn arounds, but the point it, raw OSS could do it faster and better than RH ever could. But the packages are allot nicer with RH
Again this is not a flame and I don't mean any disrespect to anyone.
Cheating on bug fix times? (Score:4)
And I'll still wonder what's with the legalese every bulletin has about "no known people being affected" by the security bug.
zlxiss
Re:Why is this surprising? (Score:4)
2k is supposed to have some provisions for not allowing other random progs to overwite dll in system/system32 (which would be nice) - every random Joe Blow app should *NOT* replace system-wide dll s. Ever. Even MS Office (are you listening, chief of software architecture??
Imagine installing BitchX or XAmp and having them overwite parts of QTLibs, Xlibs, and why not, glibc... our versions *have* to be better, right?
Oh well...
Statistics tell all sorts of lies (Score:4)
Looking at their results, the time to fix 50% of the bugs is 4 days for Red Hat and 3 days for Microsoft.
After 1 day, Microsoft fixed 42% of their bugs. Red Hat only 29%.
I know I'll probably get moderated to hell for this, but the simple fact is the "average" statistic tells nothing at all. What the results seem to be saying is that Microsoft is faster on simple bugs (probably better distribution channels) though they fail on the more difficult bugs (probably more complex code, but who can tell without the source).
John Wiltshire
It's more subtle than this (Score:4)
Important bugs in important software will be fixed just about as quickly in either system: the 5 key people who know the source behave more or less the same way in open or closed source situations. It's the vastly larger number that matter to most developers. And, as more and more developers realize this and enjoy more working on open source, it won't matter what the other guys think.
Re:Speed != Quality (Score:4)
Didja read the article? I know it was /.'ed -- I waited a longish time for it -- but it addressed the quality-of-fix issue pretty well.
BTW, while I don't know for sure whether you're right that many OSS projects don't regression-test such fixes first, I do know the ones I've worked on could stand some improvement...and also that it's a bit easier to regression-test a fix to a small component than a large one, and that OSS thrives on collections of small components in a way Closed Source $$$-making development doesn't (the latter favors the development of monoliths, since they represent a harder-to-reverse-engineer, and therefore steeper, wall for competitors to climb).
Also, the article made mention of various Microsoft-issued "fixes" that, themselves, had to be fixed. Didn't mention that happening with GCC, though it has happened there (not security fixes AFAIK, but the same principle applies), but the implication was that the most heavily-funded closed-source-development organization in the world doesn't seem to do to well producing correct fixes in the first place.
CmdrTaco is UNINTERESTED (Flaimbait -1) (Score:4)
Your comment says "interesting" while you still remain uninterested in your user's demands to Open the
Slow
They are a threat to free speech and must be silenced! - Andrea Chen
Re:Cheating on bug fix times? (Score:4)
Slashdot Readers: The UNIX Philosophy (Score:5)
I think I'd like to point Slashdot readers to a wonderful book: The UNIX Philosophy by Mike Gancarz. This book explains the tenets and values that traditional UNIX programmers have held. It goes on to list the 9 most primary tenets:
As you probably guessed, Open Source _pushes_ Tenet 6 to the forefront. Let others use your code!
Along with those primary, religiously-followed tenets, 10 lesser tenets are typically followed:
The book also mentions something very important: The Three Systems of Man. Software goes through the First system, the "innovative" cycle where one or only a few develop something revolutionary, to the Second system, where committees are formed for the software so more people can feel they're worth something contributing to the idea, and the Third system, where experts who left the scene during the 2nd stage come back to implement the idea, now that the obvious solution for it is well-known and has been walked many times.
CREDITS: This posting contains lots of quotations from, of course, the book: The UNIX Philosophy by Mike Gancarz, Copyright 1995 Butterworth-Heinemann. ISBN 1-55558-123-4 ... about $19.95. Well worth the money.
You people are ruthless! We're working on it! (Score:5)
People calm down. We have our best perl coders here slaving over the Slash release. Patrick, Rob, and Pater are trying to convert their undocumented code and database schema into something that can be installed on other machines besides this one. The Slash code really is hardcoded in many ways and they are trying to unhardcode it for you now now. But they very much appreciate your flames so please keep 'em coming. =)
Comment removed (Score:5)
Speed != Quality (Score:5)
As far as speed goes, big deal... give me a fix that works.
Re:Open Source!=Redhat (Score:5)
Red Hat has actually released several 2.2.14 RPMs in Raw Hide [redhat.com], our more experimental version. If you want to be on the bleeding edge, use that.
Also, check the source RPM for Red Hat's 2.2.13 kernel - it already contains a number of the fixes that later made it into the official 2.2.14 kernel.
We don't put out errata RPMs for every minor bug (misspelled man pages and such); this stuff gets fixed in Raw Hide and then makes it into the next release.
Errata RPMs are released only when they fix a MAJOR bug, such as a security problem (such as the bind update currently available) or a real functionality problem (such as the lynx update).
Releasing them for every minor problem, or every base version update, would be a bad idea because it would be very hard to keep track of everything. (And of course it would lead to "You need to update 1500 packages before Red Hat Linux works well" FUD from Microsoft and other people who don't care to check what an update does before writing flames).
Re:RedHat's response time (Score:5)
The thing that slows Red Hat errata down is called Quality Assurance. Bugfixed packages don't leave Red Hat without having run at least a couple of tests to verify
I'd rather delay a package for a day than having to release yet another security update for the same package the next day...
Bumblebees flying AND:Poor research (Score:5)
Airplane and ornithopter (and bird?) wings work on laminar airflow. Try 'too hard' to fly, and you get turbulence above the wing. In other words, a stall.
The bee has a different method of dealing with this. Rather than prevent turbulence, the bee wing uses turbulence, and has a machanism for continually spinning the turbulent vortices off of the wing. In this flight mode, a given size wing has much as 50X more effective lift than in laminar mode.
I'm not sure we can apply this to the whole Linux vs Microsoft thing, other than to say that a new modality changes the whole landscape. But I guess that's what Open Source is all about. In this case, we're the bee.