2014: The Year We Learned How Vulnerable Third-Party Code Libraries Are 255
jfruh writes Heartbleed, Shellshock, Poodle — all high-profile vulnerabilities in widely used libraries that rocked the software industry in 2014. Sadly, experts are now beginning to believe that these aren't the only bugs lurking out there in widely used open source code, just the ones that grabbed the most attention. It's beginning to look like one of the foundation concepts of open source — that with enough eyes, all bugs are shallow — is a myth. Of course, probably no one believes that all bugs are instantly shallow, no matter how open is the source, or that open source software is immune from bugs -- particularly ESR, coiner of the phrase.
But *are* there enough eyes? (Score:5, Insightful)
The phrase might be true, but we're seeing the effects of insufficient eyes. In reality, how many sets of eyes are actually reviewing these libraries at a source code level? I rather strongly suspect that in most cases they are simply used under the assumption that "well, everyone uses it, it must be okay".
Re:But *are* there enough eyes? (Score:5, Insightful)
Re:But *are* there enough eyes? (Score:5, Informative)
Amen.
I touched OpenSSL once. It was a nightmare. No idea where or how it passed anything as it wasn't at all clear the path that simple things, like certificate checking, were supposed to take.
In the end, I hacked onto it rather than play with it. The documentation was non-existent. The code samples were incomplete and with almost zero explanation of what you were supposed to be checking for and where things COULD go wrong. Hence 90% of the code I see that touches OpenSSL looks exactly like the samples and nothing more.
All I wanted to do was have two x509 certificates, and check that both were valid and one properly signed the other, as part of a primitive DRM scheme I was toying with. It turned into a nightmare scenario of IMAGINING every possible outcome and specifically coding for each one, rather than anything sensible.
I don't think I'd ever touch it again, and was not at all surprised that there were problems with it. I was more surprised that others had had the same problems, yet OpenSSL was still regarded as the "gold standard" library to integrate with.
Re: (Score:2)
I am struggling with this. As far as I can say that you are saying you looked at a large code base *once* and you found that it was complicated and you didn't understand it.
Is this not true with most code bases? I mean, I can look back at my own code from a while back, and it takes me quite a while to work out what it is doing. And long-lived code bases can, ironically, be particularly problematic.
The ultimate problem here is the funding problem. Free software is much more adaptable because you don't have t
Re:But *are* there enough eyes? (Score:5, Informative)
"once" = several weeks of fighting with the damn thing to do one simple task, clearly specified, that's way within it's scope.
There is no serious documentation. The examples are given as documentation and are vastly incomplete.
It's not a question of "glanced at it, it was horrible", but for anything serious even a quick glance will show whether or not it's a nicely-produced library or not. One quick glance at a library is normally all I do in order to get a handle on whether it's good enough and clean enough for me to program against.
The problem with OpenSSL was that the only people who knew how it worked never bothered to simplify that or document that enough. There is no "is this certificate valid" function, that returns a enum from a list of potential problems (CERT_EXPIRED, CERT_NOT_YET_VALID, CERT_CORRUPT, CERT_UNTRUSTED, CERT_INSECURE, etc.), for instance. There are lots of things that LOOK like that, but none actually do it in OpenSSL.
Given that it's a library who's primary purpose is - given a configuration of particular algorithms and keys - to produce a encrypted bitstream from an unencrypted one (or vice versa), it's suprisingly complex to do anything simple with any guarantee that you're doing it right.
Re: (Score:2)
the problem is 'security' software is never as secure as promised. it's like physical keys, you can only make so many types of physical keys, so you would think they would make locks with a larger sample size than the retailer has access to. but no go to any menards you want to, bring the package from your door lock and they will point you to the exact same paired key on the shelf. throwing away your key package is inviting a thief to grab fresh one at the local hardware store and ask if they can get matchi
Re: (Score:2)
Re: (Score:2)
I touched OpenSSL once. It was a nightmare. No idea where or how it passed anything as it wasn't at all clear the path that simple things, like certificate checking, were supposed to take.
In the end, I hacked onto it rather than play with it. The documentation was non-existent. The code samples were incomplete and with almost zero explanation of what you were supposed to be checking for and where things COULD go wrong. Hence 90% of the code I see that touches OpenSSL looks exactly like the samples and nothing more.
LOL I agree with you the documentation sucks especially when you have any need to go off the SSL_* rails...yet this is niche + free shit and in that context well in line with a number of open source C APIs or generally what I've come to expect.
There is a separate .c file for every command verb you can possibly type from openssl CLI in the apps folder.
I've seen a number of cut and paste jobbers in various projects where authors obviously had taken no time to understand what they were doing yet this is hardly
Re: (Score:2)
Better to pay for an independent review of open source code than to hope you can trust a closed source vendor when they say they are secure.
Re: (Score:2)
In closed source you don't know anything at all about bugs lurking until someone accidentally encounters it.
Running a third party library with closed source in a closed source application - well, you have many potential problems there that can be flying for a long time.
Don't forget that it wasn't long ago that a considerable bug in Windows was found that had been around since Windows 95 [pcworld.com].
Re:But *are* there enough eyes? (Score:5, Interesting)
A long time ago, I saw Bertrand Meyer (the Eiffel guy) give a keynote at ICSE, where he pointed out that the "given enough eyeballs, all bugs are shallow" claim is unscientific, because it can't be falsified: if a bug is not found, people can always say that there were not enough eyeballs, so "Linus' Law" still holds.
Re: (Score:3)
And here is the context. Note that specifically it is referring to bugs that people notice....if the user-base doesn't notice th
Re: (Score:2)
The phrase might be true, but we're seeing the effects of insufficient eyes.
If there are insufficient eyes, then the truth of the phrase is moot.
Re:But *are* there enough eyes? (Score:5, Insightful)
This is just FUD. Whatever the number of the eyes, they are certainly far more than open source. I have already contributed many bug reports and often fixes, would you care to elaborate how I would do that in a closed source model? Because I am *very* curious.
FUD? Sure, there are *more* eyes in open source than closed source: that's not the point. Are there *enough* eyes to prevent potentially catastrophic bugs from being exploited? I'd submit that we're seeing that there isn't. I'm not suggesting that closed source is superior, but let's not confuse some sort of moral superiority being attributed to open source as being equivalent to automatic technical superiority. In most cases, I'd agree that open source has technical superiority, but it's not automatic.
Re:But *are* there enough eyes? (Score:5, Interesting)
one of the issues is that there are indeed *more eyes* but they are incentivised to look for exploits and sell them to the bug-buyers rather than report or fix them. I did a hands up poll (buyer beware) at our local OWASP chapter and over half had sold a bug to such an organisation. pretty shocking.
certainly, one of the first moderately important bugs I found, I was daft and got in touch with the software vendor and then faced legal action from them which luckily they saw sense and dropped. So many people nowadays just can't be bothered with that problem and can make a fast and low risk buck by selling the 0-day.
Re: (Score:2)
There is a qualitative difference between "reviewing code on my spare time when I feel like it" and "Im paid to look for security bugs 9-5".
Obviously you can argue about whether security is prioritized for any particular closed-source software vendor, but there IS a difference.
No, that is false (Score:4, Insightful)
An open source project can have as few as just one set. There are some projects that nobody other than the developer ever contributes to. Just one guy occasionally working on some little project that some people use. Those all those people COULD look at the code, they don't.
Likewise commercial firms can, and sometimes do, pay many people to look at the code. In addition to having a big development staff they can have dedicated QA staffs. They can have a person, or many people, who's job it is literally to sit and look over the code for security issues every single day.
You are buying in to the same fallacy that the article is talking about: That because something is open is just means that more people MUST be looking at it. No it means people can look at it, but they may choose not to.
Re: (Score:2)
I think the number of "commercial firms" actually doing proper QA and code review is minuscule. Its almost a margin of error in the stats. Sure, all the paper work is there with the sigs etc. But I have never seen the actual process ever being properly done by qualified folks with any of the big IT consulting firms. Heck, my small 25k-50k projects don't even get it right... where I am the PM.
Because the budgets, time constraints, & resources are never setup to do this correctly. With crunch time, c
Re: (Score:2)
You would submit a problem ticket. If enough people submit them, it becomes a priority for a paid developer to address the issue.
Re: (Score:3)
That entirely depends on the company and the seriousness of the defect.
You're even more likely to get ignored by everyone if reporting a defect on an open source project. Mostly they'll expect you to fix it yourself or wait until someone with the appropriate skills takes an interest. And of course usually almost no-one that experiences the bug has the skills and knowledge of the particular project's internals to be able to fix it.
Re: (Score:2)
Depends on the company. They can also disappear leaving you without support, decide to abandon the product as non-strategic, or ask you to upgrade when you don't need to.
Which FOSS project you adopt is equally important. A while ago I was looking for a simple FOSS file upload utility, I found one, installed it, read through the sourceforge site, used it for a good year. Then when somebody was looking for a similar utility, I searched for the utility and found a 5 year old CVE which allowed arbitrary
Re: (Score:2)
I don't think there's any real disagreement here, other than what the thread should be about.
The poster didn't say that people couldn't find/post/send patches for bugs. Only that there weren't enough people doing so.
How many would be "enough"? I suppose enough so that exploits didn't happen before the maintainers were aware of them. Clearly, then, we don't have "enough" eyeballs of the right sort. But we have more than if the libraries were closed-source.
Re: (Score:3)
Usually with the closed source applications, you send in the bug report, and it appears to vanish from your end. There is no feedback from the bug treacking team. There is no update on if the issue is pending more data (which you could supply if they ask. Clearly the bug was severe enough to warrant a report, so clearly you must run into it fairly frequently-- but no-- no access to the bug tracker, so you dont see the comments about "Cant reproduce! closing!" getting thrown about in there) or even if the b
Re: But *are* there enough eyes? (Score:2)
Yeah, compare that with the "go fix it yourself" response, or the seven year, 200-'me too' obstinate won't-fix-because-that-would-require-admitting-that-my-design-was-broken nightmare.
Re: (Score:3)
Compare that with FOSS bug trackers, and it is night and day.
But it isn't much different with FOSS. Often the bug can just sit there without anyone responding to it. Or maybe someone asks for more details, but after I provide them, it's just crickets.
Re: (Score:2, Informative)
There is one workaround for this. If it's a security bug, public disclosure with a working demonstration/exploit on the right boards will get their attention awful damned fast, and you'll know when a fix is posted.
Funny part is, usually it still takes (relatively) longer for the proprietary shop to come up with a workaround/fix than it does a given OSS community.
Re: (Score:2)
>(which has a max size text field, so keep it contrite!)
keep it concise (giving a lot of information clearly and in a few words; brief but comprehensive.), not contrite (feeling or expressing remorse or penitence; affected by guilt.)
Though I suppose if I'm struggling to explain a complex bug in 140 characters of less I might be feeling contrite about the quality of my report. More likely though I'm just annoyed at the idiots that made it virtually impossible for me to file an adequate bug report.
Signed, Every Developer Ever (Score:2)
No, you should feel contrite that you are daring to report a bug in software that is obviously perfect. There are two classes of software error: hardware error, and user error. In the first case, you shouldn't have bought that in the first place, so it reduces to the second case. Our QA process will take advantage of this breakthrough, but the documentation will not be updated.
Re: (Score:2)
As a long-time developer, I must disagree. There are of course excessively vocal assholes in the field whose fragile ego will brook no argument, but you'll never get to be *good* with that attitude, no matter how many high-paying jobs it lands you working for similarly egotistical assholes whose self-image requires conflating egotism with competence.
Poe's Law (Score:2)
I'm sorry, one or both of us (depending on how that error is scored) has fallen victim to Poe's Law. If high-paying jobs are a result of egotism I may have to try it some day. For the present I suppose I'll have to send off for another batch of <sarcasm> tags.
Re: (Score:3)
The only positive there for open source is the visibility. With the defect report as with the source. It's certainly not any more likely to be dealt with in a timely way, or at all with open source. People being paid is the best way to get uninteresting bugs fixed in a timely way, and that happens a lot more often with commercial software.
Libraries? (Score:4, Interesting)
Shellshock did not affect a "library", but an executable.
Re: (Score:3)
Re: (Score:2)
So if I'm understanding this correctly (Score:4)
the big news is that people are now thinking that bugs in software is big news.
Magic ball prediction - 2015 (Score:5, Informative)
Re: (Score:2)
...My magic 8 ball tells me that in 2015 we will learn that proprietary and embedded software is even more vulnerable. ...
imo, That's already been proven by Microsoft over the years....
Re: (Score:2)
A lot of vehicles are now looking at common operating systems like Android, Windows, iOS etc. just because they are common on the market.
Just be aware that the proprietary systems they are leaving are a lot worse when it comes to bugs, so it's not an option to stick to a proprietary system. However some people in the vehicle industry are still not having a very good knowledge on how to segment networks.
Re: (Score:2)
Personally, I prefer my cars air-gaped. I place negative value into infotainment systems and any car functionality that is no directly related to driving. Unfortunately, I am in the minority.
Re:Magic ball prediction - 2015 (Score:4, Funny)
requires physical access to the car.
Good thing my car stays in a locked server rack in a room with a security guard posted at the door requiring a finger print and an RFID card to access.
Re: (Score:3)
what...no faraday cage?
um what? (Score:2)
I don't know anyone that ever thought "Open source" was bug free. The point is that people can more easilly find and fix bugs with open source. With closed source, there could be some obvious and dangerous mistakes in the code but no-one but those with access to the source will know it exists. It's then up to whomever owns the source to decide if it's profitable enough to fix it. The problem with that system is there are people with access to the source... People come and go from every company on earth ever
Re:um what? (Score:4, Insightful)
Not only can they fix them, but they do fix them. As soon as these vulnerabilities were discovered (by non-malicious actors) patches were available within days or even hours. Commercial vendors take their good sweet time. Or in the case of Microsoft, discontinue support for the still widely-used Windows XP. Find a vulnerability in that? Too damn bad. It'll never get fixed.
Nobody's saying open source software is bug-free or even necessarily has fewer bugs than closed source software. It's what happens when a bug is discovered that makes the difference.
Re: (Score:3)
Hey now, I'm all for closed-source bashing, but let's be fair: Windows XP is over 13 years old. Yes, if you found a bug in one of it's contemporaries such as Red Hat Linux 6.1, but it's much more likely you'd upgrade to a newer version of the software where the bug has (hopefully) been fixed.
Re: (Score:2)
Or in the case of Microsoft, discontinue support for the still widely-used Windows XP. Find a vulnerability in that? Too damn bad. It'll never get fixed.
Like when Ubuntu Server 13.04 didn't get a fix for Heartbleed because they discontinued support after 1 year despite the criticality of the bug and the servers seeing considerable use? All the official replies were "it's your own fault" and "change distro version immediately". Which you often can't do quickly. No users really expected 12.04, 12.10, 13.10 and 14.04 to get the fix while 13.04 in the middle was left out - except people who read the really, really fine print and took it seriously. Shipping the
Re: (Score:2)
Dick move, for sure, but at least it's able to be patched by the administrators themselves. With a closed source product you'd be SOL.
Say what? (Score:2, Insightful)
I don't know anyone that ever thought "Open source" was bug free.
Every FOSS fanatic on Slashdot for the last 17 years has implied that bugs would be found and fixed FAST - not linger for years.
The point is that people can more easilly find and fix bugs with open source.
They could but do they? Nope.
Re: (Score:2)
I don't know anyone that ever thought "Open source" was bug free.
Everyone who didn't think about it thought exactly that. Which means a very large number.
Re: (Score:2)
I don't know anyone that ever thought "Open source" was bug free.
At least back in the day one of the main motivations to move to open source was that it provided a less buggy experience than the proprietary option.
overall framework vulnerability (Score:2)
Whether it's OpenSSL or Windows APIs hackers are looking at every possible vector to attack systems. To be honest with ourselves, the software engineering community has to realize that security must be given the same priority as any other code quality metric. While we may not be able to test for every possible vector there should be a standard set of vulnerability tests that every organization should be able to test for before releasing code. Likewise regression tests need to be exercised prior to any su
Re: (Score:2)
Shallow, not psychic (Score:2)
In the cases that were software defects, the defect was rapidly fixed upon discovery. That's really the meaning of all bugs being shallow, not that they won't ever exist.
That said, POODLE is not a code defect, but defect in the standard (well, except the bit where an implementation skipping validation of the pad). Shellshock was indeed a grave defect, but I think a correct takeaway there is to avoid having a shell language be in the path where untrusted data could be injected as much as possible (as well
Exactly. Shallow doesn't mean nonexistent (Score:2)
Exactly. Saying bugs are shallow doesn't mean they don't exist. Shallow vs deep refers to how much effort it takes to characterize and fix - is it a "hard" bug or an "easy" one. Wikipedia explains it well:
As the Heartbleed bug shows, even shallow bugs[7] may persist in important pieces of open softwareâ"it took two years for the bug to be discovered, and the OpenSSL library containing it is used by millions of servers. Raymond said that in the case of Heartbleed, "there weren't any eyeballs".[8]
When
The sad truth (Score:2)
The quality of open source software is kind of crusty these days. No matter how open it is, making stuff work properly should be priority number one.
For example, try to adjust the display brightness of a laptop under Mint or Ubuntu. It goes in multiple steps because there is multiple listeners for the adjustment event. Even basic stuff like this does not work properly.
Security (Score:2)
Open-source does not make code automatically bug-free. No more than using a safe-malloc-library does, or deploying DEP on your executable, or ASLR, or coding only in a language that's considered "secure".
What it does is allows certain types of security problems to be POTENTIALLY spotted. It's a +1 on the score, not a game winner. And it doesn't mean that proprietary is -1, either. It just means that you are so confident in the quality of your code, you can show people that by opening it up.
What gets me
Misunderstood (Score:4, Interesting)
Re: (Score:2)
Statistics and Damnable Lies (Score:2)
Has anyone noticed that there are now astronomically more OSS users now? The number of OSS users is also growing at an exponential pace.
What we should expect with those stats is that there should be more cracks and bugs in OSS due to the higher percentage of people programming/using it.
Also, as the value of OSS increases to the market and more information are handled by OSS there is more incentive for old vested interests to search for the downside as a form of marketing. We never heard about all those MS
What utter bullshit (Score:2)
2014: The Year We Learned How Vulnerable Third-Party Code Libraries Are
Really? Like we did not know before?
I don't think anyone in the industry who is both sane and honest ever pretended that FOSS was bug-free.
We know that software, ALL software, contains bugs.
Also, plenty of projects don't have too many contributors, so the "many eyes" principle hardly applies.
But if you've got the source at least you can have a look, (and really should, if you are considring using something for a mission-critcal application).
Then fix, if required,and contrib back.
Certainly, vulnerabilities
Re: (Score:2)
I don't think anyone in the industry who is both sane and honest ever pretended that FOSS was bug-free.
Why not? We should precisely strive for FOSS being the golden standard for bug-free software. The OpenBSD team is doing pretty good work in this area.
Open or closed, same problems (Score:2)
From the perspective of most IT customers, bugs are bugs regardless of closed or open source. They still rely on other people to find them, patch them and release changes.
Companies who rely on open source libraries may or may not have the ability or spare resources to go digging through the code of a library, finding a security issue, writing a patch for it, recompiling the library, then using that patched copy in production. Companies in the 'service provider' realm may be able to do this, simply because t
What's the big deal? (Score:2)
So there were a few high-profile security flaws found in important open source software recently. So what? People are talking almost as though this somehow proved that open source is not superior or maybe even inferior to closed source software.
It isn't like there has never been a high-profile security problem in important closed source software. Nor is it likely there will not be others in the future.
Here is an awfully safe prediction... in the future there will be more high-profile security bugs found in
Re: (Score:2)
"Certainly the age of the code that caused these bugs is reason for concern."
But.. if those bugs were so obvious and easy to exploit then why didn't "the world end" a long time ago? I'm pretty sure there are an awfull lot of important systems out there that have Bash on them!
Even the hills have eyes (Score:2)
I have never been a believer open source code is automatically more secure. Different projects have different code quality. Depends on if/how they are managed and who all is willing to step up contribute their time and effort.
There have been some advantages unique to open source projects such as static analysis vendors developing, testing and marketing their wares blessing a huge swatch of open source land with the fruits of their labor as well as the ability for savvy users to evaluate willingness to use
Computer Science still newb (Score:2)
Computer Science is still a newbie discipline. Much more relevantly, the problems introduced by the sudden social change of what a network is are a pretty big deal.
Here's how you know it's crazy: look at the hacker hysteria, and how it has barely gotten any better. The vast majority of "hackers" who cracked stuff back in the day were treated entirely ludicrously, like some kind of wizard. Everyone here probably remembers indefinite detention and ludicrous punishments such as "can't use a computer", whic
Security Bugs are Different (Score:2)
When everybody has the same goal, as is pretty much the case for usability issues, the shallowness of bugs posited by the many eyes hypothesis would be a good thing. When it comes to security issues, it sets up a race between the white hats and the black hats, and there is more incentive for the black hats (collectively, the rest of us have as much incentive as do the black hats, but that is not the case individually - for one thing, an attacker satisfies his goal by finding just one vulnerability.)
Not a myth (Score:2)
Which Linux user actually got hacked by a library vulnerability this year? Speak up now. Oh, hmm, the sound of silence. Certainly not me, and not anyone I know of.
The thing is, sometimes the many eyes just aren't pointed in the right direction. A publicly disclosed vulnerability changes that instantly, hundreds or thousands of expert eyes to got work, fixes happen fast, and the community learns from the incident, often resulting in the eradication of a whole class of risks.
Re:not just many eyes (Score:5, Interesting)
The security of the open source model isn't really the problem or the answer here. The problem is homogeneity. A million different sites and applications rely on just a few libraries, so that when a bug hits one, it has massive impact on the entire internet.
We also know that the answer isn't in rolling your own security. Very few people or organizations are likely to be able to securely implement their own version of TLS. Even the best packages of today didn't start out perfect, they had to iterate through several flaws to get to where they are today.
So perhaps the better answer is in having more packages to choose from? Instead of picking just openssl by default, it would be better to have a broad array of choices. With a dozen packages on the market, that might mean 11 times out of 12 the bad guys wouldn't exploit our site. If the packages are interchangeable, we'd be better positioned to switch them quickly in case of emergency.
Re: not just many eyes (Score:4, Informative)
Re: not just many eyes (Score:4, Interesting)
So all we need are 11 more sets of programmers to program free version of SSL 2-12?
Yes, and demand for them. But the big problem you're correctly implying is there's no economic justification that will drive this behavior. Maybe it will take a dozen big companies and foundations to drive this. Imagine if IBM, Microsoft, Google, RedHat, Yahoo, HP, Dell, Apache, Wikimedia, Mozilla, FSF, Apple, Intel, AMD, nVidia, Bungiesoft, and others each contributed their own versions of openSSL; each written in their own choice of language, using their own code, and building their own implementations of everything from the crypto through the command line interpreter logic. My company may decide we do more business with Intel, so we choose theirs. Or your company may be more Apple focused, so you'd choose theirs. In every case, we'd all nervously watch each other looking for signs of intrusions, hoping we won't be the victims, but knowing that alternatives exist if we are.
While a 1/12th scale incident of Heartbleed is still a huge problem for a lot of companies, it's no longer the catastrophe-sized disaster that Heartbleed actually was.
Re: not just many eyes (Score:4, Insightful)
It's not that just "being open source" automatically means code is being validated by lots of eyes. It means that you can look at the code. All we need is more people interested in doing that, or paid to do so. They also need to have the knowledge/skill necessary to do that.
And as always, being closed source would not have made the issues easier to find. And then you'd be at their mercy waiting for a fix. These were all found and all fixed relatively quickly, so let's focus on that.
SSL certainly isn't a simple library. Increased complexity makes it easier to make a mistake and harder to find it.
Re: not just many eyes (Score:4, Insightful)
Microsoft, Google, RedHat, Yahoo, HP, Dell, Apache, Wikimedia, Mozilla, FSF, Apple, Intel, AMD, nVidia, Bungiesoft, and others each contributed their own versions of openSSL; each written in their own choice of language, using their own code, and building their own implementations of everything from the crypto through the command line interpreter logic.
...and then, due to the nature of software today, you wind up with all of these in your organization, and now if any of them are compromised you have a problem. I'd rather engineers from all of those organizations worked on OpenSSL (or whatever) to make it as secure as possible, perhaps forking it occasionally to solve some particular problem and then merging the code later.
Mo Code, Mo Problems (Score:5, Insightful)
When you write code, you are going to screw up. If you aren't writing bugs that people notice, you aren't working on anything worthwhile. While the bugs that were found were costly and dangerous, the question is were these found quicker than a closed source solution? Were they fixed faster than a closed source solution? Is there anything that can be done to allow quicker roll back or disabling of vulnerable features? When you write code, you need to design for failure, because it will happen and plan so that the recovery will be as quick as possible.
Adding additional software library offerings will only add stability in the sense that one particular vector wont affect as much of the Internet, but you introduce more surface area for attackers to poke at, and more vulnerabilities overall. Given the challenges to write really solid code, I think I'd like to have fewer, but really well vetted open source software solutions. Of course, I am not correct in this opinion, as there are no 'right' decisions here.
Re:not just many eyes (Score:4, Insightful)
So "Security through complexity" then?
Sounds complex.
Re: (Score:2)
Cool story bro.
Re: (Score:3)
Unfortunately, assuming the number of developers is constant, with 12x as many libraries every one of them is virtually guaranteed to be far less secure than if all the developers were working on improving the same library.
So the root of the issue is that we need to dedicate more total resources to development, auditing, etc. the fundamental security libraries. Whether the best outcomes would then be seen by focussing all those resources on a single library, or spreading them across multiple competitors, i
Re: (Score:3)
Actually the number of developers ( and developer effective effort ) that would work on such a library is not constant. Once you get too many developers on a single project a larger effort is required to required to just for code / development syncing. The other issue is especially with open source developers is not all developers are interested in working in the same development models among other issues. Which means single projects tend to discourage some potential developers from joining in. There also i
Re: (Score:3)
First off, let's be clear - I'm not arguing against alternatives, there are many good reasons to have them. But when the single major widely-used library is already "under-funded", starting a dozen more similar projects all competing for basically the same developer base is unlikely to improve things.
As for attracting OSS developers, I'm unconvinced. Sure, for "sexy" projects many will prefer a smaller project where their contributions are likely to be more appreciated. On the other hand, for a utility l
Re: (Score:2)
it would be better to have a broad array of choices. With a dozen packages on the market, that might mean 11 times out of 12 the bad guys wouldn't exploit our site
1 Staffing and maintaining a single project is no small feat in itself.
2 The smaller the piece of the puzzle that you are trying to replace, the more likely you will be driven towards the same solution.
3 Does even the IT pro have the time and resources to research and test a dozen choices for each of the one hundred, three hundred, or more third party libraries they may need?
Re: (Score:3)
OpenSSL, Netscape NSS, GnuTLS, as well as various proprietary SSL implementations have all had serious bugs over the last few years. Some of the bugs affected multiple unrelated code bases.
How much diversity do you want? How did the diversity help?
Re:not just many eyes (Score:5, Insightful)
>We also know that the answer isn't in rolling your own security. Very few people or organizations are likely to be able to securely implement their own version of TLS.
TLS is the issue. It isn't simple. It isn't even secure in many compliant configurations. It invites implementation errors.
A good spec of a secure protocol would make secure implementations easy. If you don't think about the implementability while you're writing a spec, you're doomed, like TLS is.
"Don't roll your own security" is advice aimed at people who don't know about security. Some of us have to implement and 'roll' the specs. The world looks different when your reputation is tied to your stuff not get broken before senility sets in. You can do it right, but you need all the elements in place including a well thought out spec.
Re: (Score:3)
TLS is the issue. It isn't simple. It isn't even secure in many compliant configurations. It invites implementation errors.
A good spec of a secure protocol would make secure implementations easy. If you don't think about the implementability while you're writing a spec, you're doomed, like TLS is.
Well said! You deserve +5, Insightful. Get on it, mods!
Re: (Score:3)
I think reuse is far more secure than multiple options. In the long run. Reuse will find more bugs and make it more aware. Reuse will have a much larger impact, but that just ups the priority to get it fixed. Multiple options will only decrease the impact but will open up for duplicate bugs, finding less of them, more entropy for fragmentation, and less priority & longer time for getting them fixed. Multiple options will be less secure... you just won't think it is.
Re: (Score:2)
Very few people or organizations are likely to be able to securely implement their own version of TLS.
Security and Complexity are enemies. Call me paranoid, but given recent revelations, I suspect TLS was purposefully made complex--so that any implementation was likely to have bugs that could be found and exploited.
Secure communication shouldn't be, and doesn't have to be, so mind-bendingly complex. It should be relatively easy to implement secure communication libraries. It should be fall-off-a-log easy to use said libraries, so that software developers can get client/client apps, client/server apps, an
Re: (Score:2)
One difference is that if you use openssl and someone finds a huge bug then it gets fixed. If you've got the proprietary library and it has bugs you will never be told about them and if there are patches you'll have to wait many months to get them (or worse they'll require you to pay to buy an upgrade, you won't ever be able to backport fixes). So yes you may have a dozen solutions, but only 1 out of 12 probably lets you learn about severe vulnerabilities in a timely manner while also letting you supply yo
Make that THREE other things (Score:5, Interesting)
3) Port to to multiple architectures (and OS's) to catch bugs not reported by the original build environment. This is one of the approaches OpenBSD uses to improve security and was quite common in the open source software world when ESR coined the phrase.
The OpenBSD team found one very long lasting (30+ years) bug in the legacy BSD code when the Sparc64 build barfed.
Re: (Score:3)
That's a double edge sword as shown by the clusterfuck that is OpenSSL. When you start supporting many architectures then the strange hacks you need to do to make things work can be the ones that introduce the security risk.
Re: (Score:2)
Re: (Score:2)
You are wrong, each and every person connected to the internet is an open source user. Open source iis first and foremost used on the planet, your email for example isn't routed around the globe by Microsoft Exchange servers, nor global DNS done by Active Directory servers.
The hundreds of thousands of bug reports submitted and successfully used to patch open source prove you are blathering about a process you don't understand.
You also are a moron with no understanding of how your computing world works.
Re: (Score:2)
Wrong, those devices are connected to the internet where open source rules.
I'm older than IBM mainframes, boy. You have not been in the field longer than me.
Re: (Score:2)
Your manner of writing and your choice of idioms and colloquialisms reveal you to be about half my age; those from my era have a different "look and feel" as it were. Refine your abilities as a poseur, so they will shine in the darkness like a luminescent swamp gas.
Re: (Score:2)
Re: (Score:2)
Do provide links to your identity, so we may marvel at your accomplishments and technical acumen.
Re: (Score:2)
The only thing you have blown is any shred of credibility
Re: (Score:2)
Apk hits opensores idiots with truth and all they have is downmods to hide it as usual? Yes.
Apk hits guillible slashtards with best trolls & the laughter NEVER ENDS? Yes... apk
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I thought it was the other way around.
2014 was the year that Google's Project Zero and similar efforts from other companies like Redhat started proactively taking a deep look at some of these crusty old open source libraries and finding some real doozeys.
A lot of the big vulnerabilities last year were found and reported by these company security teams. 2015 will have more of these issues co
Re: (Score:2)
Preach it brother! The whole world should be using BSD!