Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Bug Open Source Programming IT Linux

2014: The Year We Learned How Vulnerable Third-Party Code Libraries Are 255

jfruh writes Heartbleed, Shellshock, Poodle — all high-profile vulnerabilities in widely used libraries that rocked the software industry in 2014. Sadly, experts are now beginning to believe that these aren't the only bugs lurking out there in widely used open source code, just the ones that grabbed the most attention. It's beginning to look like one of the foundation concepts of open source — that with enough eyes, all bugs are shallow — is a myth. Of course, probably no one believes that all bugs are instantly shallow, no matter how open is the source, or that open source software is immune from bugs -- particularly ESR, coiner of the phrase.
This discussion has been archived. No new comments can be posted.

2014: The Year We Learned How Vulnerable Third-Party Code Libraries Are

Comments Filter:
  • by tj2 ( 54604 ) on Friday January 02, 2015 @11:12AM (#48717455)

    The phrase might be true, but we're seeing the effects of insufficient eyes. In reality, how many sets of eyes are actually reviewing these libraries at a source code level? I rather strongly suspect that in most cases they are simply used under the assumption that "well, everyone uses it, it must be okay".

    • by TheRaven64 ( 641858 ) on Friday January 02, 2015 @11:19AM (#48717545) Journal
      Exactly. The problem with OpenSSL was that everyone used it, no one looked inside it. Those of us who did look inside it recoiled in horror and didn't do proper code review because 'kill it with fire' was the immediate reaction for anyone looking at the code. Open source isn't magic. People who build their business on a third party library have a lot of choice in terms of who provides the support (bug fixing, code auditing, and so on) for that library. Far more than with a proprietary library, where you're locked into a single source. If everyone choses to get support for the product using the power of wishful thinking, then don't be surprised if the code is crap.
      • by ledow ( 319597 ) on Friday January 02, 2015 @11:37AM (#48717687) Homepage

        Amen.

        I touched OpenSSL once. It was a nightmare. No idea where or how it passed anything as it wasn't at all clear the path that simple things, like certificate checking, were supposed to take.

        In the end, I hacked onto it rather than play with it. The documentation was non-existent. The code samples were incomplete and with almost zero explanation of what you were supposed to be checking for and where things COULD go wrong. Hence 90% of the code I see that touches OpenSSL looks exactly like the samples and nothing more.

        All I wanted to do was have two x509 certificates, and check that both were valid and one properly signed the other, as part of a primitive DRM scheme I was toying with. It turned into a nightmare scenario of IMAGINING every possible outcome and specifically coding for each one, rather than anything sensible.

        I don't think I'd ever touch it again, and was not at all surprised that there were problems with it. I was more surprised that others had had the same problems, yet OpenSSL was still regarded as the "gold standard" library to integrate with.

        • I am struggling with this. As far as I can say that you are saying you looked at a large code base *once* and you found that it was complicated and you didn't understand it.

          Is this not true with most code bases? I mean, I can look back at my own code from a while back, and it takes me quite a while to work out what it is doing. And long-lived code bases can, ironically, be particularly problematic.

          The ultimate problem here is the funding problem. Free software is much more adaptable because you don't have t

          • by ledow ( 319597 ) on Friday January 02, 2015 @12:20PM (#48718119) Homepage

            "once" = several weeks of fighting with the damn thing to do one simple task, clearly specified, that's way within it's scope.

            There is no serious documentation. The examples are given as documentation and are vastly incomplete.

            It's not a question of "glanced at it, it was horrible", but for anything serious even a quick glance will show whether or not it's a nicely-produced library or not. One quick glance at a library is normally all I do in order to get a handle on whether it's good enough and clean enough for me to program against.

            The problem with OpenSSL was that the only people who knew how it worked never bothered to simplify that or document that enough. There is no "is this certificate valid" function, that returns a enum from a list of potential problems (CERT_EXPIRED, CERT_NOT_YET_VALID, CERT_CORRUPT, CERT_UNTRUSTED, CERT_INSECURE, etc.), for instance. There are lots of things that LOOK like that, but none actually do it in OpenSSL.

            Given that it's a library who's primary purpose is - given a configuration of particular algorithms and keys - to produce a encrypted bitstream from an unencrypted one (or vice versa), it's suprisingly complex to do anything simple with any guarantee that you're doing it right.

            • by kesuki ( 321456 )

              the problem is 'security' software is never as secure as promised. it's like physical keys, you can only make so many types of physical keys, so you would think they would make locks with a larger sample size than the retailer has access to. but no go to any menards you want to, bring the package from your door lock and they will point you to the exact same paired key on the shelf. throwing away your key package is inviting a thief to grab fresh one at the local hardware store and ask if they can get matchi

        • by Ichijo ( 607641 )
          "There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies." --C. A. R. Hoare
        • I touched OpenSSL once. It was a nightmare. No idea where or how it passed anything as it wasn't at all clear the path that simple things, like certificate checking, were supposed to take.

          In the end, I hacked onto it rather than play with it. The documentation was non-existent. The code samples were incomplete and with almost zero explanation of what you were supposed to be checking for and where things COULD go wrong. Hence 90% of the code I see that touches OpenSSL looks exactly like the samples and nothing more.

          LOL I agree with you the documentation sucks especially when you have any need to go off the SSL_* rails...yet this is niche + free shit and in that context well in line with a number of open source C APIs or generally what I've come to expect.

          There is a separate .c file for every command verb you can possibly type from openssl CLI in the apps folder.

          I've seen a number of cut and paste jobbers in various projects where authors obviously had taken no time to understand what they were doing yet this is hardly

      • by AmiMoJo ( 196126 ) *

        Better to pay for an independent review of open source code than to hope you can trust a closed source vendor when they say they are secure.

    • by Z00L00K ( 682162 )

      In closed source you don't know anything at all about bugs lurking until someone accidentally encounters it.

      Running a third party library with closed source in a closed source application - well, you have many potential problems there that can be flying for a long time.

      Don't forget that it wasn't long ago that a considerable bug in Windows was found that had been around since Windows 95 [pcworld.com].

    • by antientropic ( 447787 ) on Friday January 02, 2015 @11:37AM (#48717685)

      A long time ago, I saw Bertrand Meyer (the Eiffel guy) give a keynote at ICSE, where he pointed out that the "given enough eyeballs, all bugs are shallow" claim is unscientific, because it can't be falsified: if a bug is not found, people can always say that there were not enough eyeballs, so "Linus' Law" still holds.

    • Yeah, that's what I came here to say. I re-read The Cathedral and the Bazaar [unterstein.net] recently, and it's talking more about development methodology, rather than making perfect software. Here is ESR's more formal statement explaining what he meant: "Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone."

      And here is the context. Note that specifically it is referring to bugs that people notice....if the user-base doesn't notice th
    • The phrase might be true, but we're seeing the effects of insufficient eyes.

      If there are insufficient eyes, then the truth of the phrase is moot.

  • Libraries? (Score:4, Interesting)

    by fnj ( 64210 ) on Friday January 02, 2015 @11:14AM (#48717485)

    Shellshock did not affect a "library", but an executable.

    • It depends a bit on your definition of a library. In the broadest sense, it's just a body of code that other things use. Programs that communicate via pipes fall into this category as do language interpreters. Bash definitely meets this definition. There's more to libraries than just linker input.
      • In that case the word "code" would have been more appropriate than the word library. This is slashdot after all.
  • by Ukab the Great ( 87152 ) on Friday January 02, 2015 @11:16AM (#48717503)

    the big news is that people are now thinking that bugs in software is big news.

  • by sinij ( 911942 ) on Friday January 02, 2015 @11:18AM (#48717533)
    My magic 8 ball tells me that in 2015 we will learn that proprietary and embedded software is even more vulnerable. My Tarot Card deck tell me that we will see a lot of hacked car wrecks in 2015, now that Volvo released the demon by putting a web browser into in-dash system. Rest of the lemmings are sure to follow. Not that you really need a browser to pwn a car, with Bluetooth-to-CAN-BUS exploits shutting down cars demonstrated as early as 2012.
    • ...My magic 8 ball tells me that in 2015 we will learn that proprietary and embedded software is even more vulnerable. ...

      imo, That's already been proven by Microsoft over the years....

    • by Z00L00K ( 682162 )

      A lot of vehicles are now looking at common operating systems like Android, Windows, iOS etc. just because they are common on the market.

      Just be aware that the proprietary systems they are leaving are a lot worse when it comes to bugs, so it's not an option to stick to a proprietary system. However some people in the vehicle industry are still not having a very good knowledge on how to segment networks.

      • by sinij ( 911942 )
        Car manufacturers will have to learn that they will have to a) patch cars b) support them out of warranty with security patches c)educate users and independent mechanics to apply security patches. Soon, it will be "Quick Lube and Patch" service stations.

        Personally, I prefer my cars air-gaped. I place negative value into infotainment systems and any car functionality that is no directly related to driving. Unfortunately, I am in the minority.
  • I don't know anyone that ever thought "Open source" was bug free. The point is that people can more easilly find and fix bugs with open source. With closed source, there could be some obvious and dangerous mistakes in the code but no-one but those with access to the source will know it exists. It's then up to whomever owns the source to decide if it's profitable enough to fix it. The problem with that system is there are people with access to the source... People come and go from every company on earth ever

    • Re:um what? (Score:4, Insightful)

      by meta-monkey ( 321000 ) on Friday January 02, 2015 @11:33AM (#48717661) Journal

      Not only can they fix them, but they do fix them. As soon as these vulnerabilities were discovered (by non-malicious actors) patches were available within days or even hours. Commercial vendors take their good sweet time. Or in the case of Microsoft, discontinue support for the still widely-used Windows XP. Find a vulnerability in that? Too damn bad. It'll never get fixed.

      Nobody's saying open source software is bug-free or even necessarily has fewer bugs than closed source software. It's what happens when a bug is discovered that makes the difference.

      • Hey now, I'm all for closed-source bashing, but let's be fair: Windows XP is over 13 years old. Yes, if you found a bug in one of it's contemporaries such as Red Hat Linux 6.1, but it's much more likely you'd upgrade to a newer version of the software where the bug has (hopefully) been fixed.

      • Or in the case of Microsoft, discontinue support for the still widely-used Windows XP. Find a vulnerability in that? Too damn bad. It'll never get fixed.

        Like when Ubuntu Server 13.04 didn't get a fix for Heartbleed because they discontinued support after 1 year despite the criticality of the bug and the servers seeing considerable use? All the official replies were "it's your own fault" and "change distro version immediately". Which you often can't do quickly. No users really expected 12.04, 12.10, 13.10 and 14.04 to get the fix while 13.04 in the middle was left out - except people who read the really, really fine print and took it seriously. Shipping the

        • Dick move, for sure, but at least it's able to be patched by the administrators themselves. With a closed source product you'd be SOL.

    • Say what? (Score:2, Insightful)

      by Anonymous Coward

      I don't know anyone that ever thought "Open source" was bug free.

      Every FOSS fanatic on Slashdot for the last 17 years has implied that bugs would be found and fixed FAST - not linger for years.

      The point is that people can more easilly find and fix bugs with open source.

      They could but do they? Nope.

      • by Livius ( 318358 )

        I don't know anyone that ever thought "Open source" was bug free.

        Everyone who didn't think about it thought exactly that. Which means a very large number.

    • I don't know anyone that ever thought "Open source" was bug free.

      At least back in the day one of the main motivations to move to open source was that it provided a less buggy experience than the proprietary option.

  • Whether it's OpenSSL or Windows APIs hackers are looking at every possible vector to attack systems. To be honest with ourselves, the software engineering community has to realize that security must be given the same priority as any other code quality metric. While we may not be able to test for every possible vector there should be a standard set of vulnerability tests that every organization should be able to test for before releasing code. Likewise regression tests need to be exercised prior to any su

  • Comment removed based on user account deletion
  • In the cases that were software defects, the defect was rapidly fixed upon discovery. That's really the meaning of all bugs being shallow, not that they won't ever exist.

    That said, POODLE is not a code defect, but defect in the standard (well, except the bit where an implementation skipping validation of the pad). Shellshock was indeed a grave defect, but I think a correct takeaway there is to avoid having a shell language be in the path where untrusted data could be injected as much as possible (as well

    • Exactly. Saying bugs are shallow doesn't mean they don't exist. Shallow vs deep refers to how much effort it takes to characterize and fix - is it a "hard" bug or an "easy" one. Wikipedia explains it well:

      As the Heartbleed bug shows, even shallow bugs[7] may persist in important pieces of open softwareâ"it took two years for the bug to be discovered, and the OpenSSL library containing it is used by millions of servers. Raymond said that in the case of Heartbleed, "there weren't any eyeballs".[8]

      When

  • The quality of open source software is kind of crusty these days. No matter how open it is, making stuff work properly should be priority number one.

    For example, try to adjust the display brightness of a laptop under Mint or Ubuntu. It goes in multiple steps because there is multiple listeners for the adjustment event. Even basic stuff like this does not work properly.

  • Open-source does not make code automatically bug-free. No more than using a safe-malloc-library does, or deploying DEP on your executable, or ASLR, or coding only in a language that's considered "secure".

    What it does is allows certain types of security problems to be POTENTIALLY spotted. It's a +1 on the score, not a game winner. And it doesn't mean that proprietary is -1, either. It just means that you are so confident in the quality of your code, you can show people that by opening it up.

    What gets me

  • Misunderstood (Score:4, Interesting)

    by TopSpin ( 753 ) on Friday January 02, 2015 @11:54AM (#48717851) Journal
    ESR's claim has nothing to do with the frequency or discovery of bugs. All he says is that given enough observers, bugs are quickly characterized. It is implied that any given bug has already been discovered. There is no benevolent cohort of experts continuously auditing code bases and his statement doesn't claim there is.
    • Yes, exactly this. ESR was saying that if bugs annoy users, the bugs will be fixed before the users go off and find a different kernel.
  • Has anyone noticed that there are now astronomically more OSS users now? The number of OSS users is also growing at an exponential pace.

    What we should expect with those stats is that there should be more cracks and bugs in OSS due to the higher percentage of people programming/using it.

    Also, as the value of OSS increases to the market and more information are handled by OSS there is more incentive for old vested interests to search for the downside as a form of marketing. We never heard about all those MS

  • 2014: The Year We Learned How Vulnerable Third-Party Code Libraries Are

    Really? Like we did not know before?
    I don't think anyone in the industry who is both sane and honest ever pretended that FOSS was bug-free.
    We know that software, ALL software, contains bugs.
    Also, plenty of projects don't have too many contributors, so the "many eyes" principle hardly applies.

    But if you've got the source at least you can have a look, (and really should, if you are considring using something for a mission-critcal application).
    Then fix, if required,and contrib back.

    Certainly, vulnerabilities

    • I don't think anyone in the industry who is both sane and honest ever pretended that FOSS was bug-free.

      Why not? We should precisely strive for FOSS being the golden standard for bug-free software. The OpenBSD team is doing pretty good work in this area.

  • From the perspective of most IT customers, bugs are bugs regardless of closed or open source. They still rely on other people to find them, patch them and release changes.

    Companies who rely on open source libraries may or may not have the ability or spare resources to go digging through the code of a library, finding a security issue, writing a patch for it, recompiling the library, then using that patched copy in production. Companies in the 'service provider' realm may be able to do this, simply because t

  • So there were a few high-profile security flaws found in important open source software recently. So what? People are talking almost as though this somehow proved that open source is not superior or maybe even inferior to closed source software.

    It isn't like there has never been a high-profile security problem in important closed source software. Nor is it likely there will not be others in the future.

    Here is an awfully safe prediction... in the future there will be more high-profile security bugs found in

    • "Certainly the age of the code that caused these bugs is reason for concern."

      But.. if those bugs were so obvious and easy to exploit then why didn't "the world end" a long time ago? I'm pretty sure there are an awfull lot of important systems out there that have Bash on them!

  • I have never been a believer open source code is automatically more secure. Different projects have different code quality. Depends on if/how they are managed and who all is willing to step up contribute their time and effort.

    There have been some advantages unique to open source projects such as static analysis vendors developing, testing and marketing their wares blessing a huge swatch of open source land with the fruits of their labor as well as the ability for savvy users to evaluate willingness to use

  • Computer Science is still a newbie discipline. Much more relevantly, the problems introduced by the sudden social change of what a network is are a pretty big deal.

    Here's how you know it's crazy: look at the hacker hysteria, and how it has barely gotten any better. The vast majority of "hackers" who cracked stuff back in the day were treated entirely ludicrously, like some kind of wizard. Everyone here probably remembers indefinite detention and ludicrous punishments such as "can't use a computer", whic

  • When everybody has the same goal, as is pretty much the case for usability issues, the shallowness of bugs posited by the many eyes hypothesis would be a good thing. When it comes to security issues, it sets up a race between the white hats and the black hats, and there is more incentive for the black hats (collectively, the rest of us have as much incentive as do the black hats, but that is not the case individually - for one thing, an attacker satisfies his goal by finding just one vulnerability.)

  • Which Linux user actually got hacked by a library vulnerability this year? Speak up now. Oh, hmm, the sound of silence. Certainly not me, and not anyone I know of.

    The thing is, sometimes the many eyes just aren't pointed in the right direction. A publicly disclosed vulnerability changes that instantly, hundreds or thousands of expert eyes to got work, fixes happen fast, and the community learns from the incident, often resulting in the eradication of a whole class of risks.

"Plastic gun. Ingenious. More coffee, please." -- The Phantom comics

Working...