Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Linux IT

Linux Foundation: Bugs Can Be Made Shallow With Proper Funding 95

jones_supa writes The record amount of security challenges in 2014 undermined the confidence many had in high quality of open source software. Jim Zemlin, executive director of the Linux Foundation, addressed the issue head-on during last week's Linux Collaboration Summit. Zemlin quoted the oft-repeated Linus' law, which states that given enough eyes, all bugs are shallow. "In these cases the eyeballs weren't really looking", Zemlin said. "Modern software security is hard because modern software is very complex," he continued. Such complexity requires dedicated engineers, and thus the solution is to fund projects that need help. To date, the foundation's Core Infrastructure Initiative has helped out the NTP, OpenSSL and GnuPG projects, with more likely to come. The second key initiative is the Core Infrastructure Census, which aims to find the next Heartbleed before it occurs. The census is looking to find underfunded projects and those that may not have enough eyeballs looking at the code today."
This discussion has been archived. No new comments can be posted.

Linux Foundation: Bugs Can Be Made Shallow With Proper Funding

Comments Filter:
  • Spending resources on 'finding the next Heartbleed' bug... I fail to see the advantage of finding it by a coordinated search as opposed to someone just stumble on it (as long as the bugs are reported responsibly of course).

    Software can't be made secure afterwards, it must be the the primary goal.

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Saturday February 21, 2015 @12:42PM (#49100813)
      Comment removed based on user account deletion
      • by zyche ( 784345 )

        Except that pretty much noone spends that time or resources to do that. It's more fun to continue adding features into the doomed architecture. Or start over... again.

        If you design a software with a certain feature set insecurely, it's often difficult to keep those features when re-goaling for security.

        A depressingly large majority of all software hasn't been coded with best-knowledge tools and APIs in mind. Not even those of the time of writing, but particularly not the one of the current time!

  • by Anonymous Coward on Saturday February 21, 2015 @12:39PM (#49100787)

    I've been using Linux for an awfully long time, since the mid 1990s (Yggdrasil, then Debian). Over time, as Linux has gotten more and funding, it has gotten worse and worse. I initially switched to Linux because it generally just worked, and it worked better than many of the alternatives. But now it's just getting fucking horrible. I mean, look at systemd. Normal users, and especially power users, don't want it. It just causes problem after problem for many people. Yet we have corporate interests and corporate-funded developers forcing it on us, even forcing it into community-oriented distros like Debian. GNOME and Firefox are other great examples of community-based open source projects that got co-opted by money and ruined, to the most recent versions of both being almost totally unusable. On the other hand, we see projects that get less commercial interest, like Slackware and Xfce, producing the most usable and reliable open source software systems around. Linux was better when there wasn't so much money floating around. Back then it was about creating great software, and doing things right. Now it's about everything but that.

    • As Heinlein noted, TANSTAAFL [wikipedia.org], just like there's no such thing as free beer. Everything has a cost. Even free software.

      And when you have such a fragmented ecosystem, the attack surface is going to be huge (after all, an OS is more than just a kernel), and the idea that "with enough eyes all bugs are shallow" is patently false. So it turns out that open source has been to a large extent relying on the same "security through obscurity" model. This was fine a decade ago, but the competition have stepped up their game and can afford to throw money and bodies at the job without begging.

      The solution would be to do a code freeze for 2-3 years while the developers of the various projects audit their code and the ways other projects interact with their code - not just for security problems, but to get rid of bloat and cruft. That's not going to happen, because it makes too much sense. Everyone wants the newest shiny.

      Linux was definitely better when there were fewer distros. What a mess.

      • As Heinlein noted, TANSTAAFL, just like there's no such thing as free beer. Everything has a cost. Even free software.

        Free software is free as in beer. If you claim it's not free because of the time to put in, then $100 lying on the pavement in front of you isn't free because you have to take the time t ogo and bend down and pick it up. Making such claims is basically changing the definition of the word "free" to mean something other than what it actually means.

        Linux is Free as in Beer because you can get i

    • by phantomfive ( 622387 ) on Saturday February 21, 2015 @04:05PM (#49101811) Journal

      Yet we have corporate interests and corporate-funded developers forcing it on us, even forcing it into community-oriented distros like Debian.

      Debian adopted systemd for reasons outlined here. [slashdot.org] It wasn't a conspiracy. Poettering knew that distros are crucial to the adoption of systemD, so he's made things as easy as possible for them, and given them features they wanted. Essentially systemd makes it easier to write an init script, and since Debian writes a lot of them, they liked that.

      Of course, Poettering has been less responsive to other parties (like the kernel devs), but that's another topic.

    • by Skiron ( 735617 )
      " like Slackware and Xfce, producing the most usable and reliable open source software systems around." Well said - exactly what I use (even Slackware on my Raspberry Pi's).
    • by dkf ( 304284 )

      I've been using Linux for an awfully long time, since the mid 1990s (Yggdrasil, then Debian).

      Darn noobs! I remember having fun making the MCC Interim distribution work...

    • by tlhIngan ( 30335 ) <slashdot.worf@net> on Sunday February 22, 2015 @04:24AM (#49104145)

      Over time, as Linux has gotten more and funding, it has gotten worse and worse. I initially switched to Linux because it generally just worked, and it worked better than many of the alternatives. But now it's just getting fucking horrible. I mean, look at systemd. Normal users, and especially power users, don't want it. It just causes problem after problem for many people.

      No, it hasn't gotten worse. It has gotten responsive to user demands.

      Back in the 90s when life was simple, users were simple. Unless you used an Amiga or MacOS, if you played a sound, that was it - no one else could play a sound (MacOS and Amiga had software mixers so you could listen to music AND hear application generated sounds - you could use exclusive mode if you needed it, though).

      Likewise, you logged in and you rarely had things starting up just for you.

      And your networking options were... single. You either had Ethernet, or a modem, and only one IP per host. And rarely did you move - I mean, if you were on Ethernet, it was assumed you were on the same network permanently, or at least changes were rare.

      Nowadays, user demands have gone way up. Audio has to be mixed by the OS because the user may listen to tunes, start yakking on VoIP, and having sound effects played while gaming, all simultaneously. The VoIP call goes over say, a Bluetooth headset or the communications path, while the music and sound effects play through the main speakers. Oh, and no application is to dare use the HDMI port to send audio as it's hooked to a monitor with no speakers. A modern PC can easily have 4 or 5 different ways to play audio.

      Likewise, when you log in, you probably have a few per-user services you like to have - either from the environment you're using or other services. It would be a shame if logging in again restarted those services (e.g., you log in locally, then log in remotely over ssh) or if those multiple sessions couldn't communicate with each other (e.g., you make a change remotely, and it fails to propagate through the rest of the logins).

      And networks... well, an Ethernet port or WiFi? A user may connect to many different networks in a single day, and have more than a few ways to send a packet around. Perhaps they're hooked to their same network multiple ways - either dual Ethernet, or Ethernet plus WiFi. And maybe the next time the connection is re-established, those ports need to be firewalled because it went from private network to public.

      Back in the old days, well, audio was simple because your PC couldn't really do multiple things at once. Networks were generally safe so it didn't matter that you didn't bring up the firewall on the public Ethernet connection. And users didn't run too many things in the background because no one could imagine needing to log into the console AND over ssh simultaneously, or they could just remotely kill the session because there wasn't important stuff to save.

      And it's perfectly fine on a server that sits in a rack and never moves until it's powered down and retired. But modern users need this complexity just to manage their normal use case. Sure you can force the user to tell you what kind of network is at the other end, or to re-establish the VPN, but users want computers to do stuff automatically - I mean, why should I tell the computer this coffeeshop WiFi is public over and over again - can't it remember?

      Or to reconfigure my VoiIP app because I attach my Bluetooth headset to my computer so it now uses that - why can't it ask for a communications headset, and if one isn't available right now, use the default audio hardware. Then when one suddenly appears (Bluetooth!), automagically use that? Zero reconfiguration, event he app doesn't have to reopen the audio device because the audio core did it internally.

      It should be telling that the most popular Linux "distribution" in the world is Android, which has its own init system (like systemd, it manages processes, events, and other things), its own audio

      • Unless you used an Amiga or MacOS, if you played a sound, that was it - no one else could play a sound (MacOS and Amiga had software mixers so you could listen to music AND hear application generated sounds - you could use exclusive mode if you needed it, though).

        FreeBSD 4.x, also from the 90s, allowed you to play multiple sounds simultaneously. It used the same OSS code that Linux used ... but they enhanced it to support features Linux never did. Unfortunately, Linux devs continuted with their NIH syndro

  • Whose Eyes? (Score:5, Insightful)

    by Capt.Albatross ( 1301561 ) on Saturday February 21, 2015 @12:40PM (#49100789)

    Even for non-security bugs, the many-eyes hypothesis contains a large dose of wishful thinking, but at least in that case most eyes are looking with the same purpose. When it comes to security, however, it is a race between black-hat and white-hat eyes, and the former only have to win once.

    • by thieh ( 3654731 )
      The problem with many-eyes hypothesis is that not everyone looks for the security bugs and not everyone is capable of looking for every kind of bug. Everyone will notice a problem when the UI behaves differently. Not many of us will notice when the command-line utility did something subtly different (especially when it comes to RNG, like the Debian OpenSSL bug back in 2008 [wikipedia.org]) without abnormal output.
    • Even for non-security bugs, the many-eyes hypothesis contains a large dose of wishful thinking

      Not true; Torvalds' observation wasn't what he wished would happen, it what what he'd observed repeatedly on a large and complex project over the period of many years.

      That said, I think your disagreement is because, like many, you misunderstand the hypothesis. What Torvalds said wasn't that given enough eyes all bugs are visible, but that they're shallow, meaning easy to track down and fix. The hypothesis doesn't even come into play until the existence of the bug is known.

      And, undoubtedly, there are som

      • Re:Whose Eyes? (Score:4, Informative)

        by Your.Master ( 1088569 ) on Saturday February 21, 2015 @07:39PM (#49102659)

        Torvald's didn't say the "many eyes" thing at all. Eric S. Raymond did.

      • The [many-eyes] hypothesis doesn't even come into play until the existence of the bug is known.

        If that is so, then it doesn't help much with security, where finding exploitable bugs (and doing so before they are exploited) is usually the hard part.

        • The [many-eyes] hypothesis doesn't even come into play until the existence of the bug is known.

          If that is so, then it doesn't help much with security, where finding exploitable bugs (and doing so before they are exploited) is usually the hard part.

          Precisely. It's not that the hypothesis is wrong, it's just that it doesn't apply.

          This doesn't reduce the value of open source for security software, because while it gives both white hats and black hats a great deal of help with finding vulnerabilities, the nature of security research means that the white hat side benefits more. Open source software, developed in public, also makes it more difficult for the likes of the NSA to insert back doors, because it's not just a matter of paying (or threatening) s

  • Bugs can be made shallow?

    On Linux, bugs are only skin deep

    Why have bugs at all?

    • Bugs can be made shallow?

      Sure. Just put on your cockroach-killer shoes (you know, the ones with the pointy toes to get 'em in the corners) and start stomping.

      Unfortunately, you can't eliminate programmer errors the same way.

      (programmer errors are not "bugs". they didn't mysteriously creep into the code on their own when nobody was looking. saying "it's a bug" is just a way to avoid responsibility for a mistake, and leads to both a slack attitude and a feeling of non-responsibility).

  • by JoeyRox ( 2711699 ) on Saturday February 21, 2015 @12:49PM (#49100843)
    Maybe Linus isn't cursing at the developers with enough frequency or intensity?
  • Is there a way to re-engineer operating systems so that some parts are strictly read-only (like, baked in ROM chips); other parts difficult to change (flash them?), and so on? Right now, it seems all data, programs and operating system components are equally vulnerable to writes by viruses. How many people would be harmed if some basic components of XP had been burned into ROM? Then anti-virus programs could hook into those "fortified" modules to maitain or restore the integrity of other parts.
    • This has been tried with DRM. I remember those game CDs coming with bad sectors intentionally written to make copying difficult, and software products which came with a specially crafted parallel port dongle to add hardware protection.
      None worked.

      The solution you're proposing makes life more difficult only to regular users who would need to order chips and slam them into a motherboard to upgrade their operating system. Not to mention a bug which would creep into the read-only part of the OS. At least now yo

    • How many people would be harmed if some basic components of XP had been burned into ROM?

      Everyone who had one, because they would be found to have security vulnerabilities (see here for an example of exactly that happening [defcon.org]), and then everyone's system would be vulnerable.

      Incidentally, Kaspersky was building an OS that does exactly what you suggest [kaspersky.com], so if it works, then maybe we will see more of what you suggested in the future. I'm doubtful though, for reasons mentioned in the previous paragraph.

    • by Ramze ( 640788 )

      Intriguing suggestion, but perhaps based on a false premise that "data, programs and operating system components are equally vulnerable to writes by viruses." That's most certainly not the case even on a Windows platform. System files and folders usually require an admin to modify, and drivers and other OS components typically must be signed drivers to update. On "trusted computing platforms", there's even more security on what can even boot on the machine. A virus should only have privileges based upo

    • that's some of the idea behind http://en.wikipedia.org/wiki/W... [wikipedia.org]

      OpenBSD does this even for kernel with x86-64

  • Modern software security is hard because modern software is complex.

    Doesn't that just about say it all? More eyes don't solve complexity issues, only more brains and better architecture.

    • by Kjella ( 173770 )

      Doesn't that just about say it all? More eyes don't solve complexity issues, only more brains and better architecture.

      I think that if you do some research - at least if you limit yourself to human subjects - you will find there's a strong correlation between number of eyes and number of brains so "more eyes" implies "more brains". And if you can settle the age-old discussing of whether encapsulation, abstractions and design patterns reduce or increase complexity you should the IT Peace Prize.

  • Once the bugs are in released code, it is too late to remove them efficiently.

    .
    Maybe a more cost-efficient approach to spending the Foundation's money would be to determine how and why the bugs get into the code in the first place, and reduce their occurrence as early in the development cycle as possible.

    The earlier in the development cycle a bug is eliminated, the cheaper it is to eliminate the bug.

    • by Kjella ( 173770 )

      By the time something becomes "core infrastructure", it's usually not in a condition where a rewrite is at all advisable. You have an existing code base that's seen lots of real world usage and presumably works well most of the time, what you need is testing, cleanup, sanity-checking, error handling and formal verification that it performs as intended. And it's particularly important that you review obscure functionality like the heartbeat TLS extension that lead to the heartbleed bug, that you put many eye

  • > The solution is to fund projects that need help

    But then it's not FOSS anymore? How will they resolve this massive ethical dilemma?
  • Tempting offer, but I think I'll pass.

  • There is a way to properly test software. But it is insanely expensive. Real mission critical software (like airborne systems) has standards for code verification that are pretty tough. For example per standard DO-178B [wikipedia.org], required is complete structural coverage analysis; object code analysis; worst case throughput analysis; stack analysis, etc.

    There's no way that volunteer programs can find funding for this or human resources to do this. Although many companies do contribute to various open source programs,

  • undermined the confidence many had in high quality of open source

    protip: only naive college-level awareness fanboys thought that. Everyone else was aware of the illusion of security in Linux and knew it was mostly through obscurity that it was not the victim of attacks. It's not just a lot of eyeballs btw, but the right eyeballs. And reviewing shipped code for security is usually the last thing foss people spend their time on.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...