Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Bug Open Source Software Stats Linux

Linux Foundation's Census Project Ranks Open Source Software At Risk 47

jones_supa writes: The Core Infrastructure Initiative, a Linux Foundation effort assembled in the wake of the Heartbleed fiasco to provide development support for key Internet protocols, has opened the doors on its Census Project — an effort to figure out what software projects need support now, instead of waiting for them to break. Census assembles metrics about open source projects found in Debian's package list and on openhub.net, and then scores them based on the amount of risk each presents. Risk scores are an aggregate of multiple factors: how many people are known to have contributed to the project in the last 12 months, how many CVEs have been filed for it, how widely used it is, and how much exposure it has to the network. According to the current iteration of the survey, the programs most in need of attention are not previously cited infrastructure projects, but common core Linux system utilities that have network access and little development activity around them.
This discussion has been archived. No new comments can be posted.

Linux Foundation's Census Project Ranks Open Source Software At Risk

Comments Filter:
  • Excellent idea (Score:4, Interesting)

    by golodh ( 893453 ) on Saturday July 11, 2015 @10:09AM (#50088359)
    And overdue.

    As FOSS projects become more widely used (privately and by companies), it's an excellent idea to actually collect some statistics that give an idea of how well and how actively a project is maintained.

    An attacker might e.g. get commit rights to several low-activity projects, insert malicious code, and wait for people to download updates and become easily exploitable.

    No FOSS end-user ever checks code: they rely om the maintainers to produce clean and honest code. Large and tech-savvy businesses tend to be a little more cautious, but in the end only a minority have the staff and the budget to actually vet the code. So unless they're going to expose themselves by redistributing the code, or they know they're going to use it in mission-critical ways, they won't look into it very deeply.

    This leaves users vulnerable. Because when a project is "asleep", it's a good candidate to slip in some backdoors.

    Given the number of FOSS projects, it can be quite hard for any organisation to get an idea of (let alone metrics on) how well maintained those projects are. Doing this and making the numbers public is a very useful service.

    Of course, as no doubt various FOSS enthusiasts will rush to point out, it's not a perfect indicator.

    Well ... it isn't and it doesn't have to be, but it's a very useful indicator all the same. And if you can easily look up a project's rating, that will sharply increase the likelihood that it will be used.

    • I agree that it is a very good idea. But the execution leaves a lot to be desired.

      An attacker might e.g. get commit rights to several low-activity projects, insert malicious code, and wait for people to download updates and become easily exploitable.

      Their rating system actually encourages this. If you have tight controls on commits, like perhaps 1 or two people who review code and actually make the commits, you are "at risk." So go ahead and give that NSA guy commit access...

  • by Greyfox ( 87712 )
    I was just noticing the other day that a number of emacs lisp packages I use on a regular basis hadn't had any development work in 5-10 years. It's a bit discouraging to go looking for something and only find a goddamn sourceforge link for it. That's my main metric for the death of a project -- you can only find a sourceforge link.

    I guess it's understandable. Those guys wrote those things to scratch an itch and they worked well enough long enough. If a company where trying to maintain all the code that go

    • I was just noticing the other day that a number of emacs lisp packages I use on a regular basis hadn't had any development work in 5-10 years.

      If it works, why change it? The SmallWall project was immune to all of the SSL bugs in the last year because we use an old version that does not have these new and buggy features... Of course, this rating system would ding us for that... :)

    • by DarkOx ( 621550 )

      One of the problems the software world faces is when is something 'done'.

      Suppose and application was carefully designed and written 5 years ago and was free of problems like buffer overflows, logic errors, bad assumptions about input domain, bad assumptions about scale. If these things were true 5 years ago why would they be less true today? The questions than becomes does this application still meet the need today?

      Software isn't like a house or machine. It does not require maintenance unless a problem

  • There's far too much software out there that depends on having clocks close to in sync.

    • The freeware ntpd at http://www.ntp.org/downloads.h... [ntp.org] was extremely stable code. It's greatest problems have been with subtle configuration issues, not with the old daemon itself. Unfortunately, the service is now merged into systemd itself, which means that support for it from that large part of the Linux world will no longer apply to any other operating system.

      The main codeline at http://www.eecis.udel.edu/~ntp... [udel.edu] shows that it is in active development, mostly to support new operating systems and hardwar

    • NTP is actually quite solid. But pool.ntp.org could use some love. It can't work without servers...
    • Guys. Just use OpenNTPD.

      It's actively developed, and it doesn't have any of the bloat that is in the official ntpd. And it syncs clocks!

      Brought to you by the OpenBSD project. Known for writing secure software.

  • No surprise about tcpd (aka Wietse Venema's TCP wrapper utilities) which has not been updated since its last release in 1997.

    It should just be removed from all Linux distributions just as Arch did in 2011: https://www.archlinux.org/news... [archlinux.org]

    Use something, anything else rather than this practically stone age software.

    • "I know you have this tool that works perfectly well for what you are doing, but you should remove it now, and use my new and bloated tool instead..."

      Seems to be the current "best practice" and I hate it.
  • #1 (Score:1, Troll)

    1. The .NET framework
  • I am officially starting the Church Of SystemD, which will bring enlightenment to the masses.

    Services praising The Holy SystemD will be performed at gunpoint, so stop making trouble with all those facts and shit and just get in line.
  • Are they really at risk or just mature? after 20-30 years I don't see how tools like whois and bzip2 could really need that much development.

  • Why don't we try the open source route and have people adopt these at-risk core system utilities? Won't there be any interest if those are up for adoption? If we get 3-4 volunteers per tool we can for sure do something about this and get more people to contribute to those tools.

    I would definitely be up for something like that.

Beware of all enterprises that require new clothes, and not rather a new wearer of clothes. -- Henry David Thoreau

Working...