Please create an account to participate in the Slashdot moderation system


Forgot your password?
Security Operating Systems Software Government Linux News

How the NSA Took Linux To the Next Level 172

An anonymous reader brings us IBM Developerworks' recent analysis of how the NSA built SELinux to withstand attacks. The article shows us some of the relevant kernel architecture and compares SELinux to a few other approaches. We've discussed SELinux in the past. Quoting: "If you have a program that responds to socket requests but doesn't need to access the file system, then that program should be able to listen on a given socket but not have access to the file system. That way, if the program is exploited in some way, its access is explicitly minimized. This type of control is called mandatory access control (MAC). Another approach to controlling access is role-based access control (RBAC). In RBAC, permissions are provided based on roles that are granted by the security system. The concept of a role differs from that of a traditional group in that a group represents one or more users. A role can represent multiple users, but it also represents the permissions that a set of users can perform. SELinux adds both MAC and RBAC to the GNU/Linux operating system."
This discussion has been archived. No new comments can be posted.

How the NSA Took Linux To the Next Level

Comments Filter:
  • by Anonymous Coward on Sunday May 11, 2008 @11:37AM (#23369738)
    Until we have a free government [], I cannot see how anyone can trust software that comes from the NSA.
  • by (653730) on Sunday May 11, 2008 @11:41AM (#23369766) can read the code []. People has read the code and there's nothing "hidden" on it. People who thinks that SELinux allows the NSA to enter your computer are just clueless.
  • by Anonymous Coward on Sunday May 11, 2008 @11:45AM (#23369790)

    utter pain in the ass to work with....

    Long ago, in the days when MLS was just the holy grail, Harris Corporation created the first A1 rated Multi-Level Secure computer system. I can't recall the name given to it, BlackHawk or something overblown like that. It was secure, but utterly unusable. According to some early testers I knew, it took more than 10 minutes just to log on. The command line took, on average, 5 minutes to respond to the simplest command. There were no policy templates, so all permissions and access lists had to be entered manually.

    SELinux doesn't look quite so bad in that light, now does it?

    Yeah, yeah, yeah and it took years to calculate by hand before computers and months to travel any distance before airplanes. So what's your point?

    SELinux is a pain in the ass. Your comparison is meaningless.

  • Re:wrong (Score:3, Insightful)

    by harry666t (1062422) <> on Sunday May 11, 2008 @11:50AM (#23369828)
    > Everyone who matters has always just called the OS "Linux".

    Of course including the Debian people, who made one of the greatest distros so far?

    (NOT the greatest, but certainly one of the greatest)
  • by EQ (28372) on Sunday May 11, 2008 @12:00PM (#23369880) Homepage Journal
    Put your nearly insane conspiracy theories to rest on this one, thats one of the reasons we have open source: to keep things like Microsoft's backdoors from being slipped in.

    And aside from that, lets see, they have arguably several hundred to thousands of the best crypto and security people working for them so yeah lets completely ignore what they have to say in favor of some nebulous conspiracy.

    Think about this: could such a conspiracy exist with that many people being informed of it? All it takes is one person to anonymusly leak stuff to the papers or internet. I mean really, the secret money tracing stuff they were doing got splashed on the front pages of the NYTimes, and the previous administration couldn't even keep a presidential blowjob a secret.

    But the bottom line is: It is OPEN SOURCE (and even GPL'd!). Read the code. They cannot hide a backdoor from the kernel group when those programmers and all the patchers, testers, and users have all the source.
  • by HeroreV (869368) on Sunday May 11, 2008 @12:14PM (#23369956) Homepage
    Why not just fix the silent failure? I don't understand this mentality of "There's a bug in the system! Scrap the whole thing!"
  • by lkcl (517947) <> on Sunday May 11, 2008 @12:36PM (#23370086) Homepage
    if you believe that selinux is "an utter pain in the ass" then you have misunderstood what selinux is for. selinux is specifically designed to be able to PROVE that an application is secure, using formal mathematical analysis (of the policy files).

    [ the principle on which selinux works is that when you change "security context", it doesn't matter a damn if you were "god" before, you're now starting from scratch with zero permissions in the new context unless otherwise specified. this is best illustrated with an example of when you go into a military environment, they take your ID badge away from you and issue you with a temporary one that is only relevant inside that building. you can't even leave the building without that temporary badge, and it's been coded to only let you go to the toilet and into the rooms that are associated with your specific purpose for being in that building. and of course, if you forget to get your permanent ID back once you _do_ leave, you'll find it very difficult to get out the country! ]

    one of the "rules" that GCHQ and the NSA follow is that it is perfectly acceptable for something to be "insecure" as long as you KNOW that it's insecure: you can then provide a workaround or a fix to ensure that the security vulnerability is never exploited.

    the one thing that you absolutely absolutely must not ever have is a situation where you don't KNOW whether something is "secure" or "insecure".

    so if AppArmour has wonderful automated rulesets that are impossible to analyse...

    the thing about selinux is that policies require that you understand the source code and what the application is doing. for example, one of the guidelines is that applications should use exec rather than fork, because that provides total privilege separation, obviously, between tasks. fork() does not provide such a complete level of privilege separation, and so up until quite recently there was absolutely no way in selinux to even step into a separate security context on a fork() - it just... wasn't ... even ... remotely worth considering.

    however, it turns out that there were some specific instances why stepping into a different security context on fork() is actually useful (such as in samba) and so it was added in. due to the circumstances under which this could be thoroughly abused, it was decided that it should be provided only via an explict selinux function call (usually, you can just provide an selinux policy statement without any code modifications).

  • by Truekaiser (724672) on Sunday May 11, 2008 @12:37PM (#23370088)
    I'll bet money that 99% of the people who have access to the code would have no clue what it does. that only leaves those who are familiar with it and those that know the language it is written in but are not familiar with the specific code. the former would easily be silenced, the later can be dismissed as kooks and better yet other people will do it to them as well due to herd mentality.
    frankly i think it's wise to not trust the nsa even if you can see the code, because frankly it's just plain misplaced faith that a simple philosophy like oss can universally protect you from such malicious intent, Especially considering the history and track record of such a agency.
  • by Darkness404 (1287218) on Sunday May 11, 2008 @01:21PM (#23370336)
    So some people don't understand the code very well. Thats why the 1% of people look for malicious changes and fix them. How many open-source projects have malware in them compared to all the Windows Freeware/Shareware/Adware that has it in them? Its like saying just because a recipe isn't verified by a chemist it must be designed to either A) Poison you or B) affect your mind to buy less of a competitors product. Source code can be compared to a recipe, and how many people who cook really know the science behind why they add in everything to bake a cake? I'm sure very few but how many die from incorrect recipes that were changed? I'm sure very very very few ton none.
  • by Darkness404 (1287218) on Sunday May 11, 2008 @01:27PM (#23370376)
    How many people have looked through all the lines in a recipe and understand all the chemical reactions? Seriously, whats with people having faith in how somehow someone wouldn't slip in something that would be poisonous that the maintainers of the recipe wouldn't notice? Compare recipe to SELinux and you get the general picture.

    And why would Debian, Red Hat, Ubuntu, and Fedora have it if it were malicious? Despite the fact that the US government could have made Red Hat put it in for Red Hat and Fedora, that still leaves Debian which is community (and is quite good about making sure its systems are secure) and Ubuntu which is based in the UK and is community much like Debian.

    Sure, healthy suspicion is good, but really, its just as stupid as saying because not everyone knows what the chemical reactions are when you are cooking it suddenly leaves you open to poison yourself with it.
  • by gaspyy (514539) on Sunday May 11, 2008 @01:30PM (#23370392)

    Means that when it becomes mainstream, anyone who is familiar with how to configure and use it will be in high demand.

    If no one's using it, how will it become mainstream?

  • by Haeleth (414428) on Sunday May 11, 2008 @04:12PM (#23371602) Journal

    I'll bet money that 99% of the people who have access to the code would have no clue what it does. that only leaves those who are familiar with it and those that know the language it is written in but are not familiar with the specific code. the former would easily be silenced
    How, and by whom exactly?

    You're forgetting that Linux development is distributed across the world. Maybe the NSA might conceivably be able to "silence" developers within the USA. But what hold exactly would the NSA have over developers in Europe and Asia? Even if you suppose that the USA's close allies such as Britain and Canada might be persuaded to join in some conspiracy, what would other countries have to gain? You would have to propose a global conspiracy, with governments the world over uniting to, um, stop themselves from finding out about the backdoors that America was using to spy on them? Sorry, but this is the most half-baked conspiracy theory I've ever heard.

    frankly i think it's wise to not trust the nsa even if you can see the code, because frankly it's just plain misplaced faith that a simple philosophy like oss can universally protect you from such malicious intent, Especially considering the history and track record of such a agency.
    Leaving aside the clear paranoia that is causing you to characterise the NSA as "malicious", they would have to be not only malicious but downright stupid to put backdoors into open-source code.

    For example, the Chinese government uses Linux themselves. It would be foolhardy in the extreme for NSA to assume that they will not have their best security experts scouring the code for backdoors. If they found one, they could use it themselves -- or they could expose it, seriously embarrassing the United States. Not exactly the kind of thing that's likely to result in NSA funding being maintained at its present high level...
  • Re:wrong (Score:3, Insightful)

    by Haeleth (414428) on Sunday May 11, 2008 @04:24PM (#23371708) Journal
    Big difference: Windows was designed to be a complete OS in its own right, but Linux was specifically intended to be combined with the GNU software to form a complete OS.

    I can't deny that I normally call the combination "Linux" myself, but I don't understand why some people are actively hostile to the concept of calling it "GNU/Linux" instead.

Computers can figure out all kinds of problems, except the things in the world that just don't add up.