Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Software Linux

The Future of Trusted Linux Computing 158

ttttt writes "MadPenguin.org tackles the idea of Trusted Computing in its latest column. According to author Matt Hartley, the idea of TC is quite reasonable; offering a locked-down environment offers several advantages to system administrators with possibly troublesome users. 'With the absence of proprietary code in the mix users will find themselves more inclined to trust their own administrators to make the best choices ... And so long as any controlled environment is left with checks and balances [like] the option for withdrawal should a school or business wish to opt out, then more power to those who want a closed off TC in an open source world." LWN.net has an older but slightly more balanced look at the TC approach.
This discussion has been archived. No new comments can be posted.

The Future of Trusted Linux Computing

Comments Filter:
  • Re:O RLY? (Score:3, Interesting)

    by ilikejam ( 762039 ) on Friday October 19, 2007 @10:01AM (#21040795) Homepage
    A sufficiently motivated whatnow?
  • by arivanov ( 12034 ) on Friday October 19, 2007 @10:45AM (#21041587) Homepage

    Excuse me but how exactly do I get my linux kernel i compiled myself signed?

    SelfSign it. It is not the fact that it signed, it is who sign it which matters. From there on an access request goes down the chain with everyone signing it. The access control for A may like your selfsigned kernel. Similarly, it may not and it will invalidate everything down from it as untrusted. It is A-s "owner" choice.

    And if you are talking about DRM for media, forget it, it is not here to stay.

    You have mistaken me for someone who gives a fuck about signed MP3s. Now a document sitting on a corporate CMS encrypted individually on every release and with an associtated cert chain for each revision is something I do care about. A lot. A lost laptop in this case no longer means stolen data. The entire problem of document access control also more or less goes away. Same for revision and change control. While it is a hassle it solves quite a few real world problems.

  • by Zigurd ( 3528 ) on Friday October 19, 2007 @11:07AM (#21041987) Homepage
    Trusting "trusted" computing requires trusting hardware makers that can insert exploits. Trusted computing is therefore of limited value to end-users in a world where vendors and service providers are routinely leaned on to allow surveillance back doors.

    If you have applications that you need to secure, in order to prevent, for example, misuse of tax filings or medical records, you can do it using Web applications, or other thin client technologies combined with physical security of client computers. There is nothing that can guarantee stopping someone copying data manually from a screen display and smuggling it out of an office, so there are practical limits to securing data beyond which additional technology is pointless.

    There are some theoretical cases where trusted computing could benefit individuals. But, in practice, it's all about someone else trusting your hardware to rat you out. Most of the money flowing in to trusted computing comes from those kinds of uses. "Trusted computing" has rightly earned distrust.
  • by SiliconEntity ( 448450 ) on Friday October 19, 2007 @12:09PM (#21043217)
    Unfortunately there are several DIFFERENT, INCOMPATIBLE concepts being bandied about under the name Trusted Computing. This new "Trusted Computing Project" took on that name seemingly without being aware that there was substantial work already under way on a different concept with the same name.

    Perhaps to try to remedy the confusion, we can distinguish between TC as proposed by the Trusted Computing Group [trustedcom...ggroup.org] and other forms of TC. The TCG is an industry consortium with Microsoft, Intel, HP etc., dating back several years, originally called TCPA. Their proposal has always been controversial but IMO misunderstood.

    TCG's flavor of TC is fundamentally open. I would call it Open Trusted Computing, OTC. It does not lock down your computer or try to prevent anything from running. It most emphatically does NOT "only run signed code" despite what has been falsely claimed for years. What it does do is allow the computer to provide trustworthy, reliable reports about the software that is running. These reports (called "attestations") might indicate a hash of the software, or perhaps a key that signed the software, or perhaps other properties or characteristics of the software, such as that it is sandboxed. All these details are left up to the OS, and that part of the technology is still in development.

    Open Trusted Computing runs any software you like, but gives the software the ability to make these attestations that are cryptographically signed by a hardware-protected key and which cannot be forged. Bogus software can't masquerade as something other than it is. Virus-infected software can't claim to be clean. Hacked software can't claim to be the original. You have trustworthy identification of software and/or its properties. This allows you to do many things that readers might consider either good or bad. You could vote online and the vote server could make sure your voting client wasn't infected. You can play online games and make sure the peers are not running cheat programs. And yes, the iTunes Music Store could make sure it was only downloading to a legitimate iTunes client that would follow the DRM rules. It's good and bad, but the point is that it is open and you can still use your computer for whatever you want.

    This is in contrast to some other projects which may or may not call themselves TC but which are focused on locking down the computer and limiting what you can run. The most familiar example is cell phones. They're actually computers but you generally can't run whatever you want. The iPhone is the most recent controversial example. Now they are going to relax the rules but apparently it will still only run signed software. This new "Trusted Computing Project" is the same idea, it will limit what software can run. Rumors claim that the next version of Apple's OS X will also have some features along these lines, that code which is not signed may have to run in sandboxes and have restrictions.

    This general approach I would call Closed Trusted Computing, CTC. It has many problematic aspects, most generally that the manufacturer and not the user decides which software to trust. Your system comes with a list of built-in keys that limit what software can be installed and run with full privileges. At best you can install more software but it is not a first-class citizen of your computer and runs with limitations. Closed Trusted Computing takes decisions out of your hands.

    But Open Trusted Computing as defined by the TCG is different. It lets you run any software you want and makes all of its functionality equally available to anyone. P2P software, open-source software, anything can take full advantage of its functionality. You could even have a fully open-source DRM implementation that used OTC technology: DRM code that you could even compile and build yourself and use to download high-value content. You would not be able to steal content downloaded by software you had built yourself. And you could be sure there were no back doors,
  • by Cheesey ( 70139 ) on Friday October 19, 2007 @01:09PM (#21044275)
    Yes, there are certainly benefits. I changed my mind about TC when I needed my own machine to boot up in a trusted state, so that I could be sure that it was safe for me to unlock my encrypted filesystems without the keys being stolen by a trojan. Without a TPM, the only way to do this is to boot from removable media, since an unencrypted kernel on disk could be modified by an attacker. But a TPM could be used to store a key-unlocking-key that would only be available to kernels with my digital signature. Under the control of the owner, TC is useful.

    It is a shame that TC almost certainly will be abused in various ways, enforcing DRM on media, games and applications, and creating new ways for major software vendors to lock users into their products. I don't like that possibility at all. Worse still is the possibility that remote attestation might eventually form part of the requirements for connecting to the Internet: that move would suit Apple and Microsoft (goodbye third-party OSs and web browsers), and it would suit organisations wishing to control the movement of information, such as oppressive Governments and record companies.

    But fortunately TC was never designed to be secure against owner tampering, and I suspect it will always be possible to get the private key out of the TPM by using differential power analysis (DPA), if you are sufficiently motivated to do so. I have heard that it is actually impossible to prevent DPA entirely: the most a chip manufacturer can do is make it take more time. Laws like the DMCA would make this type of hacking illegal, but I doubt that would stop anyone, any more than the DMCA has stopped people using DeCSS.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...