Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security Software Linux

The Future of Trusted Linux Computing 158

ttttt writes "MadPenguin.org tackles the idea of Trusted Computing in its latest column. According to author Matt Hartley, the idea of TC is quite reasonable; offering a locked-down environment offers several advantages to system administrators with possibly troublesome users. 'With the absence of proprietary code in the mix users will find themselves more inclined to trust their own administrators to make the best choices ... And so long as any controlled environment is left with checks and balances [like] the option for withdrawal should a school or business wish to opt out, then more power to those who want a closed off TC in an open source world." LWN.net has an older but slightly more balanced look at the TC approach.
This discussion has been archived. No new comments can be posted.

The Future of Trusted Linux Computing

Comments Filter:
  • by jonwil ( 467024 ) on Friday October 19, 2007 @09:29AM (#21040339)
    There is nothing wrong with hardware assisted security if the owner controls all the keys and nothing can touch the trusted hardware without the owner specifically installing it (i.e. logging in as root/administrator and changing things).

    Trusted Computing is only bad if the owner of the hardware does not have control over the software on the machine, the hardware keys etc.
    • by arivanov ( 12034 )
      Err...

      I would say that the owner should be allowed to do anything he likes provided that he cannot fake the keychain.

      Example in a pre-baked trusted environment when accessing resource A I sign up with a chain which shows that it is done by me, through software X on kernel Y and hardware Z.

      I should not be allowed to fake kernel Y, but there should be nothing to prevent me from installing an alternative signed kernel Y1. Similarly, I should be able to run Y on Z1 or X1 on Y as long as the chain is correctly r
      • Without informing anyone. External entities should be free to *request* specific support software, but the user should always have the right to override that request.
    • Trusted Computing is only bad if the owner of the hardware does not have control over the software on the machine, the hardware keys etc.

      It's not always bad even then. It depends who the owner of the machine is. If the owner is someone who is easy to socially engineer (90% of users, I'm sure -- Come look at the dancing bears!), then a behemoth corporation is in effect the system administrator for all those people, and locking down machines by allowing only signed applications can make sense. Most people

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Trusted Computing is only bad if the owner of the hardware does not have control over the software on the machine, the hardware keys etc.

      The only problem is that the whole point of Trusted Computing is to keep the keys used to attest to the state of the PCR completely unavailable to the user. Read the spec: https://www.trustedcomputinggroup.org/specs/TPM/ [trustedcom...ggroup.org]

      • by Rich0 ( 548339 )
        You're missing the point. Of course that is what those pushing TC WANT the system to do. It shouldn't be allowed to take off.

        There is a lot in TC that is good. You could potentially eliminate viruses, for example, with a hardware-backed chain of trust. The issue is that the chain of trust should lead back to the computer owner - not the computer manufacturer.
        • Oh, please. The need to deal with documents, and programs, that are not signed is so prevalent that Palladium's usefulness against them is just about zero.

          It may be useful against tools that corrupt virus checkers, but viruses and vulnerabilities come out so fast in basic software and protocols that this is only of limited usefulness. And that "chain of trust", as Palladium is designed, leads right back to Microsoft, who can be expected to have already handed over keys to the NSA or other federal authoritie
    • by Alsee ( 515537 )
      I am a programmer. I have read the Trusted Platform Module technical specification from cover-to-cover (332 pages of TCPA_Main_TCG_Architecture_v1_1b.pdf) plus numerous surrounding technical specifications.

      There is nothing wrong with hardware assisted security if the owner controls all the keys

      There is nothing wrong with hardware assisted security if the owner is allowed to know all his keys.
      Knowing his keys then provides full control of them.

      The specification explicitly forbids the owner to know his own ma
  • Huh? (Score:2, Insightful)

    by fitten ( 521191 )

    With the absence of proprietary code in the mix users will find themselves more inclined to trust their own administrators to make the best choices


    Proof of this statement?
  • Or are the users getting their CPUs' source code and recompile them? Or at least call their LinCPUx fans to do it for them?

    Trusted Computing requires trusting the CPU manufacturer in the first place. And in this world, where the telcos have disclosed our conversations to the govt without us finding out but several years later, can we really trust that the government hasn't pressured the CPU makers to add a backdoor here and there?

    Trusted Computing is practically closed, and incompatible with the spirit of Open Source/Free Software. Ergo, Trusted Computing cannot be trusted. Sorry.
    • by Chrisq ( 894406 )
      wire your own computer out out of logic gates [win.tue.nl]!
    • As others have commented, the gentleman in the article is using TC in a way that isn't the same as we have come to know it. It seems like he's talking about your admin having root access on your box, rather than the DRM controls. Since he's speaking about the former, this really isn't anything new. Most business users don't have admin access to their own PCs. This is standard practice.

      In principle, there is nothing wrong with TC, so long as the owner of the PC has the private keys. But this scenario is
    • Trusted Computing requires trusting the CPU manufacturer in the first place.

      Actually, TC has almost nothing to do with the CPU. The TC Trusted Platform Module (TPM) is a separate device that is just another peripheral. Most implementations sit on the USB bus.

      Trusted Computing is practically closed, and incompatible with the spirit of Open Source/Free Software. Ergo, Trusted Computing cannot be trusted. Sorry.

      Not true. TC is an open specification, and can be used to implement all sorts of different security policies. The TPM is just a peripheral that provides three services:

      • Hashing of data sent to it. Coupled with TC-aware BIOS this can be used to construct a hash that represents the boot state -- essentially a hash of a
    • I don't fear this too much. Suppose this actually happens, i.e. one CPU manufacturer sells CPUs with a "backdoor". Whatever this may be, it allows some level of remote control over the PC.

      This is almost certainly discovered. Let's suppose we can't choose for the competitor, because they're in a big conspiracy.

      Making CPUs isn't that hard. It's making them the fastest and the cheapest that's hard. There are open source processor designs available, like the LEON core [wikipedia.org]. There are lots of producers of FPGAs [wikipedia.org]
  • by amigabill ( 146897 ) on Friday October 19, 2007 @09:53AM (#21040691)
    With the absence of proprietary code in the mix users will find themselves more inclined to trust their own administrators to make the best choices

    Sorry, but I think that's putting your words into everyone else's mouths. Or fingertips, or whatever. The vast majority not only don't have this opinion about open vs proprietary code affecting how much they trust the choices their admins make, they also wouldn't have a freakin' clue as to what you're going on about in that sentence. The vast majority don't know what open-source is, how it differs from proprietary source, they don't know any reason why they'd care either way, and they'd probably give you a pretty funny look for attributing this philosophy to them.

    I like Linux and open-source, and have an appreciation for it. But I don't trust my admin at work more when he talks about Linux than when he's talking about Solaris. It's his job to make the best choices of any and all products available, and I trust him to choose whichever is most appropriate for our company, even if he feels that happens to be a proprietary product. It's not my place to impose on him to only ever choose open-source, and there's cases in our work where open-source offerings are less ideal.
  • Deception (Score:4, Insightful)

    by IgnoramusMaximus ( 692000 ) on Friday October 19, 2007 @10:16AM (#21041045)
    These sorts of propaganda pieces have only one purpose: to sneak one past us. Trusted Computing (as presently defined by the corporate founders of the TC Consortium) has two major purposes which are deadly to all things "open":
    • To make sure that the computer can be trusted by a "contents owner" thus precluding the owner of the computer itself from being able to trust it
    • To allow for so-called "remote atestation" which has the effect of 3rd parties (banks and the like) to be able to trust the computer, again to the exclusion of its owner. The additional effect of this is that banks and other online entities will be able to ensure that only Windows systems, with "approved" apps are used. No spoofing of user-agent tags anymore, end of Linux use in most of the commercial Internet.

    In short, this article aims to lure the unwary into gullible acceptance of TC with an illusion of completely deceitfully presented and impractical (no one except the mega-corps will ever get the access to the main TPM keys) applications.

    • there are two definitions of "trusted computing", and it depends on who is doing the trusting.

      the first definition basically boils down to "we don't trust users" - and is the version of trusted computing that you're describing.

      the second definition basically says "we want users to be able to trust their computers and be able to do what they want without worrying".

      it should be fairly obvious which definition that a linux-based, free-software-backed distribution will go for, especially with the backing and qu
      • it should be fairly obvious which definition that a linux-based, free-software-backed distribution will go for, especially with the backing and quiet involvement of a couple of heads of police departments, and several professors from royal holloway.

        Except of course that it is a red herring. The first definition is the only one on which all of the proposed mass-produced commercial TPM designs are based. Most of these designs also include select, castrated elements of the second definition, as a bone to thr

      • Also I forgot to add:

        ... especially with the backing and quiet involvement of a couple of heads of police departments ...

        You gotta be kidding, right?

        Police are the last people on earth who want users of personal computers to be able to detect intrusions. They are, specialy the newfangled "Father ... err ... Homeland Security" types, one the groups most eager to use trojans, keyloggers and the like in their pursuit of "criminals".

      • Oh and did I mention that having the corporations control the user's computer and being the only ones being able to detect intrusions, while the user cannot, is the ideal scenario for police. The police can then simply require a megacorp to selectively ignore the police trojans and the user gets the worst of both worlds: the only people who are able to trust "his" computer are the corporations and the police.
  • than having proper permissions set up on a machine and doing a lockdown like what's built in to Gnome? Having proper permissions prevents people from installing shit and running programs that they're not supposed to. Using Gnome's lockdown feature prevents them from fucking up their DE.
  • In corporate networks, this will just lock down your PC a little more than it already is. Nothing to see here, move on please. It is in the home this shit gets interesting. Do you want your ISP, and possibly MS, to rule your PC? For the typical /. reader, the answer is a clear NO. But what about grandma? Imagine your ISP offering 2 kinds of subscription: a normal, "free" one and a "protected" one. The protected one is firewalled (or at least NAT-ed) at the ISP, with just "sensible" traffic allowed, l
    • by Cheesey ( 70139 )
      Why, that sounds like the future!

      2007. The problem with the current "untrusted" Internet is that anyone can join, make themselves effectively anonymous, and take part in terrible crimes that threaten to undermine the infrastructure of society. Such as piracy, child pornography, terrorism, money laundering, Linux, and spam.

      2017. Clearly, this could not go on. The solution that has been legally mandated requires the network to be upgraded before 2025, so that all packets have to be digitally signed by the ori

  • by Zigurd ( 3528 ) on Friday October 19, 2007 @11:07AM (#21041987) Homepage
    Trusting "trusted" computing requires trusting hardware makers that can insert exploits. Trusted computing is therefore of limited value to end-users in a world where vendors and service providers are routinely leaned on to allow surveillance back doors.

    If you have applications that you need to secure, in order to prevent, for example, misuse of tax filings or medical records, you can do it using Web applications, or other thin client technologies combined with physical security of client computers. There is nothing that can guarantee stopping someone copying data manually from a screen display and smuggling it out of an office, so there are practical limits to securing data beyond which additional technology is pointless.

    There are some theoretical cases where trusted computing could benefit individuals. But, in practice, it's all about someone else trusting your hardware to rat you out. Most of the money flowing in to trusted computing comes from those kinds of uses. "Trusted computing" has rightly earned distrust.
  • Trusted computing also enables a real market in CPU time. You can sell your spare processor cycles since the trusted machine can attest to the fact that this really was the result of the code you sent out. Similarly to have software agents that run on unknown people's servers this would be necessary.

    It would also be useful to implement true ecash schemes and in allowing true p2p based virtual worlds/games with safegaurds against cheating.

    In short the technology offers a lot more promise than mere security
    • Re: (Score:3, Interesting)

      by Cheesey ( 70139 )
      Yes, there are certainly benefits. I changed my mind about TC when I needed my own machine to boot up in a trusted state, so that I could be sure that it was safe for me to unlock my encrypted filesystems without the keys being stolen by a trojan. Without a TPM, the only way to do this is to boot from removable media, since an unencrypted kernel on disk could be modified by an attacker. But a TPM could be used to store a key-unlocking-key that would only be available to kernels with my digital signature. Un
  • by SiliconEntity ( 448450 ) on Friday October 19, 2007 @12:09PM (#21043217)
    Unfortunately there are several DIFFERENT, INCOMPATIBLE concepts being bandied about under the name Trusted Computing. This new "Trusted Computing Project" took on that name seemingly without being aware that there was substantial work already under way on a different concept with the same name.

    Perhaps to try to remedy the confusion, we can distinguish between TC as proposed by the Trusted Computing Group [trustedcom...ggroup.org] and other forms of TC. The TCG is an industry consortium with Microsoft, Intel, HP etc., dating back several years, originally called TCPA. Their proposal has always been controversial but IMO misunderstood.

    TCG's flavor of TC is fundamentally open. I would call it Open Trusted Computing, OTC. It does not lock down your computer or try to prevent anything from running. It most emphatically does NOT "only run signed code" despite what has been falsely claimed for years. What it does do is allow the computer to provide trustworthy, reliable reports about the software that is running. These reports (called "attestations") might indicate a hash of the software, or perhaps a key that signed the software, or perhaps other properties or characteristics of the software, such as that it is sandboxed. All these details are left up to the OS, and that part of the technology is still in development.

    Open Trusted Computing runs any software you like, but gives the software the ability to make these attestations that are cryptographically signed by a hardware-protected key and which cannot be forged. Bogus software can't masquerade as something other than it is. Virus-infected software can't claim to be clean. Hacked software can't claim to be the original. You have trustworthy identification of software and/or its properties. This allows you to do many things that readers might consider either good or bad. You could vote online and the vote server could make sure your voting client wasn't infected. You can play online games and make sure the peers are not running cheat programs. And yes, the iTunes Music Store could make sure it was only downloading to a legitimate iTunes client that would follow the DRM rules. It's good and bad, but the point is that it is open and you can still use your computer for whatever you want.

    This is in contrast to some other projects which may or may not call themselves TC but which are focused on locking down the computer and limiting what you can run. The most familiar example is cell phones. They're actually computers but you generally can't run whatever you want. The iPhone is the most recent controversial example. Now they are going to relax the rules but apparently it will still only run signed software. This new "Trusted Computing Project" is the same idea, it will limit what software can run. Rumors claim that the next version of Apple's OS X will also have some features along these lines, that code which is not signed may have to run in sandboxes and have restrictions.

    This general approach I would call Closed Trusted Computing, CTC. It has many problematic aspects, most generally that the manufacturer and not the user decides which software to trust. Your system comes with a list of built-in keys that limit what software can be installed and run with full privileges. At best you can install more software but it is not a first-class citizen of your computer and runs with limitations. Closed Trusted Computing takes decisions out of your hands.

    But Open Trusted Computing as defined by the TCG is different. It lets you run any software you want and makes all of its functionality equally available to anyone. P2P software, open-source software, anything can take full advantage of its functionality. You could even have a fully open-source DRM implementation that used OTC technology: DRM code that you could even compile and build yourself and use to download high-value content. You would not be able to steal content downloaded by software you had built yourself. And you could be sure there were no back doors,
    • This piece of propaganda that you are spouting is indeed 'Interesting' and 'Insightful' in how clever it is.

      You are right that TC only provides a signature which cannot be forged. But if you the user cannot forge the signature of the result of the cpu cycles that the computer runs - then anyone can write up software that does X and Y and Z only, ONLY, when you provide signed data to them - and wont work if you don't ...

      And thats the point! That is exactly what everyone will immediately do - the banks, t

    • It all still simplifies down to the fact that either I have the keys for my machine If so the content industry could not trust me or my machine.

      Or a third party has the keys, in which case I am no longer in control of my machine. It is not "My machine" anymore. I can no longer compile and run my own software. I can only run what my drm masters deem "trustworthy".

  • ...about the ramifications (both good and bad) of TC can be found here [cam.ac.uk].

    The main problem I have with TC is the fact that it removes control over the hardware from the user and gives it to a 3rd party entity.

    When I purchase hardware, I expect to have full control over it's capacities. If the hardware is capable of doing something, I should be able to do it. There's something a bit eerie about giving your computer a command/instruction and having it come back and tell you it could do it, but that it wo
  • It's quite helpful to watch as a primer/refresher: the wonderful animation [lafkon.net] about Trusted Computing. Simple, good, understandable.

  • Did you know that The TCG/TCPA specifications create a technical definition of the "owner" of device? It could be the manufacturer, the reseller, a sysadmin, a user, or someone the user loans the machine to. It all depends on who "takes ownership" (also technically defined in the spec) first. The "owner", in this sense, is the one who gets to specify which signing keys are needed to sign code that the owner wants to allow to run. This can include vendor keys, and even a user's own signing key.

    Whether TC is

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...