TCPA Support in Linux 501
kempokaraterulz writes "Linux Journal is reporting that "The Trusted Computing Platform Alliance has published open specifications for a security chip and related software interfaces.". In the latest Gentoo Newsletter they talk about a possible 'Trusted Gentoo', and possible uses for hardware level security."
Finally ready for the main stream (Score:5, Funny)
sigh
As sad as it is (Score:5, Informative)
Re:As sad as it is (Score:2)
Stupid mods.
Seriously, though - I do agree that there are _some_ potential benefits to this. Unfortunately, the concept opens up the possiblity of DRM restrictions infecting the Linux operating system. If we continue to run on non TCPA hardware, at least we can argue that our system will not support their restrictions.
Re:As sad as it is (Score:5, Insightful)
In general...sure...TCPA could have some positive effects on the computing community. However, it also has great potential to be slipped in...and eventually, by law, it must be used to lock things down. Only a few things at first...but, eventually could mandate a great deal of limitations as to what you can legally do with a computer. As much as the corporate entities are beginning to use the govt. to legislate things...and they really don't like the fair use we do have...it is easily possible to forsee this as a means to that end.
Taken long enough...it could happen, which is why you need to take things like this slowly and with a great deal of skepticism early on.
I heard it said before that "What one generations tolerates....the next generation embraces"
Think of it this way...the article the other day on /. about how many US kids don't understand what the 1st amendment really means...they haven't been taught about it...and we're tolerating loss of freedoms. When they are grown and we're not around...they won't even know they existed in the old form...
Here comes the flood?? (Score:5, Interesting)
The "trusted" boot functions provide the ability to store in Platform Configuration Registers (PCR), hashes of configuration information throughout the boot sequence. Once booted, data (such as symmetric keys for encrypted files) can be "sealed" under a PCR. The sealed data can only be unsealed if the PCR has the same value as at the time of sealing. Thus, if an attempt is made to boot an alternative system, or a virus has backdoored the operating system, the PCR value will not match, and the unseal will fail, thus protecting the data.
At the very least, that sounds like "bye-bye multi-boot systems".
IBM also has a rebuttal to TCPA's detractors [PDF] [ibm.com]. This one talks more about how the TCPA chip as currently designed "not been designed to resist local hardware attack, such as power analysis, RF analysis, or timing analysis." That's all well and good for the moment, and while the chip is (per the PDF) mounted on a presumably-removeable daughterboard, but how about the future? Is this how TCPA will stay, or is it the beginning of our worst fears??
At least these two whitepapers agree with most of us here on one thing -- DRM itself is stupid, for a variety of reasons.
Re:Here comes the flood?? (Score:3)
One wonders what forensics types, particulary government forensics types, have to say about this.
And I mean "have to say about this"
Re:Here comes the flood?? (Score:5, Interesting)
This means that the entire security of the boot process hangs on whatever data the CPU feels like sending to the chip for hashing. I could as well make a patch for GRUB that sends the "secure" version of GRUB down the SMbus and actually executes whatever nastiness I have in store.
In the case of DRM this lets me run whatever OS I want. The only thing I have to do is to feed a copy of whatever OS Hollywood trusts to the chip and voila the chip will say I'm legit and Hollywood will give me access to their movies for me to pirate at my leisure. :)
As I see it, the only way to get this to work for real is if Intel steps up and builds TCPA support into the CPU itself such that the PCR register is continuously updated as each instruction is executed. And all existing external chips have to be blacklisted, ofcourse.
Or does the TCPA system have some other trick up their sleeve that makes this work even though it's implemented externally to the CPU?
/greger
Re:Here comes the flood?? (Score:3, Interesting)
And it does make one wonder if a VM that's wise to the TCPA chip might be a solution to the "handcuffed" machine that Alsee (http://slashdot.org/~Alsee [slashdot.org]) often predicts as the end result of TCPA. If the CPU gets involved, perhaps the "freed" OS could run on a second non-TC CPU on an add-on card, sortof like the old way to run Windows on a Mac??
Just throwing out ideas, some of them possibly cracked. Feel free to add glue as needed.
Re:Here comes the flood?? (Score:5, Informative)
That's a clever idea, but it doesn't work. The secret is that the trusted boot process uses a concept of "trust extension". We start off with the BIOS. That takes a hash of itself and sends it to the TPM. Then the BIOS will load and run the boot loader. But - and here is the key - before running GRUB, the BIOS take a hash of GRUB and sends it to the TPM. Then it runs GRUB.
The next step is that GRUB - or at least the TPM enabled version [prosec.rub.de], performs a similar process for the OS kernel. It first takes a hash of the kernel and sends it to the TPM; then it runs the kernel. And the kernel can repeat the process with the various startup scripts and other programs that loads, a la tcgLinux [ibm.com] or the Enforcer [sourceforge.net].
The key point is that before each component is loaded, it is "measured" (i.e. its hash is reported to the TPM). So you can create a bogus GRUB or a bad kernel, but this fact will show up in the TPM's configuration registers because your bad component got its hash reported before it ran.
The one exception is the BIOS, but TPM systems are supposed to have restricted BIOS flash capabilities so you can't re-flash the part of the BIOS which does the initial hash of itself. This is part of what they call the Core Root of Trust for Measurement (CRTM) and it is supposed to be inviolable.
Re:Here comes the flood?? (Score:3, Informative)
From the Inquirer: [theinquirer.net]
Improved architecture for Prescott [CPU] includes better pre-fetcher branch prediction, advanced power management, improvements to hyperthreading technology, the PNI above, La Grande support, better imul latency and additional WC buffers. La Grande is the security feature Intel told us about at the last IDF, and includes protection in the CPU, at the platform level, and with software.
And this story: [my-esm.com]
Addressing growing se
Re:Here comes the flood?? (Score:4, Interesting)
And as you point out (re SP3 over SP2) -- what's to stop the OS from refusing to play nice if it doesn't encounter the PCR that it expects to see?? Might you have to provide your PCR when the OS is activated, and then you only get updates if the PCR still matches??
Can, meet worms.
Re:Finally ready for the main stream (Score:4, Interesting)
It has been my understanding that trusted computing equals not DRM automatically. Trusted computing is initially neutral technology: the barriers are built up only after the chip gets to choose a side. You can let Microsoft turn your PC into a DRM environment using TCPA's technology but that's the Microsoftish / {MP,RI,??}AA'ish approach. You can also use TCPA to turn your Linux box into a hardware-reinforced installation of your choice. If TCPA was widespread, you could for example control how the bastard big co. digitally uses, views and copies personal information when you buy something on their website.
Re:Finally ready for the main stream (Score:3, Informative)
If you have the technical brainpower to use TCPA + Linux to build yourself a secure hardware platform, you could also more easily build an equally secure all-software Linux platform.
The only advantage the TCPA gets from using hardware is it's a big barrier-to-entry for reverse engineers with physical access to the machine: they can't just load it up into an emulator/debugger, they also have to dissect the
Tee hee... published before editing was finished (Score:5, Funny)
Garrick? Garrick? McFly? McFlyyyyyyyyyy?
Re:Tee hee... published before editing was finishe (Score:3, Funny)
From the Fine Article:
-theGreater.
Apparently this is not the first time... (Score:3, Informative)
Go to the Linux Journal search function [linuxjournal.com] and search for 'garrick'. You should get eleven hits. I didn't read all of them, but using ctrl+f to search the pages revealed notes to Garrick re: font selection and the like. D'oh.
Do we really need it ? (Score:5, Insightful)
Isn't the only purpose of pushing things like TCPA locking the platform down ?
Comment removed (Score:5, Informative)
Re:Do we really need it ? (Score:3)
Re:Do we really need it ? (Score:3)
Oh, sure. Linux is perfectly secure, right? Keep on dreaming, it must be nice.
Actually, Linux security bugs [slashdot.org] are found all the time. Dan Bernstein had a bunch of undergraduates look and found 44 Linux security bugs [slashdot.org] in just a few weeks. The only reason Linux users are sa
Re:Do we really need it ? (Score:4, Interesting)
Oh, sure. TCPA can protect against OS bugs? Keep on dreaming, must be nice.
TCPA means that signed software can run with full permission. It only stops intentional exploits (programs specifically designed to infringe copyright), not accidental ones (buffer overflows or cross-site scripting).
To block such things, there are many well known techniques that can be applied- priviledge separation, data-tainting, external-error trapping, etc. But all of those can be implemented in sofware alone, without help from TCPA or any other hardware. Conversely, TCPA without those signficiant software changes gives zero benefit.
The only people TCPA might protect is those who put themselves at risk by running slapdash amateur software like Linux and OpenBSD, instead of staying with known quality brands like Microsoft, where security is job N!
PS. Incidently, the flaws in your argument are directly analogous to those in George W. Bush's social security plan. In both cases, to prevent a vague danger, he suggests doing 2 different activities, when really only one of them goes towards solving the difficulty at all- the other just serves his ideological agenda (and is more elaborate and expensive, to boot).
Re:Do we really need it ? (Score:4, Interesting)
I might want only a limited set of applications accessing a certian storage area.
Imagine a trusted P2P application that will only interconnect with the same trusted application? The trust works both ways. Just like the RIAA thinks they can "trust" their software running my computer to not be of my own creation, or a tampered version of their software, I can "trust" that MY software running on the RIAA's computer is similarly my original code, not tampered with or substituted.
Re:Do we really need it ? (Score:3, Insightful)
You can accomplish all those things in a 100% software implementation of priviledge separation. No special TCPA hardware is needed.
However, if you did have the special hardware, you would still need modified TCPA-aware applications and OS to make it work.
So let's consider the two paths towards reaching your goal:
A) A modified OS that restricts which of your applications are allowed to access which parts of your file system
Re:Do we really need it ? (Score:3, Insightful)
No you can't. The RIAA has the money and contracts to give orders to the people holding the keys with which the software was signed. You don't have that level of influence yourself.
Re:Do we really need it ? (Score:2)
No, of course we, the users, don't need DRM. It's entire purpose is to take control of a computer away from the user and into the hands of some other entity, who can allow or disallow any given function remotely. In such a system, DRM's role is to ensure that the user cannot regain control.
This, of course, is enormously usefull for entertainment and software industries. Especially the latter can force any license terms it pleases after DRM becomes required by law (it will - the two industries combined hav
Re:Do we really need it ? (Score:4, Insightful)
But number two comes a couple years down the road from widespread adoption, when some critical flaw in TCPA is found by hackers, TCPA is hacked, and innocent businesses that have come to depend on it for security are disrupted and exploited. And then we're looking around all doe-eyed, like, "but they said it was unbreakable security, they said it was trusted computing!" TCPA is just antoher level of command heirarchy, and subject to hack.
"Trusted computing" has got to be one of the most insidious marketing doublespeaks I've ever heard in my life. All "Trusted Computing" consists of is computers who don't trust me.
what is it good for? (Score:2, Insightful)
Re:what is it good for? (Score:5, Insightful)
Imagine that you're an admin at some big company, with a hundred Linux boxes. You have this stuff on every of those boxes, and a computer for administration somewhere safe. When you install software you first check it, then sign it, then push updates to your servers.
If somebody gets in, they'll have things quite difficult. Anything unsigned simply won't run at all. Rootkit modules, exploits, etc, will all simply not be able to run at all. This would take out a quite big part of the exploits an attacker could use. Remote ones would hopefully avoided by NX.
This wouldn't protect against things like races, but it certainly could help quite a lot.
The situation above is something I wouldn't have any problems with. If an admin wants to have an uber-locked down system where anything not signed by his key that's only present in a computer with no network connection in a secure room with an armored door doesn't run at all, then sure, why not. I'm fairly sure this can mostly be accomplished without hardware support at all, though.
Now, it's when software publishers want to make it impossible for me to control my computer when I have problems with it. But if the user has full control of it, I think it could come quite handy in some cases.
Re:what is it good for? (Score:2)
If it's implemented in software, then somebody will just hack that software.
The idea is that every bit executed by the CPU must be signed by a third party, enforced by hardware, with NO WAY TO BE MODIFIED.
Re:what is it good for? (Score:2)
I don't think there's any fundamental problem that makes it impossible to make a VM or emulator that's impossible to break through. If the attacker entered by ssh guessing a password or whatever, and assuming the kernel doesn't have any defects that make bypassing the protection possible, it should work just as well.
The scenario I presented in the grandparent should work just fine with software, IMHO. Now, the chip is certainly useful if you want to take away control from the owner of the machine.
Re:what is it good for? (Score:2)
Let me put it this way... All good encryption algorithms are open -- you can examine the spec and the implementation as much as you want. But you still need the key in order to get the scrambled data out.
This means that you can examine this chip as much as you want. As long as it does not give up its keys, it should still be secure. Then your computer can be trusted t
Re:what is it good for? (Score:4, Insightful)
In the end, it depends on who gets to sign the software, and how this software is distributed once signed. In our corner of the court, we have the admin signing software for 100 boxes (does he have to sign each separately? Can you sign software for every box out there at once? If its not a specific-to-that-machine signature, how do you keep the attacker for signing software too?) for the purpose of protecting the servers from software you don't want to run.
In the other corner of the court, it appears that we have big business interests who want to have all software signed, who would charge hundreds to sign software for other authors (verisign, et al will certainly be in the business), MPAA and RIAA will be wanting to make sure signed software obeys their rules (and will probably charge for this too), all to make sure your computers are protected from software they don't want you to run.
Things like this IBM article help make the first scenario a reality, and I'm grateful for it. Now, who wants to be the first to be sued by Microsoft for some TCPA submarine patent that nobody knows about?
Re:what is it good for? (Score:2)
Re:what is it good for? (Score:2)
Games.
There are many things, for instance, in MMORPGs and FPSs that have to be done now on the server that could be done better (in terms of performance and in terms of providing a better game experience) on the client, but can't be done there because it would allow cheating.
Re:what is it good for? (Score:2)
Think again, the manufacturers of TCPA have admitted that they are not secure against hardware manipulation. It is likely that we will begin to see TCPA mod-chips hitting the market soon after TCPA takes hold (that is right people TCPA means you will have mod-chip your PC). The whole point is to make sure that circumvention of DRM requires more effort than the masses will spend - requiring an expensive and illegal modchip to be attained is considered enough.
Re:what is it good for? (Score:3, Interesting)
While I have a problem with the uses of this platform that Microsoft no doubt intends, TCPA can be quite useful for making secure systems based on open standards.
One part of these modules is the ability to send keys to the hardware module in a way that cannot be read back out (but with encryption performed using this write-only data). This allows public-key encryption with the private key stored
Re:what is it good for? (Score:3, Informative)
LOSE...not loose. You can lose money...you can turn a dog loose. Two different words...two entirely different meanings.
Comment removed (Score:3, Insightful)
Linus Torvalds himself has blessed DRM (Score:5, Insightful)
Re:Linus Torvalds himself has blessed DRM (Score:3, Insightful)
Linux can show what user-centric trusted computing can/should do. Microfoft et. al. will be showing what Big Business trusted computing wants/can do.
Eventually there will be those that will ask why it has to work against them so much when running Billy Bob's OS, and then they'll realize that their PC is not their PC, but the industry's PC.
Re:Linus Torvalds himself has blessed DRM (Score:3)
Sillier things have happened.
Re:Linus Torvalds himself has blessed DRM (Score:3, Insightful)
Linus is not a lawyer. More importantly, he's not even a free software or open source evangelist. Unlike RMS or ESR, he doesn't even hang out with lawyers or devote serious thought to legal matters.
Since DRM is a combined legal-technical area, it falls outside Linus's expertise, and his opinion carries little weight. (From a practical standpoint, TCPA is incompatible with the Linux philosophy of open-source modifications)
Re:Linus Torvalds himself has blessed DRM (Score:3, Funny)
Linus is not a lawyer. ... Unlike RMS or ESR, he doesn't even hang out with lawyers or devote serious thought to legal matters.
I knew there was a reason why I liked the guy!
not entirely so (Score:4, Insightful)
From a practical standpoint, TCPA is incompatible with the Linux philosophy of open-source modifications
IMO this is not exactly correct - is it against Linux philosophy of open-source modifications to secure my Linux box so nobody except me can make modifications to it?
TCPA used in such way (i.e. in interest of user, not supplier, not government, ...) is quite in line with Linux philosophy of "you're in control" :) .
But, as with all weapons, it has two edges. So, beware! :)
Re:Linus Torvalds himself has blessed DRM (Score:2)
As such, open source is not at odds with the ostensible goals of TCPA, any more than it is with his ownership of the Linux trademark.
If you can't beat 'em, join 'em. (Score:2, Insightful)
Re:If you can't beat 'em, join 'em. (Score:5, Funny)
Wait, wait...you lost me on that one.
Re:If you can't beat 'em, join 'em. (Score:2)
Its probably a typo. They can't seriously expect anyone to Trust Windows.
Lacking One Thing (Score:5, Interesting)
Physical access to machines is always a big issue in security, and one that is often overlooked. And while it's probably not a big deal for your home machine, consider large companies whose machines could conceivably be targetting for a physical attack to recover the keys directly from the TPM (Trusted Platform Module).
Stajano's "Ubiquitous Computing" book has excellent coverage of the rationale, issues, and complexity of attempting to prevent physical access to chips and devices which store sensitive information. It's an easy read, and well worth it: http://www-lce.eng.cam.ac.uk/~fms27/secubicomp/in
Re:Lacking One Thing (Score:2)
Re:Lacking One Thing (Score:2)
Slashdot ran a story a few weeks ago about a new set of chips with built-in TPM [slashdot.org] features. These chips have the Trusted Computing capabilities built into the CPU. It will make it much more difficult to attack them physically since the TPM is not a separate module but is integrated into the whole system. Probably this is how
Re:Lacking One Thing (Score:2)
Put the computer in a secure vault, and put guys with guns at the door and perimeter.
Hardware Security (Score:3, Interesting)
Wouldn't that be... (Score:4, Funny)
TCPA is a DRM smokescreen (Score:5, Informative)
What is remote attestation? Basically, it means that the TCPA chip, which you cannot control, can read what operating system you have loaded, and send a reponse proving that you are running a certain operating system to others on the Internet. The purpose of this, of course, is so that the operating system can be verified not to have it's DRM functions cracked, so that the RIAA and MPAA can send you data and make sure that they get to decide what you do with it.
The people pushing TCPA will claim that it is not for DRM, but that is a smokescreen and only a smokescreen. While TCPA does not do DRM itself, it is the enabling component that is needed so that software can implement DRM without being circumventable.
What does this mean for a "trusted Linux"? It means that while it is completely possible to have a Linux system working with TCPA, once you change anything in the system, the TCPA chip will notice you are running a modified system, and nolonger let your data. So while the software may nominally remain under the GPL, it will be the death of the free software model, because users who wish to tinker with their systems will be locked off the Internet (Cisco is already talking about systems to have ISPs demand remote attestation when TCPA is in place). TCPA and Linux can be combined in theory, but only in theory - in reality they cannot ever coexist.
Those who do not believe me (or those who are inclined to believe the MS shills who will respond saying that I am wrong), should read EFFs analysis of TCPA [eff.org] where they give a simple way that the chip could be changed to allow all uses except remote attestation intended to force people to use certain operating systems and enforce DRM over the user. It has been completely ignored by the manufacturers of TCPA.
Re:TCPA is a DRM smokescreen (Score:2)
Use an x86 emulator and two copies of Linux, one that uses TCPA and one that doesn't. Run the x86 emulator on the unrestricted Linux copy, and use it to run the TCPA copy under emulation. The x86 emulator would just have some security 'flaws' when it comes to storing keys or it might do stuff like forgetting to apply the encryption. It would still report as a valid DRM chip, and would be able to provide keys and authentication on demand.
Re:TCPA is a DRM smokescreen (Score:2)
Emulation won't work to defeat DRM. The TPM generates an on-chip secret key at manufacture time which never leaves the chip. The manufacturer issues a certificate on the corresponding public key which certifies that this is a legit TPM key. The TPM then is able to issue attestations about the software configuration based on what is called a "secure boot" sequence, in which the TPM creates a fingerprint of the software confi
Re:TCPA is a DRM smokescreen (Score:2)
The purpose of this, of course, is so that the operating system can be verified not to have it's DRM functions cracked, so that the RIAA and MPAA can send you data and make sure that they get to decide what you do with it.
(Cisco is already talking about systems to have ISPs demand remote attestation when TCPA is in place). TCPA and Linux can be combined in theory, but only in theory - in reality they cannot ever coexist.
Lets be clear about what it is that you are saying here, even Linux OS's th
Re:TCPA is a DRM smokescreen (Score:2)
And see this rebuttal to the EFF report [invisiblog.com].
Further see this blog entry by the same author on good uses of Trusted Computing [invisiblog.com] all of which rely on the supposedly evil
Re:TCPA is a DRM smokescreen (Score:3, Insightful)
No...thats not it. I don't "oppose people having choice" or some cra
Re:TCPA is a DRM smokescreen (Score:4, Informative)
And casinos likewise operate without any oversight or auditing whatsoever. Millions of people play these games every day. Adding security can only benefit them.
No, adding a sense of false security does not make things better. People who play on online casinos today do not expect there to be software controling that the game is fair: to the extent they care they go by reputation and testing just like I suggested. TCP adds nothing - absolutely nothing - to make this more secure. The same thing goes for voting.
You're the one who's got to be kidding! Have you not heard of the many new forms of malware which are going after banking account numbers and infiltrating themselves into secure banking transactions? TC can stop these cold via sealed storage and remote attestation. Again, you are arguing that we should deny users access to these technologies purely for political reasons because you don't like the technology.
If the TCPA application of your bank were intended to stop malware, then it would have no problem with the EFF's proposed owner override. So once again with the lies!
I won't go through the rest of your "analysis" because it's the same kind of bullshit.
LOL. "I won't even begin to counter your arguments that the world is round because it is such bullshit."
My guess is that you are worried that TC will make it harder for you to pirate your favorate songs and movies.
No, no, try harder: what I actually care about is abudcting and sexually abusing small children. And strangling puppies. And helping the turrists!
TCPA is a technology designed from the ground up for exclusion. The fundamental question of the next century, as with the previous ones, is whether we wish to build and open society or a closed one, and TCPA is ultimate tool for those who wish to close our networks. The goal of TCPA is facilitate the handing over of control of our communication devices to others, so that our computers can decide what we cannot and cannot do with them, can and cannot run with them (if we still wish to access our data and the Internet), and ultimately dictate the parameters of all networked communication. Anyone who accepts TCPA accepts that he should live in digital prison, that his doors should be locked from the outside, that a priori restraint should be placed on his ability talk to others, and that not only the Internet, but all computing and all our data should be placed in the hands of a centralized few.
You, sir, disgust me.
Obligatory TCPA FAQ Link (Score:3, Insightful)
Trusted Linux is ILLEGAL (Score:3, Interesting)
1. Linux is distributed under the GPL (and other licenses).
2. To comply with the GPL, end-users must be able to acquire the source code (which means everything they need to reproduce the binary executble, with or without modifications).
3. If you don't comply with the GPL, you are committing copyright infringement, a federal offense.
But from the other direction:
4. Trusted computing means that all binaries are signed with a secret key.
5. The Trusted CPU will not execute binaries that weren't signed with that key.
6. In this way, it is impossible for end-users to create modified binaries to add/remove features from the software.
The GPL is too much in conflict with Trusted Computing to ever allow them to work correctly together. To obey the GPL, end-users must have access to everything needed to rebuild working binaries- which includes the secret key. But for Trusted Computing to work, it must be impossible for end-users to get the key- otherwise there's no point.
So, Linux or Trusted Computing. Choose one, because you can't have both.
Re: (Score:2, Insightful)
Re:Trusted Linux is ILLEGAL (Score:2)
Probably not. More likely burned on CD-R or on a USB keychain or something. Even so, if the key is on my system I've also likely password protected it. In fact I know I have.
Re:Trusted Linux is ILLEGAL (Score:2)
This does not mean you cannot have the source or you cannot modify it. Only, the modified version will not be trusted. That is what trusted code is all about.
So there is no conflict between GPL and trusted code. There is a conflict between modified code and trust, but that is the purpose of the entire concept.
Of course you can generate your own secret key, publish the public key, sign your binaries, and
Re:Trusted Linux is ILLEGAL (Score:2)
The "anyone" who matters in this case is hardware vendors like Intel, AMD, Apple, NVidia, and Hauppage. Obviously, Fortune 500 corps aren't going to trust a hobbyist like me.
You paint a false picture by implying that it's end-users who need to trust me... that I can put my modified software up on the web, convince them to trust me (by exchanging money or contracts or whatever means), and then they will
Re:Trusted Linux is ILLEGAL (Score:2)
Re:Trusted Linux is ILLEGAL (Score:2)
Re:Trusted Linux is ILLEGAL (Score:2)
Conversely, if the GPL allows me to sign binaries on a system where unsigned binaries don't run, then the GPL is broken, because it's got a loophole allowing GPL'd works to be effectively seized by corporate programmers with no compensation.
much open-source software is already signed by the creator / distributor, so you know that the binary you got was actually made by him.
Sure, you can sign binaries. But if you give that
Re:Trusted Linux is ILLEGAL (Score:2)
No. My private key is not required to create a trusted executable. Presumably the owner of the machine, if he chooses to compile the program from source, will be able to sign the resulting binary with his own private key. After all, surely the owner of the machine will be able to trust the output of his own (trusted) compiler. If he choose
Re:Trusted Linux is ILLEGAL (Score:2)
Completely wrong. The owners of machines don't get the keys needed to sign things for their own hardware. Only the builders of the hardware have those keys, and they are contractually obligated by agreements to the MPAA and RIAA not to divulge those keys to anyone (except employees in the course of their work).
If the owners of the hardware were going
Comment removed (Score:4, Insightful)
Re:Trusted Linux is ILLEGAL (Score:2, Informative)
Re:Trusted Linux is ILLEGAL (Score:2)
The trouble is not between TCG and Linux, but TCG and GPL (and Linux is one of many GPL programs).
Later, a program wil want to attest the software you are running,
The whole idea of that "attesting to the software you are running" is restricting who can modified the software. In particular, TCG wants to forbid amateurs and especially end-users from modifying the software. That is completely against the open-development spirit of Linux and all GPL projects.
Re:Trusted Linux is ILLEGAL (Score:2)
wrong. They'll still execute, they just won't be trusted.
Trust itself is a feature at the OS level! Do you think the BIOS knows whether some data you read off the disk is an application? Does the CPU know the difference between a current application and the one that just executed just because of a context switch (which happens all the time just during timesharing between all the different applications you already have running)?
Re:Trusted Linux is ILLEGAL (Score:2)
Fine. They'll "execute", but since they're not trusted, not all of the features will work. Meaning that not all of the binary is "executing"... so I guess I was right after all.
but I can pretty much assure you that Linux wouldn't bother with this and if it did that someone would fork it
How would they fork it? You can't fork if it won't run, and it won't run if you don't have the key. The only way to fork would be if you could use the GPL to coerce
Re:Are you stupid, or are you trolling? (Score:2)
Wrong. Suppose one feature of the executable is playing blue-laser discs of Star Wars VII: Return of the Binks. That functionality will NOT WORK if the OS is flagged untrusted.
At it's core, returning true from the is_trusted() system call is the single critical feature that will be removed, crippling dozens of important programs running on that OS.
won't be able to lie to them
Lying [cbsnews.com] is still, technically, a feature. I never said you had a right t
Re:Trusted Linux is ILLEGAL (Score:4, Informative)
5. The Trusted CPU will not execute binaries that weren't signed with that key.
6. In this way, it is impossible for end-users to create modified binaries to add/remove features from the software.
This is total garbage. Where did you get this nonsense?
TC does not require binaries to be signed with a key. TC will not refuse to execute unsigned binaries. And end users can do whatever they want.
Now for the facts, in case you're interested. TC implements a secure boot. This allows the TPM chip to store a hash or fingerprint of your software configuration: the BIOS, the boot loader, the OS, and if desired, the applications that are running.
The TPM and OS can basically do two things with this information. They can implement "sealed storage" which means that a program can lock its encrypted data to the current software configuration. This means that if you boot a different OS, or if the program gets modified (either of which might happen due to virus infection), the fingerprint changes and the data will no longer be available. Likewise if another program tries to access the first program's sealed data, it won't be able to get access to it.
The second feature of the TPM is "remote attestation". This allows a program to request the TPM to issue a cryptographically signed statement about what the current software fingerprint is that is running. This signed attestation cannot be forged because the TPM generated an on-chip key at manufacture time, and the manufacturer issued a certificate on that key which the chip can use to prove to anyone that it is a legitimate TPM.
Remote attestation allows network applications to determine what software configuration the peers are running, and, if they choose, to disallow participation by software which is not running a specified set of configurations. This is the closest you will come to the idea that users can't change their own software. If they want to run a program which relies on this feature, and that program doesn't accommodate the changes the user wants to make, they would be shut out. But in practice, open source programs will probably be flexible in this regard as they will want to have as many people as possible participate. There are a number of technical measures which can be adopted to allow for considerable user flexibility.
But certainly none of this would violate the GPL or any other legal prohibitions. Everything is entirely voluntary, and Trusted Computing does not prevent you from doing what you want with your computer. You don't even have to turn it on if you don't want to!
Re: (Score:3, Insightful)
Re:MOD DOWN _ STUPID (Score:2)
I can read. I can also tell when press releases are obviously lying.
The standard is partially open, but not the most important part: the secret key that allows you to sign binaries so the OS appears trusted. Unless the hardware companies actually publish that key, they're not providing the full "standard" (specification of exactly what their hardware does), and not allowing open implementation of compatible software.
Re:MOD DOWN _ STUPID (Score:2)
You think that SSL is not a fully open standard because websites do not give you the secret key of their certificates?
A signed binary only means you can trust the binary as much as you trust the signer. Nothing more or less.
The TCPA means that publishers of data can decide that they do not hand over their data to applications that *they* don't trust. That means it can be used for DRM, but it also means it can be used for secure transactions, etc.
It does not restrict your poss
Some features could be usefull... (Score:2)
But trusted windows, at least, is going to be about remote deletion/disabling of data.
Software DRM (Score:3, Interesting)
The software DRM implementation would be 100% transparent to the application and noone would be the wiser.
It should also be workable with a x86 emulator running a closed source 'trusted' application along with its closed source OS, with the emulator doing the DRM instructions a little differently than normal.
Re:Software DRM (Score:2)
Re:Software DRM (Score:4, Informative)
This is one of the most commonly asked questions about the TPM. The answer is that the TPM implements what is called a "secure boot" sequence.
The first thing that happens in a TPM enabled computer is that the BIOS, on startup time, sends a hash of itself to the TPM. Then, when the BIOS goes to load the OS, it sends a hash of the boot loader (grub, in the case of Linux) to the TPM. The boot loader will be modified (see the Trusted Grub [prosec.rub.de] project) to take a hash of the OS kernel and send that to the TPM. And the OS itself will be modified (a la tcgLinux [ibm.com] to take a hash of the various OS components, startup scripts, and programs as the computer boots.
The net result is that the TPM has a record of what OS was booted and what the software configuration is that is running. This allows it to distinguish between a "real" boot and an emulated one, because in the latter case it sees a hash of the emulator being loaded.
Software which runs in un-emulated mode and uses the TPM features can distinguish that case from when it is running emulated. If it locked some data using the TPM in the first mode, it won't be accessible in the second mode.
Once remote attestation is possible, networked applications will be able to report their software configuration to each other. This will be unforgeable because the TPM will sign an attestation of the software configuration, and the TPM itself will have a certificate from the manufacturer attesting that it is a legit TPM. Your emulator will not have a certified TPM key (those stay on the chip) and so it won't be able to come up with a credible forged attestation. Programs running on emulators won't be able to take part in network security applications that use these features.
Re:Software DRM (Score:2)
That is why CSS is still not broken for DVD, heh?
Seriously, pretty much any hardware can be emulated in software, albeit maybe at a slower speed. And when people want it bad enough they will find a way to ease finding valid keys, or either read data out of ICs or simply TCPA documents..
TCPA - TCG (Score:4, Informative)
Re:TCPA - TCG (Score:3, Insightful)
TPM emulator (Score:2, Informative)
Newspeak Framing at its finest (Score:2, Informative)
Gentoo's public communications guy needs to read some George Lakoff [gracecathedral.org]. It's a wonderful life, folks. Every time you use their words, a devil gets his pitchfork.
TCG and Linux make sense (Score:3, Informative)
It's kind of ironic, because Ross Anderson's lying Anti-TCPA [cam.ac.uk] FAQ tries to claim that TC exists to kill Linux. And yet it is turning out that Linux is the salvation of Trusted Computing.
There are a number of research projects in TC on Linux, including TPM Device Driver [ibm.com], Trusted GRUB and Secure GUI [prosec.rub.de], tcgLinux [ibm.com], TCPA Open Source Platforms [crazylinux.net], Enforcer [sourceforge.net], and more. All Linux based.
Don't believe the FUD about TC. When implemented in Linux using Open Source software, TC gives you new options for securing and expanding the capabilities of your computer.
Re:TCG and Linux make sense (Score:4, Insightful)
Hmmm. And yet I don't seem to need any form of TCPA/TCG or DRM. In all the years I've run linux full-time, I have never ever had naughty code or naughty hackers get in. I can't say that about any of the windoze users I know. Beyond that, I certainly don't need any system that can be used as a DRM system.
Nope. Uh-uh. Not on my box. I'll copy my files and CDs as I feel the need and will not have anyone but me control when and how I go on to use such copies. This all looks like what it is, an attempt by corporations to gain control of the most important and useful aspects of your PERSONAL and private property computer. Screw TCPA/TCG (and DRM). Paint it all up with lipstick and rouge all you want but in the end it is about restricting what people are allowed to do with their own computers. Any benefits that come to the individual computer owner are accidental and peripheral to the actual designed and intended purpose.
Not so funny anymore (Score:2, Funny)
Hmmm. This puts the whole concept of so-called "Trusted Computing" into a realistic, and sad, perspective.
The ethernal "arms race" (Score:2)
Its an ethernal "arms race".
True, the TCG chip will rise the bar for running "unauthorized" software, but it might also bring its own downfall. Imagine the chip is implemented and works well for a year or two - i.e. the only ways to defeat the chip are very inefficient (and probably require doing stuff with the hardware that only few of us geeks would have to skills and guts to do)
And then, suddenly, somebody has a great idea how to decie
RMS's writing about "trusted" computing (Score:4, Informative)
NO SIGNED CODE (Score:3, Interesting)
The truth is that the TCG spec says nothing about signed code. There are no limitations in TCG that keep you from running unsigned code. There is no distinction between "secure" and "insecure" code. You can run anything you like. Signing is a complete red herring in this discussion.
I am not trying to gloss over problems or paint a false picture. The truth is that TCG does have features whose effects are somewhat like what people are worried about with signed code. The result is that TCG could be helpful for DRM, and it might make it impossible to download music from an online store without running a special application, for example. But this would not be because "you can only run signed code". Rather, it is the server that decides whether it wants to talk to you, not your computer deciding what you can and cannot run.
What's the difference? Well, if your main concern is being able to run hacked clients that will allow you to violate your user agreements, then there is no difference. You would be right to oppose Trusted Computing. It will make it harder to lie and pretend to honor an agreement, then break your word and go back on your promise.
But if your main concern is about the GPL and what software you run, there is a big difference. There are no limits on the software you can run. You can hack your Linux kernel to do whatever you want. You can disable "secure" features in the software you run. These privileges don't go away when there is a TPM chip. That should put to rest the concerns about the GPL and hopefully end the discussion about who signs what code.
If you're wondering how these two points of view can be compatible, you need to learn more about the TCG spec and the TPM chip. In a nutshell, the TPM chip, with the cooperation of the BIOS and OS software, takes a hash or fingerprint of the software configuration as the computer boots. It can then report this fingerprint to remote servers, if client software requests it. These reports are signed with an on-chip TPM key that never leaves the chip; and this chip has a certificate from the computer manufacturer, so no emulator can fake these reports (called remote attestations).
That's how it works. It's a lot more complicated than refusing to run unsigned code. What it comes down to is that software can report its configuration in a believable and, yes, trustable way. That's the real reason this is called Trusted Computing, not the lie made up by Ross Anderson. It's Trusted because you can Trust the reports from a remote system about what software it is running, and therefore what it will do.
Re:Big Bro (Score:2)
There are many small businesses looking for ways to keep people from reading stuf, people like competition and secret services.
While not asuming it gives any vast security, it gives another layer of it to some people.
Better would be seperate networks for internal systems and internet systems, the fact is that most simply cannot afford that.
As long as the specs and software are open source and as long as one can decide themselfs what OS to install on a machine there is
Re:Big Bro (Score:2)
This one has mainly bad sides...
Re: (Score:3, Interesting)