Introducing Shufflecake: Plausible Deniability For Multiple Hidden Filesystems on Linux (kudelskisecurity.com) 90
Thursday the Kudelski Group's cybersecurity division released "a tool for Linux that allows creation of multiple hidden volumes on a storage device in such a way that it is very difficult, even under forensic inspection, to prove the existence of such volumes."
"Each volume is encrypted with a different secret key, scrambled across the empty space of an underlying existing storage medium, and indistinguishable from random noise when not decrypted." Even if the presence of the Shufflecake software itself cannot be hidden — and hence the presence of secret volumes is suspected — the number of volumes is also hidden. This allows a user to create a hierarchy of plausible deniability, where "most hidden" secret volumes are buried under "less hidden" decoy volumes, whose passwords can be surrendered under pressure. In other words, a user can plausibly "lie" to a coercive adversary about the existence of hidden data, by providing a password that unlocks "decoy" data.
Every volume can be managed independently as a virtual block device, i.e. partitioned, formatted with any filesystem of choice, and mounted and dismounted like a normal disc. The whole system is very fast, with only a minor slowdown in I/O throughput compared to a bare LUKS-encrypted disk, and with negligible waste of memory and disc space.
You can consider Shufflecake a "spiritual successor" of tools such as Truecrypt and Veracrypt, but vastly improved. First of all, it works natively on Linux, it supports any filesystem of choice, and can manage up to 15 nested volumes per device, so to make deniability of the existence of these partitions really plausible.
"The reason why this is important versus "simple" disc encryption is best illustrated in the famous XKCD comic 538," quips Slashdot reader Gaglia (in the original submission. But the big announcement from Kudelski Security Research calls it "a tool aimed at helping people whose freedom of expression is threatened by repressive authorities or dangerous criminal organizations, in particular: whistleblowers, investigative journalists, and activists for human rights in oppressive regimes.
"Shufflecake is FLOSS (Free/Libre, Open Source Software). Source code in C is available and released under the GNU General Public License v3.0 or superior.... The current release is still a non-production-ready prototype, so we advise against using it for really sensitive operations. However, we believe that future work will sensibly improve both security and performance, hopefully offering a really useful tool to people who live in constant danger of being interrogated with coercive methods to reveal sensitive information.
"Each volume is encrypted with a different secret key, scrambled across the empty space of an underlying existing storage medium, and indistinguishable from random noise when not decrypted." Even if the presence of the Shufflecake software itself cannot be hidden — and hence the presence of secret volumes is suspected — the number of volumes is also hidden. This allows a user to create a hierarchy of plausible deniability, where "most hidden" secret volumes are buried under "less hidden" decoy volumes, whose passwords can be surrendered under pressure. In other words, a user can plausibly "lie" to a coercive adversary about the existence of hidden data, by providing a password that unlocks "decoy" data.
Every volume can be managed independently as a virtual block device, i.e. partitioned, formatted with any filesystem of choice, and mounted and dismounted like a normal disc. The whole system is very fast, with only a minor slowdown in I/O throughput compared to a bare LUKS-encrypted disk, and with negligible waste of memory and disc space.
You can consider Shufflecake a "spiritual successor" of tools such as Truecrypt and Veracrypt, but vastly improved. First of all, it works natively on Linux, it supports any filesystem of choice, and can manage up to 15 nested volumes per device, so to make deniability of the existence of these partitions really plausible.
"The reason why this is important versus "simple" disc encryption is best illustrated in the famous XKCD comic 538," quips Slashdot reader Gaglia (in the original submission. But the big announcement from Kudelski Security Research calls it "a tool aimed at helping people whose freedom of expression is threatened by repressive authorities or dangerous criminal organizations, in particular: whistleblowers, investigative journalists, and activists for human rights in oppressive regimes.
"Shufflecake is FLOSS (Free/Libre, Open Source Software). Source code in C is available and released under the GNU General Public License v3.0 or superior.... The current release is still a non-production-ready prototype, so we advise against using it for really sensitive operations. However, we believe that future work will sensibly improve both security and performance, hopefully offering a really useful tool to people who live in constant danger of being interrogated with coercive methods to reveal sensitive information.
Rubberhose lives (Score:3)
Re: (Score:1)
...and we can tell why this is even more important considering what the assholes are doing to him.
Re: (Score:1)
Though, i think the word fugitive is misleading. How about Persecuted Journalist.
it's (was?) written marrutuku (Score:3)
But pronounced 'rubberhose'
Re: Rubberhose lives (Score:1)
My confusion is in regards to an act that can only be applied to a US citizen is being used to hold him.
Is that still a thing, or have I missed something?
Re: (Score:1)
Make this a standard part of distros (Score:5, Insightful)
Re:Make this a standard part of distros (Score:5, Informative)
OP here, thanks for the comment! This is in fact our long-term vision, we will explain better the details on the website, but yes, you're right, ideally that would makes sense. However, notice the difference:
Having Shufflecake included by default in most distributions, like cryptsetup/LUKS, would definitely be the best option, but we're waaay far from being there. As we stress in our disclaimer, this is a proof of concept, and a lot of more work is needed. The next steps are 1) clean the code and make it more portable, and 2) working on booting linux from a Shufflecake volume. Stay tuned!
Re: (Score:2)
Even if you're the only one using it and so you're basically shouting to the world "I'm hiding stuff", there is no way to know how much you're hiding, so you can still lie convincingly with a decent setup
Really strange fetish porn to cover up that I have the Galactic Ultimate Top Secret plans (GUTS), or, even more weird, nothing at all, only pics of flaming goats while sky diving....
Re: (Score:2)
So, serious question: Why do you want people to get detained, beaten up, raped, tortured and murdered?
The only way to make people safe is if people can prove they have no secret hidden stuff and if the stuff they have visible is non-problematic. That means filling empty space with zeros. Any high-quality randomness fill already makes it very plausible that there is hidden data in there. And, of course, there are countless forensics tools that look exactly for those kinds of areas. The legal situation is typ
Re: (Score:2)
So, serious question
Then I will reply with a serious answer.
The only way to make people safe is if people can prove they have no secret hidden stuff and if the stuff they have visible is non-problematic.
So your security model is "just belong to the Mulino Bianco family [archiviost...arilla.com]". Sounds like a great idea, why didn't we think about it before?
Re: (Score:3)
It's not possible to prove that you aren't hiding anything - you're trying to prove a negative, which is logically impossible. You could always have just done a really good job.
Lets be honest - if someone is searching your computer in the first place, it means they're already pretty sure you did something. Not finding an obvious smoking gun is unlikely to convince them otherwise.
Granted, finding the equivalent of a hidden wall safe may make them even more convinced that there's a second safe somewhere.
Re: (Score:2)
That was my thought as well. The easier this kind of thing is to access and use, the more likely it is any particular suspect is using it, and so the less likely an adversary is to believe it isn't in use. If the adversary is a judge in the US, maybe you're looking at a few months or years for contempt of court. If it's the Russian mob, the scenario could be far worse. How do you convince them there's nothing there when the whole purpose of the product is to give people a way to claim there's nothing th
Re: (Score:2)
Can you explain your threat model in more detail please?
The main problem with plausible deniability is that it's often not very plausible. Say you have some document you don't want anyone else to see on there. If you open it in an application that keeps a list of recently opened files, or that has some of its memory dumped into swap, or that makes temporary files somewhere outside the hidden volume, there is evidence that the supposedly deniable data exists.
Mitigating that can be very difficult. Booting fro
Re: (Score:2)
Re: Make this a standard part of distros (Score:2)
Re: (Score:2)
Yeah. And in the real world, just using one of those distros is already enough to create the suspicion they need to convince a judge. You fail.
Re: Make this a standard part of distros (Score:2)
Re: (Score:2)
Dev here. The aim of Shufflecake is not to hide the fact that you have something to hide (because that can't be done), but to give you the necessary "infrastructure" for you to cook up a plausible lie about the disk contents. You still need to be good at lying,
You have to be better at lying than authoritarian governments are at locking people up and torturing them? GLWT. Just having the software on your computer is probable cause. When you figure out how to hide the software you will have created something that does more good than harm.
Re: (Score:1)
Bullshit. Lots of people are not using the hidden volume on TC. For example, I never did.
Seriously, why do you put people in danger like this? This is a gross violation of engineering ethics!
Bad Governments Don't Care (Score:3)
a tool aimed at helping people whose freedom of expression is threatened by repressive authorities or dangerous criminal organizations, in particular: whistleblowers, investigative journalists, and activists for human rights in oppressive regimes
As long as there's unused (unpartitioned) space on your device or you have large useless files sitting around (loop device), you'll be suspect and those governments don't care whatever you say to reason with them.
One way I can see something that would be undetectable is a filesystem that always uses free space at the beginning, leaving the end of it untouched, and then a tool that can tap into the free space at the end of the file system, requiting you to give the starting block (or MiB) and length of the "partition" you want to access. You'd have to be careful not to fill up the file system more than the space allocated to the beginning or you'll start overwriting your hidden area . That way there are NO signatures of any kind kept anywhere and the program that can open that portion of the file system can be downloaded every time, ran, and then erased. The ability to always use space at the beginning of the file system would need to be built into all popular file systems so that the file system itself doesn't give anything away.
Re:Bad Governments Don't Care (Score:5, Informative)
From the Shufflecake project website [shufflecake.net], which Slashdot didn't bother to link:
If I do not open all volumes but only some of them, what happens to the other unopened volumes?
Likely they will get corrupted badly. This is a desired behaviour: It is necessary for plausible deniability
The hidden volumes are only protected from being overwritten when their password is entered. All volumes which are more hidden than the volume to which the password is entered are treated as empty space which can be allocated at any time. Their on-disk structures look like the random data in actually unused space. The headers to any volumes are encrypted and random data is written to unused header space, making headers and unused header space indistinguishable without the password to the most hidden volume.
The problem with these systems derives directly from the design goals: You can't prove that there is no further hidden data unless you use all 15 headers and reveal the password to the 15th header. You can say that you only used two volumes, but depending on the adversary, you'll be tortured to death if you don't reveal a third, which you can't if there is no third.
Re: (Score:2)
Ahhh, Thanks for the link. So I was on the right track. And wondering if there's a table that keeps track of the volumes, and there is!
Data about the position of used and unused slices is stored in a volume-specific "position map", which is indexed within an encrypted header at the beginning of the device. Both position map and header are indistinguishable from random data without the correct decryption key, and every slot in the header (currently up to 15 volumes) has a field containing the decryption key for the previous (i.e., "less hidden") header and volume, thereby recursively linking all volumes and allowing the user to open all of them with a single password.
I'm sure I'm missing something, but wouldn't it make sense that once they've forced you to decrypt the first volume, its header would have pointers to where the 2nd volume starts? Or are we assuming that if I provide the password for, say, third volume, it would unlock the 3rd, 2nd, and 1st automatically? But how would it know where the third volume is?
Re: Bad Governments Don't Care (Score:1)
Re: (Score:3)
The hidden volumes are only protected from being overwritten when their password is entered.
And at this time, the whole thing has already failed. Well, it has already failed at the presence of the shufflecacke software and at the presence of areas with crypto-grade randomness in there, but it fails here again. This write-protection must _not_ be present, or forensics software can see obvious suspicious block allocation patterns. And that is why the whole idea is bogus.
Re: (Score:2)
The filesystem looks exactly the same whether there are hidden volumes or not. The space is randomly allocated in slabs that are not used by the decrypted volumes. It's similar to LVM thin provisioning, just that you have to always randomize allocations or the allocations to the decrypted volumes would show patterns that hint at space used by hidden volumes.
Re: (Score:2)
And how do you randomize allocations without arousing suspicion, when the ordinary FS code does nothing of the sort? Right.
Re: (Score:2)
It's a block layer below the filesystem. Even if the filesystem allocates sequential blocks, the underlying allocation of actual disk space need not be contiguous. If the similar LVM thin provisioning concept is new to you, read up on it.
Re: Bad Governments Don't Care (Score:1)
Re: (Score:2)
So, in other words when you are already behind bars, you can maybe lie about more data being in there? Yeah, great design. In the real world, you are screwed as soon as somebody with enough authority suspects you are hiding data and you cannot prove otherwise. Here is a hint: Ordinary file systems do _not_ contain "random noise" and no amount of "mathematical proof" makes that little problem go away.
Re: (Score:2)
But here's a solution to it.
You use eCryptfs instead of LUKS.
Then the space in between files will indeed be full of cryptographic data, and would expected to be so, after you give them the password to the eCryptfs mount.
Re: (Score:2)
You are right. That's why you HAVE something to hide (naked images of your wife for example). And you can reveal it when forced to do. Those pictures are your justification for having encryption in the first place. If needed you can have multiple layers of those: company financial data on first layer, regular porn on second, wife nudes on third, child porn on forth, dissident contact list on final. You can reveal layer by layer depending of interrogation hardness and risk.
Re: Bad Governments Don't Care (Score:1)
Re: (Score:2)
The problem is subtly different. There is no way to prove that you don't have further data if you don't have it. Can't be, per the design goals. If there is random data on the system, there's always a possibility from the outside perspective that it isn't actually random but encrypted data. The only way to "prove" that there is no hidden data is to decrypt everything that looks random. Depending on the circumstances, actual randomness is a liability. This isn't a critique of Shufflecake or similar systems,
Re: (Score:2)
You don't need any special tools for that. Make partition. Fill partition with random data. Make filesystem on partition much smaller than the partition. Automatically run script which, when that filesystem is getting low-ish in space, asks permission then resize2fs's it. Make second thing at memorable offset, use loopback to mount it manually whenever you want it. Deny permission to script if you got careless and it wants to resize beyond your memorable offset.
Still looks very suspicious, but there's no wa
Re: (Score:2)
Still looks very suspicious, but there's no way to protect large amounts of non-zeroed but claimed information-free disk space from potential overwriting without looking suspicious, because that's never how you'd naturally do things.
Exactly. If you go over a border where they may detain you on suspicion of smuggling data, make sure to zero-wipe all unused data areas on your storage before. That is the only thing which does not scream "I have things to hide!". For unused space within partitions, you should be able to get away by just not having crypto-grade randomness in there, but don't depend on it. Better zero-wipe them as well.
Re:Bad Governments Don't Care (Score:4, Informative)
Not so plausible (Score:3)
I suppose that is a win for data integrity, but at least with TrueCrypt most users probably didn't use the hidden volume feature, whereas I don't see why anyone would use such a wasteful file system unless they specifically needed hidden volumes. You immediately lose the plausible deniability that there isn't at least one hidden drive, and now you're getting hit over the head with a wrench until you come up with 15 passwords or pass out.
Re: (Score:2)
I understood that XKCD reference (also - the first thing I thought about when reading this).
Re:Not so plausible (Score:4, Informative)
PhonebookFS? (Score:5, Interesting)
This reminds me of a project that was done a decade+ ago, called PhonebookFS. It not just allowed different keys to mount a "layer" as a filesystem, but also created chaff, which were junk files that never were decryptable, just as plausible deniability that no matter if all keys were divulged, there was still stuff that was out there and couldn't be accessed.
The advantage of PhonebookFS over Shufflecake was simplicity. You could just have one password, or you could have millions. None of this interleaving and tap-dancing was required.
With PhonebookFS were still around. I can't even find a working Git repo of it.
Re:PhonebookFS? (Score:4, Informative)
So you get tortured even longer (Score:2)
Good job.
Re: (Score:2)
Or you get tortured to death, since the attackers can't decide whether you have given all the passwords or you have more.
Re: (Score:2)
Indeed. What a complete fail. By far enough has been written about why this is an exceptionally bad idea that does not survive in the real world. The authors of this are willfully ignorant and apparently do not care that they endanger lives. It is better to have no security than bogus security.
Truecrypt (Score:2)
Re: (Score:2)
So, if you're trying to avoid prosecution, have the decoy volume be something with jail time, but substantially less than what the true hidden volume would reveal. And, I guess, maybe have a decoy-decoy of mom's lasagna recipe.
Re: (Score:2)
Yep, because police forensics people cannot read and understand documentation. Not. They will just lock you up for the minor offense and then have ample time to prepare for locking you up even longer on the mere _suspicion_ that you have a 2nd layer in there.
Re: (Score:2)
However, hypothetically, it should be safe enough to do in most of the Western world where the police cannot hold you indefinitely without charging you for a crime, and a Judge would not hold you in contempt if you had reasonable plausible deniability.
Re: Truecrypt (Score:2)
One of the issues with the original Truecrypt hidden volume was that you could only have one volume mounted. If you mounted the decoy volume then you risked corrupting your hidden volume if you tried to do any decoy writes.
Finding the key to the decoy volume gives the investigator such useful information as file access times. Since you couldnt safely use the decoy volume it became increasingly stale and was a dead giveaway that there's more to see here.
That was fixed by either hidden volume protection (moun
Re: (Score:2)
I still think that the main risk of these plausible deniability crypto schemes is that simply virtue of using them, plausible deniability goes out the window. If the FBI saw a person using this, they'd know they were hiding something. So it would be necessary to maintain a fake decoy and a very convincing pla
Free list (Score:2)
Ok, so, good idea, and I'm probably missing something, but if the blocks masquerade as free blocks, does this mean that they are in the free list? If not, wouldn't one possible forensic vector be to find what blocks are not in existing filesystems and also not in the free list?
Re: (Score:3, Informative)
Re: (Score:2)
Don't you take a performance hit for that? When the OS allocates a block for writing, what used to happen was that it would just pick the next free block off the free list. This is why sorting the free list became important, so that new blocks would have a greater chance of being contiguous.
It seems like having only a list of used blocks would add extra steps to find the next free block, and make it even tougher to pick out contiguous blocks. I guess you could assume SSD hardware, where using contiguous
Re:Free list (Score:5, Informative)
Re: (Score:2)
This algorithm doesn't create evenly distributed random allocations and will create patterns which can hint at the existence of hidden volumes.
Re: (Score:2)
Re: (Score:2)
The probability of choosing an allocation unit directly after a unit which is occupied by this volume or a more visible volume is at least twice as high as one which is preceded by a unit which is unused or from a less visible volume. You can land directly on the free unit or on the occupied unit and "fast-forward" into the same unit, whereas you can only land directly on a unit which is preceded by "free" space. This skews the allocation to a non-uniform distribution. If you then allocate units without thi
Re: (Score:2)
Scratch that. Obviously the allocations without the more hidden volume don't depend on the hidden volume. The problem however still exists: If you use a volume with a more hidden volume decrypted, the allocation pattern isn't an even distribution because it is skewed by the improper algorithm "fast-forwarding" on the allocations to the hidden volume. This pattern remains when the hidden volume isn't decrypted and can hint at its existence.
Re: (Score:2)
the allocation pattern isn't an even distribution because it is skewed by the improper algorithm "fast-forwarding" on the allocations to the hidden volume. This pattern remains when the hidden volume isn't decrypted and can hint at its existence.
Ah, I see what you mean, like in a modulo bias. You're not wrong: it is improper to say that the distribution of allocated slices is uniformly random. However, this is not a problem, because the scenario is different from a modulo bias: here, the "skewed" distribution is the same whether you have other hidden volumes or not. Let's take an example: say you have 3 volumes total but you only give the password for the 1st one, so there is other 2 hidden. What the adversary sees is slices allocated to avoid what
oblig xkcd (Score:2, Interesting)
https://imgs.xkcd.com/comics/s... [xkcd.com]
Re: oblig xkcd (Score:2)
Re: (Score:2)
Literally right there in TFS
And in court? (Score:2)
Re: (Score:3)
It makes it even worse in court. it's not a random Joe who clicked "protect with password" on a zip file, it's a knowledgeable user who has chosen to install a software whose major feature is to enable lying to courts. The only way this software to serve purpose would be for a major linux distribution or large computer vendor to create by default a set of encrypted volumes, so one can argue not knowing what it was for.
Hans Reiser could "plausibly deny" committing murder, except he had acquired reading mater
Re: (Score:2)
If this were a widespread option, where more than just a few users used it, maybe it wouldn't be something that a DA could use as nailing someone to the wall for. Similar to people using VeraCrypt, but because it is so commonly used, singling out a specific user, thinking they had hidden volumes likely wouldn't work.
Re: (Score:2)
Sure. The DA would just nail people to the wall for refusing to open the hidden volume that is obviously there because "everybody has it".
Re: (Score:2)
Case law is pretty clear on this matter, in particular, even if it's highly unclear with regard to compelled surrender of cryptographic keys in general.
Caselaw on the matter says that it must be a "forgone conclusion" that what the government seeks is behind cryptographic lock.
If the government cannot prove that there is a volume there, then it's a matter of forcing testimony. I.e., a 5th amendment issue. The fact that
Re: (Score:2)
Good luck with trying that, for example, in the UK. I would also not rely too much on that "foregone conclusion". Prosecutors in the US have a habit of torturing the available facts as far as they can.
Re: (Score:2)
But in the US, the "forgone conclusion" caselaw exists for a reason. Because the prosecutors have tried that bullshit, and the court's said no.
In particular, deniable cryptography hasn't been tested, but deniable analogues have, and the court has been clear. Prosecutors absolutely can not compel you to produce something they cannot have overwhelming evidence that you have.
To be fully clear- I get your concerns, and I agree with them
Re: (Score:2)
The same DA could use those arguments and say they need to cough their real evidence on a USB flash drive, even though there is not much more than a guessing/suspicion, much less a foregone conclusion that it even exists. Here in the US, people do not have to testify against themselves, otherwise there would be a lot of people in jail being held indefinitely until they cough up evidence against them.
Yes, there are edge cases, but even then, turning over a passcode is a lot different than having to turn ove
Re: (Score:2)
Hans Reiser could "plausibly deny" committing murder, except he had acquired reading material about how to commit a murder. That was enough to pressure him to enter a deal.
Exactly. And that was in a country where the rule of law is mostly intact. Try that in some where it is not.
Re: (Score:2)
Hans Reiser could "plausibly deny" committing murder, except he had acquired reading material about how to commit a murder. That was enough to pressure him to enter a deal.
Not really. Reiser's problem was that, as a massive narcissist, he took the stand and went with the "you cannot convict me because even if I did it I am smarter than you" defence. He came across as an arrogant liar, so the jury convicted. We don't know what would have happened if he had the sense to STFU and let his attorney argue that all the evidence was circumstantial: the books weren't the only evidence the prosecution had.
He took a plea after conviction in exchange for showing where the body was buried
Re: (Score:2)
You are right. I re-read the story. I'm surprised however that after conviction one can make a deal that changes the nature of the crime.
Would not use it (Score:1)
Re: (Score:2)
Exactly. And even if the adversary does play by the rules, the rules can be and often will be stacked against this. Because obviously law enforcement cannot prove there really is a hidden volume, in the UK and US, it is enough that it is plausible enough that there is a hidden volume and then they can lock you up (after convincing a judge that does not understand technology and will not even be able to follow that "plausibility" argument you are trying to make; instead it will be taken as an indicator that
Re: (Score:2)
Because obviously law enforcement cannot prove there really is a hidden volume, in the UK and US, it is enough that it is plausible enough that there is a hidden volume and then they can lock you up (after convincing a judge that does not understand technology and will not even be able to follow that "plausibility" argument you are trying to make; instead it will be taken as an indicator that you have hidden something) until you give them the keys.
The defense will be allowed to bring their own expert to the table as well.
Plausible deniability will in fact render the state unable to compel (on 5th amendment grounds), and this is firmly rooted in caselaw.
As to whether or not this is actually plausibly deniable is another story. At least one thing you pointed out makes that highly questionable.
Having those crypt-grade randomness containing areas on your drive may be quite enough already.
Ah yes. That. I agree that this is likely good enough to cause you to lose your 5th amendment claim.
And having software like shufflecacke there
That will not convince a judge to strip you of your constitu
Snake Oil for the stupid (Score:2)
Except with great care and crafted pretty much directly before going into danger, plausible deniability does not work because the "plausibility" angle does not work. Sure, you can always create hidden volumes that nobody can prove are hidden volumes. The most simple way is to fill some space with crypto-grade random data and then place a plain, no-metadata encrypted volume in there, e.g. raw cryptsetup. That is easy to do, and there is absolutely no way to prove it is a hidden volume if you are careful and
Re: (Score:3)
> They will just torture you until you give them the data or you die. Not data there? Well, tough luck.
I think you misunderstand.
This is for people who are willing to die to protect their data.
Death Star plans, not your porn collection.
Almost nobody is willing to die for a belief or a community anymore, so this is hard for most people to understand.
Re: (Score:2)
Almost nobody is willing to die for a belief or a community anymore, so this is hard for most people to understand.
Thank you, I wish I had karma points to mod you up. I add to this that there are scenarios where you don't need to go that far, at least in more democratic countries. There are recorded cases where even the "simple" Truecrypt was enough to grant acquittal of a suspect, and we don't know about the non-recorded ones. See for example wikipedia [wikipedia.org]. Sadly, some of these examples are related to people suspected of devious crimes, but this is just to make the point that plausible deniability can work. Even moreso if,
Sounds great; I'll try it out today! (Score:2)
Great that we have the source code. [washingtonpost.com]
I would *assume* it's been reviewed by trusted and qualified members of the open source community.
Right?
Works against good guys, less against bad guys (Score:2)
When the US Prosecutor wants to find your child porn, this kind of protections works great. You show your tax records are encrypted and they can't find the illegal stuff. No conviction, the scum walk free.
When North Korea is looking at your laptop, they do not believe you when you show them the tax records and the child porn, they insist you must also have North Korean National Secrets in a third directory. So they hang you for espionage.
Re: (Score:2)
The wrench technique still works (Score:1)
Just keep hitting him after he gives up the first password. You don't even need to say anything.
15-level limit?? Just use the wrench 15 times! (Score:1)