Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Operating Systems Security Linux

Introducing Shufflecake: Plausible Deniability For Multiple Hidden Filesystems on Linux (kudelskisecurity.com) 90

Thursday the Kudelski Group's cybersecurity division released "a tool for Linux that allows creation of multiple hidden volumes on a storage device in such a way that it is very difficult, even under forensic inspection, to prove the existence of such volumes."

"Each volume is encrypted with a different secret key, scrambled across the empty space of an underlying existing storage medium, and indistinguishable from random noise when not decrypted." Even if the presence of the Shufflecake software itself cannot be hidden — and hence the presence of secret volumes is suspected — the number of volumes is also hidden. This allows a user to create a hierarchy of plausible deniability, where "most hidden" secret volumes are buried under "less hidden" decoy volumes, whose passwords can be surrendered under pressure. In other words, a user can plausibly "lie" to a coercive adversary about the existence of hidden data, by providing a password that unlocks "decoy" data.

Every volume can be managed independently as a virtual block device, i.e. partitioned, formatted with any filesystem of choice, and mounted and dismounted like a normal disc. The whole system is very fast, with only a minor slowdown in I/O throughput compared to a bare LUKS-encrypted disk, and with negligible waste of memory and disc space.

You can consider Shufflecake a "spiritual successor" of tools such as Truecrypt and Veracrypt, but vastly improved. First of all, it works natively on Linux, it supports any filesystem of choice, and can manage up to 15 nested volumes per device, so to make deniability of the existence of these partitions really plausible.

"The reason why this is important versus "simple" disc encryption is best illustrated in the famous XKCD comic 538," quips Slashdot reader Gaglia (in the original submission. But the big announcement from Kudelski Security Research calls it "a tool aimed at helping people whose freedom of expression is threatened by repressive authorities or dangerous criminal organizations, in particular: whistleblowers, investigative journalists, and activists for human rights in oppressive regimes.

"Shufflecake is FLOSS (Free/Libre, Open Source Software). Source code in C is available and released under the GNU General Public License v3.0 or superior.... The current release is still a non-production-ready prototype, so we advise against using it for really sensitive operations. However, we believe that future work will sensibly improve both security and performance, hopefully offering a really useful tool to people who live in constant danger of being interrogated with coercive methods to reveal sensitive information.
This discussion has been archived. No new comments can be posted.

Introducing Shufflecake: Plausible Deniability For Multiple Hidden Filesystems on Linux

Comments Filter:
  • by zerobeat ( 628744 ) on Saturday November 12, 2022 @02:41PM (#63045931) Homepage
    A certain Australian fugitive would be proud
  • by Anonymous Coward on Saturday November 12, 2022 @02:52PM (#63045963)
    This sounds like a great addition to default installations. If it's automatically installed for everyone, it makes it easier to deny that you are actually using it.
    • by Gaglia ( 4311287 ) on Saturday November 12, 2022 @04:21PM (#63046127)

      OP here, thanks for the comment! This is in fact our long-term vision, we will explain better the details on the website, but yes, you're right, ideally that would makes sense. However, notice the difference:

      • . With Truecrypt and similar solutions, having the installation by default on every system would be mandatory to achieve a minimum of plausible deniability, because the only reason one would install Truecrypt versus more "standard" disc encryption solutions is, well, the plausible deniability! But Truecrypt only supports one layer of hiding, so that doesn't buy you much.
      • . With Shufflecake, instead, even if you're the only one using it and so you're basically shouting to the world "I'm hiding stuff", there is no way to know how much you're hiding, so you can still lie convincingly with a decent setup

      Having Shufflecake included by default in most distributions, like cryptsetup/LUKS, would definitely be the best option, but we're waaay far from being there. As we stress in our disclaimer, this is a proof of concept, and a lot of more work is needed. The next steps are 1) clean the code and make it more portable, and 2) working on booting linux from a Shufflecake volume. Stay tuned!

      • Even if you're the only one using it and so you're basically shouting to the world "I'm hiding stuff", there is no way to know how much you're hiding, so you can still lie convincingly with a decent setup

        Really strange fetish porn to cover up that I have the Galactic Ultimate Top Secret plans (GUTS), or, even more weird, nothing at all, only pics of flaming goats while sky diving....

      • by gweihir ( 88907 )

        So, serious question: Why do you want people to get detained, beaten up, raped, tortured and murdered?

        The only way to make people safe is if people can prove they have no secret hidden stuff and if the stuff they have visible is non-problematic. That means filling empty space with zeros. Any high-quality randomness fill already makes it very plausible that there is hidden data in there. And, of course, there are countless forensics tools that look exactly for those kinds of areas. The legal situation is typ

        • by Gaglia ( 4311287 )

          So, serious question

          Then I will reply with a serious answer.

          The only way to make people safe is if people can prove they have no secret hidden stuff and if the stuff they have visible is non-problematic.

          So your security model is "just belong to the Mulino Bianco family [archiviost...arilla.com]". Sounds like a great idea, why didn't we think about it before?

        • It's not possible to prove that you aren't hiding anything - you're trying to prove a negative, which is logically impossible. You could always have just done a really good job.

          Lets be honest - if someone is searching your computer in the first place, it means they're already pretty sure you did something. Not finding an obvious smoking gun is unlikely to convince them otherwise.

          Granted, finding the equivalent of a hidden wall safe may make them even more convinced that there's a second safe somewhere.

        • by nasch ( 598556 )

          That was my thought as well. The easier this kind of thing is to access and use, the more likely it is any particular suspect is using it, and so the less likely an adversary is to believe it isn't in use. If the adversary is a judge in the US, maybe you're looking at a few months or years for contempt of court. If it's the Russian mob, the scenario could be far worse. How do you convince them there's nothing there when the whole purpose of the product is to give people a way to claim there's nothing th

      • by AmiMoJo ( 196126 )

        Can you explain your threat model in more detail please?

        The main problem with plausible deniability is that it's often not very plausible. Say you have some document you don't want anyone else to see on there. If you open it in an application that keeps a list of recently opened files, or that has some of its memory dumped into swap, or that makes temporary files somewhere outside the hidden volume, there is evidence that the supposedly deniable data exists.

        Mitigating that can be very difficult. Booting fro

        • by Gaglia ( 4311287 )
          Thanks for the question, you raise very good points. As you point out, it is important to make sure that metadata about, i.e., last accessed files or last opened volume do not leak, and this is very OS-dependent. So the next step would be booting Linux from inside Shufflecake itself, which is a WIP feature (just like Truecrypt evolved from "disc encryption only" to "hidden OS mode" over time). Some more in-depth technical discussion about the implementation details and challenges you can find in this [reddit.com] Reddit
      • Disagree that the only reason true crypt/ veracrypt would be used is for plausible deniability feature. For full risk encryption, I trust veracrypt a lot more than bitlocker since Microsoft has proven untrustworthy with PRISM in past.
    • by gweihir ( 88907 )

      Yeah. And in the real world, just using one of those distros is already enough to create the suspicion they need to convince a judge. You fail.

      • Dev here. The aim of Shufflecake is not to hide the fact that you have something to hide (because that can't be done), but to give you the necessary "infrastructure" for you to cook up a plausible lie about the disk contents. You still need to be good at lying, but with previous software that's not enough: with TrueCrypt, for example, you cannot _plausibly_ claim that you're only using one volume, no matter how plausible that is, because _everybody_ uses the second volume. Shufflecake supports more than two
        • Dev here. The aim of Shufflecake is not to hide the fact that you have something to hide (because that can't be done), but to give you the necessary "infrastructure" for you to cook up a plausible lie about the disk contents. You still need to be good at lying,

          You have to be better at lying than authoritarian governments are at locking people up and torturing them? GLWT. Just having the software on your computer is probable cause. When you figure out how to hide the software you will have created something that does more good than harm.

        • by gweihir ( 88907 )

          Bullshit. Lots of people are not using the hidden volume on TC. For example, I never did.

          Seriously, why do you put people in danger like this? This is a gross violation of engineering ethics!

  • by lsllll ( 830002 ) on Saturday November 12, 2022 @03:07PM (#63045995)

    a tool aimed at helping people whose freedom of expression is threatened by repressive authorities or dangerous criminal organizations, in particular: whistleblowers, investigative journalists, and activists for human rights in oppressive regimes

    As long as there's unused (unpartitioned) space on your device or you have large useless files sitting around (loop device), you'll be suspect and those governments don't care whatever you say to reason with them.

    One way I can see something that would be undetectable is a filesystem that always uses free space at the beginning, leaving the end of it untouched, and then a tool that can tap into the free space at the end of the file system, requiting you to give the starting block (or MiB) and length of the "partition" you want to access. You'd have to be careful not to fill up the file system more than the space allocated to the beginning or you'll start overwriting your hidden area . That way there are NO signatures of any kind kept anywhere and the program that can open that portion of the file system can be downloaded every time, ran, and then erased. The ability to always use space at the beginning of the file system would need to be built into all popular file systems so that the file system itself doesn't give anything away.

    • by Generic User Account ( 6782004 ) on Saturday November 12, 2022 @04:08PM (#63046103)

      From the Shufflecake project website [shufflecake.net], which Slashdot didn't bother to link:

      If I do not open all volumes but only some of them, what happens to the other unopened volumes?
      Likely they will get corrupted badly. This is a desired behaviour: It is necessary for plausible deniability

      The hidden volumes are only protected from being overwritten when their password is entered. All volumes which are more hidden than the volume to which the password is entered are treated as empty space which can be allocated at any time. Their on-disk structures look like the random data in actually unused space. The headers to any volumes are encrypted and random data is written to unused header space, making headers and unused header space indistinguishable without the password to the most hidden volume.

      The problem with these systems derives directly from the design goals: You can't prove that there is no further hidden data unless you use all 15 headers and reveal the password to the 15th header. You can say that you only used two volumes, but depending on the adversary, you'll be tortured to death if you don't reveal a third, which you can't if there is no third.

      • by lsllll ( 830002 )

        Ahhh, Thanks for the link. So I was on the right track. And wondering if there's a table that keeps track of the volumes, and there is!

        Data about the position of used and unused slices is stored in a volume-specific "position map", which is indexed within an encrypted header at the beginning of the device. Both position map and header are indistinguishable from random data without the correct decryption key, and every slot in the header (currently up to 15 volumes) has a field containing the decryption key for the previous (i.e., "less hidden") header and volume, thereby recursively linking all volumes and allowing the user to open all of them with a single password.

        I'm sure I'm missing something, but wouldn't it make sense that once they've forced you to decrypt the first volume, its header would have pointers to where the 2nd volume starts? Or are we assuming that if I provide the password for, say, third volume, it would unlock the 3rd, 2nd, and 1st automatically? But how would it know where the third volume is?

        • Dev here, thanks for your comment! The header section is statically divided into 15 equal-sized "slots", each one containing an encrypted volume header (or random noise if unused). So, when opening the third volume, the tool statically knows where to find the header. The header then contains pointers into the data section for where to find the data of the third volume. Also, it contains the password to the second volume. The most secret volume is the highest-order one.
      • by gweihir ( 88907 )

        The hidden volumes are only protected from being overwritten when their password is entered.

        And at this time, the whole thing has already failed. Well, it has already failed at the presence of the shufflecacke software and at the presence of areas with crypto-grade randomness in there, but it fails here again. This write-protection must _not_ be present, or forensics software can see obvious suspicious block allocation patterns. And that is why the whole idea is bogus.

        • The filesystem looks exactly the same whether there are hidden volumes or not. The space is randomly allocated in slabs that are not used by the decrypted volumes. It's similar to LVM thin provisioning, just that you have to always randomize allocations or the allocations to the decrypted volumes would show patterns that hint at space used by hidden volumes.

          • by gweihir ( 88907 )

            And how do you randomize allocations without arousing suspicion, when the ordinary FS code does nothing of the sort? Right.

            • It's a block layer below the filesystem. Even if the filesystem allocates sequential blocks, the underlying allocation of actual disk space need not be contiguous. If the similar LVM thin provisioning concept is new to you, read up on it.

        • Dev here, thanks for your comment! The block allocations are mathematically proven to be secure, you can find the proof in the Thesis linked on shufflecake.net. In a nutshell, the disk snapshot (assuming the adversary only has one) always looks like randomly-scattered decryptable data + random noise, whether or not you have surrendered all passwords.
          • by gweihir ( 88907 )

            So, in other words when you are already behind bars, you can maybe lie about more data being in there? Yeah, great design. In the real world, you are screwed as soon as somebody with enough authority suspects you are hiding data and you cannot prove otherwise. Here is a hint: Ordinary file systems do _not_ contain "random noise" and no amount of "mathematical proof" makes that little problem go away.

            • You raise an interesting point, regarding good random data not really existing in the space between files, normally.
              But here's a solution to it.

              You use eCryptfs instead of LUKS.
              Then the space in between files will indeed be full of cryptographic data, and would expected to be so, after you give them the password to the eCryptfs mount.
            • by pacinpm ( 631330 )

              You are right. That's why you HAVE something to hide (naked images of your wife for example). And you can reveal it when forced to do. Those pictures are your justification for having encryption in the first place. If needed you can have multiple layers of those: company financial data on first layer, regular porn on second, wife nudes on third, child porn on forth, dissident contact list on final. You can reveal layer by layer depending of interrogation hardness and risk.

      • Dev here, thanks for your comment! There's no way to prove that you don't have further data, if you do have it. You cannot mathematically convince someone of a false statement. Shufflecake is mainly aimed at people that have something to hide.
        • The problem is subtly different. There is no way to prove that you don't have further data if you don't have it. Can't be, per the design goals. If there is random data on the system, there's always a possibility from the outside perspective that it isn't actually random but encrypted data. The only way to "prove" that there is no hidden data is to decrypt everything that looks random. Depending on the circumstances, actual randomness is a liability. This isn't a critique of Shufflecake or similar systems,

    • You don't need any special tools for that. Make partition. Fill partition with random data. Make filesystem on partition much smaller than the partition. Automatically run script which, when that filesystem is getting low-ish in space, asks permission then resize2fs's it. Make second thing at memorable offset, use loopback to mount it manually whenever you want it. Deny permission to script if you got careless and it wants to resize beyond your memorable offset.

      Still looks very suspicious, but there's no wa

      • by gweihir ( 88907 )

        Still looks very suspicious, but there's no way to protect large amounts of non-zeroed but claimed information-free disk space from potential overwriting without looking suspicious, because that's never how you'd naturally do things.

        Exactly. If you go over a border where they may detain you on suspicion of smuggling data, make sure to zero-wipe all unused data areas on your storage before. That is the only thing which does not scream "I have things to hide!". For unused space within partitions, you should be able to get away by just not having crypto-grade randomness in there, but don't depend on it. Better zero-wipe them as well.

    • by Gaglia ( 4311287 ) on Saturday November 12, 2022 @04:27PM (#63046153)
      OP here, thanks for your comment. What you are describing is exactly how Truecrypt and Veracrypt work, and this has unfortunately many drawbacks, first of all that "a filesystem that always uses free space at the beginning" is basically FAT only nowadays. Shufflecake works differently, there is no wasted space at all in the disc, regardless of how many volumes you open or how many of the 15 you have actually created, by default you will always see each of these volumes being as large as the whole device (although this is, in fact, an illusion). This is called "overcommitment", and it's better explained in the FAQ [shufflecake.net].
  • by ixneme ( 1838374 ) on Saturday November 12, 2022 @03:20PM (#63046025)
    One of the downsides of the hidden volumes feature of TrueCrypt was that the data couldn't be actively protected from destructive overwrites - you can't plausibly deny a secret volume exists while also telling the OS not to overwrite your nonexistent volume. The solution here seems to be that all volumes are created, and if you don't use 15 secret volumes, whatever you don't use is essentially wasted space.

    I suppose that is a win for data integrity, but at least with TrueCrypt most users probably didn't use the hidden volume feature, whereas I don't see why anyone would use such a wasteful file system unless they specifically needed hidden volumes. You immediately lose the plausible deniability that there isn't at least one hidden drive, and now you're getting hit over the head with a wrench until you come up with 15 passwords or pass out.
    • by ugen ( 93902 )

      I understood that XKCD reference (also - the first thing I thought about when reading this).

    • Re:Not so plausible (Score:4, Informative)

      by Gaglia ( 4311287 ) on Saturday November 12, 2022 @04:31PM (#63046159)
      OP here, thanks for your comment, but what you are describing is not how Shufflecake works. In Shufflecake, hidden volumes on a device claim space that it is not known to "less hidden" volumes, and they do it "on the fly" as soon as space is requested. When you initialize a new device with Shufflecake, there is no wasted space at all, regardless of how many volumes you open or how many of the 15 you have actually created, by default you will always see each of these volumes being as large as the whole device (although this is, in fact, an illusion). This is called "overcommitment", and it's better explained in the FAQ [shufflecake.net].
  • PhonebookFS? (Score:5, Interesting)

    by ctilsie242 ( 4841247 ) on Saturday November 12, 2022 @03:37PM (#63046061)

    This reminds me of a project that was done a decade+ ago, called PhonebookFS. It not just allowed different keys to mount a "layer" as a filesystem, but also created chaff, which were junk files that never were decryptable, just as plausible deniability that no matter if all keys were divulged, there was still stuff that was out there and couldn't be accessed.

    The advantage of PhonebookFS over Shufflecake was simplicity. You could just have one password, or you could have millions. None of this interleaving and tap-dancing was required.

    With PhonebookFS were still around. I can't even find a working Git repo of it.

    • Re:PhonebookFS? (Score:4, Informative)

      by Gaglia ( 4311287 ) on Saturday November 12, 2022 @04:37PM (#63046175)
      OP here, thanks for your comment, this PhonebookFS sounds interesting and I wish I could find more info online about it. But I don't understand what you mean about having "one password or one million". With Shufflecake, you can have any number of passwords from 1 to 15, and nobody will be able to tell. All the interleaving of slices and the low-level machinery is, of course, completely transparent for the user, so you don't have to mess up with the fine details in everyday use. In fact, so far we only support three commands: create, open, and close. But of course any suggestion for improving the usability would be very welcome, please also feel free to have a look at the website [shufflecake.net] (there is pointers to the source code and documentation as well). Thanks!
    • Or you get tortured to death, since the attackers can't decide whether you have given all the passwords or you have more.

      • by gweihir ( 88907 )

        Indeed. What a complete fail. By far enough has been written about why this is an exceptionally bad idea that does not survive in the real world. The authors of this are willfully ignorant and apparently do not care that they endanger lives. It is better to have no security than bogus security.

  • I remember that being the premise of TrueCrypt - create a shadow volume with mom's baking recipe and other innocuous stuff while the other volume contained the stuff you want to protect. I wonder in practice how likely it is to save a person. If the investigators *know* you're using software with baked in plausible deniability you'd *really* have to make the fake data something that appears to be something you want to protect otherwise they'd know it is the decoy.
    • So, if you're trying to avoid prosecution, have the decoy volume be something with jail time, but substantially less than what the true hidden volume would reveal. And, I guess, maybe have a decoy-decoy of mom's lasagna recipe.

      • by gweihir ( 88907 )

        Yep, because police forensics people cannot read and understand documentation. Not. They will just lock you up for the minor offense and then have ample time to prepare for locking you up even longer on the mere _suspicion_ that you have a 2nd layer in there.

        • Definitely a problem if you're trying to pull this stunt in a country that doesn't have a strong habeas corpus.

          However, hypothetically, it should be safe enough to do in most of the Western world where the police cannot hold you indefinitely without charging you for a crime, and a Judge would not hold you in contempt if you had reasonable plausible deniability.
    • One of the issues with the original Truecrypt hidden volume was that you could only have one volume mounted. If you mounted the decoy volume then you risked corrupting your hidden volume if you tried to do any decoy writes.

      Finding the key to the decoy volume gives the investigator such useful information as file access times. Since you couldnt safely use the decoy volume it became increasingly stale and was a dead giveaway that there's more to see here.

      That was fixed by either hidden volume protection (moun

      • by DrXym ( 126579 )
        Yes I remember that. I played around with it and basically you mounted the shadow, put some files in it and that was it. Couldn't do any more with the shadow volume without destroying the real volume.

        I still think that the main risk of these plausible deniability crypto schemes is that simply virtue of using them, plausible deniability goes out the window. If the FBI saw a person using this, they'd know they were hiding something. So it would be necessary to maintain a fake decoy and a very convincing pla

  • Ok, so, good idea, and I'm probably missing something, but if the blocks masquerade as free blocks, does this mean that they are in the free list? If not, wouldn't one possible forensic vector be to find what blocks are not in existing filesystems and also not in the free list?

    • Re: (Score:3, Informative)

      by Gaglia ( 4311287 )
      OP here, thanks for your comment. In a nutshell, what happens is that there is no list of "free" blocks, but rather of "occupied" blocks: every volume keeps track of the blocks that itself is using, all the rest are considered free space.
      • Don't you take a performance hit for that? When the OS allocates a block for writing, what used to happen was that it would just pick the next free block off the free list. This is why sorting the free list became important, so that new blocks would have a greater chance of being contiguous.

        It seems like having only a list of used blocks would add extra steps to find the next free block, and make it even tougher to pick out contiguous blocks. I guess you could assume SSD hardware, where using contiguous

        • Re:Free list (Score:5, Informative)

          by Gaglia ( 4311287 ) on Saturday November 12, 2022 @05:34PM (#63046283)
          Good observation! The way we do it currently is something on the lines of: 1) pick a random location for allocating a new slice; 2) if that location is occupied, just fast-forward until you find a free slot; 3) if you reach the end of the device, wrap around to the beginning; 4) if you reach the starting point that means you're out of space, return I/O error. This is *extremely* fast, but it requires to keep a sorted list of allocated slices in-memory (this is what those 60 MiB per volume are basically for). We're also investigating other ways of doing that but so far we're happy with the performance. Thanks!
          • This algorithm doesn't create evenly distributed random allocations and will create patterns which can hint at the existence of hidden volumes.

            • by Gaglia ( 4311287 )
              Not really: The pattern allocation of the topmost (i.e., the "most hidden" volume that you unlock), say volume N, is completely random on the remaining empty space. The caveat is what "empty" means: by "empty" I mean "not occupied by any of the volumes from 1 to N". So, for example, if there is actually an N+1 volume that you didn't unlock, the allocation algorithm will not consider that - and, yes, that means that you have good chances of corrupting it by accidentally overwriting sectors of it, which is th
              • The probability of choosing an allocation unit directly after a unit which is occupied by this volume or a more visible volume is at least twice as high as one which is preceded by a unit which is unused or from a less visible volume. You can land directly on the free unit or on the occupied unit and "fast-forward" into the same unit, whereas you can only land directly on a unit which is preceded by "free" space. This skews the allocation to a non-uniform distribution. If you then allocate units without thi

                • Scratch that. Obviously the allocations without the more hidden volume don't depend on the hidden volume. The problem however still exists: If you use a volume with a more hidden volume decrypted, the allocation pattern isn't an even distribution because it is skewed by the improper algorithm "fast-forwarding" on the allocations to the hidden volume. This pattern remains when the hidden volume isn't decrypted and can hint at its existence.

                  • by Gaglia ( 4311287 )

                    the allocation pattern isn't an even distribution because it is skewed by the improper algorithm "fast-forwarding" on the allocations to the hidden volume. This pattern remains when the hidden volume isn't decrypted and can hint at its existence.

                    Ah, I see what you mean, like in a modulo bias. You're not wrong: it is improper to say that the distribution of allocated slices is uniformly random. However, this is not a problem, because the scenario is different from a modulo bias: here, the "skewed" distribution is the same whether you have other hidden volumes or not. Let's take an example: say you have 3 volumes total but you only give the password for the 1st one, so there is other 2 hidden. What the adversary sees is slices allocated to avoid what

  • Do you really think this is going to help you if you end up in court?
    • It makes it even worse in court. it's not a random Joe who clicked "protect with password" on a zip file, it's a knowledgeable user who has chosen to install a software whose major feature is to enable lying to courts. The only way this software to serve purpose would be for a major linux distribution or large computer vendor to create by default a set of encrypted volumes, so one can argue not knowing what it was for.

      Hans Reiser could "plausibly deny" committing murder, except he had acquired reading mater

      • If this were a widespread option, where more than just a few users used it, maybe it wouldn't be something that a DA could use as nailing someone to the wall for. Similar to people using VeraCrypt, but because it is so commonly used, singling out a specific user, thinking they had hidden volumes likely wouldn't work.

        • by gweihir ( 88907 )

          Sure. The DA would just nail people to the wall for refusing to open the hidden volume that is obviously there because "everybody has it".

          • The DA cannot hold you for refusing to open a hidden volume it cannot prove is there.
            Case law is pretty clear on this matter, in particular, even if it's highly unclear with regard to compelled surrender of cryptographic keys in general.

            Caselaw on the matter says that it must be a "forgone conclusion" that what the government seeks is behind cryptographic lock.

            If the government cannot prove that there is a volume there, then it's a matter of forcing testimony. I.e., a 5th amendment issue. The fact that
            • by gweihir ( 88907 )

              Good luck with trying that, for example, in the UK. I would also not rely too much on that "foregone conclusion". Prosecutors in the US have a habit of torturing the available facts as far as they can.

              • I don't know anything about UK rights when it comes to trials and such.
                But in the US, the "forgone conclusion" caselaw exists for a reason. Because the prosecutors have tried that bullshit, and the court's said no.

                In particular, deniable cryptography hasn't been tested, but deniable analogues have, and the court has been clear. Prosecutors absolutely can not compel you to produce something they cannot have overwhelming evidence that you have.

                To be fully clear- I get your concerns, and I agree with them
          • The same DA could use those arguments and say they need to cough their real evidence on a USB flash drive, even though there is not much more than a guessing/suspicion, much less a foregone conclusion that it even exists. Here in the US, people do not have to testify against themselves, otherwise there would be a lot of people in jail being held indefinitely until they cough up evidence against them.

            Yes, there are edge cases, but even then, turning over a passcode is a lot different than having to turn ove

      • by gweihir ( 88907 )

        Hans Reiser could "plausibly deny" committing murder, except he had acquired reading material about how to commit a murder. That was enough to pressure him to enter a deal.

        Exactly. And that was in a country where the rule of law is mostly intact. Try that in some where it is not.

      • by Shimbo ( 100005 )

        Hans Reiser could "plausibly deny" committing murder, except he had acquired reading material about how to commit a murder. That was enough to pressure him to enter a deal.

        Not really. Reiser's problem was that, as a massive narcissist, he took the stand and went with the "you cannot convict me because even if I did it I am smarter than you" defence. He came across as an arrogant liar, so the jury convicted. We don't know what would have happened if he had the sense to STFU and let his attorney argue that all the evidence was circumstantial: the books weren't the only evidence the prosecution had.

        He took a plea after conviction in exchange for showing where the body was buried

        • You are right. I re-read the story. I'm surprised however that after conviction one can make a deal that changes the nature of the crime.

  • Too complex, endangers the user. The only somewhat deniable thing is a USB stick/memory card that can be plausibly claimed as `empty/unused/carried just in case'. This entire deniability idea revolves around the adversary playing by the rules. If you or your data truly worth anything, rules through the window.
    • by gweihir ( 88907 )

      Exactly. And even if the adversary does play by the rules, the rules can be and often will be stacked against this. Because obviously law enforcement cannot prove there really is a hidden volume, in the UK and US, it is enough that it is plausible enough that there is a hidden volume and then they can lock you up (after convincing a judge that does not understand technology and will not even be able to follow that "plausibility" argument you are trying to make; instead it will be taken as an indicator that

      • Because obviously law enforcement cannot prove there really is a hidden volume, in the UK and US, it is enough that it is plausible enough that there is a hidden volume and then they can lock you up (after convincing a judge that does not understand technology and will not even be able to follow that "plausibility" argument you are trying to make; instead it will be taken as an indicator that you have hidden something) until you give them the keys.

        The defense will be allowed to bring their own expert to the table as well.
        Plausible deniability will in fact render the state unable to compel (on 5th amendment grounds), and this is firmly rooted in caselaw.
        As to whether or not this is actually plausibly deniable is another story. At least one thing you pointed out makes that highly questionable.

        Having those crypt-grade randomness containing areas on your drive may be quite enough already.

        Ah yes. That. I agree that this is likely good enough to cause you to lose your 5th amendment claim.

        And having software like shufflecacke there

        That will not convince a judge to strip you of your constitu

  • Except with great care and crafted pretty much directly before going into danger, plausible deniability does not work because the "plausibility" angle does not work. Sure, you can always create hidden volumes that nobody can prove are hidden volumes. The most simple way is to fill some space with crypto-grade random data and then place a plain, no-metadata encrypted volume in there, e.g. raw cryptsetup. That is easy to do, and there is absolutely no way to prove it is a hidden volume if you are careful and

    • > They will just torture you until you give them the data or you die. Not data there? Well, tough luck.

      I think you misunderstand.

      This is for people who are willing to die to protect their data.

      Death Star plans, not your porn collection.

      Almost nobody is willing to die for a belief or a community anymore, so this is hard for most people to understand.

      • by Gaglia ( 4311287 )

        Almost nobody is willing to die for a belief or a community anymore, so this is hard for most people to understand.

        Thank you, I wish I had karma points to mod you up. I add to this that there are scenarios where you don't need to go that far, at least in more democratic countries. There are recorded cases where even the "simple" Truecrypt was enough to grant acquittal of a suspect, and we don't know about the non-recorded ones. See for example wikipedia [wikipedia.org]. Sadly, some of these examples are related to people suspected of devious crimes, but this is just to make the point that plausible deniability can work. Even moreso if,

  • Great that we have the source code. [washingtonpost.com]
    I would *assume* it's been reviewed by trusted and qualified members of the open source community.
    Right?

  • When the US Prosecutor wants to find your child porn, this kind of protections works great. You show your tax records are encrypted and they can't find the illegal stuff. No conviction, the scum walk free.

    When North Korea is looking at your laptop, they do not believe you when you show them the tax records and the child porn, they insist you must also have North Korean National Secrets in a third directory. So they hang you for espionage.
     

    • by Gaglia ( 4311287 )
      You raise a good point, but it's not up to us to decide who needs this kind of security, I would say. In North Korea, if the police already has a suspicion on you, then they don't need to seize your laptop, it's just game over. But, let's say that you are randomly picked for a routine inspection, and let's say that you are actually storing sensitive data on your laptop related to an illegal organization, a planned act of sabotage against the regime, or a sensitive dossier that you plan to exfiltrate to some
  • Just keep hitting him after he gives up the first password. You don't even need to say anything.

  • But for $100,000 extra there's a super secret 16th level.

He who has but four and spends five has no need for a wallet.

Working...