Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy United States Linux

Linus Torvalds Admits He's Been Asked To Insert Backdoor Into Linux 576

darthcamaro writes "At the Linuxcon conference in New Orleans today, Linus Torvalds joined fellow kernel developers in answering a barrage of questions about Linux development. One question he was asked was whether a government agency had ever asked about inserting a back-door into Linux. Torvalds responded 'no' while shaking his head 'yes,' as the audience broke into spontaneous laughter. Torvalds also admitted that while he as a full life outside of Linux he couldn't imagine his life without it. 'I don't see any project coming along being more interesting to me than Linux,' Torvalds said. 'I couldn't imagine filling the void in my life if I didn't have Linux.'"
This discussion has been archived. No new comments can be posted.

Linus Torvalds Admits He's Been Asked To Insert Backdoor Into Linux

Comments Filter:
  • by MadX ( 99132 ) on Thursday September 19, 2013 @02:09AM (#44891237)

    *If* such a mechanism was coded in, the nature of open source would mean it would be found by others. This in turn would compromise the trust of the ENTIRE kernel. That trust can take years to build up - but be detroyed in a heartbeat.

    • by phantomfive ( 622387 ) on Thursday September 19, 2013 @02:12AM (#44891249) Journal

      That trust can take years to build up - but be detroyed in a heartbeat.

      You'd think so, but somehow people still trust Windows, even though it most certainly has been compromised.

      • by DerPflanz ( 525793 ) <bart.friesoft@nl> on Thursday September 19, 2013 @02:14AM (#44891257) Homepage

        Being compromised isn't the issue. The Linux kernel has been compromised as well.

        The issue here, is that there is a backdoor being built-in deliberately. That could compromise trust.

        • by phantomfive ( 622387 ) on Thursday September 19, 2013 @02:20AM (#44891277) Journal

          The issue here, is that there is a backdoor being built-in deliberately. That could compromise trust.

          There is [americablog.com] that possibility [wikipedia.org]. Once again, this is a possibility we've known about for a while, and it hasn't caused people to leave Windows in droves. I think it's something most people just must not care about.

          • by vague regret ( 1834800 ) on Thursday September 19, 2013 @03:42AM (#44891557)
            According to the recent human brain study [alternet.org], facts do not matter. So no wonder people still believe in things like Windows (or open-source) safety and security...
            • by Joining Yet Again ( 2992179 ) on Thursday September 19, 2013 @04:41AM (#44891757)

              From the description of the study, it seems to me that people who have formed an opinion won't change it just because they see a single piece of potentially falsified or misleading evidence. For example (looking at one of the experiments), if someone has an opinion on joblessness in the US - which might bring in factors of job stability, hours worked or attainment of a living wage - seeing a single graph on number of employed people in recent years does not allow us to conclude that joblessness has been reduced under Obama, unless you have a very primitive interpretation of "joblessness".

              The only damning conclusion is that some academics are so arrogant that they assume test subjects must be faulty if they don't immediately believe the academic's interpretation of some data presented to them.

              • by Austerity Empowers ( 669817 ) on Thursday September 19, 2013 @07:12AM (#44892389)

                test subjects must be faulty if they don't immediately believe the academic's interpretation of some data presented to them.

                Probably the actual discovery in this experiment: There were a lot of faulty test subjects.

                • It certainly makes for better headlines than, "Extraordinary results explained by bad methodology."

                • by Dcnjoe60 ( 682885 ) on Thursday September 19, 2013 @09:41AM (#44893523)

                  test subjects must be faulty if they don't immediately believe the academic's interpretation of some data presented to them.

                  Probably the actual discovery in this experiment: There were a lot of faulty test subjects.

                  Actually the similar studies have been repeated numerous times with the same result, so it is unlikely to be a fault of the subjects or the methodology. What the tests do show is that information that we hold to be technical types of information we are readily willing to concede that we could be wrong. Information that we hold as a belief or ideological position, we hold on to vehemently. Technical issues responds to logic. Ideological ones are usually emotionally based and processed in a different part of the brain. Most social views including politics and religion fall into the ideological camp and is why it is very difficult to get people to change their position using logic. It's also why, things like prejudice and bigotry are so hard to eradicate, because they, too are ideological positions.

                  The old adage used to be to not discuss politics or religion when having company. The tests just confirm what we already knew.

              • It all depends... (Score:5, Insightful)

                by Dcnjoe60 ( 682885 ) on Thursday September 19, 2013 @09:32AM (#44893433)

                From the description of the study, it seems to me that people who have formed an opinion won't change it just because they see a single piece of potentially falsified or misleading evidence. For example (looking at one of the experiments), if someone has an opinion on joblessness in the US - which might bring in factors of job stability, hours worked or attainment of a living wage - seeing a single graph on number of employed people in recent years does not allow us to conclude that joblessness has been reduced under Obama, unless you have a very primitive interpretation of "joblessness".

                The only damning conclusion is that some academics are so arrogant that they assume test subjects must be faulty if they don't immediately believe the academic's interpretation of some data presented to them.

                Learning math, and being shown that an equation is incorrect, one readily accepts that. Things like unemployment, climate change, etc., aren't about concrete objective things, but instead are really various facets of one's ideology. Ideology, like religion is hard to change and pretty much for the same reason. It is not based on knowledge, but instead on belief.

                That can be good or bad, depending on how it is used, but most often, it turns out to be bad. Ideologies often force us to characterize others by stereotypes, not individuals. What is happening in the US Congress and many parts of the world politically, is all based on people holding on to their ideologies and not not listening to the other side. Holding to ideologies instead of the underlying principles leads to the notion of if you aren't with me you are against me and that ultimately leads to disaster for a society by concentrating the power in the hands of a few at the expense of many.

                One thing is for certain, you don't change people's ideology with facts. Facts appeal to the rational, logical part of our psyche. Ideology, on the other hand is an emotional response and like love is often anything but logical.

            • by DoofusOfDeath ( 636671 ) on Thursday September 19, 2013 @08:01AM (#44892735)

              According to the recent human brain study [alternet.org], facts do not matter. So no wonder people still believe in things like Windows (or open-source) safety and security...

              Then why are you presenting a fact?

            • Re: (Score:3, Interesting)

              by operagost ( 62405 )
              No, what that study proved is that people are lied to so often, that once they form an opinion they simply refuse to believe anything new.
          • by michelcolman ( 1208008 ) on Thursday September 19, 2013 @04:08AM (#44891647)

            Then again, the back door would be easier to find by criminals. I don't personally care that much about the NSA snooping through my e-mails. But if some criminal can read them just as easily, it's a different story.

            • by Anonymous Coward on Thursday September 19, 2013 @04:19AM (#44891689)

              You seem to assume that there are no criminals at all part of "the NSA". Considering the number of employees they have with most having fairly complete access it is almost certain that there are criminals with access to a lot of NSA data.

              • by Hatta ( 162192 ) on Thursday September 19, 2013 @07:59AM (#44892715) Journal

                You seem to assume that there are no criminals at all part of "the NSA".

                The NSA itself is comprised of criminals. From the agent who accesses data he has no legitimate right to, to James Clapper who lies about it to Congress. The NSA is a criminal organization.

                • by Anonymous Coward on Thursday September 19, 2013 @08:14AM (#44892859)

                  The State is nothing more nor less than a bandit gang writ large

                    -- Murray Rothbard

              • I think GP was merely pointing out an alternative reason the government shouldn't be given the keys to everything, a reason that should appeal even to those poor idiots who don't realize their government can do evil. They probably worry more about identity theft from non-government criminals than their privacy being invaded by the government. That's not entirely unjustified: if you don't sell drugs or associate with terrorists, the government probably isn't going to lock you up without rights based on the
            • by AlphaWoIf_HK ( 3042365 ) on Thursday September 19, 2013 @04:23AM (#44891707)

              It is foolish to assume that the people working for the government are perfect angels who could never mean you any harm; this has never been true and never will be true.

              • by DoofusOfDeath ( 636671 ) on Thursday September 19, 2013 @08:04AM (#44892763)

                As someone who used to work for the U.S. government, I can say that not everyone there is pure evil. I worked in the DoD, and it was more or less a normal workplace. If anything we were more sticklers for obeying the law there then we were anywhere else I've worked. Maybe because the lack of profit pressure removed one possible temptation to break the law.

                • by OakDragon ( 885217 ) on Thursday September 19, 2013 @08:20AM (#44892903) Journal
                  But how can we take the word of a criminal?
                • by Archangel Michael ( 180766 ) on Thursday September 19, 2013 @09:55AM (#44893649) Journal

                  Good people allowing bad things to happen because they believe the lies that the bad things are actually good, allowing their consciences to be eased. If you saw one thing that was evil, and did nothing, you are as complicit as the evil people the rest of us believe are running those organizations.

                  Liberty takes eternal vigilance. Anything less, walks us slowly down the path of tyranny. We've walked down that path so long that people crying for liberty seem like the loons while those people who are usurping liberty look like our saviors.

                  And the tyrants always cloak their deeds in legality.

                  People like you, who did nothing, saw nothing, are the ones I hate the most. You allowed evil in the false premise that it was "good" . But I understand, you were just following orders.

            • by Millennium ( 2451 ) on Thursday September 19, 2013 @05:13AM (#44891871)

              But if the NSA can get in, then it is only a matter of time before someone else figures out how. Whether or not I trust the NSA barely even matters, because I certainly don't trust this next entity.

              This is why I prefer something the NSA can't get into: there's probably nobody else who can either. The NSA's cracking efforts hold considerable value for that reason: they can, and should, be letting us know when our machines are not secure enough. The problem arises when they fail to do this, which seems to have been the case in recent years.

            • by Anonymous Coward on Thursday September 19, 2013 @06:00AM (#44892005)

              How about just the UK and France? Both have a "special relationship" with the USA, so can easily be getting the same information on how to snoop on your stuff as the NSA do.

              So are you fine with the UK government, a foreighn power, snooping through your e-mails?

              No?

              THEN WHY THE FUCK IS IT OK FOR THE NSA TO SNOOP THROUGH MINE?

              Morons.

              You even say of your spying agencies "Well, I expect the agency to be spying on foreigners, but NOT to spy on me!!!". Except where they're spying on you, in which case "It's OK for them to spy on me".

              • by raymorris ( 2726007 ) on Thursday September 19, 2013 @07:50AM (#44892649) Journal

                It's ILLEGAL for the NSA to spy on Americans, and for good reason. That doesn't mean it's OKAY for them to spy on everyone else, but at least it's LEGAL.

                As a US citizen, I'd rather China spy on me than the NSA. The reason is because China isn't going to try to "bust" me on a minor and erroneous charge. For example, there is a porn star named Ann Howe aka Melissa who started in porn when she was 20. She looks young, so several people have been busted for "child porn" for having pics of her when she was 20-25 years old. I don't want my government spying on my internet usage because my government will charge me with child porn based on a chick in her twenties. The Chinese government doesn't give a shit what porn I see. Therefore yes, it's less bad for a government to spy on foreigners - even when I am the foreigner.

            • by Yvanhoe ( 564877 ) on Thursday September 19, 2013 @06:32AM (#44892141) Journal
              Snowden could snoop through emails and is considered a criminal by the US government.
      • Re: (Score:3, Interesting)

        by gigaherz ( 2653757 )
        Most of us don't feel important enough to worry about some government knowing our secrets. Yes, we know this gives a means for those governments to identify the people who have something to hide, and that isn't always a good thing, but it's easier than being paranoid.
      • by meta-monkey ( 321000 ) on Thursday September 19, 2013 @09:35AM (#44893465) Journal

        I never "trusted" windows, apple, google, or really any for-profit company, but I assumed because of their rational self-interest, they would not deliberately fuck me over in egregious ways to a third party, like a government, because the knowledge they had done so would be bad for business. So while I have always preferred free software, I would still use closed software because, meh, why not?

        Since the PRISM slides, no. No. I have already or am in the process of eliminating from my life every closed platform I was using.

        Except for video games. I have a computer that will boot windows for games and I own an Xbox, but that's it.

    • by Rosco P. Coltrane ( 209368 ) on Thursday September 19, 2013 @02:18AM (#44891269)

      Yes, that's the conventional wisdom with open-source. But tell me: when was the last time you went inspect the code deep in the kernel? How many open-source code users do you think have the time, desire and ability - and probably paranoia - to go and inspect the code in *any* open-source project of reasonable size, let alone something as complex as the kernel?

      I don't think someone could slip funny code in the main kernel tree - too many specialists reviewing the patches - but I'm convinced that if Canonical, SuSE or RH wanted to distribute a tainted kernel, they could do it undetected for a very long time, if not indefinitely.

      • by Starky ( 236203 ) on Thursday September 19, 2013 @02:26AM (#44891297)

        Code does not have to be fully reviewed for the open source development process to discipline attempts at compromise. There is a nonzero probability that any given piece of code will be reviewed for reasons other than looking for a back door, and if the probability is higher than trivial, it would dissuade parties from attempting to surreptitiously put in a back door. If a back door were found, the contributor would be known and repercussions would follow.

        Moreover, I would not be at all surprised if foreign governments who have a national security interest in running uncompromised operating systems have devoted time and resources specifically to code review of the kernel for potential compromises.

        • by rioki ( 1328185 ) on Thursday September 19, 2013 @02:39AM (#44891347) Homepage

          Do you compile your programs from source and check that it is the last valid version from the project or do you install rpm or deb binary packages? Even if the actual project is vetted, it is near impossible to validate everything that comes though the automatic updates. This is definitely a point of failure, since you only need one person, the person that has access to the signing keys and the update server. So you trust canonical, red hat, SuSe to be fully vetted? Open source is better than closed source vendors, but in the end, if you download binaries you are in the mercy of the person who built them.

          • by Anonymous Coward

            As Thompson explains in his Reflections on trusting Trust (http://cm.bell-labs.com/who/ken/trust.html) even if you download everything in source form, and review it, you are still susceptible to manipulation if you use the compiler binary and haven't reviewed it's source.

            Or the source of the compiler compiling that compiler, and so on.

          • While correct, this is hardly a kernel specific problem. In many environments, local packages are published without GPG signatures, and installed quite arbitrarily from poorly secured internal repositories and poorly managed third party repositories. Even the most reputable repositories are vulnerable to having their build environments penetrated and signed, but backdoor-enabled packages, published.

            Personally, I don't trust Canonical because of their poor attitudes about sending personal system data back to

      • by Mr. Freeman ( 933986 ) on Thursday September 19, 2013 @03:02AM (#44891419)
        You raise a good point, and there's actually a lot of evidence proving you correct. There have been more than a few security vulnerabilities that have persisted in the code for various widely-used pieces of open-source software for years. One was even found and patched but then quickly reverted without anyone noticing.

        What people fail to understand is that proper security reviews are more than "let's just take a look at the code and make sure that it's not sending email to the NSA." You also can't perform a proper review with a bunch of hobbyist coders, you need highly-trained experts. Every single line of code needs to be checked, double checked, and triple checked against every single other line in the code to make sure that there isn't anything that could possibly compromise the security of the system. These failures are always subtle and usually unintentional.

        This is best summed up with an example. Any idiot can look at the code and say "wait a second, this code copies the decryption key and sends an email to the NSA!" Only a very methodical search with a lot of people can say "hey, we've determined that this implementation of this specific part of this specific algorithm probably doesn't have a large amount of randomness over a long period of time. It likely decays such that the complexity is reduced to such and such a number of bits after such and such an amount of time and in these specific situations. This is a problem!"
        • Re: (Score:3, Insightful)

          by Anonymous Coward

          Few people are more expert on C and the x86 memory architecture than the Linux kernel devs, and none are more expert on Linux than the kernel devs themselves.

          But I can tell you're one of 'those' people, who can't conceive that people are capable of learning and becoming experts without some certificate granting jerkoff/circlejerk club to sanctify their alleged expertness with a wax stamped piece of paper.

          "hey, we've determined that this implementation of this specific part of this specific algorithm probably doesn't have a large amount of randomness over a long period of time."

          An algorithm doesn't, by definition, have any randomness, so it's clear you yourself don't know what the

        • by hughk ( 248126 )

          You also can't perform a proper review with a bunch of hobbyist coders, you need highly-trained experts. Every single line of code needs to be checked, double checked, and triple checked against every single other line in the code to make sure that there isn't anything that could possibly compromise the security of the system. These failures are always subtle and usually unintentional.

          If you are writing for some critical applications like a flight control computer then it is clear that there will many form

      • by jamesh ( 87723 ) on Thursday September 19, 2013 @03:35AM (#44891533)

        How many open-source code users do you think have the time, desire and ability - and probably paranoia - to go and inspect the code in *any* open-source project of reasonable size, let alone something as complex as the kernel?

        There's a whole industry evolved around finding exploitable holes in Windows, and there's no source available for that at all[1]. You can be sure the bad guys have given it a thorough going over and if there was a generic hole (I doubt you could slip an "if password = NSA then accept" style patch by the gatekeeper so it would need to be subtle and generic) it would be found. Admittedly this is not ideal but as soon as the bad guys use their exploit it will be effectively disclosed and then fixed.

        [1] actually it would be reasonable to assume that at least some source for windows is in the hands of the bad guys...

        • by Bert64 ( 520050 )

          [1] actually it would be reasonable to assume that at least some source for windows is in the hands of the bad guys...

          And that is the worst part...

          The malicious groups have more access than the good guys. A legitimate security researcher cannot get to see the source code without complying with the terms dictated by the vendor, while a malicious hacker can obtain copies of the source and go through it freely.

      • by Zero__Kelvin ( 151819 ) on Thursday September 19, 2013 @08:50AM (#44893109) Homepage
        Stop spreading ridiculous myths:

        "Yes, that's the conventional wisdom with open-source. But tell me: when was the last time you went inspect the code deep in the kernel? "

        From the latest Linux Foundation report [linuxfoundation.org]: Kernel: 2.6.30 Number od developers: 1,150 Number of known companies: 240

        3,300 eyes is a lot of eyes (apologies to any kernel devs who have lost an eye or are blind.) And that is only the count of the actual contributors. There are many more who look at it, and write code for it, that don't submit their code at all, or don't have their code accepted into the kernel proper.

        Before you make such a ridiculous statement, please learn about the Linux Kernel development process. Nothing, and I mean nothing gets into the kernel without highly skilled devs reviewing it first. Sure, they could make a mistake, but saying that it might happen because nobody is really looking is ridiculous.

    • by mwvdlee ( 775178 ) on Thursday September 19, 2013 @02:22AM (#44891279) Homepage

      If anybody were somehow forced to submit a backdoor, it would be very easy to just tip off a random fellow developer to "discover" it.

      • by AmiMoJo ( 196126 ) *

        I'm not so sure. The NSA monitors all email and basically 0wns the internet. You could try to tip them off in person but chances are they would be watching you carefully for that kind of behaviour. If you did reveal what they forced you to do at the very least there would be jail time, if not gitmo time and a bit of torture.

        It's hard to understate just how screwed we are.

        • by Yomers ( 863527 )
          We are not THAT screwed yet. PGP encrypted email is still secure? Torchat is probably secure and anonymous, in a sense it's impossible to decrypt conversation and figure out who is talking to who.
    • by jma05 ( 897351 ) on Thursday September 19, 2013 @02:22AM (#44891285)

      It's unlikely that such a backdoor, should it exist, would be coded so obviously, since the source is published. Instead, it would more likely be in the form of a subtle buffer overflow that results in previlige escalation or such, such that when found, it could simply be labeled as a bug rather than an backdoor... plausible deniability.

    • by Jeremi ( 14640 ) on Thursday September 19, 2013 @02:26AM (#44891293) Homepage

      *If* such a mechanism was coded in, the nature of open source would mean it would be found by others. This in turn would compromise the trust of the ENTIRE kernel. That trust can take years to build up - but be detroyed in a heartbeat.

      If it was obviously a deliberate back door, sure. Which is why the clever hacker/government-agency would be a lot more subtle -- rather than a glaring "if (username == "backdoor") allowRootAccess();", they'd put a very subtle [mit.edu] mistake into the code instead. If the mistake was detected, they could then simply say "oops, my bad", and it would be fixed for the next release, but other than that nobody would be any the wiser. Repeat as necessary, and the visible results might not look too different from what we actually have.

    • No, it might not (Score:5, Insightful)

      by bitbucketeer ( 892710 ) on Thursday September 19, 2013 @02:33AM (#44891323)
    • by dmcq ( 809030 ) on Thursday September 19, 2013 @03:44AM (#44891563)
      Have a look at some of the code from the 'Underhanded C Contest' at http://underhanded.xcott.com/ [xcott.com] where people write code that looks straightforward and nice and clear but contains deliberate evil bugs. I think that should remove any complacency and the NSA has a lot of money to spend on people posing as developers never mind the ones they stick onto standards bodies.
    • Or possibly, the discovery of such a mechanism would conveniently distract attention from the possibility of, say, a backdoor in the processor itself by means of which an unlikely but valid instruction stream might, for example, give kernel privileges to a program running in user mode. An open source software exploit might be intended to be found, and removed, thus restoring your false sense of security in your possibly compromised hardware.
  • Some people ... (Score:5, Insightful)

    by daveime ( 1253762 ) on Thursday September 19, 2013 @02:32AM (#44891321)
    ... can't tell the difference between humour and reality.

    Torvalds said no while nodding his head yes is a JOKE people, not a fucking admission. Please, save the tinfoil paranoia for Reddit, and keep the serious tech discussions here.
    • Re:Some people ... (Score:4, Insightful)

      by Anonymous Coward on Thursday September 19, 2013 @03:11AM (#44891441)

      ... can't tell the difference between humour and reality.
      Torvalds said no while nodding his head yes is a JOKE people, not a fucking admission. Please, save the tinfoil paranoia for Reddit, and keep the serious tech discussions here.

      I don't know if you've been following the news lately, but when it comes to backdoors a lot of the "tinful paranoia" of years past has turned out to actually be true. Statistically speaking it is no longer such a certainty that it's just paranoia anymore. The true tinfoil cynic might say that agencies like the NSA are actually depending on "serious tech people" discounting stuff like this as tinfoil paranoia.

    • Re:Some people ... (Score:5, Insightful)

      by trewornan ( 608722 ) on Thursday September 19, 2013 @05:24AM (#44891913)
      Many a true word is spoken in jest.
    • by gsslay ( 807818 )

      Ahh, but if you RTFA, you'll see he did not nod his head yes. He shook his head yes, which I didn't know was even possible.

      It's probably a secret Illuminati signal.

    • by sjbe ( 173966 ) on Thursday September 19, 2013 @11:00AM (#44894221)

      Torvalds said no while nodding his head yes is a JOKE people, not a fucking admission.

      I agree it is a joke but making a joke does not mean there is nothing serious being communicated. The best jokes are usually about topics that are very serious. Maybe it was a joke and nothing more (I certainly hope so) but without more information you cannot actually be certain either way. If he was asked to put a back door in that would hardly be a surprising revelation.

      Please, save the tinfoil paranoia for Reddit, and keep the serious tech discussions here.

      You think the idea of a backdoor in linux is not a serious tech topic? Besides it's only paranoia if "they" are not actually after you. Recent revelations about the NSA and other government activities clearly demonstrates that being concerned over government snooping is actually quite reasonable.

  • by Zanadou ( 1043400 ) on Thursday September 19, 2013 @02:37AM (#44891337)

    One question he was asked was whether a government agency had ever asked about inserting a back-door into Linux. Torvalds responded 'no' while shaking his head 'yes,'

    That's actually quite a cunning answer: possibly, regardless of his answer to the back-door request (I hope the answer was something like "No, fuck you"), like others in comparable situations have hinted at, maybe he's being held accountable to some kind of on-going government "Non-disclosure clause" concerning such a request/conversation.

    But can body language and gestures be held up to the same legal gagging? I'm sure no legal precedent been held for that yet, and Linus probably is aware of that.

    A cunning, cunning way of answering the question.

  • by GauteL ( 29207 ) on Thursday September 19, 2013 @02:45AM (#44891363)

    Seems we need reminding of this classic [bell-labs.com] by Ken Thompson.

    Slip a backdoor into a RHEL 6.x (or any other major Linux distribution) version of GCC and make it do two major things:
    1. Slip a backdoor into any Linux kernel it compiles.
    2. Replicate itself in any version of GCC it compiles.

    Choose some entry point which changes very rarely so the chances of incompatibility with new code is small.

    This would probably keep RHEL with any kernel version tainted for generations of releases without very little chance of being spotted, because there are no changes in the distributed source code of either project

    • by PhilHibbs ( 4537 ) <snarks@gmail.com> on Thursday September 19, 2013 @06:02AM (#44892015) Journal

      Seems we need reminding of this classic [bell-labs.com] by Ken Thompson... there are no changes in the distributed source code of either project

      Someone would have found it with a debugger. Sure, they could change the compiler to insert code into a debugger to hide the patch. But this rapidly gets so complex and error-prone that the bloat would be noticed and it would fail to spot all debuggers and patch them all. It's an interesting theoretical attack, but not practical in the long run.

    • by Bert64 ( 520050 )

      Use gcc to compile clang..
      Use clang to recompile gcc..
      Add more compilers to the mix..
      The more you do this, the greater the chance of an incompatibility with the backdoor code either resulting in it being removed, or causing unexpected and easily noticed problems.

    • Think this this is the most salient point in the whole presentation:

      The act of breaking into a computer system has to have the same social stigma as breaking into a neighbor's house. It should not matter that the neighbor's door is unlocked.

      Time and time again I hear the old argument "Why not,I got nothing to hide" as it relates to computer access and spying. Present the same person with evidence that their house was accessed while they were out, their car was accessed without their permission and watch the reaction (most likely some variation of anger). People need to be taught that their digital world is just as tangible, just as important as their physical world.

      Two quest

    • by melikamp ( 631205 ) on Thursday September 19, 2013 @08:29AM (#44892959) Homepage Journal
      In reality, slipping a backdoor into Linux is much easier: just code it into a proprietary wireless firmware blob which is already a part of the (non-free) kernel distributed at linux.org. The mal-firmware can then spy and report directly from the network card, or use DMA to elevate itself to ring 0 on the main CPU. What makes this scenario most FUN is the sheer likelihood of such a backdoor being in place RIGHT NOW, within the official Linux git repo, since no approval or knowledge by Linus would be required to slip it in.
  • by Anonymous Coward on Thursday September 19, 2013 @04:20AM (#44891695)

    If the Govenrment asked for Linux, then certainly they asked for Windows, and whereas I trust Torvalds, I don't trust Microsoft - not in a nasty way, just in the sense that they're a very large company over whom the Government has a great deal of power and where very large companies typically are not morally motivated. I don't mean that in a nasty sense, I just mean there's so many people, taking a moral stance - e.g. accepting a cost for a benefit you personally do not see - is in practical terms very, very unlikely.

    So I think I have to assume there is a backdoor in Windows. In fact, it's hard to imagine anything anyone could say to reassure me. If the NSA said it was not so, I'd laugh. They twist words with the pure purpose of deception. If MS said so, I'd be thinking they were legally compelled, such that they could not even say that uch a request had occurred. The NSA surely now have a problem, in that I absolutely cannot trust their word - and indeed I cannot see how that trust can be re-established. If there was a full disclosure, that would be a start, followed by a credible reform programme. I don't think either even remotely likely; and by that, I rather think the NSA has either sealed its doom, or *our* doom. The NSA has gone too far. Either they will be replaced, in which case the problem is addressed, or, if they are not replaced, then *we* have a problem, because the NSA is too powerful to remove (and violates all privacy and security).

    So, what do you know? turns out this *will* hurt MS sales, because now I *have* to move to Linux. I've been thinking about it for a while, but the cost of learning a new system to do only exactly what you can do already means where I'm very busy, it hasn't happened; but now there is a *need* for me to do, privacy.

  • by TheGratefulNet ( 143330 ) on Thursday September 19, 2013 @04:57AM (#44891803)

    yeah, he's a "char star" alright. yup.

    if you have char-stars you don't care about voids, really.

  • by nukem996 ( 624036 ) on Thursday September 19, 2013 @05:03AM (#44891829)
    The kernel of any operating system serves software in the same way governments serve the people. Its taking the politcs out of government. The goal is to make the best system which fairly distributes its resources amounst its users/people most efficiantly so that they maximize their utilization. At the same time it is secure enough to withstand unruly users/citizens and out side agressors.
  • Backdoors... (Score:4, Insightful)

    by fabrica64 ( 791212 ) <fabrica64.yahoo@com> on Thursday September 19, 2013 @06:49AM (#44892239)
    Why bother asking Linus to put a backdoor in Linux when it's just easier to ask Intel putting a backdoor in their processors?
  • by eer ( 526805 ) on Thursday September 19, 2013 @07:25AM (#44892483)

    Worrying about compromise of the Linux or Windows kernel is foolish - they're so large, they could have anything hidden inside and you'd never find it (searching for such is literally uncomputable). Begin your concerns with the device drivers from who knows where that are put into place by your motherboard BIOS or EFI boot systems. Conventional operating systems are entirely dependent on them, and they're completely beyond your ability to inspect or trust. And the Open Source variations have the same issue as the operating systems - large, monolithic blocks of code impenetrable to analysis.

    You fear what you know about. Fear, instead, what you don't.

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Thursday September 19, 2013 @07:31AM (#44892515)
    Comment removed based on user account deletion
  • What I want to know. (Score:4, Interesting)

    by Lumpy ( 12016 ) on Thursday September 19, 2013 @07:45AM (#44892611) Homepage

    What has been snuck past linus and the other code reviewers. Honestly Linus needs to do a call for people to comb through and look specifically for sneaky things. It's not hard to make something look innocent in C but instead it does evil. http://www.ioccc.org/ [ioccc.org] for example. or more scary... http://underhanded.xcott.com/ [xcott.com]

    Linux needs a security team that is double checked by a team outside the USA so it can be the ONLY OS that can state, "Not compromised by the NSA"

  • by johanw ( 1001493 ) on Thursday September 19, 2013 @09:42AM (#44893535)

    But he is forbidden to talk about it and has to communicate it this way. Reminds me of the proposal to publish your pgp key with the note "this key has not been compromised". When thr government demand the key you remove the note.

Technology is dominated by those who manage what they do not understand.

Working...