Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Software Linux

Open Source a National Security Threat 921

n3xup writes "Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes. He likened Linux with a Trojan Horse- free, but in the end a lot of trouble. O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
This discussion has been archived. No new comments can be posted.

Open Source a National Security Threat

Comments Filter:
  • by RLiegh ( 247921 ) on Tuesday July 27, 2004 @10:51AM (#9811406) Homepage Journal
    While he has some great points, I think it's unlikely that al qaeda is likely to be able to plant a dibilitating bug - much less a backdoor or other serious security malware (mal-feature?) into anything that we have the NSA look over.

    So that puts it down to Osama Bin Laden doing his best to fuck up linux, and only succeeding in placing a few periods where commas should be in the documentation. Yeah, that's worth his time and trouble. Ya sure Ya betcha.

  • by abrotman ( 323016 ) on Tuesday July 27, 2004 @10:53AM (#9811445)
    Isn't that the whole idea of open source? The guys working for our government can see the source code. Either this guy is clueless or working for someone with a vested interest.

    On the flipside, what is to prevent our government from doing the same thing? If the "enemy" can insert malicious code, why can't our government?
  • NSA (Score:3, Interesting)

    by codepunk ( 167897 ) on Tuesday July 27, 2004 @10:55AM (#9811480)
    One Word NSA....If it was so bad the NSA would not have their own version.
  • by Jim_Hawkins ( 649847 ) on Tuesday July 27, 2004 @10:58AM (#9811524)
    Hate to break it to you, but there are a lot of other places that would *love* to have US information than good ol' Osama. These other governments have money. They have the resources to hire someone to insert this code into any open source project.

    As for the NSA inspecting this code -- that's all well and good. But, how often do hundreds of individuals look over OpenSource code and miss a big for awhile. "Awhile" is all it takes for a foreign government to download A LOT of information that they shouldn't have.

    Contrary to popular belief, a lot of places do not like America. It's not the big lovable teddy bear that it likes to think it is. It's a great country, but it should do everything it has to do to protect itself.
  • by Anonymous Coward on Tuesday July 27, 2004 @10:59AM (#9811538)
    and the flipside.

    its just as easy for a unscrupilous person to get a job at xyz software company and put that "feature" in too.

    its already happened (not terrorism, but thats why i said unscrupilous (and spelled it wrong to im sure ))

    not to mention the NSA atleast has the option of a source audit. i doubt they could ever attempt to audit winXP, even if they had access to the source.

    not to mention, guns and bombs are what terrorists use, not software. they want carnage, not financial problems, people tend to forget that. terrorism is about casualties, not just causing major problems.
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Tuesday July 27, 2004 @11:00AM (#9811546) Homepage Journal
    Green Hills is a failing company that is seeing its market go to Open Source. In contrast, Wind River, which is in the same market with the same customers, embraces Linux.

    The fact is that Green Hills products are no more secure, and may well be less secure, because they don't have the "many eyes" looking at their source code. We've had trojan horse attempts in Open Source software. They get caught quickly. But even if the source is disclosed, nobody outside of their tiny company has an incentive to do productive work on the internals of a Green Hills operating system in the way that people who modify GNU/Linux do. And security audits by such a small company can't catch everything.

    The best example of this has been the Borland Interbase database. This was used for airline reservations, and had a trojan horse buried in it for 6 to 9 years while it was a proprietary product. The door could have been found by anyone who did an ASCII dump of the product, but those who did kept it secret, and probably took a lot of free flights. An Open Source coder found the door some months after the database went Open Source, and had an incentive to report it - at that point he was one of the people doing productive work on the database and only wanted it to work better and more securely.

    This "black hats" (people who are motivated for bad purposes) vs. "white hats" (good purpose) phenomenon is important to consider when you evaluate the security of Open Source. Generally the only people who would look for vulnerabilities in proprietary software, outside of its manufacturer, are looking to exploit them! This is hardly the case with Open Source.

    Thanks

    Bruce

  • by bit01 ( 644603 ) on Tuesday July 27, 2004 @11:01AM (#9811576)

    (who knows what M$ + NSA put in the closed windows source that might hurt other nations)?

    Cryptographic code [heise.de] for a start.

    ---

    It's wrong that an intellectual property creator should not be rewarded for their work.
    It's equally wrong that an IP creator should be rewarded too many times for the one piece of work, for exactly the same reasons.
    Reform IP law and stop the M$/RIAA abuse.

  • by torpor ( 458 ) <ibisum@ g m a i l . c om> on Tuesday July 27, 2004 @11:04AM (#9811608) Homepage Journal

    I'm a developer, working for a relatively successful hardware company in a non-U.S. land, and I have every intention of hiding all sorts of stuff in any Open Source code I may (or may not, thats freedom) contribute to! :P

    Whether what I hide will be nefarious is one thing, whether or not Easter Eggs can still exist on Open Source Island is another thing entirely ...
  • Pure FUD, indeed (Score:4, Interesting)

    by krygny ( 473134 ) on Tuesday July 27, 2004 @11:04AM (#9811614)
    Well, he said it all, so it must be true; even though he backs it up with nothing. This is so wrong on so many levels I don't even know where to begin. His assetions are hardly worth addressing. Therefore, pure FUD.

    Ok, I'll bite just once: I doubt there is a single weapon system procured by the DoD in the last 10 years that does not have a subsatantial portion of it outsourced overseas. Most procurments now require some % of it, by contract.
  • by foidulus ( 743482 ) * on Tuesday July 27, 2004 @11:07AM (#9811673)
    ebola isn't a very good analogy, maybe HIV is a better one. Even though ebola can spread easier than HIV, the fact that it kills it's host rather quickly means that it doesn't have nearly the number of human nfections that HIV has. A virus that kills it's host off too quickly tends not be able to spread. HIV is a better analogy, it seems to be almost 100% fatal right now, but because it takes so long for symptoms to pop up, people don't realize they have it and spread it to others.
  • by Anonymous Coward on Tuesday July 27, 2004 @11:08AM (#9811677)
    Can you honestly tell me that the government is going to hire a panel of people to check in in-depth source changes on OSS projects? People who are familiar enough that they can catch an exploit that may only take 3-4 lines of code to perform?

    The DoD? Sure, why not. Who is going to audit and check every change to a closed source, proprietory system? The developers? Why should the government trust them over a different group of developers?
  • by Altus ( 1034 ) on Tuesday July 27, 2004 @11:08AM (#9811679) Homepage

    thats why you do testing and code reviews. its not like these people are downloading new kernals in the field, any code that goes into a government project requires immense testing and code review... PERIOD. I dont care who wrote it.

    if the military wanted to use open source software they would likely take the source and lock it down, producing a branch, for them that would be secured and standardized after a large review. if they wanted to bring in new functionality from the "public" branch it would mean a new verion of their "secure and approved" branch which would have to go through the same review process again.

    Its not like they dont have to do this anyway with the code they produce now... sure they arent expecting people to try an sabotage them but you can do that without intention simply by making a coding error. Testing & code review is essential to the process.

    this isnt that much differnt that what the military does with hardened versions of comercial processors... sure they lag behnind their comercial counterparts because they have to be hardend and tested heavily, but then they work, and they are able to leverage the initial design work and testing done when the hardware was being developed for comercial purposes.

  • by rlp ( 11898 ) on Tuesday July 27, 2004 @11:09AM (#9811697)
    At least OSS lets the prospective user review the source code. U.S. companies are rapidly outsourcing proprietary development to foreign countries. Key infrastructure software (and firmware) is being developed in countries such as mainland China (including code used for the U.S. telecom system). Meanwhile, the U.S. military is rapidly adopting off-the-shelf components to reduce costs. But, by all means, lets ignore this, and concentrate on OSS ...
  • FUD slinging (Score:2, Interesting)

    by HexRei ( 515117 ) on Tuesday July 27, 2004 @11:10AM (#9811710)
    O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
    mere speculation. and certainly no more valid than a PROVEN case of bugs being left inside closed code which aren't found until they're exploited.
  • by D3 ( 31029 ) <daviddhenning@gma i l .com> on Tuesday July 27, 2004 @11:10AM (#9811728) Journal
    The NSA already produces their own version of secure Linux. It wouldn't surprise me one bit that they check that code very carefully. I doubt they just grab a copy of the RedHat ISO images and lock down the starup files.

    Also, your code would have to be integrated enough into the calculations to only mis-fire when aimed at a certain target or to mis-fire at a set percentage. If the mis-fires were too high they wouldn't buy off on the weapon.
  • by EXTomar ( 78739 ) on Tuesday July 27, 2004 @11:18AM (#9811834)
    If you were a paranoid Iranian or North Korean computer user and look at Microsoft Windows would you think the same thing? Heck, why would a Chinese user think that MS and the NSA/CIA/alphabet soup is trying to snoop them? Because MS allows a select group to look at their source?!?

    At least with Open Source you have the source to ultimately check for yourself. Vendors like Novel, IBM, and RedHat are supposed to be actively looking at the source to make sure no one is slipping stuff in that doesn't belong but if you don't believe them you can do it yourself.

    So you have a Mr. Dan O'Dowd trying to a terrorist ghost threat into Open Source. The problem is that the source is there for you to inspect. With Microsoft the only word you have is their word that they aren't monkeying with the OS to monitor you.

    IMHO, BSD and Linux are perfect for Military and security applications. You can inspect every corner of the kernel. You can freeze on a specific version because you always have that source code. You can branch and patch as you see fit. This seems perfect for the military and security branches. With Microsoft you have to "signup" (how much money does it cost to do that?) to view the source and then what? The only proof you have is that this particular version of Windows hasn't been monkeyed with. What about the patches and hotfixes? *shrug*

    When it really boils down to it are you going to believe the source you compiled, you control yourself or Microsoft? I think Mr. O'Dowd's trust is ill placed.
  • by FWMiller ( 9925 ) on Tuesday July 27, 2004 @11:20AM (#9811865) Homepage

    I'm a long time Linux user and have been around open-source for a long time. While the source of this article is obviously questionable, I work for a Defense Contractor and I'm here to tell you, the points raised in the article have some truth to them.

    If you're selling products to the govt and those products use an operating system, the issue of being able to GUARANTEE that your code base is not and cannot be coerced is very real. Everyone has (or should have) seen the techniques used to obfuscate trojan horses by using a compiler or some other tool that makes this problem even harder.

    The problem being eluded to here is about a chain of control of a code base that can be demonstrated to satisfy a DoD or other govt customer. While no process can ever be completely secure, the real point is, if you have a choice between a system that has been developed in a closed environment where you can keep an eye on everyone involved and and open-source development, the prior development is easier to verify. You can call it FUD but this is a real issue within the govt circles and WILL limit the use of Linux in certain applications.

  • by wurp ( 51446 ) on Tuesday July 27, 2004 @11:20AM (#9811869) Homepage
    If you think that it's as hard to check code for correctness as it is to write the code in the first place, you can't be a developer, or you're not thinking clearly.

    Writing code is one of the classic "genius" kind of activities - it's generally an NP problem. Figuring out the right answer is *immensely* harder than recognizing the right answer when you see it. A good design jumps out at you when you see it, but finding it starting from scratch can take a long time. It's like finding the answer to a riddle, or factoring a large number. When you have the answer, it's obvious. If you don't, it can seem impossible.

    So even if they have to evaluate the whole body of the code initially, then all diffs with every revision after that, they've made an immense gain to use open source versus building it over again themselves.
  • Amusing article (Score:5, Interesting)

    by u-235-sentinel ( 594077 ) on Tuesday July 27, 2004 @11:21AM (#9811884) Homepage Journal
    Even if Linux were as secure as Windows, Windows is the wrong benchmark. Defense systems should be held to a higher standard.

    As secure as Windows? He's kidding .. right?

    When I worked for the AirForce, they had several instances in which systems were comprimised (desktops). Various worms came out of the blue and just hammered their network. My systems running Linux noticed it immediately. In fact I was told there was NO problem. After a few hours of watching the logs logging attacks over and over again I then noticed a general email sent out to all explaining there was a problem and instructions were provided.

    As secure as Windows? God I hope not!

    The Federal Aviation Administration (FAA) requires software that runs commercial (and many military) aircraft be approved as part of a DO-178B certification. DO-178B Level A is the highest safety standard for software design, development, documentation, and testing. It is required for any software whose failure could cause or contribute to the catastrophic loss of an aircraft.

    Several operating systems have been DO-178B Level A certified. Until Linux is certified to DO-178B Level A, our soldiers, sailors, airmen and marines should not be asked to trust their lives with it.


    If Linux isn't at this level then what is the point of the article? Linux is certified for various things in the military. Whenever I stand up a server I was asked what OS I would be running. Everyone was apprehensive it would be Windows which requires a whole heap of testing before it's allowed to run in production. As soon as I told security it was either Unix or Linux they would sigh and tell me to go ahead. Much more confidence there :-)
  • For example microsoft let China view it's source code.
    No, they let them view PART of their source code. Without compiling.

    Unless you have both the full source and the compiler/toolchain it was built with, a security audit is worse than useless, as you have no way of verifying your results.

    In such a case, WYSINNWYG (What You See Is Not Necessarily What You Get).

    For example: You get the full source and the toolchain. You do a build on the same platform using the same flags. Your final executable has a different md5 sum. You have to conclude that the either the source or the toolchain you received is not identical to the source that was used for the original build.

    Without everything (full source, toolchain, build scripts and flags) you cannot verify that you even hae the right source.

  • NSA Linux (Score:3, Interesting)

    by failedlogic ( 627314 ) on Tuesday July 27, 2004 @11:26AM (#9811948)
    Yeah, Linux is untrustworthy. Enough so that the NSA chose to develop NSA linux with its own security extensions? What I'm getting at is that the government can make its own secure OSes by using Open Source.
  • all in the family (Score:5, Interesting)

    by blooba ( 792259 ) on Tuesday July 27, 2004 @11:35AM (#9812073)
    both my father and i were DoD software engineers, me as a developer, and he as a tester. i do commercial stuff nowadays, and dad's retired. we both know, for a cold hard fact, that no national security- or defense-related software ever goes into production without passing the most rigorous reviews and testing, throughout its entire lifecycle. from functional descriptions, through design reviews, code walkthroughs and acceptance testing, everything is closely monitored and recorded.

    so, please explain to me again how open source terrorists are going to slip their malware under our noses?

  • by AK Marc ( 707885 ) on Tuesday July 27, 2004 @11:45AM (#9812237)
    So, someone could put in bugs or backdoors in Linux? That would never happen with, say, routers from the largest router company in the world. Oh, wait. It already has. Other countries are dumping Microsoft. Why? Because it is a closed source they can not look at that may pack bugs or backdoors placed by the US company to help the US.

    Of all the valid reasons to attack open source software, I can't imagine how they can imply an unknown piece of code is more secure than a known piece of code, even if possible enemies are contributing to the open source (unless, of course, every programmer at Microsoft has been given the proper clearance for all levels the OS they are working on will be used).
  • by pestie ( 141370 ) on Tuesday July 27, 2004 @11:50AM (#9812305)
    How carefully do you suppose those defense contractors screen their subcontractors in India?
  • by duffbeer703 ( 177751 ) on Tuesday July 27, 2004 @11:51AM (#9812332)
    Friendly fire is a fact of life for surface based anti-aircraft weaponry. Creating a spoof-proof friend or foe anti-aircraft system is a non-trivial problem.

    That's one of the reasons why the US has always focused on fighter aircraft at the expense of anti-air artillery and SAM systems.
  • by Rei ( 128717 ) on Tuesday July 27, 2004 @11:51AM (#9812334) Homepage
    I think the DoD's biggest fear concerning OSS is not that the software is too insecure, but that it is *too good* for something available in the public domain. If other countries can get all of the tools they need for a weapon apart from, say, a specific 1000-line guidance or control program, and can make any changes to the tools that they need, that gives them a *major* bonus. Lets not forget how hard our government has worked to stop the export of technology in general - including software - to countries deemed "enemies".
  • Scenario (Score:3, Interesting)

    by jhaberman ( 246905 ) on Tuesday July 27, 2004 @11:53AM (#9812356)
    I have no idea if something like this could be possible, but just playing devils advocate here.

    Say, the software was used for target calculations for an artillery piece. For the sake of argument, say some "rogue" developer has added a bit of code that takes the coordinates from the GPS in the unit. Checks to see if they are in, say, a middle eastern country. If so, the shells, which found the target just fine in testing, are now missing the mark by 500 yards.

    Not paranoid, just askin.
  • by GeckoX ( 259575 ) on Tuesday July 27, 2004 @11:53AM (#9812360)
    So that'll be the solution then, have Linus et al peer review all OSS code and problem solved?

    Just how much code do you think Linus is actually involved with? Really now. Last I checked, him especially, and most of the rest of the crew you mention, live and breath in kernal land. Think they've been over the code in every module out there? Think they know the code for Open Office inside and out? Come now.

    Bottom line, if security is _that_ important, the code will be written and maintained IN HOUSE. Period. There is just NO viable alternative to this that is as secure, and even at that level (which is how this kind of thing is done NOW), there are usually many people working on any given coding project, broken into little bitty units that aren't useful on their own, and implemented in parallel by multiple developers so that cross-checking etc can be done to reduce the possibility of a mole actually being able to do any kind of damage.

    So, the bigger question is then deciding which projects it is acceptable to use OSS for, and which are not. I am quite sure I'll be modded into oblivion for saying this, but it is the blatant truth: OSS is NOT a silver bullet. It will NOT solve all of the worlds programming problems. And it is NOT appropriate for all situations.

  • > a less-than highly used library

    I think this is the tricky part here, though - will a little-known open source math library be used for, say, target software in a M109A6 howitzer? I think it's a bit unlikely.

    > How could you prove I set out
    > to intentionally make that error?

    The problem is that a typical open source project maintainer will not be concerned with _why_ a patch contains buggy code. Instead, he'll run the current suite of tests, they'll fail, and he'll reject the patch.

    > How long would one need to invest to
    > get the trust of a submitter?

    Of course, any system is fallible. But how long would it take for a person with bad intentions to be hired by a commercial company? Or, better yet, for a current employee to be bribed? That'd be a much faster route. And who would ever catch the hidden evil, since the code would only be readable by a select few?

    Open source processes aren't failsafe - but neither are commercial company processes.

    > How long would it take for a badly
    > intentioned person to take over as a
    > maintainer?

    Same as above, I feel.

    Incidentally, thanks for taking the time to respond to my post. This is a good issue to discuss, I think.
  • by Altus ( 1034 ) on Tuesday July 27, 2004 @11:57AM (#9812404) Homepage

    Is it possible that some known and trusted employee could make a subtle mistake that goes uncaught and results in a buffer overflow.

    the answer to that is an unqualified yes.

    military source goes through review and testing the likes of which is seen in few industries (perhaps medical/lifesupport systems... cant think of any other) this will always be the case, and infact must be the case no matter who is writing the code.

  • by Sxooter ( 29722 ) on Tuesday July 27, 2004 @11:58AM (#9812414)
    Have you ever submitted code to Linus, Alan, et. al.? They're total code nazis. Many many patches never make it in because it's insecure / poorly indented / poorly written.

    I.e. the kernel hackers don't just toss patches at the kernel and see what sticks, they review patches very carefully. And toss out a lot of them.
  • by Jayfar ( 630313 ) on Tuesday July 27, 2004 @12:02PM (#9812466)
    Is this a dup? I dunno, but Green Hills FUD was discussed on groklaw at great length [groklaw.net] over 3-1/2 months ago.
  • Wait a second! (Score:2, Interesting)

    by shotro ( 779786 ) on Tuesday July 27, 2004 @12:02PM (#9812467) Journal
    Doesn't Microsoft outsource to other countries? I wouldn't be to surprised if Green Hills does as well. WTF? What a hypocrite. Wouldn't that be putting our security at risk considering we all know that Microsoft surely doesn't check their code as well as hackers do.
  • by Anita Coney ( 648748 ) on Tuesday July 27, 2004 @12:02PM (#9812471) Homepage
    Microsoft has been showing its source code to governments and corporations for the last few years. China is one example. Does anyone seriously believe that a terrorist could NOT get a copy of that source?!

    Heck, what would stop a terrorist from getting someone employed at Microsoft and simply stealing the code?!

    But, then again, Windows is such an easy target for exploitation, getting the source code probably wouldn't be worth the bother. It'd be like stealing a key to a building without locks.
  • by Giggle Stick ( 673504 ) on Tuesday July 27, 2004 @12:19PM (#9812667)
    I wonder if they've found any security vulnerabilities that they aren't telling the world about? They would fix it in their versions, but leave it in the main one, so that they could exploit it against enemies that are using Linux and other GNU stuff.
  • Supporting comment (Score:5, Interesting)

    by Allen Zadr ( 767458 ) * <Allen DOT Zadr AT gmail DOT com> on Tuesday July 27, 2004 @12:25PM (#9812732) Journal
    Here's a supporting comment...

    Just as parent post suggested. Except, the govenment is already auditing open source, and customizing the Linux kernel to it's own needs... Does nobody remember NSA Secure Linux [nsa.gov]?

  • Re:Failing company? (Score:2, Interesting)

    by rumblesnort1 ( 800660 ) on Tuesday July 27, 2004 @12:41PM (#9812918)
    Wait a minute here - Windriver is in the RED. Their stock is dropping like a ton of bricks and their 2004 sales growth is -18.1% and is shrinking their employee base. If you would equate Windriver's adoption of linux instead of their earnings as proof of their success, you will have to understand that I respectfully disagree. Would it be probable that Windriver is embracing this technology (Linux) to gain market share and Green Hills is merely competiting with them (e.g., not a bunch of linux haters) to plant seeds of doubt with prospective Windriver customers? I understand that linux may be the victim of collateral damamge, but I fail to see Windriver going the way of market dominance (they already dominate quite a bit, just losing share) because of this choice and Green River on the fast track to Chapter 11. This may get Windriver a bit closer in the "Visionary" box in Gartner's magic quadrant, but it will be interesting to see if Linux will lower production/operating costs enough to bring profits out of the red. R
  • by Anonymous Coward on Tuesday July 27, 2004 @12:59PM (#9813123)
    you guys are idiots, get the facts right first

    1) Green Hills' Integrity RTOS is not closed source. "INTEGRITY is available in binary distributions, Binary with BSP Source, as well as affordable full source code distributions."
    he's talking about an open-source development method where lots of people from all over are contributing code. they probably have a team of a few guys who do the entire OS, and know everything about the OS. plus, they pay people to review everything and certify it!

    2) because they "support" Linux doesn't mean they are hypocritical. their development tools support linux-- who cares what your desktop OS is, that doesn't have to be secure. the embedded OS in the field does.

    3) NSA SE-Linux. Jesus, this doesn't mean linux is secure!!!!!! "This work is not intended as a complete security solution for Linux. Security-enhanced Linux is not an attempt to correct any flaws that may currently exist in Linux. Instead, it is simply an example of how mandatory access controls that can confine the actions of any process, including a superuser process, can be added into Linux. The focus of this work has not been on system assurance or other security features such as security auditing, although these elements are also important for a secure system."

    **simply an example of mandatory access controls** that's it.

    get things straight before you go spouting bs
  • by AbbyNormal ( 216235 ) on Tuesday July 27, 2004 @12:59PM (#9813135) Homepage
    Right, but wasn't the PS2 considered a military weapon by Japan (mostly for marketing) because of its image capacities? Sorry, technology is already out of the bag...its how you use it that counts.
  • by Anonymous Coward on Tuesday July 27, 2004 @01:03PM (#9813188)
    Maybe this: How has the penguin got into the US army? Let's look on the Land Warrior project:


    http://www.usatoday.com/tech/news/2002/02/07/tech- military.htm [usatoday.com] tells the first part (or may be the first and the second) of the story: The army started its own proprietary development, employing several contractors to design completly new system (of hardware and software, not only a software operating system). The development proved to be expensive and inefficient, not leading to anything usable---they got it reviewed, learned the lesson and changed their approach. They adopted and adapted mostly commertially available off the shelf parts, including the Windows operating system. Then they moved forward, up to actual tests of usable equipment.


    The next part is told at http://www.nationaldefensemagazine.org/article.cfm ?Id=1238 [nationalde...gazine.org] The new design, including Windows, failed its test. The army again learned its lesson, the whole system was redesigned, and Linux adopted for its operating system.


    The army did not take Linux out of sheer stupidity, not knowing other alternatives---the army took Linux after serious considerations of its rich and expensive experience with several other alternatives.


    Mr. O'Dowd speaks of Linux being worse than Windows, and Windows being almost as bad as Linux. Looks like his Green Hills Software was part of the firs expensive exprience of the army, first losing its contracts to Windows, and then to Linux.

  • by zoloto ( 586738 ) on Tuesday July 27, 2004 @01:14PM (#9813299)
    but if the military found F/OSS software to be superior, then so be it. If they're worried about it coming into the hands of their enemies, the DoD's implementation does not have to go back into the public via GPL. Since the Military isn't "Public" in many aspects, I don't believe they should be bound by the GPL in the sence that any changes used and put in place inside their tanks or jets can be used in the public anyhow, it shouldn't be released.

    make sense?

    just my thoughts.
  • by casmithva ( 3765 ) on Tuesday July 27, 2004 @01:18PM (#9813330)
    Unless my memory's foggy, didn't the Dept. of Homeland Security and CERT advise everyone recently to stop using closed-source Internet Explorer, developed by an American company, and switch to open-source Mozilla, an international effort for security reasons? Nah, I must've dreamt that...
  • by wtrmute ( 721783 ) on Tuesday July 27, 2004 @01:30PM (#9813434)
    It would be incredibly naive to believe that other countries and terrorist organizations would not exploit an easy opportunity to sabotage our military or critical infrastructure systems when we have been doing the same to them for more than 20 years!

    Whoa, he comes out and says it like that? Man, if I were part of the Gringo-hater crowd, that'd give me fuel for years!... :-)

  • So US-Centric (Score:4, Interesting)

    by abe ferlman ( 205607 ) <bgtrio.yahoo@com> on Tuesday July 27, 2004 @01:33PM (#9813463) Homepage Journal
    A lot of other governments are moving away from Microsoft b/c they're pretty sure we're using Windows to spy on them.

    Unfortunately, you can't guarantee that someone looking to subvert windows in a subtle way won't be hired by (or more interestingly, license their code to) Microsoft- so with closed source you basically get the worst of all possible worlds.

  • by sakshale ( 598643 ) on Tuesday July 27, 2004 @01:34PM (#9813471) Homepage Journal
    Ken Thompson, one of the coauthors of C, said it best in his Turing Award lecture; Reflections on Trusting Trust.
    The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.
  • by Dr_Marvin_Monroe ( 550052 ) on Tuesday July 27, 2004 @01:53PM (#9813673)
    Lest us not forget that WE'VE been planting trojans in software shipped overseas too. I recall a story here regarding deliberately sabotaged software shipped to some Russian pipline project. As I recall, the trojaned pipeline test software was designed to operate the pipeline at 10X normal pressure and cause an explosion...which it properly did, setting back the Russian government's energy plans.

    When other governments start using OSS, they may be freeing themselves of these US planted trojans. I believe THAT is the major fear of the US government... Not that they will fail to detect a foreign planted bug in some fighterjet, but that OUR planted bugs will be found by China/India/Pakastan/Iran/etc... This would also seem to explain our government's looking the other way with regard to the Microsoft settlement. Remember that the anti-trust settlement was made within a week or so of September 11. Remember also the "Green Lantern" project, where our government was activly looking for ways to co-opt peoples boxes.

    Software than cannont be easily trojaned creates just one more difficulty for our spy agencies. As with the gangster who was using pretty secure encryption, the government is now forced to use things like hardware keystroke loggers (meaning they have to have physical access to the unit), sneek-and-peek, you get the idea.

    The US government has an interest in keeping people using insecure systems. How easy to you think it was to open those Windows laptops captured in Afganastan? Why, the NSA had those famous "NSA-KEY" entrys to Windows!... Easy as pie. The last thing they want is for KSM and OBL to start putting strong-encrypted filesystems on their Linux laptops in Afganastan. No way to plant the backdoor!

    Expect to see a lot more of this type of FUD... The US Government has plenty of time and money to make sure that their Linux systems are safe, they just don't want others using them...
  • More Green Hill FUD (Score:3, Interesting)

    by infolib ( 618234 ) on Tuesday July 27, 2004 @02:12PM (#9813896)
    You can find what's more or less an expanded version of this article here. [ghs.com]

    Quote: The NSA has not fixed, or even seriously tried to fix, the security problems (documented in this series of white papers) that make Linux unsafe for defense systems.

    [...] If secrecy isn't important to security, then why does Linus Torvalds keep the means of accessing the core Linux development tree a secret from all but a few people?

    Another FUD dose says [ghs.com]

    The GPL was designed by Richard Stallman to prevent you from making a profit from distributing his software (which makes up a large part of Linux).

  • by AbraCadaver ( 312271 ) on Tuesday July 27, 2004 @02:39PM (#9814259)
    Does anyone remember that Canadian company that was making US DOD software... and outsourcing some or most of the programming to China. I beleive that the DOD wasn't fully aware of where the work was being done until later in the game. Not that either of the parties involved had malicious intent, BUT, that in itself seemed far more vulnerable than code that EVERYone can see, and audit, and comment on.
  • by Fishstick ( 150821 ) on Tuesday July 27, 2004 @03:08PM (#9814642) Journal
    this one?

    http://news.zdnet.co.uk/software/0,39020381,391479 17,00.htm [zdnet.co.uk]

    Software supplied to run a Russian pipeline was deliberately planned to go haywire, causing the biggest non-nuclear explosion the world had ever seen...

    as I recall, this wasn't a case of sabotaging legitimately acquired software for the hell of it. The CIA became aware of the Soviet's intent to steal western technology, including control software for their pipeline project, through an agent recruited by the French.

    Reagan was aware of, and approved the plan. The CIA managed to get inside the deal, and instead of stopping the transaction, sabotaged the code so that the pump speeds and valve settings would go haywire after some period of time.

    I'm certainly not excusing the sabotage that, while not causing any loss of life, caused immense damage to the Soviet economy. I won't argue wether it was justified... but the US government made it illegal for them to import certain technology. They circumvented this ban, and paid a heavy price.
  • Re:I wonder? (Score:1, Interesting)

    by Anonymous Coward on Tuesday July 27, 2004 @05:09PM (#9815906)
    No need.

    Green Hills biggest competitor has always been Open Source. They make embedded compilers and operating systems, a market that open source invaded well before it became popular on the desktop.

    When I worked there, they were still in the laugh at it phase "open source, you get what you pay for, ha ha ha!" despite their largest compeditor being gcc. I am sure by now it is hurting their core business.
  • Re:Oh really (Score:2, Interesting)

    by pchan- ( 118053 ) on Tuesday July 27, 2004 @05:13PM (#9815942) Journal
    I would challenge you to compile a new Intel C library using a Microsoft C compiler from 6 years ago too. Heck, compile glibc using an IRIX compiler from six years ago.

    hmm, i don't think code using features of C99 (y'know, the latest C specification) would compile on pre-C99 compilers. i can't imagine why you find that surprising. as for glibc, it is far too dependant on gcc hacks to have a chance of compiling under any other compiler.
  • by darnok ( 650458 ) on Tuesday July 27, 2004 @07:27PM (#9817150)
    if we take this guy at his word, then it's reasonable to think that non-US countries shouldn't use closed source, US-developed software because it could contain nasties as well.

    The US govt isn't everyone else's best friend at the moment, and appears to be working particularly closely with US software companies at present in terms of pushing US' interpretation of intellectual property onto much of the world. It's more than feasible that the US govt could have said "Look, software vendors, we'll push your interests out into the world, but there's a favor you can do for us in return. Here's some source code we want you to bolt into your products for overseas distribution"...

    At least FOSS gives people the opportunity to examine the source of what they're going to be running. No, most people don't bother, but with Windows, Solaris, etc. it's not even a possibility.

"Only the hypocrite is really rotten to the core." -- Hannah Arendt.

Working...