Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Software Linux

Open Source a National Security Threat 921

n3xup writes "Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes. He likened Linux with a Trojan Horse- free, but in the end a lot of trouble. O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
This discussion has been archived. No new comments can be posted.

Open Source a National Security Threat

Comments Filter:
  • by stecoop ( 759508 ) on Tuesday July 27, 2004 @10:51AM (#9811405) Journal
    Understand the source perspective before you draw opinions. Green Hills is under threat from Linux due to the embedded software being integrated in more Government system. GreenHills is (was?) a large player in government based Embedded Operating Systems. I imagine you will see a similar stance by WindRiver maker of the popular Realtime Embedded OS VXWorks.

    The threat comes from the length of time on some large government projects. Some systems have been around longer than you and me. In the proprietary world, your whole project is dependent on a set of companies staying in business for 30+ years. Now with Linux, you're no longer dependent on that string; you can leverage off the community providing updates or if necessary you as the developer can make the changes. Most people fail to say this with Linux; everyone just says hey it's free and cheap. But if you really want to sell Linux, try saying that your entire project doesn't fall on another proprietary solution, we will have the source code in hand - people will listen.

    It's easy to retort GreenHills FUD by saying all changes will be baselined and a change control board will review any updates (easy enough huh).
  • by beh ( 4759 ) * on Tuesday July 27, 2004 @10:51AM (#9811412)
    Shouldn't this article immediately point back to other articles on
    how governments OUTSIDE the US are choosing open source for exactly
    the same reason (who knows what M$ + NSA put in the closed windows
    source that might hurt other nations)?

    [World Govs Choose Linux For Security & More]
    http://slashdot.org/articles/01/12/11/0132213.shtm l

  • by proj_2501 ( 78149 ) <mkb@ele.uri.edu> on Tuesday July 27, 2004 @10:53AM (#9811431) Journal
    do we even need another comment on this story?
  • FUD. (Score:5, Insightful)

    by garcia ( 6573 ) * on Tuesday July 27, 2004 @10:53AM (#9811435)
    Some embedded Linux providers even outsource their development to China and Russia.

    GASP! Some XYZ providers even outsource their development to ABC and DEF (insert your favorite company and terrorist sponsoring country where necessary).

    It would be incredibly naive to believe that other countries and terrorist organizations would not exploit an easy opportunity to sabotage our military or critical infrastructure systems when we have been doing the same to them for more than 20 years!

    I think it has been proven that closed-source development doesn't help to change the possibilities that a "mole" has been planted or that a "hole" will be discovered.

    One of the greatest misconceptions about Linux is that the free availability of its source code ensures that the "many eyes" with access to it will surely find any attempt at sabotage. Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs. Many of these flaws have eluded detection for years. It is ridiculous to claim that the open source process can eradicate all of the cleverly hidden intentional bugs when it can't find thousands of unintentional bugs left lying around in the source code.

    And it is ridiculous to claim that a closed development enviornment will make it any different.

    In addition, under the internationally recognized Common Criteria for IT Security Evaluation (ISO 15408), Windows has been certified to Evaluation Assurance Level 4 (EAL 4), a higher level of security than the EAL 2 that Linux has achieved.

    According to this [com.com] article, obtaining EAL2 certification typically costs between $400,000 and $500,000. Looks like it is more money than security. In their infancy, why would Linux vendors decide to shell out large sums of money when the government wasn't interested in using Linux anyway?

    This whole article is FUD. He's annoyed because Linux is making leaps and bounds and will possibly affect his market-share in the lucrative Defense and Aerospace industries. At least he came out and said it on his own legs and not by paying off a third party to "investigate" the "problems" with Linux and post their results to the world.
  • Governments should not use OS without a proper security audit. Once you can verify the nature of the code, there should be no obstruction to using it.
  • by Anonymous Coward on Tuesday July 27, 2004 @10:54AM (#9811461)
    ...with really big glass windows. All you need do is open your eyes to see what's inside.
  • by InThane ( 2300 ) on Tuesday July 27, 2004 @10:54AM (#9811463) Homepage Journal
    IIRC, China has seen the source code to Microsoft Windows, whereas the U.S. government hasn't.

    I think that's a pretty large security threat right there...
  • by shobadobs ( 264600 ) on Tuesday July 27, 2004 @10:54AM (#9811467)
    What if a terrorist gets a job at a software company? Where's the hope of catching the bugs then? It seems to me that closed-source software is more susceptible than open-source.
  • by Turn-X Alphonse ( 789240 ) on Tuesday July 27, 2004 @10:56AM (#9811503) Journal
    Now then.. last time I checked alot of the new bugs found in Windows were revealed by geeks... the type of geeks who make open source in many cases.

    I think I'd rather put my trust in someone doing it for the pure love (hate) of (bad) software, then someone doing it for money and no love at all.
  • by polyp2000 ( 444682 ) on Tuesday July 27, 2004 @10:57AM (#9811508) Homepage Journal
    Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes.

    Urmm , so what operating system do you use then Dan O'Dowd? and which newspapers and websites do you read?

    You're obviously using a closed source operating system that is free of viruses, worms, holes and other security problems. What might this mystery closed source operating system that you are using that doesnt pose a threat to the nations security?
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @11:00AM (#9811556)
    Comment removed based on user account deletion
  • by Coz ( 178857 ) on Tuesday July 27, 2004 @11:06AM (#9811653) Homepage Journal
    You have one set of experts write the code under an F/OSS license, another set of experts examine the code, test cases, and test results.

    Believe me, if you're talking about something like gunnery firmware, they're going to test it... the deepest fear in DoD these days is friendly fire.
  • by Total_Wimp ( 564548 ) on Tuesday July 27, 2004 @11:07AM (#9811660)
    Can you honestly tell me that the government is going to hire a panel of people to check in in-depth source changes on OSS projects?

    More to the point, will they do this with closed source projects? Getting a mole into Green Hills Software, Microsoft, etc is every bit as real of a threat as getting one into any open source project. In many cases it might even be easier because of the lack of good hiring practices and oversite at small defense companies.

    TW
  • > How hard would it be to build in a small
    > tiny bit of error that would only be
    > useful in cases of calibration of
    > high-tech weapons?

    I think it'd be tricky, because it would break other high-precision things as well. And the other folks using the open source project would say "hey, this fellow Fred just submitted a patch. something looks odd about it. Fred, why does line 314 do a bit shift without checking the foobar?" And then the patch would be rejected.

    > If 3000 lines of dense mathematically
    > rich C were checked in

    I doubt any maintainer would accept such a patch. I don't accept patches for PMD [sf.net] without reading them, and if I got a 3K line patch I'd reject it out of hand.
  • by dougmc ( 70836 ) <dougmc+slashdot@frenzied.us> on Tuesday July 27, 2004 @11:07AM (#9811667) Homepage
    What he suggests is possible. And a well hidden bug could easily escape detection by Linus and anybody else who goes over each new patch looking for stuff like this.

    And it doesn't have to be in the Linux kernel. The classic example (at least 10 years old) is to hack up gcc so that it examines the code it's compiling, and if it decides that it's compiling /bin/login to do things a little differently, inserting a back door where there was none before.

    However, while he does have a point, it's a very myopic point. Closed source software has exactly the same vulnerabilities, except for one critical difference -- only people within the company in question have a chance of detecting the problem -- the end user will never get to see the source and see if it's compromised. Granted, most open source users do not review all the source code that they use, but at least the option is there, and for the people where security is absolutely essential (like the NSA) they almost certainly use it.

    Also, for a closed source company, the problem is even worse. The backdoor (or whatever) could be introduced when the code is finally compiled for distribution, and never get checked into whatever source control system they use. So the binaries get shipped out, but NOBODY has reviewed the source code in question (except our cracker friend) and once the bug does come to light (if it ever does) the company will look at the source code and scratch it's head -- it won't even have the source code in question to look at.

  • by Feneric ( 765069 ) on Tuesday July 27, 2004 @11:07AM (#9811670) Homepage

    Isn't the problem described much larger for commercial outsourcing? These days most software used in the U.S. is partially written outside the U.S. At least with the open source software people concerned about security can build from source and perform an inspection on the source. With commercial software, no such precautions are available.

  • by hawkeyeMI ( 412577 ) <brock&brocktice,com> on Tuesday July 27, 2004 @11:08AM (#9811678) Homepage
    Two words: equipment testing
  • by Anonymous Coward on Tuesday July 27, 2004 @11:09AM (#9811699)
    If 3000 lines of dense mathematically rich C were checked in and a dozen lines acted in concert to create a miscalculation, how much expertise would be needed to catch that?

    It wouldn't be easier or more likely to find such a problem if the code were closed source, and only a handful of people ever looked at it.
  • by tizzyD ( 577098 ) <tizzyd AT gmail DOT com> on Tuesday July 27, 2004 @11:09AM (#9811703) Homepage
    When you can't compete, FUD.

    The gauntlet is thrown down. I challenge this man to come up with a demonstrable "trojan horse" in an OS piece of software that cannot be found in a reasonable period of time by a security audit (the kind the government does of OS software to be used). Such fear mongering should be laughed at with, torn up, and spit upon whenever you see or hear it. It reminds me of Ridge getting up and saying, well, there's a threat around the election, but we have no evidence of it. Be scared (and vote for Bush). Yea . . . right. I didn't just fall off the turnip truck.

    Get a life, and make better products, jerk!
  • by SuperChuck69 ( 702300 ) on Tuesday July 27, 2004 @11:10AM (#9811707)
    I am continually amazed that people believe open source is a good place for terrorists to hide evil, anti-government bugs...

    The cornerstone of open source is that it is OPEN SOURCE. The government is free to view and evaluate all the packages to their little, demonic hearts' content.

    If I were a terrorist, I'd think I would penetrate a closed-source house (say, Microsoft or Green Hills) and hack some little nasties into their source.

    But,, maybe that's why Dan O'Dowd isn't a very good terrorist.

  • by _Sprocket_ ( 42527 ) on Tuesday July 27, 2004 @11:10AM (#9811709)


    Can you honestly tell me that the government is going to hire a panel of people to check in in-depth source changes on OSS projects? People who are familiar enough that they can catch an exploit that may only take 3-4 lines of code to perform?

    ...

    I think that having experts able to review each line of code checked in and put into production defeats the whole idea of using Open Source: at that point, you might as well just hire the experts to write the code in the first place and eliminate the vector all together.


    No - the Government will contract a company to do this for them. The difference is that when the contract comes up for bid, they will both have the source code involved and a better chance of finding competing contractors able to work with that code. At worse, they'll end up contracting with an outfit that is unfamiliar with the code and has to ramp-up... which isn't much different from a proprietary-based situation. Which brings us to a very important point...


    If 3000 lines of dense mathematically rich C were checked in and a dozen lines acted in concert to create a miscalculation, how much expertise would be needed to catch that?


    Are you implying that 3000 lines of proprietary code is going to go in to a system without any checks? And once that code is incorporated in to the system, that system is not going to undergo a strict battery of tests?
  • by GoofyBoy ( 44399 ) on Tuesday July 27, 2004 @11:10AM (#9811715) Journal
    >In many cases it might even be easier because of the lack of good hiring practices and oversite at small defense companies.

    What hiring practices does Linux have?

    Defense companies have to go through a certain amount of security and background checks to win a contract.
  • Why OSS ? (Score:2, Insightful)

    by alvieboy ( 61292 ) on Tuesday July 27, 2004 @11:11AM (#9811731) Homepage
    I wonder if they also consider shareware and freeware as a possible threat.

    You know, it's easier to hide funcionalities in Shareware/Freeware than in OSS - you can't look at the code and observe them at one glance. I wonder if it would not be easier to spread malicious code in PaintShopPro and others.

    Al.
  • by Anonymous Coward on Tuesday July 27, 2004 @11:19AM (#9811855)
    What hiring practices does Linux have?

    Doesn't matter one jot. Gee, look, there's the source code. Every bug, hole and trojan horse, just waiting for you to find them. All you have to do is audit the code. You should be auditing the code of any product you're going to use in a sensitive enviroment anyway, wether it's closed or open source. Where's the difference?
  • by Zocalo ( 252965 ) on Tuesday July 27, 2004 @11:20AM (#9811860) Homepage
    Who needs a mole at Green Hills Software or Microsoft? The kind of software we are talking about here is highly proprietary stuff that you are not going to be able to get mail order from your local retailer. A better bet would be to target any third party libraries the vendors are using; almost no one would write their own IP stack when they can by one for a few dollars. Sooner or later you are going to get one that might have been bought from a legitimate company in the US, but was actually coded by easily bribable coders in the third world.

    If anything, I'd say the risk of getting exploits deliberately planted in code without detection are far greater in closed source applications than in OSS projects. Another lame attempt at FUD from the people behind AdTI..

  • by Dare nMc ( 468959 ) on Tuesday July 27, 2004 @11:20AM (#9811864)
    > you might as well just hire the experts to write the code in the first place and eliminate the vector all together.

    and where do they find the expert to hire in the first place? the beauty of Version controll in a OSS is that you can find exactly who programmed the code, and if you do the background check, and he is hireable, you use him, and his code. if not you have the choice of finding another expert to re-write just that section, or thourghly review his sections...

    try to find out exactly which indian contractor did that work for the closed source contractor, and trust them to tell you, oh ya we hired an extremist, but he didn't do that part of the code ;) ;)
  • by Atzanteol ( 99067 ) on Tuesday July 27, 2004 @11:21AM (#9811875) Homepage
    Perhaps, but renouncing what they say because of who they are is not exactly correct either. It's a fallacy to claim 'A' is false becuase the person saying it is 'B'.

    If a man on death row for murder tells you that murder is wrong, is he incorrect simply because of who he is?

    Sure they (Green Hills) have an agenda, and it should be noted. However, they may have a point and shouldn't be discredited out of hand.
  • by demachina ( 71715 ) on Tuesday July 27, 2004 @11:26AM (#9811946)
    "how much expertise would be needed to catch that?"

    Uh, not much. If the weapons aren't hitting the mark on the firing range they probably wouldn't get deployed until they are fixed.

    This is probably a poor example. The danger isn't in OSS that is designed to fail. If it doesn't work it wouldn't get used. The danger is an obscure security hole that would allow infiltration.

    The key point where this guys whole argument falls apart is that proprietary software isn't any better. I'm confident Microsoft employs a small army of foreigners, and I'm not sure they would be any more reliable than OSS developers and their code gets a lot less scrutiny, and absolutely none if you are a customer getting binaries. Most big companies are putting R&D centers in India and China. How do they assure us the people they are hiring don't have ulterior motives.

    If you want to develop software critical to national security you have to develop it in a classified lab with cleared employees. Oh but wait, in spite of all the scrutiny people with get security clearances get, they also turn out to be foreign agents and do great damage. Los Alamos doesn't exactly have a stellar security record and those people get more scrutiny than anyone. The Navy's comsec and has been massively compromised in the past.

    I'd argue the opposite case from this guy. If you want secure software the best approach is to have as many people possible, both OSS and governemnt, scrutinize the source. If you find a project that is intentionally or negligently checking in compromised code black list them or give them extra scrutiny. The NSA's secure linux effort is an example of the government making sure OSS is secure and its way more likely to be that, than anything Microsoft or Green Hills is going to give them.

    On a tangent here [counterpunch.org] is an interesting article on Homeland Security trying to enforce security through obscurity in the physical world. Someone walked around the DNC and took photos of all the weaknesses in their security in Boston and posted it on a list on Yahoo. Homeland security shut down the list and is collecting the names of everyone on the list and everything said. Should give you pause before joining any list in these interesting times.
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @11:31AM (#9812015)
    Comment removed based on user account deletion
  • by Abcd1234 ( 188840 ) on Tuesday July 27, 2004 @11:32AM (#9812031) Homepage
    LOL! You're telling me that you think the government will replace "defense companies [that] have to go through a certain amount of security and background checks" with unverified open source software? Please... if security matters *that* much, the government will either 1) continue to hire secure defense companies or 2) hire a secure defense company to verify the safety of any open source they use.
  • the deepest fear in DoD these days is friendly fire.

    No it isn't - if it was, they would have burned all the patriot systems already. the biggest fear in DoD is making sure their pet contractors stay on the payroll so that they keep getting their kickbacks.
  • FUD, FUD, FUD! (Score:1, Insightful)

    by Anonymous Coward on Tuesday July 27, 2004 @11:33AM (#9812057)
    With the knowledge that Linux is going to control our most advanced defense systems, foreign intelligence agencies and terrorists can easily infiltrate the Linux community to contribute subversive software.

    And proprietary software is safer, how? It is just as easy, if not easier, to infiltrate a specific closed-source company (remember, the 9-11 hijackers were here for 3 years learning to fly jumbo jets) and program in their subversions directly. (See my comments about his compny's certifications below).

    Some embedded Linux providers even outsource their development to China and Russia.

    Unlike all the major proprietray developers who outsource all the work they can to China and Russia, too. How much of Green Hills' code is written overseas? By sweat-shop coders they never even meet?

    In fact, the U.S. National Institute of Standards and Technology (NIST) security vulnerabilities database lists more vulnerabilities for Linux than Windows in the last ten years.

    This is such a broad statement it is tough to refute. Are they talking kernel only? The kernel is the only part the military should be interested in as far as security vulnerabilities go. Then how can they get equivalent numbers for Windows which doesn't easily allow you to separate the kernel? And Windows definitely should include all of the apps because any app vulnerability is a potential OS vulnerability. This statement needs a lot of amplification before it even approaches something like "truth".

    DO-178B Level A is the highest safety standard for software design, development, documentation, and testing.

    From Green Hills' web-site about DO-178B Level A certification:

    "The certification package includes Green Hills Software services for all DO-178B Level A compliant verification activities for INTEGRITY-178B operating on processor architecture specified by a customer's requirements. All reviews, analysis and testing of the INTEGRITY-178B real-time operating system is performed by Green Hills Software using the customer's target processor system."

    So DO-178B Level A verification is OK as long as you trust Green Hills. Remember my earlier comments about infiltrating proprietary companies? With a couple of fifth-columnists in a couple of key places terrorists can insert whatever code they like and then pass it right along in the certification stage.

    If the government truly wants to use Linux in military operations:
    1. Freeze the source right now. Fork it into their own private source control tree that nobody in the outside world ever sees.
    2. Perform the entire DO-178B procedures (I don't remember what parts of it these are) that do a detailed analysis on the source code for all decision brnches, etc.
    3. NEVER use any public patches or source code changes as-is; instead, any changes to the code must be examined at the source level to the same rigor as 2 above and then incorporated directly into their private source tree.
    4. etc, etc.
  • by SpiritOfGrandeur ( 686449 ) on Tuesday July 27, 2004 @11:34AM (#9812068)
    However, this conventional wisdom is unsupported by quantitative data. In fact, the U.S. National Institute of Standards and Technology (NIST) security vulnerabilities database lists more vulnerabilities for Linux than Windows in the last ten years. In addition, under the internationally recognized Common Criteria for IT Security Evaluation (ISO 15408), Windows has been certified to Evaluation Assurance Level 4 (EAL 4), a higher level of security than the EAL 2 that Linux has achieved.

    Is this not due to the fact that only people inside M$ can check their own code, and that they will not always disclose vulnerabilities?

    Linux on the other hand almost always instantly discloses its bugs.
  • by tomknight ( 190939 ) on Tuesday July 27, 2004 @11:35AM (#9812088) Journal
    That's not the DoD's deepest fear, more the MOD's deepest fear...

    Tom.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @11:36AM (#9812094)
    Comment removed based on user account deletion
  • closed source (Score:2, Insightful)

    by Errtu76 ( 776778 ) on Tuesday July 27, 2004 @11:36AM (#9812107) Journal
    Isn't a bug inside an open source program much easier (and no doubt faster) to find than one in a 'closed' source application?
  • by killmenow ( 184444 ) on Tuesday July 27, 2004 @11:38AM (#9812140)
    What hiring practices does Linux have?
    Peer review. I imagine Linus, Alan, Andrew, Ingo, Tigran, et. al., are more capable of: A) reviewing code submitted for inclusion in the Linux kernel; B) understanding its purpose; C) deciding who to trust to write proper code; and, D) actually committing the code into the kernel THAN ANY GOVERNMENT OFFICIAL.

    As evidenced by his desperate attacks to stave off his dwindling market share, they're obviously doing a better job than the Green Hills CEO.
  • by Anonymous Coward on Tuesday July 27, 2004 @11:45AM (#9812244)
    I've heard that some government systems that hold classified data are still using NT4, and are preparing to use W2K, because the W2K audit isn't finished yet (or was only finished recently), and they install the systems with their own special NT4 CDs. They don't even use MS CDs to install the systems, and they don't run machines with OSs preinstalled by the manufacturer. They use their own CDs with their own patches applied.

    I don't know this for sure, but it seems to me that the government is in a bit of a bind with MS products, because they're bloating and the audit process is falling behind. They're about to jump to W2K, which at least puts them in the Active Directory world, but they're not even thinking about XP at this point. W2K is much bigger than NT4, XP is much bigger than W2K, and Longhorn is massive (from an auditing point of view). It's got to be hard to manage these audits.

    I think that from the government's point of view, the real issue is whether the code has been audited, not whether it's open or closed source. If it's audited, they feel they can use it, if it's not, they can't. Keep in mind that the machines themselves are closed off, guarded by men with guns, and are not plugged into public networks.

    So it seems to me that a lot of the open vs. closed source would come down to how it affects the auditing process.

    I don't know anything about the audit process (or any of the rest of what I'm saying here, really), so take this with a grain of salt, but...

    It seems to me that open source would allow them to manage their own distro, with a small number of essential packages. Patches and updates could be audited on a continuous basis, as they come out. That has to be a much more manageable project than simply tearing into Longhorn from scratch when MS finally drops it.

    If they have the source, they could make sure that whatever they were using could deal with whatever it needs to deal with -- new hardware, or whatever. I think the pressure of obsolete code (which they're probably feeling with NT4 now) would be less intense.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @11:46AM (#9812251)
    Comment removed based on user account deletion
  • by Rei ( 128717 ) on Tuesday July 27, 2004 @11:48AM (#9812290) Homepage
    Yes, that is a good point, but there's another reason why this argument is bunk: Linux is written by a lot of non-Americans, but so is Windows, and essentially every other operating system developed by anyone other than the US military. The same appliess for software in general. So then, the argument needs to be: Which one is easier to audit and has better *creation time auditing*?

    The answer should be obvious.
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @11:49AM (#9812292)
    Comment removed based on user account deletion
  • by Frank T. Lofaro Jr. ( 142215 ) on Tuesday July 27, 2004 @11:50AM (#9812308) Homepage
    What could a terrorist at Microsoft do?

    Add bugs to Windows? ;)
  • by Business King ( 599197 ) on Tuesday July 27, 2004 @11:50AM (#9812315)
    Lets say the United States uses a contractor, that has a foreign national as part of their staff, but does not know it, and this national is in charge of building some software. The foreign national knows exactly where to place key code segments to crash the program (lets say a missle interception program) when they want too. The foreign national knows exactly what test cases are being done, and knows how to avoid them, therefore hte software looks bullet proof. The software is approved as working, and is shippped. But the U.S. Government does not know, is that one line in tens of millions, checks for an override code. Now, unless an extensive code review is done -- which is supposed to be done, but not always done -- to go over all lines as they are checked in, this bug will make it past the checks. Once the code is delivered, the chances, at lesat from what I can imagine, that it getting caught till the damage is done, is super small.

    Now, if the code was open source, it would get reviewed, and looked at constantly. Yes, again, what are the chances of someone finding that bug, but I am sure they are greater than someone trying to find a bug in closed software.... :)

    The Austrailan voting system has been open source now for a number of years, and that system has just gotten more secure over time. I think that is a prime example of something that is borderline needing to be secure, and how open source worked. I Think it can work again, and that the US. should adopt it, if our greedy companies do not get in the way first.

    By the way, the top paragraph was completely hypothetical -- no one wants CIA agents at their door.

    -A
  • by Anonymous Coward on Tuesday July 27, 2004 @11:52AM (#9812342)
    If this is true for open source then it is 10 fold for closed source comercial software with all the outsourcing and visa holders! At least with open source we can find these mythical backdoors. The Outsourcing and visa trends are a much greater risk to National Security than Open Source if you use this lunatics logic.
  • Exactly the point (Score:5, Insightful)

    by hol ( 89786 ) on Tuesday July 27, 2004 @11:52AM (#9812351) Homepage Journal
    This is precisely why Brazil, China, and even Germany are moving towards open-source. The US Government cannot insert backdoors into this stuff that would affect anyone not wanting to be affected, unlike Microsoft stuff. Remember the NSA keys in the Windows NT crypto libraries?

    The US can continue to run Windows, be our guest, but the point is moot since much of US Government software is developed in India anyways. No back doors there, for sure.
  • First blush (Score:3, Insightful)

    by yoshi_mon ( 172895 ) on Tuesday July 27, 2004 @11:53AM (#9812354)
    I went to Green Hills Software page 1st. Just to see who this is.

    And I'm a little upset. Is OSN actually letting these guys astoturf on /.? Why would this get even a 2nd look by anyone? I might see the somethingawful forums laughing at this but to have a posting here?

    This whole thing smells bad. Maybe it's a slow news day but there are better anti-linux rants than what is coming from that lame ass website. Nothing to see here, move along.
  • by Anonymous Coward on Tuesday July 27, 2004 @11:57AM (#9812408)
    The U.S. military and government (For example, Horizontal Fusion, the catalyst for the Net-Centric Transformation of DoD, is heavily leveraging web services, JSR-168, et cetera... ) is increasingly using Open Source with talented people behind the wheel. e.g. Many software programming books following open standards and what not are penned by Defense Intelligence Agency (DIA) employees... And the government has groups actively focused on Information Assurance (IA).

    And I, for one, work in the industry and gee, know what I'm doing and can read other people's code.

    Pure FUD.
  • by Anonymous Coward on Tuesday July 27, 2004 @11:59AM (#9812442)
    Speaking as someone who has worked on DOD projects, I've seen projects where the source code is included and where it is not. The ones without the source or with poor source code (only a printout of the code) are a train wreck when it comes to updating the stuff. The ones with the source code are a piece of cake. I can definitely tell which is which just by how happy the people who actually have to use the stuff are.

    I should point out that these projects involved one-of-a-kind, highly custom stuff that is supposed to be updated all the time.

    It's all about how the contracts are written. Anybody sane should write their contract specifications to REQUIRE the full source code and complete schematics/blueprints/etc. This goes for DOD and anybody else. If someone offers you a lower price for not including sources, you'll pay for it in the long run, and it's not worth it.
  • Oh really (Score:5, Insightful)

    by TheConfusedOne ( 442158 ) <the@confused@one.gmail@com> on Tuesday July 27, 2004 @12:01PM (#9812459) Journal
    It's possible, and HAS happened that KNOWN, and TRUSTED engineers have put bits of code that would pass initial scrutiny and still be dangerous.

    Wasn't there recently an article about a router with a backdoor shipped out in its code? How about all those darn "easter eggs" floating around in Windows and Office and other programs?

    I would challenge you to compile a new Intel C library using a Microsoft C compiler from 6 years ago too. Heck, compile glibc using an IRIX compiler from six years ago.

    You can drag out all the scenarios you want and whether it's Linux or it's *nix or BSD or Windows you're going to have the same audit challenges and not even have access to the source code without negotiating with all your suppliers.
  • by otisg ( 92803 ) on Tuesday July 27, 2004 @12:03PM (#9812483) Homepage Journal
    It's as simple as the subject. Open-source software is just another option. Don't like it? Buy the commerical software, or pay for its development by a trusted company or agency.
    However, governments DO use closed-source software from companies and people they do not know. Who says Microsoft is not paid by X government that, through mass-adoption of Windows and other MS applications around the planet, now has control over a laaaaaarge number of computers in all countries of this world?
    In other words, simply buying commerical software is not any more secure. What's worse, there is not way you will be able to check the sources. With OSS you have that option, and it is up to you, the user of the software, to check it or not check it.
  • by twem2 ( 598638 ) on Tuesday July 27, 2004 @12:04PM (#9812489) Journal
    As has been said many times before:
    Closed source is no guarantee of lack of tainting. Even with security checks its perfectly possible to have a hostile programmer working on the software.

    I'd just like to put in a word for the NSA et al. They're perfectly capable of making their own decisions, and are probably far more qualified than anyone here. They know how to minimise risk using whichever development model they like...
    It is of course very possible that open source software could be written on behalf of the military. The people who keep the official version for use within the military/government will go through all code submitted with a fine tooth comb being very conservative with what patches they accept from the outside world...

    This article is basically written for politicians to try and scare politicians into banning the FOSS competition. I doubt it would work if the military and friends didn't want it that way, they will make up their own minds.
  • by Total_Wimp ( 564548 ) on Tuesday July 27, 2004 @12:08PM (#9812541)
    Also, we are not just talking about the "brand name" projects. We are talking about the unsexy, not-front-page projects. The things at risk are the ones without thousands of eyes looking at it. It's the ones with just dozens, or a handful of eyes, looking at it. Projects that make up stuff in the buildchain. Projects like filesystem drivers. Projects like device drivers. Compilers. Linkers. All of them would have to be validated and audited, for each change, for each version, on each platform. A malicious patch anywhere along the way can lead to a trojan. Even code that otherwise looks good could be poisoned. A single unchecked buffer. A single small simple looking error - big consequences.

    But really you're talking about the same exact problem for closed sourse. In that 5-person dev team with 2 QA people in that 20-person company how closely are they checking the source for the type of stealth errors in question? Do they personally check the thousands or millions of lines of code in the build chain? Once a guy is hired, how much contact does he have with his boss? Does his boss even understand the code the "honest worker" is submitting?

    I'm NOT saying the open source is "safe." I'm saying that closed source is every bit as unsafe but with the added hinderence of much fewer people haveing the ability to even look at the source if something goes wrong.

    TW
  • by j1m+5n0w ( 749199 ) on Tuesday July 27, 2004 @12:12PM (#9812576) Homepage Journal
    The auditors would have to audit every bit of the toolchain, the compiler and linker, and the rest of the system to be able successfully rely on the code audit.

    Even that might be insufficient. Ken Thompson showed in his '95 Turing award speech how a compiler trojan can exist even if the backdoor is not present in the compiler's source code.

    Here's the link [acm.org].

  • by hawkeyeMI ( 412577 ) <brock&brocktice,com> on Tuesday July 27, 2004 @12:14PM (#9812601) Homepage
    How much of the time that equipment is being used do you think it is in actual battle situations as opposed to practice/training. Aside from initial testing, equipment is used in a variety of places and scenarios not involving engagement with a real enemy, including war games, before being used in an actual battle.

    Aditionally, what you specified is more complex than what was originally posited, and seems more likely to be detected in code. There are a lot of other issues, but these are covered by the many other threads on this story.

  • by Arakonfap ( 454732 ) on Tuesday July 27, 2004 @12:19PM (#9812651)
    What does this have to do with OpenSource? Specifically, what problem does it present that is not shared with ANY software?

    If we're going to compare Linux and Microsoft - have you noticed all the backdoors and trojans and worms out there lately? These security holes that are being used were not (likely) put in there on purpose - they're simple mistakes. Left by coders, in a CLOSED SOURCE company. There's no need to go to open source to get bad code!

    Are you a coder? Have you participated, or ran, an open sourced project? In such a thing, you don't just accept code. You read it through, test it, and make sure it will do the job before compiling it.

    Sure, you don't do a background check on every person submitting code - but you DO make sure the code does what it needs to, and that it does not have blatent bugs in it. Atleast the code that's submitted this way gets CHECKED AND APPROVED. That's not the case in all closed source companies - they don't all review code changes!

    Has there been a background/security check on everyone that submitted code for use in NT4 SP6 that's used in high security defense systems? No. It's just not possible.

    Atleast with embedded system, that are OS, the code can be all checked because it's small enough where it's a fully solvable and testable system.

    And what does information stealing have to do with even pure source code?? Most "hacks" like that are social enginieering (email attachments, "CoOL CURSoRs! HeRE!"), etc. high security organizations use reverse firewalls to make sure nothing important is going out!
  • by Jim_Maryland ( 718224 ) on Tuesday July 27, 2004 @12:20PM (#9812669)
    Security organizations do not generally use the latest available releases of open or closed source software and it's generally due to the fact that the software must be reviewed. Commercial entities are more likely to keep up with frequent software updates and not perform the software audits that you find in secure organizations.

    Software is reviewed carefully before being allowed onto secured networks. Just ask anyone who works on projects where a deadline is missed due to security reviews.
  • by betelgeuse-4 ( 745816 ) on Tuesday July 27, 2004 @12:23PM (#9812695) Homepage Journal
    Calling coding an NP problem seems a little odd. It's not a decision problem really and some aspects are trivial whereas others may have no solution (e.g. the halting problem).
  • by It doesn't come easy ( 695416 ) on Tuesday July 27, 2004 @12:23PM (#9812700) Journal

    What a bizarre article.

    The statement "Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs." Shouldn't one consider that the "many eyes" are the developers finding those weekly bugs? Wonder how many eyes are looking for Green Hills software bugs?

    As long as people are involved, mistakes (bugs) will be made. But saying that malicious code is more likely in a product where someone CAN examine the code verses a product where no one can is just plain stupid. There is obviously an undisclosed agenda here (might that be selling a DO-178B Level A rated real time OS, aka Integrity? Getting a lot of Linux competition, eh?).

    As to the standard DO-178B...the first 90% of the article is about security, then you mention DO-178B. DO-178B is not a security standard. DO-178B is a FAA safety related standard for software. Any software certified under DO-178B can still be full of unknown security holes. The standard may be required for software used in flight related applications but it does not mean the software is also secure.

    The level A rating doesn't even mean "most secure" as the article seems to imply. It means that if the software crashes, it will not affect other software that is running. In other words, the software is ISOLATED, not secure.

    It is amazing the things companies will say when they are losing ground to a competitor.

  • by gurps_npc ( 621217 ) on Tuesday July 27, 2004 @12:26PM (#9812743) Homepage
    A lot of people are trying to defend him saying "The code could be modified by evil people who put an Evil Easter Egg in it."

    STUPID ARGUEMENT

    The code does not have to be modified by evil people. ALL CODE HAS BUGS So all code should be checked, and not just by the people that write the code. The entire point of Open Source is that LOTS of people check the code for bugs. The difference between an "Evil Easter Egg" and a "bug" is just the intent of the programmer. Open source is MORE likely to catch an "evil Easter Egg/bug" than a closed source technique. Having a Spy try to sneak one is ridiculous because the the same bug detection routine will also detect the evil easter eggs.

    But closed source DOES have one advantage over Open Source: secrecy.The problems with Open Source Defense programs are 1) "They" know exactly how good our programs are and 2) "They" can use them themselves.

    Because of these two things it is not a good idea for most Defense purposes. We want the bad guy to NOT know how good our stuff is and we do NOT want them to have the same quality stuff.

  • by dstone ( 191334 ) on Tuesday July 27, 2004 @12:27PM (#9812764) Homepage
    ...then it's made of nice, transparent Plexiglas.
  • by zogger ( 617870 ) on Tuesday July 27, 2004 @12:38PM (#9812870) Homepage Journal
    ...and defense related places [dailytexanonline.com] DON'T hire foreign nationals or domestic nationals with perhaps a bent for the blackhat side? This never happens? And everyone in government itself is sweet [whatreallyhappened.com] and pure [infowars.com] as the mountain streams, and would never think of doing anything...strange [cnn.com]... for some financial remuneration [fas.org] off the books? This never happens either? And so called "allied and friendly" [rense.com] governments don't run spooks inside our establishment and sleepers inside our citizenry? And they *always* have our best interests [mises.org] at heart? [blackboxvoting.com]



    Nope. Open source is still the best way to go, along with open government. When you let people hide "stuff", and when it's connected to massive political power and heaps 0 money, that's when crimes occur. The best bet is openness, bar none. It is not perfect, but it's the best design yet.

  • by Anonymous Coward on Tuesday July 27, 2004 @12:38PM (#9812873)
    Because Linux has millions of lines of code. GHS's operating system has about 10k, and you get a copy of the source which you can rebuild from scratch when you buy it. And those 10k lines of source code also come pre-documented and certified.

    I've worked for defense contractors using GHS's operating system. I'd much rather read and certify just my own application code than 1M lines of Linux _plus_ my own application code.
  • by Kagato ( 116051 ) on Tuesday July 27, 2004 @12:38PM (#9812874)
    Sure, there is a threat in the Open Source movement. But, how is that threat compared to offshoring? I don't think they are any different. Yet, when a threat is something that enhances the bottom line, security concerns are not raised.
  • by MarkusQ ( 450076 ) on Tuesday July 27, 2004 @12:53PM (#9813061) Journal

    What this article is claiming -- and everybody seems to be ignoring -- is that open source, being a wild system with no accountability (liability) nor authority, is more prone to dangerous bugs and backdoors that closed software developers don't have to worry about. This is the key -- not that hacker X is going to put a backdoor in that won't get caught by peers, but that hacker X's identity and location are completely unknown (not true for employees of a closed software developer) and that there is nobody whose lifestyle is in jeopardy should such a backdoor be found.

    No, they aren't ignoring, they are denying it. Because it's bunk.

    1. Accountability has very little to do with preventing problems and everything with placing blame after they happen.

    2. Open source has more loci of "authority" than closed source, chained in a check-and-ballance system that greatly improves their effectiveness. And the cap stone is, I get to make an informed choice about what I run on my boxes.

    3. It isn't true that closed source developers don't have to worry about backdoors, but it may well be true that they believe that they don't. There have been many cases of backdoors in popular closed source programs (Remember "Netscape engineers are wennies"?) but can you name one backdoor that made it into a widely used open source product?

    4. As far as knowing the identity of the person who supplied a patch, this is just plain nuts. I can (and have) easily tracked down the person who wrote/submitted a patch to an open source product, and the person who accepted it--often, I can have an e-mail wigning its way to them in miuntes. But I can't recall ever learning the identity of a programmer who made a change in a closed source product, or even being offered the means to.

    -- MarkusQ
  • by GCP ( 122438 ) on Tuesday July 27, 2004 @12:54PM (#9813070)
    do we even need another comment on this story?

    Yes, of course, because the fact that open source has some advantages doesn't negate the risk pointed out in the article. It just means that their are risks both ways.

    ANY piece of software that you run on a secure system has the potential of subverting the system. I think open source does create the illusion that it couldn't contain hidden malware because where could it hide in open source, right? Well, anyone who has ever seen the entries in an obfuscated C contest and wondered what that code could possibly do ought to be able to see the flaw in that argument. For that matter, anyone who has ever gone over and over HIS OWN CODE looking for a bug and not finding it ought to ask himself, what if it weren't even my own code and I didn't even know that a bug existed?

    Closed source is even worse in this respect, though, but at least we know who wrote it, right?

    Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.

  • Naive? (Score:3, Insightful)

    by Mirkon ( 618432 ) <mirkon.gmail@com> on Tuesday July 27, 2004 @12:54PM (#9813077) Homepage
    ...unfriendly countries will attempt to hide intentional bugs...

    s/countries/Microsoft/g;
  • by at_kernel_99 ( 659988 ) on Tuesday July 27, 2004 @01:12PM (#9813276) Homepage
    Does nobody remember NSA Secure Linux?

    When one is ranting in a desperate plea to defend one's own methodology & existence, it is often helpful to ignore facts that do not support one's case.

  • by surprise_audit ( 575743 ) on Tuesday July 27, 2004 @01:20PM (#9813351)
    Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.

    And then, of course, there's the question of outsourcing to a foreign country... Who's to say that the programming companies in India (or anywhere else) don't have some disgruntled employees among them? And I don't mean disgruntled with their own bosses, so much as disgruntled at the rich western nations that take advantage of the poorer eastern nations.

    How is Open Source different from that? Oh yeah, you pay more for outsourcing...

  • by at_kernel_99 ( 659988 ) on Tuesday July 27, 2004 @01:22PM (#9813376) Homepage

    Closed source is even worse in this respect, though, but at least we know who wrote it, right?

    Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.

    Exactly. The author also helpfully ignores the backgrounds - often unknown by the enduser - of the developers of closed source software. I've been in this industry for 12 years and have worked in one place (out of 7) that did not have a foreign national on the team. Ethnicities have included China, Vietnam, Russia, India, Pakistan and Syria.

    The only point the author made that I could agree with would be that all software used for the military/intelligence communities should be thoroughly tested & certified to a high standard of security. I doubt there are many that would disagree with this statement. The problem is the author is hiding this valid argument beneath a layer of FUD intended solely to harm Linux & support the proprietary development model his company has chosen. He uses fear & stereotypes to paint the opposition without explaining what his company is doing that will solve the problem in a way that open source cannot.

  • by digital photo ( 635872 ) on Tuesday July 27, 2004 @01:26PM (#9813405) Homepage Journal

    I find it interesting that open source software is considered a risk because individuals from other nations are allowed to participate in the development of the code...

    How does this differ from corporations which provide software to the military who outsource their development to individuals from other nations?

    The only difference is that the OSS model involves corporations giving up some of their control over the rights of the product and corporations don't like that.

    Otherwise, the article makes assumptions of differences between OSS remote participation and outsourcing which has no material relevance.

    The idea of outsourcing being more secure because security checks are done can be argued, but even security checks fail and someone who is cleared can decide to sabotage. The problem is that once someone is vetted, they are trusted. This is actually worse than the OSS model where no matter who you are, the code is reviewed with the same level of scrutiny as anyone else's code.

    I can think of so many instances of calling support, having to provide my personal identifying information to an individual who was either not in my state or not even in the US.

    Sounds more like a double standard of judgement from the corporate viewpoint that is prejudiced against OSS projects.

  • by RoLi ( 141856 ) on Tuesday July 27, 2004 @01:30PM (#9813439)
    I think open source does create the illusion that it couldn't contain hidden malware because where could it hide in open source, right?

    There are numerous examples of malware (like those in Kazaa), easter-eggs (like the flight simulator in Excel and the pinball game in MS Word) and unrequested features (like Windows Product Activation) in proprietary software.

    While I agree that something similar could in theory also happen to some one-man or very small open source projects (in theory because I have never heard of any such occurence) there is absolutely no way such code could be smuggled into bigger projects like Linux, Apache, KDE or the like, there are just too many people watching.

    So if you compare exactly what the article is talking about, proprietary software has a much worse track record.

  • by Allen Zadr ( 767458 ) * <Allen.Zadr@nOspaM.gmail.com> on Tuesday July 27, 2004 @01:32PM (#9813457) Journal
    Uh, remember Phil Zimmerman [philzimmermann.com]?

    I think geeks have been DoJ targets for some time.

  • by x0n ( 120596 ) on Tuesday July 27, 2004 @01:35PM (#9813484) Homepage Journal
    LOL, come on. There's pride, and then there's hubris. It's too good to export - almost as bad a PR job as PS2s being "military grade" hardware. Pfff. Secondly, you talk like all OSS software is developed in the U.S, and is thus a possibly "controlled" export. Not true.

    In any case, the military will always offer the work out to private contractors. These contractors would not be allowed to use GPL licensed s/w anyhow, imagine: icbmguidance.sourceforge.net; no? Neither do I.

    - Oisin

  • by stealth.c ( 724419 ) on Tuesday July 27, 2004 @01:37PM (#9813499)
    he is terminally paranoid. I understand that he has a vested interest in FUDing FOSS, but let's attack his argument for a second:

    First of all, what truly important piece of software would possibly be part of open public development? I thought this was specialized enough of a field that the only people who had any competence with what you were making were already trusted anyway. Wasn't SELinux developed *inside the NSA* before it was released?

    Secondly, assuming a vital piece of software WERE being developed publicly, someone trying to insert malicious code would have to make it past a few barriers, the first being the most complicated. He would have to: 1) Know what his deliberately inferior code would probably do in the finished product versus what a non-ciminal would want it to do. 2) Get it past the critical eye of a few other developers, 3) Slip through some kind of government screening. And all the while NOT make anyone suspicious.

    And even then the results are not guaranteed. What is your cyberterrorist counting on? I sincerely doubt that he could have snuck a back door into the code given all those hoops. I don't think the deliberate bug can be both significant and unknown at the same time. Is he hoping that his bug will cause the software to make a slight miscalculation? Whoopty shit. Whatever agency he or she is working against will be annoyed for a little while and then fix the problem.

    Even if his deliberate bug caused a catastrophic failure, it can and will be traced back to HIS contribution, and if some terrorist group stands up and says "Ha ha! Look what we did! And here's why!" (and if it's Al-Qaeda we can be almost certain of this) That man is immediately under FBI surveillance and probably arrest.

    In any case, inserting a bug would be a lot of work. A lot of work for an uncertain return, and success will mean almost inevitable detection.

    Why some terrorist would bother with this approach is beyond me. It's so much easier just to fill a truck with dynamite.
  • by Pieroxy ( 222434 ) on Tuesday July 27, 2004 @01:43PM (#9813557) Homepage
    Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.

    It's even worse with all the offshoring going on. I mean, people from halfway around the globe write your code. There is no way in hell you can trust them with national security!
  • by Godai ( 104143 ) * on Tuesday July 27, 2004 @02:14PM (#9813931)

    ...have to be low. I mean, let's go over the steps that such an alleged terrorist would have to go thorugh to get this crucial system bug into the kernel:

    1. craft a bug that would be capable of rending someone-in-the-know root access
    2. craft the code that creates the bug in such a way that it would be accepted by Linus. This would require:
      • a plausible reason for the patch; ie. a feature addition or bug
      • the crafted 'secret bug' would have to be imbeded in such code that purports to add the alleged feature or fix the supposed bug
      • Linus (or whoever) would have to miss the bug; so it would have to be fairly subtle
    3. hope that no one detects the bug, be it by the hardware manufacturer, or anyone using the OS before the military hardware goes live in the field (or wherever)
    4. ???
    5. Profit! (Terrorize?)

    That's a pretty good obscure set of circumstances. Does it mean it can't happen? No. But contrast this with proprietary methodology wherein a coder has (usually) unrestricted access to the code base. Hmmm. Sounds more plausible there!

    Of course, the key thing to note here is that anyone who has to dredge the dread forumla that terrorism + open source == Disaster!!! is probably desperate to save his flagging business.

  • by stwrtpj ( 518864 ) on Tuesday July 27, 2004 @02:20PM (#9814015) Journal
    So that means that we can document that 7 security trained people or outside organizations have looked at any code that is declared "Evaluated"...
    To say that the code is Linux code is locked down and tested is to say that the barn door is locked too late in the process for the kinds of things the author of this posting is citing as potentials for happening.

    So what's stopping the DoD from taking the source code base and doing their own testing and certification on it? Considering you claim to have had a background in this, I'm surprised you didn't think of this. This may save them some time in the long run, since they don't have to go through the effort of developing the software itself.

    If I decide to use a library or module from another developer (OSS or otherwise) in something that I am doing, I always take the time to test it to make sure it at least does what I want and is adequate for the task at hand. Now, my own projects don't require a terrible amount of security, but if they did, I would be certain to do some testing in that area as well.

    So I just don't get your point. You don't have to develop the code yourself in order to certify it if you have the full source available to you. And then once you have certified it, after making any corrections that you need on your copy of the source, then you lock THAT down. What came out of the original source base is irrelevant at this point. It only matters what you improved upon and certified.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday July 27, 2004 @02:44PM (#9814325) Journal
    Furthermore, you have to pay for source code to proprietary software, if you can even get it at all. This forces you into an agreement with the proprietary vendor.

    If you're the government, of course you'll review the code and try to pull out dangerous things. With open source, this means hiring a bunch of programmers, and possibly one lawyer to go over the GPL. With proprietary software, this means hiring a bunch of programmers to make sure the code is clean, hiring a bunch of lawyers to make sure the license is clean, and paying for the code itself.

    Furthermore, the community will catch quite a lot of bugs that a team of 10 or 20 programmers from the NSA might never see, simply because there are more of us.

    So I would say to the author that open source is less of a threat to national security than closed, and that furthermore, open source costs less. If you want to compete, stop insulting our intelligence and start changing your business model. Maybe a support / code review company, if you're so worried about national security? "Pay us, and we'll make sure there are no easter eggs in this source"?
  • by wintermute42 ( 710554 ) on Tuesday July 27, 2004 @03:00PM (#9814546) Homepage

    A number of postings have done an excellent job of describing why open source is not a security threat and, in fact, can be more secure because the source code itself can be audited, signed, etc...

    This is a long discussion and I may have missed it. But I have not seen a mention of why Greenhills might be motivated to make the claim that open source is a security threat.

    Greenhills sells proprietary compilers for embedded systems. They also sell real time operating system software. Their business model is under threat by open source. This is especially true in the area of real time embedded operating systems. With their compilers they can at least show that for a given embedded processor their compiler produces better code (this is not that hard to do against GNU for a number of architectures). But with real time operating systems (RTOS) they have less of an edge. Linux is becoming more and more widely adopted. Increasing numbers of people have experience with Linux.

    Greenhills, one can speculate, fears a serious erosion of their RTOS business. This business is probably bigger than their compiler business (few people make much money on compilers these days). Taking a page from Microsoft and SCO they are attacking Linux using FUD. If this also helps their compiler business ("Who knows what trojans open source compilers might generate?"), all the better.

  • by Anonymous Coward on Tuesday July 27, 2004 @03:20PM (#9814764)
    If I were a foreign adversary of the United States, and I wanted to exploit software to gain a strategic advantage, I would not go with F/OSS. There would be too many eyes, too many curious geeks with nothing else better to do than inspect my work, and too many other routes for my target to get source code other than what I have tainted. Rather, I would worm my way into a company selling proprietary software to government and industry. There are fewer eyes, so I would have fewer people to "manage". Influence a developer, one person in QA, and perhaps a secretary and I could have almost anything I want put in place and distributed across the planet, with the help of the DMCA to cover my tracks.

    The KGB said, go for the secretaries. The secretary holds the keys, the trust of others, and can go where s/he wishes. Monolithic organizations are vulnerable exactly because they have few internal firewalls; if you *can* get in then you become part of the trusted architecture, and then there is no mechanism to get you out.
  • Embedded systems vendors are moving to Linux to keep up with the changing hardware and needs of their customers.

    Does this address the needs of all embedded systems users? Of course not. I can see in really high-security fields you need to have 100% control of things. The critical embedded devices in power plants come to mind in that case where you may be replacing a 10 year old device and need to ensure that you have exact compatability.

    However, these are outlying cases and will strain almost ANY OS group to satisfy. (Especially as they still need to move forward with their technology as well.)

    I would say that the OSS route in that case may actually provide you better security as long as you archive both the code and the software used to build the code (including the OS and the hardware if necessary too). If your requirements are in fact that strict then you're going to either have to have complete control of the code you're relying on or have escrow agreements that ensure you'll be able to obtain the code if your vendor happens to go out of business.
  • by T-Ranger ( 10520 ) <jeffw@NoSPAm.chebucto.ns.ca> on Tuesday July 27, 2004 @04:46PM (#9815653) Homepage
    They may be scared of it, but they do SFA when it happens.

    Proof: "Friendly fire" incident in Afghanistan where Canadian soldiers were killed by the hands of a US Pilot. I put "friendly fire" in quotes because to me, it doesn't even qualify as that. Im prepared to accept combat deaths, even friendly fire ones. Anyway: the Canadians were on a known live fire range (and the Americans knew, even if that information didnt make it to the pilot and air controlers). The pilot reported fire, and was specificly told to hold fire. The fire was coming from a range well known to be used by friendly ground forces, including US Forces. He fired anyway, killing 4, wounding a dozen others. He claimed he was "under attack", but how a machine gun could threatin a F18 at 20,000ft, I dont know.

    Final end result: loss of 1/2 months pay for 2 months, letter of reprimand. In other words: exactly nothing.

    If the US DoD were concerned about friendly fire there would have been a minimum of 3 people, if not incarcerated, then discharged from the service. The pilot, the person (or persons) who didn't provide sufficient briefings, and the commanders for letting that happen.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...