Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security Software Linux

Open Source a National Security Threat 921

n3xup writes "Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes. He likened Linux with a Trojan Horse- free, but in the end a lot of trouble. O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
This discussion has been archived. No new comments can be posted.

Open Source a National Security Threat

Comments Filter:
  • by stecoop ( 759508 ) on Tuesday July 27, 2004 @09:51AM (#9811405) Journal
    Understand the source perspective before you draw opinions. Green Hills is under threat from Linux due to the embedded software being integrated in more Government system. GreenHills is (was?) a large player in government based Embedded Operating Systems. I imagine you will see a similar stance by WindRiver maker of the popular Realtime Embedded OS VXWorks.

    The threat comes from the length of time on some large government projects. Some systems have been around longer than you and me. In the proprietary world, your whole project is dependent on a set of companies staying in business for 30+ years. Now with Linux, you're no longer dependent on that string; you can leverage off the community providing updates or if necessary you as the developer can make the changes. Most people fail to say this with Linux; everyone just says hey it's free and cheap. But if you really want to sell Linux, try saying that your entire project doesn't fall on another proprietary solution, we will have the source code in hand - people will listen.

    It's easy to retort GreenHills FUD by saying all changes will be baselined and a change control board will review any updates (easy enough huh).
    • by proj_2501 ( 78149 ) <mkb@ele.uri.edu> on Tuesday July 27, 2004 @09:53AM (#9811431) Journal
      do we even need another comment on this story?
      • No, but that's never stopped slashdot before :)
      • Supporting comment (Score:5, Interesting)

        by Allen Zadr ( 767458 ) * <Allen@Zadr.gmail@com> on Tuesday July 27, 2004 @11:25AM (#9812732) Journal
        Here's a supporting comment...

        Just as parent post suggested. Except, the govenment is already auditing open source, and customizing the Linux kernel to it's own needs... Does nobody remember NSA Secure Linux [nsa.gov]?

      • by GCP ( 122438 ) on Tuesday July 27, 2004 @11:54AM (#9813070)
        do we even need another comment on this story?

        Yes, of course, because the fact that open source has some advantages doesn't negate the risk pointed out in the article. It just means that their are risks both ways.

        ANY piece of software that you run on a secure system has the potential of subverting the system. I think open source does create the illusion that it couldn't contain hidden malware because where could it hide in open source, right? Well, anyone who has ever seen the entries in an obfuscated C contest and wondered what that code could possibly do ought to be able to see the flaw in that argument. For that matter, anyone who has ever gone over and over HIS OWN CODE looking for a bug and not finding it ought to ask himself, what if it weren't even my own code and I didn't even know that a bug existed?

        Closed source is even worse in this respect, though, but at least we know who wrote it, right?

        Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.

        • by at_kernel_99 ( 659988 ) on Tuesday July 27, 2004 @12:22PM (#9813376) Homepage

          Closed source is even worse in this respect, though, but at least we know who wrote it, right?

          Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.

          Exactly. The author also helpfully ignores the backgrounds - often unknown by the enduser - of the developers of closed source software. I've been in this industry for 12 years and have worked in one place (out of 7) that did not have a foreign national on the team. Ethnicities have included China, Vietnam, Russia, India, Pakistan and Syria.

          The only point the author made that I could agree with would be that all software used for the military/intelligence communities should be thoroughly tested & certified to a high standard of security. I doubt there are many that would disagree with this statement. The problem is the author is hiding this valid argument beneath a layer of FUD intended solely to harm Linux & support the proprietary development model his company has chosen. He uses fear & stereotypes to paint the opposition without explaining what his company is doing that will solve the problem in a way that open source cannot.

        • by Dashing Leech ( 688077 ) on Tuesday July 27, 2004 @12:29PM (#9813426)
          Closed source is even worse in this respect...

          Absolutely. Any to anyone who cares to argue that proprietary companies are more strict in reviewing their own code, please explain the abundance of easter eggs [eggheaven2000.com] in proprietary software.

        • by RoLi ( 141856 ) on Tuesday July 27, 2004 @12:30PM (#9813439)
          I think open source does create the illusion that it couldn't contain hidden malware because where could it hide in open source, right?

          There are numerous examples of malware (like those in Kazaa), easter-eggs (like the flight simulator in Excel and the pinball game in MS Word) and unrequested features (like Windows Product Activation) in proprietary software.

          While I agree that something similar could in theory also happen to some one-man or very small open source projects (in theory because I have never heard of any such occurence) there is absolutely no way such code could be smuggled into bigger projects like Linux, Apache, KDE or the like, there are just too many people watching.

          So if you compare exactly what the article is talking about, proprietary software has a much worse track record.

    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @10:00AM (#9811556)
      Comment removed based on user account deletion
      • by Coz ( 178857 ) on Tuesday July 27, 2004 @10:06AM (#9811653) Homepage Journal
        You have one set of experts write the code under an F/OSS license, another set of experts examine the code, test cases, and test results.

        Believe me, if you're talking about something like gunnery firmware, they're going to test it... the deepest fear in DoD these days is friendly fire.
        • the deepest fear in DoD these days is friendly fire.

          No it isn't - if it was, they would have burned all the patriot systems already. the biggest fear in DoD is making sure their pet contractors stay on the payroll so that they keep getting their kickbacks.
          • by duffbeer703 ( 177751 ) on Tuesday July 27, 2004 @10:51AM (#9812332)
            Friendly fire is a fact of life for surface based anti-aircraft weaponry. Creating a spoof-proof friend or foe anti-aircraft system is a non-trivial problem.

            That's one of the reasons why the US has always focused on fighter aircraft at the expense of anti-air artillery and SAM systems.
          • by Rei ( 128717 ) on Tuesday July 27, 2004 @10:51AM (#9812334) Homepage
            I think the DoD's biggest fear concerning OSS is not that the software is too insecure, but that it is *too good* for something available in the public domain. If other countries can get all of the tools they need for a weapon apart from, say, a specific 1000-line guidance or control program, and can make any changes to the tools that they need, that gives them a *major* bonus. Lets not forget how hard our government has worked to stop the export of technology in general - including software - to countries deemed "enemies".
            • by Dr_Marvin_Monroe ( 550052 ) on Tuesday July 27, 2004 @12:53PM (#9813673)
              Lest us not forget that WE'VE been planting trojans in software shipped overseas too. I recall a story here regarding deliberately sabotaged software shipped to some Russian pipline project. As I recall, the trojaned pipeline test software was designed to operate the pipeline at 10X normal pressure and cause an explosion...which it properly did, setting back the Russian government's energy plans.

              When other governments start using OSS, they may be freeing themselves of these US planted trojans. I believe THAT is the major fear of the US government... Not that they will fail to detect a foreign planted bug in some fighterjet, but that OUR planted bugs will be found by China/India/Pakastan/Iran/etc... This would also seem to explain our government's looking the other way with regard to the Microsoft settlement. Remember that the anti-trust settlement was made within a week or so of September 11. Remember also the "Green Lantern" project, where our government was activly looking for ways to co-opt peoples boxes.

              Software than cannont be easily trojaned creates just one more difficulty for our spy agencies. As with the gangster who was using pretty secure encryption, the government is now forced to use things like hardware keystroke loggers (meaning they have to have physical access to the unit), sneek-and-peek, you get the idea.

              The US government has an interest in keeping people using insecure systems. How easy to you think it was to open those Windows laptops captured in Afganastan? Why, the NSA had those famous "NSA-KEY" entrys to Windows!... Easy as pie. The last thing they want is for KSM and OBL to start putting strong-encrypted filesystems on their Linux laptops in Afganastan. No way to plant the backdoor!

              Expect to see a lot more of this type of FUD... The US Government has plenty of time and money to make sure that their Linux systems are safe, they just don't want others using them...
      • by Total_Wimp ( 564548 ) on Tuesday July 27, 2004 @10:07AM (#9811660)
        Can you honestly tell me that the government is going to hire a panel of people to check in in-depth source changes on OSS projects?

        More to the point, will they do this with closed source projects? Getting a mole into Green Hills Software, Microsoft, etc is every bit as real of a threat as getting one into any open source project. In many cases it might even be easier because of the lack of good hiring practices and oversite at small defense companies.

        TW
        • >In many cases it might even be easier because of the lack of good hiring practices and oversite at small defense companies.

          What hiring practices does Linux have?

          Defense companies have to go through a certain amount of security and background checks to win a contract.
          • by Anonymous Coward on Tuesday July 27, 2004 @10:19AM (#9811855)
            What hiring practices does Linux have?

            Doesn't matter one jot. Gee, look, there's the source code. Every bug, hole and trojan horse, just waiting for you to find them. All you have to do is audit the code. You should be auditing the code of any product you're going to use in a sensitive enviroment anyway, wether it's closed or open source. Where's the difference?
            • Comment removed (Score:4, Insightful)

              by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @10:31AM (#9812015)
              Comment removed based on user account deletion
              • Oh really (Score:5, Insightful)

                by TheConfusedOne ( 442158 ) <the.confused.one ... m ['il.' in gap]> on Tuesday July 27, 2004 @11:01AM (#9812459) Journal
                It's possible, and HAS happened that KNOWN, and TRUSTED engineers have put bits of code that would pass initial scrutiny and still be dangerous.

                Wasn't there recently an article about a router with a backdoor shipped out in its code? How about all those darn "easter eggs" floating around in Windows and Office and other programs?

                I would challenge you to compile a new Intel C library using a Microsoft C compiler from 6 years ago too. Heck, compile glibc using an IRIX compiler from six years ago.

                You can drag out all the scenarios you want and whether it's Linux or it's *nix or BSD or Windows you're going to have the same audit challenges and not even have access to the source code without negotiating with all your suppliers.
              • by j1m+5n0w ( 749199 ) on Tuesday July 27, 2004 @11:12AM (#9812576) Homepage Journal
                The auditors would have to audit every bit of the toolchain, the compiler and linker, and the rest of the system to be able successfully rely on the code audit.

                Even that might be insufficient. Ken Thompson showed in his '95 Turing award speech how a compiler trojan can exist even if the backdoor is not present in the compiler's source code.

                Here's the link [acm.org].

          • LOL! You're telling me that you think the government will replace "defense companies [that] have to go through a certain amount of security and background checks" with unverified open source software? Please... if security matters *that* much, the government will either 1) continue to hire secure defense companies or 2) hire a secure defense company to verify the safety of any open source they use.
          • by killmenow ( 184444 ) on Tuesday July 27, 2004 @10:38AM (#9812140)
            What hiring practices does Linux have?
            Peer review. I imagine Linus, Alan, Andrew, Ingo, Tigran, et. al., are more capable of: A) reviewing code submitted for inclusion in the Linux kernel; B) understanding its purpose; C) deciding who to trust to write proper code; and, D) actually committing the code into the kernel THAN ANY GOVERNMENT OFFICIAL.

            As evidenced by his desperate attacks to stave off his dwindling market share, they're obviously doing a better job than the Green Hills CEO.
            • Comment removed (Score:4, Insightful)

              by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @10:46AM (#9812251)
              Comment removed based on user account deletion
              • by HopeOS ( 74340 ) on Tuesday July 27, 2004 @11:52AM (#9813043)
                Linus only takes patches from the people responsible for appropriate parts of the kernel. To get a patch through requires convincing those individuals -- and they do check the patches. In my experience, getting patches into the kernel is not a trivial matter, in fact it is frustratingly difficult. Futhermore, even if you succeed in getting a patch into some esoteric driver, the less mainstream it is, the less likely it will be in an active kernel.

                If the various world governments will go through the trouble to audit defense contractors' code, then they can save themselves some trouble and audit Open Source code instead; any vendor establishing from that base will require less time in audit later. If the governments do not demand an independent audit of contractors' code, then that is where you will find the weak link. With Open Source, you always have the opportunity to audit at any time, diff against previously audited sources, and compile customized code with minimal audited feature sets.

                Green Hills is saying "Trust Us! Trust Us!" Open Source is suggesting you trust what you can independently verify before your own experts' eyes.

                As for the tool chain issue, you are seriously glossing over the obvious -- all the statements you have made apply to proprietary vendors as well. The solution is simple: don't upgrade the tool chain until the changes pass inspection. This is standard operating procedure for all mission critical deployments.

                -Hope
            • So that'll be the solution then, have Linus et al peer review all OSS code and problem solved?

              Just how much code do you think Linus is actually involved with? Really now. Last I checked, him especially, and most of the rest of the crew you mention, live and breath in kernal land. Think they've been over the code in every module out there? Think they know the code for Open Office inside and out? Come now.

              Bottom line, if security is _that_ important, the code will be written and maintained IN HOUSE. Period.
          • How carefully do you suppose those defense contractors screen their subcontractors in India?
        • by Zocalo ( 252965 ) on Tuesday July 27, 2004 @10:20AM (#9811860) Homepage
          Who needs a mole at Green Hills Software or Microsoft? The kind of software we are talking about here is highly proprietary stuff that you are not going to be able to get mail order from your local retailer. A better bet would be to target any third party libraries the vendors are using; almost no one would write their own IP stack when they can by one for a few dollars. Sooner or later you are going to get one that might have been bought from a legitimate company in the US, but was actually coded by easily bribable coders in the third world.

          If anything, I'd say the risk of getting exploits deliberately planted in code without detection are far greater in closed source applications than in OSS projects. Another lame attempt at FUD from the people behind AdTI..

      • > How hard would it be to build in a small
        > tiny bit of error that would only be
        > useful in cases of calibration of
        > high-tech weapons?

        I think it'd be tricky, because it would break other high-precision things as well. And the other folks using the open source project would say "hey, this fellow Fred just submitted a patch. something looks odd about it. Fred, why does line 314 do a bit shift without checking the foobar?" And then the patch would be rejected.

        > If 3000 lines of dense mathematically
        > rich C were checked in

        I doubt any maintainer would accept such a patch. I don't accept patches for PMD [sf.net] without reading them, and if I got a 3K line patch I'd reject it out of hand.
        • Comment removed (Score:4, Insightful)

          by account_deleted ( 4530225 ) on Tuesday July 27, 2004 @10:36AM (#9812094)
          Comment removed based on user account deletion
        • I think it'd be tricky, because it would break other high-precision things as well.

          Run something with a known analytic solution through an artificially complicated computation to achieve the same solution. Any error means something is wrong. Somewhat like recompiling the kernel to check for strange memory errors. Too many eyes. Too many idle hands with idle computers.

          Extremely more likely to get away with it with closed source because the scrutiny is less and much more predictable. By the time an open so
      • Two words: equipment testing
      • by Altus ( 1034 ) on Tuesday July 27, 2004 @10:08AM (#9811679) Homepage

        thats why you do testing and code reviews. its not like these people are downloading new kernals in the field, any code that goes into a government project requires immense testing and code review... PERIOD. I dont care who wrote it.

        if the military wanted to use open source software they would likely take the source and lock it down, producing a branch, for them that would be secured and standardized after a large review. if they wanted to bring in new functionality from the "public" branch it would mean a new verion of their "secure and approved" branch which would have to go through the same review process again.

        Its not like they dont have to do this anyway with the code they produce now... sure they arent expecting people to try an sabotage them but you can do that without intention simply by making a coding error. Testing & code review is essential to the process.

        this isnt that much differnt that what the military does with hardened versions of comercial processors... sure they lag behnind their comercial counterparts because they have to be hardend and tested heavily, but then they work, and they are able to leverage the initial design work and testing done when the hardware was being developed for comercial purposes.


      • Can you honestly tell me that the government is going to hire a panel of people to check in in-depth source changes on OSS projects? People who are familiar enough that they can catch an exploit that may only take 3-4 lines of code to perform?

        ...

        I think that having experts able to review each line of code checked in and put into production defeats the whole idea of using Open Source: at that point, you might as well just hire the experts to write the code in the first place and eliminate the vector all

      • by D3 ( 31029 ) <.moc.liamg. .ta. .gninnehddivad.> on Tuesday July 27, 2004 @10:10AM (#9811728) Journal
        The NSA already produces their own version of secure Linux. It wouldn't surprise me one bit that they check that code very carefully. I doubt they just grab a copy of the RedHat ISO images and lock down the starup files.

        Also, your code would have to be integrated enough into the calculations to only mis-fire when aimed at a certain target or to mis-fire at a set percentage. If the mis-fires were too high they wouldn't buy off on the weapon.
      • by Lumpy ( 12016 ) on Tuesday July 27, 2004 @10:12AM (#9811752) Homepage
        Let's say I knew that DoD used a certain package in gunnery firmware. Let's say a math library that would be used to make calculations to calibrate the weapon. How hard would it be to build in a small tiny bit of error that would only be useful in cases of calibration of high-tech weapons? If 3000 lines of dense mathematically rich C were checked in and a dozen lines acted in concert to create a miscalculation, how much expertise would be needed to catch that?

        so you are telling us that if they BUY the software form XYZ company they blindly accept it as perfect and simply use it without question??

        if so, then I really need to look at emigrating out of the United States because the levels of incompetence is getting insane.

        I dont care if it's free/oss or a 60bajillion dollar closed source software written by aliens from alpha centauri. if it's something you absolutely rely on, you had damn better check it completely. OSS should abide by the same rules that the other stuff does.... check it completely from beginning to end.
      • by kfg ( 145172 ) on Tuesday July 27, 2004 @10:15AM (#9811789)
        Can you honestly tell me that the government is going to hire a panel of people to check in in-depth source changes on OSS projects?

        The American government actually has an entire agency whose job is to perform just such tasks.

        It's called the NSA.

        Will the NSA actually perform this function with OSS?

        They've already made their own distro.

        KFG
      • by demachina ( 71715 ) on Tuesday July 27, 2004 @10:26AM (#9811946)
        "how much expertise would be needed to catch that?"

        Uh, not much. If the weapons aren't hitting the mark on the firing range they probably wouldn't get deployed until they are fixed.

        This is probably a poor example. The danger isn't in OSS that is designed to fail. If it doesn't work it wouldn't get used. The danger is an obscure security hole that would allow infiltration.

        The key point where this guys whole argument falls apart is that proprietary software isn't any better. I'm confident Microsoft employs a small army of foreigners, and I'm not sure they would be any more reliable than OSS developers and their code gets a lot less scrutiny, and absolutely none if you are a customer getting binaries. Most big companies are putting R&D centers in India and China. How do they assure us the people they are hiring don't have ulterior motives.

        If you want to develop software critical to national security you have to develop it in a classified lab with cleared employees. Oh but wait, in spite of all the scrutiny people with get security clearances get, they also turn out to be foreign agents and do great damage. Los Alamos doesn't exactly have a stellar security record and those people get more scrutiny than anyone. The Navy's comsec and has been massively compromised in the past.

        I'd argue the opposite case from this guy. If you want secure software the best approach is to have as many people possible, both OSS and governemnt, scrutinize the source. If you find a project that is intentionally or negligently checking in compromised code black list them or give them extra scrutiny. The NSA's secure linux effort is an example of the government making sure OSS is secure and its way more likely to be that, than anything Microsoft or Green Hills is going to give them.

        On a tangent here [counterpunch.org] is an interesting article on Homeland Security trying to enforce security through obscurity in the physical world. Someone walked around the DNC and took photos of all the weaknesses in their security in Boston and posted it on a list on Yahoo. Homeland security shut down the list and is collecting the names of everyone on the list and everything said. Should give you pause before joining any list in these interesting times.
      • Yes, that is a good point, but there's another reason why this argument is bunk: Linux is written by a lot of non-Americans, but so is Windows, and essentially every other operating system developed by anyone other than the US military. The same appliess for software in general. So then, the argument needs to be: Which one is easier to audit and has better *creation time auditing*?

        The answer should be obvious.
    • "Embedded Operating Systems"

      Just a very brief point, but how do you fix embedded system bugs? Am I incorrect in thinking that embedded means that the OS is onchip?

    • by khasim ( 1285 )
      Same guy, same company, same crap

      http://developers.slashdot.org/developers/04/01/ 15 /1733237.shtml?tid=106&tid=185&tid=190

    • Not Wind River (Score:5, Informative)

      by Bruce Perens ( 3872 ) <bruce@perens.com> on Tuesday July 27, 2004 @10:14AM (#9811767) Homepage Journal
      No, so far we don't see the same from Wind River. They had the choice of FUD or joining the Free Software developers, and they chose the latter.

      Bruce

    • Perhaps, but renouncing what they say because of who they are is not exactly correct either. It's a fallacy to claim 'A' is false becuase the person saying it is 'B'.

      If a man on death row for murder tells you that murder is wrong, is he incorrect simply because of who he is?

      Sure they (Green Hills) have an agenda, and it should be noted. However, they may have a point and shouldn't be discredited out of hand.
  • by beh ( 4759 ) * on Tuesday July 27, 2004 @09:51AM (#9811412)
    Shouldn't this article immediately point back to other articles on
    how governments OUTSIDE the US are choosing open source for exactly
    the same reason (who knows what M$ + NSA put in the closed windows
    source that might hurt other nations)?

    [World Govs Choose Linux For Security & More]
    http://slashdot.org/articles/01/12/11/0132213.shtm l

  • remember this guy? (Score:5, Informative)

    by jabella ( 91754 ) * on Tuesday July 27, 2004 @09:52AM (#9811419) Journal
    Remember this guy? He also wrote "Linux Security: Unfit for Retrofit" ( http://www.ghs.com/linux/unfit.html [ghs.com] )

    This was covered by LWN back in May: http://lwn.net/Articles/83242/ [lwn.net]

    IIRC, GHS does development on embedded XP stuff? I don't remember the details...
  • by mekkab ( 133181 ) on Tuesday July 27, 2004 @09:52AM (#9811424) Homepage Journal
    Yeah, can't trust those commie FOSS developers. Instead, lets invest in "America", lets give money to companies who develop software overseas anyway!*

    *We wanted to buy software from only American developers, but we couldn't afford it.
  • FUD. (Score:5, Insightful)

    by garcia ( 6573 ) * on Tuesday July 27, 2004 @09:53AM (#9811435)
    Some embedded Linux providers even outsource their development to China and Russia.

    GASP! Some XYZ providers even outsource their development to ABC and DEF (insert your favorite company and terrorist sponsoring country where necessary).

    It would be incredibly naive to believe that other countries and terrorist organizations would not exploit an easy opportunity to sabotage our military or critical infrastructure systems when we have been doing the same to them for more than 20 years!

    I think it has been proven that closed-source development doesn't help to change the possibilities that a "mole" has been planted or that a "hole" will be discovered.

    One of the greatest misconceptions about Linux is that the free availability of its source code ensures that the "many eyes" with access to it will surely find any attempt at sabotage. Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs. Many of these flaws have eluded detection for years. It is ridiculous to claim that the open source process can eradicate all of the cleverly hidden intentional bugs when it can't find thousands of unintentional bugs left lying around in the source code.

    And it is ridiculous to claim that a closed development enviornment will make it any different.

    In addition, under the internationally recognized Common Criteria for IT Security Evaluation (ISO 15408), Windows has been certified to Evaluation Assurance Level 4 (EAL 4), a higher level of security than the EAL 2 that Linux has achieved.

    According to this [com.com] article, obtaining EAL2 certification typically costs between $400,000 and $500,000. Looks like it is more money than security. In their infancy, why would Linux vendors decide to shell out large sums of money when the government wasn't interested in using Linux anyway?

    This whole article is FUD. He's annoyed because Linux is making leaps and bounds and will possibly affect his market-share in the lucrative Defense and Aerospace industries. At least he came out and said it on his own legs and not by paying off a third party to "investigate" the "problems" with Linux and post their results to the world.
    • Re:FUD. (Score:5, Informative)

      by nemaispuke ( 624303 ) on Tuesday July 27, 2004 @10:04AM (#9811612)
      In Dan O'Dowd's mentioning of Linux "only" receiving CC EAL 2 is somewhat incorrect. RedHat Enterprise Linux Advanced Server got CC EAL2, SuSe Enterprise Linux was evaluated at EAL 3+. This is roughly the equivalent of TCSEC C2, and can be deployed in a classified environment. I guess he needs to check http://niap.nist.gov/cc-scheme/vpl/vpl_assur_lvl.h tml more regularly and actually read it!
    • Pure FUD, indeed (Score:4, Interesting)

      by krygny ( 473134 ) on Tuesday July 27, 2004 @10:04AM (#9811614)
      Well, he said it all, so it must be true; even though he backs it up with nothing. This is so wrong on so many levels I don't even know where to begin. His assetions are hardly worth addressing. Therefore, pure FUD.

      Ok, I'll bite just once: I doubt there is a single weapon system procured by the DoD in the last 10 years that does not have a subsatantial portion of it outsourced overseas. Most procurments now require some % of it, by contract.
    • Re:FUD. (Score:5, Informative)

      by imroy ( 755 ) <imroykun@gmail.com> on Tuesday July 27, 2004 @10:09AM (#9811691) Homepage Journal

      Check out this link: Understanding the Windows EAL4 Evaluation [jhu.edu]

      ...EAL levels run from 1 to 7. EAL1 basically means that the vendor showed up for the meeting. EAL7 means that key parts of the system have been rigorously verified in a mathematical way. EAL4 means that the design documents were reviewed using non-challenging criteria. This is sort of like having an accounting audit where the auditor checks that all of your paperwork is there and your business practice standards are appropriate, but never actually checks that any of your numbers are correct. An EAL4 evaluation is not required to examine the software at all.

      EAL doesn't really mean much. At least, not until you get up to the higher levels. It's basically so that government departments can have a check-list requirement for any software they buy or comission.

  • Governments should not use OS without a proper security audit. Once you can verify the nature of the code, there should be no obstruction to using it.
  • by InThane ( 2300 ) on Tuesday July 27, 2004 @09:54AM (#9811463) Homepage Journal
    IIRC, China has seen the source code to Microsoft Windows, whereas the U.S. government hasn't.

    I think that's a pretty large security threat right there...
  • by shobadobs ( 264600 ) on Tuesday July 27, 2004 @09:54AM (#9811467)
    What if a terrorist gets a job at a software company? Where's the hope of catching the bugs then? It seems to me that closed-source software is more susceptible than open-source.
  • NSA (Score:3, Interesting)

    by codepunk ( 167897 ) on Tuesday July 27, 2004 @09:55AM (#9811480)
    One Word NSA....If it was so bad the NSA would not have their own version.
  • by Noksagt ( 69097 ) on Tuesday July 27, 2004 @09:56AM (#9811504) Homepage
    The U.S. government and military will be brought to their knees by...Finland?!
  • by polyp2000 ( 444682 ) on Tuesday July 27, 2004 @09:57AM (#9811508) Homepage Journal
    Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes.

    Urmm , so what operating system do you use then Dan O'Dowd? and which newspapers and websites do you read?

    You're obviously using a closed source operating system that is free of viruses, worms, holes and other security problems. What might this mystery closed source operating system that you are using that doesnt pose a threat to the nations security?
  • by FunWithHeadlines ( 644929 ) on Tuesday July 27, 2004 @09:59AM (#9811537) Homepage
    Huh? Where's slashdot been? Groklaw [groklaw.net] answered [groklaw.net] this FUD [groklaw.net] months ago, repeatedly and definitively.

    Truly nothing to see here, folks. Just empty FUD that has been discredited.

  • by C_Kode ( 102755 ) on Tuesday July 27, 2004 @09:59AM (#9811543) Journal
    This should be from the "If-you-can't-with dazzle-them-brilliance-baffle-them-with-bullshit" department.

    O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding.

    If the source is open how can there be no chance in finding bugs or whatever else they wish to put in the source?

    This is clearly FUD to protect their market from the steam-roller known as FOSS. Security through obscurity is already proven faulty.
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Tuesday July 27, 2004 @10:00AM (#9811546) Homepage Journal
    Green Hills is a failing company that is seeing its market go to Open Source. In contrast, Wind River, which is in the same market with the same customers, embraces Linux.

    The fact is that Green Hills products are no more secure, and may well be less secure, because they don't have the "many eyes" looking at their source code. We've had trojan horse attempts in Open Source software. They get caught quickly. But even if the source is disclosed, nobody outside of their tiny company has an incentive to do productive work on the internals of a Green Hills operating system in the way that people who modify GNU/Linux do. And security audits by such a small company can't catch everything.

    The best example of this has been the Borland Interbase database. This was used for airline reservations, and had a trojan horse buried in it for 6 to 9 years while it was a proprietary product. The door could have been found by anyone who did an ASCII dump of the product, but those who did kept it secret, and probably took a lot of free flights. An Open Source coder found the door some months after the database went Open Source, and had an incentive to report it - at that point he was one of the people doing productive work on the database and only wanted it to work better and more securely.

    This "black hats" (people who are motivated for bad purposes) vs. "white hats" (good purpose) phenomenon is important to consider when you evaluate the security of Open Source. Generally the only people who would look for vulnerabilities in proprietary software, outside of its manufacturer, are looking to exploit them! This is hardly the case with Open Source.

    Thanks

    Bruce

  • Closed source safer? (Score:4, Informative)

    by bigbigbison ( 104532 ) on Tuesday July 27, 2004 @10:00AM (#9811548) Homepage
    I seem to remember a few years ago (possibly after 9/11, but I'm not sure) there was an incident where an employee of a company that has a governement contract to write software that manages government infrastructure was suspected of terrorist links and so they had to spend tonds of time seaching through the code to make sure the suspect had not programmed a back-door into the system. (I might be misremembering the details here, but that was the gist of it) it seems that closed source is a lot easier to hide things away than open source.
  • by G4from128k ( 686170 ) on Tuesday July 27, 2004 @10:04AM (#9811610)
    If this is a security issue, then the government should definitely not buy closed source from any software company that uses any offshore (non-U.S.) programmers. Who knows what those offshore programmers are inserting into the closed code. Of course, that rules out just about every large closed source maker in the world as I'm sure most have some non-U.S. development groups.
  • by nysus ( 162232 ) on Tuesday July 27, 2004 @10:06AM (#9811646)
    Just trying to think how a secret bug could be introduced. Maybe it would go down something like this?

    Lead Programmer at Major Defense Contractor: Hey, can you install this patch by the that new Pakastani contributor for our missile control module?

    New programmer: Yeah, I looked at it. There was some weird code in there that I couldn't quite figure out. There was some one line Perl code with about 10,000 characters. Shouldn't we look at it? What does it do, exactly?

    Lead Programmer: Naw. I don't think it really matters. I don't want to look stupid because I sure can't figure Perl out. Let's just go with the release early and often policy. We'll let the users report the bugs back to us.

  • "Attempt" is right (Score:5, Informative)

    by Zocalo ( 252965 ) on Tuesday July 27, 2004 @10:06AM (#9811648) Homepage
    Um, this was already tried [kerneltrap.org] last November. Not only was the exploit very subtle indeed but it was still detected and removed within 24 hours. This is about as effective a piece of FUD as AdTI's last effort, and it looks like they were so embarrassed by that one they are resorting to a new name. I'm guessing we won't be hearing from "Green Hills Software" again once they've been publically ridiculed either...
  • by dougmc ( 70836 ) <dougmc+slashdot@frenzied.us> on Tuesday July 27, 2004 @10:07AM (#9811667) Homepage
    What he suggests is possible. And a well hidden bug could easily escape detection by Linus and anybody else who goes over each new patch looking for stuff like this.

    And it doesn't have to be in the Linux kernel. The classic example (at least 10 years old) is to hack up gcc so that it examines the code it's compiling, and if it decides that it's compiling /bin/login to do things a little differently, inserting a back door where there was none before.

    However, while he does have a point, it's a very myopic point. Closed source software has exactly the same vulnerabilities, except for one critical difference -- only people within the company in question have a chance of detecting the problem -- the end user will never get to see the source and see if it's compromised. Granted, most open source users do not review all the source code that they use, but at least the option is there, and for the people where security is absolutely essential (like the NSA) they almost certainly use it.

    Also, for a closed source company, the problem is even worse. The backdoor (or whatever) could be introduced when the code is finally compiled for distribution, and never get checked into whatever source control system they use. So the binaries get shipped out, but NOBODY has reviewed the source code in question (except our cracker friend) and once the bug does come to light (if it ever does) the company will look at the source code and scratch it's head -- it won't even have the source code in question to look at.

  • by Doc Ruby ( 173196 ) on Tuesday July 27, 2004 @10:08AM (#9811688) Homepage Journal
    America is locked in a life or death battle of Good vs. Evil. Any openness or flexibility is weakness, which will be immediately exploited by our enemies to destroy our way of life. Open source hippies might be having fun, but they're frittering away our hard-won tech lead. The Internet itself, invented by the Pentagon, has been taken over by pedophiles since Al Gore reinvented it during the fake Bubble. God told President Bush to have Bill Gates take over the Internet, and all software development, to protect us from the hackers, and get rid of spam.

    Freedom is Slavery.
    Ignorance is Strength.
    War is Peace.
  • by rlp ( 11898 ) on Tuesday July 27, 2004 @10:09AM (#9811697)
    At least OSS lets the prospective user review the source code. U.S. companies are rapidly outsourcing proprietary development to foreign countries. Key infrastructure software (and firmware) is being developed in countries such as mainland China (including code used for the U.S. telecom system). Meanwhile, the U.S. military is rapidly adopting off-the-shelf components to reduce costs. But, by all means, lets ignore this, and concentrate on OSS ...
  • by tizzyD ( 577098 ) <{moc.liamg} {ta} {dyzzit}> on Tuesday July 27, 2004 @10:09AM (#9811703) Homepage
    When you can't compete, FUD.

    The gauntlet is thrown down. I challenge this man to come up with a demonstrable "trojan horse" in an OS piece of software that cannot be found in a reasonable period of time by a security audit (the kind the government does of OS software to be used). Such fear mongering should be laughed at with, torn up, and spit upon whenever you see or hear it. It reminds me of Ridge getting up and saying, well, there's a threat around the election, but we have no evidence of it. Be scared (and vote for Bush). Yea . . . right. I didn't just fall off the turnip truck.

    Get a life, and make better products, jerk!
  • by SuperChuck69 ( 702300 ) on Tuesday July 27, 2004 @10:10AM (#9811707)
    I am continually amazed that people believe open source is a good place for terrorists to hide evil, anti-government bugs...

    The cornerstone of open source is that it is OPEN SOURCE. The government is free to view and evaluate all the packages to their little, demonic hearts' content.

    If I were a terrorist, I'd think I would penetrate a closed-source house (say, Microsoft or Green Hills) and hack some little nasties into their source.

    But,, maybe that's why Dan O'Dowd isn't a very good terrorist.

  • by EXTomar ( 78739 ) on Tuesday July 27, 2004 @10:18AM (#9811834)
    If you were a paranoid Iranian or North Korean computer user and look at Microsoft Windows would you think the same thing? Heck, why would a Chinese user think that MS and the NSA/CIA/alphabet soup is trying to snoop them? Because MS allows a select group to look at their source?!?

    At least with Open Source you have the source to ultimately check for yourself. Vendors like Novel, IBM, and RedHat are supposed to be actively looking at the source to make sure no one is slipping stuff in that doesn't belong but if you don't believe them you can do it yourself.

    So you have a Mr. Dan O'Dowd trying to a terrorist ghost threat into Open Source. The problem is that the source is there for you to inspect. With Microsoft the only word you have is their word that they aren't monkeying with the OS to monitor you.

    IMHO, BSD and Linux are perfect for Military and security applications. You can inspect every corner of the kernel. You can freeze on a specific version because you always have that source code. You can branch and patch as you see fit. This seems perfect for the military and security branches. With Microsoft you have to "signup" (how much money does it cost to do that?) to view the source and then what? The only proof you have is that this particular version of Windows hasn't been monkeyed with. What about the patches and hotfixes? *shrug*

    When it really boils down to it are you going to believe the source you compiled, you control yourself or Microsoft? I think Mr. O'Dowd's trust is ill placed.
  • Issues at Hand (Score:5, Informative)

    by gmletzkojr ( 768460 ) <`moc.liamg' `ta' `rjokztelmg'> on Tuesday July 27, 2004 @10:18AM (#9811836) Homepage Journal
    There are a number of issues that play a part in the Green Hills argument. First of all, let me say that I have had the experience of using Green Hills products (non-military) for the past few years now.

    First of all, coming from a company that charges *a lot* of money for an OS stands *a lot* to lose from a free OS. Therefore, GH would be expected to say that a GH product is better.

    The fact that GH source code is not open source does not mean that no one ever sees it. I have access to the entire source, and, if so inclined, could use that information to create an attack myself or provide the source to someone else. Remember, even though the company signed a release for the source, that doesn't mean that money talks more.

    GH has, up till this point, maintained a 'top dog' status in this area. In fact, when we asked for a driver for USB mass storage, the response was 'Well, where else would you get it? It is going to cost you.'

    IMHO, GH has had a bit of a mini-Microsoft status within the military embedded world. This has certainly mirrored the PC OS world - one leading OS, some neat features, but when you really look at, how many ways are there to create a GUI or an OS. Let's be honest - an OS has queues, semaphores, a file system (replaceable, in GH), etc. So we are not talking about 'rocket surgery'.

    The idea of Linux not being 'military grade' would really need to be made from an independent group. This is akin to MS saying that it has the best browser or GUI. Of course they are going to say that.

  • by Tenebrious1 ( 530949 ) on Tuesday July 27, 2004 @10:18AM (#9811838) Homepage
    He should liken any government using closed source software with the Trojans themselves, who took the *gift* without examining the contents.

    If the Trojan Horse were really Open Source, it would have had a list of building materials, instructions on building the horse yourself, the number of greek warriors inside, how the warriors were armed, along with several notes from the Phoenicians commenting on the dangers of the included Greeks...

  • by FWMiller ( 9925 ) on Tuesday July 27, 2004 @10:20AM (#9811865) Homepage

    I'm a long time Linux user and have been around open-source for a long time. While the source of this article is obviously questionable, I work for a Defense Contractor and I'm here to tell you, the points raised in the article have some truth to them.

    If you're selling products to the govt and those products use an operating system, the issue of being able to GUARANTEE that your code base is not and cannot be coerced is very real. Everyone has (or should have) seen the techniques used to obfuscate trojan horses by using a compiler or some other tool that makes this problem even harder.

    The problem being eluded to here is about a chain of control of a code base that can be demonstrated to satisfy a DoD or other govt customer. While no process can ever be completely secure, the real point is, if you have a choice between a system that has been developed in a closed environment where you can keep an eye on everyone involved and and open-source development, the prior development is easier to verify. You can call it FUD but this is a real issue within the govt circles and WILL limit the use of Linux in certain applications.

  • Amusing article (Score:5, Interesting)

    by u-235-sentinel ( 594077 ) on Tuesday July 27, 2004 @10:21AM (#9811884) Homepage Journal
    Even if Linux were as secure as Windows, Windows is the wrong benchmark. Defense systems should be held to a higher standard.

    As secure as Windows? He's kidding .. right?

    When I worked for the AirForce, they had several instances in which systems were comprimised (desktops). Various worms came out of the blue and just hammered their network. My systems running Linux noticed it immediately. In fact I was told there was NO problem. After a few hours of watching the logs logging attacks over and over again I then noticed a general email sent out to all explaining there was a problem and instructions were provided.

    As secure as Windows? God I hope not!

    The Federal Aviation Administration (FAA) requires software that runs commercial (and many military) aircraft be approved as part of a DO-178B certification. DO-178B Level A is the highest safety standard for software design, development, documentation, and testing. It is required for any software whose failure could cause or contribute to the catastrophic loss of an aircraft.

    Several operating systems have been DO-178B Level A certified. Until Linux is certified to DO-178B Level A, our soldiers, sailors, airmen and marines should not be asked to trust their lives with it.


    If Linux isn't at this level then what is the point of the article? Linux is certified for various things in the military. Whenever I stand up a server I was asked what OS I would be running. Everyone was apprehensive it would be Windows which requires a whole heap of testing before it's allowed to run in production. As soon as I told security it was either Unix or Linux they would sigh and tell me to go ahead. Much more confidence there :-)
  • NSA Linux (Score:3, Interesting)

    by failedlogic ( 627314 ) on Tuesday July 27, 2004 @10:26AM (#9811948)
    Yeah, Linux is untrustworthy. Enough so that the NSA chose to develop NSA linux with its own security extensions? What I'm getting at is that the government can make its own secure OSes by using Open Source.
  • by Goeland86 ( 741690 ) <goeland86@gmail.3.14com minus pi> on Tuesday July 27, 2004 @10:27AM (#9811967) Homepage
    this is what I posted on his article, on designnews itself, where I'm sure he will read it:

    In theory, of course, you're totally right in believing this. In practice, however, you're inescapibly wrong. First, since Linux is open source, the army implementing these linux embedded systems most likely read through the code to verify it's normal behavior and lack of serious design flaws, second, terrorists nowadays do not use computers for fear of being traced by the NSA or CIA with the net, thus preventing themselves from ever contributing code to Linux. Third and last, the linux kernel development team has now a signature follow-up on the internet, to make sure that each piece of code can be traced back to it's original author. It makes it that much easier to locate the developpers of Linux. Many of them are in countries that you failed to mention, like Japan, Australia, Finland and many other western countries that the US government trusts. Besides that, the open-source community is the best bug-tracking-solving community in the world. I believe it has happened for the webserver apache when the new version was shipped out with a security flaw less than an hour later the bug was traced in the code and a patch submitted. So, even in the case of a security flaw in the linux kernel, I believe that in less than 35 minutes the army computer specialists would be able to trace and fix the flaw. And those security flaws are precisely the reason the army orders pre-series of each equipment they will use and test them for a few months with anything that they're expected to meet in combat zone, one of them being loss of OS stability, control or even total power failure and recovery. You have only looked at the theoretical part of the problem, and propose no solution to the problems you see, therefore I consider your article a big rant against opensource, not constructive criticism, which in my opinion would be true partiotism.
  • all in the family (Score:5, Interesting)

    by blooba ( 792259 ) on Tuesday July 27, 2004 @10:35AM (#9812073)
    both my father and i were DoD software engineers, me as a developer, and he as a tester. i do commercial stuff nowadays, and dad's retired. we both know, for a cold hard fact, that no national security- or defense-related software ever goes into production without passing the most rigorous reviews and testing, throughout its entire lifecycle. from functional descriptions, through design reviews, code walkthroughs and acceptance testing, everything is closely monitored and recorded.

    so, please explain to me again how open source terrorists are going to slip their malware under our noses?

  • by AK Marc ( 707885 ) on Tuesday July 27, 2004 @10:45AM (#9812237)
    So, someone could put in bugs or backdoors in Linux? That would never happen with, say, routers from the largest router company in the world. Oh, wait. It already has. Other countries are dumping Microsoft. Why? Because it is a closed source they can not look at that may pack bugs or backdoors placed by the US company to help the US.

    Of all the valid reasons to attack open source software, I can't imagine how they can imply an unknown piece of code is more secure than a known piece of code, even if possible enemies are contributing to the open source (unless, of course, every programmer at Microsoft has been given the proper clearance for all levels the OS they are working on will be used).
  • Exactly the point (Score:5, Insightful)

    by hol ( 89786 ) on Tuesday July 27, 2004 @10:52AM (#9812351) Homepage Journal
    This is precisely why Brazil, China, and even Germany are moving towards open-source. The US Government cannot insert backdoors into this stuff that would affect anyone not wanting to be affected, unlike Microsoft stuff. Remember the NSA keys in the Windows NT crypto libraries?

    The US can continue to run Windows, be our guest, but the point is moot since much of US Government software is developed in India anyways. No back doors there, for sure.
  • First blush (Score:3, Insightful)

    by yoshi_mon ( 172895 ) on Tuesday July 27, 2004 @10:53AM (#9812354)
    I went to Green Hills Software page 1st. Just to see who this is.

    And I'm a little upset. Is OSN actually letting these guys astoturf on /.? Why would this get even a 2nd look by anyone? I might see the somethingawful forums laughing at this but to have a posting here?

    This whole thing smells bad. Maybe it's a slow news day but there are better anti-linux rants than what is coming from that lame ass website. Nothing to see here, move along.
  • Scenario (Score:3, Interesting)

    by jhaberman ( 246905 ) on Tuesday July 27, 2004 @10:53AM (#9812356)
    I have no idea if something like this could be possible, but just playing devils advocate here.

    Say, the software was used for target calculations for an artillery piece. For the sake of argument, say some "rogue" developer has added a bit of code that takes the coordinates from the GPS in the unit. Checks to see if they are in, say, a middle eastern country. If so, the shells, which found the target just fine in testing, are now missing the mark by 500 yards.

    Not paranoid, just askin.
  • by Jayfar ( 630313 ) on Tuesday July 27, 2004 @11:02AM (#9812466)
    Is this a dup? I dunno, but Green Hills FUD was discussed on groklaw at great length [groklaw.net] over 3-1/2 months ago.
  • by anthro398 ( 729495 ) on Tuesday July 27, 2004 @11:05AM (#9812505) Homepage

    I was looking through the authors citations and it seems that his quote concerning the number of vulnerabilities in Linux compared to those in Windows is pretty questionable. The database [nist.gov], as you can see here, has one selection for Linux and many for Windows. It seems that the U.S. National Institute of Standards and Technology considers components of Windows, such as Internet Explorer not to be a part of the operating system, thus listing vulnerabilities of the compenents separate from those of the OS. At the same time, Linux vulnerabilities include Sound Blaster driver issues and problems with third party software such as Symantec Antivirus.

  • by It doesn't come easy ( 695416 ) on Tuesday July 27, 2004 @11:23AM (#9812700) Journal

    What a bizarre article.

    The statement "Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs." Shouldn't one consider that the "many eyes" are the developers finding those weekly bugs? Wonder how many eyes are looking for Green Hills software bugs?

    As long as people are involved, mistakes (bugs) will be made. But saying that malicious code is more likely in a product where someone CAN examine the code verses a product where no one can is just plain stupid. There is obviously an undisclosed agenda here (might that be selling a DO-178B Level A rated real time OS, aka Integrity? Getting a lot of Linux competition, eh?).

    As to the standard DO-178B...the first 90% of the article is about security, then you mention DO-178B. DO-178B is not a security standard. DO-178B is a FAA safety related standard for software. Any software certified under DO-178B can still be full of unknown security holes. The standard may be required for software used in flight related applications but it does not mean the software is also secure.

    The level A rating doesn't even mean "most secure" as the article seems to imply. It means that if the software crashes, it will not affect other software that is running. In other words, the software is ISOLATED, not secure.

    It is amazing the things companies will say when they are losing ground to a competitor.

  • by zogger ( 617870 ) on Tuesday July 27, 2004 @11:38AM (#9812870) Homepage Journal
    ...and defense related places [dailytexanonline.com] DON'T hire foreign nationals or domestic nationals with perhaps a bent for the blackhat side? This never happens? And everyone in government itself is sweet [whatreallyhappened.com] and pure [infowars.com] as the mountain streams, and would never think of doing anything...strange [cnn.com]... for some financial remuneration [fas.org] off the books? This never happens either? And so called "allied and friendly" [rense.com] governments don't run spooks inside our establishment and sleepers inside our citizenry? And they *always* have our best interests [mises.org] at heart? [blackboxvoting.com]



    Nope. Open source is still the best way to go, along with open government. When you let people hide "stuff", and when it's connected to massive political power and heaps 0 money, that's when crimes occur. The best bet is openness, bar none. It is not perfect, but it's the best design yet.

  • by Kagato ( 116051 ) on Tuesday July 27, 2004 @11:38AM (#9812874)
    Sure, there is a threat in the Open Source movement. But, how is that threat compared to offshoring? I don't think they are any different. Yet, when a threat is something that enhances the bottom line, security concerns are not raised.
  • by digital photo ( 635872 ) on Tuesday July 27, 2004 @12:26PM (#9813405) Homepage Journal

    I find it interesting that open source software is considered a risk because individuals from other nations are allowed to participate in the development of the code...

    How does this differ from corporations which provide software to the military who outsource their development to individuals from other nations?

    The only difference is that the OSS model involves corporations giving up some of their control over the rights of the product and corporations don't like that.

    Otherwise, the article makes assumptions of differences between OSS remote participation and outsourcing which has no material relevance.

    The idea of outsourcing being more secure because security checks are done can be argued, but even security checks fail and someone who is cleared can decide to sabotage. The problem is that once someone is vetted, they are trusted. This is actually worse than the OSS model where no matter who you are, the code is reviewed with the same level of scrutiny as anyone else's code.

    I can think of so many instances of calling support, having to provide my personal identifying information to an individual who was either not in my state or not even in the US.

    Sounds more like a double standard of judgement from the corporate viewpoint that is prejudiced against OSS projects.

  • Linux is not a RTOS (Score:5, Informative)

    by Discoflamingo13 ( 90009 ) on Tuesday July 27, 2004 @12:27PM (#9813411) Homepage Journal

    What Dowd fails to mention, in all of this, is that Level A certification requires a detailed specification of requirements that the system must implement. These requirements must be covered by test cases that give full requirement coverage (or appropriate analysis) and structural coverage (for Level A, it is MC/DC statement coverage [uow.edu.au]). The Open Source methodology is a long way from being a DO-178B compliant process, and rightly so - the rules for change control of a Level A-certified product are the exact opposite of the "release early, release often" method embraced by a typical open source program, because the development objectives are entirely different. This does not mean that an open source program can not be certified to Level A - it means that it requires a great deal of work on behalf of the organization submitting it for Level A compliance, first.

    DO-178B is the most rigorous safety evaluation standard in the aerospace, automotive, or defense industries. There is no difference in the DO-178B certification guidelines for verifying a closed-source vs. open-source application. The problem that both of them have to come up with is documentation of the process used to produce the product, along with design and architectural requirements for the application that can be independently verified for full MC/DC statement coverage by an independent third party. Each application must be shown to accomodate space (memory access) and time (real-time scheduling) partitioning requirements on any device it is run on.

    Most Level A OS's are a RTOS with (if you're lucky) ANSI and POSIX libraries for I/O and math. There are companies that have modified Linux for use in real-time embedded applications, but the standard Linux scheduler is not real-time, and does not perform space partitioning of application memory (which means it can be Level E, but nothing above that). If it does not affect safety-critical parameters, it doesn't have to be Level A - Levels D or E are acceptable.

  • So US-Centric (Score:4, Interesting)

    by abe ferlman ( 205607 ) <bgtrio&yahoo,com> on Tuesday July 27, 2004 @12:33PM (#9813463) Homepage Journal
    A lot of other governments are moving away from Microsoft b/c they're pretty sure we're using Windows to spy on them.

    Unfortunately, you can't guarantee that someone looking to subvert windows in a subtle way won't be hired by (or more interestingly, license their code to) Microsoft- so with closed source you basically get the worst of all possible worlds.

  • by stealth.c ( 724419 ) on Tuesday July 27, 2004 @12:37PM (#9813499)
    he is terminally paranoid. I understand that he has a vested interest in FUDing FOSS, but let's attack his argument for a second:

    First of all, what truly important piece of software would possibly be part of open public development? I thought this was specialized enough of a field that the only people who had any competence with what you were making were already trusted anyway. Wasn't SELinux developed *inside the NSA* before it was released?

    Secondly, assuming a vital piece of software WERE being developed publicly, someone trying to insert malicious code would have to make it past a few barriers, the first being the most complicated. He would have to: 1) Know what his deliberately inferior code would probably do in the finished product versus what a non-ciminal would want it to do. 2) Get it past the critical eye of a few other developers, 3) Slip through some kind of government screening. And all the while NOT make anyone suspicious.

    And even then the results are not guaranteed. What is your cyberterrorist counting on? I sincerely doubt that he could have snuck a back door into the code given all those hoops. I don't think the deliberate bug can be both significant and unknown at the same time. Is he hoping that his bug will cause the software to make a slight miscalculation? Whoopty shit. Whatever agency he or she is working against will be annoyed for a little while and then fix the problem.

    Even if his deliberate bug caused a catastrophic failure, it can and will be traced back to HIS contribution, and if some terrorist group stands up and says "Ha ha! Look what we did! And here's why!" (and if it's Al-Qaeda we can be almost certain of this) That man is immediately under FBI surveillance and probably arrest.

    In any case, inserting a bug would be a lot of work. A lot of work for an uncertain return, and success will mean almost inevitable detection.

    Why some terrorist would bother with this approach is beyond me. It's so much easier just to fill a truck with dynamite.
  • by Godai ( 104143 ) * on Tuesday July 27, 2004 @01:14PM (#9813931)

    ...have to be low. I mean, let's go over the steps that such an alleged terrorist would have to go thorugh to get this crucial system bug into the kernel:

    1. craft a bug that would be capable of rending someone-in-the-know root access
    2. craft the code that creates the bug in such a way that it would be accepted by Linus. This would require:
      • a plausible reason for the patch; ie. a feature addition or bug
      • the crafted 'secret bug' would have to be imbeded in such code that purports to add the alleged feature or fix the supposed bug
      • Linus (or whoever) would have to miss the bug; so it would have to be fairly subtle
    3. hope that no one detects the bug, be it by the hardware manufacturer, or anyone using the OS before the military hardware goes live in the field (or wherever)
    4. ???
    5. Profit! (Terrorize?)

    That's a pretty good obscure set of circumstances. Does it mean it can't happen? No. But contrast this with proprietary methodology wherein a coder has (usually) unrestricted access to the code base. Hmmm. Sounds more plausible there!

    Of course, the key thing to note here is that anyone who has to dredge the dread forumla that terrorism + open source == Disaster!!! is probably desperate to save his flagging business.

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...