Open Source a National Security Threat 921
n3xup writes "Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes. He likened Linux with a Trojan Horse- free, but in the end a lot of trouble. O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
remember this guy? (Score:5, Informative)
This was covered by LWN back in May: http://lwn.net/Articles/83242/ [lwn.net]
IIRC, GHS does development on embedded XP stuff? I don't remember the details...
Haha, a Trojan horse (Score:1, Informative)
Remember you have the source and all bugs are shallow with enough eyes, this applies to evil code as well.
M$ windoze is the real trojan horse. The one you cannot see inside and not only that, is being forced upon you.
Re:FUD. (Score:1, Informative)
Groklaw destroyed this FUD...long ago (Score:5, Informative)
Truly nothing to see here, folks. Just empty FUD that has been discredited.
Closed source safer? (Score:4, Informative)
What about outsourced closed-source? (Score:3, Informative)
Re:FUD. (Score:5, Informative)
"Attempt" is right (Score:5, Informative)
Same crap from January. (Score:3, Informative)
http://developers.slashdot.org/developers/04/01
Re:FUD. (Score:5, Informative)
Check out this link: Understanding the Windows EAL4 Evaluation [jhu.edu]
EAL doesn't really mean much. At least, not until you get up to the higher levels. It's basically so that government departments can have a check-list requirement for any software they buy or comission.
Re:Understand the Source Perspective (Score:5, Informative)
so you are telling us that if they BUY the software form XYZ company they blindly accept it as perfect and simply use it without question??
if so, then I really need to look at emigrating out of the United States because the levels of incompetence is getting insane.
I dont care if it's free/oss or a 60bajillion dollar closed source software written by aliens from alpha centauri. if it's something you absolutely rely on, you had damn better check it completely. OSS should abide by the same rules that the other stuff does.... check it completely from beginning to end.
Not Wind River (Score:5, Informative)
Bruce
Re:Understand the Source Perspective (Score:5, Informative)
The American government actually has an entire agency whose job is to perform just such tasks.
It's called the NSA.
Will the NSA actually perform this function with OSS?
They've already made their own distro.
KFG
Issues at Hand (Score:5, Informative)
First of all, coming from a company that charges *a lot* of money for an OS stands *a lot* to lose from a free OS. Therefore, GH would be expected to say that a GH product is better.
The fact that GH source code is not open source does not mean that no one ever sees it. I have access to the entire source, and, if so inclined, could use that information to create an attack myself or provide the source to someone else. Remember, even though the company signed a release for the source, that doesn't mean that money talks more.
GH has, up till this point, maintained a 'top dog' status in this area. In fact, when we asked for a driver for USB mass storage, the response was 'Well, where else would you get it? It is going to cost you.'
IMHO, GH has had a bit of a mini-Microsoft status within the military embedded world. This has certainly mirrored the PC OS world - one leading OS, some neat features, but when you really look at, how many ways are there to create a GUI or an OS. Let's be honest - an OS has queues, semaphores, a file system (replaceable, in GH), etc. So we are not talking about 'rocket surgery'.
The idea of Linux not being 'military grade' would really need to be made from an independent group. This is akin to MS saying that it has the best browser or GUI. Of course they are going to say that.
my comment about his article (Score:3, Informative)
In theory, of course, you're totally right in believing this. In practice, however, you're inescapibly wrong. First, since Linux is open source, the army implementing these linux embedded systems most likely read through the code to verify it's normal behavior and lack of serious design flaws, second, terrorists nowadays do not use computers for fear of being traced by the NSA or CIA with the net, thus preventing themselves from ever contributing code to Linux. Third and last, the linux kernel development team has now a signature follow-up on the internet, to make sure that each piece of code can be traced back to it's original author. It makes it that much easier to locate the developpers of Linux. Many of them are in countries that you failed to mention, like Japan, Australia, Finland and many other western countries that the US government trusts. Besides that, the open-source community is the best bug-tracking-solving community in the world. I believe it has happened for the webserver apache when the new version was shipped out with a security flaw less than an hour later the bug was traced in the code and a patch submitted. So, even in the case of a security flaw in the linux kernel, I believe that in less than 35 minutes the army computer specialists would be able to trace and fix the flaw. And those security flaws are precisely the reason the army orders pre-series of each equipment they will use and test them for a few months with anything that they're expected to meet in combat zone, one of them being loss of OS stability, control or even total power failure and recovery. You have only looked at the theoretical part of the problem, and propose no solution to the problems you see, therefore I consider your article a big rant against opensource, not constructive criticism, which in my opinion would be true partiotism.
Re:Understand the Source Perspective (Score:2, Informative)
Comment removed (Score:3, Informative)
I've got two words for this guy: (Score:3, Informative)
Re:Understand the Source Perspective (Score:3, Informative)
Run something with a known analytic solution through an artificially complicated computation to achieve the same solution. Any error means something is wrong. Somewhat like recompiling the kernel to check for strange memory errors. Too many eyes. Too many idle hands with idle computers.
Extremely more likely to get away with it with closed source because the scrutiny is less and much more predictable. By the time an open source gets anywhere near serious, there are egos to contend with. It's open. It's shared. But there's something even stronger than the "Not Invented Here" syndrome. An indication was when OpenBSD said "Everybody patch OpenSSH." Debian said "Show us the exploit." If OpenBSD doesn't get instant credibility on a security issue, notbody else is going to get much either.
There is an issue if the code itself is weak. With open source it will be found out and everybody will know about it. With closed source the odds favor your enemy knowing that you have no idea of.
Re:Understand the Source Perspective (Score:3, Informative)
IAAWSSE (I am a weapons systems safety engineer) and I can tell you there's another point where this argument falls apart. It's not like weapons systems software is accessible via the Internet - there's an "air gap" between these kind of systems and the rest of the world. So even if there are security holes in the software, it's not like J. Random Hacker is going to be able to connect to these systems to exploit them.
While insider attacks are possible, a) the people who operate these things are fairly thoroughly checked out before they get in the military, b) generally speaking, they have a vested interest in having their systems operate properly, and c) if in spite of all that, they were still motivated to do mischief, they'd have to be signed into a military computer terminal somewhere inside the system... and chances are very high they'd be caught.
Sean
Re:Understand the Source Perspective (Score:3, Informative)
Besides, both Linux and Windows get plenty of hack attempts and plenty of holes are closed. So neither is probably that ideal for military use.
Nothing will ever be that secure, they should take steps to make sure their systems aren't exposed to the public, private networks, no dialup numbers, no use of wifi etc..
Criteria for ICAT vulnerability citation (Score:4, Informative)
I was looking through the authors citations and it seems that his quote concerning the number of vulnerabilities in Linux compared to those in Windows is pretty questionable. The database [nist.gov], as you can see here, has one selection for Linux and many for Windows. It seems that the U.S. National Institute of Standards and Technology considers components of Windows, such as Internet Explorer not to be a part of the operating system, thus listing vulnerabilities of the compenents separate from those of the OS. At the same time, Linux vulnerabilities include Sound Blaster driver issues and problems with third party software such as Symantec Antivirus.
Re:This is an old story, and FUD anyway (Score:2, Informative)
The strengths of Linux count against its security (Score:3, Informative)
If you bet your life or your country's safety, you want something like Evaluated software doing the protecting. And not just EAL4 level stuff like NT. Look at the common criteria and look at the definitions of what the software evaluation levels are appropriate for.
Evaluation at a greater than EAL4 leval means that the documentation and test development each take much more time than the coding. The Evaluation itself (assuming the vendor has all his docs and tests complete) takes twice the time of the development. The Evaluation is done in a 4 tiered process with each of the 4 entities (lab, validator, tecnical approval board and vulnerability tester) having access to the source code and to the developers documentation and to the developers themselves.
High levels of evaluation require single source development under a single set of development standards.
Code developed, in our group, is reviewed in writing by 3 of the most senior architects of the product. Each reviewer objection or concern must be satisfied until it passes to the next reviewer.
So that means that we can document that 7 security trained people or outside organizations have looked at any code that is declared "Evaluated".
The object code is delivered in a trusted distribution methodology such that there is end to end verification (including while loading and while running) that the code that was developed and evaluated is the code you are running.
Now compare the Linux method of development and distribution.
To say that the code is Linux code is locked down and tested is to say that the barn door is locked too late in the process for the kinds of things the author of this posting is citing as potentials for happening. The emphasis must be on security over all, designed in from the begining and nothing
Is every Linux improvement preceeded by a security review?
Is there a security guru that can stop ship?
Is the security guru trained in security?
Is the security guru management supported?
Developing and deploying secure software is a time consuming, expensive, specialty that only a very few companies attempt.
You Are Being Delibrately Misleading. (Score:4, Informative)
If the various world governments will go through the trouble to audit defense contractors' code, then they can save themselves some trouble and audit Open Source code instead; any vendor establishing from that base will require less time in audit later. If the governments do not demand an independent audit of contractors' code, then that is where you will find the weak link. With Open Source, you always have the opportunity to audit at any time, diff against previously audited sources, and compile customized code with minimal audited feature sets.
Green Hills is saying "Trust Us! Trust Us!" Open Source is suggesting you trust what you can independently verify before your own experts' eyes.
As for the tool chain issue, you are seriously glossing over the obvious -- all the statements you have made apply to proprietary vendors as well. The solution is simple: don't upgrade the tool chain until the changes pass inspection. This is standard operating procedure for all mission critical deployments.
-Hope
Rebuttal from LynuxWorks at COTS Journal (Score:2, Informative)
LynuxWorks: http://www.cotsjournalonline.com/home/article.php
GHS: http://www.cotsjournalonline.com/home/article.php
Linux is not a RTOS (Score:5, Informative)
What Dowd fails to mention, in all of this, is that Level A certification requires a detailed specification of requirements that the system must implement. These requirements must be covered by test cases that give full requirement coverage (or appropriate analysis) and structural coverage (for Level A, it is MC/DC statement coverage [uow.edu.au]). The Open Source methodology is a long way from being a DO-178B compliant process, and rightly so - the rules for change control of a Level A-certified product are the exact opposite of the "release early, release often" method embraced by a typical open source program, because the development objectives are entirely different. This does not mean that an open source program can not be certified to Level A - it means that it requires a great deal of work on behalf of the organization submitting it for Level A compliance, first.
DO-178B is the most rigorous safety evaluation standard in the aerospace, automotive, or defense industries. There is no difference in the DO-178B certification guidelines for verifying a closed-source vs. open-source application. The problem that both of them have to come up with is documentation of the process used to produce the product, along with design and architectural requirements for the application that can be independently verified for full MC/DC statement coverage by an independent third party. Each application must be shown to accomodate space (memory access) and time (real-time scheduling) partitioning requirements on any device it is run on.
Most Level A OS's are a RTOS with (if you're lucky) ANSI and POSIX libraries for I/O and math. There are companies that have modified Linux for use in real-time embedded applications, but the standard Linux scheduler is not real-time, and does not perform space partitioning of application memory (which means it can be Level E, but nothing above that). If it does not affect safety-critical parameters, it doesn't have to be Level A - Levels D or E are acceptable.
Re:Understand the Source Perspective (Score:5, Informative)
Absolutely. Any to anyone who cares to argue that proprietary companies are more strict in reviewing their own code, please explain the abundance of easter eggs [eggheaven2000.com] in proprietary software.
Rebuttle... (Score:1, Informative)
I think that Mr. O'Dowd represents one extereme end of the community, and a very
paranoid one at that. His argument has a certain degree of merit, but he, like most of
those posting comments, has missed the point. I could post a lengthy rebuttle to Mr.
O'Dowds points, but that would be useless as his statements are based on his beliefs
and belief systems are difficult to change. Instead, I will say this...
Mr. O'Dowd in an attempt to strengthen his position rattles off a number of socially
obscure references to government security standards and policies. Attempting to
create emotion in favor of your position by spouting off vague and obscure references
to security standards shows, well, Mr. O'Dowd's insecurity at best. I do not dispute the
relevence to his position, but I do dispute the FACT that Linux has not passed nor
been tested to meet these standards. Let me be direct. If Linux is going to be used in
operational areas where these standards exist, then it MUST pass them now doesn't
it? These standards exist and MUST be adhered to whether the source is open or
proprietary. Using them as a means of disqualifying Linux in operational areas is silly.
Attacking Linux by saying that because it is open source someone can "easily infiltrate
the Linux community to contribute subversive software" is also rubbish. Going back to
Mr. O'Dowd's argument about the standards compliance, he totally destroys this as a
valid point. The clearance process for the software to be in compliance with the
standards he mentioned would prevent this from happening, just as it does for
proprietary code. Basically, when the government and matters of national security are
involved, there is no such thing as closed code. The code is scrutinized very carefully
no matter what the source.
So what does this all mean? Well, Mr. O'Dowd, as I mentioned above, is at an extreme
end of the spectrum. His comments, although alarmist and defensive in nature, have a
certain amount of value. I believe there are enough checks and balances in place to
see that our nations secrets and its critical systems remain safe. One thing also to
keep in perspective is that NO SYSTEM, open or proprietary is safe from attack or
vulnerabilities. That is an unrealistic ideal. However, I do not think that open source
software poses any less or greater a threat to security, but does offer a much more
flexible solution than a proprietary counterpart.
Jason Lockhart
Director, HPC and Technology Innovation
College of Engineering
Virginia Tech