Open Source a National Security Threat 921
n3xup writes "Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes. He likened Linux with a Trojan Horse- free, but in the end a lot of trouble. O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
Understand the Source Perspective (Score:5, Insightful)
The threat comes from the length of time on some large government projects. Some systems have been around longer than you and me. In the proprietary world, your whole project is dependent on a set of companies staying in business for 30+ years. Now with Linux, you're no longer dependent on that string; you can leverage off the community providing updates or if necessary you as the developer can make the changes. Most people fail to say this with Linux; everyone just says hey it's free and cheap. But if you really want to sell Linux, try saying that your entire project doesn't fall on another proprietary solution, we will have the source code in hand - people will listen.
It's easy to retort GreenHills FUD by saying all changes will be baselined and a change control board will review any updates (easy enough huh).
Re:Understand the Source Perspective (Score:4, Insightful)
Re:Understand the Source Perspective (Score:5, Funny)
Supporting comment (Score:5, Interesting)
Just as parent post suggested. Except, the govenment is already auditing open source, and customizing the Linux kernel to it's own needs... Does nobody remember NSA Secure Linux [nsa.gov]?
Re:Supporting comment (Score:5, Insightful)
When one is ranting in a desperate plea to defend one's own methodology & existence, it is often helpful to ignore facts that do not support one's case.
Re:Understand the Source Perspective (Score:5, Insightful)
Yes, of course, because the fact that open source has some advantages doesn't negate the risk pointed out in the article. It just means that their are risks both ways.
ANY piece of software that you run on a secure system has the potential of subverting the system. I think open source does create the illusion that it couldn't contain hidden malware because where could it hide in open source, right? Well, anyone who has ever seen the entries in an obfuscated C contest and wondered what that code could possibly do ought to be able to see the flaw in that argument. For that matter, anyone who has ever gone over and over HIS OWN CODE looking for a bug and not finding it ought to ask himself, what if it weren't even my own code and I didn't even know that a bug existed?
Closed source is even worse in this respect, though, but at least we know who wrote it, right?
Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.
Re:Understand the Source Perspective (Score:5, Insightful)
Exactly. The author also helpfully ignores the backgrounds - often unknown by the enduser - of the developers of closed source software. I've been in this industry for 12 years and have worked in one place (out of 7) that did not have a foreign national on the team. Ethnicities have included China, Vietnam, Russia, India, Pakistan and Syria.
The only point the author made that I could agree with would be that all software used for the military/intelligence communities should be thoroughly tested & certified to a high standard of security. I doubt there are many that would disagree with this statement. The problem is the author is hiding this valid argument beneath a layer of FUD intended solely to harm Linux & support the proprietary development model his company has chosen. He uses fear & stereotypes to paint the opposition without explaining what his company is doing that will solve the problem in a way that open source cannot.
Re:Understand the Source Perspective (Score:5, Informative)
Absolutely. Any to anyone who cares to argue that proprietary companies are more strict in reviewing their own code, please explain the abundance of easter eggs [eggheaven2000.com] in proprietary software.
Re:Understand the Source Perspective (Score:5, Insightful)
There are numerous examples of malware (like those in Kazaa), easter-eggs (like the flight simulator in Excel and the pinball game in MS Word) and unrequested features (like Windows Product Activation) in proprietary software.
While I agree that something similar could in theory also happen to some one-man or very small open source projects (in theory because I have never heard of any such occurence) there is absolutely no way such code could be smuggled into bigger projects like Linux, Apache, KDE or the like, there are just too many people watching.
So if you compare exactly what the article is talking about, proprietary software has a much worse track record.
Comment removed (Score:4, Insightful)
Re:Understand the Source Perspective (Score:5, Insightful)
Believe me, if you're talking about something like gunnery firmware, they're going to test it... the deepest fear in DoD these days is friendly fire.
Re:Understand the Source Perspective (Score:3, Insightful)
No it isn't - if it was, they would have burned all the patriot systems already. the biggest fear in DoD is making sure their pet contractors stay on the payroll so that they keep getting their kickbacks.
Re:Understand the Source Perspective (Score:5, Interesting)
That's one of the reasons why the US has always focused on fighter aircraft at the expense of anti-air artillery and SAM systems.
Re:Understand the Source Perspective (Score:5, Interesting)
US has software trojans too... (Score:5, Interesting)
When other governments start using OSS, they may be freeing themselves of these US planted trojans. I believe THAT is the major fear of the US government... Not that they will fail to detect a foreign planted bug in some fighterjet, but that OUR planted bugs will be found by China/India/Pakastan/Iran/etc... This would also seem to explain our government's looking the other way with regard to the Microsoft settlement. Remember that the anti-trust settlement was made within a week or so of September 11. Remember also the "Green Lantern" project, where our government was activly looking for ways to co-opt peoples boxes.
Software than cannont be easily trojaned creates just one more difficulty for our spy agencies. As with the gangster who was using pretty secure encryption, the government is now forced to use things like hardware keystroke loggers (meaning they have to have physical access to the unit), sneek-and-peek, you get the idea.
The US government has an interest in keeping people using insecure systems. How easy to you think it was to open those Windows laptops captured in Afganastan? Why, the NSA had those famous "NSA-KEY" entrys to Windows!... Easy as pie. The last thing they want is for KSM and OBL to start putting strong-encrypted filesystems on their Linux laptops in Afganastan. No way to plant the backdoor!
Expect to see a lot more of this type of FUD... The US Government has plenty of time and money to make sure that their Linux systems are safe, they just don't want others using them...
Re:Understand the Source Perspective (Score:5, Insightful)
More to the point, will they do this with closed source projects? Getting a mole into Green Hills Software, Microsoft, etc is every bit as real of a threat as getting one into any open source project. In many cases it might even be easier because of the lack of good hiring practices and oversite at small defense companies.
TW
Re:Understand the Source Perspective (Score:3, Insightful)
What hiring practices does Linux have?
Defense companies have to go through a certain amount of security and background checks to win a contract.
Re:Understand the Source Perspective (Score:5, Insightful)
Doesn't matter one jot. Gee, look, there's the source code. Every bug, hole and trojan horse, just waiting for you to find them. All you have to do is audit the code. You should be auditing the code of any product you're going to use in a sensitive enviroment anyway, wether it's closed or open source. Where's the difference?
Comment removed (Score:4, Insightful)
Oh really (Score:5, Insightful)
Wasn't there recently an article about a router with a backdoor shipped out in its code? How about all those darn "easter eggs" floating around in Windows and Office and other programs?
I would challenge you to compile a new Intel C library using a Microsoft C compiler from 6 years ago too. Heck, compile glibc using an IRIX compiler from six years ago.
You can drag out all the scenarios you want and whether it's Linux or it's *nix or BSD or Windows you're going to have the same audit challenges and not even have access to the source code without negotiating with all your suppliers.
Re:Oh really (Score:4, Funny)
The idea of an easter egg in missle guidance software is amusing, at least.
"Now see here, Joe, lemme show you a little trick. You fire this here missle directly at the sun and that enables the solitare game!"
Re:Understand the Source Perspective (Score:5, Insightful)
Even that might be insufficient. Ken Thompson showed in his '95 Turing award speech how a compiler trojan can exist even if the backdoor is not present in the compiler's source code.
Here's the link [acm.org].
Re:Understand the Source Perspective (Score:3, Insightful)
Re:Understand the Source Perspective (Score:5, Insightful)
As evidenced by his desperate attacks to stave off his dwindling market share, they're obviously doing a better job than the Green Hills CEO.
Comment removed (Score:4, Insightful)
You Are Being Delibrately Misleading. (Score:4, Informative)
If the various world governments will go through the trouble to audit defense contractors' code, then they can save themselves some trouble and audit Open Source code instead; any vendor establishing from that base will require less time in audit later. If the governments do not demand an independent audit of contractors' code, then that is where you will find the weak link. With Open Source, you always have the opportunity to audit at any time, diff against previously audited sources, and compile customized code with minimal audited feature sets.
Green Hills is saying "Trust Us! Trust Us!" Open Source is suggesting you trust what you can independently verify before your own experts' eyes.
As for the tool chain issue, you are seriously glossing over the obvious -- all the statements you have made apply to proprietary vendors as well. The solution is simple: don't upgrade the tool chain until the changes pass inspection. This is standard operating procedure for all mission critical deployments.
-Hope
Re:Understand the Source Perspective (Score:3, Interesting)
Just how much code do you think Linus is actually involved with? Really now. Last I checked, him especially, and most of the rest of the crew you mention, live and breath in kernal land. Think they've been over the code in every module out there? Think they know the code for Open Office inside and out? Come now.
Bottom line, if security is _that_ important, the code will be written and maintained IN HOUSE. Period.
Re:Understand the Source Perspective (Score:3, Interesting)
Re:Understand the Source Perspective (Score:5, Insightful)
If anything, I'd say the risk of getting exploits deliberately planted in code without detection are far greater in closed source applications than in OSS projects. Another lame attempt at FUD from the people behind AdTI..
Re:Understand the Source Perspective (Score:5, Insightful)
> tiny bit of error that would only be
> useful in cases of calibration of
> high-tech weapons?
I think it'd be tricky, because it would break other high-precision things as well. And the other folks using the open source project would say "hey, this fellow Fred just submitted a patch. something looks odd about it. Fred, why does line 314 do a bit shift without checking the foobar?" And then the patch would be rejected.
> If 3000 lines of dense mathematically
> rich C were checked in
I doubt any maintainer would accept such a patch. I don't accept patches for PMD [sf.net] without reading them, and if I got a 3K line patch I'd reject it out of hand.
Comment removed (Score:4, Insightful)
Re:Understand the Source Perspective (Score:3, Informative)
Run something with a known analytic solution through an artificially complicated computation to achieve the same solution. Any error means something is wrong. Somewhat like recompiling the kernel to check for strange memory errors. Too many eyes. Too many idle hands with idle computers.
Extremely more likely to get away with it with closed source because the scrutiny is less and much more predictable. By the time an open so
Re:Understand the Source Perspective (Score:3, Insightful)
Re: (Score:3, Insightful)
Re:Understand the Source Perspective (Score:5, Interesting)
thats why you do testing and code reviews. its not like these people are downloading new kernals in the field, any code that goes into a government project requires immense testing and code review... PERIOD. I dont care who wrote it.
if the military wanted to use open source software they would likely take the source and lock it down, producing a branch, for them that would be secured and standardized after a large review. if they wanted to bring in new functionality from the "public" branch it would mean a new verion of their "secure and approved" branch which would have to go through the same review process again.
Its not like they dont have to do this anyway with the code they produce now... sure they arent expecting people to try an sabotage them but you can do that without intention simply by making a coding error. Testing & code review is essential to the process.
this isnt that much differnt that what the military does with hardened versions of comercial processors... sure they lag behnind their comercial counterparts because they have to be hardend and tested heavily, but then they work, and they are able to leverage the initial design work and testing done when the hardware was being developed for comercial purposes.
Re:Understand the Source Perspective (Score:3, Insightful)
Re:Understand the Source Perspective (Score:5, Interesting)
Also, your code would have to be integrated enough into the calculations to only mis-fire when aimed at a certain target or to mis-fire at a set percentage. If the mis-fires were too high they wouldn't buy off on the weapon.
Re:Understand the Source Perspective (Score:5, Informative)
so you are telling us that if they BUY the software form XYZ company they blindly accept it as perfect and simply use it without question??
if so, then I really need to look at emigrating out of the United States because the levels of incompetence is getting insane.
I dont care if it's free/oss or a 60bajillion dollar closed source software written by aliens from alpha centauri. if it's something you absolutely rely on, you had damn better check it completely. OSS should abide by the same rules that the other stuff does.... check it completely from beginning to end.
Re:Understand the Source Perspective (Score:5, Informative)
The American government actually has an entire agency whose job is to perform just such tasks.
It's called the NSA.
Will the NSA actually perform this function with OSS?
They've already made their own distro.
KFG
Re:Understand the Source Perspective (Score:5, Insightful)
Uh, not much. If the weapons aren't hitting the mark on the firing range they probably wouldn't get deployed until they are fixed.
This is probably a poor example. The danger isn't in OSS that is designed to fail. If it doesn't work it wouldn't get used. The danger is an obscure security hole that would allow infiltration.
The key point where this guys whole argument falls apart is that proprietary software isn't any better. I'm confident Microsoft employs a small army of foreigners, and I'm not sure they would be any more reliable than OSS developers and their code gets a lot less scrutiny, and absolutely none if you are a customer getting binaries. Most big companies are putting R&D centers in India and China. How do they assure us the people they are hiring don't have ulterior motives.
If you want to develop software critical to national security you have to develop it in a classified lab with cleared employees. Oh but wait, in spite of all the scrutiny people with get security clearances get, they also turn out to be foreign agents and do great damage. Los Alamos doesn't exactly have a stellar security record and those people get more scrutiny than anyone. The Navy's comsec and has been massively compromised in the past.
I'd argue the opposite case from this guy. If you want secure software the best approach is to have as many people possible, both OSS and governemnt, scrutinize the source. If you find a project that is intentionally or negligently checking in compromised code black list them or give them extra scrutiny. The NSA's secure linux effort is an example of the government making sure OSS is secure and its way more likely to be that, than anything Microsoft or Green Hills is going to give them.
On a tangent here [counterpunch.org] is an interesting article on Homeland Security trying to enforce security through obscurity in the physical world. Someone walked around the DNC and took photos of all the weaknesses in their security in Boston and posted it on a list on Yahoo. Homeland security shut down the list and is collecting the names of everyone on the list and everything said. Should give you pause before joining any list in these interesting times.
Re:Understand the Source Perspective (Score:3, Insightful)
The answer should be obvious.
Re: (Score:3, Informative)
Re:Understand the Source Perspective (Score:3, Funny)
Just a very brief point, but how do you fix embedded system bugs? Am I incorrect in thinking that embedded means that the OS is onchip?
Same crap from January. (Score:3, Informative)
http://developers.slashdot.org/developers/04/01
Not Wind River (Score:5, Informative)
Bruce
Re:Understand the Source Perspective (Score:3, Insightful)
If a man on death row for murder tells you that murder is wrong, is he incorrect simply because of who he is?
Sure they (Green Hills) have an agenda, and it should be noted. However, they may have a point and shouldn't be discredited out of hand.
the rest world chooses linux for the same reasons. (Score:5, Insightful)
how governments OUTSIDE the US are choosing open source for exactly
the same reason (who knows what M$ + NSA put in the closed windows
source that might hurt other nations)?
[World Govs Choose Linux For Security & More]
http://slashdot.org/articles/01/12/11/0132213.sht
Re:the rest world chooses linux for the same reaso (Score:5, Interesting)
(who knows what M$ + NSA put in the closed windows source that might hurt other nations)?
Cryptographic code [heise.de] for a start.
---
It's wrong that an intellectual property creator should not be rewarded for their work.
It's equally wrong that an IP creator should be rewarded too many times for the one piece of work, for exactly the same reasons.
Reform IP law and stop the M$/RIAA abuse.
remember this guy? (Score:5, Informative)
This was covered by LWN back in May: http://lwn.net/Articles/83242/ [lwn.net]
IIRC, GHS does development on embedded XP stuff? I don't remember the details...
Totally! Lets OUTSOURCE instead! (Score:5, Funny)
*We wanted to buy software from only American developers, but we couldn't afford it.
FUD. (Score:5, Insightful)
GASP! Some XYZ providers even outsource their development to ABC and DEF (insert your favorite company and terrorist sponsoring country where necessary).
It would be incredibly naive to believe that other countries and terrorist organizations would not exploit an easy opportunity to sabotage our military or critical infrastructure systems when we have been doing the same to them for more than 20 years!
I think it has been proven that closed-source development doesn't help to change the possibilities that a "mole" has been planted or that a "hole" will be discovered.
One of the greatest misconceptions about Linux is that the free availability of its source code ensures that the "many eyes" with access to it will surely find any attempt at sabotage. Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs. Many of these flaws have eluded detection for years. It is ridiculous to claim that the open source process can eradicate all of the cleverly hidden intentional bugs when it can't find thousands of unintentional bugs left lying around in the source code.
And it is ridiculous to claim that a closed development enviornment will make it any different.
In addition, under the internationally recognized Common Criteria for IT Security Evaluation (ISO 15408), Windows has been certified to Evaluation Assurance Level 4 (EAL 4), a higher level of security than the EAL 2 that Linux has achieved.
According to this [com.com] article, obtaining EAL2 certification typically costs between $400,000 and $500,000. Looks like it is more money than security. In their infancy, why would Linux vendors decide to shell out large sums of money when the government wasn't interested in using Linux anyway?
This whole article is FUD. He's annoyed because Linux is making leaps and bounds and will possibly affect his market-share in the lucrative Defense and Aerospace industries. At least he came out and said it on his own legs and not by paying off a third party to "investigate" the "problems" with Linux and post their results to the world.
Re:FUD. (Score:5, Informative)
Pure FUD, indeed (Score:4, Interesting)
Ok, I'll bite just once: I doubt there is a single weapon system procured by the DoD in the last 10 years that does not have a subsatantial portion of it outsourced overseas. Most procurments now require some % of it, by contract.
Re:FUD. (Score:5, Informative)
Check out this link: Understanding the Windows EAL4 Evaluation [jhu.edu]
EAL doesn't really mean much. At least, not until you get up to the higher levels. It's basically so that government departments can have a check-list requirement for any software they buy or comission.
Governments should not use OS without a proper... (Score:4, Insightful)
Re:Governments should not use OS without a proper. (Score:3, Interesting)
No, they let them view PART of their source code. Without compiling.
Unless you have both the full source and the compiler/toolchain it was built with, a security audit is worse than useless, as you have no way of verifying your results.
In such a case, WYSINNWYG (What You See Is Not Necessarily What You Get).
For example: You get the full source and the toolchain. You do a build on the same platform using the same flags. Your final executable has a d
Um, and what about the source China has seen? (Score:5, Insightful)
I think that's a pretty large security threat right there...
Terrorists in Microsoft (Score:4, Insightful)
Re:Terrorists in Microsoft (Score:3, Insightful)
Add bugs to Windows?
NSA (Score:3, Interesting)
Don't Trust Linus! (Score:4, Funny)
Come out of the cave! (Score:5, Insightful)
Urmm , so what operating system do you use then Dan O'Dowd? and which newspapers and websites do you read?
You're obviously using a closed source operating system that is free of viruses, worms, holes and other security problems. What might this mystery closed source operating system that you are using that doesnt pose a threat to the nations security?
Groklaw destroyed this FUD...long ago (Score:5, Informative)
Truly nothing to see here, folks. Just empty FUD that has been discredited.
This reminds me of an old saying (Score:5, Funny)
O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding.
If the source is open how can there be no chance in finding bugs or whatever else they wish to put in the source?
This is clearly FUD to protect their market from the steam-roller known as FOSS. Security through obscurity is already proven faulty.
This is an old story, and FUD anyway (Score:5, Interesting)
The fact is that Green Hills products are no more secure, and may well be less secure, because they don't have the "many eyes" looking at their source code. We've had trojan horse attempts in Open Source software. They get caught quickly. But even if the source is disclosed, nobody outside of their tiny company has an incentive to do productive work on the internals of a Green Hills operating system in the way that people who modify GNU/Linux do. And security audits by such a small company can't catch everything.
The best example of this has been the Borland Interbase database. This was used for airline reservations, and had a trojan horse buried in it for 6 to 9 years while it was a proprietary product. The door could have been found by anyone who did an ASCII dump of the product, but those who did kept it secret, and probably took a lot of free flights. An Open Source coder found the door some months after the database went Open Source, and had an incentive to report it - at that point he was one of the people doing productive work on the database and only wanted it to work better and more securely.
This "black hats" (people who are motivated for bad purposes) vs. "white hats" (good purpose) phenomenon is important to consider when you evaluate the security of Open Source. Generally the only people who would look for vulnerabilities in proprietary software, outside of its manufacturer, are looking to exploit them! This is hardly the case with Open Source.
Thanks
Bruce
Closed source safer? (Score:4, Informative)
What about outsourced closed-source? (Score:3, Informative)
Here's the likely scenario (Score:5, Funny)
Lead Programmer at Major Defense Contractor: Hey, can you install this patch by the that new Pakastani contributor for our missile control module?
New programmer: Yeah, I looked at it. There was some weird code in there that I couldn't quite figure out. There was some one line Perl code with about 10,000 characters. Shouldn't we look at it? What does it do, exactly?
Lead Programmer: Naw. I don't think it really matters. I don't want to look stupid because I sure can't figure Perl out. Let's just go with the release early and often policy. We'll let the users report the bugs back to us.
"Attempt" is right (Score:5, Informative)
Well, he does have a point ... (Score:5, Insightful)
And it doesn't have to be in the Linux kernel. The classic example (at least 10 years old) is to hack up gcc so that it examines the code it's compiling, and if it decides that it's compiling /bin/login to do things a little differently, inserting a back door where there was none before.
However, while he does have a point, it's a very myopic point. Closed source software has exactly the same vulnerabilities, except for one critical difference -- only people within the company in question have a chance of detecting the problem -- the end user will never get to see the source and see if it's compromised. Granted, most open source users do not review all the source code that they use, but at least the option is there, and for the people where security is absolutely essential (like the NSA) they almost certainly use it.
Also, for a closed source company, the problem is even worse. The backdoor (or whatever) could be introduced when the code is finally compiled for distribution, and never get checked into whatever source control system they use. So the binaries get shipped out, but NOBODY has reviewed the source code in question (except our cracker friend) and once the bug does come to light (if it ever does) the company will look at the source code and scratch it's head -- it won't even have the source code in question to look at.
Open is Closed. (Score:4, Funny)
Freedom is Slavery.
Ignorance is Strength.
War is Peace.
Outsourced Proprietary Software? (Score:5, Interesting)
Let's call him Chicken Little (Score:3, Insightful)
The gauntlet is thrown down. I challenge this man to come up with a demonstrable "trojan horse" in an OS piece of software that cannot be found in a reasonable period of time by a security audit (the kind the government does of OS software to be used). Such fear mongering should be laughed at with, torn up, and spit upon whenever you see or hear it. It reminds me of Ridge getting up and saying, well, there's a threat around the election, but we have no evidence of it. Be scared (and vote for Bush). Yea . . . right. I didn't just fall off the turnip truck.
Get a life, and make better products, jerk!
I am continually amazed... (Score:5, Insightful)
The cornerstone of open source is that it is OPEN SOURCE. The government is free to view and evaluate all the packages to their little, demonic hearts' content.
If I were a terrorist, I'd think I would penetrate a closed-source house (say, Microsoft or Green Hills) and hack some little nasties into their source.
But,, maybe that's why Dan O'Dowd isn't a very good terrorist.
Funny How Great Minds Think Alike... (Score:5, Interesting)
At least with Open Source you have the source to ultimately check for yourself. Vendors like Novel, IBM, and RedHat are supposed to be actively looking at the source to make sure no one is slipping stuff in that doesn't belong but if you don't believe them you can do it yourself.
So you have a Mr. Dan O'Dowd trying to a terrorist ghost threat into Open Source. The problem is that the source is there for you to inspect. With Microsoft the only word you have is their word that they aren't monkeying with the OS to monitor you.
IMHO, BSD and Linux are perfect for Military and security applications. You can inspect every corner of the kernel. You can freeze on a specific version because you always have that source code. You can branch and patch as you see fit. This seems perfect for the military and security branches. With Microsoft you have to "signup" (how much money does it cost to do that?) to view the source and then what? The only proof you have is that this particular version of Windows hasn't been monkeyed with. What about the patches and hotfixes? *shrug*
When it really boils down to it are you going to believe the source you compiled, you control yourself or Microsoft? I think Mr. O'Dowd's trust is ill placed.
Issues at Hand (Score:5, Informative)
First of all, coming from a company that charges *a lot* of money for an OS stands *a lot* to lose from a free OS. Therefore, GH would be expected to say that a GH product is better.
The fact that GH source code is not open source does not mean that no one ever sees it. I have access to the entire source, and, if so inclined, could use that information to create an attack myself or provide the source to someone else. Remember, even though the company signed a release for the source, that doesn't mean that money talks more.
GH has, up till this point, maintained a 'top dog' status in this area. In fact, when we asked for a driver for USB mass storage, the response was 'Well, where else would you get it? It is going to cost you.'
IMHO, GH has had a bit of a mini-Microsoft status within the military embedded world. This has certainly mirrored the PC OS world - one leading OS, some neat features, but when you really look at, how many ways are there to create a GUI or an OS. Let's be honest - an OS has queues, semaphores, a file system (replaceable, in GH), etc. So we are not talking about 'rocket surgery'.
The idea of Linux not being 'military grade' would really need to be made from an independent group. This is akin to MS saying that it has the best browser or GUI. Of course they are going to say that.
Wrong Analogy (Score:5, Funny)
If the Trojan Horse were really Open Source, it would have had a list of building materials, instructions on building the horse yourself, the number of greek warriors inside, how the warriors were armed, along with several notes from the Phoenicians commenting on the dangers of the included Greeks...
Ignore this at your peril (Score:5, Interesting)
I'm a long time Linux user and have been around open-source for a long time. While the source of this article is obviously questionable, I work for a Defense Contractor and I'm here to tell you, the points raised in the article have some truth to them.
If you're selling products to the govt and those products use an operating system, the issue of being able to GUARANTEE that your code base is not and cannot be coerced is very real. Everyone has (or should have) seen the techniques used to obfuscate trojan horses by using a compiler or some other tool that makes this problem even harder.
The problem being eluded to here is about a chain of control of a code base that can be demonstrated to satisfy a DoD or other govt customer. While no process can ever be completely secure, the real point is, if you have a choice between a system that has been developed in a closed environment where you can keep an eye on everyone involved and and open-source development, the prior development is easier to verify. You can call it FUD but this is a real issue within the govt circles and WILL limit the use of Linux in certain applications.
Amusing article (Score:5, Interesting)
As secure as Windows? He's kidding
When I worked for the AirForce, they had several instances in which systems were comprimised (desktops). Various worms came out of the blue and just hammered their network. My systems running Linux noticed it immediately. In fact I was told there was NO problem. After a few hours of watching the logs logging attacks over and over again I then noticed a general email sent out to all explaining there was a problem and instructions were provided.
As secure as Windows? God I hope not!
The Federal Aviation Administration (FAA) requires software that runs commercial (and many military) aircraft be approved as part of a DO-178B certification. DO-178B Level A is the highest safety standard for software design, development, documentation, and testing. It is required for any software whose failure could cause or contribute to the catastrophic loss of an aircraft.
Several operating systems have been DO-178B Level A certified. Until Linux is certified to DO-178B Level A, our soldiers, sailors, airmen and marines should not be asked to trust their lives with it.
If Linux isn't at this level then what is the point of the article? Linux is certified for various things in the military. Whenever I stand up a server I was asked what OS I would be running. Everyone was apprehensive it would be Windows which requires a whole heap of testing before it's allowed to run in production. As soon as I told security it was either Unix or Linux they would sigh and tell me to go ahead. Much more confidence there
NSA Linux (Score:3, Interesting)
my comment about his article (Score:3, Informative)
In theory, of course, you're totally right in believing this. In practice, however, you're inescapibly wrong. First, since Linux is open source, the army implementing these linux embedded systems most likely read through the code to verify it's normal behavior and lack of serious design flaws, second, terrorists nowadays do not use computers for fear of being traced by the NSA or CIA with the net, thus preventing themselves from ever contributing code to Linux. Third and last, the linux kernel development team has now a signature follow-up on the internet, to make sure that each piece of code can be traced back to it's original author. It makes it that much easier to locate the developpers of Linux. Many of them are in countries that you failed to mention, like Japan, Australia, Finland and many other western countries that the US government trusts. Besides that, the open-source community is the best bug-tracking-solving community in the world. I believe it has happened for the webserver apache when the new version was shipped out with a security flaw less than an hour later the bug was traced in the code and a patch submitted. So, even in the case of a security flaw in the linux kernel, I believe that in less than 35 minutes the army computer specialists would be able to trace and fix the flaw. And those security flaws are precisely the reason the army orders pre-series of each equipment they will use and test them for a few months with anything that they're expected to meet in combat zone, one of them being loss of OS stability, control or even total power failure and recovery. You have only looked at the theoretical part of the problem, and propose no solution to the problems you see, therefore I consider your article a big rant against opensource, not constructive criticism, which in my opinion would be true partiotism.
all in the family (Score:5, Interesting)
so, please explain to me again how open source terrorists are going to slip their malware under our noses?
Valid concern for *all* OSs (Score:3, Interesting)
Of all the valid reasons to attack open source software, I can't imagine how they can imply an unknown piece of code is more secure than a known piece of code, even if possible enemies are contributing to the open source (unless, of course, every programmer at Microsoft has been given the proper clearance for all levels the OS they are working on will be used).
Exactly the point (Score:5, Insightful)
The US can continue to run Windows, be our guest, but the point is moot since much of US Government software is developed in India anyways. No back doors there, for sure.
First blush (Score:3, Insightful)
And I'm a little upset. Is OSN actually letting these guys astoturf on
This whole thing smells bad. Maybe it's a slow news day but there are better anti-linux rants than what is coming from that lame ass website. Nothing to see here, move along.
Scenario (Score:3, Interesting)
Say, the software was used for target calculations for an artillery piece. For the sake of argument, say some "rogue" developer has added a bit of code that takes the coordinates from the GPS in the unit. Checks to see if they are in, say, a middle eastern country. If so, the shells, which found the target just fine in testing, are now missing the mark by 500 yards.
Not paranoid, just askin.
Green Hills FUD was covered on groklaw months ago (Score:4, Interesting)
Criteria for ICAT vulnerability citation (Score:4, Informative)
I was looking through the authors citations and it seems that his quote concerning the number of vulnerabilities in Linux compared to those in Windows is pretty questionable. The database [nist.gov], as you can see here, has one selection for Linux and many for Windows. It seems that the U.S. National Institute of Standards and Technology considers components of Windows, such as Internet Explorer not to be a part of the operating system, thus listing vulnerabilities of the compenents separate from those of the OS. At the same time, Linux vulnerabilities include Sound Blaster driver issues and problems with third party software such as Symantec Antivirus.
Green Hills is the national security threat (Score:4, Insightful)
What a bizarre article.
The statement "Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs." Shouldn't one consider that the "many eyes" are the developers finding those weekly bugs? Wonder how many eyes are looking for Green Hills software bugs?
As long as people are involved, mistakes (bugs) will be made. But saying that malicious code is more likely in a product where someone CAN examine the code verses a product where no one can is just plain stupid. There is obviously an undisclosed agenda here (might that be selling a DO-178B Level A rated real time OS, aka Integrity? Getting a lot of Linux competition, eh?).
As to the standard DO-178B...the first 90% of the article is about security, then you mention DO-178B. DO-178B is not a security standard. DO-178B is a FAA safety related standard for software. Any software certified under DO-178B can still be full of unknown security holes. The standard may be required for software used in flight related applications but it does not mean the software is also secure.
The level A rating doesn't even mean "most secure" as the article seems to imply. It means that if the software crashes, it will not affect other software that is running. In other words, the software is ISOLATED, not secure.It is amazing the things companies will say when they are losing ground to a competitor.
and closed source propietary firms.... (Score:5, Insightful)
Nope. Open source is still the best way to go, along with open government. When you let people hide "stuff", and when it's connected to massive political power and heaps 0 money, that's when crimes occur. The best bet is openness, bar none. It is not perfect, but it's the best design yet.
Offshoring just as much of a threat (Score:5, Insightful)
How is this riskier than companies that outsource? (Score:4, Insightful)
I find it interesting that open source software is considered a risk because individuals from other nations are allowed to participate in the development of the code...
How does this differ from corporations which provide software to the military who outsource their development to individuals from other nations?
The only difference is that the OSS model involves corporations giving up some of their control over the rights of the product and corporations don't like that.
Otherwise, the article makes assumptions of differences between OSS remote participation and outsourcing which has no material relevance.
The idea of outsourcing being more secure because security checks are done can be argued, but even security checks fail and someone who is cleared can decide to sabotage. The problem is that once someone is vetted, they are trusted. This is actually worse than the OSS model where no matter who you are, the code is reviewed with the same level of scrutiny as anyone else's code.
I can think of so many instances of calling support, having to provide my personal identifying information to an individual who was either not in my state or not even in the US.
Sounds more like a double standard of judgement from the corporate viewpoint that is prejudiced against OSS projects.
Linux is not a RTOS (Score:5, Informative)
What Dowd fails to mention, in all of this, is that Level A certification requires a detailed specification of requirements that the system must implement. These requirements must be covered by test cases that give full requirement coverage (or appropriate analysis) and structural coverage (for Level A, it is MC/DC statement coverage [uow.edu.au]). The Open Source methodology is a long way from being a DO-178B compliant process, and rightly so - the rules for change control of a Level A-certified product are the exact opposite of the "release early, release often" method embraced by a typical open source program, because the development objectives are entirely different. This does not mean that an open source program can not be certified to Level A - it means that it requires a great deal of work on behalf of the organization submitting it for Level A compliance, first.
DO-178B is the most rigorous safety evaluation standard in the aerospace, automotive, or defense industries. There is no difference in the DO-178B certification guidelines for verifying a closed-source vs. open-source application. The problem that both of them have to come up with is documentation of the process used to produce the product, along with design and architectural requirements for the application that can be independently verified for full MC/DC statement coverage by an independent third party. Each application must be shown to accomodate space (memory access) and time (real-time scheduling) partitioning requirements on any device it is run on.
Most Level A OS's are a RTOS with (if you're lucky) ANSI and POSIX libraries for I/O and math. There are companies that have modified Linux for use in real-time embedded applications, but the standard Linux scheduler is not real-time, and does not perform space partitioning of application memory (which means it can be Level E, but nothing above that). If it does not affect safety-critical parameters, it doesn't have to be Level A - Levels D or E are acceptable.
So US-Centric (Score:4, Interesting)
Unfortunately, you can't guarantee that someone looking to subvert windows in a subtle way won't be hired by (or more interestingly, license their code to) Microsoft- so with closed source you basically get the worst of all possible worlds.
If this guy actually believes this, (Score:5, Insightful)
First of all, what truly important piece of software would possibly be part of open public development? I thought this was specialized enough of a field that the only people who had any competence with what you were making were already trusted anyway. Wasn't SELinux developed *inside the NSA* before it was released?
Secondly, assuming a vital piece of software WERE being developed publicly, someone trying to insert malicious code would have to make it past a few barriers, the first being the most complicated. He would have to: 1) Know what his deliberately inferior code would probably do in the finished product versus what a non-ciminal would want it to do. 2) Get it past the critical eye of a few other developers, 3) Slip through some kind of government screening. And all the while NOT make anyone suspicious.
And even then the results are not guaranteed. What is your cyberterrorist counting on? I sincerely doubt that he could have snuck a back door into the code given all those hoops. I don't think the deliberate bug can be both significant and unknown at the same time. Is he hoping that his bug will cause the software to make a slight miscalculation? Whoopty shit. Whatever agency he or she is working against will be annoyed for a little while and then fix the problem.
Even if his deliberate bug caused a catastrophic failure, it can and will be traced back to HIS contribution, and if some terrorist group stands up and says "Ha ha! Look what we did! And here's why!" (and if it's Al-Qaeda we can be almost certain of this) That man is immediately under FBI surveillance and probably arrest.
In any case, inserting a bug would be a lot of work. A lot of work for an uncertain return, and success will mean almost inevitable detection.
Why some terrorist would bother with this approach is beyond me. It's so much easier just to fill a truck with dynamite.
The odds of this happening... (Score:4, Insightful)
That's a pretty good obscure set of circumstances. Does it mean it can't happen? No. But contrast this with proprietary methodology wherein a coder has (usually) unrestricted access to the code base. Hmmm. Sounds more plausible there!
Of course, the key thing to note here is that anyone who has to dredge the dread forumla that terrorism + open source == Disaster!!! is probably desperate to save his flagging business.
Re:Well, here's the obvious (imho) response. (Score:3, Interesting)
As for the NSA inspecting this code -- that's all well and good. But, how often do hundreds of individuals look over OpenSource code and miss a big for awhile. "Awhile" is all it takes for a foreign government to download A LOT of information that they should
I've got two words for this guy: (Score:3, Informative)
Re:Well, here's the obvious (imho) response. (Score:3, Interesting)
Re:The strengths of Linux count against its securi (Score:4, Insightful)
To say that the code is Linux code is locked down and tested is to say that the barn door is locked too late in the process for the kinds of things the author of this posting is citing as potentials for happening.
So what's stopping the DoD from taking the source code base and doing their own testing and certification on it? Considering you claim to have had a background in this, I'm surprised you didn't think of this. This may save them some time in the long run, since they don't have to go through the effort of developing the software itself.
If I decide to use a library or module from another developer (OSS or otherwise) in something that I am doing, I always take the time to test it to make sure it at least does what I want and is adequate for the task at hand. Now, my own projects don't require a terrible amount of security, but if they did, I would be certain to do some testing in that area as well.
So I just don't get your point. You don't have to develop the code yourself in order to certify it if you have the full source available to you. And then once you have certified it, after making any corrections that you need on your copy of the source, then you lock THAT down. What came out of the original source base is irrelevant at this point. It only matters what you improved upon and certified.