Open Source a National Security Threat 921
n3xup writes "Dan O'Dowd, CEO of Green Hills Software, suggests that open source software has the capability of being sabotaged by foreign developers and should not be used for U.S. military or security purposes. He likened Linux with a Trojan Horse- free, but in the end a lot of trouble. O'Dowd thinks that unfriendly countries will attempt to hide intentional bugs that the Open Source community will have no chance of finding."
Understand the Source Perspective (Score:5, Insightful)
The threat comes from the length of time on some large government projects. Some systems have been around longer than you and me. In the proprietary world, your whole project is dependent on a set of companies staying in business for 30+ years. Now with Linux, you're no longer dependent on that string; you can leverage off the community providing updates or if necessary you as the developer can make the changes. Most people fail to say this with Linux; everyone just says hey it's free and cheap. But if you really want to sell Linux, try saying that your entire project doesn't fall on another proprietary solution, we will have the source code in hand - people will listen.
It's easy to retort GreenHills FUD by saying all changes will be baselined and a change control board will review any updates (easy enough huh).
the rest world chooses linux for the same reasons. (Score:5, Insightful)
how governments OUTSIDE the US are choosing open source for exactly
the same reason (who knows what M$ + NSA put in the closed windows
source that might hurt other nations)?
[World Govs Choose Linux For Security & More]
http://slashdot.org/articles/01/12/11/0132213.sht
Re:Understand the Source Perspective (Score:4, Insightful)
FUD. (Score:5, Insightful)
GASP! Some XYZ providers even outsource their development to ABC and DEF (insert your favorite company and terrorist sponsoring country where necessary).
It would be incredibly naive to believe that other countries and terrorist organizations would not exploit an easy opportunity to sabotage our military or critical infrastructure systems when we have been doing the same to them for more than 20 years!
I think it has been proven that closed-source development doesn't help to change the possibilities that a "mole" has been planted or that a "hole" will be discovered.
One of the greatest misconceptions about Linux is that the free availability of its source code ensures that the "many eyes" with access to it will surely find any attempt at sabotage. Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs. Many of these flaws have eluded detection for years. It is ridiculous to claim that the open source process can eradicate all of the cleverly hidden intentional bugs when it can't find thousands of unintentional bugs left lying around in the source code.
And it is ridiculous to claim that a closed development enviornment will make it any different.
In addition, under the internationally recognized Common Criteria for IT Security Evaluation (ISO 15408), Windows has been certified to Evaluation Assurance Level 4 (EAL 4), a higher level of security than the EAL 2 that Linux has achieved.
According to this [com.com] article, obtaining EAL2 certification typically costs between $400,000 and $500,000. Looks like it is more money than security. In their infancy, why would Linux vendors decide to shell out large sums of money when the government wasn't interested in using Linux anyway?
This whole article is FUD. He's annoyed because Linux is making leaps and bounds and will possibly affect his market-share in the lucrative Defense and Aerospace industries. At least he came out and said it on his own legs and not by paying off a third party to "investigate" the "problems" with Linux and post their results to the world.
Governments should not use OS without a proper... (Score:4, Insightful)
Open Source is more like a Trojan Horse... (Score:2, Insightful)
Um, and what about the source China has seen? (Score:5, Insightful)
I think that's a pretty large security threat right there...
Terrorists in Microsoft (Score:4, Insightful)
Hmm Microsoft VS geeks? (Score:2, Insightful)
I think I'd rather put my trust in someone doing it for the pure love (hate) of (bad) software, then someone doing it for money and no love at all.
Come out of the cave! (Score:5, Insightful)
Urmm , so what operating system do you use then Dan O'Dowd? and which newspapers and websites do you read?
You're obviously using a closed source operating system that is free of viruses, worms, holes and other security problems. What might this mystery closed source operating system that you are using that doesnt pose a threat to the nations security?
Comment removed (Score:4, Insightful)
Re:Understand the Source Perspective (Score:5, Insightful)
Believe me, if you're talking about something like gunnery firmware, they're going to test it... the deepest fear in DoD these days is friendly fire.
Re:Understand the Source Perspective (Score:5, Insightful)
More to the point, will they do this with closed source projects? Getting a mole into Green Hills Software, Microsoft, etc is every bit as real of a threat as getting one into any open source project. In many cases it might even be easier because of the lack of good hiring practices and oversite at small defense companies.
TW
Re:Understand the Source Perspective (Score:5, Insightful)
> tiny bit of error that would only be
> useful in cases of calibration of
> high-tech weapons?
I think it'd be tricky, because it would break other high-precision things as well. And the other folks using the open source project would say "hey, this fellow Fred just submitted a patch. something looks odd about it. Fred, why does line 314 do a bit shift without checking the foobar?" And then the patch would be rejected.
> If 3000 lines of dense mathematically
> rich C were checked in
I doubt any maintainer would accept such a patch. I don't accept patches for PMD [sf.net] without reading them, and if I got a 3K line patch I'd reject it out of hand.
Well, he does have a point ... (Score:5, Insightful)
And it doesn't have to be in the Linux kernel. The classic example (at least 10 years old) is to hack up gcc so that it examines the code it's compiling, and if it decides that it's compiling /bin/login to do things a little differently, inserting a back door where there was none before.
However, while he does have a point, it's a very myopic point. Closed source software has exactly the same vulnerabilities, except for one critical difference -- only people within the company in question have a chance of detecting the problem -- the end user will never get to see the source and see if it's compromised. Granted, most open source users do not review all the source code that they use, but at least the option is there, and for the people where security is absolutely essential (like the NSA) they almost certainly use it.
Also, for a closed source company, the problem is even worse. The backdoor (or whatever) could be introduced when the code is finally compiled for distribution, and never get checked into whatever source control system they use. So the binaries get shipped out, but NOBODY has reviewed the source code in question (except our cracker friend) and once the bug does come to light (if it ever does) the company will look at the source code and scratch it's head -- it won't even have the source code in question to look at.
Open Source Outsourcing vs Commercial Outsourcing (Score:2, Insightful)
Isn't the problem described much larger for commercial outsourcing? These days most software used in the U.S. is partially written outside the U.S. At least with the open source software people concerned about security can build from source and perform an inspection on the source. With commercial software, no such precautions are available.
Re:Understand the Source Perspective (Score:3, Insightful)
Re:Understand the Source Perspective (Score:1, Insightful)
It wouldn't be easier or more likely to find such a problem if the code were closed source, and only a handful of people ever looked at it.
Let's call him Chicken Little (Score:3, Insightful)
The gauntlet is thrown down. I challenge this man to come up with a demonstrable "trojan horse" in an OS piece of software that cannot be found in a reasonable period of time by a security audit (the kind the government does of OS software to be used). Such fear mongering should be laughed at with, torn up, and spit upon whenever you see or hear it. It reminds me of Ridge getting up and saying, well, there's a threat around the election, but we have no evidence of it. Be scared (and vote for Bush). Yea . . . right. I didn't just fall off the turnip truck.
Get a life, and make better products, jerk!
I am continually amazed... (Score:5, Insightful)
The cornerstone of open source is that it is OPEN SOURCE. The government is free to view and evaluate all the packages to their little, demonic hearts' content.
If I were a terrorist, I'd think I would penetrate a closed-source house (say, Microsoft or Green Hills) and hack some little nasties into their source.
But,, maybe that's why Dan O'Dowd isn't a very good terrorist.
Re:Understand the Source Perspective (Score:3, Insightful)
No - the Government will contract a company to do this for them. The difference is that when the contract comes up for bid, they will both have the source code involved and a better chance of finding competing contractors able to work with that code. At worse, they'll end up contracting with an outfit that is unfamiliar with the code and has to ramp-up... which isn't much different from a proprietary-based situation. Which brings us to a very important point...
Are you implying that 3000 lines of proprietary code is going to go in to a system without any checks? And once that code is incorporated in to the system, that system is not going to undergo a strict battery of tests?
Re:Understand the Source Perspective (Score:3, Insightful)
What hiring practices does Linux have?
Defense companies have to go through a certain amount of security and background checks to win a contract.
Why OSS ? (Score:2, Insightful)
You know, it's easier to hide funcionalities in Shareware/Freeware than in OSS - you can't look at the code and observe them at one glance. I wonder if it would not be easier to spread malicious code in PaintShopPro and others.
Al.
Re:Understand the Source Perspective (Score:5, Insightful)
Doesn't matter one jot. Gee, look, there's the source code. Every bug, hole and trojan horse, just waiting for you to find them. All you have to do is audit the code. You should be auditing the code of any product you're going to use in a sensitive enviroment anyway, wether it's closed or open source. Where's the difference?
Re:Understand the Source Perspective (Score:5, Insightful)
If anything, I'd say the risk of getting exploits deliberately planted in code without detection are far greater in closed source applications than in OSS projects. Another lame attempt at FUD from the people behind AdTI..
Re:Understand the Source Perspective (Score:2, Insightful)
and where do they find the expert to hire in the first place? the beauty of Version controll in a OSS is that you can find exactly who programmed the code, and if you do the background check, and he is hireable, you use him, and his code. if not you have the choice of finding another expert to re-write just that section, or thourghly review his sections...
try to find out exactly which indian contractor did that work for the closed source contractor, and trust them to tell you, oh ya we hired an extremist, but he didn't do that part of the code
Re:Understand the Source Perspective (Score:3, Insightful)
If a man on death row for murder tells you that murder is wrong, is he incorrect simply because of who he is?
Sure they (Green Hills) have an agenda, and it should be noted. However, they may have a point and shouldn't be discredited out of hand.
Re:Understand the Source Perspective (Score:5, Insightful)
Uh, not much. If the weapons aren't hitting the mark on the firing range they probably wouldn't get deployed until they are fixed.
This is probably a poor example. The danger isn't in OSS that is designed to fail. If it doesn't work it wouldn't get used. The danger is an obscure security hole that would allow infiltration.
The key point where this guys whole argument falls apart is that proprietary software isn't any better. I'm confident Microsoft employs a small army of foreigners, and I'm not sure they would be any more reliable than OSS developers and their code gets a lot less scrutiny, and absolutely none if you are a customer getting binaries. Most big companies are putting R&D centers in India and China. How do they assure us the people they are hiring don't have ulterior motives.
If you want to develop software critical to national security you have to develop it in a classified lab with cleared employees. Oh but wait, in spite of all the scrutiny people with get security clearances get, they also turn out to be foreign agents and do great damage. Los Alamos doesn't exactly have a stellar security record and those people get more scrutiny than anyone. The Navy's comsec and has been massively compromised in the past.
I'd argue the opposite case from this guy. If you want secure software the best approach is to have as many people possible, both OSS and governemnt, scrutinize the source. If you find a project that is intentionally or negligently checking in compromised code black list them or give them extra scrutiny. The NSA's secure linux effort is an example of the government making sure OSS is secure and its way more likely to be that, than anything Microsoft or Green Hills is going to give them.
On a tangent here [counterpunch.org] is an interesting article on Homeland Security trying to enforce security through obscurity in the physical world. Someone walked around the DNC and took photos of all the weaknesses in their security in Boston and posted it on a list on Yahoo. Homeland security shut down the list and is collecting the names of everyone on the list and everything said. Should give you pause before joining any list in these interesting times.
Comment removed (Score:4, Insightful)
Re:Understand the Source Perspective (Score:3, Insightful)
Re:Understand the Source Perspective (Score:3, Insightful)
No it isn't - if it was, they would have burned all the patriot systems already. the biggest fear in DoD is making sure their pet contractors stay on the payroll so that they keep getting their kickbacks.
FUD, FUD, FUD! (Score:1, Insightful)
And proprietary software is safer, how? It is just as easy, if not easier, to infiltrate a specific closed-source company (remember, the 9-11 hijackers were here for 3 years learning to fly jumbo jets) and program in their subversions directly. (See my comments about his compny's certifications below).
Some embedded Linux providers even outsource their development to China and Russia.
Unlike all the major proprietray developers who outsource all the work they can to China and Russia, too. How much of Green Hills' code is written overseas? By sweat-shop coders they never even meet?
In fact, the U.S. National Institute of Standards and Technology (NIST) security vulnerabilities database lists more vulnerabilities for Linux than Windows in the last ten years.
This is such a broad statement it is tough to refute. Are they talking kernel only? The kernel is the only part the military should be interested in as far as security vulnerabilities go. Then how can they get equivalent numbers for Windows which doesn't easily allow you to separate the kernel? And Windows definitely should include all of the apps because any app vulnerability is a potential OS vulnerability. This statement needs a lot of amplification before it even approaches something like "truth".
DO-178B Level A is the highest safety standard for software design, development, documentation, and testing.
From Green Hills' web-site about DO-178B Level A certification:
"The certification package includes Green Hills Software services for all DO-178B Level A compliant verification activities for INTEGRITY-178B operating on processor architecture specified by a customer's requirements. All reviews, analysis and testing of the INTEGRITY-178B real-time operating system is performed by Green Hills Software using the customer's target processor system."
So DO-178B Level A verification is OK as long as you trust Green Hills. Remember my earlier comments about infiltrating proprietary companies? With a couple of fifth-columnists in a couple of key places terrorists can insert whatever code they like and then pass it right along in the certification stage.
If the government truly wants to use Linux in military operations:
1. Freeze the source right now. Fork it into their own private source control tree that nobody in the outside world ever sees.
2. Perform the entire DO-178B procedures (I don't remember what parts of it these are) that do a detailed analysis on the source code for all decision brnches, etc.
3. NEVER use any public patches or source code changes as-is; instead, any changes to the code must be examined at the source level to the same rigor as 2 above and then incorporated directly into their private source tree.
4. etc, etc.
Misleading the naive! (Score:2, Insightful)
Is this not due to the fact that only people inside M$ can check their own code, and that they will not always disclose vulnerabilities?
Linux on the other hand almost always instantly discloses its bugs.
Re:Understand the Source Perspective (Score:2, Insightful)
Tom.
Comment removed (Score:4, Insightful)
closed source (Score:2, Insightful)
Re:Understand the Source Perspective (Score:5, Insightful)
As evidenced by his desperate attacks to stave off his dwindling market share, they're obviously doing a better job than the Green Hills CEO.
open vs. closed source and auditing (Score:1, Insightful)
I don't know this for sure, but it seems to me that the government is in a bit of a bind with MS products, because they're bloating and the audit process is falling behind. They're about to jump to W2K, which at least puts them in the Active Directory world, but they're not even thinking about XP at this point. W2K is much bigger than NT4, XP is much bigger than W2K, and Longhorn is massive (from an auditing point of view). It's got to be hard to manage these audits.
I think that from the government's point of view, the real issue is whether the code has been audited, not whether it's open or closed source. If it's audited, they feel they can use it, if it's not, they can't. Keep in mind that the machines themselves are closed off, guarded by men with guns, and are not plugged into public networks.
So it seems to me that a lot of the open vs. closed source would come down to how it affects the auditing process.
I don't know anything about the audit process (or any of the rest of what I'm saying here, really), so take this with a grain of salt, but...
It seems to me that open source would allow them to manage their own distro, with a small number of essential packages. Patches and updates could be audited on a continuous basis, as they come out. That has to be a much more manageable project than simply tearing into Longhorn from scratch when MS finally drops it.
If they have the source, they could make sure that whatever they were using could deal with whatever it needs to deal with -- new hardware, or whatever. I think the pressure of obsolete code (which they're probably feeling with NT4 now) would be less intense.
Comment removed (Score:4, Insightful)
Re:Understand the Source Perspective (Score:3, Insightful)
The answer should be obvious.
Comment removed (Score:3, Insightful)
Re:Terrorists in Microsoft (Score:3, Insightful)
Add bugs to Windows?
Lets look at this from the other direction... (Score:2, Insightful)
Now, if the code was open source, it would get reviewed, and looked at constantly. Yes, again, what are the chances of someone finding that bug, but I am sure they are greater than someone trying to find a bug in closed software....
The Austrailan voting system has been open source now for a number of years, and that system has just gotten more secure over time. I think that is a prime example of something that is borderline needing to be secure, and how open source worked. I Think it can work again, and that the US. should adopt it, if our greedy companies do not get in the way first.
By the way, the top paragraph was completely hypothetical -- no one wants CIA agents at their door.
-A
What about the H1-Bs and outsourcing (Score:2, Insightful)
Exactly the point (Score:5, Insightful)
The US can continue to run Windows, be our guest, but the point is moot since much of US Government software is developed in India anyways. No back doors there, for sure.
First blush (Score:3, Insightful)
And I'm a little upset. Is OSN actually letting these guys astoturf on
This whole thing smells bad. Maybe it's a slow news day but there are better anti-linux rants than what is coming from that lame ass website. Nothing to see here, move along.
Dan put down the crack pipe (Score:1, Insightful)
And I, for one, work in the industry and gee, know what I'm doing and can read other people's code.
Pure FUD.
Re:Understand the Source Perspective (Score:1, Insightful)
I should point out that these projects involved one-of-a-kind, highly custom stuff that is supposed to be updated all the time.
It's all about how the contracts are written. Anybody sane should write their contract specifications to REQUIRE the full source code and complete schematics/blueprints/etc. This goes for DOD and anybody else. If someone offers you a lower price for not including sources, you'll pay for it in the long run, and it's not worth it.
Oh really (Score:5, Insightful)
Wasn't there recently an article about a router with a backdoor shipped out in its code? How about all those darn "easter eggs" floating around in Windows and Office and other programs?
I would challenge you to compile a new Intel C library using a Microsoft C compiler from 6 years ago too. Heck, compile glibc using an IRIX compiler from six years ago.
You can drag out all the scenarios you want and whether it's Linux or it's *nix or BSD or Windows you're going to have the same audit challenges and not even have access to the source code without negotiating with all your suppliers.
Afraid of it? Don't use it! Comm. sw is no better (Score:2, Insightful)
However, governments DO use closed-source software from companies and people they do not know. Who says Microsoft is not paid by X government that, through mass-adoption of Windows and other MS applications around the planet, now has control over a laaaaaarge number of computers in all countries of this world?
In other words, simply buying commerical software is not any more secure. What's worse, there is not way you will be able to check the sources. With OSS you have that option, and it is up to you, the user of the software, to check it or not check it.
Give the NSA and friends some credit (Score:2, Insightful)
Closed source is no guarantee of lack of tainting. Even with security checks its perfectly possible to have a hostile programmer working on the software.
I'd just like to put in a word for the NSA et al. They're perfectly capable of making their own decisions, and are probably far more qualified than anyone here. They know how to minimise risk using whichever development model they like...
It is of course very possible that open source software could be written on behalf of the military. The people who keep the official version for use within the military/government will go through all code submitted with a fine tooth comb being very conservative with what patches they accept from the outside world...
This article is basically written for politicians to try and scare politicians into banning the FOSS competition. I doubt it would work if the military and friends didn't want it that way, they will make up their own minds.
Re:Understand the Source Perspective (Score:3, Insightful)
But really you're talking about the same exact problem for closed sourse. In that 5-person dev team with 2 QA people in that 20-person company how closely are they checking the source for the type of stealth errors in question? Do they personally check the thousands or millions of lines of code in the build chain? Once a guy is hired, how much contact does he have with his boss? Does his boss even understand the code the "honest worker" is submitting?
I'm NOT saying the open source is "safe." I'm saying that closed source is every bit as unsafe but with the added hinderence of much fewer people haveing the ability to even look at the source if something goes wrong.
TW
Re:Understand the Source Perspective (Score:5, Insightful)
Even that might be insufficient. Ken Thompson showed in his '95 Turing award speech how a compiler trojan can exist even if the backdoor is not present in the compiler's source code.
Here's the link [acm.org].
Re:Understand the Source Perspective (Score:3, Insightful)
Aditionally, what you specified is more complex than what was originally posited, and seems more likely to be detected in code. There are a lot of other issues, but these are covered by the many other threads on this story.
Re:Well, here's the obvious (imho) response. (Score:2, Insightful)
If we're going to compare Linux and Microsoft - have you noticed all the backdoors and trojans and worms out there lately? These security holes that are being used were not (likely) put in there on purpose - they're simple mistakes. Left by coders, in a CLOSED SOURCE company. There's no need to go to open source to get bad code!
Are you a coder? Have you participated, or ran, an open sourced project? In such a thing, you don't just accept code. You read it through, test it, and make sure it will do the job before compiling it.
Sure, you don't do a background check on every person submitting code - but you DO make sure the code does what it needs to, and that it does not have blatent bugs in it. Atleast the code that's submitted this way gets CHECKED AND APPROVED. That's not the case in all closed source companies - they don't all review code changes!
Has there been a background/security check on everyone that submitted code for use in NT4 SP6 that's used in high security defense systems? No. It's just not possible.
Atleast with embedded system, that are OS, the code can be all checked because it's small enough where it's a fully solvable and testable system.
And what does information stealing have to do with even pure source code?? Most "hacks" like that are social enginieering (email attachments, "CoOL CURSoRs! HeRE!"), etc. high security organizations use reverse firewalls to make sure nothing important is going out!
Re:Understand the Source Perspective (Score:3, Insightful)
Software is reviewed carefully before being allowed onto secured networks. Just ask anyone who works on projects where a deadline is missed due to security reviews.
Re:You can't possibly be a developer (Score:3, Insightful)
Green Hills is the national security threat (Score:4, Insightful)
What a bizarre article.
The statement "Yet, despite the "many eyes," new security vulnerabilities are found in Linux every week in addition to dozens of other bugs." Shouldn't one consider that the "many eyes" are the developers finding those weekly bugs? Wonder how many eyes are looking for Green Hills software bugs?
As long as people are involved, mistakes (bugs) will be made. But saying that malicious code is more likely in a product where someone CAN examine the code verses a product where no one can is just plain stupid. There is obviously an undisclosed agenda here (might that be selling a DO-178B Level A rated real time OS, aka Integrity? Getting a lot of Linux competition, eh?).
As to the standard DO-178B...the first 90% of the article is about security, then you mention DO-178B. DO-178B is not a security standard. DO-178B is a FAA safety related standard for software. Any software certified under DO-178B can still be full of unknown security holes. The standard may be required for software used in flight related applications but it does not mean the software is also secure.
The level A rating doesn't even mean "most secure" as the article seems to imply. It means that if the software crashes, it will not affect other software that is running. In other words, the software is ISOLATED, not secure.It is amazing the things companies will say when they are losing ground to a competitor.
Just another security by Obscurity argument. (Score:3, Insightful)
STUPID ARGUEMENT
The code does not have to be modified by evil people. ALL CODE HAS BUGS So all code should be checked, and not just by the people that write the code. The entire point of Open Source is that LOTS of people check the code for bugs. The difference between an "Evil Easter Egg" and a "bug" is just the intent of the programmer. Open source is MORE likely to catch an "evil Easter Egg/bug" than a closed source technique. Having a Spy try to sneak one is ridiculous because the the same bug detection routine will also detect the evil easter eggs.
But closed source DOES have one advantage over Open Source: secrecy.The problems with Open Source Defense programs are 1) "They" know exactly how good our programs are and 2) "They" can use them themselves.
Because of these two things it is not a good idea for most Defense purposes. We want the bad guy to NOT know how good our stuff is and we do NOT want them to have the same quality stuff.
If OSS is a Trojan Horse... (Score:3, Insightful)
and closed source propietary firms.... (Score:5, Insightful)
Nope. Open source is still the best way to go, along with open government. When you let people hide "stuff", and when it's connected to massive political power and heaps 0 money, that's when crimes occur. The best bet is openness, bar none. It is not perfect, but it's the best design yet.
Re:all in the family (Score:2, Insightful)
I've worked for defense contractors using GHS's operating system. I'd much rather read and certify just my own application code than 1M lines of Linux _plus_ my own application code.
Offshoring just as much of a threat (Score:5, Insightful)
Misunderstand the Source Perspective (Score:3, Insightful)
No, they aren't ignoring, they are denying it. Because it's bunk.
Re:Understand the Source Perspective (Score:5, Insightful)
Yes, of course, because the fact that open source has some advantages doesn't negate the risk pointed out in the article. It just means that their are risks both ways.
ANY piece of software that you run on a secure system has the potential of subverting the system. I think open source does create the illusion that it couldn't contain hidden malware because where could it hide in open source, right? Well, anyone who has ever seen the entries in an obfuscated C contest and wondered what that code could possibly do ought to be able to see the flaw in that argument. For that matter, anyone who has ever gone over and over HIS OWN CODE looking for a bug and not finding it ought to ask himself, what if it weren't even my own code and I didn't even know that a bug existed?
Closed source is even worse in this respect, though, but at least we know who wrote it, right?
Well, I think that's yet another illusion. Think disgruntled employees being paid by Bad Guys to insert a bit of code.... You may trust the company that made your software, but how can you possibly trust every one of their employees? And once it's in, since it's trusted it could be there for years.
Naive? (Score:3, Insightful)
s/countries/Microsoft/g;
Re:Supporting comment (Score:5, Insightful)
When one is ranting in a desperate plea to defend one's own methodology & existence, it is often helpful to ignore facts that do not support one's case.
Re:Understand the Source Perspective (Score:3, Insightful)
And then, of course, there's the question of outsourcing to a foreign country... Who's to say that the programming companies in India (or anywhere else) don't have some disgruntled employees among them? And I don't mean disgruntled with their own bosses, so much as disgruntled at the rich western nations that take advantage of the poorer eastern nations.
How is Open Source different from that? Oh yeah, you pay more for outsourcing...
Re:Understand the Source Perspective (Score:5, Insightful)
Exactly. The author also helpfully ignores the backgrounds - often unknown by the enduser - of the developers of closed source software. I've been in this industry for 12 years and have worked in one place (out of 7) that did not have a foreign national on the team. Ethnicities have included China, Vietnam, Russia, India, Pakistan and Syria.
The only point the author made that I could agree with would be that all software used for the military/intelligence communities should be thoroughly tested & certified to a high standard of security. I doubt there are many that would disagree with this statement. The problem is the author is hiding this valid argument beneath a layer of FUD intended solely to harm Linux & support the proprietary development model his company has chosen. He uses fear & stereotypes to paint the opposition without explaining what his company is doing that will solve the problem in a way that open source cannot.
How is this riskier than companies that outsource? (Score:4, Insightful)
I find it interesting that open source software is considered a risk because individuals from other nations are allowed to participate in the development of the code...
How does this differ from corporations which provide software to the military who outsource their development to individuals from other nations?
The only difference is that the OSS model involves corporations giving up some of their control over the rights of the product and corporations don't like that.
Otherwise, the article makes assumptions of differences between OSS remote participation and outsourcing which has no material relevance.
The idea of outsourcing being more secure because security checks are done can be argued, but even security checks fail and someone who is cleared can decide to sabotage. The problem is that once someone is vetted, they are trusted. This is actually worse than the OSS model where no matter who you are, the code is reviewed with the same level of scrutiny as anyone else's code.
I can think of so many instances of calling support, having to provide my personal identifying information to an individual who was either not in my state or not even in the US.
Sounds more like a double standard of judgement from the corporate viewpoint that is prejudiced against OSS projects.
Re:Understand the Source Perspective (Score:5, Insightful)
There are numerous examples of malware (like those in Kazaa), easter-eggs (like the flight simulator in Excel and the pinball game in MS Word) and unrequested features (like Windows Product Activation) in proprietary software.
While I agree that something similar could in theory also happen to some one-man or very small open source projects (in theory because I have never heard of any such occurence) there is absolutely no way such code could be smuggled into bigger projects like Linux, Apache, KDE or the like, there are just too many people watching.
So if you compare exactly what the article is talking about, proprietary software has a much worse track record.
Re:Understand the Source Perspective (Score:3, Insightful)
I think geeks have been DoJ targets for some time.
Re:Understand the Source Perspective (Score:3, Insightful)
In any case, the military will always offer the work out to private contractors. These contractors would not be allowed to use GPL licensed s/w anyhow, imagine: icbmguidance.sourceforge.net; no? Neither do I.
- Oisin
If this guy actually believes this, (Score:5, Insightful)
First of all, what truly important piece of software would possibly be part of open public development? I thought this was specialized enough of a field that the only people who had any competence with what you were making were already trusted anyway. Wasn't SELinux developed *inside the NSA* before it was released?
Secondly, assuming a vital piece of software WERE being developed publicly, someone trying to insert malicious code would have to make it past a few barriers, the first being the most complicated. He would have to: 1) Know what his deliberately inferior code would probably do in the finished product versus what a non-ciminal would want it to do. 2) Get it past the critical eye of a few other developers, 3) Slip through some kind of government screening. And all the while NOT make anyone suspicious.
And even then the results are not guaranteed. What is your cyberterrorist counting on? I sincerely doubt that he could have snuck a back door into the code given all those hoops. I don't think the deliberate bug can be both significant and unknown at the same time. Is he hoping that his bug will cause the software to make a slight miscalculation? Whoopty shit. Whatever agency he or she is working against will be annoyed for a little while and then fix the problem.
Even if his deliberate bug caused a catastrophic failure, it can and will be traced back to HIS contribution, and if some terrorist group stands up and says "Ha ha! Look what we did! And here's why!" (and if it's Al-Qaeda we can be almost certain of this) That man is immediately under FBI surveillance and probably arrest.
In any case, inserting a bug would be a lot of work. A lot of work for an uncertain return, and success will mean almost inevitable detection.
Why some terrorist would bother with this approach is beyond me. It's so much easier just to fill a truck with dynamite.
Re:Understand the Source Perspective (Score:3, Insightful)
It's even worse with all the offshoring going on. I mean, people from halfway around the globe write your code. There is no way in hell you can trust them with national security!
The odds of this happening... (Score:4, Insightful)
That's a pretty good obscure set of circumstances. Does it mean it can't happen? No. But contrast this with proprietary methodology wherein a coder has (usually) unrestricted access to the code base. Hmmm. Sounds more plausible there!
Of course, the key thing to note here is that anyone who has to dredge the dread forumla that terrorism + open source == Disaster!!! is probably desperate to save his flagging business.
Re:The strengths of Linux count against its securi (Score:4, Insightful)
To say that the code is Linux code is locked down and tested is to say that the barn door is locked too late in the process for the kinds of things the author of this posting is citing as potentials for happening.
So what's stopping the DoD from taking the source code base and doing their own testing and certification on it? Considering you claim to have had a background in this, I'm surprised you didn't think of this. This may save them some time in the long run, since they don't have to go through the effort of developing the software itself.
If I decide to use a library or module from another developer (OSS or otherwise) in something that I am doing, I always take the time to test it to make sure it at least does what I want and is adequate for the task at hand. Now, my own projects don't require a terrible amount of security, but if they did, I would be certain to do some testing in that area as well.
So I just don't get your point. You don't have to develop the code yourself in order to certify it if you have the full source available to you. And then once you have certified it, after making any corrections that you need on your copy of the source, then you lock THAT down. What came out of the original source base is irrelevant at this point. It only matters what you improved upon and certified.
Re:Understand the Source Perspective (Score:3, Insightful)
If you're the government, of course you'll review the code and try to pull out dangerous things. With open source, this means hiring a bunch of programmers, and possibly one lawyer to go over the GPL. With proprietary software, this means hiring a bunch of programmers to make sure the code is clean, hiring a bunch of lawyers to make sure the license is clean, and paying for the code itself.
Furthermore, the community will catch quite a lot of bugs that a team of 10 or 20 programmers from the NSA might never see, simply because there are more of us.
So I would say to the author that open source is less of a threat to national security than closed, and that furthermore, open source costs less. If you want to compete, stop insulting our intelligence and start changing your business model. Maybe a support / code review company, if you're so worried about national security? "Pay us, and we'll make sure there are no easter eggs in this source"?
Greenhills has an ax to grind by spreading FUD (Score:3, Insightful)
A number of postings have done an excellent job of describing why open source is not a security threat and, in fact, can be more secure because the source code itself can be audited, signed, etc...
This is a long discussion and I may have missed it. But I have not seen a mention of why Greenhills might be motivated to make the claim that open source is a security threat.
Greenhills sells proprietary compilers for embedded systems. They also sell real time operating system software. Their business model is under threat by open source. This is especially true in the area of real time embedded operating systems. With their compilers they can at least show that for a given embedded processor their compiler produces better code (this is not that hard to do against GNU for a number of architectures). But with real time operating systems (RTOS) they have less of an edge. Linux is becoming more and more widely adopted. Increasing numbers of people have experience with Linux.
Greenhills, one can speculate, fears a serious erosion of their RTOS business. This business is probably bigger than their compiler business (few people make much money on compilers these days). Taking a page from Microsoft and SCO they are attacking Linux using FUD. If this also helps their compiler business ("Who knows what trojans open source compilers might generate?"), all the better.
take a page from the KGB playbook (Score:2, Insightful)
The KGB said, go for the secretaries. The secretary holds the keys, the trust of others, and can go where s/he wishes. Monolithic organizations are vulnerable exactly because they have few internal firewalls; if you *can* get in then you become part of the trusted architecture, and then there is no mechanism to get you out.
and the vendors are moving to Linux (Score:3, Insightful)
Does this address the needs of all embedded systems users? Of course not. I can see in really high-security fields you need to have 100% control of things. The critical embedded devices in power plants come to mind in that case where you may be replacing a 10 year old device and need to ensure that you have exact compatability.
However, these are outlying cases and will strain almost ANY OS group to satisfy. (Especially as they still need to move forward with their technology as well.)
I would say that the OSS route in that case may actually provide you better security as long as you archive both the code and the software used to build the code (including the OS and the hardware if necessary too). If your requirements are in fact that strict then you're going to either have to have complete control of the code you're relying on or have escrow agreements that ensure you'll be able to obtain the code if your vendor happens to go out of business.
Re:Understand the Source Perspective (Score:3, Insightful)
Proof: "Friendly fire" incident in Afghanistan where Canadian soldiers were killed by the hands of a US Pilot. I put "friendly fire" in quotes because to me, it doesn't even qualify as that. Im prepared to accept combat deaths, even friendly fire ones. Anyway: the Canadians were on a known live fire range (and the Americans knew, even if that information didnt make it to the pilot and air controlers). The pilot reported fire, and was specificly told to hold fire. The fire was coming from a range well known to be used by friendly ground forces, including US Forces. He fired anyway, killing 4, wounding a dozen others. He claimed he was "under attack", but how a machine gun could threatin a F18 at 20,000ft, I dont know.
Final end result: loss of 1/2 months pay for 2 months, letter of reprimand. In other words: exactly nothing.
If the US DoD were concerned about friendly fire there would have been a minimum of 3 people, if not incarcerated, then discharged from the service. The pilot, the person (or persons) who didn't provide sufficient briefings, and the commanders for letting that happen.