Linked by Thom Holwerda on Tue 14th Dec 2010 23:55 UTC, submitted by Oliver
OpenBSD Okay, this is potentially very big news that really needs all the exposure it can get. OpenBSD's Theo de Raadt has received an email in which it was revealed to him that ten years ago, the FBI paid several open source developers to implement hidden backdoors in OpenBSD's IPSEC stack. De Raadt decided to publish the email for all to see, so that the code in question can be reviewed. Insane stuff.
Thread beginning with comment 453749
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: FOIA - I disagree
by jabbotts on Wed 15th Dec 2010 22:56 UTC in reply to "RE: FOIA"
jabbotts
Member since:
2007-09-06

In terms fo OpenBSD specifically, this is currently nothing more than a rumour. If something is found in the source code then we can draw conclusion along with seeing how fast the issue is fixed.

In general terms, sure.. anyone can slip code into the distirbution repositories if it's a poorly managed distribution. Wake me when you manage to get arbitrary code included into Debian Stable.

In terms of insider threat, sure.. one of your developers may download and compile a package. Maybe your admin installed a backdoor of some sort. Those are both company specific HR problems though. Why was someone so disgruntled without being recognized. Why where they able to install arbitrary software across the network. Why did security monitoring systems not recognize the issue. How did the malicious insider get pack the system's package validation mechanism, past tripwire, past regular debsum checks?

Hiding the source code doesn't make anything more secure nor does revealing the source code make it less secure. If your security system relies on what is hidden then you've failed. If it's not secure knowing all details of the implemented security mechanisms except for the user specific key/cert/password then you've failed. Obscurity has no place in security design.. include it as icing on the top; sure.. rely on it as the core security mechansim.. Fk no. Consider cryptographic research where a new algorithm remains untrusted until proven through open peer review and analysis. AES is not currently considered secure and trusted because the math behind it us unknown; it remains so because the math behind it known and has not yet shown weakness under the scrutiny of the brightest cryptographers and criminals in the world.

If your nervous about reputable major open source projects having intentionall vulnerabilities written in.. you should be terrified of the closed source reliance on obscurity and what minimal code review doesn't conflict with time and monitary budgets. Heck, with a reputable open source project, you can track a bug from first report, confirmation, patch creation and updated version release; you can clearly watch and measure the time between discovery and the "fix". You'll actually get bug reports of "in house" discovered bugs. Show me the closed source software producers who voluntarily report internally discovered bugs rather than fixing them quietly under some illusion that it maintains PR cred.

Your absolutely right about USB though.. security goes much deeper than allowing or disallowing removable media and "security through visibility" is a very big part of that as is layering, monitoring, least privileged and so on.

Reply Parent Score: 2