Linked by Thom Holwerda on Thu 18th Jan 2007 23:42 UTC
Privacy, Security, Encryption Alan Cox, one of the leading Linux kernel developers, has told a House of Lords hearing that neither open- nor closed-source developers should be liable for the security of the code they write. Cox, who is permanently employed at Red Hat, told the Lords Science and Technology Committee inquiry into personal internet security that both open- and closed-source software developers, including Microsoft, have an ethical duty to make their code as secure as possible. "Microsoft people have a moral duty in making sure their operating system is fit-for-purpose," Cox said on Wednesday.
Thread beginning with comment 203280
To read all comments associated with this story, please click here.
Liability
by Hetfield on Fri 19th Jan 2007 00:06 UTC
Hetfield
Member since:
2005-07-09

I'll have to agree that general liability for the security of code is wrong for a couple of reasons:

1. There is no absolute security; developers are just people, and people make mistakes.

2. It would pretty much kill off every little useful hobbyist application, every small open or closed source project, almost every small to medium software company, because those are exactly the ones who cannot afford extensive security audits, leaving only a handful of big players like Microsoft, Sun and IBM.

3. It would generally stifle (or at least severely slow down) innovation and progress, as developers would hesitate to introduce new features and explore novel methods of computing for fear of introducing new holes.

I do support, though, the idea of holding for-profit companies liable for negligence. There's a security problem unpatched for six months? Punish them. Bad software was knowingly released to the public? Punish them.

Reply Score: 5

RE: Liability
by leech on Fri 19th Jan 2007 00:26 in reply to "Liability"
leech Member since:
2006-01-10

For the most part I'd agree that they should be liable to a point. Of course my 'to a point' would be determined on how large the software is, and how many holes it has.

For example, something like an operating system developer should be held more liable over security holes / bugs than say a person who is creating a notepad clone.

At least with smaller software, you can remove it if there is a security hole that isn't patched. The operating system on the other hand is much more difficult to do so, due to applications required, etc.

Reply Parent Score: 2

RE[2]: Liability
by ma_d on Fri 19th Jan 2007 02:01 in reply to "RE: Liability"
ma_d Member since:
2005-06-29

I don't think that's legally feasible. It's be twisted to be too strict or to be lenient to where it was meaningless.

I don't think regulators could keep up with software. I'd rather see consumers holding their vendors responsible and leaving them when they have major troubles and the vendor isn't their to respond quickly enough.

Software is just too complex.

Reply Parent Score: 4

RE: Liability
by happycamper on Fri 19th Jan 2007 08:54 in reply to "Liability"
happycamper Member since:
2006-01-01

/*1. There is no absolute security; developers are just people, and people make mistakes.*/


they can use that as an excuse to write sloppy code
and just relase the program if it just works without really auditing the code for bugs which leads to security holes
just waiting to be exploited.

Reply Parent Score: 1

RE: Liability
by w-ber on Fri 19th Jan 2007 19:00 in reply to "Liability"
w-ber Member since:
2005-08-21

3. It would generally stifle (or at least severely slow down) innovation and progress, as developers would hesitate to introduce new features and explore novel methods of computing for fear of introducing new holes.

I don't think that would be a bad thing. Considering that most software has more features than people can even begin to use, ignoring the smallest programs, adding more and more features for every release does not make much sense in the long run. Of course it's sexy to announce "we integrated foobar and quuxwiz in this release", rather than "there are no new features, but we did plug in seven security holes that no-one had exploited yet".

With every new feature, you are making a more complex program. With every new feature, you will have more so-called feature interactions, most of which you are not aware of. With every new feature, you will have higher chance of errors (bugs) in the program.

If it was possible to trade in 50% of features to even 20% more correctness (and correctness usually implies more secure, as it will have less bugs), I would gladly do the trade. Unfortunately this is unlikely to happen until the general populace stops demanding more features for every release.

Reply Parent Score: 1