Linked by Thom Holwerda on Sat 1st Jun 2013 18:43 UTC
Privacy, Security, Encryption Google is changing its disclosure policy for zero-day exploits - both in their own software as in that of others - from 60 days do 7 days. "Seven days is an aggressive timeline and may be too short for some vendors to update their products, but it should be enough time to publish advice about possible mitigations, such as temporarily disabling a service, restricting access, or contacting the vendor for more information. As a result, after 7 days have elapsed without a patch or advisory, we will support researchers making details available so that users can take steps to protect themselves. By holding ourselves to the same standard, we hope to improve both the state of web security and the coordination of vulnerability management." I support this 100%. It will force notoriously slow-responding companies - let's not mention any names - to be quicker about helping their customers. Google often uncovers vulnerabilities in other people's software (e.g. half of patches fixed on some Microsoft 'patch Tuesdays' are uncovered by Google), so this could have a big impact.
Thread beginning with comment 563510
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[5]: Comment by Nelson
by JAlexoid on Mon 3rd Jun 2013 13:02 UTC in reply to "RE[4]: Comment by Nelson"
JAlexoid
Member since:
2009-05-19

most would consider it a software defect which is more commonly known as a bug

That is - for a fact - not true. Design flaws are not bugs. A lot of security vulnerabilities are and were not bugs, but a perfectly correct implementations of designs and requirements.

Sorry you are being a pedantic dick-piece.

And I just hope that you don't work on any of the software that stores my private information...

Also you make no mention of whether you actually created the patch, deployed it or the complexity.

How about all three steps, on multiple occasions and none of them were SQL injection.
And since when does anyone give a f**k about complexity when it comes to critical vulnerabilities?

Reply Parent Score: 3

RE[6]: Comment by Nelson
by Nelson on Mon 3rd Jun 2013 13:28 in reply to "RE[5]: Comment by Nelson"
Nelson Member since:
2005-11-29


That is - for a fact - not true. Design flaws are not bugs. A lot of security vulnerabilities are and were not bugs, but a perfectly correct implementations of designs and requirements.


The mistake you made is in assuming that you're both talking about the same classification of "bug". He obviously used the word questionably, and you called him out on it. It is though even more obvious that he didn't mean a run of the mill bug or software defect, but a very real showstopping critical vulnerability.

So you going on about the differences between bug and vulnerability is an example of pedantry. Its nice that you know the difference, as I'm sure a lot of us do, but its superfluous to this discussion.



And since when does anyone give a f**k about complexity when it comes to critical vulnerabilities?


Because the implications of patching the vulnerability can extend deeply into the code base and cause other issues down the road, which is why QA processes are necessary, and they don't necessarily have a constant time. More complex code takes longer to evaluate, especially when it runs on an increasingly complicated array of software.

The oversimplification of this entire thing is what I think Lucas is getting at, and its disgusting. People here think that software engineering runs on pixie dust and good feelings. There are actual people working on these projects and it takes actual time to get a fix out of the door in a responsible manner.

Its great that you have had a situation where you got a fix out in a relatively short amount of time, but I hardly think that your experience is one that is necessarily universal.

Reply Parent Score: 3

RE[7]: Comment by Nelson
by lucas_maximus on Mon 3rd Jun 2013 13:30 in reply to "RE[6]: Comment by Nelson"
lucas_maximus Member since:
2009-08-18

Thanks for explaining it a lot better than I.

Reply Parent Score: 2

RE[7]: Comment by Nelson
by cfgr on Mon 3rd Jun 2013 14:33 in reply to "RE[6]: Comment by Nelson"
cfgr Member since:
2009-07-18

Because the implications of patching the vulnerability can extend deeply into the code base and cause other issues down the road, which is why QA processes are necessary, and they don't necessarily have a constant time. More complex code takes longer to evaluate, especially when it runs on an increasingly complicated array of software.


1) Most security vulnerabilities are implementation based (a la SQL injections and buffer overflows). They do not alter the external interface at all. Any business that delays those patches either has a shitty update process or simply has a shitty QA.

2) Design vulnerabilities should cost you money. I don't see why the software industry should get a free pass where as any other industry is responsible for recalls and repairs within a reasonable amount of time (during the warranty) - or else it's a free replacement or refund.

Simply because your company is incompetent at handling critical vulnerabilities, does not mean other companies are. I think punishing those incompetent companies will reward those that do care. And to be honest, I doubt the former are incompetent, they're mostly just negligent as they care more about their wallet than their customers.

Reply Parent Score: 2

RE[7]: Comment by Nelson
by JAlexoid on Mon 3rd Jun 2013 15:16 in reply to "RE[6]: Comment by Nelson"
JAlexoid Member since:
2009-05-19

Its nice that you know the difference, as I'm sure a lot of us do, but its superfluous to this discussion.


No. There is a process and urgency difference between a regular bug and a critical bug and a critical security vulnerability. This is at the heart of the issue.

I'm happy for you if you develop software that does not store critical data, that does not mean that others aren't under serious threat from these hushed up for 60 days and "we'll get to it" vulnerabilities. I personally have seen "big boys" jump though burning hoops to get fixes and workarounds out(Like Microsoft did quite a few patches for Telia's Exchange servers within 8 hours, IBM for StoraEnso's Websphere Portal in 4 hours or Oracle for Vodafone).

Because the implications of patching the vulnerability can extend deeply into the code base and cause other issues down the road, which is why QA processes are necessary, and they don't necessarily have a constant time.

Seriously... Why would you ignore the word critical there? When it's critical no one cares how complex it is to test, verify or fix it correctly. There is an immediate need for a fix - PERIOD.
Breaking ribs to restart your heart is a non-optimal way of making sure that you live, but when you're in a critical condition no one cares.

Its great that you have had a situation where you got a fix out in a relatively short amount of time, but I hardly think that your experience is one that is necessarily universal.


No. I had to drop all my work and actually work non stop till the issue was resolved, a few times. SLAs are there for a reason and in the industries that I have worked at they carry hefty fines.

Reply Parent Score: 4