Linked by Thom Holwerda on Sat 1st Jun 2013 18:43 UTC
Privacy, Security, Encryption Google is changing its disclosure policy for zero-day exploits - both in their own software as in that of others - from 60 days do 7 days. "Seven days is an aggressive timeline and may be too short for some vendors to update their products, but it should be enough time to publish advice about possible mitigations, such as temporarily disabling a service, restricting access, or contacting the vendor for more information. As a result, after 7 days have elapsed without a patch or advisory, we will support researchers making details available so that users can take steps to protect themselves. By holding ourselves to the same standard, we hope to improve both the state of web security and the coordination of vulnerability management." I support this 100%. It will force notoriously slow-responding companies - let's not mention any names - to be quicker about helping their customers. Google often uncovers vulnerabilities in other people's software (e.g. half of patches fixed on some Microsoft 'patch Tuesdays' are uncovered by Google), so this could have a big impact.
Thread beginning with comment 563443
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Comment by Nelson
by Thom_Holwerda on Sat 1st Jun 2013 19:41 UTC in reply to "Comment by Nelson"
Thom_Holwerda
Member since:
2005-06-29

If I'm using something that has a vulnerability in it that's serious, I want to know so that I can stop using said software, disable the feature in question, or apply a workaround.

It's not my problem that most companies are really bad at protecting their customers.

Reply Parent Score: 5

RE[2]: Comment by Nelson
by Nelson on Sat 1st Jun 2013 21:10 in reply to "RE: Comment by Nelson"
Nelson Member since:
2005-11-29

As usual, you vastly oversimplify a complicated matter.

There are a lot of variables involved in software engineering, and any one change can affect various hardware configurations running on that platform, especially something as important as say, Windows.

What one person considers a fix might break something else, and cause major quality headaches down the road.

How do you deal with that? Would you appreciate a Windows Update screwing your install? Itd be a disaster.

You can be advised via partial disclosure of a flaw and act accordingly. There is full disclosure, then there's being unreasonable.

There are potentially millions at risk, not something to be taken lightly.

Reply Parent Score: 2

RE[3]: Comment by Nelson
by Neolander on Sun 2nd Jun 2013 14:26 in reply to "RE[2]: Comment by Nelson"
Neolander Member since:
2010-03-08

Regarding security fixes, I would have spontaneously assumed that a company the size of Microsoft would have boatloads of automated regression tests in place in order to ensure that a security patch won't likely break a customer's machine (unless he is using code that binds to undocumented APIs or crap like that). Isn't that the case ?

Edited 2013-06-02 14:27 UTC

Reply Parent Score: 4

RE[2]: Comment by Nelson
by lucas_maximus on Mon 3rd Jun 2013 07:03 in reply to "RE: Comment by Nelson"
lucas_maximus Member since:
2009-08-18

You have no idea do you?

I work on a fairly small code-base if there is a bug, it can take weeks before it goes through the QA process and I get the go-ahead to release.

This is not taking into account my own time ... and when I can be put on task for it.

Reply Parent Score: 2

RE[3]: Comment by Nelson
by cfgr on Mon 3rd Jun 2013 09:47 in reply to "RE[2]: Comment by Nelson"
cfgr Member since:
2009-07-18

You have no idea do you?

I work on a fairly small code-base if there is a bug, it can take weeks before it goes through the QA process and I get the go-ahead to release.

This is not taking into account my own time ... and when I can be put on task for it.


Then maybe there is something wrong with the whole process. I'd say: hold companies accountable starting 7 days after they've been notified. Let good old capitalism take care of this. You'll be surprised how quickly the process adapts towards better security (fixing and prevention).

Reply Parent Score: 3

RE[3]: Comment by Nelson
by JAlexoid on Mon 3rd Jun 2013 11:27 in reply to "RE[2]: Comment by Nelson"
JAlexoid Member since:
2009-05-19

A bug is not the same as a critical security vulnerability. If you lump them together, then it's you who has no clue.

Security vulnerabilities have high priorities and just like bugs are classified Minor, Moderate, Major and Critical.
I've had to patch a few critical security vulnerabilities. The total response time for them ranges 8-72 hours, including QA. A week to patch, or even put out an advisory, is exceptionally generous.

Reply Parent Score: 3

RE[2]: Comment by Nelson
by Deviate_X on Mon 3rd Jun 2013 15:09 in reply to "RE: Comment by Nelson"
Deviate_X Member since:
2005-07-11

If I'm using something that has a vulnerability in it that's serious, I want to know so that I can stop using said software, disable the feature in question, or apply a workaround.

It's not my problem that most companies are really bad at protecting their customers.


I can 100% guarantee that you will be using something with a vulnerability in it ;) ) --> nature of the beast

Reply Parent Score: 2