Linked by Thom Holwerda on Wed 29th Nov 2017 23:45 UTC
Mac OS X

So there's been a big security flaw in Apple's macOS that the company fixed in 24 hours. I rarely cover security issues because where do you draw the line, right? Anyhow, the manner of disclosure of this specific flaw is drawing some ire.

Obviously, this isn't great, and the manner of disclosure didn't help much either. Usually it's advisable to disclose these vulnerabilities privately to the vendor, so that it can patch any holes before malicious parties attempt to use them for their own gains. But that ship has sailed.

I've never quite understood this concept of "responsible disclosure", where you give a multi-billion dollar company a few months to fix a severe security flaw before you go public. First, unless you're on that company's payroll, you have zero legal or moral responsibility to help that company protect its products or good name. Second, if the software I'm using has a severe security flaw, I'd rather very damn well please would like to know so I can do whatever I can to temporarily fix the issue, stop using the software, or take other mitigating steps.

I readily admit I'm not hugely experienced with this particular aspect of the technology sector, so I'm open to arguments to the contrary.

Thread beginning with comment 651458
To read all comments associated with this story, please click here.
it is not for the companies sake
by emphyrio on Thu 30th Nov 2017 10:09 UTC
emphyrio
Member since:
2007-09-11

The concept of responsible disclosure is not there to protect the companies, it's there to protect the users.

In general (not this latest silly bug), there's physical limitations on how quickly a vulnerability fix and testing can be implemented, no matter how many billions you throw at it - during that time I'd prefer that the vulnerability in question not be known to every criminal on the planet.

Additionally, given the potential damage, I suspect that security researchers also are on a better legal footing by disclosing in this manner.

Edited 2017-11-30 10:13 UTC

Reply Score: 2

Megol Member since:
2011-04-11

The concept of responsible disclosure is not there to protect the companies, it's there to protect the users.


Exactly.


In general (not this latest silly bug), there's physical limitations on how quickly a vulnerability fix and testing can be implemented, no matter how many billions you throw at it - during that time I'd prefer that the vulnerability in question not be known to every criminal on the planet.

Additionally, given the potential damage, I suspect that security researchers also are on a better legal footing by disclosing in this manner.


Why would they? It isn't they who created the security problem in the first place.
In a reasonable legal system pointing out a flaw isn't a crime.

Reply Parent Score: 2

emphyrio Member since:
2007-09-11



...

Why would they? It isn't they who created the security problem in the first place.
In a reasonable legal system pointing out a flaw isn't a crime.


That's nice, but we are talking about rather big and scary companies here (involving aforementioned billions for both the software producer and their costumers); responsible disclosure would be a shield for any civil procedure and hugely important in the media attention which such a procedure would garner: it's the difference between the company being able to say "we could have prevented this, with a little time: pay up" and disclosure where the researcher can respond "you had months to fix the problem."


Sure, the primary responsibility lies with the software companies in question, but that is of scant help when they ruin you anyways for partial culpability.

Edited 2017-11-30 12:11 UTC

Reply Parent Score: 3