Linked by Thom Holwerda on Mon 15th May 2017 16:18 UTC
Windows

Friday saw the largest global ransomware attack in internet history, and the world did not handle it well. We're only beginning to calculate the damage inflicted by the WannaCry program - in both dollars and lives lost from hospital downtime - but at the same time, we're also calculating blame.

There's a long list of parties responsible, including the criminals, the NSA, and the victims themselves - but the most controversial has been Microsoft itself. The attack exploited a Windows networking protocol to spread within networks, and while Microsoft released a patch nearly two months ago, it’s become painfully clear that patch didn’t reach all users. Microsoft was following the best practices for security and still left hundreds of thousands of computers vulnerable, with dire consequences. Was it good enough?

If you're still running Windows XP today and you do not pay for Microsoft's extended support, the blame for this whole thing rests solely on your shoulders - whether that be an individual still running a Windows XP production machine at home, the IT manager of a company cutting costs, or the Conservative British government purposefully underfunding the NHS with the end goal of having it collapse in on itself because they think the American healthcare model is something to aspire to.

You can pay Microsoft for support, upgrade to a secure version of Windows, or switch to a supported Linux distribution. If any one of those mean you have to fix, upgrade, or rewrite your internal software - well, deal with it, that's an investment you have to make that is part of running your business in a responsible, long-term manner. Let this attack be a lesson.

Nobody bats an eye at the idea of taking maintenance costs into account when you plan on buying a car. Tyres, oil, cleaning, scheduled check-ups, malfunctions - they're all accepted yearly expenses we all take into consideration when we visit the car dealer for either a new or a used car.

Computers are no different - they're not perfect magic boxes that never need any maintenance. Like cars, they must be cared for, maintained, upgraded, and fixed. Sometimes, such expenses are low - an oil change, new windscreen wiper rubbers. Sometimes, they are pretty expensive, such as a full tyre change and wheel alignment. And yes, after a number of years, it will be time to replace that car with a different one because the yearly maintenance costs are too high.

Computers are no different.

So no, Microsoft is not to blame for this attack. They patched this security issue two months ago, and had you been running Windows 7 (later versions were not affected) with automatic updates (as you damn well should) you would've been completely safe. Everyone else still on Windows XP without paying for extended support, or even worse, people who turn automatic updates off who was affected by this attack?

I shed no tears for you. It's your own fault.

Thread beginning with comment 644411
To view parent comment, click here.
To read all comments associated with this story, please click here.
Alfman
Member since:
2011-01-28

fmaxwell,

Knowing about the problem and being willing to swallow the bitter pill to fix it are two different things. I talked to many software engineers in my 30+ years and, almost to a person, they were very opposed to software being held to the same standards as other consumer products.


That's what happens when a company has no legal obligation to make their product perform as advertised.


If other companies operated like software companies:


If you knew it was defective, why was I not notified? Why didn't you recall it?


I don't think most software developers are against holding the companies accountable, many of us have been calling for that for a long time.

I think there may have be some unintentional confusion here, when you said "software developers", it generally means someone's title, although now your post clarifies you meant the software developing companies. That changes a lot and when you go blaming "software developers" this distinction is very important. For the most part the employees who develop the software have very little authority to invest company resources into security, more often than not I've found the only time companies seriously invest in security is...you guessed it...right after a breach.

Reply Parent Score: 2

fmaxwell Member since:
2005-11-13

Alfman,

I think there may have be some unintentional confusion here, when you said "software developers", it generally means someone's title, although now your post clarifies you meant the software developing companies. That changes a lot and when you go blaming "software developers" this distinction is very important.

I used was "software engineers" to avoid confusion. You introduced the term "software developers" and I assumed that you intended it to mean the same thing.

But you understood me correctly the first time. It's true that software engineers, the people who code for a living, almost always want more time and resources during the development process, but they still don't want the fruits of their labors treated as products, with all of the legal ramifications that entails. They don't want to have to revisit old code and make fixes years later.

And that is an area where they agree with management; software should remain in its special not-a-product niche. If a latent defect is found in something that hasn't been sold in years, management doesn't want to be in the position of being legally obligated to repair, replace, or refund. More importantly, management does not want the company to be able to be successfully sued when their security bug leads to, say, hospitals turning away patients.

For the most part the employees who develop the software have very little authority to invest company resources into security, more often than not I've found the only time companies seriously invest in security is...you guessed it...right after a breach.

Based on the idiotic notion that you can add security on rather than having to design it in. At one point in my career, I headed up a team developing a secure workstation that went through a formal C2 evaluation conducted by a team from NSA (back before Common Criteria). Most software engineers are pretty clueless about security. Most software companies don't want to invest in training or to hire enough senior software engineers with a specialty in security. They don't want to be constrained by engineers asking "do you really need a programming language inside of a word processor that most users run with admin privileges?"

Reply Parent Score: 2

Alfman Member since:
2011-01-28

fmaxwell,

But you understood me correctly the first time. It's true that software engineers, the people who code for a living, almost always want more time and resources during the development process, but they still don't want the fruits of their labors treated as products, with all of the legal ramifications that entails. They don't want to have to revisit old code and make fixes years later.


Software engineers don't get to make any of those choices, who says we'd be against it? It could benefit more qualified engineers and create incentives to become more qualified. But none of this is decided by us, it's all decided on by management, executives and lawyers. To be clear, if you held the software engineers accountable without holding management or CEOs accountable you'd end up with a large number of scape goats being blamed without any authority or power to change things at the company.

Like the wells fargo fiasco:
http://www.washingtonexaminer.com/fired-wells-fargo-employees-sue-m...


I've been involved in projects where code was released with some known vulnerabilities over my objections. If those had been publicly exploited, you would probably blame the software engineers for it, however you would not be privy to the facts of what actually happened, and that it was a managerial decision to consider those things out of scope (another way of saying "unfunded"). I'm for accountability, but you've got to make the whole company accountable and not just those working on the software - many of us aren't in any position to demand changes from our employers.


Most software engineers are pretty clueless about security. Most software companies don't want to invest in training or to hire enough senior software engineers with a specialty in security.


I agree, but I'd go even further and say this low investment and appreciation for security skills is quite discouraging even for those of us who have those skills.

Edited 2017-05-18 21:33 UTC

Reply Parent Score: 2