Linked by Thom Holwerda on Sat 26th Aug 2017 19:08 UTC
In the News

Some light weekend reading: ethical guidelines for self-driving cars, as proposed by an ethics commission of the German government.

The technological developments are forcing government and society to reflect on the emerging changes. The decision that has to be taken is whether the licensing of automated driving systems is ethically justifiable or possibly even imperative. If these systems are licensed - and it is already apparent that this is happening at international level - everything hinges on the conditions in which they are used and the way in which they are designed. At the fundamental level, it all comes down to the following question. How much dependence on technologically complex systems - which in the future will be based on artificial intelligence, possibly with machine learning capabilities - are we willing to accept in order to achieve, in return, more safety, mobility and convenience? What precautions need to be taken to ensure controllability, transparency and data autonomy? What technological development guidelines are required to ensure that we do not blur the contours of a human society that places individuals, their freedom of development, their physical and intellectual integrity and their entitlement to social respect at the heart of its legal regime?

Cars are legalised murder weapons, and the car is probably one of the deadliest inventions of mankind. Self-driving cars, therefore, open up a whole Pandora's box oef ethical dilemmas, and it only makes sense for governments and lawmakers to start addressing these.

Beyond the ethics related to life and death, though, there are also simpler, more banal ethical considerations. What if, in the hunger for more profits, a car maker makes a deal with McDonalds, and tweaks its self-driving car software just a tad bit so that it drives customers past McDonalds more often, even if it increases total travel time? What if a car maker makes similar deals with major chains like Target, Walmart, and Whole Foods, so that smaller chains or independent stores don't even show up when you say "take me to the nearest place that sells X"? Is that something we should allow?

Should we even allow self-driving car software to be closed-source to begin with? Again - cars are legal murder weapons, and do we really trust car manufacturers enough not to cut corners when developing self-driving car software to meet deadlines or due to bad management or underpaid developers? Shouldn't all this development and all this code be out there for the world to see?

Interesting times ahead.

Permalink for comment 648367
To read all comments associated with this story, please click here.
Sabon
Member since:
2005-07-06

Self driving cars should NOT be held to higher standards than human drives.

It is very easy to calculate how the average human reacts in emergency situations. Auto-pilot systems should be programmed to be held to the same standard for DERTEMINING which path to take. With auto-pilot cars never being distracted, that would instantly create a lot safe environment for everyone to exist in.

Accidents would be reduced just because auto-pilot cars can react faster as well not never being distracted.

As more and more vehicles become self driving, the chance for accidents will be reduced year over year. Eventually there will be a tipping point to where accidents are AUTOMATICALLY dropped due to the reasons above.

Again. To hold auto-pilot cars to a higher standard than humans as far as what path they would take is just people trying to slow down the auto-pilot industry. In the meantime, the tipping point will be delayed and thousands, if not tens of thousands of lives will be lost due to the foot dragging of neandrathals of humans including government officials to slow progress down will actually cause a lot of people to needlessly die to delays.

Reply Score: 1