Self-driving cars could be ready for the road before regulation has codified the ethics that determine their decisions.

Computer scientist Iyad Rahwan has been conducting research to develop moral self-driving cars.

self driving car crash devrimb
© iStock/Devrimb

Rahwan is an associate professor at the MIT Media Lab, where he runs a team called Scalable Cooperation that is investigating ethical dilemmas faced by autonomous vehicles.

Their research will help inform regulation on who a car has a duty to protect when it has to make a risky choice, such as prioritising the life of a passenger or a pedestrian in an impending crash.

Regulation for automation

More than 1.25 million people are killed in road accidents every year. Research suggests that driverless cars could reduce these deaths by up to 90 percent and save the US economy alone $190 billion, but they cannot enter the market without being regulated.

The regulation has to reflect our values in the same way as the existing rules on cars controlled by a human. These morals vary among individuals and more broadly across cultures. Take the bull bar installed on the front of a vehicle to protect them from collisions. They’re legal in the US, despite studies showing that they make a car more likely to injure pedestrians, but they are banned in the UK.

The chances of these accidents occurring can be quantified by analysing data on the results of driving decisions. Researchers can study what happened when a car in the middle of three lanes losing control veered left towards a truck or right towards a motorbike, for example, and find out how often each choice led to the death of each vehicle’s driver.

In 2014, Google filed a patent for a computer-implemented method for evaluating these risks based on the probability of them generating specific results and the importance of each outcome.

This type of technology can only result in driving decisions if some consensus is established on the most moral choices. These ethical dilemmas are being debated by car manufacturers, and the discussions are proving to be controversial, as Mercedes-Benz recently found out.

The company was pilloried by the press after a senior manager revealed that it would prioritise the safety of a car’s occupants over pedestrians. The Daily Mail reported his comments under a headline that read: "Mercedes-Benz admits automated driverless cars would run over a CHILD rather than swerve and risk injuring the passengers inside."

Parent company Daimler AG responded to the outcry in a statement.

"We will implement both the respective legal framework and what is deemed to be socially acceptable," it read, but what that means is not yet clear.

Nobody had established what is socially acceptable, as a major study of the public’s views had never been conducted, until Rahwan and his team found a way to survey them digitally.

Ethical vehicles

Scalable Cooperation has created a platform called the Moral Machine that presents ethical dilemmas and asks users to make a choice.

The website can generate 26 million different dilemmas that include changes in age, gender, species and behaviour of the potential victims. They were translated into 10 languages and round four million users were surveyed. The results were analysed en masse and divided into voter demographics.

Rahwan revealed the early findings for the first time at the Global Education and Skills Forum in Dubai.

They show that people slightly favour pedestrians over passengers and that a sizable majority of them would rather save a child than an adult.

Their preferences vary depending on the behaviour of the people involved in the accident. Around three-quarters of people would rather swerve to hit a jaywalker than someone who was lawfully walking, and almost half of them would rather plough into two jaywalkers than one law-abiding pedestrian. Both of these percentages were even higher in Germany.

Turning ethics into laws

Germany is also the first state to create an ethics commission for automated and connected driving.

The commission has already released some of its verdicts. It ruled that technology must prioritise the prevention of critical situations from arising, that general programming to reduce the number of personal injuries may be justifiable but is not mandated, and that parties involved in the generation of risks must not sacrifice those that are not involved, meaning the actions of jaywalkers cannot cause harm to lawful pedestrians.

The commission also prohibited distinctions based on personal features such as age, gender or physical appearance, meaning the safety of children must not be prioritised over that of adults 

But German morality is not always replicated beyond its borders, as the Moral Machine proved. Universal machine ethics are impossible as values vary around the world.

"These things correlate with a lot of educational and cultural values," said Rahwan. "And I think that’s important because if we are to build machines that reflect our own values then we need to understand those values more and we need to quantify them as well."

The Moral Machine also exposed a moral tension that is integral to human nature. There is a contradiction between what people think is good for society and what they are willing to contribute to make that a reality.

This conflict could determine whether self-driving cars are a success. As Rahwan put it:

"I would never buy a car that would sacrifice me, but I want everybody else to buy such cars."