According to the report published by the World Health Organization in 2015 about safety of roads, nearly 1,250,000 people die in car accidents yearly.
It’s a big number.
And 94% of these accidents are caused by human errors, so if we could prevent these human errors from the equation, it would save a lot of lives.
It is possible with a technology called self-driving cars.
Cars driving themselves without a driver!!
These cars are programmed to be able to see everywhere around, they know streets and measure distances.
We can see that they spot everything everywhere; they are designed the way that makes errors almost impossible, they behave and make decisions almost perfectly.
And the big important thing…
Self-driving cars are not only going to make roads safer than before but will make the world more efficient. However, before we allow them we should ask some questions.
We have to give a car an instruction about behavior in a specific situation. Imagine the car in the situation when it has to choose between killing 5 people or killing just a/one person.
While you are in the car, a group of people is crossing the road, and there is a man crossing it on the other side.
Will it hit the group of people or the man? or should it turn to hit a wall and you died?
Don’t forget this option…!
According to Jeremy Bentham’s theory
We should give it the order to behave in a way that reduces the total harm and saves the biggest number of people.
But you must not insist killing.
According to Kant’s theory.
If you turn, you have an intention of killing someone on the sidewalk, and if you hit the wall you have an intention of killing yourself.
So according to Kant’s theory, the right way of behavior is to let the car go straight and hit anyone whatever the number.
When asking people which car they prefer (Kant or Jeremy), everyone chooses Jeremy – of course, they want the bigger number to live and sacrifice with the little.
By asking people whether they will buy this car which may sacrifice the little for the rest, which may sacrifice you for the rest, they said:
“No, we would buy Kant’s car which protects us but anyone else should buy the car that reduce the harm.”
It’s a funny issue.
If you have a car company, you have a huge problem.
You want them to buy your car but if you make a car that kills its driver in some situation to save the biggest number of people, no one will buy it.
However, if you make a car that doesn’t kill its driver to let the rest live, you harm public interest because you protect your customer and expose other people to danger.
So you are going to make them angry and also you can’t sell the car.
In 2016, when they asked car companies how self-driving cars should behave in that situation, most companies avoided responding.
Volvo said that their cars didn’t meet with accidents, so they did not face the situation.
Mercedes, by asking the head of the safety department “Van Hugo” about self-driving cars he said:
“If you know that you can save at least one person, save the one in the car.”
He said that to reassure the customers, so they would buy the car.
But Daily Mail got the sentence: “Mercedes admits that it can kill a kid but saves the customer.”
People get angry…
It is a clear example of people’s anger.
So we have a problem, we want safe future that guarantees less damage, but that can’t happen without self-driving cars. Nevertheless, people would not buy self-driving cars because they would know that they might kill them in a rare situation to save other people.
Also there are a lot of problems.
Suppose, the car is running and a doctor is crossing and a criminal is on the opposite side…
What should it do?
Should we consider doctor’s life more valuable than the criminal’s one?
Will it kill higher social status?
When asked people prefer to kill the criminal to safe the doctor.
Another example: the car is running and there is a man whose signal is green and on the other side there is a man crossing whose signal is red.
Whom will we choose?
People choose killing the wrong one.
But what if we make a modification, and instead of one wrong man there is a group of people crossing the road whose signal is red and one person crossing with the green signal.
Whom will it kill? Most people choose the one and save the group because they are many.
And a lot of similar examples…
Should it kill the kids or the old man? We will choose saving kids and women than men, and sporty than lazy.
Through people’s opinions we know their priorities.
Priorities are different from one society to another.
It refers to the states, whether they are developed or developing.
That makes these actions moral or not.
For example: Hong Kong and the Netherlands people stick to the rules, you can’t violate traffic signals because of a thing called Jaywalking: if you cross the road while the traffic light is red you are fined.
On the other hand
In Bangladesh and India people are used to breaking the rules. They refused to kill the one who violates the signal, because it’s normal there.
In the countries with the high human development index people are equal no matter of the social status of an individual.
A worker’s life is the same as one of an engineer or a criminal.
And in the countries with the low human development index there is a distinction between the life of a doctor and the life of a worker.
Probably speaking about these things before was too early.
But this year we have spotted an accident with a self-driving Uber for the first time. It killed a woman in Arizona in the USA
And guess what….
The car was Volvo, and it is sad for the death of the woman.
But it is also funny for Volvo because this company said that their cars did not meet with accidents.
And this happened while there was a supervisor of the self-driving process from Uber.
Who should we blame for this mistake? The programmer of the car, the supervisor or the manufacturing company?
And that makes us take it seriously.
No doubt that there is some well-being of thinking.
1- The trolley problem is not something that usually happens.
2-Why do we go right to the last point of the situation? There are many steps before, why don’t we say that there are problems with the brakes?
And we should solve this problem before discussing the situation and saying we’d rather kill one or 5.
As these questions never appear in a driver’s mind, they are likely not to get to the self-driving cars.
Of course, these scenarios mostly don’t exists, and they say that humans like to think about hard and evil things, they find thinking about these things delightful.
And they say that we would never get anything out of these thoughts.
What is your opinion? Should we shift to self-driving cars or not? Leave a comment and tell us what is in your mind.