The Ethical Challenge Facing Self-Driving Cars

The Ethical Challenge Facing Self-Driving Cars

Self-driving cars may work, in most cases, but they’re not foolproof yet. One of the biggest challenges still facing the designers of these autonomous vehicles is situational ethics — who is responsible for an accident? How do these cars need to make decisions in the event of an accident, especially when the people who are in the car aren’t the only ones at risk? We’ll look at some of the ethical challenges that are puzzling self-driving car designers, as well as some of the answers they’re offering.

The Trolley Problem

One ethical question many people have brought up is called the Trolley Problem. It goes something like this:

  • A trolley or a train is hurtling toward a station. In the station are five people who will die if the train crashes. You can move the train to a different track, but a single person standing on that track will die because there is no time to warn them. Would you kill one person to save five?
  • When dealing with the same trolley, you can stop it if you push one person in front of it to stop its forward motion. Again, you’re killing one person to save five. What do you do?

Neither outcome is ideal — either way, someone dies — but you have the choice to save five lives by sacrificing one. It’s a question of human morality, and a decision you can make in a split second. If you’re going to be in an accident, you must decide between making a choice that protects you and your passengers, or choosing to protect pedestrians and others outside your car, while putting yourself and your passengers at risk.

How do you teach that to a computer? Do you — as German engineers did — teach the car to avoid injury or death to humans at all costs, even if it means hitting other obstacles in the road, like animals or property? Or do you give it the ability to make that decision on its own, even if it means putting the driver or passengers in a self-driving car at risk?

Here is one of the biggest dilemmas self-driving car manufacturers are facing in their quest to get these cars on the road. While it is true self-driving cars may eventually be able to reduce car accidents by up to 90 percent, that can only happen if the majority of the cars on the road are autonomous — something that could be a long way off if these ethical dilemmas continue to hold up production and testing.

Self-Driving Accidents

If you’re behind the wheel of a self-driving car — or even a car like a Tesla, which has a moderate level of autonomy but isn’t a fully self-driving car — who is responsible for an accident if the car is driving itself?

A self-driving car claimed its first life in Tempe, Ariz., earlier this year. Elaine Herzberg was crossing the road at night with her bicycle when a self-driving Uber being tested in the city struck and killed her. There was a backup driver in the car, but dash cam footage showed she wasn’t watching the road, as she was distracted by a cell phone or other device. She didn’t see Herzberg until it was too late for her to take over to try to avoid the accident, and there are no indications the self-driving car’s programming attempted to brake or do anything to avoid the collision with the unfortunate pedestrian.

Herzberg wasn’t moving erratically or in a way that would make it hard for the car’s algorithms to predict whether she was moving into the car’s path. She was just making her way across the street, so why couldn’t the car’s LIDAR sensors — which can see in the dark — spot her and take the appropriate precautions to avoid this collision?

The truth is, we just don’t know yet — and neither does Uber. What should have been a straightforward encounter ended in a fatality, and Uber has yet to figure out why the car didn’t respond. Uber has taken all its self-driving test cars in other cities off the road while they continue to investigate this accident.

D

Insurance Quandaries

Self-driving car manufacturers are also finding themselves at odds with insurance companies — if you’re taking the driver out of the equation when you’re driving, who is at fault in the event of an accident? Do you even need car insurance if your car drives itself with no input from you?

Experts predict around 23 million fully autonomous cars will be on the road by 2035. Individual policies will become less expensive, due to a reduced number of accidents, and between now and 2035, the insurance industry could lose upwards of $25 billion.

With that in mind, insurance will have to change — instead of insuring a driver against accidents, you may find yourself in need of an insurance policy that protects your investment by offering increased cybersecurity, or a policy that insures you against product liability. These are ways for the insurance industry to stay relevant once human error and human-caused accidents are no longer in the picture.

Self-driving cars are the future, and while they have the potential to save tens of thousands of lives every year, there are still a few ethical hurdles to overcome before they can finally take to the streets, roads and highways of the world.

Arrange a Conversation 

Browse

Article by channel:

Read more articles tagged: AI, Featured