The Real Moral Dilemma of Self-Driving Cars
Push this button. It's driving itself. It feels good. So, BMW brought me to the Consumer Electronics Show here in Las Vegas. I'm going to check out the future of driving. Did I get it? Am I near? [unintelligible] Oh! I felt it! That really felt like pushing a button.
In this concept car, there's a holographic menu screen. It works by projecting an image above this panel. And then it uses this camera in the steering column to determine where your finger is. And when it detects your fingers in the right spot, it uses ultrasound from these speakers to provide haptic feedback - you can actually feel it in your fingers. It's like a little buzzing.
But what I really want to try is NOT driving. I can actually talk to the camera. Are you sure that this is a good idea? So here's a question: How much should you trust an autonomous car? This car is now driving itself. But I need to be able to take over at any time. I'm still legally responsible if something happens to the car, right?
But, in the coming years, cars are going to take over more and more of the responsibility for driving safely. And that has led a lot of people to consider the moral dilemmas faced when programming self-driving cars. The question is what sort of ethical framework should we program in through autonomous vehicles. So it needs to make a decision. Swerve left into an SUV or swerve right into a motorcycle.
Okay, so we can imagine a lot of weird situations where an autonomous car has to make a tough choice. But the real moral dilemma is accidents are happening right now. More than 30,000 people are killed each year in the U.S. alone. And more than 2 million are injured. And the problem in 94% of collisions is driver error.
In 2015, half of all traffic fatalities occurred on highways. So even this level of technology we've demonstrated today - autonomous driving on a highway - could save a lot of lives. We are already shirking our responsibility for driving cars. We are using our phones. In 2014, distracted driving resulted in at least 3,000 killed and 430,000 injured.
So, if we're not driving, we better hope that the tech gets to a level where the cars can drive for us. My view: this problem is only going to get worse. You know, when elevators became autonomous, a lot of people were uncomfortable with that. They were used to there being a driver in the elevator, so compromises had to be made, like big red stop buttons just to make people comfortable.
And a nice soothing voice to say, "Which floor would you like to go to?" Now, I know that elevators have many fewer degrees of freedom than a car, but even if you look at something like airplanes, airplanes flying in full autonomous mode are actually safer - studies show - than when pilots can take control.
I think the moral dilemmas over exactly how cars should react in a tiny percentage of cases where tough choices need to be made is a distraction from the main problem. The longer we wait to get autonomous vehicles on the road, the more people will die. And that is the real moral question of autonomous cars. Why aren't we getting them on the road faster?
I hope you enjoyed the ride. That was cool. Now let's head back for the CES. Perfect.