The Impossibly High Hurdle For Self-Driving Cars
Why human factors and perceptions of acceptable risks may keep self-driving cars off the roads
Now the impact of AI upon the creative industries is very much in the news at the moment, and as an author I’ve strong feelings on the subject – so I’m going to ignore that completely and instead talk about self-driving cars.
And for clarity, I’m talking about fully self-driving cars, where no human driver is required. (Think a taxi/Uber that’s got no steering wheel and drives itself).
Now proponents of self-driving cars will say that already those cars would be safer in terms of accidents and deaths caused per mile than human-driven cars. I suspect they’re right. They say that as development continues, this advantage will only increase. Again, I suspect they’re right. But equally, I suspect they’re missing a set of very significant factors related to human psychology that could keep fully self-driven cars off the road for the foreseeable future.
The first factor relates to what we consider to be an acceptable level of risk.
Years, ago I was watching a TV audience debate about train safety. There had just been a bad train collision caused by a driver jumping a red signal in which several people had died. (Bear in mind that on the UK railways, if you exclude lawbreakers and people committing suicide, the annual death rate through accidents on the railway system is generally zero).
Two possible automated systems to prevent trains going through red lights had been proposed: a cheaper one, which would cost an estimated £3 million per life saved; and a more expensive one, which would cost an estimated £17 million per life saved. A woman in the audience was arguing that it was disgusting to consider going for the cheaper option, saying that “you can’t put a price on human life”. One of the experts pointed out that if we went for the cheaper option, the money saved could be put to better use in road safety. Every year, he pointed out to her, 1700 people were being killed on the UK road system. Quick as a flash she answered, “Yes, but there’s nothing that we can do about that.”
And there’s the rub. When it comes to acceptable levels of risk, people are prepared to accept a much higher level of risk in systems where they perceive that they are in control than in systems where they must entrust their safety to others. We know cars are dangerous but we can believe that it won’t happen to us – we or the friend or the family member who’s driving will swerve around the drunk driver, or brake in time to avoid that truck. Whereas if your train derails because of faulty maintenance the first you’ll know about it is when you find yourself bouncing around the carriage’s interior.
This woman felt that 1700 annual deaths was an acceptable penalty in order that we as a society enjoy a road transportation system, but felt that any annual death rate on the railways in excess of zero was utterly unacceptable. I suspect that were she now to consider the transition to a self-driven vehicle transportation system, she would insist on a similar figure of zero as the acceptable annual death rate.
(As an aside, we see similar logic in the US gun control debate. People in the US know fully well that we have far lower levels of violence in Europe, but would rather be in a more dangerous system where they have control – believing that they could shoot the bad guys if the bad guys came for them – than have to entrust their safety to the authorities.)
Now let’s move onto the second factor, about what we consider to be a reasonable, understandable mistake.
When human-driven cars kill people, we can generally sort it into two categories: negligence; and honest mistake. If you drive drunk, or at 90 miles an hours through a residential estate, that’s negligence. But if your senses, reactions, judgement, and spatial understanding are overwhelmed by multiple factors – the road was narrow and winding with obstacles impairing visibility, it was dark, it was raining, the wet road was shining under the lights, making road markings invisible, you lost sense of where you were. Well in those cases, we call it an honest mistake.
Self-driving cars won’t make honest mistakes. They’ll hit bizarre edge cases that fall outside of the data on which they were trained. Imagine it’s early evening on Halloween. Perfect visibility. Dead straight road. But there’s a kid crossing that road who’s made two big mistakes. His first mistake was to make a costume inspired by the zig-zag black and white dazzle camouflage used by ships in World War II to evade detection by German U-Boats. And his second mistake was to be black (because the self-driving cars might have been trained in largely white-only suburbs).
So imagine a self-driving car simply doesn’t see him. It either doesn’t brake at all, or it brakes mere moments before it hits him. Either way, in perfect visibility on a dead straight road, it ploughs straight through him.
People will not accept that as an honest mistake. But if it’s not an honest mistake, then is it not negligence? Maybe, but who was negligent? If this was a self-driving taxi, then you can’t blame the passengers. Hell, they might see themselves as victims, although obviously less so than the poor kid who got killed. Now you might say that the car manufacturer could get sued in a case such as this. But that’s just a lawsuit, and unless someone goes to prison, it means nothing.
Right now, if you get in your car and you kill someone not through an honest mistake but through gross negligence, you will probably end up serving prison time. If a human-driven Ford kills someone and it’s clear negligence on the part of the driver, that driver goes to prison. Whereas if a self-driving Ford kills someone then it might be Ford’s fault, but then what? Corporate slap on the wrist?
At the end of the day, if no-one will serve prison time when a self-driving car kills someone in a manner that appears negligent rather than an honest mistake, then the public will insist on the cars being so safe that no-such accidents will ever occur.
So putting it all together, I’m thinking it might be quite a while before we see fully self-driven cars.
The Nexus Files is free to read. But if you subscribe you'll get new posts emailed to your inbox automatically, and I won't feel like I'm pointlessly screaming into the void.