It's not clear anybody wants autonomous cars
Nobody actually knows if people want autonomous cars. I don't mean the imaginary version of autonomous cars, sleekly futuristic pods always on call that whisk you in silent comfort and privacy to your destination. I mean autonomous cars as they will actually be. Partly this is because people haven’t been able to try them, since they don't work yet. But partly this is because when they do reach a point where they work—which is not a given—autonomous cars will behave quite differently from human drivers, and will play an extremely different role in traffic interactions than people do. The autonomous vehicle industry—particularly the companies developing and testing robotaxis—has gotten away for too long with selling a vision of the future that they should know perfectly well is never going to come to pass.
I used to argue with my founding CEO about this. When I brought the topic up on twitter, I had replies that were similar to the line he would take in those arguments. People use Ubers, don't they? What fails to be desirable about an Uber but without the vagaries of a driver? Take that human—who might be dangerous or get lost or be on the phone—out of the equation and it seems like a dream.
There are, roughly, two main reasons that robotaxis will never be like an Uber, just without a driver. One of those reasons is technological and one is social. The technological issues will probably get solved. They're fantastically difficult, and we're not very close, but betting against technological problems getting solved eventually is never smart. The social issues won't get solved and, fundamentally, sort of can't. In the twitter conversation about this most of the people who agreed with me brought up the technological issues. The social issues are less well understood. Let me explain both.
On the technological side, everybody kind of knows that autonomous vehicles don't yet work right. The companies deploying them tend to be fairly tight-lipped about the ways in which they don't work right, for good reasons and bad, but there's a reason they have very limited deployments in places and at times where the streets aren't likely to be very crowded or chaotic.
The big reason for the limited nature of those deployments is that these vehicles can't yet make sense of the world. They see much less of the world than humans do, and what they see is vastly different. Figuring out how to use that information to understand what is happening, what is likely to happen, and what the vehicle should do based on that understanding is a fantastically difficult problem, and the state of the art in these vehicles isn't particularly close to nailing it.
That doesn't mean they can't be safe on the road, at least from the limited perspective of not hitting anybody. But it does mean that they're going to behave fundamentally differently than you'd expect the driver of a vehicle to behave. Stopping for no reason, failing to see something obvious, completely losing track of an object in the world and then abruptly correcting course when it reappears: all of these behaviors are going to be pretty standard on autonomous cars for a long while yet. If you're going to bet, bet on them eventually getting solved, but it'll be a long time, and if these vehicles are going to be successful it's going to be because people want to use them despite these behaviors. It's possible they will, but it's no sure thing.
Even once those problems are solved, the vehicles are still not going to drive like people. They can't drive like people, because they won't get the same charity of interpretation that people do when they drive. A (horrifying, sorry) tweet going around makes this point pretty well.
"Officers said the teenage driver wasn't reckless, but inexperienced." The vehicle blew through a crosswalk and came within centimeters of murdering a small child, and the law enforcement response was, fundamentally, empathy for the driver. This is awful, of course, and rightly getting called out as such. Imagine, though, that the vehicle was a Waymo autonomous vehicle, or a Tesla. Would law enforcement say that the robot wasn't reckless, but inexperienced? They certainly would not. Nobody is going to make allowances for driving software because it is still learning, or it just didn't see the car coming. Nobody is ever going to think "that could happen to anybody" in response to dangerous, illegal, or even just annoying behavior by an autonomous car. There isn't a driver to empathize with.
That tweet is an extreme example. Hopefully most people agree that empathy for the driver in that situation, no matter how young and inexperienced, is misplaced at best. If you pay attention to what happens on the road, though, that kind of excuse-making and justificatory perspective-taking happens constantly. Somebody stops past the line? Well, it's that one jerk. Somebody rolls through a stop sign? Honestly, we all do it. Somebody goes a little bit over the speed limit? Everybody goes a bit over the speed limit.
All of the things I mentioned above are illegal. Very, very few of them ever result in traffic enforcement. The story of traffic laws is fundamentally one of selective enforcement. This has a great number of bad effects, as laid out by Sarah Seo in her book Policing the Open Road. But it's really how the system works. Commuting patterns, driving conventions, travel times, road layouts: all of these emerged from negotiations between what people ought to do, or are instructed to do, or are imagined to do, and what they actually do. Driving is a process of coming to a shared understanding, based on recognizing each other's humanity.
Because of the absence of empathy, the story is totally different for autonomous cars. They will be expected to strictly obey the law and behave considerately in all situations. If they don't do that, the companies that develop them will be blamed. Again, everybody in the industry knows this. Except for Tesla, the strategy is to be as conservative as possible and avoid problems. Tesla's strategy is to emphasize the responsibility of the human driver when something bad happens. That's a reprehensible strategy that is not going to work long-term. But even Tesla understands that there's simply never going to be any such thing as charitably interpreting the problematic behavior of driving software. If you want to have fully autonomous vehicles, they have to be strictly law abiding. The laws are there to prevent dangerous behavior, and intentionally programming a robot to behave dangerously around humans? That has rather famously been thought of as a pretty big no-no.
There was a recent news story about a delivery robot that drove through a crime scene. There was a huge uproar until the company explained that actually the robot was being teleoperated by a human, who made a mistake. That was understandable. Humans are fallible, but surely they'll figure it out.
The thing is, we don’t really have much or any experience of what it’s like to ride in a completely solicitous and law-abiding vehicle. What will it be like to travel through a busy urban area in a robotaxi that is appropriately conservative in its driving behavior and strictly law abiding at all times? What will it be like to be in a vehicle that does not exceed the speed limit, that comes to a complete stop at every stop sign, that always keeps an appropriate following distance, that always yields when there is a chance of a pedestrian entering its path?
All we can do at this point is speculate. I would bet on it being, to a first approximation, infuriating. Travel times will be much longer. Human-driven vehicles will take constant advantage of your vehicle. Slower vehicles like pedestrians and cyclists will effortlessly be able to claim priority of the autonomous vehicle. Riding in a robotaxi in an urban center will be maddening.
What's funny about this is that for people who aren’t traveling by car, people on foot or scooter or bicycle, the autonomous vehicle's behavior will be great! A car that is obliged to follow strict safety rules, that can never be aggressive or unpredictable, that will always give way? Speaking as a cyclist, I am into this. That's the kind of car I'd like to ride my bike around.
So in a way, robotaxi developers are revealing something fundamental about the way that cars work—or more to the point, don't work—as urban transportation. Autonomous cars are stuck in a sort of no-man’s land, unable to solve the ineradicable conflict between usefulness and safety. A car that is as safe as we believe cars should be is much less convenient for urban transportation. This conflict is inherent to the automobile, but it gets obscured by our focus on individual driving behavior. But autonomous vehicles eliminate the question of individual driving behavior entirely; the fundamental problem is not drivers, but cars.