Embarrassing admission: the Luddite wing of my consciousness was shocked and dismayed when I first noticed that self-driving cars were becoming a “thing.” For example, when I learned that General Motors alone is devoting $5 billion to putting self-driving taxis on the streets of San Francisco by the end of next year, I fixated on the hundreds of times hardware, software, or internet connections had crashed, frozen, or produced bizarre results on the devices I routinely use and pictured the same thing happening at sixty-five miles per hour. I feared that large numbers of people would lose their driving skills, and be unable to cope with crises when the machines failed. I resolved to spend more time indoors, away from exterior walls.
I don’t admit to being wrong that often, but I now think I was wrong. AI-controlled cars may produce thousands of deaths, millions of injuries, and billions in property damage every year. The first self-driving car death has already occurred. But human-controlled cars aren’t so good either. Last year we suffered over forty-thousand traffic fatalities, and 4.5 million auto-related injuries. Can machines do better than that?
The answer is yes, in time. They’re rapidly getting better, with no clear upper limit on how proficient they might become. Humans have no such realistic capacity for improvement. Moreover, self-driving cars promise greater mobility for the blind, the disabled, the impaired elderly, and those of us who sometimes enjoy a drink or two.
In some of my other columns, I’ve advocated a dose of government regulation for new technologies like artificial intelligence and CRISPR. Self-driving cars need regulation too. The difference is, there’s already a massive auto safety regulatory structure in place. The industry is now busily trying to amend parts of that structure to accommodate self-driving cars. Do fully self-driving cars need rear-view mirrors or steering wheels? No, but current rules won’t allow a car on the road without them. A bill passed by the US House of Representatives last year that would make the changes sought by the industry is now stalled in the Senate.
I’m not certain I agree with every provision in that bill, but its central point is important: regulation of self-driving vehicles should occur primarily at the federal level, not at the state level. The investment necessary to realize the promise of self-driving vehicles will not materialize if there is a patchwork of fifty different sets of state regulations. The Constitution gives Congress the power to preempt state regulation of interstate commerce for a good reason, and road transportation is a fundamental component of interstate commerce.
One element lacking from the bill is a requirement for transparency. The companies developing self-driving cars are not required to make data about their testing performance public, so most do not. From the standpoint of maximizing competitive advantage, that may be understandable, but it does the public no good at all. A reasonable tradeoff for the right to modify safety rules that make no sense for self-driving cars would be a requirement to be completely forthcoming in publicizing testing results.
The US National Highway Traffic Safety Administration does have a program for voluntary reporting of test results, but response from the industry has been underwhelming. Only five firms have submitted voluntary reports so far, and Detroit News reporter Keith Laing says they “resemble slick marketing brochures instead of stringent regulatory filings.”
If the industry knew what was good for it, there would be no resistance to this. Unlike AI and CRISPR, self-driving cars will require public acceptance in order to succeed. The Brookings Institution found that only 21 percent of Americans were willing to ride in a self-driving car, largely because of doubts about the technology. The best way to win acceptance is to develop a reputation for openness and honesty the industry now sorely lacks. Rather than hiding behind a veneer of “Everything’s fine! Trust us!” the industry would be better off saying “We’re not perfect. We do have an incident every X thousand miles, a figure we’re trying hard to reduce. But since that’s better than the incident per every Y thousand miles that human-driven cars cause, you should give us a try.” Consumers Union, among others, is calling for exactly this. “Companies should make public detailed documentation of their tests on private proving grounds and their advanced computer simulations,” its car safety director insists.
Mandatory test reporting is necessary, but not ultimately sufficient. There are things like rear-view mirrors we don’t need to regulate—but there are new aspects peculiar to self-driving that we do. Vulnerability to hackers? Accuracy and granularity of maps, and frequency of map updating? New problems that will require innovative regulatory approaches. The best way to get there quickly is to open up access to the manufacturers’ testing data. It’s not enough just to study the accidents—the whole panoply of information about near-misses, “backseat driver” interventions, and unnecessary stops needs to be available for public scrutiny.
There are vast unknowns about how widespread adoption of self-driving cars will affect the way we live. Some speculate that jaywalkers will become more aggressive, once they become convinced that the top priority of the ever-vigilant AIs controlling the cars is not to hit them. Some speculate that urban sprawl will worsen, once hour-plus commutes become less unpleasant. Some speculate that traffic congestion will worsen, as driverless taxis wander about waiting for a call from a customer. Some speculate that not only will truck drivers lose their jobs, but traffic cops and highway police as well. Some are trying to figure out how to make money from the free time riders in driverless cars may enjoy—Hollywood is already drooling at the prospect. The effects will be as profound as those of the switch from horses to cars a century ago. A “wild west” model where the only thing that matters is the quickest and highest return on investment, which is basically where we are right now, creates risks we shouldn’t be taking.