I've got a new op-ed out today at National Review. I suspect those of you with more technical chops (and legal chops) could help add texture to my perspective and I welcome your feedback. If you like the piece, I'd appreciate any shares you can offer.
Self-Driving Cars Will Save Many Lives | National Review
The black Mercedes-Benz sped south on L.A.’s La Brea Avenue at, allegedly, more than 90 miles per hour. Despite heavy cross traffic and a clear red light in its path, nothing in its operating system prompted the vehicle to slow, let alone stop, as it approached the intersection with Slauson Avenue. At 1:40 p.m. the Mercedes struck Asherey Ryan’s westbound car with such force that it was plowed 50 feet from the spot of the collision, instantaneously bursting into flames. Ryan; her boyfriend, Reynold Lester; her one-year-old son, Alonzo; and her unborn child were all killed. Two other people, traveling in a different vehicle, were killed as well. Eight more were injured. The driver of the Mercedes is, the Los Angeles Times reported, now facing six counts of murder and five counts of gross vehicular manslaughter.
As we consider a future full of self-driving cars, we ought to remember tragedies like the August 4 collision at La Brea and Slauson. The operating system responsible for it was not an autonomous program devised by Mercedes computer scientists. It was the human being sitting in the driver’s seat.
As happens around 100 times each day in America, a car’s human driver had caused a fatal collision. Due to distraction, inebriation, incapacitation, malice, or just plain incompetence, human drivers caused collisions resulting in the deaths of more than 40,000 people in the United States last year, the highest number of roadway fatalities recorded in more than a decade according to National Highway Traffic Safety Administration (NHTSA) statistics. With 2021’s troubling uptick, the cumulative number of Americans killed by motor vehicles has climbed to well over 4 million since we first put cars on our roads over a century ago. In 94 percent of roadway collisions, the NHTSA says, drivers are to blame.
Self-driving cars offer hope that this seemingly endless series of tragedies might come to an end (or something close to it). Where human drivers succumb to physical and psychological weakness by nature, self-driving cars are immune. Instead of human perception, reaction, and will, the cars coming soon will be controlled by sensors and software that far outstrip our human levels of focus and decision-making. According to Mercatus Center estimates, these systems could prevent millions of car crashes annually and reduce the cost of insurance premiums by 90 percent, saving Americans a total of $350 billion each trip around the sun.
Automation, importantly, will not come all at once. Rather, a gradual layering process will aid, before replacing, our driving skills. Encouragingly, features such as adaptive cruise control and lane-centering are already being added to standard vehicles to make them safer. Intelligent, connected cars are beginning to benefit us already.
While vehicular fatalities in the U.S. are down relative to the bad old days when roadway deaths were twice as common relative to population size, developing countries such as Nigeria and Brazil still suffer vehicular fatalities comparable to the U.S. nightmare of 100 years ago.
Despite this, an anti-autonomous-car movement seeks to block the technological path to a safer future. Ralph Nader, for instance, calls Tesla’s continued development of its incremental autonomous technology “one of the most dangerous and irresponsible actions by a car company in decades.” Really? While Tesla is advancing toward fully autonomous vehicles, car companies such as Ford and GM are supplying the market with larger, heavier pickup trucks each year — vehicles proven to increase the likelihood of collisions and to cause greater bodily harm to people on foot or in other vehicles in the event that collisions do occur.
Tesla, of course, is not the only car company that wants to go autonomous. GM, to its credit, backs Cruise, the company now deploying autonomous cars for test drives in San Francisco and autonomous light buses in Abu Dhabi. Waymo, Google’s self-driving car project, is rolling out as well.
Amid a general trend toward improved roadway safety, autonomous systems will cause some collisions, and probably always will: The flip side of human fallibility is not AI’s infallibility. Many will remember the terrible 2018 death of Elaine Herzberg, the first person on foot believed to have been killed by a self-driving car. A recent viral video showing a Tesla failing to detect a mannequin on a test track will renew these fears. To be sure, all bodily harm caused by vehicles is of grave concern, but the advanced computational applications that govern autonomous vehicles will steadily improve. As they do so, they will accompany and aid human drivers long before replacing them.
Rather than put the brakes on autonomous driving, we need to continually assess and update our legal and regulatory structures to reflect technological changes. This will necessarily involve a complex mix of agency rule-making, private-sector guidelines, and revisions to liability standards. It will never be perfect, but it will be better than arresting progress in favor of our deadly norms.
Nader closes his letter by stating that no one should be above the laws of manslaughter, worrying that Tesla will not be held accountable if its systems cause harm. The truth is, all too often, human drivers tend not to be, either.
On August 9, a jury in New Hampshire delivered a not-guilty verdict following a twelve-day manslaughter trial pertaining to a 2019 rural highway collision that killed seven motorcycle-club members. The jury acquitted the defendant despite a National Transportation Safety Board report concluding that the probable cause of the collision was the driver “crossing the centerline and encroaching into the oncoming lane of travel, which occurred because of his impairment from use of multiple drugs.”
The crux of the matter, thus, may not be legal, but rather cultural resistance to change and deference to the norm of the human driver. The rancor Ralph Nader and company show towards self-driving cars masks a fear of technological advance and a paralyzing bias in favor of the status quo — a status quo that leaves tens of thousands of Americans dead in the street each year and tens of thousands more maimed for life.
While much can be implemented to lower the risk of driver error — for instance, better roadway designs that disincentivize speeding — human beings will never overcome their weaknesses: inattention, impatience, anger, bravado, and sheer stupidity, to name a few. According to the Los Angeles Times, investigators say the driver of the Mercedes mentioned at the beginning of this piece had been involved in up to 13 prior vehicular collisions. It’s likely that we may well be seeing yet again an example of our failure to reform driving behavior and hold drivers accountable. Self-driving cars will require fine-tuning over time, both of technology and public policy, but their autonomous systems will be free from our unique human flaws and will keep us safer as a result.