Sebastian Thrun gave a TED talk recently about why the driverless cars that came out of DARPA’s Grand Challenge and are being developed at Google aren’t just a nice feature like cruise control, but rather a necessity for society. The vast majority of car accidents are caused by human error, and they are the leading cause of death for young people. Robotic cars could vastly improve highway capacity and eliminate traffic jams. They would allow people to reclaim the average hour spent on commuting and do something useful with it. And the various forms of distracted driving — sleepy driving, drunk driving, texting while driving, shaving while driving — would be eliminated.
The technology for driverless cars is here: Thrun’s cars have successfully driven over 140,000 miles on all kinds of roads and conditions, from highways to dense city streets, including California’s Highway 1 and the “crookedest street in the world”, San Francisco’s Lombard Street. The EU is also developing a similar concept of the “road train,” made up of driverless cars. Right now all of this is at the prototype stage, but it could be brought to market in probably around a decade.
The big issue isn’t the technology, but rather the legality. If you get in a car accident now, you or the other driver are likely responsible. But if two driverless cars are involved in an accident, where does the blame go? To you? To the other car? To Google? And what if someone gets injured or killed? Would Google get sued for millions each time? No car manufacturer would take on that liability.
The truth is, as a society, we’d rather deal with the 10 million car accidents per year and the 39,000 deaths they cause because, at least humans were responsible. To put that into perspective, it would be as if everyone in New York City got in a car accident every year, and everyone in Greenwich Village died as a result. We don’t want machines killing us, even if the total number would be much, much lower. We’ve already had a sneak peek of the coming backlash, via the scandal surrounding “unintended acceleration” in Toyota cars. For over a year, the cars were blamed for a couple dozen deaths, until most of these turned out to be caused by driver error. It makes you wonder how many at-fault drivers, having heard of the acceleration problems, decided to blame their accident on the car? After all, that instinct is only human.
But we do want driverless cars after all, so what will probably happen is that cars will gradually become more and more “automatic”, until we get used to the idea and the legal ramifications get sorted out. Cruise control has been around for a couple of decades, and self-parking cars for a couple of years. “Lane-departure warning systems” which detect your car drifting out of its lane are becoming more common, as are collision warning systems which predict that you’re about to crash.
In a few years, we might see some enhanced version of cruise control that actually controls the car during cruise conditions on the highway. Then during traffic, and finally in the city. All requiring an alert driver of course, to shoulder the blame if anything should go wrong.