Unsafe at any speed

Posted by Mike Walsh

9/20/16 11:25 AM

000026.jpg

 

Statistics are evil things. I learned a new one today: on average, one person dies for every 94 million vehicle miles traveled in the United States.

 

Presumably, that means that we should still feel statistically safe in an autonomously operated Tesla, given that their vehicles have so far covered 130 million miles in Autopilot mode, with just one, recently reported fatality. Of course, that’s to miss the real point. Namely, what is the appropriate relationship between human, and non-human driving technologies?

 

The specifics of the accident are tragic. According to the statement by Tesla, neither Autopilot nor the driver noticed the white side of a tractor trailer against a brightly lit sky, so the brake was not applied as the car approached at speed. As David Silver has pointed out in a recent article, that explanation implicates the computer vision software, which might have had sufficient sensor data to recognize the truck, but seemed to have failed to process that data correctly.

 

In my view, the accident highlights two of the main problems with self-driving cars. The first is when drivers need to know how to respond, when control is handed back to them. Most of today’s current self-driving technologies, Tesla included, are known as Level 3 autonomous systems — which means that the driver has to be alert in order regain steering in challenging situations. Level 4 systems, on the other hand, are those that never require humans to intervene.

 

What makes a Level 3 system dangerous is not the car’s software, but the vulernability of relying on a human to judge when they can relax, and when they need to pay attention. Watching Harry Potter while you are driving is almost certainly not a good idea — but what about merging traffic, exiting a freeway or a busy intersection? People simply don’t have a base line understanding of how they should behave around AI operated cars — and so they become dangerous and unpredictable.

 

Even if your car is Level 4, and can drive itself without your input — you still have another problem: other drivers.

 

My wife and I travel alot. When we are in my hometown, Sydney, I always drive. When we are in her’s, Istanbul, she does. The reason for that is simple. No one obeys any road rules in Turkey. Or at least, rules that I can understand. The minute I start driving carefully and politely, it will either prompt an accident, an explosion of road rage or an international incident.Point being — how you drive is only half the story, it is the context of the entire traffic ecosystem that determines safety.

 

Self-driving cars will only really be safe, when every single car on the road is autonomous, and managed holistically like packets in a distributed network.At that point, when driving tests are obsolete and most cars will lack steering wheels, what we will need are not traffic cops, but technology ethicists.

 

Accidents in an autonomous future will be mainly the result of deliberate tradeoffs made by algorithms. Do you let a car crash and kill its driver, if that means swerving to avoid a group of schoolchildren? Philosophers call these brain twisters ‘Trolley Problems’. Soon they are likely to be the subject of not just academic debate, but of litigation and regulation.

 

Statistically we will be safer, but how well will we sleep at night knowing that we now need to consciously let a certain percentage of people die every year, because of a predictable, rather than a possible scenario?

 

In the meantime, maybe keep a little extra distance from technologically sophisticated vehicles as you drive home tonight. People used to make jokes about Volvo owners as being bad drivers; their faith in their car’s legendary safety gave them false confidence on the road. I wonder, if the same will soon be said of Tesla drivers.

Topics: Technology

New call-to-action

Latest Ideas