Self driving cars are like airplanes

Yesterday, for the first time, an autonomous car killed a pedestrian.  It isn't clear that the car was at fault but we're almost certainly going to have an accident where the car was at fault at some point.  At this point autonomous cars haven't driven enough miles for us to know if they're currently safer or more dangerous than human drivers.  But I think they have the potential to be much safer in the long run for a combination of technical and institutional reasons.

Technically, cars can learn in the same way that humans can.  But while we humans are mostly limited to learning from the situations we encounter a fleet of cars can hope to learn as a unit.  Some accident occurs, engineers analyze it, and then no car in that fleet will make that particular mistake again.  It's reasonable to think about robo-cars trained on a body of experience far greater than a human could amass in a lifetime.

And I think that robotic car manufacturers have the right incentives to care about this too, for the same reason that airlines do.  People are just less inclined to trust other people guaranteeing their safety than their are their own abilities.  This is true when people choose to drive long distances rather than fly because they're afraid of some airplane disaster.  This is even true when they know that people like them are more likely to kill themselves driving than a professional pilot is to kill them flying.  It's sort of irrational but it's a fact.  And I'm pretty sure the same principle applies to autonomous cars.

For this reason airlines are scared of airplane crashes in a way most other industries aren't.  When a planes crashed into the World Trade Center on 9/11 lots of people became afraid to fly and started driving instead.  This caused 1600 extra car deaths in the year after 9/11, increasing the death toll of the attack by 50%.  Many airlines nearly went out of business.  So while any individual airline might be tempted to skimp on safety they're all terrified that their competitors will and then cause a crash that will seriously hurt everyone.  In many industries the regulated parties use their influence to make the regulator weaker but for the FAA the airlines do their best to make it stronger.

I think the same dynamic is liable to occur in the autonomous cars industry too.  People are just inclined to trust their own driving over a computer's unless there's very clear evidence that the computer is better.  As long as people can drive themselves on public roads autonomous car companies will be scared that an accident involving one of their competitor's cars will make people want to do just that.  So I expect that once the industry grows and stabilizes enough for a good regulatory body it will be pretty demanding.  And I expect that the companies that get involved will be fairly safety conscious about their autonomous cars even if they're lax about other things.

UPDATE:  Actually the crash that prompted me to write this actually looks pretty bad for Uber, but I still think the forces involved will make autonomous cars safer in the long run.

Comments

  1. The airline analogy is appropriate for inferring about the incentive of the autonomous vehicle manufacturers. However, the two industries are different to me in the sense that airplanes operated by humans but autonomous vehicles are not. This difference could prompt different legal consequences to accidents.

    ReplyDelete
    Replies
    1. Possibly, but I hope not! For airplanes the system does a very good job of not trying to blame the pilot who makes the mistake but the system that let people down. I would hope that driverless car accidents work similarly so that we can learn from the accidents that happen.

      Delete

Post a Comment

Popular posts from this blog

Various Ukraine topics

Book Review: Power, Sex, Suicide

Seveneves and the Roche limit