Worlds & Time

Saturday, December 01, 2012

Morality and Autonomous Cars

Via The Daily Dish but originally by . . . Nick Carr?  Really?

So you’re happily tweeting away as your Google self-driving car crosses a bridge, its speed precisely synced to the 50 m.p.h. limit. A group of frisky schoolchildren is also heading across the bridge, on the pedestrian walkway. Suddenly, there’s a tussle, and three of the kids are pushed into the road, right in your vehicle’s path. Your self-driving car has a fraction of a second to make a choice: Either it swerves off the bridge, possibly killing you, or it runs over the children. What does the Google algorithm tell it to do?

Seems easy enough to me.  It tries to brake, runs over the children, and does everything possible to record everything about the situation to protected memory that can only be accessed by law enforcement.  The car is going to have to be programmed to do it's best with the variables that it can control while protecting the occupant of the car who is riding in it.  The kids on the road are an aberration, something that the car can't control, couldn't have expected, and so it should have no responsibility to protect something that steps in front of it while traveling at speed.

But it's also going to have to prove that after the fact, with a higher bar of proof than a human would have in the same situation.  the car is going to have to show what it saw (the raw recordings), what it thought it saw (the analysis), and record the decisions that it made about what to do.

Just to point out, it's highly unlikely that self-driving cars are going to be able to determine between a single dog and a group of three children in the road.  Whatever instructions the car has are going to have to take into account for the fact that it won't be able to assign a value to the object in the road.  If your car regularly drives into ditches (or off bridges) for dogs and deer, then people are going to be incredibly more upset than with the unfortunate situation described above because they're going to be injured at a much higher rate over the actions of wild animals rather than very rare occurrences of children being hurt in unavoidable situations.

And just to point out . . . it's unlikely this situation would result in criminal charges for a human driver, especially if the kids were pushed into the road.  The legal culpability is with the person doing the pushing.

All of this speculation ignores one important point: the car isn't going to have to make the moral judgement.  The law is going to have to make these decisions, hopefully up front, and Google and the car companies are going to have to program their cars accordingly.

0 Comments:

Post a Comment

<< Home