Google shows the Google driverless car navigating along... Google shows the Google driverless car navigating along a street in Mountain View, Calif. (AP Photo/Google)
Even in a future of self-driving cars, some car accidents may be unavoidable. Which raises the ethical question, how should a self-driving car be programed to act when faced with an inevitable crash?

Self-driving cars have mostly stuck to highways and freeways where complicated situations are at a minimum. Google's self-driving car has traveled 700,000 miles accident-free while maneuvering down California's freeways. Freeway driving is easier. There are fewer unpredictable events for the on-board computer to face. But with Google's announcement last month that their self-driving car was logging miles on city streets, complicated questions emerged.

WIRED magazine asked how an autonomous car should be programed to behave in a no-win scenario. That is, when a crash is unavoidable, how will that car decide what to hit? All of the programming options have their own drawbacks. There are more questions than answers at this point. And like all ethical problems, how the answer is reasoned will be just as important as the answer itself.

Read more about the ethical dilemmas of self-driving cars here.


Share This Photo X