Autonomous cars are piloting their way into the wide philosophical sea of ethics. Right now the autonomous cars are unaware of this because the driver's will always comes first, but when we start getting cars that can overrule commands or choose a particular ethical outcome either without or in spite of driver input, we'll have a lot of decisions to make. Which means we have a lot of decisions to start considering right now.

Patick Lin considers some of them in a piece in Wired, starting with the trolley problem - whether a person who has control of a runaway trolley should let it kill five people tied to the track without intervention, or should pull a lever so that only one person on another track is killed. From there, he wonders about the possibility of fixed ethics settings, created by manufacturers, versus user-adjustable ethics settings that, for example, allow a driver to prioritize his own safety over others, or prioritize the safety of children over that of the elderly.

Lin admits that the examples are outrageous in order to stress the point of the question. Still, it's worth a read because we already have cars that can make driving decisions, and it might not be long before "Five-Mode Adjustable Prime Directive" shows up on the options sheet. Head over to Wired to read the full piece.


I'm reporting this comment as:

Reported comments and users are reviewed by Autoblog staff 24 hours a day, seven days a week to determine whether they violate Community Guideline. Accounts are penalized for Community Guidelines violations and serious or repeated violations can lead to account termination.


    • 1 Second Ago
  • 18 Comments
      Jason Krumvieda
      • 4 Months Ago

      Will one of them be "BMW Dick Mode"?

        19nomad56
        • 4 Months Ago
        @Jason Krumvieda

        I had to upvote you, and I'm on my second bimmer. Yes, most of my fellow BMW drivers are dicks.

          Peter_G
          • 4 Months Ago
          @19nomad56

          So does that make Audi and Mercedes drivers @ssholes and Pu$$ies? (Bonus points if you get that movie reference)

      Grendal
      • 4 Months Ago

      Isaac Asimov came up with the 3 laws of robotics 75 years ago.  They would apply here.

      Greenman Wood
      • 4 Months Ago

      They shouldn't exist, period. 

      IfIWereObama
      • 4 Months Ago

      What about settings for liberals..since most don't have any ethics?

      They'll value animal life over human & demand special settings for blacks & Muslims & exemptions for union made cars.

      Winnie Jenkems
      • 4 Months Ago

      jeez... that Leaf sure looks like a doofus

      danfred311
      • 4 Months Ago

      non issue. self driving cars will have relatively simple rules and will make no precise decision about complex ethical dilemmas and will thus be exonerated by its 'ignorance' should such a super rare situation occur. it's the shit happens school of legal thought.

      only later will AI potentially be put in charge or complex decision making but likely never in cars.

      DaveMart
      • 4 Months Ago

      'Google’s driverless cars have been given permission to break the speed limit by 10mph admits the head of the search company’s autonomous car project – but only by software engineers, not the police.

      Dmitri Dolgov, the project's lead software engineer, told Reuters during a recent test drive that it would be safer for Google's cars to keep up with traffic when it was slightly exceeding the speed limit than to rigidly stick to it and cause an obstruction.

      But this ability to speed has been restricted to 10mph over the speed limit, he said'

      http://www.telegraph.co.uk/technology/google/11043546/Googles-driverless-cars-will-be-allowed-to-speed.html

      I'm going to teach mine how to give the finger to other drivers....



        white_blur47
        • 4 Months Ago
        @DaveMart

        It sounds to me like the speed limits in those areas are to low and should be adjusted.

      Bernard
      • 4 Months Ago

      For philosophers ethics are always so complex with so many competing theories, when the truth of the matter is far simpler. When you choose based entirely upon love for others, there is no conundrum.

      The autonomous vehicle should never deliberately sacrifice anyone. It's primary responsibility is to the driver and what happens outside of the vehicle is beyond the vehicle's control.

      Jmaister
      • 4 Months Ago

      AFTERmarket will go after that. Either 3rd party or "hacked"

      Master Austin
      • 4 Months Ago

      I would want to make sure that I regain control of the car again if I'm having a road rage episode, as long as I can do that, then I'm fine.

      DarylMc
      • 4 Months Ago

      I'm not sure we will see autonomous cars for quite a while.

      But if we do I am quite sure that it will be the drivers not the manufacturers or programmers who will bear the ultimate responsibility for the system's operation.

      Just as pilots of aircraft have autopilot, I can't see a time where a human operator won't be held accountable.

      Trains still have drivers who are responsible for their operation and I imagine it would be a lot easier to automate them than cars on the roads.

      Krazeecain
      • 4 Months Ago *Edited*

      I don't think autonomous cars will have this capability anytime soon, so we've got plenty of time to debate this. Right now they could probably barely discern humans from garbage cans lol.

    • Load More Comments