Here's the line to remember: "As robots become mainstream, lawmakers will have to grapple with how to govern machines and hold software accountable." That comes from a New York Times piece on what kind of legislation will be needed to deal with the inevitable accidents that autonomous vehicles will get into. The lawyers, naturally, will go after everyone with money, but who do the authorities charge when a self-driving car parks itself in a no-parking zone, and who will the jury hold responsible when it rolls the wrong way down a one-way street and, heaven forbid, injures or kills someone?

The academic world has already begun devoting white papers and test tracks to the subject, but it will be up to our elected leaders to – hopefully – start filling in some legally binding answers before the hard questions get asked in court. The earliest projections figure we have six years until autonomous cars drive themselves onto dealership lots, but once they show up, healthy public trust could make them quite popular. Head over to the NYT to read the piece and start thinking about who'll have to clean up the mess when you hear the phrase, "You have 20 seconds to comply..."


I'm reporting this comment as:

Reported comments and users are reviewed by Autoblog staff 24 hours a day, seven days a week to determine whether they violate Community Guideline. Accounts are penalized for Community Guidelines violations and serious or repeated violations can lead to account termination.


    • 1 Second Ago
  • 35 Comments
      rbnhd1144
      • 7 Months Ago
      Driverless cars means Big Biz will be the responsible party, and we all know how Big Biz dances around working the system, they have deep pockets and can afford to make a lawsuit go on for years, Id like to see the CEO's of these companies behind BARS but that will never happen. When Big Biz totally runs this world we the people will suffer. Ive never sued a person in my life but I would go after the vehicle owner first, nobody is forcing anyone to buy these vehicles, thats where Id start.
      Larry Litmanen
      • 7 Months Ago
      Well who is responsible when software on a plane malfunctions?
        Joeviocoe
        • 7 Months Ago
        @Larry Litmanen
        Surprisingly... it is not the aircraft maker... but the airline who hires pilots and maintenance crew to ensure the aircraft is good to fly. The only way the Aircraft maker (boeing, Airbus, etc) is liable... if it can be proven that ALL aircraft of that model and software revision have the same "reproducible" problem. If it is a isolated incident or "one time glitch".... human error is usually to blame. When it cannot be proven either way... no-fault... but the airline (not the aircraft maker) still pays. After all, passengers pay the airline to take full responsibility, regardless of fault. In passenger cars... the insurance company will always be the liable party.... whether or not the insurance company will raise premiums or try to sue the automaker... that is a different story we are trying to figure out.
      NamorF-Pro
      • 7 Months Ago
      Autonomous cars are hopeless. They're too dangerous and stupid. It's impossible for them to live around ppl who tend to change their mind every second. Crashes will increase and chaos will break out, resulting in an Auto-Apocalypse. Lol Seriously tho...
      SumYunGie
      • 7 Months Ago
      I understand why some here are pessimistic towards autonomous cars but I personally am excited. Just think: - they don't text and drive - they don't get worse as they age - they aren't susceptible to road rage The list goes on...
      WindsWilling
      • 7 Months Ago
      What happens? You go after the person dumb/lazy enough to buy one of these cars, that's what. Only people that should be allowed to purchase these devices are those who legitimately need them, like the physically challenged. In light of all these recalls, now you have vehicles that not only could easily suffer a failure mechanically, but also electronically, which can and will cause death and injury.
        jmallx
        • 7 Months Ago
        @WindsWilling
        What about people like myself who used to work 16 hour shifts 3+ days in a row? Am I dumb/lazy for wanting to get home safely? Perhaps its stupid of me to KNOW that I shouldn't be driving yet have to drive to get home so that I can rest.
      11fiveoh
      • 7 Months Ago
      Yes, everyone get cars that drive themselves. I'm sure i'd take my r6 out more if I wasnt constantly dodging soccer moms in caravans.
      Brian P
      • 7 Months Ago
      Simple: The autonomous cars will not break the law. They will steadfastly stop at the stale yellow about-to-turn-red traffic signal despite the presence of an 18-wheeler directly behind, which is not capable of stopping in time (but the collision and death of all the occupants of the autonomous car will be the 18-wheeler's fault for following too closely / running the red). They will roll down the road at precisely the speed limit of 100 kilometers per hour, despite all the other traffic treating the autonomous vehicle like a moving chicane and the resulting mayhem behind and around. In our world of under-posted speed limits, they will be the worst slowpokes around. Where the law requires them to not enter an intersection in order to turn left until oncoming traffic is completely clear, they will sit in place at the front of the line patiently waiting for an opening in the oncoming traffic for this cycle of the traffic lights ... and the next ... and the next ... ad infinitum, blithely obeying the law while tying up traffic behind and causing gridlock to no end. ... until the driver of said autonomous vehicle gets so frustrated at not being able to get anywhere, that they press "OFF", break the law a little bit (like *everyone* does), and actually get somewhere by driving the car themselves ...
        Jarda
        • 7 Months Ago
        @Brian P
        ...because detecting closing object from behind is something today's technology can not possibly handle
        Bill
        • 7 Months Ago
        @Brian P
        I guess your scenario will be true but how long it lasts will depend on how fast autonomous cars are accepted by the mainstream… or perhaps the owner could choose some kind of "aggressive" driving program and accept the legal responsibilities...
      EVnerdGene
      • 7 Months Ago
      It will be fun freaking-out autonomous cars. It will be a new cool thing to do when bored driving around with a bunch of autonomous lane bandits. Slowly pass one, and do a quick steering input like you are going to hit its front fender. Laugh as dozing passenger's chin hits chest.
      Bernard
      • 7 Months Ago
      Simple solution: remote monitoring and law enforcement remote control override. This will only apply to cars in auto mode. If driver decides to take over then the monitoring and override are disabled. Next! :-D
      danfred311
      • 7 Months Ago
      Simple enough. If it's a freak accident, no problem, if it's systemic problem with that particular implementation then that model is no longer legal for use on roads until improved. Assuming the user didn't misuse it, it's a relatively simple case of product defect/product recall. Only in the case of negligence is the manufacturer really liable. The car wont 'break the law' because it feels like it. It's not like it's subject to breaking bad. Or braking bad as it were :) Google's self driving cars have done over a million km without being at fault. There was only one incident of someone else rear ending a google car. The number of accidents might be a million times fewer than human drivers. No small margin. It's a non issue.
      Kevin Potts
      • 7 Months Ago
      Do Elevators and Escalators break the law when they don't work?
        SpacemanSpiff
        • 7 Months Ago
        @Kevin Potts
        "Escalators never break, they just temporarily become stairs." -Mitch Hedberg.
      John Hughan
      • 7 Months Ago
      Parking and moving violations are a bit trickier, but I've been asking myself how collisions involving autonomous vehicles will be handled from an insurance standpoint for a while. If an autonomous and "conventional" car collide, fault and who pays can be established pretty easily. But if the accident involves only autonomous vehicles, the only reasonable solution I can think of would be for owners of autonomous cars to carry no-fault insurance. Their premiums would go into a bucket that would be used to pay for such collisions. Otherwise, if owners could be held liable despite not driving, that defeats much of the point of an autonomous car, and if the car manufacturers or software developers could be held liable for every mishap, they'd never be willing to being an autonomous car to market.
        11fiveoh
        • 7 Months Ago
        @John Hughan
        Most likely they'll have some sort of 'black box' that'll better show which cars program had a fault.
        rbnhd1144
        • 7 Months Ago
        @John Hughan
        You bring up a good point, who will want to insure these cars, maybe Google will be willing to take that risk.
        Jim R
        • 7 Months Ago
        @John Hughan
        They're coming one way or another, and it's a problem they're going to have to work out sooner rather than later. I think the no-fault insurance is probably the only logical way to go.
    • Load More Comments