Self Driving Cars: To kill or not to kill?

I just noticed an interesting article on ABC: http://www.abc.net.au/news/2015-11-04/researchers-probe-mora…

The question: should your self-driving car kill you if that avoids killing multiple other people?

I personally don't think it is quite as easy as adding up numbers of people. My moral compass would include a question like "who is at fault for the accident?".

An example: someone is speeding and runs a red light, they have a passenger in their car. I am alone in my car and obey all the road rules. I would think my car should prioritise my own safety over that of someone breaking the law.

Another example: a couple of drunk people stumble out onto the road as I am about to drive past. The car can send me to certain death by swerving into a tree or it can hit the two drunk people. I'd still like my own life prioritised.

What car would you buy? One that:

Poll Options

  • 14
    Does the math and decides based on minimum overall damage
  • 7
    Has some adjustment factor to include 'at fault'
  • 18
    Always only protects the driver and own passengers
  • 10
    I would never want a self driving car!

Comments

  • +2

    "A self driving car" as you put it would sense the pedestrians and red light runners well before they become a hazard and adjust your speed and braking accordingly..I'm not sure about "swerving" to avoid objects as that would then put the vehicle out of control.

    • +1

      I didn't read the article, but I expect a car with do everything practical to avoid an accident, with emphasis on practical.

  • I think I saw really interesting point about that (It might've been on that page or on facebook post of that). Car manufacturers would have to build a car to maximise the safety of the driver, not the pedastrians. This is simply because people wouldn't buy a car that does not priorities the safety of the driver (would you buy the car that can kill you more?).

    If you ask my opinion on it, if possible, let the buyers choose. Let it be an option for the buyers to decide whether they want it to priortise their safety or maximise the number of people saved if it is possible. In accidents, some people would choose to save themselves, some would choose to sacrifice themselves for safety of others. Let it be a choice as well for the self-driving cars if possible.

    • Well that's one way for a driver to administer euthanasia. Just drive around until the car kills the driver!

      • What we are looking at is the highly unlikely situations in which the accidents occur, or can occur (i.e. it does not happen regularly).

        I think accidents with self driving cars would be a lot less than with human, at least when the technology ripes and becomes reasonably solid (i.e. they don't have fatigue problems, drink and drive, etc etc). Of course, other non-human errors can happen, that said, I doubt they will be more frequent than the human factors in accidents, given that the technology becomes ripe and tested over a reasonable time period.

      • -1

        Odds of such a lethal accident are very slim. I think the driver/passenger is more likely to die of starvation before the car needs to kill the driver/passenger in order to minimise deaths to others in an accident. And that's assuming the car can robotically refuel/recharge itself.

        • Small chance, but millions (eventually billions?) of drivers on the road… Means that it is likely to happen at some point.

  • So if an ambulance goes past carrying a critically patient, would you be willing to allow your car to swerve into a tree, killing you? What if the patient is a valuable member of society and you are just a peon. :P

    • Tony Abbott's Motorcade!

      (Pretend he's still PM for context)

  • Have a big red 'manual override' button. Whenever you feel your life is in danger, smack that button and take over the controls.

    • I believe it's called the brake pedal.

      • +1

        Oh, I was hoping for a pedestrian harvester..

  • +5

    What car would you buy? One that:

    Can figure out all the crossing double white line business.

  • +3

    the real question is….

    will the self driving car allow u to do a u turn or turn over double white lines!!!!!!

    • Why would it need to? It knows where it's going.

  • Don't worry artificial intelligence is too far off to worry about. Your hypothetical is one of those choose the "lesser of two evils" questions.

    Currently the accident avoidance systems are very moral and don't deal with the lesser of two evils situation.
    For example the current scenarios are uncontentious:
    1) staying in the same lane.
    2) proximity sensor braking.
    3) anti-moose/kangaroo avoidance.

    Even if your worst fears are realised and the cars priorise the safety of pedestrian/cyclist this does not necessarily mean the driver WILL die.

    Some of today's passive car safety features I know of are: seat belts; crumple zones; side intrusion bars; reinforced safety cells; steering airbag; ABS; Electronic Stability Control; preloaded seat belts; preloaded head rests; side airbags; retracting brake pedals; knee airbags; improving tyre compounds; run flat tyres.

    With current technology and safety standards a driver is very likely to survive a 100km/h collision into a wall as well as 100km/h full frontal collision with equal height vehicle.

    Who know's what future safety features will have evolved by the time car artificial intelligence is implemented.

    By the time car AI is "commonly" adopted, vehicles will definitely be communicating with each other. Airlines already implement a system for avoiding mid-air near collision. Its a system that allows a pilot to see where all other planes are. UAV/unmanned drones have been working on their solution for about a decade.

    By the time car AI is "commonly" adopted the vehicle communication/location system will also be implemented in trucks, buses, mini vans, SUV. This would help the AI's of oncoming vehicles to avoid each other.

    Imagine how much faster the ambulance response times would improve if the computers in cars automatically moved over and could guarantee ambulances run through intersections at maximum speed or drive the wrong way up a one way lane!

    Two thirds of Australia's population currently live in a capital city. The majority of road pavement is marked at 50km/h (with lobbying to lower to 45km/h). The other city roads are dual/triple carriage ways and separated from oncoming traffic by barriers or pedestrian islands. City drivers are more likely to kill a pedestrian than lose their own life.

    After looking at todays facts, statistics, risks. The probability of Artificial Intelligence increasing the deaths of car occupants is very unlikely. Cars are evolving quicker than the homosapien. I'd rather the AI protect a city dwelling pedestrian from being sandwiched into a building or under a tire than to have post traumatic memories for the rest of life.

    The ideal solution would be to encourage all city car owners to swap to bicycle/motorcycle. It will force vehicle operators to confront and update their sense of risk. It's also the furthest thing from a computer controlled carriage. Plus the moral questions and post traumatic stress are dealt with :)

    • If you are talking about ACAS system, I am not sure how people would find it and how effective it would be. It's not a complex system from what I've seen. It's simply using transponders to let the airplanes to exchange information about their location and altitude, when they get too close, the system tells the pilots to change altitude in certain ways. That's why the towers and traffic control still exists, it's better to have someone who knows where everything is and tell people to fly in a mannered fashion.

      The system's not flawless and I don't think it'd be effective on ground. First of all, you have less axis to work with, secondly, there are obstacles that you cannot track with it unlike in the air, lastly, the distance between traffic and no of traffic there are is too small and too much for the system to work properly, in my opinion. Not to mention, I thought the equipments were expensive (don't ask me, I am just guessing on the price)

  • Wouldn't they just bake Asimov's Laws into the car, as it's essentially a robot on wheels?
    As long as it follows that it should have the best possible outcome.

    • If only it was that easy. There are situations where harm is unavoidable. Asimov's laws don't say anything about how to decide who gets harmed.

      But then even Google seems to not have implemented Asimov's laws in their cars. I am sure some of the rear end collisions they were involved in could have been predicted and avoided.

    • Robots usually don't have passengers to worry about.
      It can sacrifice itself to protect a human, but it can't sacrifice its passenger to protect other humans.

    • I don't think 50 year old science fiction is an appropriate basis for current policy.

  • +2

    What if the car senses you are bored with it and thinking about buying a newer one? "Car. Stop at this car sales yard." "I'm sorry Dave, I'm afraid I can't do that." Automatically locks doors, undoes your seatbelt and slams the brakes on doing 180kmph..

  • It's an interesting thought experiment, but law trumps morals.
    The self-driving car must follow road rules and not break any laws. Otherwise, they would be deemed illegal and insurance would be impossible.

  • If everyone has self-driving cars, then this scenario would never happen since the driver cannot increase the speed of the car and run a red light.

    • Everyone? That would require to outlaw all existing cars. Even if that will be the case eventually, there needs to be some rule for the 20 or 30 years that self driving cars have to share the roads with today's cars.

  • In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.

    Self driving cars? No thank you.

  • Although not as big a law as in USA jaywalking was made illegal to give cars right of way. I agree. If people are at fault and breaking the law by running red lights and getting overly intoxicated that they stumble on to the road they should suffer the consequences, not the passenger/driver in smart auto car.

    But in future most cars would be smart with laser and sonar sensors to streamline bumper to bumper so it'll be great for fuel efficiency, productivity and less roads etc.

  • I'll preface this by saying I am really keen on getting a self driving car, however, I think these concerns and the scenario presented is pretty bullshit.

    There is no going to be a situation where multiple people are on the road in front of the car and it cannot brake in time. BUT, let's just say that there is, the idea of swerving and killing the driver is idiotic. The car should simply apply the brakes in a straight line. If it stops in time, great, if not, well, why are a group of people on the road?

    I think this is a non-problem that a few philosophers or academics have trudged up to get their 15 mins of fame.

  • Genisys is Skynet!

  • Gee lets see, I walk into Googly cars, and theirs has the feature that, in accordance with their "do no harm philosophy" does the least overall damage, possibly putting me more at risk.

    Over at Applemobile, they put my safety first (because their device will not be at fault anyway)even if it means others may be injured.

    Which do you think I am going to spend my money on?

    In case your not sure, I act responsibly, and want myself and my family looked after so I am going Applemobile ever day.

    Unless it is mandated by law, the market will sort it out. I believe the article backs up my view.

    Interestingly, I believe the trolley problem mentioned in the article has a legal issue. If you pull the lever to sacrifice one to save five, you are guilty of murder. I believe there are similar cases where a number of climbers attached to a rope were going to fall unless one was cut away. My understanding is this is still murder to cut away one to save several.

    Would a car manufacture a device that murders people by deliberately changing course, according to law if not morals?

Login or Join to leave a comment