Who's Fault When Cars Are Autonomous and Breach a Rule?

Just a hypothetical question (or there are rules for this already?): when cars are driving themselves and breached a rule, who is at fault?

The driver? As he/she should have all the power to be in total control at all times (whats the point of riding a self-driving car then?).
The manufacturer? As they should guarantee their cars are fully safe to drive.
The navigation software provider? As technically they are in fact the one that controls the car.

To give you a real scenario, I realised that even when I set the cruise control of my car at 60km, sometimes it can go above to 64 or even 65 temporarily (e.g. going down a slope). If I am caught speeding due to that, do you think I can point my finger at the car computer?

One day when cars are fully autonomous, say running red lights, who do you think is responsible?

Comments

  • They changed the fine print in automatic parking cars to diminish legal responsibility of the manufacturer. It's slightly complicated by Tesla PR getting involved to whitewash embarrassing incidents.

    • Good example about auto park. I wonder how they can justify the onus is on the driver but not the system.

      • By using the word “assist”. This puts the onus on the driver to be aware and that the car, while acting autonomously, is still under the override of the driver.

        Lane keep assist
        Parking assist

        This way, they system is only ever intended to be of an “assist” nature and ultimately its function can and should be overridden at any time by whoever is co troll of the “stop” button.

        • Well then there is at least some, if not all, parts that it is assisting you, i.e. the parts that the car system makes decision, not you; otherwise what is it assisting if you still control every parts? So for those parts that it is assisting, shouldn't it provides correct and fault free controls?

          • +2

            @justwii: No because if those controls start doing something wrong you're supposed to step in and stop it

            • +1

              @Quantumcat: Exactly what Quantum is saying. Everything a car is doing ultimately is assisting. As the driver, you can step in at any time and shut it down. Systems in cars are there to assist the driver, not to replace the driver. To avoid liability, this is all the car manufacturer would have to say, it’s not a “replacement”, it’s a driving aide.

              Sure, in the future, there may be cars where the driver is replaced, but I think that we are a very very long way from fully autonomous vehicles and for a long time to come, the onus on who is at fault will fall to whoever is in the seat that has the most control over the vehicle systems.

  • +2

    Most modern cars have speedos are overestimate your speed by 10%. That means if your speedo says you're going 64, then your likely doing under the 60 limit anyway.

    Regardless of what you set your cruise control to, the driver is responsible for maintaining the correct speed. The driver would get the fine and you won't be able to palm it off to the manufacturer, mechanic nor anybody else.

    Re: fully autonomous cars. That's a very interesting situation. Best look for some case law (if it exists)

    • What if you can proof that the setting was really set to 60, not over nor under?

      • -1

        What if the car was hacked by the Po Po and they made your car go faster so they could fine you and complete their quota?

      • Doesn't matter, you should keep an eye on the speedo and brake if necessary

  • +3

    Telsas already have a high degree of autonomy right now, on the road. On ramp to off ramp in the USA, with automated lane changes coming in the next software update. In Australia the software is still getting passed legislation, but a Tesla will drive itself on freeways and on normal roads, just by enabling autopilot. It wont stop at stop signs, red lights or change roads by itself, but all this is coming with the FSD option soon.

    From the outset, the driver has always been responsible, and is reminded to keep their hands on the wheels sporadically to show they are in control. The liability is fully on the driver, even when you back your car out of your driveway when you are not in it.

    • Never owned nor tried a Tesla before, but isn't it defeating the purpose to have autopilot? More like letting your kid to do the driving and you need to be extra focused to oversee the whole "auto" driving journey.

      • No. When you drive it, it is revolutionary. Long trips become a breeze. That subconscious stress that you never realised you had evaporates when you're on autopilot. I think for the next few iterations, Autopilot will always require human supervision. Even though right now it's statistically safer than a human driver, it still makes mistakes. With every software update, it makes less and less, but I can't see it being perfect for a few generations yet. AP3 is just around the corner, but I think it will be AP4 or 5 that will allow for no human intervention.

        • Yeah I don't know, not until may be when I've tried it myself. It just seems that it released your stress in one way but introduced more in another way.

          • @justwii: Nope, it's stress free driving. It's revolutionary. It's amazing tech that other manufacturers have failed to replicated so far. It's as amazing as the Tesla drive trains. All yours for a $190k starting price, on the road.

            Or wait 6 months and you'll be able to get a Model 3 with it for $80k.

  • Who's Fault When Cars Are Autonomous and Breach a Rule?

    They won't.

    Edit: Serious answer is it really depends on the circumstance. It would be more likely to be either the driver's vault or if another party is involved such as another driver or pedestrian it would be their fault.
    Currently, and I suspect for a long time autonomous cars in are in the testing stage. So they have gotten into accidents at the fault of the car itself.
    I really don't expect them to be commercially available until the majority of kinks are ironed out, especially for Australia.

  • +1

    We'll have practically autonomous cars within the next decade. But we won't have laws to actually move liability away from the driver until likely much much later. You'll still need to be aware of your surroundings and in control of the car - and if you want to take advantage of convenience features, you'll probably also have to take the risk that something goes wrong.

    • +2

      I think until one day, people are not driving cars anymore, but riding, i.e. everyone is a passenger, then there will be no driver is at fault.

  • You still need to be actively ready to take control of the vehicle so.. same as is now

  • Yours if behind the wheel, laws have not been updated otherwise. Car manufacturers will never take responsibility.

  • Answers to these questions are being dealt with by governments and until new laws are passed we wont know as current laws really don't cover this at all.

  • This is similar to the more major (and important, IMO) question, which is what an autonomous car does, in the event of an imminent collision.

    For example, imagine you’re travelling in the left lane of a highway in a vehicle that is driving in an autonomous mode, and a pedestrian steps out in front of you from the grass/parked cars to your left. In this scenario, there is a car in the lane to your right, so no clear path to swerve, and there’s not enough time to brake. Does the car:

    1) Acknowledge that there’s no safe alternative and brake to attempt to minimise the overall harm, but likely still killing or seriously injuring the pedestrian

    2) Swerve to the right, despite the car being in the adjacent lane, perhaps under the justification that hitting a car should be less harmful than hitting a human (although this fails to consider the domino effect, which is something that could only really happen with an “Internet of Things” concept where cars communicate with each other)

    3) Swerve to the left, mounting the grassy area or slamming into parked cars, potentially harming whomever may be there (again, it won’t know this in all likelihood) if not harming the driver as well.

    I don’t believe there’s any kind of consensus of what should, or will, happen.

    • +1

      It’s the old “a train is hurtling down the track” trolley dilemma… what lever will it pull?

    • Yeah I also read this IOT and 5G vision to connect all the cars on the road together. The future is fun.

    • +1

      Number 1 - it can't take an action that would hurt someone. It is what a human driver would do too (who doesn't know the exact breaking distance and will just be hoping). If the pedestrian reacts quickly enough to hop back off the road you are going to look pretty stupid barrelling into another car.

      With the trolley problem my answer is always to stay on the path you're on, brake and hope for the best.

      • That’s what I agree with; not making an active decision and allowing things to run their course. However, then you get into the realm of “inaction is, in itself, an action” etc

        • I guess you have to choose one or the other - both are bad, but inaction is slightly 'less' bad (in my opinion)

        • “inaction is, in itself, an action”

          This is never applied in real life, and is impossible to do, because if we applied it, we're all each responsible for the deaths of innumerable African orphans whose lives we could've saved if we just shipped a bag of rice to them once every few months.

          I tend to have a very clearcut sense of responsibilities, and in your scenario:

          1. You have a responsibility to follow road rules and not harm others by breaking them - this rules out crashing into the car in the next lane, or swerving off the road if it's unsafe and might hurt someone.

          2. Conversely while you have a responsibility to the pedestrian to try not to run them over, this does not extend to putting others in danger, and it's the pedestrian who owed (and now breached) their responsibility to you by stepping onto the road.

          To me, the correct option is obvious: Slow down as much as possible, swerve only if no others are put at risk, and otherwise the pedestrian lives with the consequences of their actions.

  • User/Driver in control of vehicle is always responsible, no matter what driver assist options are in car.

  • This is a current policy issue.

    Once the level of automation is high enough, the manufacturer (and not just the driver) will have to be somewhat responsible.

    i.e. Safety Assurance for Automated Driving Systems:

    https://www.ntc.gov.au/roads/technology/automated-vehicles-i…

  • They are driver assistants. You are in control.

Login or Join to leave a comment