So Much for The Tesla Driverless Car

Comments

  • +133

    There would have been thousands of fatal crashes in standard cars that day too, should we outlaw them?

    • +42

      Exactly.
      Comparing crash or fatality rates per km driven, self driving cars are likely far safer in most scenarios of normal driving roads/conditions.

      It's just that it's harder to point the finger of blame if a computer crashes compared to a human, so there is more resistance to autonomous driving .

      • +8

        I think the finger of blame, for driverless cars, should always go to the person in the driver's seat. Ultimately they should be liable for crashes/damage to other cars.

        • +76

          If you read up on this accident, there was no-one in the driver's seat, one person in the front passenger seat, one in the back seats. Darwin Award nomination right there.

        • +1

          There was no-one in the driver’s seat,"

          -Sergeant Cinthya Umanzor of the Harris County Constable Precinct 4 said.

        • +22

          Elon and his marketing team need to stop using the term autopilot. In a plane this means all sensors are monitoring and adjusting the settings to maintain safe and level flight. Telsa systems are not using Lidar, and cannot see into sunsets, around exit ramps or crash barriers. Irresponsible companies promoting sub par systems as state of the art causing catastrophes.

          • +16

            @Brian McGee: Autopilot in planes is not perfect either - you still need pilots continuously monitoring.

            • -8

              @blackstarzes: If autopilot in aviation crashed as many times as Teslas all planes would be grounded.
              You also don't get to fly a plane after learning how to reverse park a couple of times.
              You cant fly a plane before you pass tests regarding the specific use of autopliot systems. Any fool can buy a Tesla without any knowledge of how the system operates.

            • +1

              @blackstarzes: and sitting in the drivers seat

            • +4

              @blackstarzes: The difference though is pilots undergo a lot of training and certification to understand the limits and requirements of autopilot, a Tesla driver gets handed the keys and is free to learn as little or as much as they like about the limitations before engaging it.

          • -1

            @Brian McGee: True true. It isn't real autopilot. No slam or lidar.

          • +1

            @Brian McGee: Relevant:

            Road Kill 1 year ago (edited)
            Tesla’s autopilot is supposed to assist in driving not completely replace it

            For all you idiots out there, this is what Tesla’s website says word for word:

            The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving features evolve, your car will be continuously upgraded through over-the-air software updates.

            Also relevant:
            Death by lagging in real life.

            • +2

              @Zachary: Don't call the system autopilot if it isn't autonomous enough for the most mundane of tasks like keeping in a lane, maybe 'copilot' would be a better name.
              You can't advertise something to do a job and explain it away in fine print.
              Look at Hummer with their ad for the H3 in the daintree, quite a few shock absorbers went through bonnets because the ad showed the driver bouncing over rocky terrain and fallen trees.
              The Holden thunder ute ad that showed some sicc*nt ripping doughies. Holden said don't try this at home, may void warranty, etc etc. When it came to fair trading if anyone complained about noisy diffs, Holden had to replace them cause they advertised the car being capable of doing that.

              • @Brian McGee: is this the holden thunder ute ad ur on about?
                and is the hummer h3 ad ur talking about? dont see them driving up or on top of dead tree branches or trunks but a lot of rocky terrain and sand dunes though

                You can't advertise something to do a job and explain it away in fine print.

                but u can…cant u…? otherwise if u didnt read the fine print then thats on you for trying something out when the product was not designed for that task or is but very selective on the specificity of said task…and no one is gonna help you on that since its ur fault for not reading.

                • +1

                  @Zachary: The hummer ad is the American ad, not the Australian one.

                  Holden couldn't show the car doing donuts and then claim the car was wrecked by the owner doing them. Something about truth in advertising (ACMA) and acceptable quality (Fair Trading/ACL).

                  • @Brian McGee: If someone spends close to $100k on buying that Hyundai built quality car and didn't bother to fully understand what the car can ACTUALLY DO, I think the blame is on the buyer.

                    Having said all of that, there are an enormous amount of sub standard people out there.
                    Maybe regulation should be the be way. A second driver licence just to be allowed to drive a semi autonomous vehicle, ironically

                    • @berry580: Maybe not a second license, just an extra 60hr/100hr course with a certification on the license, similar to wearing glasses, or maybe a different class, like CA (Car Autonomous) . And if you hold that license you need to do a knowledge test everytime you renew (5 year max)?

      • +4

        No, driving is still too complex for machines. And likely will be for a long time.

        I know there's some interesting videos and footage on specially designed tracks (usually otherwise empty ones with clearly marked lines). They're mainly to keep investors interested (uBer is going to have driverless cars and flights and all sorts of things "soon", so focus on that instead of the fact it hasn't made a profit in ten years etc).

        Factor in other vehicles (with erratic human drivers), traffic, the fact most suburban streets don't have pristine line marking, and we're a long way off.

        • +2

          No, driving is still too complex for machines. And likely will be for a long time.

          I know there's some interesting videos and footage on specially designed tracks (usually otherwise empty ones with clearly marked lines). They're mainly to keep investors interested (uBer is going to have driverless cars and flights and all sorts of things "soon", so focus on that instead of the fact it hasn't made a profit in ten years etc).

          Actually the technology is already here, whether it's matured enough to be able to handle every situation is a different story though.
          https://www.youtube.com/watch?v=lacmtG0V-uk

          But the above demonstration is on beta software that wasn't available to the general public at the time of filming.

          Planes can already auto-land, though they're not equipped to think about every single scenario like an experienced human pilot.

          • +5

            @cwongtech: Yeah, if software isn't able to handle every situation or think about scenario like a human, then driving is still too complex.

            There's no point getting 95% of the driving basically right and then T-boning someone at an intersection because a reflection from a headlight made it think it was halfway between two lanes.

            • +5

              @CrowReally:

              There's no point getting 95% of the driving basically right and then T-boning someone at an intersection because a reflection from a headlight made it think it was halfway between two lanes.

              Lol, pretty sure a reflection from headlight is what confuses humans.. that's why they have multiple cameras.

              Yeah, if software isn't able to handle every situation or think about scenario like a human, then driving is still too complex.

              You may be smart as an above average intelligent human.. Not so sure about the ones who were involved in this accident.
              I'd argue thinking like a human is not always best as we tend to succumb to emotions and freeze up when we panic

              • +5

                @cwongtech: The human brain has a wonderful ability to 'think outside the box' and prioritise urgent tasks. For all the pamphlets on how great A.I. is, we are decades ahead of them. "Chess" is a game with simple rules, and it still took decades to teach a computer sufficiently to beat a skilled human.

                A deadbeat human with a couple of tins of beer in him will still be able to catch a cricket ball better than a robot. Calculations of velocity, force, all that stuff - the motor skills to grab it out of the air - that's all ways the human brain is better than A.I. Look at how great we are at walking up stairs. I was able to do something at age 4 that Boston Dynamics has spent hundreds of millions of dollars trying to figure out.

                Giving an A.I. multiple cameras doesn't get around the issue I outlined above. But let's pretend that I'm wrong on this, and computers are actually pretty darn good at driving cars and we could all install the software tomorrow.

                1. The ultimate responsibility for any actual accident will always, always be on the human being. Tesla (or any manufacturer) isn't going to take on the blame for when the car runs over a pedestrian wearing a black shirt with a white stripe on it. There'll be lots of legal small print about "you should be in the drivers' seat and monitoring traffic at all times while in the vehicle".

                2. Regulators and the 'industry' are going to hate it - so early adaptors are going to cop it on their insurance premiums. ("But they're safer!" Sure, and two heads are better than one. Is it safer if a human in the passenger seat is also watching traffic and 'helping' you drive by grabbing the wheel occasionally?)

                3. If we accept that we're ultimately responsible for any crashes that occur, then we're going to have to have our hands hovering over the steering wheel at all times and doing driver things [checking mirrors, blind spots], just to be ready for when the A.I. beeps and says "DIVIDE BY ZERO, HELP ME HUMAN". And it's more likely than not these instances are going to occur on "edge cases" when the A.I. sees something weird (Oh no, an extra line right in the middle of this intersection!). So yes, the only times it's going to throw control to you is right when it's shitting the bed in a weird set of circumstances, also known as "0.23 seconds before the accident occurs".

                That's the convenience of the driverless car future, hoverhand enthusiasts trying to anticipate what the software is going to get freaked out by. What a long way we've come.

                • @CrowReally: You may be right on the motor skills point (although there are robot arms and the like being used for very specific tasks that are almost as or more accurate than humans). However, I think you're wrong on the calculations of velocity, force etc point. For example, it's widely accepted that Hawk-Eye technology (used in tennis and cricket etc) is more accurate than humans when it comes to judging the path of a ball. It's obviously not flawless but it is better than the human brain.

                  • @Pulseidon: I don't believe Hawk-Eye technology is real time and can predict what is about to happen based on different variables but it analysis the video frame by frame to plot the trajectory and predict what would have happened if the obstacle was not there. Yes, accuracy wise it is probably better that humans with adequate resource.

                    @cwongtech, with aircraft auto pilot with have highly regulated space, procedures and standards. You got transponders, TCAS… most of all ATCs

                    with Autoland, it is brilliant technology. Again airports and runways are regulated, you have ground based ILS and glide slopes tailed to each approach. I dont believe autoland will work on a random airstrip or land on a road.

                    • plus the aircraft has one extra axis to work with until they a land :D

                    Driving a car is probably thousand times easier that flying an aircraft. The difference is where…

                    IMHO, we have already past aircraft autopilot stage to some extent…its more of what people believe an aircraft's autopilot is capable of.

                    These car can drive almost automatically on Freeways already where the pedestrian, cycles are removed from the equation. Add a transponder equivalent on all cars, so the auto-piloted cars can communicate with each other and establish some level of communication with cars driven by human so the auto-piloted cars dont have to just rely on visual cues.

                    lol, there is probably billion holes in my opinion.. its too late

              • +3

                @cwongtech: Airspace and chess boards are highly regulated / have few exceptional cases and outcomes are highly deterministic.

                Driving in city streets is rather chaotic.

                There's a big difference.

                • +1

                  @afoveht: "Driving in city streets is rather chaotic. There's a big difference."

                  Exactly!
                  There are far too many variables for A.I. to monitor.

                  There is still a very long way to go, in regards to having no steering wheel.

                  The human brain is brilliant at monitoring thousands of variables - whether they be visual, audible, scent-related etc.

                  A.I. simply cannot compete, yet.

                  If our brain was simpler, we still could not fully understand it.

                  It will be a long time before there is a replacement/alternative

                  • @Forkinhell: To boot, I'm not sure with some of these more recent technologies but exceptions are IME much more difficult to contain than even a large number of variables.

            • @CrowReally: Then human should be banned from driving outright.
              As humans choose to get drunk and smash into things every day.

              Software/ hardware can be fixed. Unintelligent human behaviour, much more difficult.

              • @berry580: That's an incredibly shallow answer that overlooks almost all of the issues (for instance, we already have rules that make drunk driving illegal - which is how we have already "banned" it).

                It itself is a good example of unintelligent human behaviour, sure, but I don't think a piece of software would have given a better response.

          • -2

            @cwongtech:

            Actually the technology is already here, whether it's matured enough to be able to handle every situation is a different story though.

            You can argue we can have pilotless planes. Automated warehouses. Can automate everything. Problem is after that where is the people going to get the money to consume the products?

            Rich wants more people as consumers so they can grow richer. But having everything automated and giving universal income to consumer to consume just defeats the purpose. Catch-22 situation.

            • +3

              @netjock:

              u can argue we can have pilotless planes. Automated warehouses. Can automate everything. Problem is after that where is the people going to get the money to consume the products?

              How did this turn into a cashflow discussion?
              And to answer your question, if you want to be protected from automation, be either the designer or the maintenance roles of the automation industry (robots haven't been designed to repair themselves, and robots cannot design themselves)

              Rich wants more people as consumers so they can grow richer. But having everything automated and giving universal income to consumer to consume just defeats the purpose. Catch-22 situation.

              You cannot automate everything

              You can only automate simple tasks that are repetitive.

              Damn dishwashers, they come to my country and take my job!

              Then arguably, every complex task is a combination of simple tasks.

              What makes a human adaptable (compared to a machine) is a machine cannot be trained easily in the decision-making process to find the right combination of simple tasks that make up the complex tasks.

              If enough time is given to design a machine that has a complex decision making process (algorithms) that execute the right set and combination of simple tasks, then it can do so at a speed higher than a human can think.

              The only issue is, the model is only as good as the data set it's been trained on (i.e. you feed in rubbish data, it will only be able to respond to rubbish data).

              • @cwongtech:

                You cannot automate everything

                You might be right but that is not what these tech entrepreneurs are pitching.

                AI blah blah blah will take over the world. Killer robots on the battle field, delivery drones blah blah blah.

                In the end it is always a cash flow discussions. No money for robots and automation if there is no cash, robots don't drop out of magic trees.

                I am just giggling inside when people think they are going to automate everything and not expect delivery drops to be literally robbed on the street as crime goes up due to lack of jobs.

                • +2

                  @netjock:

                  blah blah blah

                  I guess that explains your thought pattern

                  In the end it is always a cash flow discussions. No money for robots and automation if there is no cash, robots don't drop out of magic trees.

                  You think Elon Musk started SpaceX and Tesla for cash..?

                  I am just giggling inside when people think they are going to automate everything and not expect delivery drops to be literally robbed on the street as crime goes up due to lack of jobs.

                  People can be re-trained in different fields or different parts of a business that can't/haven't been automated.
                  Delivery drops can be secured by private parcel lockers in front of a home, it's already been done (you purchase a box with a lock on it, and you can put in your Delivery instructions
                  "Please enter 1234 to open parcel locker"..

                  • +1

                    @cwongtech:

                    I guess that explains your thought pattern

                    Thanks. Explains how you like to insult first and talk logic later.

                    You think Elon Musk started SpaceX and Tesla for cash..?

                    $100bn+ net worth. If you aren't in it for the cash then you'd be some unknown writing rest of your pay cheques to charity every year.

                    SpaceX: if you have so much pull you'd try to fix the earth because it is a sure fire ecosphere for life, instead of pretending Mars is a back up plan. Even if they could put people on Mars can you imagine the cost of going there, also living there because there is no atmosphere and you can't freely travel? It would make forced labour camps look like a walk in the park. Maybe Tesla is to make him feel better for burning all that rocket fuel.

                    Tesla: yeah I might give it to him on that one.

                    For us mortals I think having a $100m would be more than enough to sell out and just retire. If you got hundred billion and keep on working you've got a point to prove, and that point isn't about health benefits and longevity.

                    People can be re-trained in different fields or different parts of a business that can't/haven't been automated.

                    Easy to say, it comes up every other week why people don't want to relocate and fruit picking just aren't for them. If it was so easy you'd be wondering why people are on JobSeeker which is our selective universal income.

                    Maybe time to check your thought pattern.

                    • @netjock:

                      Easy to say, it comes up every other week why people don't want to relocate and fruit picking just aren't for them. If it was so easy you'd be wondering why people are on JobSeeker which is our selective universal income.

                      Fruit picking jobs are normally given to tourists who want to learn English on the cheap (.. maybe you should get out more and meet some people and listen to their stories. Covid has stopped our inflows of candidates for Working Holiday

                      Instead of being a Damo and Darren who complains about Centrelink all the time and how others take your jobs , might want to either join a business that's well diversified against geographically catastrophic events like COVID-19 (I didn't say it was easy), or start your own business

                      Thanks. Explains how you like to insult first and talk logic later.

                      Your entire logic is based on everything is cash motivated, hope you can spare some time to take in concepts highlighted in Westworld Season 3.
                      When a society is run purely with dollar motivations, also where the currency can be manipulated.. you get what we see in China (I'm Australian Born Chinese, not being racist here but a lot of actions are now profit-motive driven)

                      Then you lose sight of the difference between "being rich" and adding value (whether it is captured in monetary value or not) to other people's lives.

                      Maybe time to check your thought pattern.

                      Perhaps it's time to be aware that everyone has their own unique way of thinking (look up the existence of Myer-Briggs Types)

                      Also, not everything is about the money.

                      Maybe in OzBargain it is when people get mad at ShopBack at the databreach and go "oh! extra cashback! hello!".

                      If everything (including a business model) is about price, then you build a very fragile relationship purely on that (short-term fling)
                      Long-term relationships go far beyond price, and are strategic in nature

                      • @cwongtech:

                        Your entire logic is based on everything is cash motivated, hope you can spare some time to take in concepts highlighted in Westworld Season 3.

                        Whatever you watch on TV and movies is real? No matter now much script writers try to keep to the original text it is entertainment. Just quoting off TV shows how shallow your understanding is.

                        When a society is run purely with dollar motivations, also where the currency can be manipulated.. you get what we see in China (I'm Australian Born Chinese, not being racist here but a lot of actions are now profit-motive driven)

                        The I'm not racist BUT… okay if you think Chinese is dollar motivated how about leasing Darwin port to China on a 99 year lease? China owns quite a bit of Australia. Australia is as much dollar motivated as Chinese in China. Actually it might be worse because Australia is motivated by trying to take money from the Chinese where as the China is just trying to take money so they can lift half a billion people out of poverty.

                        Perhaps it's time to be aware that everyone has their own unique way of thinking (look up the existence of Myer-Briggs Types)

                        The 16 types of MBTI just tries to put people into types. If you believe everyone is unique then you can't explain the mathematical models. People might think uniquely but the outcomes are not unique.

                        Long-term relationships go far beyond price, and are strategic in nature

                        Just by saying it is "strategic" it is saying deriving value and ultimately it is value which creates comfort which needs money. If relationship is beyond price then you are talking about the feels mother's have for their child but we know how often those relationships also come down to money (entitlement to inheritance)

                        Good luck if you think Elon just stumbled upon $100bn and think it is strategic in nature.

            • @netjock: Us humans will do the jobs that the machines can't. eg. driving cars.

        • Limitations could be relatively quickly and easily resolved, especially considering the benefits.

          The biggest problem will be getting non-automated vehicles out of the road

          • @SlickMick:

            The biggest problem will be getting non-automated vehicles out of the road

            Probably not the problem. Plenty of videos of Tesla's avoiding other crazy driving behavior. Most of these Tesla accidents have been:

            1. software misinterpreting white trailer for the horizon (famous one where driver was asleep and roof of Tesla got taken off)
            2. lane markings unclear and Tesla going off road (most likely in this instance)
      • do you have stats to back your claim?

      • +19

        The story is about DRIVERLESS cars.

        Yes the story is about a driverless car….. As in a driver wasn't in the drivers seat, nothing about self driving.

        • +2

          They are 100% sure nobody was in the drivers seat at the time of the accident. What if they spammed Triangle just before crashing and ejected from the seat?

          • +4

            @ATangk:

            They are 100% sure nobody was in the drivers seat at the time of the accident

            Nope, its just poor news reporting again jumping the gun and making up news. Oh it was a Tesla, must have been in self driving mode.

            Well this car didn't have FSD, they didn't buy it. So not able to drive by itself. So it is highly possible there was a 3rd person driving.

            Then we have the car burnt for 4 hours and they couldn't put it out claim…. Odd the the fire chief who was on the scene and put out the fire, said it took them 2-4 mins to put it out. More false reporting.

            https://www.teslarati.com/tesla-battery-fire-fud-debunked-tx…

            So where are the articles to 'correct' the FUD?

            • +1

              @JimmyF:

              Nah, got the article out, clickbaited, just move on now.

            • -1

              @JimmyF: it took them only 5 mins to put out the initial fire. The problem was not a 4 hour long blaze, the problem was the batteries kept reigniting continually after the fire was out, hence the huge amount of water that was needed to cool the vehicle down to stop ignition. The article you linked to is as misleading as those that state it was a 4 hour fire.

              • +1

                @gromit:

                the problem was the batteries kept reigniting continually after the fire was out, hence the huge amount of water that was needed to cool the vehicle down to stop ignition.

                I see no claims of 'huge' amounts of water to cool the vehicle. Did you even read the article? The fire chief said "simply pouring a little bit of water on it". Not seeing any huge amount claim from the person putting it out. They even claimed it wasn't a fire at all.

                The article you linked to is as misleading as those that state it was a 4 hour fire.

                Really? Its all direct quotes from the fire chief

                “With respect to the fire fight, unfortunately, those rumors grew way out of control. It did not take us four hours to put out the blaze. Our guys got there and put down the fire within two to three minutes, enough to see the vehicle had occupants. After that, it was simply cooling the car as the batteries continued to have a chain reaction due to damage.

                “We could not tear it apart or move it around to get ‘final extinguishment’ because the fact that we had two bodies in there and it was then an investigation-slash-crime scene. We had to keep it cool, were on scene for four hours, but we were simply pouring a little bit of water on it. It was not because flames were coming out. It was a reaction in the battery pan. It was not an active fire,” Buck said.

                every once in a while, the (battery) reaction would flame and it was mainly keeping water pouring on the battery,” Buck explained

                Due to the bodies in the car, they couldn't break the car up to put the fire out at the source. There was no run away thermal fire nor was there any need to apply continuous water on the car to keep it 'cool' or to control the fire.

      • +4

        I got in my car and put a brick on the accelerator

        It's DRIVERLESS

    • +1

      Will have thousands more if people decide to sit on the passenger seat and let the car drive.

      If Tesla was reading the lines and the lines weren't painted properly and car ran off the road it is the same kind of human error as people sitting in the passenger seat expecting to be able to react to an emergency.

      It is funny that people some how used their brains to make enough money to buy an expensive car but not enough brains to know that if you are sitting in the passenger seat you might not be able to react to an emergency.

      • same kind of human error

        Yes, such as the OP that drove onto a T-way…

    • "There would have been thousands of fatal crashes in standard cars that day too, should we outlaw them?"

      how many self driving cars were being used that day? One maybe.

    • -1

      There would have been thousands of fatal crashes in standard cars that day too, should we outlaw them?

      If they were driverless, then yes…

    • We should outlaw bad drivers, oh…

    • -3

      A sequence of events caused those other deaths, whereas a foolish concept and arrogance of owning (and building) a trendy new toy caused this one. I work with electronics and anyone who trusts electronics with their life (those who have a choice, and not on life support or dialysis etc) is high on the list of possible Darwin award candidates.

      Electronics will ALWAYS find an UNEXPECTED way to fail SOMEDAY. There will always be a short circuit, a capacitor that dries out, resistors that crack or overheat and burst into flames, jungle chips that contain dozens to hundreds of functions can crack from vibration or stress from a simple screw being too tight (or an external force like a collision from another car)… and NO amount of planning or safety features coded-in (like a car coming to a slow halt when something goes wrong with a computer) will overcome those… because that feature is open to failure TOO.

      e.g. If a car computer can control your throttle and brakes to slow and stop, then it just takes a fault condition to plant the throttle down hard while the computer happily thinks it's slowing to a stop. Your life relies on everything being perfect, 24/7, FOREVER, and that's no more guaranteed than uncle Bob who's sure he'll arrive safe a few km home after 5 beers.

      Sure, crashes may be fewer with self-driving cars, fewer than those caused by human error… but that's the whole point. At least if someone falls asleep at the wheel they basically killed themselves or someone else. But dying because a car computer had a dry joint, a capacitor changed value, or a chip with the necessary code to keep people alive cracked due to physical stress, is closer to murder than stupidity/ineptitude.

      I read an article a few years ago, manufacturers of these nightmares waiting to go wrong were trying to set common rules of decision for killing someone. i.e. Someone in a wheelchair unexpectedly appears on a foot crossing, the car can't stop in time but does a split second calculation showing it could drive onto the sidewalk. But there's a kid there waiting for a bus… They want set 'moral' rules so when their computers fail or decide who dies, nothing comes back on them. They want all the benefits of that 'safety' and none of the responsibility for the death they PROGRAM their computer to cause. That alone tells us it shouldn't be happening.

      • +3

        Such a wall of text. There was NO DRIVER in the driver's seat and no autonymous features were in use, or could have been in use.
        The car did not even have FSD features. A clear case of driver stupidity.

        • -1

          Which you could see in full before reading. No-one made you.

          It doesn't alter the facts of what I said. There should never be driverless vehicles with electronics making the decisions, regardless of whether THIS car was, or wasn't being used that way.

    • There is a difference between human error and a mechanical/software error.

    • the fact that noone can comprehend this when they see a headline like the Tesla crash is actually hilarious.

  • +21

    Nobody in drivers seat. Probably trying to impress other people
    Darwinism at work.
    .

    • +7

      Car also didn't have full autopilot feature purchased, so in reality they just set the cruise control.

      • What if the driver just jumped out and it was a homicide?

        • murder suicide?

          • +2

            @jaimex2: jumped out of the car, so implying there could be a 3rd person.

    • Unfortunately natural selection won't help us here because both gentlemen were 50+ they likely have already passed on their genes.

  • +2

    the idiot should have still been behind the wheel! I was reading that they had to use some enormous amount of water because the battery kept reigniting!

    • Yep, that's how battery fires work.

      • Yep, that's how rumors and false news works..

  • +3

    Dumbasses. At least the gene pool of the human race has improved a bit now.

    • +1

      Maybe they already had kids. Ooops.

      • +3

        If that's the case, this is a blessing in disguise for the kids who will hopefully be able to be rewired by better carers so as to flush out the dodgy genes.

        • -2

          You guys seriously judge the entirety of a man based on one (poor) decision?

          • +4

            @Kangal: You f%&k one chicken and all of a sudden you're the chicken f%&ker!

  • +5

    https://www.youtube.com/watch?v=lacmtG0V-uk

    Fully self driving is here, but they have NEVER advertised the current iterations as fully self driving, "AutoSteer" or traffic-aware cruise control only..
    The driver is supposed to have always been behind the wheel in case anything happens

  • +1

    Won't somebody think of the Children - Helen lovejoy.

  • +5

    So much money so little sense

  • +1

    If you put dropkicks behind the wheel you get stupid results.
    Automated driving is only as strong as it's weakest link.

    In this case it's still semi-automated driving with the system designed to be overruled by the end user if an issue is encountered. If the end user isn't in the driver's seat then it's not really Tesla's problem. I'd still debate if the driver was no longer in the driver seat and this was manufactured to spread the blame elsewhere, pretty sure there's pressure sensors in the Tesla seats that would shut down the car very rapidly if someone left the seat. Also the GPS software on board ensures the car understands the future road speed conditions so i find it hard to believe.

    Automated driving will only be allowed when government red tape doesn't get so caught up in it/we take all non-automated cars off the road driven by dropkicks.

    • +1

      If you put dropkicks behind the wheel you get stupid results.

      Not just any drop kick, rich drop kick! There is a caste system for drop kicks too.

  • +7

    There was no driver in Semi-automatic mode, semi, meaning a driver is required.

    Even in full automated mode, I expect there to be accidents, there are too many variables.

    As long as the number of accidents caused by full auto mode is less than that of humans, that would be an improvement and a cause to embrace it.

    One of my favourite authors discusses an interesting scenario where if 2 children were to cross the road in front of a Tesla, should the fully automatic Tesla swerve to the other side and likely kill its occupant or continue on its collision course to the children? Resulting in theoretical Tesla models, the Tesla Egoist or the Tesla Altruist. Very small extract below.

    https://www.goodreads.com/quotes/10424991-tesla-will-produce…

  • +16

    So Much for The Tesla Driverless Car

    The article doesn't say the car was in self driving mode, only

    which was believed to be operating without anyone in the driver's seat

    They also say

    authorities located two occupants in the vehicle, with one in the front passenger seat and the other in the back seat,

    So let us review the facts, car with no driver doing stupid things crashes… Call me surprised.

    Tesla AutoPilot requires you to touch the steering wheel from time to time. I'm guessing these two had been trying to do one of those 'tiktok' videos and screwed up big time! Hence why someone in the back filming and the driver jumped into the passenger seat.

    But they had been too slow/didn't touch the steering wheel enough, so autopilot disengages when you do stupid things like ignore the warnings. So with no driver in the drivers seat to take over, they run off the road.

    Nothing to see here, two stupid people doing stupid shit. In other news the sky is blue.

    Let me spell out how stupid this article is to even focus in on it being Tesla

    In the USA, nearly 1.25 million people are killed in car accidents each year. That means, on average, auto accidents cause 3,287 deaths per day. An additional 20-50 million people are injured or disabled.

    Australia's population is ~25m….. So in the USA from car accidents more people than the entire population of Australian are injured EACH YEAR. But hey it was a Tesla so better do a world wide news article, because Teslas are death traps. Lets ignore that every 30 seconds someone in America dies on the road in other car makers cars……

    • Those death and injury numbers seem pretty high, did the article provide sources on those. I only ask as I was looking at the USA road toll figures latley and this seems wildly off?

      • +2

        I only ask as I was looking at the USA road toll figures latley and this seems wildly off?

        Whoops sorry, you are right! The page I was reviewing was talking about USA and then changed to global figures!

        Correct USA figures are

        More than 38,000 people die every year in crashes on U.S. roadways.
        An additional 4.4 million are injured seriously enough to require medical attention.

        Still, thats someone dying every 15 mins on the road in America that are not in a Tesla ;)

        • That works out to be over 100 per day. That would be roughly over 2 deaths per day per state.

          • @Caped Baldy: Correct….. So why do we only hear about people doing stupid things in a tesla? Surely at least one of those other deaths out of the 38k are doing stupid things too.

            • @JimmyF: It's like people fearing airplane crashes. They make the news when it happens despite car accidents killing way more people.

              • @Caped Baldy: Yes but we don't only hear about airbus crashes and ignoring boeing crashes…. At least for plane crashes they report both :)

                If you looked at the media, you only think Tesla crashes.

    • +1

      1.25 million people are killed in car accidents each year is way off the mark?
      42,060 people died in vehicle crashes in 2020.

      • See above, that is global figure. My whoopies.

        But still, every 30 seconds someone in the world dies in a car crash that isn't a tesla. Where are those articles?

    • In the USA, nearly 1.25 million people are killed in car accidents each year. That means, on average, auto accidents cause 3,287 deaths per day. An additional 20-50 million people are injured or disabled.

      Sorry that's incorrect
      That's the global figure, America's is closer to 39,000.

      I think if ~0.4% of your population was dying in car crashes a year you'd have a very big problem. let alone 1/6th of your population getting disabled EVERY YEAR.

      • -2

        Sorry that's incorrect
        That's the global figure

        Scroll up, I posted a comment to say I was wrong and it was a global figure!

        I think if ~0.4% of your population was dying in car crashes a year you'd have a very big problem

        They seem to be ok with people dying from covid though? 581k dead so far…. They might be ok for 0.4% of their population dying in nearly a year of covid, they are halfway there!

    • The paid media took advantage of the incident. As always.

Login or Join to leave a comment