So Much for The Tesla Driverless Car

Comments

    • +1

      Easy to pick the Tesla fan boys.

      Easy to pick the Tesla haters. They post articles that have nothing to do with Tesla self driving cars.

      If a toyota crashed without a driver between the wheel, would you have posted the article? Yeah didn't think so.

        • +5

          As a Tesla troll? Hope you have shorted the stock!!

    • Easy to pick who didn't read the article.

      Easy to pick who can't tell the difference between "nobody was in the driver's seat" and "the computer was driving but failed".

  • was reading on reddit, that they were doctors or something like that

    • I suspect they were probably DUI at the time?

  • -4

    I like how people are defending it isn't full self driving when you pay for FSD with your Tesla.

    Elon was pitching the FSD will turn into robo taxi and make your Tesla worth $300k. If it was that good then Uber wouldn't have sold their self driving unit. Elon would be keeping Tesla private like SpaceX. If a $60k Tesla suddenly becomes worth $300k would Tesla just shut down their website and never sell their vehicles to the public because why not keep all the profits to yourself.

    Sometimes I think Elon is just stirring up us mortals by talking jack.

    • +4

      I like how people are defending

      I like how people are saying its the car fault that two fools in the car and not one was in control of the car or behind the wheel is somehow the car makers fault when it crashes.

      • Google/Waymo had a car that could do what the Tesla currently does around ten years ago, but they found that people couldn't be trusted to properly supervise the cars (even company employees who knew they were being filmed). The cars lull drivers into a dangerous false sense of security. Here's a video of theirs with footage from 2013 with the description explaining why they haven't released a partially autonomous system to the public (like Tesla has): https://www.youtube.com/watch?v=6ePWBBrWSzo

        I'm not sure that Tesla's doing the wrong thing with their approach because outperforming the average human driver is a low bar, but I greatly respect and trust the more conservative approach that Waymo is taking.

        • Google/Waymo had a car that could do what the Tesla currently does around ten years ago

          They had a car that drove around in a very controlled environment that was pre mapped.Tesla doesn't need to be pre mapped.

          Waymo is still like that today.

          The cars lull drivers into a dangerous false sense of security

          Here's a video of theirs with footage from 2013 with the description explaining why they haven't released a partially autonomous system to the public

          Stupidness does that in lots of environments. The same was said when cruise control first came out.

          So shall we not move forward because we have a handful of tiktok wannabe famous idiots out there like these 2?

          I'm not sure that Tesla's doing the wrong thing with their approach because outperforming the average human driver is a low bar

          I would rather trust Tesla autopilot than a lot of people I see on the roads.

          If it does BETTER than a human, then it should be used. Which is does.

          every 30 seconds someone in the world dies in a car crash. Should be ban all driving?

          • @JimmyF:

            very controlled environment

            Not true. Within a year of starting the project (2009/10) they'd achieved their starting goal of driving 10 different 100 mile routes on public roads around the Bay Area without a disengagement. Public roads means other drivers, pedestrians, changeable conditions. Not a very controlled environment. Pre-mapped? Sure, that's part of their approach. Info on the 100 mile routes: https://techcrunch.com/2019/02/08/waymo-cto-on-the-companys-…

            tiktok wannabe famous idiots

            The people in Waymo's video were Google employees - probably more intelligent than average, and they weren't showing off when they were letting the car drive unsupervised. They just felt comfortable doing it, because that's what experience shows happens when someone is asked to supervise a capable self-driving car.

            I would rather trust Tesla autopilot than a lot of people I see on the roads.

            We're saying roughly the same thing there, but I think the bar should be set a bit higher because there are easy actions you can take to beat average crash rates, e.g. not driving drunk, not driving at excessive speeds. If a robot driver can only just squeeze past the average crash rate, then a driver who doesn't drive drunk or too fast would be safer to drive themselves. It's not clear at this point by how much Autopilot exceeds the safety of a human driver because Tesla releases crash data that mixes urban and highway driving, when almost all Autopilot kms are done on highways. Here is one person trying to extract a like-for-like comparison of highway driving safety for Autopilot vs human driving from Tesla's data: https://www.forbes.com/sites/bradtempleton/2020/10/28/new-te…

            • -2

              @ragrum:

              Not true

              Are you really sure about that? The areas the cars drive are pre mapped public roads. The waymo cars can not and will not 'self drive' in any unmapped areas. Unlike the Tesla ones.

              https://www.ltad.com/about/waymo-zones.html

              Waymo’s system is currently designed so each vehicle operates only within pre-mapped zones under certain conditions. Passengers cannot select a destination outside of Waymo’s approved geography, and its software will not create a route that travels outside of a “geo-fenced” area which has been mapped in detail.

              or go straight to waymo who say

              https://blog.waymo.com/2020/09/the-waymo-driver-handbook-map…

              To create a map for a new location, our team starts by manually driving our sensor equipped vehicles down each street, so our custom lidar can paint a 3D picture of the new environment. This data is then processed to form a map that provides meaningful context for the Waymo Driver, such as speed limits and where lane lines and traffic signals are located. Then finally, before a map gets shared with the rest of the self-driving fleet, we test and verify it so it’s ready to be deployed.

              So they have to pre map an area, verify it and tweak the data, BEFORE the car can self drive.

              Sounds very controlled environment to me, even if its a public space.

              The people in Waymo's video were Google employees - probably more intelligent than average

              I'm not talking about the waymo video. I'm talking about the idiots on tiktok doing tesla self driving videoes without a driver in the seat. All looking for 30 seconds of fame.These are the idiots crashing and all the click bait media running anti tesla articles.Someone dies every 30 seconds in the world from a car crash, why am I only seeing articles when Tesla crashes once a year and kills someone?

              I'm sure these two that died, have tiktok accounts and was recording ;)

              Autopilot vs human driving from Tesla's data

              Just go to the source, Tesla doesn't hide this data unlike other OEMs

              https://www.tesla.com/en_AU/VehicleSafetyReport#:~:text=Q1%2….

              In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven.

              So looking at the latest data, you're twice as likely to crash when autopilot isn't engaged, and twice as likely again to crash when you disable all safety features.

              So give me autopilot over a driver without safety features enabled who are 4 times more likely to crash!

            • -2

              @ragrum: Someone likes to dish the negs out when they are wrong! hahaha

          • @JimmyF: We should ban vehicles that exhibit thermal runaway — continuous explosions and intense fires — when they crash.
            ICE vehicles rarely explode into flames in a crash, and even when they do, not in such a manner.

            As for deaths? Simply a question of preference. Do you want an open casket service or all-in-one cremation service by Tesla?

            • +1

              @Speckled Jim:

              We should ban vehicles that exhibit thermal runaway — continuous explosions and intense fires — when they crash.

              Its funny how you can list each and every Tesla fire or crash that has ever happened. But can you do the same for other OEM car fires? No didn't think so. 190k car fires happened in the USA in 2019. How many had been a Tesla again?

              https://www.statista.com/statistics/377006/nmber-of-us-highw…

              You also don't put out Electrical fires with water, that just makes the problem worse. So yeah stupid firefighters didn't help the issue. If they responded correctly this would be a non event.

              As for deaths? Simply a question of preference. Do you want an open casket service or all-in-one cremation service by Tesla?

              Every 30 seconds someone dies in a car crash somewhere in the world. Must be a lot of people owning Teslas then as you seem to think they are the only cars that kill people?

              So you think you're little ICE car running into a tree at over 100kmh would protect you?

              • @JimmyF:

                Its funny how you can list each and every Tesla fire or crash that has ever happened.

                Show us where I "list(ed) each and every Tesla fire or crash that has ever happened".
                I never have. News agencies report on them though, and that's the topic of this discussion. Why so defensive?

                But can you do the same for other OEM car fires?

                No, and it seems neither can you:
                https://www.statista.com/statistics/377006/nmber-of-us-highw…

                "Highway vehicles include any vehicle designed to operate normally on highways, such as automobiles, motorcycles, buses, trucks, and trailers, but not manufactured homes on foundations.
                Values are estimates."

                "In 2019, there were around 189,500 highway vehicle fires reported in the United States."
                So you round up by 500 fires in an attempt to dilute your Tesla Kool-Aid?
                The page doesn't state cause of fire either. Doesn't specify crash-and-burn. Stolen, dumped and torched vehicles are likely included.

                So you think you're little ICE car running into a tree at over 100kmh would protect you?

                This is where it descends to idiocy. Anyone gunning for a tree at that speed is suicidal. So if that's the level of sophistication we have with AI right now, they're a long way off self-drivers.

                It's one thing to keep crashing unmanned spacecraft in the desert, quite another to beta-test half-arsed tech on public roads, particularly when the crash results are so catastrophic.

                • @Speckled Jim:

                  So you round up by 500 fires in an attempt to dilute your Tesla Kool-Aid?

                  Oh god, dare I use rounding to keep things simple… The figure was 'around' 189,500. Around means it could be more or less.

                  Anyhow, last time I looked for 'rounding' on a 5, you rounded up. Has that changed?

                  The page doesn't state cause of fire either. Doesn't specify crash-and-burn. Stolen, dumped and torched vehicles are likely included.

                  hahaha ok. Yes must have been 189,499 torched stolen cars and 1 tesla that caught fire.

                  This is where it descends to idiocy. Anyone gunning for a tree at that speed is suicidal. So if that's the level of sophistication we have with AI right now, they're a long way off self-drivers.

                  You do know Tesla has confirmed the car in question hadn't purchased the self drive feature and base autopilot won't engage on the road they crashed as it has no line markings.

                  So there is no 'AI' at play here. Just like your ICE car. So yes the point about a car running into the tree at over 100kmh is valid, as that is what happened here. Idiots doing stupid things, hit a tree at speed. News at 9, water is wet.

                  It's one thing to keep crashing unmanned spacecraft in the desert, quite another to beta-test half-arsed tech on public roads, particularly when the crash results are so catastrophic.

                  Now comparing SpaceX to Tesla? Might as well blame Boeing's issue on Tesla too while you're there. As above, no AI/FSD at play in this crash. Be the same as you turning on cruise control and jumping out of the drivers seat, then wondering why a car didn't take a corner!? Darn Toyota and there AI!!! Oh wait, what AI?

                  • @JimmyF: Round up a fraction of ONE, so 0.55, if you must to 1.0.

                    We're talking five hundred here.

                    We don't know the causes or the distribution between ICE and EV, but I'm betting the latter is over-represented on a per-capita registered car basis.

                    Now comparing SpaceX to Tesla?

                    Sure, same w@nker at the helm.

                    • +1

                      @Speckled Jim:

                      We're talking five hundred here.

                      Oh my… 500, such a big scary number compared to oh wait 189,500. As a % that extra 500 is a error of 0.26%. Not even worth blinking a eyelid over.

                      I'm guessing math wasn't your A grade subject. I mean your entire point is to ignore the 189,500 fire related issues and scream I over quoted by 500. The original source said 'around', so wasn't a exact number to start with, it could be more or less, so could have been 180k or could be 200k fire related issues.

                      Sure, same w@nker at the helm.

                      Ahhh yes, that statement explains all.

                      Love him or hate him, his doing things that most people wouldn't even dream about.

                      • @JimmyF: You're fighting a losing battle here. The same types of arguments get trotted out with the development and implementation of any sort of disruptive technology.

                        You won't ever be able to change their minds, they're the same types who said the motor car won't ever replace horse and buggy, and we can all see how that turned out…

                        • @downbythecreek: With the benefit of hindsight, we can discuss Horse-buggy and Motor car.

                          The latter gave humans full control whereas the former had a mind of its own, did it not?

                          Coming full-circle to relinquish even a fraction of full control is a regressive step, IMO.

                          • @Speckled Jim: Look, I do agree with your argument in principle, but humans having 'full control' of motor vehicles is precisely why it's the most dangerous activity we engage in on a daily basis. I'm not suggesting we should unleash autonomous vehicles right now, but the time will definitely be here within the next decade given the strides that have been made within the last few years.

                            We could argue about the risks of letting computers control such a dangerous activity all you want, but the reality is the s**t tier driving skills on display by 75% of people on the road is exactly why autonomous vehicles will very quickly become a much safer option for everyone who uses the road. It'll also reduce congestion and reduce the traffic enforcement and compliance burden.

                            • @downbythecreek: Push the "easy answer" of autonomy all you like. As many are finding, the reality is, it's not easy in practice.

                              If the solution requires wholesale rebuilding of infrastructure to ensure an acceptable level of safety; if fire departments cannot rely on hydrants and must carry special foams; if battery chemistry instability wants to BBQ everything in spitting distance…
                              Volunteers are welcome to step forth and become beta-testing guinea pigs. In the fullness of time, it will be seen for what it was — a Utopian fantasy.

                              • @Speckled Jim: You know what, I've thought about it and you're right. Human advancement is going to stop where we currently are.

  • It may also appear "governing" bodies may already be behind. e.g. driver's licenses (does this mean you don't need a license if the car can drive itself?), insurance (who is the "primary" driver", who crashed the car?), etc… 🤔

  • “It took four hours to put out a fire that normally would have taken a matter of minutes,” Constable Herman said, adding that it took more than 30,000 gallons of water to extinguish the fire.

    https://www.nytimes.com/2021/04/18/business/tesla-fatal-cras…

    • +4

      any real fire fighter knows you don't use water to put out a electrical fire….

      • +2

        Obviously petrol cars are so much safer because of this for firefighters /s

        • +3

          Obviously petrol cars are so much safer

          yes they just explode like the movies and are done with it. None of this slow burning for hours on end ;)

    • +1

      What rubbish. The tyres wouldn't be intact if it burned for hours.

  • +4

    The driver less car maybe a very long way off.

    "Two men have died after a Tesla vehicle, which was believed to be operating without anyone in the driver's seat"

    I suggest you read the article before posting about it next time. The fact this was a driverless car is irrelevant.

    • Do you think the guy would have still climbed in to the back seat with the cruise control on a regular car?

      • +4

        Yes, if you're silly enough to do it in a Telsa, then you have a death wish and would do it in any car.

      • Do you think the guy would have still climbed in to the back seat with the cruise control on a regular car?

        Ummmm yes, many videos around of people doing this when cruise first came out!

  • High speed crash - how is the speed determined in hands-off Tesla driving? Who decided to go at high speed, the car or the human?

    • +1

      The user sets the speed the car can go, within reason to the speed limits.

      But in this case, the car wasn't driving.

  • -3

    Electric cars are using lithium batteries and are highly flammable and very expensive to replace too.

    • +3

      Ya know, car fuel is just as flammable

  • -1

    Have any real-life driving instructors/assessors actually tested and passed a driverless car?

    No way would I trust a driverless car. Look at all the software errors that stuff things up in everyday life. Those only cause inconvenience, not death.
    Way to much risk for me.

    I can see AI having a long attention span, and sticking to the road rules, and being free of drug and alcohol influence.
    What I can't see, is AI having anticipation and broad situational awareness, and stored experiences leading to sensible heuristics.

    • +3

      I drive a Kia family SUV, that has never faulted taking a corner, keeping it's distance in traffic, warning me if I even indicate that I want to change lanes if someone would be obstructed.

      The only problem with the Tesla apparently is it doesn't cause enough fuss if the driver takes thier hands off the steering wheel.

      When you consider all the technology available today, and expected tomorrow, you'd have to be narrow minded to think that the obstacles to automated vehicles can't be overcome.

      • +1

        When you consider all the technology available today, and expected tomorrow, you'd have to be narrow minded to think that the obstacles to automated vehicles can't be overcome.

        It is possible just at what price. Cruise missiles use GPS and could hit a target within a few meters. Problem is each missile costs tens of millions of dollars that is on top of the cost of development.

        For commercial applications automated trucks / buses etc I would think there would be enough efficiency and money behind it (this would probably be for long time consuming point to point traffic along high ways with very little unique obstacles) You don't want trucks and buses in the cities turning and crushing cyclists and pedestrians at street corners.

        How many passenger cars would you have to sell if you got your own unique solution that is not off the shelf?

        Cars are so cheap because most of the parts are made by the same underlying companies for all the car manufacturers.

    • Firefighters at the scene contacted Tesla for advice on how to extinguish the blaze and were told just to let it burn out, Mr Herman said.

      Imagine if there was a fire at the Tesla factory

  • too much of a control freak to ever use this type of stuff

    • +1

      The fan boys are out on full force, down-voting everyone making a meaningful comment.

  • +1

    Ok so Tesla Autopilot is 4x safer for the average Tesla driver. But is it safe than a SPECIFIC driver? That is a question Tesla will have to convince said driver the answer to is yes.

    • -4

      Might be 4X safe but Tesla drivers are paying 2X if not 4X price of a similar size car. Also to be able to afford one you'd most likely to be earning above average salaries afforded by higher levels of education. You have to also measure fleet age, the general automotive fleet would be much older than Teslas, older they are the most likely accidents caused by mechanical failure.

      That is like saying university graduates can type 4X as quick as the general population which includes people who are illiterate, primary / high school level of education and people with disability and intellectual impairment.

      • +2

        Everything you've said is irrelevant.

        • Okay whatever. If you believe all is equal and you can install a piece of software into a car with 4 flat tires and prevent accidents.

  • +11

    Auto pilot wasn't engaged and FSD wasn't purchased in this case:
    https://cleantechnica.com/2021/04/19/some-thoughts-on-the-te…

    • +8

      No, but but the clickbait title must be true!

    • Then I'm surprised they didn't fine one guy with their head in the foot well and the other trying to steer from behind the driver seat.

      Or maybe they were doing myth busters horizontal lift free fall experiment to see if you jump at just the right moment in the opposite direction you may survive the impact.

      All I can say is in this instance wealth might not be corelated with level of intellect in this matter.

  • Tesla driverless car crashes into tree and bursts into flames in Texas, killing two

    How do they do forensics on a crushed burned car … ???

  • +2

    Yeah, I still drive a Model T Ford because I was concerned about progressiveness in newer car designs! Ha!!

  • I had this discussion with a colleague many times. My humble opinion is that you need to build roads to suit driverless cars and not the other way around. Very much like the new driverless Sydney Metro system in which dedicated rails were built for them. You can't just chuck them on existing heavy rails and tune them to work.

  • +2

    so what, most owners dont know how to use the auto pilot or have too much confidence that they take a nap while driving specially in usa.. nothing to do with the ability of the system

    • Whats being reported is impossible to have occured ( Car wont engage AP on roads without lines, disengages if it detects driver seat has no weight or isnt holding the steering wheel, car in accident hadn't actually purchased self driving package ). Muppet media whoring for clicks.

      You'd expect better from the ABC.

      • my guess is the drunk driver hit the runners before the cops show up and the media is so dumb they picked up the story of the tesla driven by ghosts..

  • +3

    Autopilot wasn’t engaged. Muppet media.

  • +3

    It’s interesting.
    1. Tesla’s don’t have Full Self Driving yet. And the said car did not have Beta version either.
    2. Autopilot wouldn’t engage on unmarked road, especially a pocket road. Even if they manage to engage, it wouldn’t let you do 5(or is it 10?) over.
    3. You cannot drive away unless there is a mass on the seat. Autopilot would disengage if you unbuckle your seat belts.
    4. EDR shows autopilot wasn’t even engaged at the time of crash.
    So extraordinary measures had to be made for contravening the security measures to cause a crash. Not to mention doing high speed in any sort of autonomous driving sounds improbable.

    We still don’t have all the information. However using any tool in manners in which they are not intended and then stating that it’s not safe is disingenuous.
    Should a car manufacturer be liable if I put a brick on the accelerator and drive a car to the wall. Or if I set cruise control to 50 over the limit and get caught for speeding?

    • +1

      Should a car manufacturer be liable if I put a brick on the accelerator and drive a car to the wall

      A block of ice leaves no evidence in a firey crash.

  • +2

    Did they have insurance?

  • +1

    The investigation has only just begun and people have jumped to conclusions. According to Tesla the car did not have autopilot or FSD engaged. That remains to be tested. I have a feeling there was a 'hold my beer' moment with the driver taking reckless action (yes, even with FSD there is a driver: the human).

    • +1

      The driver thought "Cruise Control" meant they could take a nap.

  • Planes have autopilot, and they still require people to be in the drivers seat supervising.

  • Tesla doesn't make a driverless car.

    • who does then?

  • +4

    OP just embarrassed himself by jumping the gun.

    Why are people so keen to see self driving or Tesla fail?

    • +3

      because they can't afford one.

    • Why are people so keen to see self driving or Tesla fail?

      Probably protecting their ego's. People who said it will never work don't want to see it work, because then they feel stupid.

    • It's likely OP didn't buy Tesla stock during the run up and has always held an irrational grudge because of this.

  • Is it about Tesla or driverless? Anyway I would love to see if any driverless car can back into my driveway safely. Certainly I will get one of those if it can handle my driveway. I don't like to keep the car outside all the time.

  • I find it terrible that they decided to not have anyone in the drivers seat though. It's like asking for problems and unfortunately it ended up in some deaths that could of been avoided.

    • As time goes on it seems more and more like the journalist wrote the article without any facts or truth.

  • +4

    Was the car under Tesla Autopilot control?

    The argument for, by the local police:
    1. no-one was in the drivers seat, and
    2. friends of the dead men say they were going out to try out the car's self-driving ability.

    The arguments against, by Tesla:
    1. logs show that car wasn't in self driving mode,
    2. if it had self-driving mode it wouldn't have stayed in it without someone in the drivers seat,
    3. it couldn't have been put in self-driving mode because that requires lane lines, which that road didn't have, and
    4. that car didn't even have the self-driving software installed.

    If all those things are true, it seems the guy purchased a car thinking it had self-driving mode because it was a Tesla, and did what he thought put it into self-driving mode for the first time, and exited the drivers seat. But in reality that car didn't have a self driving mode. So soon after he exited the drivers seat the car left the road.

    • Well done detective, case closed! 🕵️‍♂️

    • +1

      Op destroyed

  • -2

    The scary thing about the crash, which is not being reported because of the focus on Tesla's self-driving software, is that it took 4 hours to finally put the fire out. That would normally take minutes after the fire truck arrives. But because the battery was damaged and kept supplying energy and chemicals it took 4 hours. This is an issue with EVs. There is a jumper that can be pulled so the battery is disconnected from the electronics and the motors so the motors stop, but it still leaves all the parts of the battery connected to each other, and if it is damaged enough that its shorting, its really hard to stop it generating huge amounts of heat.

    • Why are the tyres intact if it burned for 4 hours?

      • -1

        You are imagining a petrol fire, where the fuel tank ruptures and the petrol spills everywhere, and burns everything. A shorting battery just keeps generating heat.

    • What is scary is that this is FAKE NEWS! In this it wasn't self-driving and it didn't take 4 hours to put the fire out.

  • Dumb and Dumber didn't read the manual or understand the manual after reading. The rest is known facts, so what is this long debate about?

  • Having used autopilot on a plane many times before, I won't trust auto pilot on a car unless someone is actively monitoring it.

    But that's beside the point as many commenter above has already pointed out that this crash happened because autopilot wasn't even engaged.

  • +1

    So much fail in that clickbait bs article and the op.

    Couple of idiots killed themselves would be the more accurate title

  • I call these people brave beta testers.

  • This is very skewed. If you want fairer data swamp the city with driveless cars amongst ordinary traffic in them for a day to see the number accidents that occur.

  • Driverless cars are an inevitability. It’s simply a matter of cost versus risk.

    Consider planes; they very rarely crash but when they do people very often die. But, planes are simply ubiquitous. The same will happen with driverless cars, trucks, and buses. Sometimes, they will crash and people will die, but they will absolutely become standard. It’s just a matter of time between when they’re cheaper than paying drivers, and they’ll replace drivers.

    I’m happy to be proven wrong, it will mean a lot of people will lose their jobs but based on the automation of industry in history I don’t think I will be.

  • The title is misleading. The car didn't even have the autonomous driving function installed or activated. Just two morons doing what they were told not to do.

  • +1

    You can always spot the geniuses (aka morons) who read one news article and then claim to be experts on said topic.

  • We are not even at level 5 yet…

    • To think one day we're going to need space travel and have to travel at light speed handled by a computer, always thought a human could do that. [sarcasm]

      So yes, autonomous cars IS something that is going to have to happen if we are going to go down the path of technological advancement.

      Yes, they won't cure stupidity.

      • Advanced aliens would probably already invented technology to break the laws of physics to move in erratic directions and go faster than the speed of light. If we ever find ways to move at the speed of light and encounter an advanced civilisation, they would probably wipe us out in a instant with our primitive technology

  • Check more than one source.

    Allegedly the car did not have the full self driving package installed, and the street the car was on would not have allowed self driving to be activated.

    https://twitter.com/elonmusk/status/1384254194975010826?ref_…

  • I don't blame Tesla, it was human error. If we have to dumb things down we'll soon have warnings for forks and hammers (let's hope we don't get to that state soon).

    I was in the US in a tourbus and saw a car merge into the lane next to ours. A Tesla with its driver completely asleep at the wheel, I was shocked but it did what it needed to and merged correctly with incoming vehicles.

  • With self driving car crashes, why doesn't the AEB kick in?

Login or Join to leave a comment