New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • CaptainProton@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    3
    ·
    edit-2
    1 year ago

    This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

    That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.

    It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

    • socsa@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      There’s no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I’ve experienced it first hand.

      • lapommedeterre@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that’s in the article). It shows a good bit of the footage, too.

        Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.

      • doggle@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        The headline doesn’t state that the warnings were consecutive.

        Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?

        I’ll grant you, though, 150 warnings is still a little tough tough to believe…

      • limelight79@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        I turned off the “lane assist” in our Mazda because it kept steering me back toward obstacles I was trying to avoid, like cyclists, oversized loads, potholes, etc. I don’t know why anyone thought that was a good idea.

        But try buying a car without those features now…sigh.

          • doggle@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            If you’re swerving to avoid a sudden obstacle you reasonably may not have the foresight or reaction to flip on a signal. The car still needs to not force you back on collision course.

            • Grabthar@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              That’s a good point, and is probably why they designed it so that if you swerve hard, lane assist shuts off. It only nudges you back to the middle of the lane if you are gently drifting to a side, so it only works in situations where your turn signal can be used to avoid it. Or you can just disable it if you drive a BMW or otherwise can’t use turn signals.

          • limelight79@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Even moving over slightly in the lane to avoid a pothole triggers it; it doesn’t seem like a turn signal should be necessary in that situation. Instead the situation seems to be that I’m seeing the pothole and altering the car’s course gently to avoid it, and I get close to the line and it freaks out.

            I guess if I drove right up to the obstacle then swerved, it wouldn’t do it…but I was always taught swerving was a last-resort thing, best to drive as smoothly as possible. (This was my dad’s argument, and I said, “Uh, SOMEONE taught me to not swerve unless it was necessary…” (him). He laughed.

    • chris2112@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called “self driving”. Compare Tesla’s marketing to like GMs of Hyundai’s, both of which essentially have parity with Teslas system in terms of actual features, and you’ll see a big difference

    • doggle@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Sounds like the injured officers are suing. It’s a civil case not criminal, so I’m not sure how much the court would actually be asked to legislate. I’d be interested to hear their arguments, though I’m sure part of their reasoning for suing Tesla over the driver is they have more money.

    • dzire187@feddit.de
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      It should be pulling over and putting the flashers on if a driver is unresponsive.

      Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.

    • Obi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      You’re completely right and I’ve never seen this for traffic stops in Europe, they’ll make you park somewhere safe, at the very worst, in the emergency lane, but even that is rare for traffic stops. The only times I see lanes blocked is when there’s been an accident/breakdown and then the first thing they do is bring massive light panels well ahead of the spot to make everyone clear the lane.

  • hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    1 year ago

    Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?

  • Jeena@jemmy.jeena.net
    link
    fedilink
    English
    arrow-up
    35
    ·
    1 year ago

    So if the guy behind the wheel died and couldn’t react to the alerts then the car can’t do a decision to just stop instead of crashing into a police car?

      • Wats0ns@programming.dev
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 year ago

        Isn’t that in purpose tho ? Like “hey if we’re not sure to be able to break on time, just disengage so it’s not our responsibility anymore”?

        • iWidji@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          If we want to get really technical, the NSTB is requiring all new cars to have emergency braking so in this situation, the car should slam on the brakes. Even if it can’t slow down fast enough to prevent a crash, it should slow down enough to minimize it.

          Is this particular Tesla under said law? Probably not. But I think we can see why this tactic is the infinitely safer and more ethical than saying “good luck, control this car on your own or enjoy this 100 km crash otherwise”

          • tony@lemmy.hoyle.me.uk
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Tesla has AEB but by the time something like that triggers you’re reducing the severity of the crash not eliminating it.

            It’s likely the car braked at 100km/h but was still doing 50 when it hit… at those speeds it’s fatal whatever happens.

    • pec@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      1 year ago

      He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).

      So if the guy died the car would of disengaged auto pilot (I’m not sure how this works).

      You can check the video in the article. It’s quite informative .

      Edit

      I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.

      • socsa@lemmy.ml
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        1 year ago

        So the headline should be “drunk driver hits police car.”

        • Landmammals@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Was he drunk? The article seems to use the fact that the car nagged him 150 times as evidence that he was impaired.

      • tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        TBH if you’re not used to it the steering wheel check can warn frequently. It’s checking for a small amount of torque on the wheel rather than actually holding it (as there are no pressure sensors) and that catches people out but the prompt says to put your hands on the wheel… I could believe 150 times on a long journey.

  • hoodlem@hoodlem.me
    link
    fedilink
    English
    arrow-up
    35
    ·
    1 year ago

    In fact, by the time the crash happens, it’s alerted the driver to pay more attention no less than 150 times over the course of about 45 minutes. Nevertheless, the system didn’t recognize a lack of engagement to the point that it shut down Autopilot

    I blame the driver, but if the above is true there was a problem with the Tesla as well. The Tesla is intended to disengage and disable autopilot for the remainder of the drive after a small number of ignored alerts. If the car didn’t do that, there’s a bug in the Tesla software.

    I think it’s more likely the driver used a trick to make the car think he was engaged when he was not. You can do things like put a water bottle wedged in the steering wheel to make the car think you have tugged on the steering wheel to prove you are engaged. (Don’t ask me how I know)

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      After 3 alerts, it’s off until you park. There are visual cues that precede the alert though and these do not count. I don’t recall how many there are and for how long, but you start by seeing a message asking to have your hands on the wheel, then a blue line at the top, them the line starts pulsing ,then you’ve got an audio alert that is the first strike. Three strikes during the same drive and you need to park before using autopilot again.

      • meco03211@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        And those alerts don’t come if you’ve overridden the system by putting a weight on the wheel or something.

  • redcalcium@lemmy.institute
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    1 year ago

    Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.

    Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would’ve recognized the stopped police car from way more distance than that.

    • Thetimefarm@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I’m not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.

  • Snapz@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    5
    ·
    1 year ago

    This source keeps pushing tesla propaganda. There’s always an angle trying to sell that it wasn’t the tesla’s fault

  • thatKamGuy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

    Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

    • RushingSquirrel@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      2
      ·
      1 year ago

      I believe this is caused by the fog combined with flashing lights and upward/curved road. The Tesla autopilot system is super impressive in almost all situations but you can clearly see the limits in extreme situations. Here, the drunk driver is definitely at fault, I don’t understand why they’d sue Tesla.

  • Pablo@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    It’s also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist

  • MrSpArkle@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I think Mercedes is the only car company that will accept blame for a self-driving or self-parking failure. That should tell you something.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      edit-2
      1 year ago

      Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn’t change these facts.

      In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

      Source

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          2
          ·
          1 year ago

          Perhaps. I’m sure you’ll provide me with the independent data you’re basing that “Teslas are not safe” claim on

            • narp@feddit.de
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              You made the first comment: “Teslas aren’t safe”, without providing proof.

              And now you’re calling someone a hypocrite because he asks for data of exactly what you claimed, while you’re redefining your first argument as “the contrary”.

              So, do you have proof that Tesla’s aren’t safe in comparison to other cars, or is it just your opinion?

                • narp@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  But you can’t base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?

                  Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn’t really know if the perceived truth is a fact or not.

            • Thorny_Thicket@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Tesla model Y scored the highest possible score on IIHS crash test as well as 5 stars on Euro NCAP

              Their other models have similar results. I believe Model X is the safest SUV ever made.

              EDIT:

              More than just resulting in a 5-star rating, the data from NHTSA’s testing shows that Model X has the lowest probability of injury of any SUV it has ever tested," Tesla said in a statement. "In fact, of all the cars NHTSA has ever tested, Model X’s overall probability of injury was second only to Model S.

              Source

              Also might want to check this

              EDIT2: Imagine downvoting the guy providing hard evidence and upvoting the fanatic making baseless claims backed by nothing

                • Thorny_Thicket@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  arrow-down
                  1
                  ·
                  1 year ago

                  Or maybe you’re so blinded by the hatred towards Musk that you can’t even think straight and no evidence in the world could convince you otherwise?

                  You really should’ve checked the last link.

      • tiny_electron@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.

        • Thorny_Thicket@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          2
          ·
          1 year ago

          Sure. There are always multiple factors in play. However I’d still be willing to bet that there’s nothing in Teslas that makes them inherently unsafe compared to other cars.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I’m not so sure disengaging autopilot because the driver’s hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that’s the better way?

    Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end “The driver was in control at the moment of the crash” just again feels like bad “self” driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

    Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it’s again a sign you shouldn’t be releasing this to the public. It’s clearly just not ready.

    Not taking any responsibility away from the human driver here. I just don’t think the behaviour was good enough for software controlling a car used by the public.

    Not to mention, of course, the reason for suing Tesla isn’t because they think they’re more liable. It’s because they can actually get some money from them.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

        Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.

  • Peanut@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    i still think tesla did a poor job in conveying the limitations on the larger scale. they piggybacked waymo’s capability and practice without matching it, which is probably why so many are over reliant. i’ve always been against mass-producing semi-autonomous vehicles to the general public. this is why.

    and then this garbage is used to attack the general concept of autonomous vehicles, which may become a fantastic life-saver, because then it can safely drive these assholes around.

  • Jordan Lund@lemmy.one
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    1 year ago

    Don’t see how that’s a Tesla problem… Drunk/high driver operating their car incorrectly.