TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    68
    ·
    2 days ago

    Lidar needs to be a mandated requirement for these systems.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      1 day ago

      Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 day ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

      • Nastybutler@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 day ago

        No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 day ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

      • explodicle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

          • explodicle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

            • scarabic@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              23 hours ago

              Those are ways to gather empirical results, though they rely on artificial, staged situations.

              I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. That kind of thing can still be well founded in data.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          14 hours ago

          There’s been 54 reported fatalities involving their software over the years in the US.

          That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

          Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

          That equates to 1 fatal accident every 125.9 million miles.

          The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

          Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.