Tesla CEO Elon Musk has said that cars operating in Tesla’s Autopilot mode are safer than those piloted solely by human drivers, citing crash rates when the modes of driving are compared. He has pushed the carmaker to develop and deploy features programmed to maneuver the roads, arguing that the technology will usher in a safer, virtually accident-free future. While it’s impossible to say how many crashes may have been averted, the data shows clear flaws in the technology being tested in real time on America’s highways.

  • BlameThePeacock@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I dislike elon, and I’m never buying a Tesla (I own a different EV already). However, until someone shows me the equivalent human caused rates (for the same type of roads and distance) these numbers just simply don’t look out of line with what I would expect for any car.

    IMO Self driving doesn’t need to be perfectly safe, it just needs to be equivalent or safer than the average human driver.

    • Earthwormjim91@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I think the bigger issue is the lack of transparency. Tesla only reported 3 fatalities involving autopilot while the real number is 17. Not a massive difference when dealing with low numbers like this, but still a big issue if Tesla is lying about safety data.

    • cosmic_skillet@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Yeah, the article reads like a hit piece. Why didn’t they even try to do an apples to apples comparison? Same thing when they compared Tesla to Subaru, just raw total numbers. Not normalized to cars sold or miles driven or anything. The raw data is pretty meaningless.

      • 0x815@feddit.deOP
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        @comic_skillset

        Why didn’t they even try to do an apples to apples comparison?

        Maybe because the data released by Tesla is incomplete and biased as it appears to serve its sales rather than safety?

        It’s the company and Elon Musk himself that are frequently making bold statements while it seems that not even the authorities have the data to verify the claims. As the article says:

        In a March presentation, Tesla claimed Full Self-Driving crashes at a rate at least five times lower than vehicles in normal driving, in a comparison of miles driven per collision. That claim, and Musk’s characterization of Autopilot as “unequivocally safer,” is impossible to test without access to the detailed data that Tesla possesses.

    • tango_octogono@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Yeah this. Maybe I could agree that it’s too soon to be testing these auto pilots on the road, but I dislike how people miss the point with this tech. They set an impossible standard for a technology that could potentially be better than us on the road

      • 0x815@feddit.deOP
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        1 year ago

        @BlameThePeacock @tango_octogono Fair, and in principle I agree. But it’s not (only) the people who miss the point and set an impossible standard but foremost Mr Elon Musk himself. He has been promoting Tesla’s autopilot and even its self-driving capability for years (although the folks at Tesla will certainly know that the latter won’t come anytime soon).

        Tesla video promoting self-driving was staged, engineer testifies

        A 2016 video that Tesla (TSLA.O) used to promote its self-driving technology was staged to show capabilities like stopping at a red light and accelerating at a green light that the system did not have, according to testimony by a senior engineer.

        One of the things we needed for setting reasonable expectations regarding this tech is more reliable information also from Tesla and its CEO. As long as the company itself is frequently flooding the market with unrealistic “news” about this tech, it is good that there are independent investigations imo.

    • freshhotbiscuits@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Yeah, this article is ridiculous. Self-driving cars are FAR safer than human drivers. The number of accidents is minuscule compared to what would be expected if the automated features were absent.

      We run the real risk of screwing this up if people insist on an automated car never causing a crash, or never hurting anyone. Even if they hurt someone, the point is that it harmed maybe 0.1% of the people that would have been hurt in traditional vehicles.

      ”Don’t let perfect be the enemy of good”

      • 0x815@feddit.deOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        @freshhotbiscuits

        Even if they hurt someone, the point is that it harmed maybe 0.1% of the people that would have been hurt in traditional vehicles.

        Where have you got this number from?

        This is exactly what I meant in my comment before. Anyone throws out a number and claims it is true. I don’t think we should let perfect be the enemy of good, but here are people’s lives on the line. We need independent and reliable data.

        • freshhotbiscuits@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Oh, I wasn’t stating that as fact, or ever intending to. The fact of the matter though is that more than 90% of car accidents are caused by human error. If we can eliminate the human error, then we’ll have far fewer people hurt on the roads, even though that means that self-driving cars are going to hurt people. This is merely my point, that we can’t expect self-driving cars to have 0 accidents, but once the tech is good enough (and I acknowledge that it’s not even close yet), we need to be ok with that hard truth.

          • 0x815@feddit.deOP
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            but once the tech is good enough

            What makes you think that the tech is already good enough for being tested in the public? The fact is that the incidents revealed by the investigation is much higher than what Tesla has published, Tesla is holding back a lot of relevant data, no one (except Tesla) can say how safe this tech is and whether or not it should be allowed to be tested on the streets.

            The fact that many or most car accidents are caused by human error and a lot of other critical points here simply don’t matter here as they have nothing to do with the issue. This is not some application on your smartphone that you can test at your own risk while it is still in beta. This is a car. It kills people, and Tesla is obviously unwilling to disclose the data even to the authorities. As long as this is the case, this tech should not be allowed to be tested in the public space.