Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • NathanielThomas@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    9
    ·
    1 year ago

    I am the last person to defend Elon and his company but honestly it’s user error. It’s like blaming Microsoft for deliberately ignoring logic and downloading viruses. The autopilot should be called driver assist and that people still need to pay attention. Deaths were caused by user negligence.

    • silvercove@lemdro.id
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Then you should call it driver asist, not autopilot.

      Also Tesla’s advertisement is based on “having solved self driving”.

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      8
      ·
      1 year ago

      That is precisely why autopilot is called a driver assist system. Just like every other manufacturer’s LKAS.

        • Ocelot@lemmies.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          How is that confusing? If you look at the capabilities an airplane autopilot does, it will maintain altitude and heading and make turns at pre-determined points. Autopilot in an airplane does absolutely zero to avoid other airplanes or obstacles, and no airplane is equipped with any AP system that allows the pilot to leave the cockpit.

          Tesla autopilot maintains speed and distance from the car in front of you and keeps you in your lane. Nothing else. It is a perfect name for the system.

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              I’d hope they would before willfully getting behind the controls of one to operate it. Regardless of what we call it, these people still would have crashed. They both drove into the side of a semi while sitting in the driver’s seat with their hands on the wheel.

              • renohren@partizle.com
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                1 year ago

                That because tesla induced them to think it was level 4 or 5 while FSD is level 2 (like most Toyotas are ) but with a few extra options.

                And as long as there is a need for a human to endorse responsability, it will remain at level 2.

                • CmdrShepard@lemmy.one
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 year ago

                  A) Autopilot and the FSD beta are two totally separate systems and FSD wasn’t even available as an option when one of these crashes occurred.

                  B) where’s the evidence that these drivers believed they were operating a level 4 or 5 system?

                  • renohren@partizle.com
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    1 year ago

                    [comment redacted this is a placeholder]

                    Sorry about it, mixed up the two. I’m an empty head.