OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we’re going to allow Elon to murder every year?

  • nimble@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 hours ago

    Friendly reminder that tesla auto pilot is an AI training on live data. If it hasn’t seen something enough times then it won’t know to stop. This is how you have a tesla running full speed into an overturned semi and many, many other accidents.

    • pyre@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      I wonder how much recognition it has on non-white people. we’ve seen these models not having enough people of color in their samples before.

  • blady_blah@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 hours ago
    1. Vehicle needed lidar
    2. Vehicle should have a collision detection indicator for anomalous collisions and random mechanical problems
  • Gammelfisch@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    3 hours ago

    So, a kid on a bicycle or scooter is an edge case? Fuck the Muskrat and strip him of US citizenship for illegally working in the USA. Another question. WTF was the driver doing?

    • M600@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      In regards to the deer, it looks like it might have been hard to see for the driver. I remember learning in driversED that it is better to hit the animal instead of swerving to miss it as it might hit a car to your side, so maybe that is what they were thinking?

  • pdxfed@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 hours ago

    You just need to buy the North America Animal Recognition AI subscription and this wouldn’t be an issue plebs, it will stop for 28 out of 139 mammals!

  • w3dd1e@lemm.ee
    link
    fedilink
    English
    arrow-up
    22
    ·
    6 hours ago

    Deer aren’t edge cases. If you are in a rural community or the suburbs, deer are a daily way of life.

    As more and more of their forests are destroyed, deer are a daily part of city life. I live in the middle of a large midwestern city; in neighborhood with houses crowded together. I see deer in my lawn regularly.

    • Kecessa@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      6
      ·
      5 hours ago

      People are acting like drivers don’t hit deers at full speed while they’re in control of the car. Unless we get numbers comparing self driving vs human driven cars then this is just a non story with the only goal being discrediting Musk when there’s so many other shit that can be used to discredit him.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        3 hours ago

        People are acting like drivers don’t hit deers at full speed while they’re in control of the car.

        I should be very surprised if people don’t generally try to brake or avoid hitting an animal (with some exceptions), if only so that they don’t break the car. Whether they succeed at that is another question entirely.

        • Kecessa@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          edit-2
          2 hours ago

          People drive drunk, people drive while checking their phone, people panic and freeze, deers often just jump in front of you from out of nowhere.

          People hit fucking humans without braking because they’re not paying attention to what the fuck they’re doing!

          • pyre@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            52 minutes ago

            … and that’s the kind of driving Tesla of trying to emulate? awesome.

  • Emerald@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    edit-2
    5 hours ago

    I notice nobody has commented on the fact that the driver should’ve reacted to the deer. It’s not Tesla’s responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle’s movements at the end of the day.

    • rsuri@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      3 hours ago

      True but if Tesla keeps acting like they’re on the verge of an unsupervised, steering wheel-free system…this is more evidence that they’re not. I doubt we’ll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.

    • inclementimmigrant@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      5 hours ago

      That would be lovely if it wasn’t called and marketed as Full Self-Driving.

      You sell vaporware/incomplete functionality software and release it into the wild, then you are responsible for all the chaos it brings.

    • chaogomu@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      6 hours ago

      Then it’s not “Full self driving”. It’s at best lane assistance, but I wouldn’t trust that either.

      Elon needs to shut the fuck up about self driving and maybe issue a full recall, because he’s going to get people killed.

  • MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    6
    ·
    edit-2
    5 hours ago

    I hit a deer on the highway in the middle of the night going about 80mph. I smelled the failed airbag charge and proceeded to drive home without stopping. By the time I stopped, I would never have been able to find the deer. If your vehicle isn’t disabled, what’s the big deal about stopping?

    I’ve stuck two deer and my car wasn’t disabled either time. My daughter hit one and totaled our van. She stopped.

    That said, fuck Musk.

    • Sentau@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 hours ago

      Maybe drive a little slower at night. If you can’t spot and react to animals on your path, you won’t able to react when it’s a human

      • MagicShel@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        It was an expressway. There were no lights other than cars. You’re not wrong, had a human sprinted at 20mph across the expressway in the dark, I’d have hit them, too. That being said, you’re not supposed to swerve and I had less than a second to react from when I saw it. It was getting hit and there was nothing I could’ve done.

        My point was more about what happened after. The deer was gone and by the time I got to the side of the road I was probably about 1/4 mile away from where I struck it. I had no flashlight to hunt around for it in the bushes and even if I did I had no way of killing it if it was still alive.

        Once I confirmed my car was drivable I proceeded home and called my insurance company on the way.

        The second deer I hit was in broad daylight at lunch time going about 10mph. It wasn’t injured. I had some damage to my sunroof. I went to lunch and called my insurance when I was back at the office.

        • Sentau@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          55 minutes ago

          It was an expressway. There were no lights other than cars. You’re not wrong, had a human sprinted at 20mph across the expressway in the dark, I’d have hit them, too. That being said, you’re not supposed to swerve and I had less than a second to react from when I saw it. It was getting hit and there was nothing I could’ve done.

          I am neither blaming you nor critiquing your actions. In fact I agree that we should not swerve. I was just making an observation that driving slightly slower in low visibility might help by giving you more time to notice an obstruction and brake while provide also providing more time for the obstruction to react and clear the road. At least very least, people might slow down enough so that the crash is no longer fatal to the person or animal being crashed into

      • dirtbiker509@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        Great on paper but literally not okay to slow down to 35 mph on the freeway … Where most wild animals are hit at night.

          • dirtbiker509@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            29 minutes ago

            Have you hit a deer before or almost hit them in the dark? Yes absolutely 60mph will shorten your stopping distance and reaction time but not nearly enough. Even at 35mph people hit deer all the time because they typically jump out in front. But much faster than 35mph and even standing still in the middle of the road they’re tough to see and stop for. 60mph, not a chance.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 hours ago

      Whether or not a human should stop seems beside the point. Autopilot should immediately get the driver to take back control if something unexpected happens, and stop if the driver doesn’t take over. Getting into an actual collision and just continuing to drive is absolutely the wrong behavior for a self-driving car.

    • Madison420@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      5 hours ago

      You’re supposed to stop and report it so they can come and get it so no one hits it and ends up more squishy then intended.

      • MagicShel@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        No one was hitting it. It ran into the tall weeds (not far, I’ll wager). I couldn’t have found it. Had it been in the road I’d have called it in.

  • Kbobabob@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    10 hours ago

    Is there video that actually shows it “keeps going”? The way that video loops I know I can’t tell what happens immediately after.

      • LordKitsuna@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 hours ago

        Inb4 it actually stopped with hazards like I’ve seen in other videos. Fuck elon and fuck teslas marketing of self driving but I’ve seen people reach far for karma hate posts on tesla sooooooo ¯\_(ツ)_/¯

  • Sam_Bass@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    10 hours ago

    the deer is not blameless. those bastards will race you to try and cross in front of you.

    • WoahWoah@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      8 hours ago

      Finally someone else familiar with the most deadly animal in North America.

      • Buddahriffic@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        7 hours ago

        I’d give the moose the top spot. Maybe not in sheer numbers of deaths, but I’d much rather have an encounter with a deer than a moose.

        Though for sheer number, I also wouldn’t give that to deer, that spot would go to humans, though I can admit it’s a bit pedantic.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    10 hours ago

    For the 1000th time Tesla: don’t call it “autopilot” when it’s nothing more than a cruise control that needs constant attention.

    • LordKitsuna@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 hours ago

      Real Autopilot also needs constant attention, the term comes from aviation and it’s not fully autonomous. It maintains heading, altitude, and can do minor course correction.

      It’s the “full self driving” wording they use that needs shit on.

    • GoodEye8@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 hours ago

      It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn’t have to take responsibility when it inevitably breaks the law.

  • whotookkarl@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    9 hours ago

    It doesn’t have to not kill people to be an improvement, it just has to kill less people than people do

    • ano_ba_to
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 hours ago

      That’s a low bar when you consider how stringent airline safety is in comparison, and that kills way less people than driving does. If sensors can save people’s lives, then knowingly not including them for profit is intentionally malicious.

    • rigatti@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 hours ago

      True in a purely logical sense, but assigning liability is a huge issue for self-driving vehicles.

      • Kecessa@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        As long as there’s manual controls the driver is responsible as they’re supposed to be ready to take over

          • Kecessa@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 hours ago

            Because it’s not, it’s a car with assisted driving, like all cars you can drive at the moment and with which, surprise surprise, you are held responsible if there’s an accident while it’s in assisted mode.

  • bluGill@fedia.io
    link
    fedilink
    arrow-up
    33
    arrow-down
    2
    ·
    12 hours ago

    Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.

    The real question isn’t is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn’t be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I’ll accept a few edge cases where they are worse.

    Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

    • atempuser23@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      Yes. The question is if the Tesla is better than a anyone in particular. People are given the benefit of the doubt once they pass the drivers test. Companies and AI should not get that. The AI needs to be as good or better than a GOOD human driver. There is no valid justification to allow a poorly driving AI because it’s better than the average human. If we are going to allow these on the road they need to be good.

      The video above is HORRID. The weather was clear, there was no opposing traffic , the deer was standing still. The auto drive absolutely failed.

      If a human was driving in these conditions plowed through a deer at 60 mph and didn’t even attempt to swerve or stop they shouldn’t be driving.

    • ano_ba_to
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 hours ago

      Being safer than humans is a decent starting point, but safety should be maximized to the best of a machine’s capability, even if it means adding a sensor or two. Keeping screws loose on a Boeing airplane still makes the plane safer than driving, so Boeing should not be made to take responsibility.

    • Semi-Hemi-Lemmygod@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 hours ago

      Humans are also bad drivers who get edge cases wrong all the time.

      It would be so awesome if humans only got the edge cases wrong.

      • xthexder@l.sw0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 hours ago

        I’ve been able to get demos of autopilot in one of my friend’s cars, and I’ll always remember autopilot correctly stopping at a red light, followed by someone in the next lane over blowing right through it several seconds later at full speed.

        Unfortunately “better than the worst human driver” is a bar we passed a long time ago. From recent demos I’d say we’re getting close to the “average driver”, at least for clear visibility conditions, but I don’t think even that’s enough to have actually driverless cars driving around.

        There were over 9M car crashes with almost 40k deaths in the US in 2020, and that would be insane to just decide that’s acceptable for self driving cars as well. No company is going to want that blood on their hands.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      12 hours ago

      Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.

      https://www.reuters.com/business/autos-transportation/nhtsa-opens-probe-into-24-mln-tesla-vehicles-over-full-self-driving-collisions-2024-10-18/

      The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.

      It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.

      • atempuser23@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        One trick used is to disengage auto pilot when it senses and imminent crash. This would vastly lower the crash count shifting all blame to the human driver.

      • Billiam@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        11 hours ago

        It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.

        I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn’t you be shoving that into every single selling point you have? Why wouldn’t that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla’s FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 hours ago

          If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.

          Bu they don’t because they know it would be a repeat of smashing the bulletproof window on stage.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      6 hours ago

      Given that they market it as “supervised”, the question only has to be “are humans safer when using this tool than when not using it?”

      One of the cool things I’ve noticed since recent updates, is the car giving a nudge to help me keep centered, even when I’m not using autopilot