In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • Korhaka
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    Next test I would love is what is the minimum amount of false road to fool it.

    • Fermion@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      24 hours ago

      Have you ever seen examples of how the features that ai picks out to identify objects isn’t really the same as what we pick out? So you can generate images that look unrecognizeable to people but have clearly identifiable features to ai. It would be interesting to see someone play around with that concept for interesting ways to fool tesla’s ai. Like could you make a banner that looks like a barricade to people, but the cars think looks like open road?

      This isn’t a great example for this concept, but it is a great video. https://youtu.be/FMRi6pNAoag?t=5m58s

      • Korhaka
        link
        fedilink
        English
        arrow-up
        3
        ·
        23 hours ago

        I was thinking something that the AI would think the road turns left and humans see it turns right