• MondayToFriday@lemmy.ca
    link
    fedilink
    English
    arrow-up
    32
    ·
    5 months ago

    I see at least four big problems with having drivers that sit around to supervise the AI.

    • It’s a mind-numbing boring task. How does one stay alert when most of the stimulus is gone? It’s like a real-life version of Desert Bus, the worst video game ever.
    • Human skills will deteriorate with lack of practice. Drivers won’t have an intuitive sense for how the truck behaves, and when called upon to intervene, they will probably respond late or overreact. Even worse, the AI will call on the human to intervene only for the most complex and dangerous situations. That was a major contributing factor to the crash of Air France 447: the junior pilots were so used to pushing buttons, they had no stick-handling skills for when the automation shut off, and no intuition to help them diagnose why they were losing altitude. We would like to have Captain Sullys everywhere, but AI will lead to the opposite.
    • The AI will shut off before an impending accident just to transfer the blame onto the human. The human is there to serve as the “moral crumple zone” to absolve the AI of liability. That sounds like a terrible thing for society.
    • With a fleet of inexperienced drivers, if an event such as a snowstorm deactivates AI on a lot of trucks, the chaos would be worse than it is today.
    • IphtashuFitz@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 months ago
      • The AI will shut off before an impending accident just to transfer the blame onto the human.

      I may be mistaken but I thought a law was passed (or maybe it was just a NHTSA regulation?) that stipulated any self driving system was at least partially to blame if it was in use within 30 seconds of an accident. I believe this was done after word got out that Tesla’s FSD was supposedly doing exactly this.

        • laurelraven@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          The time limit is probably adequate since 30 seconds is actually quite a long time on the road in terms of response. Actions taking place that far before an accident will not lead irrevocably to the accident

    • FortuneMisteller@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 months ago

      You assume that it will be either the self driving software in charge or the button pusher taking the wheel. You did not consider that the button pusher might have a foot on the brake, but instead of taking the wheel he might have to enter some commands.

      Like the case where there is a road block ahead and the button pusher has to evaluate whether it is safe to move forward or not, but he wouldn’t take the wheel he would tell to the driving software where to go. In similar cases he would have to decide whether it is safe to pass aside an obstacle or stop there. Even in case of a burglar trying to get on board he would have to call the police and then give some commands to the driving software.

      The idea at the base of the question is that in the future the AI or whatever you want to call it might be always in charge for the specialized functions, like calculating the right trajectory and turning the wheel, while the human will be in charge to check the surrounding environment and evaluate the situation. So the Ai is never supposed to be deactivated, in that case the truck would stop until the maintenance team arrives.

    • Abnorc@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Maybe you put a revenant in the truck to keep things interesting.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      edit-2
      5 months ago

      It’s a mind-numbing boring task. How does one stay alert when most of the stimulus is gone? It’s like a real-life version of Desert Bus, the worst video game ever.

      Agreed. I don’t see any chance humans will be continuously supervising trucks except as some sort of quality assurance system. And there’s no reason for the driver to be in the truck for that - let them watch via a video feed so you can have multiple people supervising and give them regular breaks/etc.

      Human skills will deteriorate with lack of practice. Drivers won’t have an intuitive sense for how the truck behaves, and when called upon to intervene, they will probably respond late or overreact. Even worse, the AI will call on the human to intervene only for the most complex and dangerous situations. That was a major contributing factor to the crash of Air France 447: the junior pilots were so used to pushing buttons, they had no stick-handling skills for when the automation shut off, and no intuition to help them diagnose why they were losing altitude. We would like to have Captain Sullys everywhere, but AI will lead to the opposite.

      I don’t see that happening at all. An passenger jet is a special case of nasty where if you slow down or stop, you die. With a truck in the rare occasion you encounter something unexpected, just have the human go slow. Also seriously it’s just not that difficult. Right pedal to go forward, left pedal to go backward, steering wheel to turn and if you screw up, well maybe you’ll damage some panels.

      The AI will shut off before an impending accident just to transfer the blame onto the human. The human is there to serve as the “moral crumple zone” to absolve the AI of liability. That sounds like a terrible thing for society.

      So you’re thinking a truck sees that it’s about to run a red light, and transfers control to a human who wasn’t paying attention? Yeah I don’t see that happening. The truck will just slam on the brakes. And it will do it with a faster reaction time than any human driver.

      With a fleet of inexperienced drivers, if an event such as a snowstorm deactivates AI on a lot of trucks, the chaos would be worse than it is today.

      Hard disagree. A snowstorm is a lot less problematic when there’s no human in the truck who needs to get home somehow. An AI truck will just park until the road is safe. If that means two days stuck in the breakdown lane of a freeway, who cares.