• TheBeege@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I mean, if it wasn’t their AI tools they were outsourcing thinking to, it was the “smartest” person in their social circle. The question is if AI is smarter than that person for the average social circle

    But yeah, fake info is terrifying

    • Skelectus@suppo.fi
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Outsourcing in more ways, than what another person will do for you. I think the Google Maps example tells exactly what’s going to come true when AI lets you easily skip any task that requires thinking, such as writing your homework.

    • tchotchony@mander.xyz
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I recently visited an Australian dog breed info site. Most of the information there looked alright and very detailed. Pictures, history of the dog breed, physical traits, personality, uses, random tidbits, … It all looked well put together and reliable.

      Until I accidentally stumbled on the Khala dog page. You can read it here

      It’s a Pakistan hairless dog, that and the picture are about the only correct thing on there.

      The history is basically the entire storyline of Starcraft, which might or might not have Khala dogs. Further down in the text it is also referred to as a horse, and a South-American bird of prey. Both of which I have no clue where it got that from, I did a very extensive search on South-American birds of prey and no Khala to be found. And it randomly switches back to dogs in-between.

      So now I can’t trust anything on that site without it being verified through other means. Found more minor mistakes too after doublechecking other breeds, that I would’ve accepted without questions if I hadn’t stumbled across this gem.

      So yeah, I find using AI for science/information purposes without doublechecking it by actual, human experts quite scary. This is just a dog breed site without much consequences, but I’m sure similar things happen (and slip through the mazes) with actual news too.