Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • intensely_human@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    10 个月前

    In India they’ve been used to determine whether people should be kept on or kicked off of programs like food assistance.

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      10 个月前

      Well, humans are similar to pigs in the sense that they’ll always find the stinkiest pile of junk in the area and taste it before any alternative.

      EDIT: That’s about popularity of “AI” today, and not some semantic expert systems like what they’d do with Lisp machines.