• IWantToFuckSpez@kbin.social
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    1 year ago

    I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.

    We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.

    • lol3droflxp@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Any dataset sourced from human activity (eg internet text as in Chat GPT) will always contain the current societal bias.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The AIs are not racist themselves, it’s a side effect of the full technology stack: cameras have lower dynamic resolution for darker colors, images get encoded with a gamma that leaves less information in darker areas, AIs that work fine with images of light skinned faces, don’t get the same amount of information from images of dark skinned faces, leading to higher uncertainty and more false positives.

      The bias starts with cameras themselves; security cameras in particular should have an even higher dynamic range than the human eye, but instead they’re often a cheap afterthought, and then go figure out what have they recorded.