• EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    11 months ago

    I don’t like to shit on people’s work, but given the amount of spam, fake accounts, and blatant abuse on their platform I do wonder what their oversight board actually does. A friend of mine had someone spoof their identity to write horrible things online, and it took two years, the police, and a letter from their MP to get Facebook to remove the profile. It’s a joke.

    • loki@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      11 months ago

      This is essentially some good PR to lower the bad PR they’re getting. The damage is already done, whatever narrative they were pushing worked. Now they get to say “We did bad things and we’re aware of it. We have an oversight board and we have a system in place. We pinky promise we will continue to improve how we handle these things in future.”

      Until next time it happens again.

      Sorry about what happened to your friend.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    11 months ago

    This is the best summary I could come up with:


    Meta’s Oversight Board has criticized the company’s automated moderation tools for being too aggressive after two videos that depicted hostages, injured civilians, and possible casualties in the Israel-Hamas war were — it says — unfairly removed from Facebook and Instagram.

    In a report published on Tuesday, the external review panel determined that the posts should have remained live and that removing the content has a high cost to “freedom of expression and access to information” in the war.

    One of the removed videos, posted to Facebook, depicts an Israeli woman during the October 7th attack on Israel by Hamas, pleading with kidnappers who were taking her hostage not to kill her.

    The other video was published on Instagram and shows what appears to be the aftermath of an Israeli strike on or near al-Shifa Hospital in Gaza City.

    The board says that, in the case of the latter video, both the removal and a rejection of the user’s appeal to restore the footage were conducted by Meta’s automated moderation tools, without any human review.

    The board took up a review of the decision on an “accelerated timeline of 12 days,” and after the case was taken up, the videos were restored with a content warning screen.


    The original article contains 489 words, the summary contains 206 words. Saved 58%. I’m a bot and I’m open source!

  • cosmicrookie@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    11 months ago

    Meta’s oversight board… Right…

    Is this the one that told me mostly all my reports were not breaking any rules… And that could reply 3 minutes after asking them to check again, that yeah there is nothing wrong about selling this but plug in the market place or that person wanting to offer me a blow job for money

    • roofuskit@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      That’s not the oversight board. The board reviews the actions of the moderators, it oversees. They do not moderate anything themselves.