The policy changes come after an NBC News investigation last month into child safety on the platform.

    • okawari
      link
      fedilink
      25
      edit-2
      1 year ago

      Sadly, I’m sure any social platform where one can make their own private community (actually private or perceived to be private) will have more of these than most of us think. Its just that we don’t see them.

      I’m also not surprised that services like discord is seemingly relaxed at moderating them, as its a problem that is invisible to most users. Moderating is expensive, and unless it hurts public opinion, seemingly its not worth it for them

      • Itty53
        link
        fedilink
        3
        edit-2
        1 year ago

        This is the unfortunate and altogether wrong (morally, it’s not incorrect) truth of the corporate structure. If you can’t point to an ROI they don’t want to spend money on it. The tech world has gone to great collective lengths to push the value of prevention and best practices even if they take a bit of time and money, and I really wish they’d take up that same passion for this kind of shit.

        I also really wish the public would treat these companies as the conspirators they are too, but I know they won’t. They never have.

        The gross truth is there’s a not-inconsequential number of pedophiles in the tech world compared to other industries. It’s the kind of world where you’re not likely to stumble into it online unless you’re well versed in the technology we all collectively call “the internet”. Speaking personally I’ve encountered it myself multiple times, technologically literate people who think they can evade or even worse, relate to others in that industry. Might be an unpopular opinion but I’m in that industry too, I’m not talking about just your industry, reader.

    • Infiltrated_ad8271
      link
      fedilink
      11 year ago

      I guess because the money is their ethic, it’s legal, and it used to give them more benefits than problems.

      • Yup. Before Discord it was Skype. Before Skype it was AIM. It’s not a problem with Discord specifically. It’s that predators will always seek out prey. When kids flock to a service like Discord it means predators do too.

    • @Thorny_Thicket
      link
      21 year ago

      That’s more of a philosophical question and I’m curious to hear why you think that way?

      It’s disturbing sure but so is scat porn but as long as no one is being harmed or forced to do something against their will I don’t really see the problem.

      If watching AI generated stuff is enough to take the edge off so that one can resist the urge to harm actual people then by all means.

      • TwilightVulpine
        link
        fedilink
        21 year ago

        You are forgetting the little detail that AI’s output is based on what has been put in it. If a model can output something like that, it’s likely because real CSAM has been fed into it. It’s not sprouting from the aether.

        • @Thorny_Thicket
          link
          11 year ago

          Fair point.

          I doubt such content is in the training data but if so then that indeed makes it a more difficult issue.

  • Reclipse
    link
    fedilink
    1
    edit-2
    1 year ago

    What does this mean? They were allowed before?? WTF!

    • CybranM
      link
      fedilink
      51 year ago

      More likely just not spotted, think of it like cockroaches, you dont “allow” cockroaches to live in your house but they very well might until you notice and exterminate them