• kibiz0r@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    4
    ·
    1 year ago

    We have a mechanism for people to make their work publically visible while reserving certain rights for themselves.

    Are you saying that creators cannot (or ought not be able to) reserve the right to ML training for themselves? What if they want to selectively permit that right to FOSS or non-profits?

    • BURN@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      3
      ·
      1 year ago

      That’s exactly what they’re saying. The AI proponents believe that copyright shouldn’t be respected and they should be able to ignore any licensing because “it’s hard to find data otherwise”

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      6
      ·
      1 year ago

      Essentially yes. There isn’t a happy solution where FOSS gets the best images and remains competitive. The amount of data needed is outside what can be donated. Any open source work will be so low in quality as to be unusable.

      It also won’t be up to them. The platforms where the images are posted will be selling and brokering. No individual is getting a call unless they are a household name.

      None of the artists are getting paid either way so yeah, I’m thinking of society in general first.

      • kibiz0r@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        The artists (and the people who want to see them continue to have a livelihood, a distinct voice, and a healthy engaged fanbase) live in that society.

        The platforms where the images are posted will be selling and brokering

        Isn’t this exactly the problem though?

        From books to radio to TV, movies, and the internet, there’s always:

        • One group of people who create valuable works
        • Another group of people who monopolize distribution of those works

        The distributors hijack ownership (or de facto ownership) of the work, through one means or another (either logistical superiority, financing requirements, or IP law fuckery) and exploit their position to make themselves the only channel for creators to reach their audience and vice-versa.

        That’s the precise pattern that OpenAI is following, and they’re doing it at a massive scale.

        It’s not new. Youtube, Reddit, Facebook, MySpace, all of these companies started with a public pitch about democratizing access to content. But a private pitch emerged, of becoming the main way that people access content. When it became feasible for them to turn against their users and liquidate them, they did.

        The difference is that they all had to wait for users to add the content over time. Imagine if Google knew they could’ve just seeded Google Video with every movie, episode, and clip ever aired or uploaded anywhere. Just say, “Mon Dieu! It’s impossible for us to run our service without including copyrighted materials! Woe is us!” and all is forgiven.

        But honestly, whichever way the courts decide, the legality of it doesn’t matter to me. It’s clearly a “Whose Line Is It?” situation where the rules are made up and ownership doesn’t matter. So I’m looking at “Does this consolidate power, or distribute it?” And OpenAI is pulling perhaps the biggest power grab that we’ve seen.

        Unrelated: I love that there’s a very distinct echo of something we saw with the previous era of tech grift, crypto. The grifters would always say, after they were confronted, “Well, there’s no way to undo it now! It’s on the blockchain!” There’s always this back-up argument of “it’s inevitable so you might as well let me do it”.