They frame it as though it’s for user content, more likely it’s to train AI, but in fact it gives them the right to do almost anything they want - up to (but not including) stealing the content outright.

  • mods_are_assholes@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    4
    ·
    10 months ago

    Most of those laws are unenforcable and some are even undetectable.

    Your ideology is getting in the way of objective fact.

      • nintendiator@feddit.cl
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        Are you kidding? #3 is the second most possible one of that set, it’s just a matter of setting up Reproducible / Deterministic Builds.

        If you can’t replicate a result with control of the software version + the arts input + the randomness seed, then “something else is going on”.

      • mods_are_assholes@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        10 months ago

        The only way to make a clear text LLM is to convert most of the hard storage that humanity produces for the next ten years into storage, and we’d need about 1/4 the processing power of bitcoin mining to have it run at ChatGPT speeds.

        Even said, blackbox self-modifying AIs will be the models that win the usefulness wars, and if one country outlaws them then the only result is they will have no defense against countries that don’t feel the need to comply with them.