• cyd@vlemmy.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    One major issue that concerns me about these regulations is whether free and open source AI projects will be left alone, or whether they’ll be liable to jumping through procedural hoops that individuals, or small volunteer teams, can’t possibly deal with. I have seen contradictory statements coming from different parties.

    Regulations of this sort always bring the risk of entrenching big, deep-pocketed companies that can just shrug and deal with the rules, while smaller players get locked out. We have seen that happening in some of the previous EU tech regulations.

    In the AI space, I think the major risk is not AI helping create disinformation, invading privacy, etc. Frankly, the genie is already out of the bottle on many of these fronts. The major worry, going forward, is AI models becoming monopolized by big companies, with FOSS alternatives being kept in a permanently inferior position by lack of resources plus ill-targeted regulations.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      The regulation is generally about the application side – things like “states, don’t have a social score system” or “companies, if you make a CV scanner you better be bloody sure it doesn’t discriminate”. Part of the application side already was regulated, e.g. car autopilots, this is simply a more comprehensive list of iffy and straight-up unconscionable uses.

      Generating cat pictures with stable diffusion doesn’t even begin to fall under the regulation.

      Here’s a good tl;dr

      • cyd@vlemmy.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Well, here’s my worry. From my understanding, the EU wants (say) foundation model builders to certify that their models meet certain criteria. That’s a nice idea in itself, but there’s a risk of this certification process being too burdensome for FOSS developers of foundation models. Worse still, would the FOSS projects end up being legally liable for downstream uses of their models? Don’t forget that, unlike proprietary software with their EULAs taking liability off developers, FOSS places no restrictions on how end users use the software (in fact, any such restrictions generally make it non-FOSS).

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          A foundation model is not an application. It’s up to the people wanting to run AI in a high-risk scenario to make sure that the models they’re using are up to the task, if they can’t say that about some FOSS model then they can’t use it. And, honestly, would you want some CV or college application scanner involve DeepDanbooru.

          • cyd@vlemmy.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            The regulation not only puts obligations on users. Providers (which can include FOSS developers?) would have to seek approval for AI systems that touch on certain areas (e.g. vocational training), and providers of generative AI are liable to “design the model to prevent it from generating illegal content” and “publishing summaries of copyrighted data used for training”. The devil is in the details, and I’m not so sanguine about it being FOSS-friendly.

            • barsoap@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              Ok here’s what parlimant passed, ie. its amendments

              Quoth:

              5e. This Regulation shall not apply to AI components provided under free and open-source licences except to the extent they are placed on the market or put into service by a provider as part of a high-risk AI system or of an AI system that falls under Title II or IV. This exemption shall not apply to foundation models as defined in Art 3.

              Interesting, no foundation model exception, though the FLOSS community isn’t going to train any of those soon in any case.

              More broadly speaking this is the same issue as with the cyber resiliance act and they’re definitly on top of it as to saying “we don’t want FLOSS to suffer by a misinterpretation of ‘to put on the market’”. Patience, none of this is as of yet law but the very act of amending it such tells courts to not interpret it that way.

              In case you have use for it, the base version the parliament diffed against. Why aren’t they using proper VCS in <currentyear>.