Dropbox removed ability to opt your files out of AI training::undefined

  • JonEFive@midwest.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    It really depends on what the AI training is looking for. You can potentially poison an AI training model, but you’ll likely have to add enough data to be statistically relevant.

    • reksas
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      enough data as in many different people will have to upload one or two files that contain such data or you have to upload very large file that contains a lot of data that causes problems?

      • JonEFive@midwest.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        It’s honestly difficult for me to say because there are so many different ways to train AI. It really depends more on what the trainers configure to be a data point. Volume of files vs size of a single file aren’t as important as what the AI believes is a data point and how the data points are weighted.

        Just as a simple example, a data point may be considered a row on a spreadsheet without regard for how that data was split up across files. So ten files with 5 rows each might have the same weight as one file with 50 rows. But there’s also a penalty concept in some models, so the trainer can set it so that data that all comes from one file may be penalized. Or the opposite could be true if data coming from the same file is deemed to be more important in some way.

        In terms of how AIs make their decisions, that can also vary. But generally speaking, if 1000 pieces of data are used that are all similar in some way and one of them is somewhat different from the others, it is less likely that that one-off data will be used. It’s much more likely to have an effect If 100 of the 1000 pieces of data have that same information. There’s always the possibility of using that 1/1000 data, it’s just less likely to have a noticeable effect.

        AIs build confidence in responses based on how much a concept is reinforced, so you’d have to know something about the training algorithm to be able to intentionally impact the results.

        • reksas
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          thank you, this was the kind of information i was hoping for