• ErmahghrrdDavid@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      3 years ago

      Not sure if you’re joking or serious but many “modern” natural language processing models like BERT are trained using offline dumps of big collections of text like wikipedia - nothing unusual there and nothing to worry about :)