• drwankingstein@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      ehh… not really, the amount of generated data you can get by snopping on LLM traffic is going to far out weigh the costs of running LLMs

      • Leaflet@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 months ago

        There’s nothing technical stopping Google from sending the prompt text (and maybe generated results) back to their servers. Only political/social backlash for worsened privacy.

      • elucubra
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        I doubt that. I’m going to guess that Google is going towards a sort of “P2P AI”