• RupeThereItIs@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    2 years ago

    It’s a MASSIVE security risk. What you tell ChatGPT is not private, if you knowingly or unknowingly tell ChatGPT secret information you have no control over where that information may go. Especially for a company for Apple that lives & breaths on surprise product releases.

    • MentalEdge
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      2 years ago

      This is true, but if you understand that queries don’t necessarily need to also become training data, what you tell it could absolutely be kept secret, provided the necessary agreements and changes were to be made. Nothing about an LLM means you can’t make it forget things you’ve told it. What you can’t make it forget, without re-training it from the ground up with that piece of information omitted, is what you told it in the training data.

      But queries, do not suffer this limitation.