Source: https://front-end.social/@fox/110846484782705013

Text in the screenshot from Grammarly says:

We develop data sets to train our algorithms so that we can improve the services we provide to customers like you. We have devoted significant time and resources to developing methods to ensure that these data sets are anonymized and de-identified.

To develop these data sets, we sample snippets of text at random, disassociate them from a user’s account, and then use a variety of different methods to strip the text of identifying information (such as identifiers, contact details, addresses, etc.). Only then do we use the snippets to train our algorithms-and the original text is deleted. In other words, we don’t store any text in a manner that can be associated with your account or used to identify you or anyone else.

We currently offer a feature that permits customers to opt out of this use for Grammarly Business teams of 500 users or more. Please let me know if you might be interested in a license of this size, and I’II forward your request to the corresponding team.

  • monobot@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    edit-2
    1 year ago

    So… grammarly is problem to help you write an email or a document you will send via gmail or publish online?

    Or you use grammarly for private diary?

    • gps@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Most people consider email private. Plus a lot of people use Grammarly for work documents.

      • Gimly@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        The same people will use Gmail without batting an eye and we all know what Google has been doing for years with emails in Gmail.

        It’s quite funny that now that there are “AI” that everyone can use, people get all worried about their data being used to train them but nobody cared before when it was Google or amazon using them to train their models.