tldr- Using the robots meta element or HTTP header to say that content of this page should not be used for machine learning, in case some actors make their search UA indistinguishable from their machine learning efforts.
tldr- Using the robots meta element or HTTP header to say that content of this page should not be used for machine learning, in case some actors make their search UA indistinguishable from their machine learning efforts.
But I think their point was that if they didn’t have to provide this detail, why do you think others would “have to”?
This is a fair point, but I think this will be a new standard for AI. So GPT-4 was possible bc of no regulations, but it won’t be the same for GPT-5 or 6.
So it’s more for future proof than back tracking.
What evidence is there of that? What regulations have been added?
I’m referring to this post. It’s not official regulations.