Archive link: https://archive.ph/GtA4Q
The complete destruction of Google Search via forced AI adoption and the carnage it is wreaking on the internet is deeply depressing, but there are bright spots. For example, as the prophecy foretold, we are learning exactly what Google is paying Reddit $60 million annually for. And that is to confidently serve its customers ideas like, to make cheese stick on a pizza, “you can also add about 1/8 cup of non-toxic glue” to pizza sauce, which comes directly from the mind of a Reddit user who calls themselves “Fucksmith” and posted about putting glue on pizza 11 years ago.
A joke that people made when Google and Reddit announced their data sharing agreement was that Google’s AI would become dumber and/or “poisoned” by scraping various Reddit shitposts and would eventually regurgitate them to the internet. (This is the same joke people made about AI scraping Tumblr). Giving people the verbatim wisdom of Fucksmith as a legitimate answer to a basic cooking question shows that Google’s AI is actually being poisoned by random shit people say on the internet.
Because Google is one of the largest companies on Earth and operates with near impunity and because its stock continues to skyrocket behind the exciting news that AI will continue to be shoved into every aspect of all of its products until morale improves, it is looking like the user experience for the foreseeable future will be one where searches are random mishmashes of Reddit shitposts, actual information, and hallucinations. Sundar Pichai will continue to use his own product and say “this is good.”
So, basically shitposting poisons AI training. Good to know 👍
The fun part is that the thing that causes Google to suggest adding glue to pizza was a genuine post about how they make the cheese stretching effect for advertisements.
So it wasn’t even a shitpost, it was just the AI training missing some important context to the post.
Ohhhh that makes it soo much better.
Cause if it was a joke post, the solution would be to label those.
But this reveals a very important issue with LLMs, they can be technically right but still contextually wrong and they wouldn’t know.
And that’s not even “hallucination”
I sincerely hope that shitposting saves us from the hell big Corpo has made of the world
As a mod of Lemmy Shitpost, you’re welcome.
That guy teleported back in time to try and get the 69th upvote and still managed to miss 3 times, hope he gets it the 4th time
Wanted to like, but 69 likes at this time
Wanted to like, but 69 likes at this time
Edit: oh hey, this posted 3 times lol that’s a new one. Sorry for the spam there
Wanted to like, but 69 likes at this time