Google’s AI generated search results include justifications for slavery cooking tips for Amanita ocreata, a poisonous mushroom known as the “angel of death.” The results are part of Google’s AI-powered Search Generative Experience, or SGE.
you would think that a “language model” would have “connotation” high on its list of priorities - being that is a huge part of the form and function of language.
I’m convinced it’s only purpose is actually to give tech C-level and VPs some bullshit to say for roughly 18-36 months now that “blockchain” and “pandemic disruption” are dead.
you would think that a “language model” would have “connotation” high on its list of priorities - being that is a huge part of the form and function of language.
Not how it works.
It’s just a fancy version of that “predict the next word” feature smartphones have. Like if you just kept tapping the next word.
They don’t even have real parameters, only black box bullshit hidden parameters.
I know, I was pointing out the irony.
I’m convinced it’s only purpose is actually to give tech C-level and VPs some bullshit to say for roughly 18-36 months now that “blockchain” and “pandemic disruption” are dead.
LLMs don’t know what words mean, much less the connotations they have.