Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
Forgive me for not putting incredible weight behind the “issue” of a LLM gendering inanimate objects incorrectly. Seems like an infinitely larger issue in the language itself than the LLM.
“inanimate objects”? Where are you getting that from? The article doesn’t state explicitly what the test sentences were, but I highly doubt that LLMs have trouble grammatically gendering inanimate objects correctly, since their gender usually doesn’t vary depending on anything other than the base noun used. I’m pretty sure this is about gendering people.
“…the technology’s shortcomings in managing gender neutrality and inclusivity…”
Yes, this is definitely the most important concern we need to have about AI, absolutely nothing else.
People can be concerned about more than one thing.
And “gender neutrality” of AI should not make the list
So, what? You think women need their own LLMs or something?
You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.
Computers do not have the sentience required to be sexist.
They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
deleted by creator
Interesting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
This is a nothing argument.
They’re nuts. Easy block, IMO.
Why.
Doesn’t it make sense to fix and address these issues now rather than waiting for them to fester.
Forgive me for not putting incredible weight behind the “issue” of a LLM gendering inanimate objects incorrectly. Seems like an infinitely larger issue in the language itself than the LLM.
“inanimate objects”? Where are you getting that from? The article doesn’t state explicitly what the test sentences were, but I highly doubt that LLMs have trouble grammatically gendering inanimate objects correctly, since their gender usually doesn’t vary depending on anything other than the base noun used. I’m pretty sure this is about gendering people.
It’s definitely important