Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
I wrote a slur detection script for lemmy, copilot refused to run unless I removed the “common slurs” list from the file. There are definitely keywords or context that will shut down the service. Could even be regionally dependant.
I’d expect it to censor slurs. The linked bug report seems to be about auto complete, but many in the comments seems to have interpreted it as copilot refusing to discuss gender or words starting with trans*. There’s even people in here giving supposed examples of that. This whole thing is very confusing. I’m not sure what I’m supposed to be up in arms about.