" ai generation be text or image, is that they are the most privacy invasive thing there could be."
ai being used by advertisers and social media are far more of an invasion of privacy, and have been for a decade.
the focus on LLMs and art generators are silly.
habit prediction and manipulation is a far greater risk. the socio-economic system is also the bigger reason to worry about any technological improvement than the technology itself.
the rhetoric keeps focusing on LLMs and image generation, which is silly to me.
the problem of giving private information as if it is safe is also the issue, not the LLMs. that’s been an issue for ages, and there has been nothing done to fix it, which is why it’s still an issue. LLMs or not, we should be educating for these problems in school. also critical thinking and media bias/manipulation studies. everyone would benefit with a more statistical understanding of information.
“so much noise that we cannot do anything from our own will is what will happen for non regulated brain interface after enough people have adopted it.”
this is already the social paradigm with propaganda and “media” encourage by those with the money to do so. they don’t need to literally see inside of your head and change things when they already know if you’re quitting a brand or in an unsuccessful relationship before you do.
BCI isn’t a worry quite yet.
" ai generation be text or image, is that they are the most privacy invasive thing there could be." ai being used by advertisers and social media are far more of an invasion of privacy, and have been for a decade.
the focus on LLMs and art generators are silly.
habit prediction and manipulation is a far greater risk. the socio-economic system is also the bigger reason to worry about any technological improvement than the technology itself.
the rhetoric keeps focusing on LLMs and image generation, which is silly to me. the problem of giving private information as if it is safe is also the issue, not the LLMs. that’s been an issue for ages, and there has been nothing done to fix it, which is why it’s still an issue. LLMs or not, we should be educating for these problems in school. also critical thinking and media bias/manipulation studies. everyone would benefit with a more statistical understanding of information.
“so much noise that we cannot do anything from our own will is what will happen for non regulated brain interface after enough people have adopted it.”
this is already the social paradigm with propaganda and “media” encourage by those with the money to do so. they don’t need to literally see inside of your head and change things when they already know if you’re quitting a brand or in an unsuccessful relationship before you do.