That’s my issue with people saying stuff like “I can immediately tell when a picture is made with AI and I hate how they look”
Your assesment doesn’t take into account all the false negatives. You have no idea how many pictures have tricked you already. By definition, the picture is badly made if you can immediately tell it’s AI. That’s a bit like seeing the most flamboyantly gay person on the street and thinking all gays look like that and you can always spot them while the closeted friend you’re with flies perfectly under the radar.
Reminds me of all the people who believe commercials and advertising doesn’t work on them. Sure, that’s why billions are spent on it. Because it doesn’t even do anything. Oh it only works on all the other people?
That’s why it is so hard to get that stuff regulated. People believe it doesn’t work on them.
In AI-generated sound you can see it in the waveform, it has less random noise altogether and it seems like a huge, well, wave. I wonder if sth similar is true for images.
Basically yes, lack of detail, especially small things like hair or fingers. The texture/definition in AI images is usually less. Though, once again, depends on the technique being used.
That’s my issue with people saying stuff like “I can immediately tell when a picture is made with AI and I hate how they look”
Your assesment doesn’t take into account all the false negatives. You have no idea how many pictures have tricked you already. By definition, the picture is badly made if you can immediately tell it’s AI. That’s a bit like seeing the most flamboyantly gay person on the street and thinking all gays look like that and you can always spot them while the closeted friend you’re with flies perfectly under the radar.
Good old toupee fallacy.
I didn’t know it had a name. Thanks!
Reminds me of all the people who believe commercials and advertising doesn’t work on them. Sure, that’s why billions are spent on it. Because it doesn’t even do anything. Oh it only works on all the other people?
That’s why it is so hard to get that stuff regulated. People believe it doesn’t work on them.
It also doesn’t help that they are working to improve it all the time.
Many unedited or using old Ai images I can detect with one look. A few more I can find by looking for inconsistencies like hands or illogical items.
However I am sure there will be more AI generated images that may even be a little bit edited afterwards that I can’t detect.
You will need an ai to detect them. Since at least in images ai is detectable by the way they create the files.
In AI-generated sound you can see it in the waveform, it has less random noise altogether and it seems like a huge, well, wave. I wonder if sth similar is true for images.
I heard they managed to put some noise into ai generated audio, so it’s even more difficult to tell it
Basically yes, lack of detail, especially small things like hair or fingers. The texture/definition in AI images is usually less. Though, once again, depends on the technique being used.