Interesting article, I find it hard to believe the major AIs of today will collapse for such a reason. This gives me "year 2000 collapse " vibes, I’m by no means trashing the article, just saying I’m skeptical we’ll ever reach such a point. The article itself already mentions awareness of the feedback loop among devs as well as two possible ways to counteract it.
Collapse is definitely a strong word to use for this. They no doubt will get worse as this kind of training simply reaffirms biases and incorrect data, but the AIs won’t suddenly collapse
Interesting article, I find it hard to believe the major AIs of today will collapse for such a reason. This gives me "year 2000 collapse " vibes, I’m by no means trashing the article, just saying I’m skeptical we’ll ever reach such a point. The article itself already mentions awareness of the feedback loop among devs as well as two possible ways to counteract it.
Collapse is definitely a strong word to use for this. They no doubt will get worse as this kind of training simply reaffirms biases and incorrect data, but the AIs won’t suddenly collapse