mesamune@lemmy.world to Technology@lemmy.worldEnglish · 9 months agoThe AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated contentventurebeat.comexternal-linkmessage-square84fedilinkarrow-up1479arrow-down113cross-posted to: futurology@lemmy.worldartificial_intel@lemmy.mltechnology@lemmy.mltech@pawb.socialtechnology@beehaw.org
arrow-up1466arrow-down1external-linkThe AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated contentventurebeat.commesamune@lemmy.world to Technology@lemmy.worldEnglish · 9 months agomessage-square84fedilinkcross-posted to: futurology@lemmy.worldartificial_intel@lemmy.mltechnology@lemmy.mltech@pawb.socialtechnology@beehaw.org
minus-squaredanielbln@lemmy.worldlinkfedilinkEnglisharrow-up4·9 months agoMicrosoft’s Phi model was largely trained on synthetic data derived from GPT-4.
minus-squaregapbetweenus@feddit.delinkfedilinkEnglisharrow-up1·edit-29 months agoI’m to lazy to search for the paper, not sure it was Microsoft, but with my rather basic knowledge of modeling (studied system biology) - it seemed rather crazy and impossible, so I remembered it.
Microsoft’s Phi model was largely trained on synthetic data derived from GPT-4.
I’m to lazy to search for the paper, not sure it was Microsoft, but with my rather basic knowledge of modeling (studied system biology) - it seemed rather crazy and impossible, so I remembered it.