ryujin470@fedia.io to Technology@beehaw.org · 3 days agoResearchers show that training on “junk data” can lead to LLM “brain rot”arstechnica.comexternal-linkmessage-square9fedilinkarrow-up163arrow-down10
arrow-up163arrow-down1external-linkResearchers show that training on “junk data” can lead to LLM “brain rot”arstechnica.comryujin470@fedia.io to Technology@beehaw.org · 3 days agomessage-square9fedilink
minus-squarejarfil@beehaw.orglinkfedilinkarrow-up12·edit-23 days ago Using a complex GPT-4o prompt, they sought to pull out tweets that focused on “superficial topics” Wait a moment… They asked an LLM, to tell them what was “junk”, and another LLM, trained on what an LLM marked as junk, turned out to be a junk LLM? It talks about model collapse, but this smells like research collapse.
Wait a moment… They asked an LLM, to tell them what was “junk”, and another LLM, trained on what an LLM marked as junk, turned out to be a junk LLM?
It talks about model collapse, but this smells like research collapse.