There’s a growing problem on the horizon that could undermine all of AI’s achievements—a phenomenon known as “model collapse.”
Model collapse is what happens when AI models are trained on data that includes content generated by earlier versions of themselves.
Over time, this recursive process causes the models to drift further away from the original data distribution, losing the ability to accurately represent the world as it really is.
Instead of improving, the AI starts to make mistakes that compound over generations, leading to outputs that are increasingly distorted and unreliable.