In an AI model collapse, AI systems, which are trained on their own outputs, gradually lose accuracy, diversity, and reliability. This occurs because errors compound across successive model generations, leading to distorted data distributions and "irreversible defects" in performance. The final result? A Nature 2024 paper stated, "The model becomes poisoned with its own projection of reality."
A remarkably similar thing happened to my aunt who can't get off Facebook. We try feeding her accurate data, but she's become poisoned with her own projection of reality.