In an AI model collapse, AI systems, which are trained on their own outputs, gradually lose accuracy, diversity, and reliability. This occurs because errors compound across successive model generations, leading to distorted data distributions and “irreversible defects” in performance. The final result? A Nature 2024 paper stated, “The model becomes poisoned with its own projection of reality.”
A remarkably similar thing happened to my aunt who can’t get off Facebook. We try feeding her accurate data, but she’s become poisoned with her own projection of reality.
It’s such an easy thing to predict happening, too. If you did it perfectly, it would, at best, maintain an unstable equilibrium and just keep the same output quality.
Google Search has been going downhill for way longer than a few months. It’s been close to a decade now.
TBF, SEO and other methodologies that game the rankings muddy the waters and make it harder to get to what you are looking for.
That is not the problem though, Google used to just give you the results containing what you searched for, the problem started when they tried to be “smarter” than that.