Slashdot

Cannibal AIs Could Risk Digital 'Mad Cow Disease' Without Fresh Data

A new study from Rice University and Stanford University reveals the dangers of training AI models solely on synthetic, machine-generated data. This practice, termed Model Autophagy Disorder (MAD), leads to a decline in the quality and diversity of the AI's output, akin to mad cow disease where cattle consume infected remains. The research shows that without fresh, real-world data, AI models become "self-consuming," resulting in degraded outputs. This phenomenon is observed in various AI applications, including image generation and handwriting recognition. The study highlights the importance of incorporating fresh, human-generated data into AI training to prevent MAD and ensure high-quality outputs. The researchers warn of the potential consequences of relying solely on AI-generated data for training future models, emphasizing the need for continuous real-world data input. The study underscores the crucial role of fresh data in maintaining the quality and diversity of AI outputs, highlighting the potential dangers of AI "slop" if this principle is not adhered to.
favicon
slashdot.org
slashdot.org
Create attached notes ...