Self-Consuming Generative Models Go MAD

241citations
arXiv:2307.01850
241
citations
#118
in ICLR 2024
of 2297 papers
8
Top Authors
4
Data Points

Abstract

Seismic advances in generative AI algorithms for imagery, text, and other data types have led to the temptation to use AI-synthesized data to train next-generation models. Repeating this process creates an autophagous ("self-consuming") loop whose properties are poorly understood. We conduct a thorough analytical and empirical analysis using state-of-the-art generative image models of three families of autophagous loops that differ in how fixed or fresh real training data is available through the generations of training and whether the samples from previous-generation models have been biased to trade off data quality versus diversity. Our primary conclusion across all scenarios is thatwithout enough fresh real data in each generation of an autophagous loop, future generative models are doomed to have their quality (precision) or diversity (recall) progressively decrease.We term this condition Model Autophagy Disorder (MAD), by analogy to mad cow disease, and show that appreciable MADness arises in just a few generations.

Citation History

Jan 28, 2026
0
Feb 13, 2026
241+241
Feb 13, 2026
241
Feb 13, 2026
241