Session
Repetita Non Iuvant: Why Generative AI Models Cannot Feed Themselves
As AI floods the digital landscape with content, what happens when it starts repeating itself?
This talk explores model collapse, a progressive erosion where LLMs and image generators loop on their own results, hindering the creation of novel output.
We will show how self-training leads to bias and loss of diversity, examine the causes of this degradation, and quantify its impact on model creativity.
Finally, we will also present concrete strategies to safeguard the future of generative AI, emphasizing the critical need to preserve innovation and originality.
By the end of this talk, attendees will gain insights into the practical implications of model collapse, understanding its impact on content diversity and the long-term viability of AI.
First delivered at Py4AI 2025, Pavia, Italy
Also held at AI Heroes 2025, Turin, Italy

Valeria Zuccoli
R&D Artificial Intelligence Scientist
Milan, Italy
Links
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top