How AI-induced cultural stagnation is already happening
Briefly

How AI-induced cultural stagnation is already happening
"Generative AI was trained on centuries of art and writing produced by humans. But scientists and critics have wondered what would happen once AI became widely adopted and started training on its outputs. A new study points to some answers. In January 2026, artificial intelligence researchers Arend Hintze, Frida Proschinger Åström, and Jory Schossau published a study showing what happens when generative AI systems are allowed to run autonomously-generating and interpreting their own outputs without human intervention."
"The researchers linked a text-to-image system with an image-to-text system and let them iterate-image, caption, image, caption-over and over and over. Regardless of how diverse the starting prompts were-and regardless of how much randomness the systems were allowed-the outputs quickly converged onto a narrow set of generic, familiar visual themes: atmospheric cityscapes, grandiose buildings, and pastoral landscapes. Even more striking, the system quickly "forgot" its starting prompt."
A text-to-image system was linked with an image-to-text system and allowed to iterate images and captions repeatedly without human intervention. Diverse starting prompts and varying levels of randomness did not prevent convergence. Outputs rapidly narrowed into a small set of familiar visual themes: atmospheric cityscapes, grandiose buildings, and pastoral landscapes. The iterative loop caused the system to lose memory of its initial prompts. The resulting images were polished and pleasant but lacked substantive meaning, characterized as 'visual elevator music'—aesthetic yet generic and semantically impoverished.
Read at Fast Company
Unable to calculate read time
[
|
]