Overview
- Houser told Virgin Radio UK on November 26 that generative models scraping an internet increasingly filled with AI output risk a self-referential decline.
- He likened the dynamic to “mad cow disease,” saying AI will eventually “eat itself” as training data becomes saturated with synthetic content.
- While noting AI can perform specific tasks “brilliantly,” he argued it remains error-prone, often “random and wrong,” and cannot replace the human element.
- He criticized the people pushing widespread adoption, describing some leaders as not the most humane or creative and “not fully rounded humans.”
- Coverage contrasted his caution with industry momentum, citing a Google Cloud survey reporting that nearly nine in ten game studios already use AI agents in their pipelines.