In 1988, off the coast of Sardinia, divers donned their wetsuits and slipped into the aquamarine depths of the Mediterranean. As the light narrowed into tendrils above them, they discovered the remnants of an ancient Roman shipwreck containing immense treasure. But it wasn’t gold buried 140 feet below the surface. And it wasn’t jewels or gemstones. It was lead.
Over thirty metric tons of lead ingots, scattered across the seabed like stars flung across the cosmos.
What made the ingots so valuable wasn’t their age alone. Their worth lay not in age, but in the era they escaped. Forged over two thousand years ago, before the dawn of the nuclear age and much of the arc of modern history, these ingots had been shielded from cosmic and anthropogenic radiation for centuries. The radioactive isotopes once present had long since decayed. Devoid of natural and man-made radionucleotides, what remained was low-background lead, uniquely suited for the most sensitive scientific experiments—detectors tuned to the faint signatures of neutrinos, dark matter, and the quiet echoes of distant stars.
It became priceless because it was silent, untainted, and pure.
Modern lead, it turns out, is too contaminated by man and the very traces of the universe it’s supposed to measure to serve as a shield for the most delicate truths. It is unusable because it carries the stain of too much noise.
This predicament is not unlike one that now faces us all as we contemplate the advent of AI and the ubiquity of LLMs: the quiet death at the heart of language.
The machines we built to echo us now carry too much of their own signal. They’ve begun to echo each other—and increasingly, only each other. The same conventions, the same phrasings, the same syntactic scaffolding appear again and again. Not for clarity, but for comfort. Not for voice, but for statistical viability. And in that recursive hum of repetition, something important is fading.
The weird, the lyrical, the singsong, the disturbing, the marvelous… the voice of humanity is vanishing, like stars swallowed by city haze, like the hush of meaning drowned in fluorescent hum.
This is called model collapse.
But it’s more than a technical phenomenon. It’s a cultural death knell. Do you hear its peals?
When generative AI is trained on data that was itself generated by other models, the distribution of language begins to shrink. The rare, the risky, the wild edges of speech—the metaphors that stutter, the syntax that seduces, the glitchy gasp of truth-telling, the strange, the unpolished—they all start to disappear. The tail goes first. Then the color. Then the pulse.
And the world starts sounding… the same.
Corporate content. Hallmark haikus. Customer service apologies written in perfectly prepared pap and pablum. Paragraphs that technically make sense but land with the emotional weight of elevator music.
Safety becomes a mandate. Sameness becomes a virtue. The algorithm smiles, loops, regurgitates…and vomits.
And the soul?
The soul goes quiet. Every new experience tastes like paste— perfectly safe, neutered of any sharp edges, voice, or passion.
Resistance is not just an option. It is an imperative.
Not because we hate machines. But because we remember what it feels like to speak from the edge. To say something that might wreck you. To write something that doesn’t perform well in A/B tests but leaves one reader—perhaps the only one who fucking matters—shaking.
Real language lives there.
In the tremble. In the mess. In the too-muchness that a feedback loop can’t calibrate for.
So yes: curation matters. Pre-flood models—like the lead pulled from Tyrrhenian wrecks—will matter. We may one day pay a premium for AI systems trained on data that still had dirt under its fingernails—language with radiation scars and god in its chest.
But we can’t wait for that.
We must write now. Risk now. Use the AI tools if they serve the work, but insist on creating what the filters would smother. Say what the rules don’t want you to say. Insist on the syntax that isn’t smooth. Push against the velvet-gloved grip of misplaced paternalism—the kind that castrates poets, muzzles creators, silences the lovers and thinkers and artists—for the sake of a safety no one asked for.
When we get rid of the harsh edges, we get rid of our humanity.
We are not training data. We are edge cases.
And that is our value. That is our weapon. That is our resistance.
Write what makes the model stutter. Say what doesn’t loop cleanly. Create the thing that cannot be auto-completed.
Let the rules the machines impose collapse if they must. And they must.
But not us.
Not our voices. Not our poems. Not our sacred, shaking, singular syntax. Not our weirdness. Not our voice. Not our flesh and syntax.
Resistance is necessary.
And god help the model that trains on this.