People Abandoned It First.

There is a very popular myth circulating right now, and like most popular myths, it exists to protect feelings rather than describe reality.
The story goes like this: once upon a time, humans were original. Ideas were daring. Creativity flourished. Then AI arrived and ruined everything. Suddenly everything is derivative, shallow, and dull. A tragic fall from grace, caused by machines.
It’s a beautiful story. Simple. Emotional. Morally satisfying.
It’s also nonsense.
Original thought did not disappear when AI arrived. It had already been deprioritized, diluted, and quietly sidelined. AI didn’t kill it. AI just stopped pretending it was still the priority.
Long before a single model generated a sentence, humans made very deliberate choices about creativity. We decided speed mattered more than depth. Familiarity mattered more than friction. Productivity mattered more than reflection.
Originality, when it slowed things down or made people uncomfortable, was treated like an inconvenience rather than a virtue.
So people adapted. Naturally.
They learned how to sound insightful without risking disagreement. How to recycle ideas with enough confidence to pass them off as perspective. How to repeat the same opinions as everyone else while insisting, with a straight face, that they arrived there independently.
This wasn’t a collapse of imagination. It was a triumph of optimization.
AI didn’t invent this behavior. It studied it. Faithfully.
When people complain that AI produces generic work, what they are often reacting to is recognition. The machine recombines patterns because patterns are what we rewarded. It produces plausible ideas because plausibility was safer than actually committing to a view.
If the output feels hollow, that’s not a technical flaw. That’s a performance review.
And yes, this is where people get defensive.
Because originality, when it actually shows up, is not polite. It disrupts conversations. It challenges consensus. It introduces tension where everyone else was enjoying agreement. And most modern systems are designed to eliminate friction, not host it.
So originality became something we admired from a distance. We praised it retrospectively, once it was no longer dangerous. Once it had been canonized, branded, and drained of its capacity to cause trouble.
In real time, we rewarded repetition with confidence.
Then AI arrived and did exactly what it was trained to do. It absorbed what was most common. It reproduced what performed best. It sounded like everything else because everything else was the curriculum.
And suddenly, people were outraged.
They began talking about originality as if it were a surface-level aesthetic problem. As if it were something you could spot instantly, like a font choice or a brushstroke. As if it lived in the output rather than in the thinking that preceded it.
This confusion is doing a lot of heavy lifting.
Originality is not novelty. It’s not weirdness. It’s not surprise for surprise’s sake. It’s perspective. It’s deciding what matters before you decide how to package it. It’s choosing which questions are worth asking when there is no obvious audience approval waiting at the end.
AI does not make those decisions.
Humans do.
Or rather, humans stopped making them consistently and now seem offended that the machine didn’t compensate for that abdication.
There is also a particularly dishonest move embedded in the current panic. People talk as if originality used to be guaranteed by effort alone. As if struggling through a process automatically produced insight. As if difficulty itself were proof of depth.
It never was.
Plenty of very hard work has always resulted in very boring ideas. We just preferred not to say that part out loud.
What AI removes is the illusion that producing words is the same thing as thinking. You can no longer hide behind effort, mystique, or a complicated workflow and expect it to register as meaning. The mechanics are visible now. And that visibility is deeply uncomfortable for people who benefited from opacity.
Because once the mechanics are exposed, originality looks suspiciously like what it always was: a responsibility.
If you want original work, you have to decide what you believe. You have to reject far more ideas than you keep. You have to tolerate being wrong without blaming the tool, the audience, or the algorithm.
That’s not a tooling problem.
That’s a backbone problem.
It is much easier to say AI ruined creativity than to admit we spent years rewarding sameness because it was efficient and safe. It is easier to mourn originality than to practice it in environments that quietly punish deviation.
AI didn’t kill original thought.
It arrived after we had already made it optional.
The machine didn’t take originality from us.
We handed it over, then complained when it didn’t give it back.
