Do Not Underestimate Your Place in History Right Now
There is no doubt in my mind that this is the most important piece I’ve written so far.
History rarely announces itself. It arrives quietly, disguised as efficiency and ease. Artificial intelligence is doing exactly that—slipping into our thinking, our decisions, and our creative work faster than we are pausing to notice.
This essay is an attempt to slow the moment down, to ask what it means to live through a cognitive shift of this scale, and why our awareness matters more than our speed.
History rarely announces itself.
It doesn’t arrive with trumpets or headlines or a clean break between “before” and “after.” More often, it slips in quietly, disguised as convenience. A software update. A new workflow. A slightly faster way of doing something you were already doing yesterday.
Most people living through true inflection points don’t experience them as such. They experience them as busy. As incremental. As vaguely unsettling, but not yet alarming enough to demand reflection.
And then, years later, they look back and say: That was the moment everything changed.
We are in one of those moments now.
The mistake many of us are making - across the creative industries, automation, business, and beyond - is assuming that what is happening with artificial intelligence is simply another technological shift. Another tool. Another platform. Another chapter in the long story of progress.
It isn’t.
This is not a story about adoption curves or productivity gains or which industries will be “disrupted.” Those conversations, while not unimportant, miss the deeper truth.
What is happening right now is not primarily technological. It is historical. And history, unlike technology, keeps score.
The Illusion of Continuity
Part of what makes this moment so easy to underestimate is how familiar it feels.
We’ve been here before, we tell ourselves. The internet. Social media. Mobile. Cloud computing. Each wave promised transformation. Each wave delivered upheaval. Each wave eventually settled into the background hum of modern life.
The internet connected us to information. Social media connected us to one another. Mobile put the world in our pockets.
Each change was vast—but also, in retrospect, bounded. Artificial intelligence feels similar at first glance. Another layer. Another acceleration. Another set of tools to master. But this sense of continuity is misleading.
Those previous technologies sat outside us. They changed how work moved, how ideas spread, how attention flowed. They reorganised distribution, not cognition. They altered access, not authorship.
AI is different. AI does not simply move information faster. It participates in thinking itself.
That distinction is subtle, but profound. It is the difference between a printing press and a co-author. Between a calculator and a collaborator. Between a tool you wield and a system that shapes how you perceive, decide, and judge.
When the terrain shifts from execution to cognition, you are no longer just upgrading your tools. You are renegotiating your relationship with agency.
Why This Moment Feels Unsettling (Even If You Can’t Quite Explain Why)
Many people describe their relationship with AI as a mixture of excitement and unease. The upside is obvious and, in many cases, astonishing. Medical research accelerated. Energy systems optimised. Accessibility expanded. Creative possibilities multiplied
At the same time, something feels off.
This discomfort is often dismissed as fear of change, or worse, as resistance from those who “don’t get it.” But that explanation is too convenient and too shallow.
What we are responding to—often subconsciously—is not the power of AI, but its proximity.
AI sits uncomfortably close to the parts of ourselves we associate with meaning: judgment, taste, intuition, synthesis, creativity, and authorship. It doesn’t merely automate tasks; it compresses the distance between idea and execution so dramatically that the space once occupied by deliberation begins to collapse.
That space mattered more than we realised.
It was where intention lived. Where doubt sharpened thinking. Where craft emerged not from speed, but from friction. Where people learned not just how to do things, but why they were worth doing in the first place.
When that space erodes, the risk isn’t inefficiency. The risk is thoughtlessness at scale.
The Creative Industries as Early Warning System
If there is a sector uniquely positioned to feel this shift first, it is the creative industry—not because creatives are more important, but because their work sits at the intersection of meaning and production.
Creativity has always been less about output than about selection. Knowing what to make, what to discard, what to refine, what to protect. Taste, after all, is a form of judgment developed over time, through exposure, failure, and lived experience.
AI excels at generating options. It does not inherently know which ones matter. That gap—between generation and judgment—is where the real work now lives.
The danger is not that AI will replace creative professionals. The danger is that it will seduce them into confusing speed with substance, volume with value, coherence with originality.
When everything becomes possible, discernment becomes scarce. And discernment is not something you can outsource without consequence.
The Quiet Risk of Passive Adoption
Most of the long-term damage caused by technological shifts is not the result of malicious intent. It comes from unexamined convenience.
Passive adoption looks responsible. Sensible. Even progressive. It sounds like staying competitive, keeping up, not being left behind.
But passive adoption has a cost.
When tools shape behaviour faster than values shape tools, agency begins to drift. Decision-making becomes defaulted. Outputs begin to converge. Originality flattens—not because creativity disappears, but because it is no longer protected by deliberate friction.
History is filled with examples of moments when societies embraced powerful systems without fully interrogating their implications. The consequences rarely arrived immediately. They unfolded slowly, then suddenly.
The question is not whether AI will be used. That question has already been answered.
The better question is who remains conscious while using it.
This Is Not About Tools. It Is About Posture.
Future historians will not catalogue this era by listing software platforms or model versions. They will ask different questions.
Who treated AI as an accelerant—and who treated it as an authority? Who preserved judgment—and who outsourced it? Who used speed to deepen thinking—and who used it to avoid thinking altogether?
In moments like this, history does not judge competence. It judges posture. Were you awake to what was happening? Did you slow down where slowing down mattered? Did you recognise that convenience is never neutral?
These are not moral questions. They are historical ones.
A Closing Thought
You do not need to become an AI evangelist or an AI sceptic. Those positions are equally limiting. What this moment requires is something rarer: historical self-awareness. An understanding that you are not merely navigating a technological shift, but helping to define the norms, habits, and values that will shape how intelligence—human and artificial—coexist.
Most people living through history do not know it at the time.
But you do.
Do not underestimate your place in it, or your role in shaping it.


