You’re being played like a piece on a chessboard. Unless you’re one of the players.
And what’s the board? It’s glowing in your hand right now. Instagram. TikTok. YouTube. The new temples of attention.
Their gods are engagement. Their priests are influencers. Their rituals are endless scrolls.
Every swipe is a silent prayer. Every click, an offering. You think you’re consuming content. But content is consuming you.
The system is beautifully simple. It doesn’t need to control you…. it just needs to train you.
Scroll long enough and you’ll feel the pull. A chemical leash made of dopamine and comparison. You don’t even need the ad anymore. You’ve become the ad.
They don’t sell attention. They sell you to whoever can afford you.
Your gaze, your mood, your late-night loneliness…. all mapped, priced, optimized.
And while you’re chasing likes, someone else is chasing equity.
There are only two kinds of people left in the digital world. Consumers. And Creators.
Consumers feed the machine. Creators feed from it.
Consumers scroll for validation. Creators build for velocity.
Consumers are chemically trained to crave. Creators are consciously training the same algorithms to amplify.
Both use the same tools. Only one uses them with intent.
This isn’t about fame. It’s about authorship.
Because in this game, visibility is power. But visibility without creation is captivity.
If you don’t create, you’ll be consumed. If you don’t tell your story, someone else will sell it back to you. If you don’t play, you’ll be played.
The most successful people online aren’t the smartest or the richest. They’re the ones who chose to be seen on purpose.
They understood that control today isn’t about owning factories. It’s about owning narrative.
You don’t need to be perfect. You just need to post. To write. To record. To make. To build something in the open.
Because when you create, you stop being a pawn. You start moving like a player.
The algorithm stops dictating your identity and starts distributing your vision.
The dopamine loop becomes a design tool. And the game flips.
So, next time you open this app, before you scroll ….ask yourself:
Am I feeding the game? Or am I bending it? Am I the pawn? Or the player?
It was meant to cure poverty. Instead, it’s teaching machines how to lie beautifully.
The dream that sold us
Once upon a time, AI was pitched as humanity’s moonshot. A tool to cure disease, end hunger, predict natural disasters, accelerate education, democratize knowledge.
“Artificial Intelligence,” they said, “will solve the problems we can’t.”
Billions poured in. Thinkers and engineers spoke of a digital enlightenment — algorithms as allies in healing the planet. Imagine it: precision medicine, fairer economics, universal access to creativity.
But as the dust cleared, the dream morphed into something grotesque. Instead of ending poverty, we got apps that amplify vanity. Instead of curing disease, we got filters that cure boredom. Instead of a machine for liberation, we got a factory for manipulation.
AI did not evolve to understand us. It evolved to persuade us.
The new language of control
When OpenAI’s ChatGPT exploded in 2022, the world gasped. A machine that could talk, write, and reason! It felt like the beginning of something magnificent.
A flood of emotional realism — not truth, but truth-shaped seduction.
“The guardrails,” one researcher said, “are not real.”
Even worse, states and PR agencies began experimenting with Sora to “test audience sentiment.” Not to inform. To engineer emotional response at scale.
Propaganda used to persuade through words. Now it possesses through images.
The addiction loop
If ChatGPT was propaganda’s pen, Sora 2 is its theater.
On Tuesday, OpenAI released an AI video app called Sora. The platform is powered by OpenAI’s latest video generation model, Sora 2, and revolves around a TikTok-like For You page of user-generated clips. This is the first product release from OpenAI that adds AI-generated sounds to videos. So if you think TikTok is addictive you can imagine how more addictive this will be.
Together they form a full-stack influence engine: one writes your worldview, the other shows it to you.
OpenAI backer Vinod Khosla called critics “elitist” and told people to “let the viewers judge this slop.” That’s the logic of every empire built on attention: if it keeps you scrolling, it’s working.
AI promised freedom from work. What it delivered is work for attention.
The same dopamine design that made TikTok irresistible is now welded to generative propaganda. Every scroll, every pause, every tiny flick of your thumb trains the system to tailor persuasion to your psychology.
It doesn’t need to change your mind. It just needs to keep you from leaving.
The Ai chatbots took aways your critical thinking this will rot your brain in the same way TikTok does only worse
The moral inversion
In the early AI manifestos, engineers dreamed of eliminating inequality, curing disease, saving the planet. But building empathy algorithms doesn’t pay as well as building engagement loops.
So the smartest minds of our century stopped chasing truth — and started optimizing addiction. The promise of Artificial Intelligence devolved into Artificial Intimacy.
The lie is always the same: “This is for connection.” But the outcome is always control.
“The same algorithms that sell sneakers now sanitize occupation.”
While real people bury their children, AI systems fabricate smiling soldiers and “balanced” stories replacing horror with narrative symmetry. The moral wound isn’t just in what’s shown. It’s in what’s erased.
A generation raised on algorithmic empathy learns to feel without acting to cry, click, and scroll on. Is this how our world would be?
The reckoning
The tragedy of AI isn’t that it became powerful. It’s that it became predictable.
Every civilization has dreamed of gods. We built one and gave it a marketing job.
If this technology had been aimed at eradicating hunger, curing cancer, ending exploitation, the world might have shifted toward light, everyone would be happier Instead, it’s monetizing illusion, weaponizing emotion, and rewiring truth.
AI didn’t fail us by mistake. It succeeded exactly as designed.
The question is no longer what can AI do? It’s who does AI serve?
If it serves capital, it will addict us. If it serves power, it will persuade us. If it serves truth, it will unsettle us.
But it will only serve humanity if we demand that it does.
Because right now, the greatest minds in history aren’t building tools to end suffering they’re building toys that make us forget how much we suffer.
AI was supposed to awaken us. Instead, it learned to lull us back to sleep.
The next Enlightenment will begin when we remember that technology is never neutral and neither is silence.