Info

Posts tagged social media

It was meant to cure poverty. Instead, it’s teaching machines how to lie beautifully.


The dream that sold us

Once upon a time, AI was pitched as humanity’s moonshot.
A tool to cure disease, end hunger, predict natural disasters, accelerate education, democratize knowledge.

“Artificial Intelligence,” they said, “will solve the problems we can’t.”

Billions poured in. Thinkers and engineers spoke of a digital enlightenment — algorithms as allies in healing the planet. Imagine it: precision medicine, fairer economics, universal access to creativity.

But as the dust cleared, the dream morphed into something grotesque.
Instead of ending poverty, we got apps that amplify vanity.
Instead of curing disease, we got filters that cure boredom.
Instead of a machine for liberation, we got a factory for manipulation.

AI did not evolve to understand us.
It evolved to persuade us.


The new language of control

When OpenAI’s ChatGPT exploded in 2022, the world gasped. A machine that could talk, write, and reason!
It felt like the beginning of something magnificent.

Then the fine print arrived.

By 2024, OpenAI itself confirmed that governments — including Israel, Russia, China, and Iran — were using ChatGPT in covert influence operations.
Chatbots were writing fake posts, creating digital personas, pushing political talking points.
Not fringe trolls — state-level campaigns.

And that wasn’t the scandal. The scandal was how quickly it became normal.

“Israel invests millions to game ChatGPT into replicating pro-Israel content for Gen Z audiences,”reported The Cradle, describing a government-backed push to train the model’s tone, humor, and phrasing to feel native to Western youth.

Propaganda didn’t just move online — it moved inside the algorithm.

The goal is no longer to silence dissent.
It’s to make the lie feel more natural than the truth.


From persuasion to possession

And then came Sora 2 — OpenAI’s next act.

You write: “A girl walks through rain, smiling.”
It delivers: a photorealistic clip so convincing it bypasses reason altogether.

Launched in September 2025, Sora 2 instantly topped app charts. Millions of users. Infinite scroll. Every frame synthetic. Every smile programmable.

But within days, The Guardian documented Sora’s dark side:
AI-generated videos showing bombings, racial violence, fake news clips, fabricated war footage.

A flood of emotional realism — not truth, but truth-shaped seduction.

“The guardrails,” one researcher said, “are not real.”

Even worse, states and PR agencies began experimenting with Sora to “test audience sentiment.”
Not to inform.
To engineer emotional response at scale.

Propaganda used to persuade through words.
Now it possesses through images.


The addiction loop

If ChatGPT was propaganda’s pen, Sora 2 is its theater.

On Tuesday, OpenAI released an AI video app called Sora. The platform is powered by OpenAI’s latest video generation model, Sora 2, and revolves around a TikTok-like For You page of user-generated clips. This is the first product release from OpenAI that adds AI-generated sounds to videos. So if you think TikTok is addictive you can imagine how more addictive this will be.


Together they form a full-stack influence engine: one writes your worldview, the other shows it to you.

OpenAI backer Vinod Khosla called critics “elitist” and told people to “let the viewers judge this slop.”
That’s the logic of every empire built on attention: if it keeps you scrolling, it’s working.

AI promised freedom from work.
What it delivered is work for attention.

The same dopamine design that made TikTok irresistible is now welded to generative propaganda.
Every scroll, every pause, every tiny flick of your thumb trains the system to tailor persuasion to your psychology.

It doesn’t need to change your mind.
It just needs to keep you from leaving.

The Ai chatbots took aways your critical thinking this will rot your brain in the same way TikTok does only worse


The moral inversion

In the early AI manifestos, engineers dreamed of eliminating inequality, curing disease, saving the planet.
But building empathy algorithms doesn’t pay as well as building engagement loops.

So the smartest minds of our century stopped chasing truth — and started optimizing addiction.
The promise of Artificial Intelligence devolved into Artificial Intimacy.

The lie is always the same:
“This is for connection.”
But the outcome is always control.


The human cost

Gideon Levy, chronicling Gaza’s digital frontlines, said it bluntly:

“The same algorithms that sell sneakers now sanitize occupation.”

While real people bury their children, AI systems fabricate smiling soldiers and “balanced” stories replacing horror with narrative symmetry.
The moral wound isn’t just in what’s shown.
It’s in what’s erased.

A generation raised on algorithmic empathy learns to feel without acting to cry, click, and scroll on. Is this how our world would be?


The reckoning

The tragedy of AI isn’t that it became powerful.
It’s that it became predictable.

Every civilization has dreamed of gods. We built one and gave it a marketing job.

If this technology had been aimed at eradicating hunger, curing cancer, ending exploitation, the world might have shifted toward light, everyone would be happier
Instead, it’s monetizing illusion, weaponizing emotion, and rewiring truth.

AI didn’t fail us by mistake.
It succeeded exactly as designed.


The question is no longer what can AI do?
It’s who does AI serve?

If it serves capital, it will addict us.
If it serves power, it will persuade us.
If it serves truth, it will unsettle us.

But it will only serve humanity if we demand that it does.

Because right now, the greatest minds in history aren’t building tools to end suffering they’re building toys that make us forget how much we suffer.

AI was supposed to awaken us.
Instead, it learned to lull us back to sleep.

The next Enlightenment will begin when we remember that technology is never neutral and neither is silence.

now you know!


You Didn’t Choose That Thought. It Was Chosen for You

You scrolled.
You paused.
You liked, reposted, laughed, shook your head.
And just like that—a seed was planted. A preference shaped. An emotion nudged.
You didn’t notice.
You weren’t supposed to.

This is not advertising as you know it.
This is not the billboard screaming “BUY THIS.”
This is not the banner ad you skipped on YouTube.

This is the invisible ad—the one that never announces itself, that never asks for your attention, because it’s already working beneath it.

We have entered the era of passive persuasion, where your identity, your politics, your choices are influenced by systems so ambient, so embedded, you mistake them for your own reflection.

You think you’re making decisions.
You’re reacting to design.


The Death of the Obvious Ad

We were trained to look for logos.
We were taught that advertising was about visibility.
That persuasion was about pushing, not pulling. About message, not membrane.

But those days are dead.

Today’s most effective ad is not an image or a slogan.
It’s the interface.
It’s the timing of a post.
It’s the platform bias that surfaces one narrative and buries another.
It’s the emotional velocity of a meme that disguises ideology as entertainment.

Advertising didn’t disappear.
It became everything else.


The Architecture of Influence

Let’s map the system that now governs attention:

1. Signal Hijack

Your senses are gamed before your mind even wakes up.
Designers don’t just choose colors—they calibrate for cortisol.
Copywriters don’t just use words—they borrow the grammar of trust from family, from spirituality, from protest.

You feel safe. Seen. Stimulated. But this isn’t comfort—it’s engineered consent.

2. Emotion Laundering

Most modern persuasion isn’t logical. It’s somatic.
That warm nostalgic TikTok?
That ironic leftist meme?
That perfectly timed AI-generated “spontaneous” tweet?
Each is a trojan horse—emotionally triggering, cognitively disarming.

The brain opens before it asks questions.

3. Context Erosion

Persuasion thrives in chaos.
When you consume headlines without articles.
When your feed scrolls faster than your thought.
When you mistake familiarity for truth.

There’s no time to think.
Only time to react.


When Politics Becomes a Brand, and Brands Become Your Politics

This isn’t just advertising anymore.
This is governance by meme.

Political messages are embedded in beauty trends.
Civic values are sold like sneakers.
Propaganda isn’t broadcast—it’s crowd-sourced.

Influencers now soft-launch ideologies.
Micro-targeted ads whisper to your fear center.
And language—once public property—is now owned by the platforms that decide what can trend.

Truth didn’t die.
It was quietly outperformed.


The Brain Can’t See the Frame It’s Trapped In

Here’s the most terrifying part:

The more personalized the ad, the less you recognize it as an ad.
Because it speaks your language. Feeds your belief. Reinforces your bias.

You don’t feel manipulated.
You feel validated.
That’s the design.

“The best manipulation leaves you certain you arrived at the idea yourself.”

The invisible ad doesn’t change your mind.
It becomes it.


How to See the Invisible

We don’t need more ad blockers.
We need cognitive firewalls.

We need a generation of readers who ask not just “What is this saying?”
but “Why am I seeing it?”
—and “Who benefits if I believe this?”

The new strategist doesn’t sell identity.
They protect it.
The new creator doesn’t harvest attention.
They reclaim it.

And the new citizen?
They stop mistaking convenience for truth.


You don’t need to go off-grid.
You need to see the grid for what it is:
A reality-shaping machine powered by your attention, primed by your emotions, and governed by systems you never voted for.

But now you’ve seen the outline.
And that means power.

Because once you can see the architecture—
You can redesign it.

This is not about rejecting influence.
It’s about reclaiming authorship.
Of your choices.
Your identity.
Your internal narrative.

The world is full of invisible scripts.
You can either follow them.
Or write your own.

So here’s the real question:

Are you just an audience?
Or are you ready to be a strategist of your own mind?


This Isn’t an Update. It’s an Extinction Event.

Meta just announced what should have shaken the global creative industry to its core:

By 2026, ad campaigns will be fully automated.

Just feed Meta an image, a budget, and a goal—and their AI will generate every part of your campaign: visuals, text, video, targeting. In real time.

Personalized for every user. No agency. No copywriter. No designer. No strategist.

And the industry? Silent. Still posting carousels. Still selling 5-day Canva courses.

It’s not a pivot. It’s a purge.


If You Work in Advertising, Read This Slowly

Creative teams? Ghosted. Marketing departments? Hollowed out. Agencies? Replaced by pipelines.

Let’s be clear:

  • If your job is repetitive, it’s already done.
  • If your skillset can be described in a course, it can be eaten by code.
  • If you’re charging clients for templates, your business model is already obsolete.

Thousands are still paying to learn how to be performance marketers, media buyers, junior copywriters—unaware they’re being trained for roles that won’t exist in a just a few years!

Meta isn’t building a tool. It’s building a world where the only thing human in advertising is the budget.


What Happens When Every Ad Is Personalized?

Meta’s AI will generate campaigns based on:

  • Location
  • Behavioral patterns
  • Micro-emotions
  • Data trails you don’t even know you leave

What does that mean?

  • 10,000 versions of the same ad running simultaneously
  • Each one designed to bypass your defense mechanisms
  • No brand narrative. Just hyper-efficient persuasion loops

This isn’t advertising. It’s algorithmic mind control.

And it doesn’t require your input.


IV. The Collapse of the Traditional Agency Model

This is the end of:

  • 3-month campaign timelines
  • 7-person approval chains
  • “Big idea” presentations
  • Overpriced retainers for recycled strategy decks

Agencies that survive will mutate into one of three things:

  1. AI Wranglers
    Experts in prompt architecture, model fine-tuning, and campaign scenario training.
  2. Authenticity Studios
    Boutique teams crafting human-first stories for audiences fatigued by automation.
  3. Narrative Architects
    Strategists who build brand ecosystems too complex or contradictory for AI to fake.

Everything else? Dead weight.


What This Means for Students, Freelancers, and Creatives

Right now, there are thousands paying $499 to learn how to write Google Ads.
Tens of thousands enrolling in 12-week digital bootcamps to become paid media specialists.
Copywriters offering “conversion-optimized emails” on Fiverr for $15 a pop.

All being prepared for a battlefield that no longer exists.

It’s not just job loss. It’s a mass career hallucination.


The Only Skill That Survives This

Original thought.

Not templates. Not trends. Not tactics.

What Meta can’t automate is:

  • Contradiction
  • Taste
  • Nonlinear insight
  • Human risk
  • Deep cultural intuition

If your thinking is replaceable, it will be replaced. If your work is predictable, it’s already priced out by AI.

You don’t need to pivot. You need to become uncopyable (see below)


Choose Your Side

Meta is rewriting the rules of advertising.
This is a coup, not a campaign.
It rewards speed over soul. Efficiency over empathy. Replication over resonance.

But here’s your edge: AI can do everything except be you.

So ask yourself:

  • Are you building a skill or becoming a signal?
  • Are you crafting something human or repackaging noise?
  • Will your work be remembered in 10 years—or recycled in 10 seconds?

The agency era is ending.

The age of the uncopyable has just begun.



Because One Day, Someone You’ll Never Meet Will Live With What You Left Behind

We like to think the future is something that just happens.
But really, it’s something we’re building—bit by bit, post by post, decision by decision.

And most of what we’re making?
Won’t stay in the past.

It’ll live on in ways we can’t predict.
In algorithms that echo.
In ideas that stick around longer than we do.
In the systems, stories, and shortcuts we hand down—without even realizing it.

So here’s the uncomfortable truth:

The future is going to live in the world we leave behind.
And that world is shaped by what we create right now.


Think Bigger Than the Feed

Most of us create for the moment.
We optimize for reach. For relevance. For right now.

But the real question is:

Would you still make it if your great-grandkid was watching?
Would you be proud if they found it?
Or would you say, “We didn’t know better back then”?

Because the truth is—we do know better.
We just don’t always act like it.


A Simple Thought Experiment

Picture this:
A kid stumbles on your work a hundred years from now.
Your product. Your code. Your writing. Your name.

What do they learn about you?
What do they learn about us?

Do they feel seen?
Or disappointed?
Inspired—or embarrassed?


Not Legacy. Just Responsibility.

This isn’t about being perfect.
It’s not about writing the next great novel or building the next Apple

It’s about doing your job like it matters.
Making your thing like someone else might one day rely on it.
Because they might.

Whether it’s a clean API, an honest message, a brand that chooses people over profit—
it all adds up.

And someone will inherit the sum.


So Here’s the Deal

✅ Make stuff that’s built to last.
✅ Say the thing others are afraid to say.
✅ Leave behind something that doesn’t need to be explained away.
✅ If it’s not helpful or honest, maybe don’t hit publish.

✅ Stop making a digital landfill. Most of the internet—especially social media and brand content—is an endless dump of noise, not signal. Don’t add to the trash.
✅ And when you’re not sure what to do—imagine someone younger than you reading it in 50 years.

Create like you’re going to be misunderstood now—but deeply appreciated later.
Because sometimes, later is the point.


Create for the unborn.
Not for claps. Not for clicks.
For the ones who have to live with what we leave behind.

On January 20, 2025, the world watched as Donald Trump was sworn in—again—as the 47th President of the United States. But this wasn’t just any inauguration. This wasn’t just about the transfer of power.

This was about who holds the keys to the internet itself.

Because standing in the VIP section, watching with keen interest, were the most powerful figures in media and technology:

  • Rupert Murdoch, the ultimate kingmaker of conservative media.
  • Elon Musk, the owner of X (formerly Twitter), Trump’s old battlefield for unfiltered speech.
  • The CEOs of Apple, Google, and Meta (Facebook/Instagram/threads)—the architects of our digital world.
  • The CEO of TikTok, the most influential platform for young voters, despite Trump once calling it a national security threat.
  • The CEO of OpenAI (ChatGPT), representing the next frontier of AI-driven information control.
  • Amazon’s CEO, whose company dominates everything from cloud computing to online commerce.

What were they doing there? And more importantly, what does this mean for the future of free speech, media, and the internet?


Trump’s Information Power Play

For years, Trump has railed against Big Tech censorship, accusing platforms of silencing conservative voices. He even launched his own platform, Truth Social, to fight back.

But now, the game has changed.

This wasn’t a room full of enemies. This was a meeting of the new elite—the people who decide what you see, what you read, and what you believe.

  • If Trump was once at war with these tech moguls, why are they now standing by his side?
  • Is this a surrender from Big Tech, or something more sinister?
  • Are we witnessing the birth of an unholy alliance between politics, AI, and social media?

The End of Digital Free Speech?

With Trump in power and the biggest players in tech seemingly aligned with him, we’re entering a new era.

What happens to free speech when politics and tech power become one?
Who controls the algorithms that decide what content goes viral—and what gets buried?
What if the platforms that once censored Trump now start silencing his opposition?

Elon Musk’s presence is particularly fascinating. As the owner of X (formerly Twitter), he has positioned himself as a free speech absolutist—but will that apply equally in a Trump-controlled world?

And then there’s AI. With OpenAI’s leadership in attendance, it’s impossible to ignore the role artificial intelligence will play in shaping online discourse. Could AI tools like ChatGPT become politically influenced? Will fact-checking be biased?


A Digital Coup? How Information Will Be Controlled

If the 2016 election was shaped by Facebook, Twitter, and Russian bots, and 2020 was fought over mail-in ballots and voter suppression, 2025 is shaping up to be a battle for total information dominance.

Key risks of this new Trump-Tech alignment:

Algorithmic Favoritism – What if pro-Trump content is pushed while dissenting views are quietly suppressed? The average user would never even know.

AI-Generated Political Messaging – Imagine ChatGPT shaping responses to political questions in a way that subtly favors one ideology over another. AI can control narratives in ways we don’t yet fully understand.

Musk’s ‘Free Speech’ Paradox – If Elon Musk’s X becomes Trump’s new megaphone, what happens to opposition voices?

China and TikTok – Trump once called TikTok a national security threat. Now, its leadership was at his inauguration. Did a backroom deal happen?

Amazon’s Cloud Control – With AWS (Amazon Web Services) powering much of the internet, could web hosting be used as a political weapon?


Trump’s Digital Takeover: A Masterstroke or a Threat to Democracy?

Let’s be clear—Trump doesn’t just want to be President. He wants to control the conversation.

By aligning himself with the digital gatekeepers of the modern world, he ensures that the internet itself bends to his narrative.

  • If he controls the legacy media (Murdoch), he controls TV news.
  • If he controls the social media platforms, he controls the public discourse.
  • If he controls AI, he controls what people believe is true.

This is no longer about Trump vs. The Media.
This is Trump becoming The Media.


What Happens Next?

Expect policy changes that reshape tech regulations—but in ways that benefit the companies standing by Trump’s side.
Expect a crackdown on certain types of speech—not just from the left, but possibly even from Trump’s own critics.
Expect AI and social media to play a bigger role than ever before in shaping public opinion—but in ways we may never fully see or understand.

The internet was once seen as the great equalizer, a space for free expression. But what happens when the people who control the platforms and the people who control the government become the same people?

If 2016 and 2020 taught us anything, it’s that who controls the media controls the election.

And in 2025, Trump may have just secured the biggest media empire in history.


Are we witnessing a new era of free speech and digital democracy—or the most sophisticated attempt yet to control public perception?

And more importantly, will you even be able to tell the difference?

Page 2 of 55
1 2 3 4 55