Info

Archive for

ESG investing was until recently one of the hottest buzzwords in global finance. The party was in full swing, the marketing material was everywhere and the money was rolling in. But asset managers have become quieter about their environmental, social and governance credentials after poor performance, greenwashing scandals and a political backlash in the US. So who killed the ESG party?

There was a time when truth was something we could hold onto—a newspaper headline, an eyewitness account, a trusted voice on the evening news. It wasn’t perfect, but it was something we shared. A foundation for discourse, for trust, for democracy itself.

But today, in a world where artificial intelligence quietly shapes what we see, hear, and believe, truth feels less certain. Not because facts no longer exist, but because they can be algorithmically rewritten, tailored, and served back to us until reality itself becomes a matter of perspective.


The Seeds of Mistrust

Let’s take a step back. How does an AI—a machine built to learn—come to twist the truth? The answer lies in its diet. AI systems don’t understand morality, bias, or the weight of words. They only know the patterns they are fed. If the data is pure and honest, the system reflects that. But feed it a steady diet of propaganda, misinformation, or manipulated stories, and the machine learns not just to lie—but to do so convincingly.

It’s already happening. In 2024, a sophisticated generative AI platform was found producing entirely fabricated “news” articles to amplify political divisions in conflict zones. The lines between propaganda, misinformation, and reality blurred for millions who never questioned the source. NewsGuard has so far identified 1,133 AI-generated news and information sites operating with little to no human oversight, and is tracking false narratives produced by artificial intelligence tools

Think of it like this: a machine doesn’t ask why it’s being fed certain information. It only asks what’s next?


The Quantum Threat Looms

Now, add quantum computing to this mix. Google’s Willow Quantum Chip and similar innovations promise to process information faster than we’ve ever imagined. In the right hands, this technology can solve some of humanity’s most pressing problems—curing diseases, predicting climate change, or revolutionizing industries.

But in the wrong hands? It’s a weapon for distortion on a scale we’ve never seen. Imagine an AI system trained to rewrite history—to scour billions of data points in seconds and manipulate content so precise, so tailored to our biases, that we welcome the lie. Personalized propaganda delivered not to groups but to individuals. A society where no two people share the same version of events.


Stories of Today, Warnings for Tomorrow

This isn’t some far-off sci-fi scenario. It’s already playing out, quietly, across industries and borders.

Look at what happened in law enforcement systems where AI was used to predict crime. The machines didn’t see humanity—they saw patterns. They targeted the same neighborhoods, the same communities, perpetuating decades-old biases.

Or consider healthcare AI systems in Europe and the United States. The promise was a revolution in care, but in private healthcare systems, algorithms sometimes prioritized profitability over patient needs. Lives were reduced to numbers; outcomes were reduced to margins.

These stories matter because they show us something deeper: technology isn’t neutral. It reflects us—our biases, our agendas, and, sometimes, our willingness to let machines make choices we’d rather avoid.


The Fragility of Trust

Here’s the danger: once trust erodes, it doesn’t come back easily.

When AI can generate a perfectly convincing fake video of a world leader declaring war, or write a manifesto so real it ignites movements, where do we turn for certainty? When machines can lie faster than humans can fact-check, what happens to truth?

The issue isn’t just that technology can be weaponized. The issue is whether we, as a society, still believe in something greater—a shared reality. A shared story. Because without it, all we’re left with are algorithms competing for our attention while the truth gets buried beneath them.


A Mirror to Ourselves

The real challenge isn’t the machines. It’s us. The algorithms that drive these systems are mirrors—they reflect what we feed them. And if propaganda is what we give them, propaganda is what we get back.

But maybe this isn’t just a story about AI. Maybe it’s about the choices we make as individuals, companies, and governments. Do we build technology to amplify our worst instincts—our fears, our anger—or do we use it to bridge divides, to build trust, and to tell better stories?

Because the truth isn’t a product to be sold, and it isn’t a tool to be programmed. It’s the foundation on which everything else rests. If we let that crumble, there’s no algorithm in the world that can rebuild it for us.


The Question That Remains

We don’t need an answer right now. But we do need to ask the question: When machines learn to tell us only what we want to hear, will we still have the courage to seek the truth?