Info

Posts from the all other stuff Category

Sam Altman, the man who helped turn the internet into a theme park run by robots, has finally confessed what the rest of us figured out years ago: the place feels fake. He scrolls through Twitter or Reddit and assumes it’s bots. Of course he does. It’s like Willy Wonka walking through his own chocolate factory and suddenly realizing everything tastes like diabetes.

The CEO of OpenAI worrying about bot-ridden discourse is like Ronald McDonald filing a complaint about childhood obesity. You built the thing, Sam. You opened the door and shouted “Release the clones!” and now you’re clutching your pearls because the clones are crowding the buffet.

The bots have won, and the humans are complicit

Here’s the real kicker: Altman says people now sound like AI. No kidding. Spend five minutes online and you’ll see humans writing in the same hollow, autocorrect tone as the machines. Every Instagram caption looks like it was generated by a motivational fridge magnet. Every tweet sounds like it was written by a marketing intern with a concussion.

This isn’t evolution. It’s mimicry. Like parrots squawking human words, we’ve started squawking algorithmic filler. Our personalities are being laundered through engagement metrics until we all sound like bot cousins trying to sell protein powder.

Dead Internet Theory goes corporate

For years, conspiracy theorists have whispered about the “Dead Internet Theory” the idea that most of what you see online is written by bots, not people. Altman just rolled into the morgue, peeled back the sheet, and muttered, “Hmm, looks lifeless.” What he forgot to mention is that he’s the one leasing out the coffins. AI companies aren’t worried the internet is fake. They’re building the next tier of fakery and charging subscription fees for the privilege.

So congratulations. The paranoid meme kids were right. The internet is a corpse dressed in flashing ads, propped up by click-farms, and serenaded by bots. And instead of cutting the cord, Silicon Valley is selling tickets to the wake.

The real problem isn’t bots

It’s incentives. Platforms reward sludge. If you spew enough generic engagement bait — “This billionaire said THIS about AI. Thoughts?” the algorithm slaps a medal on your chest and boosts you into everyone’s feed. Humans, desperate for attention, start acting like bots to compete. The lines blur. Who’s real? Who’s synthetic? No one cares, as long as the dopamine hits.

And that’s the rot. It’s not that AI makes the internet fake. It’s that humans are happy to fake themselves to survive inside it. We’re not just scrolling a dead internet. We’re rehearsing our own funerals in real time.

The coffin is already polished

So yes, Sam, the internet is fake. It’s been fake since the first influencer pretended their kitchen counter was a five-star resort. You’re just noticing now because your reflection is staring back at you. You built the machine, you fed it our words, and now it spits them back at you like a funhouse mirror. Distorted. Recycled. Dead.

The internet didn’t die naturally. It was murdered. And the suspects are still running the gift shop.

The end of democracy rarely arrives with sirens and flames. More often, it fades quietly—choice by choice, habit by habit, until the rituals remain but the substance has gone.

In their timely paper, Don’t Panic (Yet), Felix Simon and Sacha Altay remind us that the AI apocalypse never arrived in 2024. Despite a frenzy of deepfakes and fears of algorithmic manipulation, the great elections of that year were not decided by chatbots or microtargeted propaganda. The decisive forces were older and more human: politicians who lied, parties who suppressed votes, entrenched inequalities that shaped turnout and trust.

Their conclusion is measured: mass persuasion is hard. Studies show political ads, whether crafted by consultants or large language models, move few votes. People cling to their partisan identities, update beliefs only at the margins, and treat most campaign noise as background static. The public is not gullible. Even misinformation, now turbocharged by generative AI, is limited in reach by attention, trust, and demand.

In this sense, Simon and Altay are right: the panic was misplaced. AI was not the kingmaker of 2024.

But here is the danger: what if reassurance itself is the illusion?

The great risk of AI to democracy does not lie in a single election “hacked” by bots. It lies in the slow erosion of the conditions that make democracy possible. Simon and Altay diagnose panic as a cycle society overreacts to every new medium. Yet what if this is not a panic at all, but an early recognition that AI represents not another medium, but a structural shift?

Democracy depends on informational sovereignty citizens’ capacity to orient themselves in a shared reality. Generative AI now lives inside search engines, social feeds, personal assistants. It does not need to persuade in the crude sense. It reshapes the field of visibility what facts surface, what stories disappear, what worlds seem plausible.

Simon and Altay show that persuasion is weak. But erosion is strong.

  • Trust erodes when deepfakes and synthetic voices make truth itself suspect.
  • Agency erodes when predictive systems anticipate our preferences and feed them back before we form them.
  • Equality erodes when the wealthiest campaigns and nations can afford bespoke algorithmic influence while the rest of the citizenry navigates blind.

In 2024, democracy endured not because AI was harmless, but because old buffers mainstream media, partisan loyalty, civic inertia still held. These reserves are not infinite. They are the borrowed time on which democracy now runs.

So yes: panic may be premature if we define it as fearing that one election will be stolen by machines. But complacency is suicidal if we fail to see how AI, fused with the logics of surveillance capitalism, is hollowing democracy from within.

The question is not whether AI will swing the next vote. The question is whether, by the time we notice, the very meaning of choice will already have been diminished.

Democracy may survive a storm. What it cannot survive is the slow normalization of living inside someone else’s algorithm.

Go grab them here

Overall, I believe many of these trends will indeed show up more frequently in 2026 UI/UX design. Some will become mainstream; others will remain more niche or experimental. For a product team, I’d prioritize:

  • Making accessibility non-negotiable
  • Building/designing robust design systems
  • Adding or improving micro-interactions, motion, but in service of clarity and delight, not just decoration
  • Being mindful of performance, privacy, and giving users control

via

Only in Albania could such a mythic gesture occur: appointing an algorithm as cabinet minister. Diella, we are told, will cleanse public procurement of corruption, that timeless Balkan disease. The government proclaims that, at last, software will succeed where generations of politicians failed.

Permit me some skepticism.

Public procurement remains the deepest vein of corruption not because ministers are uniquely wicked, but because the system demands it. Contracts worth billions hinge on opaque decisions. Bribes are not accidents; they are the lubricant that keeps political machines alive. To imagine an algorithm can sterilize this mistake mathematics for morality.

Worse, Diella may render corruption not weaker but stronger. Unlike a human minister who can be interrogated, shamed, toppled, an algorithm offers no face to confront. If a contract flows to the prime minister’s cousin’s company, the defense comes immediate and unassailable: the machine decided. How convenient.

Algorithms never impartial. Written, trained, tuned by people with interests. Corruption, once visible in smoky cafés and briefcases of cash, risks migrating invisibly into code—into criteria weighted here, data sets adjusted there. Easier to massage inputs than to bribe a minister. Harder to detect.

This does not resemble transparency. It resembles radical opacity dressed in the costume of objectivity.

So let us be clear: Albania’s experiment counts as bold. It may inspire imitators across a continent exhausted by graft. But boldness and danger travel as twins. Diella will either cleanse the bloodstream of public life or sanctify its toxins in digital armor.

Do not be fooled by rhetoric. If citizens cannot audit code, if journalists cannot interrogate criteria, if rivals cannot challenge outputs, Albania has not abolished corruption. It has automated it.

The irony cuts deep. A government that promises liberation from human vice may have just built the perfect machine for laundering it.

Page 13 of 3615
1 11 12 13 14 15 3,615