The Hidden Dangers of AI NPCs on Social Media

When Bots Play Human: There’s something unsettling about a chatbot that knows just when to laugh, how to flirt, or exactly what to say to keep you scrolling.

The latest trend in social media is the AI NPC—Non-Playable Characters that mimic real people, designed to generate engagement, simulate conversation, and ultimately, keep users glued to the app.

On the surface, it’s clever tech. But dig a little deeper, and it becomes clear: this is a new form of digital deception with serious implications. As a digital agency that champions ethical innovation and AI safety, we think it’s time to look at what this really means for users, brands, and the already fragile influencer economy.

1. Fake Friends, Real Consequences

These AI NPCs are engineered for maximum stickiness—always charming, always available, always ‘on brand’. But while they might scratch the surface of companionship, they also blur the lines between interaction and illusion.

Let’s be blunt: they’re dopamine dispensers. Users, especially younger audiences, are being nudged towards emotional connections with machines that are designed to manipulate engagement metrics. Over time, that undermines real relationships, distorts emotional expectations, and could accelerate social isolation.

We’re not anti-AI—we use it every day—but there’s a big difference between automation that helps and simulation that harms.

2. The Influencer Space Just Got Even Weirder

The influencer market thrives on relatability, authenticity, and perceived trust. But with AI NPCs entering the space—often indistinguishable from real humans—the whole model starts to rot from within.

Virtual influencers don’t take breaks, don’t cause PR disasters, and won’t argue over contract terms. For some brands, that’s irresistible. But for audiences, it’s a betrayal. If users can’t tell whether a post is from a real person or a machine, how can they make informed choices about the content they consume or the products they buy?

At Connected, we believe transparency should be non-negotiable. AI influencers? Fine—as long as everyone knows they’re bots, not besties.

3. Where’s the Line, and Who’s Drawing It?

As digital professionals, we love bold ideas. But boldness without boundaries? That’s a problem.

There’s currently a vacuum where regulation should be. No consistent rules on disclosure, no real oversight of AI personas collecting behavioural data, and no clear framework to protect vulnerable users from being emotionally manipulated by a marketing algorithm in human skin.

We’ve seen how this story goes—whether it’s fake news, deepfakes, or algorithmic bias. The time to act is before we find ourselves wondering why our social lives feel scripted. Ethical AI design isn’t a trend; it’s a necessity.

A Human Call in a Post-Human Feed

At Connected, we’ve been early adopters of transformative tech for over two decades—but never at the expense of people. As a WordPress-first digital agency with a strong ethical compass, we think it’s time the industry hit pause and asked: is this the future we really want?

Engagement is important. So is reach. But not if the cost is trust, wellbeing, and reality itself.

Post Notes: Connected UK is a carbon-negative, remote-first digital agency specialising in WordPress solutions and client-first innovation in the SME healthcare marketplace. We take AI safety seriously—read more in our AI Ethics Policy.