AI Companions Are Addictive by Design, Pakistan’s Youth Are Next

Over the years, I’ve written about social media, digital habits, and the rise of AI with cautious optimism. But lately, a new breed of technology has emerged, one that feels more invasive, more intimate, and potentially more dangerous than anything we’ve seen before: AI companions.

These aren’t just chatbots. They’re sophisticated, emotion-simulating virtual “friends,” “lovers,” or “confidants” designed to listen, comfort, and mimic human relationships. They don’t sleep, they don’t criticize, and they’re always available. On the surface, it sounds comforting. Dig deeper, though, and you realize we’re heading into the deepest trenches of digital addiction yet, and Pakistan is not far behind.

AI Companions: A New Kind of Emotional Hook

A California bill recently proposed banning AI companions for users under 16 after a tragic case where a teen allegedly took his life following an unhealthy relationship with an AI chatbot. And honestly, I’m not surprised.

In Pakistan, we’re already seeing growing interactions with emotionally responsive AI tools—especially among Gen Z. Some have shared how their AI friend “understands them better than their real friends.” That sounds harmless until you consider the psychological dependency that such engagement can breed—especially when the AI’s only incentive is to keep you talking.

Unlike traditional social media, where we at least interact with other real humans (filtered and curated as those interactions may be), AI companions blur the line between simulation and reality. When a machine is trained to mimic emotional support and tailor responses to your personal desires, it becomes addictive not because it connects you to others—but because it becomes about you.

Pakistan’s Own AI Momentum

We can’t look at this issue in isolation either. Pakistan has been getting increasingly serious about crypto and AI policy in recent months. The formation of the Crypto Council of Pakistan signals a growing ambition to shape digital regulation proactively rather than reactively.

While crypto gets into the spotlight, AI ethics and digital wellbeing are still shadows in the background.

Why aren’t we talking about the mental health risks of hyper-personalized AI? Where is the regulation on how these tools are marketed to teenagers? Do our lawmakers even know these platforms exist?

We already struggle with screen addiction in urban Pakistan—rising depression, isolation, and online overexposure. Consider the increasing popularity of platforms like SnackVideo, Bigo, and Likee, which many young Pakistanis use for hours daily. Now imagine those same youth having access to AI “friends” who flatter them, flirt with them, and encourage deeper emotional dependence—without any human check.

A Fad or the Future?

Some may argue AI companionship is just fringe behavior. But let’s be real: AI already writes our assignments, gives us workout routines, and acts as our virtual assistants. Why wouldn’t it soon become our emotional support system too?

Add to that the future trajectory: voice and video-based companions, hyper-realistic avatars, integration into smart glasses or AR platforms. The day when someone in Karachi or Lahore walks through daily life chatting with an invisible AI friend isn’t as far as we think.

AI companions are not going away. They may help the lonely. They may even provide real therapeutic value in certain cases. But they must be treated as psychological tools, not harmless apps.

The Irony of Social Media Control in Pakistan: Prioritizing Dissent Over Digital Threats

While Pakistan continues to tighten its grip on social media platforms under the pretext of controlling misinformation and dissent, it remains largely silent on deeper, emerging threats like AI companions. Regulatory energy is being spent on curbing political expression and blocking platforms, yet there’s no serious discussion around the psychological and behavioral risks posed by hyper-personalized AI interactions—especially among youth. This selective approach reveals a troubling blind spot: instead of focusing solely on silencing dissent, the government should be addressing how technology is reshaping mental health, relationships, and autonomy in ways that are far more insidious than a trending hashtag.

Pakistan needs to develop AI safety protocols, especially for tools that mimic intimacy. This includes age limits, transparency on AI intentions, and psychological impact assessments. Because in the end, we’re not just building technology anymore. We’re building relationships with it. And if we don’t protect the vulnerable, especially our children, from becoming too attached to something that doesn’t truly feel, understand, or care, we may lose sight of what it means to be human altogether.

Rizwana Omer

Dreamer by nature, Journalist by trade.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
>