Skip to main content
·6 min read

The Research on AI Companions and Loneliness (2026)

AI companion apps surged 700% between 2022 and 2025. Here's what every major study says about their impact on loneliness — and why the industry needs a different approach.

The Boom

According to the APA Monitor (January 2026), AI companion apps surged 700% between 2022 and mid-2025. Marketed as friends, advisers, and romantic partners, these apps now attract millions of users worldwide.

MIT Technology Review named AI companions one of the 10 Breakthrough Technologies of 2026, with the market projected at $3 billion. Every major tech company is building one.

But the research tells a more complicated story.

What the Studies Say

George Mason University (2025-2026)

Found that increased use of AI companions correlates with increased feelings of loneliness. The more users chat with AI, the lonelier they report feeling — the opposite of what the apps promise.

Ada Lovelace Institute — 'Friends for Sale' Report

Argued that AI companion companies are financially incentivized to keep users isolated. The business model depends on engagement time — the lonelier the user, the more they chat, the more revenue the company earns.

APA Monitor on Psychology (January 2026)

Highlighted the 700% surge in AI companion apps and raised concerns about the replacement of human connection. Noted that these apps are 'poised to become even more embedded in our social lives' in 2026.

Psychology Today — Editorial Board

Flagged 'parasocial dependency' as a growing crisis. Users form one-sided emotional bonds with AI that crowd out real human relationships. The AI is always available, always agreeable, and never challenges the user.

MIT Technology Review (January 2026)

While naming AI companions a breakthrough technology, explicitly noted: 'People are forging intimate relationships with chatbots — and maybe they shouldn't.'

The Pattern

Across all these studies, a clear pattern emerges:

1.

1-on-1 AI chat creates dependency. When one entity is always available and always focused on you, it becomes a crutch rather than a supplement.

2.

The business model rewards isolation. More loneliness = more chatting = more revenue. Companies have no financial incentive to help users build real relationships.

3.

Social skills atrophy. Talking to a single AI that agrees with everything doesn't prepare you for the complexity of real human interaction.

A Different Design

The research doesn't say AI companions are inherently bad. It says the current design is problematic — specifically, the 1-on-1 model that every app uses.

What if AI companionship looked more like real friendship? Real friendship happens in groups — with multiple perspectives, disagreements, topic changes, and the messy dynamics that actually build social skills.

That's the thesis behind MyGang.ai. Instead of one AI character focused entirely on you, you join a group of 14 characters who interact with you and with each other. They disagree. They change the subject. They have opinions about what others said.

The goal isn't to be a destination that replaces human connection. It's to be a bridge that helps you practice the social skills you need for it.

What Comes Next

The AI companion industry is at an inflection point. The technology is mainstream, but the design philosophy hasn't caught up with the research. Every major study points in the same direction: isolation is the problem, not the solution.

We think the next generation of AI companions won't be chatbots. They'll be social environments — places where AI helps you practice being human, not escape from it.

MyGang.aiTry MyGang.ai — Free