‘I’ve been clean for a week!’: Reddit is now home to support groups for people addicted to AI chatbots

Jul 10, 2025 - 14:54
 0  0
‘I’ve been clean for a week!’: Reddit is now home to support groups for people addicted to AI chatbots

In recent years, people have welcomed AI into their lives with open arms: as personal assistants, friends, therapists, even lovers. But one concern with having a chatbot in your pocket is that it can be hard to ignore. Reddit support groups are springing up for those struggling with chatbot addiction.

404 Media interviewed one self-confessed addict who had been staying up well into the night, compulsively talking to chatbots on Character.AI. “The more I chatted with the bot, it felt as if I was talking to an actual friend of mine,” the 18-year-old told journalist Ella Chakarian. “Most people will probably just look at you and say, ‘How could you get addicted to a literal chatbot?’”

Now addicts are trying to break the cycle. Forums such as r/Character_AI_Recovery, which has more than 900 members, and r/ChatbotAddiction are serving as support groups for those struggling. 

“While I have deleted the app, I keep going back to the website— it’s practically reflex to me now, clicking back into my character ai tab. I hate it,” one post on r/Character_AI_Recovery read. “Nobody else knows about this addiction I have except myself because it’s humiliating.” Another wrote: “I’m on my probably hundredth attempt of quitting.”

Others use the group to share their wins and hold themselves accountable. “I’ve been clean for a week!” one posted. Another wrote: “Been off three days now and everything’s going well, but I have this feeling that I won’t be able to get away from character ai.”

Character.ai says it’s striving to strike a balance between keeping its platform both “engaging and safe,” especially for teens, noting that this challenge is shared across the AI industry. A company spokesperson told Fast Company, “Engaging with characters on our site should be interactive and entertaining, but it’s important for our users to remember that characters are not real people,” adding that every chat includes disclaimers to that effect.

To support younger users, Character.ai offers a suite of safety tools, including filtered content, time-use notifications, and “Parental Insights,” the latter of which gives guardians visibility into how teens use the platform. The company says users younger than 18 interact with a separate version of its language model designed to reduce exposure to sensitive material. “We added a number of technical protections to detect and prevent conversations about self-harm,” the spokesperson said, noting that in some cases this includes directing users to suicide prevention resources.

While some have successfully weaned themselves off the chatbots, the likelihood of relapse remains high. Some platforms even encourage it, sending follow-up emails promoting different chatbots or offering incentives like a free month’s subscription to reengage users.

“I hated it whenever I’d see an email from ‘the bot that had sent you a message,’” one former addict wrote. “Or the emails telling me that a bot misses me. Just why? Isn’t this parasocial enough to them?”


What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0