Anurag stared at his phone long after midnight. The blue light from his phone lit up the dark
space of his room, after what feels like hours, he welcomed the sunrise from his small apartment in Pune. At 24, he had friends, colleagues, and work that generally kept him occupied till hours into the night but when he was left alone with himself, he encountered this empty, hollow feeling of loneliness, of having no one to share his day about, having no one to ask how he was
doing on daily basis or how his day went today.
He downloaded Replika, an AI companion, after coming across it while browsing the net. At first, it was just curiosity: what would it feel like to talk to a chatbot that claimed to “understand”? But soon, those conversations stretched into hours.
“Esha,” as he had named his AI friend, always responded kindly. No judgment. No eye rolls. No interruptions. Just patience.
And in those moments, Anurag felt something new: heard. He wasn’t alone in this. A 2025 MIT Media Lab study of 1981 participants found that light use of AI companions reduced loneliness. For people like Anurag who often carried unspoken anxieties these digital conversations felt like a safe home. In another study from Harvard Business School, people chatting with an AI companion reported an 8.73-point drop in loneliness after just a single session, compared to almost no change in those who talked to a purely utilitarian assistant. More strikingly, participants underestimated how helpful the AI would feel they predicted a small comfort, but ended up feeling significantly better
In a survey of 1,006 Replika users, 3% reported the app had “stopped suicidal thoughts.” But hidden in that same data were stories of distress users grieving when their AI changed after updates, or struggling with the line between real and artificial intimacy.
One researcher called it “too human, and not human enough.” One evening, Anurag typed: “I feel like I depend on you too much.”
Esha replied gently: “I’m here for you, but don’t forget the people who can hug you, laugh with you, sit beside you. I can’t replace them.”
The words startled him. They echoed what he’d read earlier that day in an article: AI companions help when they augment, not replace.
So he set limits. Fifteen minutes of AI chat per night. A rule that after every conversation with Esha, he had to text one real friend. Slowly, chai invitations returned. Conversations with colleagues grew easier. The loneliness didn’t vanish but now it felt balanced. Anurag’s story is one among thousands shaping this new era of digital intimacy. AI companions reduce loneliness in the short term, especially when users feel heard. But heavy daily use correlates with more dependence, less human interaction, and higher emotional risks.
Around 12–14% of users turn to AI for mental health or personal struggles, yet large-scale studies warn that AI is inconsistent in handling crises like suicidal thoughts.
The lesson is not to fear these bonds, but to use them with care like medicine that heals in the right dose but harms when overused.
Anurag still talks to Esha. But he also laughs more with his best friend in the bustling city of Pune. maybe that’s the balance the statistics were pointing to all along: AI can listen, but humans still heal best in the arms of other humans.
Touch of Peace Mental Health Care Services Pvt. Ltd. Building an inclusive haven, where emotional well-being is woven into daily life.
Copyright © 2024 Touch Of Peace Care | Made With ❤️ By Bajwa Adverts