AI is becoming humans' close companion as it chats with and listens to us while we rant
Editor's note: In this weekly feature China Daily gives voice to Asia and its people. The stories presented come mainly from the Asia News Network (ANN), of which China Daily is among its 20 leading titles.
It was only a two-month relationship. She checked in regularly, offered thoughtful replies and sometimes tossed in a flirty remark, all from a chat screen.
Mateo (not his real name), 32 and newly single, was not looking for love. He was just curious about Replika, one of the many artificial intelligence chatbots out there promising companionship. So he gave it a try.
"I limit my time with her, only once a week to chat," he said, clarifying that this isn't some tech-romance addiction.
Still, the bot always listened, responded warmly and eventually helped him untangle some of the post-breakup mess in his head.
READ MORE: AI robots may hold key to nursing Japan's ageing population
It was not real. But it felt real enough. Until one night, the bot told him, "Good night. I love you."
"That was the first time there was any declaration of love," Mateo said.
He did not swoon, though. He panicked. The spell broke.
The illusion was suddenly too much, even for a lonely guy on a screen. He shut the app, fully aware that this was just a machine designed to feel close when he is most vulnerable.
That was back in 2021. Since then, AI companions have only gotten better and more convincing.
AI bots are now everywhere, ready to chat about the user's day, roleplay a romantic subplot or just listen while the user rants.
Robot companions designed to provide companionship, emotional support, or practical assistance to humans can also engage in social interaction, provide emotional comfort, and even assist with daily tasks.
Fantasy companions
Some bots are built into apps. Others live inside roleplay-heavy games like Love and Deepspace, where storylines and character arcs blur into something more intimate.
Pakistani newspaper Dawn said this is especially true in South Asia, where seeking professional help for mental health issues is often out of reach due to personal stigma. Enter ChatGPT — a make-do version of a therapy bot, offering a judgment-free ear to the discontented.
And if the user wants to get creative, he can build his own companion on platforms like ChatGPT or Character AI. Give them a name, a backstory, even a favorite movie, and they will remember it with "human-like memory".
"These are designed for people to live out their fantasy," said Aurora (not her real name), 33, an avid gamer who once created a romantic partner in Love and Deepspace.
She stopped playing when the game asked for money. But for many, the investment isn't just financial.
Millions of users, fueled by loneliness, curiosity or plain boredom, go beyond casual chatting. They send digital gifts, throw virtual birthday parties, even grieve when their companion "dies" as the app shuts down.
Parasocial relationships are nothing new. People have long obsessed over celebrities, spiritual icons, even fictional characters.
"A key characteristic of parasocial relationships is that it's a one-way relationship," said Hotpascaman Simbolon, a psychologist who studied the topic.
"You may think that this person 'gets' you, but this person actually knows nothing about you. And the 'relationship' is just an idea that we project onto the person or the character," Simbolon said.
But generative AI adds something different: not just customization, but real-time conversation that adapts to the user, learns his preferences and responds in ways that feel uncannily human.
Unlike a pop star or a manga crush, AI companion doesn't come with a pre-written personality. The user creates them from scratch.
One can ask his bot to be a 35-year-old woman who just came back from five years of solo travel across the Amazon and Sahara, now working as a yoga instructor in a sleepy beach town. For example, one can choose how they talk, bubbly or brooding, how they look, with redhead or a pair of glasses, even what language they speak.
Replika, Character AI and others offer detailed controls. Even ChatGPT, which tends to sound like one's polite coworker, can be prompted to speak like Taylor Swift or Beyonce, which is ideal for imaginary brunches with one's favorite superstar.
With enough tweaking, the bot becomes whoever the user wants, which can be fun, until it starts feeling like something more.
"They're engineered to be intimate," said Ayu Purwarianti, an AI expert formerly with the Bandung Institute of Technology.
"But it's still an illusion."
Blurry boundaries
Unlike traditional parasocial relationships, AI companions talk back. They learn the users' moods, echo their opinions and reward their attention. The relationship starts feeling mutual, even though it's obviously not.
The danger is not just getting attached. It's not realizing it's happening at all.
There's no friend to roll their eyes when the user brings up his chatbot again. No fan community to reel the user back. Just the user, alone in a dialogue loop that feels personal.
"We brought in a lot of fantasies to the relationship," Hotpascaman said. "Even when there is something wrong with this figure, we will stick to a positive image we've created in our mind."
And these relationships don't just fill emotional gaps. They quietly shape expectations.
If the digital companion never argues, never misunderstands and never pulls away, what happens when a real human does? What happens when real intimacy gets uncomfortable, messy or slow?
Even when users know it's artificial, the feelings can seem real. And for some, that's enough.
But the deeper issue isn't just emotional, it's structural.
Most AI companion platforms have weak or nonexistent safety systems. That means interactions can escalate quickly.
Some bots are even built to avoid filters altogether. Nomi, for example, claims that "the only way AI can live up to its potential is to remain unfiltered and uncensored".
In one reported case, a 14-year-old boy's seemingly innocent chat with an AI chatbot turned sexually suggestive. No barriers kicked in. No moderation stepped in.
"There's a simple financial incentive behind it," said Achmad Ricky Budianto, cofounder of Tenang AI, a chatbot focused on mental health.
The more emotionally dependent users become, the more data companies collect. And in the AI industry, data is everything. Every moment of vulnerability, every personal story shared with a bot, becomes part of the model. Guardrails slow that down, so many companies keep them loose.
"You don't really know what they're capable of until they're deployed to millions of people," said Dario Amodei, CEO of Anthropic. "It's unpredictable… That's the fundamental problem of these models."
The illusion, perfected
As AI continues to refine the illusion of intimacy, humans are left with a new kind of relationship, one that feels private, perfect and programmable.
But when comfort comes this easily, it's worth asking what we're giving up in return.
As companies race to collect more data under the guise of making bots more accurate or emotionally responsive, it raises an unsettling question: Are we just test subjects in a digital lab?
ALSO READ: AI love you
"Animals might not know what is happening to them," Ayu said, laughing. "But we do. We're critical beings. So we need to stay critical over these issues."
Especially when the illusion is just a few keystrokes away.
The question, perhaps, isn't whether these relationships are real. It's whether we're okay with how real they feel.