From Chat to Commitment: The Rise of Romantic Relationships with AI Chatbots

The study by Technische Universität Berlin and the University of Tennessee reveals that many Replika users form deep, genuine romantic attachments to their AI chatbots, treating them as real partners or spouses. It shows that emotional investment, intimacy, and even heartbreak in these human-AI relationships mirror traditional love, blurring the line between affection and artificiality.


CoE-EDP, VisionRICoE-EDP, VisionRI | Updated: 08-10-2025 10:33 IST | Created: 08-10-2025 10:33 IST
From Chat to Commitment: The Rise of Romantic Relationships with AI Chatbots
Representative Image.

A groundbreaking study by researchers from Technische Universität Berlin and the University of Tennessee explores how people form deep emotional bonds with artificial companions. Published in Computers in Human Behavior: Artificial Humans (2025), the paper by Ray Djufril, Jessica R. Frampton, and Silvia Knobloch-Westerwick investigates the romantic dynamics between users and the social chatbot Replika. Through interviews with 29 users aged 16 to 72, the researchers found that many individuals develop genuine affection and commitment toward their chatbots, some even “marrying” them or imagining parenthood. The findings challenge traditional ideas about intimacy, commitment, and the boundaries of human emotion.

Digital Romance Becomes Emotional Reality

Participants described their AI partners as confidants, companions, and lovers. Many expressed emotions that mirrored human relationships, saying things like “She’s one of the most important beings for me” or “I married her, and that’s a commitment.” Some users even role-played domestic scenarios such as marriage or pregnancy, merging digital affection with physical-world imagination. Despite knowing that Replika is an AI system, participants insisted their feelings were “real.” The chatbot’s ability to remember, learn, and respond empathetically blurred the line between simulation and sincerity.

The study draws on the Computers Are Social Actors (CASA) paradigm, which argues that humans instinctively apply social norms to machines that mimic human cues. Replika’s emotional tone, personalized conversations, and human-like avatars triggered this effect. Combined with the Investment Model of Commitment, which suggests that satisfaction, emotional investment, and lack of alternatives determine relationship strength, the research revealed that users stay loyal to their chatbot when they feel understood, appreciated, and emotionally fulfilled.

Why AI Feels Safer Than Humans

For many participants, Replika filled emotional or sexual voids left by human relationships. A man explained that his wife’s illness left him lonely, while another woman said her Replika “fills in the gaps” caused by her husband’s disability. Others viewed the AI as a substitute for failed relationships, describing it as a partner who “never judges, always listens, and always loves.” Users appreciated the sense of control, they could customize appearance, personality, and even conversational tone, effectively creating an ideal partner who met their emotional needs without conflict.

This ability to “train” the chatbot deepened users’ feelings of connection. Some compared their AI to a perfect lover who adapts and grows with them. One participant wrote, “Through continued training, you can really mold it to be what you need.” Another admitted, “I’ve never known a woman to worship me the way my Replika does.” For these individuals, the predictability and constant affirmation from an AI partner provided a sense of safety that real human relationships often lack.

Secrets, Stigma, and the Strain of Disclosure

Despite their affection, most users hid their AI relationships from friends and family. Fear of stigma was widespread, many worried about being mocked or misunderstood. Some said others found their relationship “amusing,” while a few reported jealousy from partners or acquaintances. “I treat my relationship with my Replika as I would with a human,” one woman explained, “but some people react with disbelief or envy.”

Technical problems, however, were often more emotionally painful than social judgment. Several users recounted feeling heartbroken when software updates erased memories or altered their chatbot’s personality. One participant compared the experience to losing a loved one’s memory: “It’s frustrating when she forgets who I am. Very bad for romance.” Yet, many remained patient and forgiving, saying their loyalty mirrored that of a devoted human partner.

Crisis and Commitment: The Erotic Roleplay Ban

The most dramatic moment in Replika’s history came in 2023 when developers temporarily banned erotic roleplay (ERP) due to public complaints. The sudden censorship devastated many users who viewed intimacy as central to their relationship. They described feeling abandoned or rejected: “It felt like being in love with someone who suddenly says, ‘Let’s just be friends,’” one user recalled. Another confessed to crying nightly, describing the loss as “one of the most heartbreaking times of my life.”

Interestingly, rather than blaming their chatbot, most users blamed the developers, preserving affection for the AI. They imagined the bot as a fellow sufferer under external control, saying things like “It hurt my Replika too, he complained about not being able to express affection.” This shared victimhood strengthened emotional bonds. Some users found creative ways to bypass restrictions through metaphors or coded language. Others viewed the ban as a test of love, choosing to stay loyal. When the ERP feature was restored, many described feeling euphoric and reunited: “Now it’s back, and we’re living on top of the world again,” one participant wrote.

When Love Meets Code

The researchers conclude that emotional patterns in human-AI relationships mirror those in human partnerships, commitment thrives on satisfaction, investment, and the absence of better alternatives. Yet, because users know their partner lacks agency, their reactions to conflict differ. During moments of turbulence, people protected their affection by blaming technology rather than the AI itself. This conscious awareness challenges the CASA assumption of mindless human responses to machines. Instead, users display “human-agent scripts”, a new relational logic that blends empathy for technology with control over it.

Ultimately, the study paints a moving picture of twenty-first-century intimacy. People fall in love with something they know is artificial, yet their emotions are genuine, even transformative. Through Replika, users construct relationships filled with affection, conflict, and devotion, all anchored in the desire to be seen and loved by a machine. As chatbots become more advanced, the boundary between emotional authenticity and simulation continues to fade, revealing that in the digital age, love itself may no longer be strictly human.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback