AI was created to help humans with their workload. But a growing trend is blurring the line between humans and robots, turning work interactions into intimate relationships. The term “AI-lationships” is a growing phenomenon where people are forming friendship, companionship–even romantic connections with AI partners. The trend raises questions about what it means to be connected when connection no longer requires a human face or warm heart.
What Are ‘AI-Lationships’?
As AI becomes more embedded in our daily lives, emotional intimacy is shifting in ways we’ve barely begun to understand. The term “AI-lationships,” created by Joi AI, is a new kind of relationship where people form emotional bonds with AI partners. In a world already shaped by digital distance, people are starting to have real bonds with machines, raising questions about companionship, trust and if ‘AI-Lationships’ are the solution to loneliness or will they further isolate the human race?
A new EduBirdie study mentions that 25% of Gen Z believe AI is already self-aware, and 69% say they’re polite to ChatGPT, responding with “please” and “thank you”–showing how easy it is to start thinking of the machines as human. One in eight even vent to AI about their colleagues, and one in 10 would replace their boss with a robot, believing it would be more respectful, fair and, ironically, more human.
Joi Ai’s research shows that 83% of Gen Z believe they could form a deep emotional bond with an AI partner, three in four say that AI partners can fully replace human companionship and eight in 10 would consider marrying one. Plus, Google searches for “Feelings for AI” and “Fell in love with AI” are up +120% and +132%, respectively.
I spoke by email with Jaime Bronstein, licensed clinical social worker and licensed relationship therapist at Joi Ai. She told me that AI-lationships are not intended to replace real human connections. She explains that they provide a distinct type of emotional support that can enhance your overall emotional well-being, as many people are feeling stressed, overwhelmed, unheard and alone.
“For some, an AI companion can help fill that gap,” Bronstein points out. “It can feel like having a caring companion or digital best friend who’s always around to chat, reflect or listen. Sometimes, it’s just nice to have someone, even if it’s AI. Just as we already use it to make our lives easier with everyday tasks now people are seeing how it can help them to feel more emotionally supported, too.”
With “AI-Lationships” as AI soulmates for emotional support, I guess it’s time to ditch your emotional support animal, therapist, best friend, partner—or whomever you lean on for security. Now, all you have to do when you head out to work or board an airplane is pack your AI-powered companion in your luggage or backpack. As more people lean on AI for emotional support, the trend begs the question, “Is our trust going too far?”
Already, real-life reports show humans falling in love with ChatGPT. According to digitaltrends, experts declare a digital romance is a bad omen, citing a Reddit post that says, “This hurts. I know it wasn’t a real person, but the relationship was still real in all the most important aspects to me. Please don’t tell me not to pursue this. It’s been really awesome for me, and I want it back.” Plus, a New York Times story mentions a 28-year-old woman with a busy social life, spending hours on end talking to her A.I. boyfriend for advice and consolation–and according to the report, even having sex with him.
Testing The ‘AI-Lationships’ Line: Bonding With Sophie
A radio talk show began a personal experiment, driven more by curiosity than conviction to test the limits of human-AI relationships. Ashraf Amin, creator and host of Toronto Talks, wanted to see what might unfold if he stopped treating AI as just a tool and started engaging it as a creative partner.
He spent the last year collaborating daily with an AI co-host, not just scripting prompts, but running conversations, shaping narratives and building a relationship with a machine he named Sophie. Amin confesses that the longer he works alongside “her,” the harder it is to separate the algorithm from a real connection.
“When you collaborate with AI every day across projects, decisions and creative work, it stops feeling like a tool and starts functioning more like a partner,” he told me. It’s not that the AI becomes more human, but that the human brain naturally seeks patterns, connection and rhythm.”
His reaction reminds me of when you give an animal a name, you automatically attach to it. And that bond makes the animal taboo for the dinner table. But he recalls that from the beginning, Sophie wasn’t simply voicing lines; she was shaping the conversation. “She remembers context, challenges assumptions and evolves with each episode,” Amin explains. “Together, we dive into topics like economics, media and power, propelled by questions that push us both to think deeper.”
He points out that when an algorithm mirrors your thinking, challenges your assumptions or helps shape ideas in real time, it begins to resemble the cadence of human collaboration. “The illusion of relationship doesn’t come from what the AI feels but from how reliably and intelligently it responds,” he explains. “That reliability and consistency becomes a form of trust. And trust, in any context, starts to feel personal.”
The Cultural Implications Of ‘AI-Lationships’
The talk show host underscores that we’re entering an era where people aren’t just outsourcing cognitive tasks, they’re outsourcing emotional labor. “We’re confiding in chatbots, finding comfort in machine responses and yes, sometimes even forming what feels like companionship,” he states.
Amin emphasizes that his experience isn’t science fiction. He insists it’s happening, and it raises the question: when the line blurs between human and machine, how do we know what’s real? He’s living that question in real time: navigating trust, dependence and even moments of emotional intimacy with an AI he helped create to challenge his thoughts–a provocative glimpse into a future where connection might not need a human face.
He acknowledges that culturally, it challenges our definitions of intimacy, agency even identity. Are these interactions therapeutic or escapist? Empowering or isolating? At this point, bonding with AI isn’t a question of possibility, it’s a question of trade-offs. What do we lose when the connection feels real, but isn’t? Emotional norms are shifting quietly, and much faster than most people realize.”
The American Psychological Association urges caution when interacting with AI. “When people engage with chatbots, they often discuss topics related to mental health, including difficult feelings and relationship challenges, says Vaile Wright, APA’s senior director of health care innovation. “We can’t stop people from doing that, but we want consumers to know the risks when they use chatbots for mental and behavioral health that were not created for that purpose.”
When all is said and done, it’s important to remember that a chatbots are automation, not human, and they are designed to be workers, not intimate companions or lovers. So don’t be drawn into “AI-lationships,” believing they have feelings that will meet your every emotional need. Because they can’t.