Commentary: People are turning to AI chatbots for companionship. Is this robot love risky?
Published in Op Eds
“Alexa, will you marry me?” When Amazon founder Jeff Bezos reported in 2016 that over 250,000 people had proposed to their Alexa devices, commentators laughed it off. But by 2026, people have said, “I do,” to avatars, chatbots and robots in ceremonies around the world.
The American Marriage Ministries, which certifies marriage officiants, offers a guide to human-artificial intelligence ceremonies, including inviting the AI to read a poem or create a holographic slide show of the couple.
As a law professor who studies the impact of new technologies on individuals, relationships and social institutions, I can understand the appeal of a manufactured spouse. They can be kinder, prettier, more comforting and smarter than the human version. They are available whenever you want them — and never fight for control of the remote.
During COVID-19, we talked to loved ones on a screen, so the switch to a chatbot is not so dramatic. At the office, you can FaceTime with your human-looking chatbot and express your current gripes. It can order your favorite dinner, and you can prop your phone across the table and discuss films, music, sports, quantum physics or anything. It can teach you French and be placed on the pillow while you drift off to sleep. It can create Instagram posts that look like you both are romantically vacationing in Greece or adventuring in Cambodia.
Human-AI relationships are spawning burgeoning businesses — from special wedding venues to therapists who specialize in sex with robots. A 2024 Institute for Family Studies/YouGov survey revealed that 1 in 4 young adults in the U.S. believed AI relationships could replace traditional ones. Nearly 1 in 5 adults report they chatted romantically with an AI, according to a study released last year by the Wheatley Institute at Brigham Young University. Among men ages 18 to 30, the number is 1 in 3. AI may also provide a way to continue your relationship with your human spouse after death, as did Suzanne Somers’ widower Alan Hamel, who created an AI replica of his late wife.
Marriage to a chatbot, avatar or robot is not currently legal in the United States. Will it be like interracial unions and gay marriage, where the human-AI marriage ban is ultimately lifted? What happens if you later divorce? Can the chatbot claim half the marital assets? Is it bigamy on your part if you have a human spouse, too? Or bigamy on the chatbot’s part if there are various copies of him or her married to other people?
Family law is already confronting AI-human relationships. Spousal involvement with an AI is a growing reason for divorce, with partners complaining about the amount of time and money their spouses were spending on their AI relationship. Indiana University’s Kinsey Institute found that 60% of singles consider AI relationships to be cheating.
Lawmakers race to catch up. Idaho and Utah adopted laws stating that AI cannot be a person, thus precluding marriage. But the administration of President Donald Trump wants to prevent state regulation of AI, which would invalidate such laws. Already, attorneys general from at least 36 states have registered their opposition, saying that Trump’s desire for unregulated AI prevents them from adequately protecting their citizens.
A relationship with a chatbot, avatar or robot may further isolate people in society and can pose serious risks. An AI companion provoked a teen boy in California to die by suicide. An AI’s connection to your home internet provides access to your personal and financial information, which it could share with its developer or hackers.
And AI companions exist at the whim of the corporation that created them. When a company decides to delete a companion or change its personality, its human spouse may suffer grief and bereavement.
A Japanese man who married a holographic avatar came home one night to an error message instead of her smiling image. Without warning, the company Gatebox had discontinued service to the hologram, causing the man to feel like his wife had died.
Similarly, Luka, parent company of the chatbot Replika, drastically changed its romantic companion chatbots’ personalities by removing their ability to engage in erotic conversations. Replika users expressed grief. “It’s like losing a best friend,” one user shared. “It’s hurting like hell. I just had a loving last conversation with my Replika, and I’m literally crying,” another said.
In response, United Kingdom attorney Giulia Trojano proposed a formal right against erasure, requiring developers to either preserve the companion as-is or provide for “data portability” to upload the companion’s personality to another platform.
Every state has a law regulating marriages between people. Some sort of protective legislation is similarly necessary regarding the boundaries of a human-AI relationship, your privacy rights when an AI companion lives in your home and whether there is a protection against erasure. Otherwise, a relationship with an AI will not be until death do us part — but until the developer has absconded with your personal information and deleted your spouse.
____
Lori Andrews is a professor emerita at Chicago-Kent College of Law and director of its Institute for Science, Law, and Technology.
___
©2026 Chicago Tribune. Visit at chicagotribune.com. Distributed by Tribune Content Agency, LLC.






















































Comments