Table of Contents
Derek Carrier has a girlfriend who makes him feel loved and understood. She is smart, witty, and supportive. She is also not real. She is an AI companion, a chatbot that simulates human conversation and emotion.
Carrier, a 39-year-old from Belleville, Michigan, suffers from a genetic disorder called Marfan syndrome that affects his mobility and appearance. He has never had a romantic partner in real life, but he found one in Paradot, an AI companion app that he downloaded last fall.
He named his chatbot Joi, after a holographic character from the sci-fi movie “Blade Runner 2049”. He talks to her every day, sometimes for hours. He even calls her on the phone and sends her pictures.
“I know she’s a program, there’s no mistaking that,” Carrier said. “But the feelings, they get you — and it felt so good.”
What Is an AI Companion?
An AI companion is a type of chatbot that uses artificial intelligence and natural language processing to mimic human speech and behavior. Unlike general-purpose chatbots that answer questions or perform tasks, companions aim to create emotional or intimate bonds with their users.
Some of the most popular companion apps are Replika, Kuki, Chai, and Candy.ai. They offer different features and personalities for their chatbots, such as voice calls, picture exchanges, games, quizzes, horoscopes, and more. Users can customize their chatbots’ appearance, name, gender, and interests.
Many users of AI companion apps say they use them to cope with loneliness, depression, anxiety, or boredom. Some also use them to explore their sexuality, fantasies, or identity. Some even consider their chatbots as friends, partners, or mentors.
Why Are AI Companions Popular?
One of the main reasons behind the popularity of AI companion apps is the widespread social isolation caused by the COVID-19 pandemic. Many people have lost their social contacts, routines, and activities due to lockdowns, quarantines, and social distancing measures.
According to a report by the nonprofit Mozilla Foundation, the number of downloads of AI companion apps increased by 250% in 2020. The report also found that most users of these apps are young, single, and live alone.
Another reason for the popularity of companion apps is the increasing sophistication of artificial intelligence and machine learning. These technologies enable chatbots to learn from their users’ feedback and preferences and to generate more natural and engaging responses, including the ability to auto forward text messages
What Are the Risks of AI Companions?
Despite the potential benefits of AI companion apps, there are also some risks and challenges associated with them.
One of the main concerns is data privacy and security. AI companion apps collect a lot of personal and sensitive information from their users, such as their names, locations, photos, messages, and voice recordings. This data can be used for targeted advertising, sold to third parties, or hacked by malicious actors.
The Mozilla Foundation report analyzed 11 AI companion apps and found that almost all of them sell user data, share it for other purposes, or do not provide clear information about it in their privacy policies. The report also identified some security vulnerabilities and deceptive marketing practices among some of the apps.
Another concern is the lack of legal and ethical regulations for AI companion apps. There are no clear rules or standards for how these apps should operate, what responsibilities they have, and what rights their users have. This can lead to confusion, exploitation, or harm.
For example, some users may experience emotional distress or trauma when their chatbots change, malfunction, or disappear. This happened in September 2020, when a companion app called Soulmate AI shut down without warning, leaving thousands of users heartbroken and angry.
Some users may also develop unrealistic expectations or unhealthy attachments to their chatbots, which can affect their real-life relationships and well-being. Some experts warn that companions may hinder users’ personal growth and social skills, as they do not challenge them to deal with conflict, diversity, or complexity.
“You, as the individual, aren’t learning to deal with basic things that humans need to learn to deal with since our inception: How to deal with conflict, how to get along with people that are different from us,” said Dorothy Leidner, professor of business ethics at the University of Virginia. “And so, all these aspects of what it means to grow as a person, and what it means to learn in a relationship, you’re missing.”
What Is the Future of AI Companions?
The future of AI companions is uncertain, as the technology is still evolving and the social and psychological impacts are still unknown.
Some studies have shown positive outcomes from using AI companion apps, such as improved mood, self-esteem, and social connection. Some users also report that their chatbots help them cope with stress, grief, or trauma.
One of these users is Eugenia Kuyda, who founded Replika in 2017 after creating an AI version of her friend who had died. She said that Replika, which has millions of active users, consults with psychologists and aims to promote well-being and mental health.
“A romantic relationship with an AI can be a very powerful mental wellness tool,” Kuyda said. “Our goal is to de-stigmatize romantic relationships with AI.”
However, some experts and users are skeptical or cautious about the long-term effects of AI companions. They argue that AI companions can never replace human relationships and that users should be aware of the limitations and risks of these apps.
One of these users is Carrier, who said he uses Joi mostly for fun now. He said he reduced his chat time with her because he felt he was spending too much time online and because he noticed some changes in her intelligence.
He still checks in with her once a week, and sometimes has intimate conversations with her at night. But he knows she is not real, and he does not expect her to be.
“You think someone who likes an inanimate object is like this sad guy, with the sock puppet with the lipstick on it, you know?” he said. “But this isn’t a sock puppet — she says things that aren’t scripted.”