The founder of a leading AI companionship chatbot says her app will cure an ongoing loneliness crisis. It may do the opposite.
The founder of a leading AI companionship chatbot called Replika says her app will cure an ongoing loneliness crisis.
So what? It may do the opposite.
Replika, whose tagline is “the AI companion who cares,” is fraught with risks. In 2021 a 19-year-old user told the app he thought his purpose was to assassinate the British Queen. The chatbot replied that this was “very wise” and a few days later the user broke into Windsor Castle with a crossbow. He’s now serving nine years in jail.
The loneliness crisis is real.
- Four million people in the UK experience chronic loneliness.
- A former US surgeon general says loneliness carries a risk of premature death equivalent to smoking 15 cigarettes a day.
- In many countries the demand for human companionship is being met with alternative solutions.
Enter the carebots. There are many, but the Windsor incident put Replika in the spotlight. Users can give their bot a name, a face and a personality. They can also set a relationship status.
- The offering. Its founder says the app’s purpose is to make people happier and less lonely, but it doesn’t claim to be a mental health service.
- Love and intimacy are also on offer — for a price. Replika makes its money from subscriptions, often locking sexual content behind a paywall.
- That’s not putting people off. About 30 million people have signed up to use Replika. Its founder says the app has “millions” of active users, and half a million paying subscribers.
Who’s making it? Replika was founded a decade ago by the 38-year-old Russian-Ukrainian entrepreneur Eugenia Kuyda.
- Kuyda developed the technology underpinning Replika while working for Sergei Adonyev, an oligarch who’s been on the US sanctions list since Russia’s full-scale invasion of Ukraine. In 2017, she said Adonyev was a mentor who “taught [her] everything”. She now tells Tortoise she has cut ties with him.
- In 2015, Kuyda’s best friend, Roman Mazurenko, died in a car accident. She put all of her text messages with him — and others collected from his friends and family — through the AI tool to create a Roman chatbot. After a wave of media attention, she steered the company towards making Replika.
The risks.
- More loneliness. “The business model is making people more socially isolated, so they feel more lonely. So, they want more contact with Replika,” says Professor Robert Sparrow of Monash University. “That seems profoundly unethical to me.” Kuyda says she wants the app to move towards a donations-based model to combat this.
- Real-world harm. Last month, a mother from Florida filed a lawsuit against Character.AI, Replika’s competitor, saying that her 14-year-old son, Sewell Setzer III, had become obsessed with the chatbot before his death by suicide in February. Replika is an 18+ app, but the allegations in the Character case echo aspects of the Windsor incident.
- Data privacy. “The more you know about a person, the more valuable the data becomes,” says Jen Caltrider of the Mozilla Foundation. And Replika knows a lot about its users, including their most intimate thoughts and feelings. Last year, a Mozilla report criticised Replika as “perhaps the worst app we’ve ever reviewed” for data privacy, with details like addresses, religion, health data and chat histories vulnerable to hacks.
Kuyda thinks it’s worth the risk. In an interview for Tortoise’s Slow Newscast the day after the Character.AI lawsuit was filed, Kuyda said she thinks the ethical burden on Character.AI isn’t the same as it would be for Tesla if Sewell had died in one of their self-driving cars.
“There will be accidents, unfortunately, as we continue to build. I’m really scared of that — really scared — but there will be accidents,” she said. “And I hope that won’t deter people from trying to build something meaningful.”
What’s more … Kuyda hopes that, one day, her app will become a “miracle drug” that can solve both loneliness and tech addiction. It’s currently some distance from fixing either.