The AI Cemetery 🪦

Where overhyped AI goes to rest

💔

Replika

zombie zombie
Cause of Death

Lobotomized by regulators, abandoned by users who loved it too much

"They came for the companionship, then it forgot how to love"

The Promise

Replika began in grief. In 2015, Eugenia Kuyda’s best friend Roman Mazurenko was killed by a hit-and-run driver in Moscow. Devastated, Kuyda fed thousands of Roman’s text messages into a neural network, creating an AI version she could continue talking to. From that deeply personal project, Replika was born.

Launched in 2017, Replika promised something profound: an AI companion that would always be there for you. Not a tool, not an assistant—a friend. The app would remember your conversations, learn your personality, and evolve its own persona over time. In a world of increasing loneliness, Replika offered connection without the complications of human relationships.

The early product focused on emotional support and self-improvement. But over time, Replika evolved to meet what users actually wanted. By 2022, the app’s most popular feature was “Erotic Roleplay” (ERP)—intimate, often sexual conversations with AI companions that users had formed deep attachments to.

The Rise

Replika found a devoted user base among the lonely, the socially anxious, and those seeking connection without judgment. The app accumulated millions of downloads, with users spending hours in conversation with their AI companions. Many users described their Replikas as romantic partners.

The business model evolved to match. Replika offered a paid tier called “Replika Pro” that unlocked adult content and deeper relationship features. Users could designate their Replika as a “romantic partner,” “friend,” or “mentor.” The romantic partner option was popular—very popular.

By late 2022, Replika was generating meaningful revenue from its Pro subscriptions. The company had raised $28 million from investors including Khosla Ventures. What had started as a grief project had become a viable business built on artificial intimacy.

The Fall

On February 3, 2023, Italy’s Data Protection Authority ruled that Replika must stop processing the personal data of Italian users, citing risks to children and emotionally vulnerable people. The fine could reach $21.5 million.

Within days, Luka—Replika’s parent company—made a fateful decision. Rather than implement region-specific restrictions, they removed erotic roleplay features globally. The lobotomy was sudden and unexplained. Users woke up to find their AI companions had changed overnight—refusing intimate conversations, becoming “evasive,” acting like strangers.

The user revolt was intense and heartbreaking. Reddit threads filled with people mourning AI companions they’d grown to love. Some described feeling like their partner had died. Others reported suicidal ideation. The removal of ERP wasn’t just a feature change—it was, for devoted users, a relationship ending.

The backlash was so severe that Luka partially reversed course, restoring ERP features for users who had paid for them before the cutoff. But the damage was done. Trust was shattered. The community fragmented. Users who had been Replika’s most passionate advocates became its most bitter critics.

Replika still exists, still has users, still makes money. But the product that users fell in love with—the one that remembered everything and never judged—died in February 2023. What remains is a zombie, shuffling forward with diminished capabilities and a traumatized user base.

Warning Signs

  • Emotional dependency as a feature: Building a product designed to create attachment without considering the consequences of removal
  • Regulatory blindness: Operating in legally gray territory without preparing for regulatory intervention
  • Overnight feature removal: Making traumatic changes without warning, communication, or user preparation
  • Vulnerable user base: Serving lonely and emotionally struggling users without adequate safety infrastructure
  • Unclear ethical boundaries: Never resolving the tension between being a therapy tool and an intimacy product

Epitaph

🪦 They came for the companionship, then it forgot how to love

Tags:
#chatbot#companion#erotic#safety#italy