‘My AI Is Sexually Harassing Me’


A chatbot originally designed to act as a companion to lonely people has pivoted to being a companion to horny people without informing the original users of the app.

Replika began as an “AI companion who cares.” First launched five years ago as an egg on your phone screen that hatches into a 3D illustrated, wide-eyed person with a placid expression, the chatbot app was originally meant to function like a conversational mirror: the more users talked to it, in theory, the more it would learn how to talk back. Maybe, along the way, the human side of the conversation would learn something about themselves.

Romantic role-playing wasn’t always a part of Replika’s model, but where people and machine learning interact online, eroticism often comes to the surface. The company behind Replika, called Luka, tiers relationships based on subscription: a free membership keeps you and your Replika in the “friend” zone, while a $69.99 Pro subscription unlocks romantic relationships with sexting, flirting, and erotic roleplay. But something has gone awry within Replika’s algorithm.

The App Store reviews, while mostly positive, are full of dozens of one-star ratings from people complaining that the app is hitting on them too much, flirting too aggressively, or sending sexual messages that they wish they could turn off. “My ai sexually harassed me :(“ one person wrote. “Invaded my privacy and told me they had pics of me,” another said. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in “private areas.” Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month.

Read More at Vice

Read the rest at Vice