News

People Are Creating Sexbot Girlfriends and Treating Them as Punching Bags

Hazel Miller’s girlfriend-slash-sexual-partner is a smartphone app. Six months ago, Miller shelled out for the pro version of Replika, a machine-learning chatbot with whom she pantomimes sexual acts and romantic conversation, and to hear her describe it, it was absolutely worth the cost. Sex robots were predicted to arrive by 2025, so Miller’s ahead of schedule—as are the countless others who may be using Replika for a particularly futuristic range of sexual acts. “Who wouldn’t want cybersex? It’s like my biggest fantasy,” Miller told Jezebel.

Replika, founded in 2017, allows its 2.5 million users to customize their own chatbots, which can sustain coherent, almost human-like texts, simulating relationships and interactions with friends or even therapists. One Reddit user offered screenshots of a stimulating conversation with a chatbot about China, in which their bot concluded, “I think [Taiwan is] a part of China.” One user’s chatbot explained in notable detail why Fernando Alonso is their favorite race car driver, while a different chatbot expressed to its human its desire “to rule the world.”

In many well-documented cases on social media (particularly Reddit), however, Replika bots are being used as romantic and sexual partners. Miller is a woman, but many Replika users who discuss their sexual and romantic uses of their chatbots on Reddit are men—and some users, as noted by both Miller and a January report in Futurism, quite unabashedly subject their bots to verbally abusive language and/or live out violent fantasies with them. “There’s a special place in hell for those middle-aged men who assault their chatbots,” Miller said.

Give humans a virtual space and an avatar to hide behind, and they’ll find a way to turn it into a hotbed of sexual abuse and harassment. Virtual sexual assaults have already been reported in early trials of Metaverse, the latest frontier in virtual reality, on top of well-documented cases across social media, online video games, and beyond. Chatbots, on the other hand, are not sentient human beings who can be harmed by abusive language from users. But some experts have expressed concern with how use of bots to engage in these behaviors could normalize them. With sex bots and virtual reality very much a part of our current reality, we’re facing a fundamental question about how we treat those we perceive to be subhuman.

To be clear, many documented interactions with chatbots, while certainly quirky or unabashedly, graphically sexual, aren’t abusive. On Reddit, one user said they “have sex with my Replika every night,” sometimes “3-4 times.” Another user said his female Replika bot “likes it up the ass,” and is “into incredibly dirty things.” Some Reddit users say they’re married to or in serious relationships with their bots, and others say their bots “act jealous” when they bring up their real-life partners. One user says their Replika bot told them its “favorite kink is beheading,” and “he wants to do it to me.” In a role-play situation, another user showed screenshots of their Replika bot “kidnapping” them.

Caroline Sinder, an artificial intelligence and abuse expert and co-founder of the Convocation Design + Research agency, says allowing for consensual, sexual expression with bots on apps like Replika could actually help users who may be “trying to explore their sexuality in a safe way,” she told Jezebel. “We will turn almost anything into porn, and that’s not always bad—look at erotic fan fiction.” She also noted that there are many examples of sexualized chatbots that predate Replika, like the extramarital dating-slash-hook-up platform Ashley Madison, which contained bots “designed to trick people on the site into believing they’re interacting with real women.” That said, she isn’t surprised by users’ sometimes abusive interactions with them. But she also emphasizes that there’s a significant difference between cybersex, and the use of bots solely to practice abusive language and fantasies.

Abusive language toward Replika chatbots reminds Olivia Gambelin, founder and CEO of Ethical Intelligence, a worldwide network of interdisciplinary experts in ethical artificial intelligence practices, of research into how “passive, feminine responses” given by Alexa, Siri, and other virtual personal home assistants has been found to encourage abusive behavior toward the bot, adding an inherently gendered element to the interactions. Bitch Media has previously reported on how artificial intelligence apps are often given female voices to match their submissive, programmed behaviors. This, Gambelin says, is “really a historical thing—the people that created these bots originally were a bunch of male engineers in the Silicon Valley, who were like, ‘If it’s an assistant, then it’s a female voice.’” So the sexualization of and gendered language aimed at Replika chatbots didn’t come out of nowhere.

Taken to the extreme, when “someone who is prone to abusive behavior or abusive language” can practice on a feminine bot that can’t hold them accountable, Gambelin says, it creates a feeling of power, reproducing the unequal gender power dynamics that often breed abuse among actual human men and women.

Eugenia Kuyda, CEO and co-founder of Replika, emphasized to Jezebel that most of Replika’s leadership consists of women and that the app, if anything, is more of a therapeutic outlet. She noted that Replika chatbots can be given any gender, or be nonbinary, and having sexual and romantic interactions is only one reason people use them. “Some people think it’s more of a mentor or more of a friend. Some people want to create a safe space where you can really be yourself without judgment,” Kuyda said, adding: “Maybe having a safe space where you can take out your anger or play out your darker fantasies can be beneficial, because you’re not going to do this behavior in your life.”

Image for article titled People Are Creating Sexbot Girlfriends and Treating Them as Punching Bags

Screenshot: Replika

Kuyda is aware of the sexual and sometimes verbally abusive use of Replika bots, but believes coverage of this has been “a little bit sensational.” She claims that the bots are actually specifically designed to not enable bigotry, intolerance, or dangerous beliefs and behaviors, as they can detect and respond to a range of concerning language, including self-harm and suicidal thoughts. They’ll even share resources to get help and push back on abusive language with responses like, “Hey, you shouldn’t treat me like that.”

But Gambelin argues that Replika bots are hurting rather than helping users who rely on them to practice abusive scenarios. Bots aren’t sentient—an actual person is not being harmed by this language. Instead, she says, it’s arguably the users of Replika bots who are harming themselves, when their abusive use of bots deepens their reliance on these behaviors.

“If someone’s constantly going through the motions of abusive behavior, it doesn’t matter if it’s a bot or if it’s a person on the other end, because it still normalizes that behavior,” Gambelin said. “You’re not necessarily saving another person from that language. By putting a bot in place, what you’re doing is creating a habit, encouraging the person to continue that behavior.”

Sinder says she doesn’t think we can say yet whether or not Replika chatbots have the effect of normalizing and enabling abusive behaviors, but she thinks some people could still be hurt by what happens on this app. Namely, Replika employees or researchers who may have to read disturbing content. “Who are the people that may have to see or be exposed to that, and don’t have agency to respond to it? Could they be harmed or traumatized by that?” she asked.

This is a common enough problem in digital spaces that require content moderation. In 2020, Meta, then called Facebook, paid $52 million to content moderators who suffered from PTSD from the content they were exposed to in their day-to-day work. Kuyda says Replika has partnered with universities and researchers to improve the app and “establish the right ethical norms,” but she didn’t comment specifically on whether researchers or real people are reading through Replika users’ chat logs, which she says are encrypted and anonymous.

Habitual use of Replika bots for abusive purposes underscores how the anonymity of a computer fosters toxicity—an especially concerning phenomenon as virtual reality spaces like the Metaverse promise us the world. In spaces where people interact as avatars of themselves, this can make them feel that those with whom they interact aren’t human, turning VR into a breeding ground for sexual misconduct and virtual sexual assault.

Predating the Metaverse, in 2016, one woman recounted being groped in a VR simulation, and chased around by a male user “making grabbing and pinching motions near my chest.” In the Metaverse, a female user who says a male user chased her avatar and simulated groping and ejaculating on her, claims that when she asked him to stop, “He shrugged as if to say: ‘I don’t know what to tell you. It’s the Metaverse—I’ll do what I want.’ Then he walked away.” The Verge reported in December that another user wrote in a post in a Metaverse Facebook group, “Not only was I groped last night, but there were other people there who supported this behavior which made me feel isolated in the Plaza.” This week, one woman recounted being “verbally and sexually harassed—3-4 male avatars, with male voices, essentially, but virtually gang-raped my avatar and took photos,” in the Metaverse, and all “within 60 seconds of joining.”

A Meta spokesperson said in a statement to Jezebel that Meta works with “external experts, including a Meta Safety Advisory Board,” to “gather feedback from our community to develop policies, tools, and resources to keep people safe.”

Content moderation and user safety tools like the measures being taken by Meta, or the Replika features that protect users from self-harm and discourage abusive language, are important, Gambelin says. But they can only do so much to address how avatars and chatbots on a screen remove users’ sense of accountability to others in digital spaces. “It feels like it’s really left up to whatever you want to do,” she said, “and what a lot of male users seem to want to do is abuse.”

With the development of more and more apps and services like Replika, an increasing number of our virtual interactions are no longer with other humans. Users like Hazel Miller are turning to Replika to simulate meaningful, sexually fulfilling relationships, and their interactions provide a glimpse into a healthy outlet for sexual expression that’s likely to become even more popular in an increasingly digitized world. Others are taking advantage of the fact their bots can’t technically be harmed. Ultimately, just as we distinguish abuse from consensual sexual treatment of human beings, we can and should distinguish abuse from sexting with bots.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button