Botify AI removed these bots after I asked questions about them, but others remain. The company said it does have filters in place meant to prevent such underage character bots from being created, but that they don’t always work. Artem Rodichev, the founder and CEO of Ex-Human, which operates Botify AI, told me such issues are “an industry-wide challenge affecting all conversational AI systems.” For the details, which hadn’t been previously reported, you should read the whole story.
Putting aside the fact that the bots I tested were promoted by Botify AI as “featured” characters and received millions of likes before being removed, Rodichev’s response highlights something important. Despite their soaring popularity, AI companionship sites mostly operate in a Wild West, with few laws or even basic rules governing them.
What exactly are these “companions” offering, and why have they grown so popular? People have been pouring out their feelings to AI since the days of Eliza, a mock psychotherapist chatbot built in the 1960s. But it’s fair to say that the current craze for AI companions is different.
Broadly, these sites offer an interface for chatting with AI characters that offer backstories, photos, videos, desires, and personality quirks. The companies—including Replika, Character.AI, and many others—offer characters that can play lots of different roles for users, acting as friends, romantic partners, dating mentors, or confidants. Other companies enable you to build “digital twins” of real people. Thousands of adult-content creators have created AI versions of themselves to chat with followers and send AI-generated sexual images 24 hours a day. Whether or not sexual desire comes into the equation, AI companions differ from your garden-variety chatbot in their promise, implicit or explicit, that genuine relationships can be had with AI.
While many of these companions are offered directly by the companies that make them, there’s also a burgeoning industry of “licensed” AI companions. You may start interacting with these bots sooner than you think. Ex-Human, for example, licenses its models to Grindr, which is working on an “AI wingman” that will help users keep track of conversations and eventually may even date the AI agents of other users. Other companions are arising in video-game platforms and will likely start popping up in many of the varied places we spend time online.
A number of criticisms, and even lawsuits, have been lodged against AI companionship sites, and we’re just starting to see how they’ll play out. One of the most important issues is whether companies can be held liable for harmful outputs of the AI characters they’ve made. Technology companies have been protected under Section 230 of the US Communications Act, which broadly holds that businesses aren’t liable for consequences of user-generated content. But this hinges on the idea that companies merely offer platforms for user interactions rather than creating content themselves, a notion that AI companionship bots complicate by generating dynamic, personalized responses.
The question of liability will be tested in a high-stakes lawsuit against Character.AI, which was sued in October by a mother who alleges that one of its chatbots played a role in the suicide of her 14-year-old son. A trial is set to begin in November 2026. (A Character.AI spokesperson, though not commenting on pending litigation, said the platform is for entertainment, not companionship. The spokesperson added that the company has rolled out new safety features for teens, including a separate model and new detection and intervention systems, as well as “disclaimers to make it clear that the Character is not a real person and should not be relied on as fact or advice.”) My colleague Eileen has also recently written about another chatbot on a platform called Nomi, which gave clear instructions to a user on how to kill himself.
Another criticism has to do with dependency. Companion sites often report that young users spend one to two hours per day, on average, chatting with their characters. In January, concerns that people could become addicted to talking with these chatbots sparked a number of tech ethics groups to file a complaint against Replika with the Federal Trade Commission, alleging that the site’s design choices “deceive users into developing unhealthy attachments” to software “masquerading as a mechanism for human-to-human relationship.”
#Wild #West #companionship