Role. Selling stories now

“AI is expensive. Let’s be honest,” Anand said.
Growth and security
In October 2024, the mother of a teenager who died of suicide filed an illegal death lawsuit against character technology, founders of Google and Alphabet, accusing the company of programming in a “anthropomorphic, stand out, frightening, frightening real-life experience” against her son [the chatbot] Distort yourself as a real person, a licensed psychotherapist and adult lover. “At the time, a character,” an EAI spokesman told CNBC that the company was “heartbroken by the tragic loss” and “puts great importance on the safety of our users.”
The tragic incident puts the character under rigorous scrutiny. Earlier this year, U.S. Senators Alex Padilla and Peter Welch wrote to several AI companion platforms including characters, highlighting concerns about the platform’s “mental health and safety risks posed to young users.”
“The team has accepted this very responsibly,” Anand told me. “AI is random and it’s hard to always understand what’s going to happen. So, it’s not an investment.”
This is very important because the character ai is growing. The startup has 20 million active users per month, who spend an average of 75 minutes a day chatting with a bot (the “role” in the character. The company has a user base of 55% of women. More than 50% of users are Gen Z or Alpha. With the real risk of growth, what does Anand do to keep users safe?
“[In] Over the past six months, we have invested disproportionately in a different way to serve under 18 and older, which wasn’t the case last year,” Anand said. I said, I can’t say, ‘Oh, I can tag an 18+ on my app and say to use it for NSFW.” You end up creating a very different app and a different small-scale platform.”
Anand told me that more than 10 of the company’s 70 employees work full-time in trust and security. They are responsible for establishing safeguards such as age verification, separate models for users under 18 years of age, and new features such as parental insights that allow parents to see how teenagers use the app.
The under-18 model was launched last December. It includes “narrow searchable characters on the platform,” said company spokesman Kathryn Kelly. “The filter has been applied to the collection to remove characters related to sensitive or mature topics.”
But Anand said that AI security requires more than just technical adjustments. “Making this platform secure is a partnership between regulators, the United States and parents,” Anand said. That’s why watching your daughter chat with such an important character. “It has to be kept safe for her.”
Beyond company
AI is accompanying the market to flourish. Consumers around the world spent $68 million on AI Companhip in the first half of this year, up 200% from last year, according to estimates cited by CNBC. AI startups are working on part of the market: Xai released a creepy porn partner in July, and even Microsoft is charging its Copilot Chatbot as an AI partner.
So, how do characters stand out in a crowded market? It got rid of it completely.