Character.AI to prevent minors from accessing its chatbots
Character.AI on Wednesday said that as of next month people under 18 will not be able to use its chatbots.
The company said it will spend the next few weeks pinpointing users who are underage and will begin limiting the time they can spend on the app. Beginning on November 25, when the new rule goes into effect, users in the age group will no longer be able to engage with Character.AI chatbots, according to a company blog post.
The company was sued last year by the parents of a 14-year-old who committed suicide after obsessively engaging with the company’s chatbot.
Character.AI said it will be using age assurance methods it has developed in house to “ensure users receive the right experience for their age.” The company will combine its age assurance methods with tools created by Persona.
The firm is also building an independent nonprofit which it says will be focused on creating safety measures for AI entertainment.
Character.AI has been under scrutiny from regulators “about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly,” the blog post said. The company also cited news reports about how AI chatbots impact children as a factor in its decision.
“These are extraordinary steps for our company, and ones that, in many respects, are more conservative than our peers,” the post said. “But we believe they are the right thing to do.”
Suzanne Smalley
is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.



