The platform Character.AI announced on Wednesday that it will prohibit access to the chat for those under 18 years of age, a decision that comes after a lawsuit was filed against the company for the suicide of a 14-year-old teenager who became emotionally attached to its chatbot.
The company said it will encourage younger users to use alternative tools such as creating videos, stories and streams with AI characters. The ban will begin on November 25.
The platform announced that the ban will begin on November 25. Until then, it will limit chat use for minors to two hours per day.
“These are extraordinary steps for our company, and in many ways they are more conservative than our peers,” Character.AI said in a statement. “But we believe it is the right thing to do.”
Character.AI allows users — many of them teenagers — Interact with AI-created characters as if they were your confidants or boyfriends.
Sewell Setzer III shot himself in February after months of intimate exchanges with a chatbot inspired by the character Daenerys Targaryen from the Game of Thrones series, according to the lawsuit filed by his mother, Megan Garcia, against Character.AI.
The company said it made this decision after reading “recent reports that raise questions” from regulators and safety experts about the impact interacting with AI has on teenagers.
In addition to Setzer, several cases of suicides allegedly linked to interaction with AI chatbots have been recorded this year, prompting OpenAI, creator of ChatGPT, and other companies to implement user protection measures.