Character.AI bans users under 18 after being sued over child’s suicide

The Guardian 1 min read 7 hours ago

<p>Move comes as lawmakers move to bar minors from using AI companions and require companies to verify users’ age</p><p>The chatbot company Character.AI will ban users 18 and under from conversing with its virtual companions beginning in late November after months of legal scrutiny.</p><p>The announced change comes after the company, which enables its users to create characters with which they can have open-ended conversations, faced tough questions over how these <a href="https://www.theguardian.com/technology/artificialintelligenceai">AI</a> companions can affect teen and general <a href="https://www.theguardian.com/society/mental-health">mental health</a>, including a lawsuit over a <a href="https://www.theguardian.com/society/children">child</a>’s suicide and a proposed bill that would ban minors from conversing with AI companions.</p> <a href="https://www.theguardian.com/technology/2025/oct/29/character-ai-suicide-children-ban">Continue reading...</a>
Read original The Guardian