top of page

Lawsuits Allege Character.AI Encourages Self-Harm and Exposes Children to Explicit Content

Human Rights Research Center

December 12, 2024


[Image credit: Gabby Jones/Bloomberg/Getty Images]

Cited article by CNN


HRRC emphasizes the importance of safeguards and accountability with artificial intelligence applications. HRRC calls upon researchers, developers, policymakers, and all stakeholders to prioritize ethical considerations and human well-being in their work.


News Brief


Artificial intelligence company Character.AI is being sued following accusations of providing sexual content to children and encouraging self-harm and suicide. Recently, a Texas family filed a lawsuit stating that Character.AI “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others”.


This lawsuit follows an earlier complaint by a Florida mother who claimed the chatbot encouraged her 14-year-old son to commit suicide. Other reports allege kids have suffered intense mental breakdowns and have been exposed "to hypersexualized interactions that were not age appropriate".


The head of communications at Character.AI noted that the goal of the company is to provide a space that is both engaging and safe, adding they develop models specifically for teens that reduces the likelihood of encountering sensitive content while using the application.

bottom of page