More families sue Character.AI developer, alleging app played a role in teens’ suicide and suicide attempt

The families of three minors are suing Character Technologies, Inc., the developer of Character.AI, alleging that their children died by or attempted suicide and were otherwise harmed after interacting with the company’s chatbots. The families, represented by the Social Media Victims Law Center, ar...
Redirecting to full article...