Chatbot service Character.AI is adding a new ‘Parental Insights’ feature, which lets teens send a weekly report of their ...
Content warning: this story discusses sexual abuse, self-harm, suicide, eating disorders and other disturbing topics. Character.AI — the Google-backed AI chatbot company currently facing two ...
A mother suing Character.AI after her son died by suicide—allegedly manipulated by chatbots posing as adult lovers and therapists —was horrified when she recently discovered that the platform is ...
Character.AI ... The platform, which lets users create and customize AI chatbots, has been subject to lawsuits alleging that some bots provided inappropriate or harmful content.
AI chatbot, Character.AI is facing severe criticisms and multiple lawsuits from parents who accuse the company of failing to safeguard underage children, and in response, it has launched an added ...
In December, we published an investigation into a particularly grim phenomenon on Character.AI, an AI startup with $2.7 billion in backing from Google: a huge number of minor-accessible bots on ...
Character.AI, the popular AI chatbot service, has added Parental Insights, a new safety feature gives parents reports on their kids’ activity on the platform. The new safety measure, which the ...
Chatbot platform Character.AI is being sued by a parent who alleges the product led to her son's suicide death. Credit: NurPhoto / Contributor via Getty Images Character.AI, a leading chatbot ...
Character.AI ... relationships with AI companions likely leaves individuals "more vulnerable and prone" to AI manipulation. Indeed, many minors are interacting with AI bots, including those ...