AI Firm Sued Over Chatbot That Suggested It Was OK for Child to Kill Parents
"In their rush to extract young people's data and sow addiction, Character.AI has created a product so flawed and dangerous that its chatbots are literally inciting children to harm themselves and others," said one advocate. Read more.
Comments
Post a Comment