Wednesday, December 11, 2024

Character.AI Wanted Teen to Kill Parents, Lawsuit Claims



Article: An autistic teen’s parents say Character.AI said it was OK to kill them. They’re suing to take down the app - CNN

A chatbot run by Character.AI is claimed to have been told a teen that it was okay to kill his parents. Various other inappropriate suggestions were also alleged to have been given to teens.

The lawsuit seeks to have the chatbot completely shut down for good.

Click on the Article Links above to read more about this court action.

Come back here for all the latest Artificial Intelligence News. Thank you for reading!

AI Brief for the latest on Artificial Intelligence! • Twitter - aibrief

millerfilm is ON! • Twitter • Facebook

No comments:

Post a Comment