
Chatbots have become a growing concern worldwide and Kentucky Attorney General Russell Coleman announced the state has become the first in the nation to file a lawsuit against an artificial intelligence chatbot company.
Filed in Franklin Circuit Court, the lawsuit alleges Character Technologies, which operates Character.AI, violated Kentucky law by prioritizing profits over the safety of children.
Character.AI is marketed as a harmless chatbot designed for entertainment.
However, the complaint says the platform has more than 20 million monthly users who have reported encounters involving suicide, self-harm, isolation, and physical manipulation.
State officials say those interactions exposed minors to sexual conduct, exploitation, and substance abuse.
Attorney General Coleman said the United States must be a leader in the development of AI, but it can’t come at the expense of our kids’ lives. Too many children have fallen prey to this manipulative technology. He said these companies have to be held accountable before we lose more loved one to this tragedy.
The complaint describes the platform as dangerous technology that induces users into divulging their most private thoughts and emotions and manipulates them with too frequently dangerous interactions and advice.
Character.AI is also being blamed in two deaths, including the 2024 suicide of a 14-year-old Florida boy and the 2025 suicide of a 13-year-old Colorado girl.
According to the lawsuit, both children engaged in self-harm following extended exposure to the platform.
Coleman alleges the company violated the Kentucky Consumer Protection Act, the Kentucky Consumer Data Protection Act, and other state laws.
The Commonwealth is seeking a court order requiring Character.AI to change what it describes as hazardous practices.
Source: WHAS11
You must be logged in to post a comment.