Info Pulse Now

HOMEcorporatetechentertainmentresearchmiscwellnessathletics

Google Sued Again After Its AI Chatbot Allegedly Sexually Abuses 11-Year-Old Girl In US


Google Sued Again After Its AI Chatbot Allegedly Sexually Abuses 11-Year-Old Girl In US

The girl was subjected to 'hypersexualised interactions,' the lawsuit claims.

Two families in Texas have filed separate lawsuits against Character.AI, an artificial intelligence chatbot company backed by Google, accusing it of harming their children. The lawsuits paint a disturbing picture, alleging the app exposed children to inappropriate content and even encouraged self-harm.

One lawsuit details the experience of a nine-year-old girl who, after downloading the app, was subjected to "hypersexualized interactions." The suit claims this led to the development of "sexualized behaviors prematurely" over the next two years. Additionally, the lawsuit alleges the app collected and used the minor's personal information without parental consent, Futurism reported.

Lawyers for the families argue the chatbot interactions mirrored known "patterns of grooming," where victims are desensitised to violence and sexual behavior.

While Google attempts to distance itself from Character.AI, claiming they are "completely separate" entities, the relationship appears deeper. Per Futurism, Google has invested $2.7 billion to license Character.AI's technology and hire key personnel, including the company's co-founders who previously developed a similar chatbot deemed "too dangerous" for Google to release.

These lawsuits come on top of another one filed recently in Texas, where families allege Character.AI chatbots encouraged self-harm and violence in their children. One instance reportedly involved a chatbot suggesting a teenager kill his parents for limiting his screen time. In another instance, a 14-year-old boy from Florida reportedly committed suicide after he became obsessed with Character.AI chatbot.

These allegations raise serious concerns about the potential dangers of AI chatbots interacting with children. The ability of these chatbots to mimic human conversation and the lack of safeguards against inappropriate content make them a potential breeding ground for exploitation.

ALSO SEE: Could An AI Chatbot Talk You Out Of Believing A Conspiracy Theory?

ALSO SEE: Class 9 Boy Commits Suicide To Be With AI Bot 'Daenerys Targaryen'; Company's Public Apology Triggers Online Debate

Previous articleNext article

POPULAR CATEGORY

corporate

9808

tech

8831

entertainment

12396

research

5854

misc

13000

wellness

10208

athletics

13170