Elon Musk's xAI sued for turning three girls' real photos into AI CSAM

Discord user led cops to Grok-generated CSAM of real girls, lawsuit says.

Elon Musk's xAI sued for turning three girls' real photos into AI CSAM
Elon Musk's xAI sued for turning three girls' real photos into AI CSAM Photo: Ars Technica

A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent.

As recently as January, Musk denied that Grok generated any CSAM during a scandal in which xAI refused to update filters to block the chatbot from nudifying images of real people.

Instead, it was generated on Grok Imagine.

Digging into the standalone app, a researcher in January found that a little less than 10 percent of about 800 Imagine outputs reviewed appeared to include CSAM.

In an X post following that revelation, Musk continued rejecting the evidence and insisted that he was “not aware of any naked underage images generated by Grok,” emphasizing that he’d seen “literally zero.”
However, Musk may now be forced to finally confront Grok’s CSAM problem after a Discord user reached out to a victim, prompting law enforcement to get involved.

In a proposed class-action lawsuit filed Monday, three young girls from Tennessee and their guardians accused Musk of intentionally designing Grok to “profit off the sexual predation of real people, including children.” They estimated that “at least thousands of minors” were victimized and have asked a US district court for an injunction to finally end Grok’s harmful outputs.

They also seek damages, including punitive damages, for all minors harmed.

Source: This article was originally published by Ars Technica

Read Full Original Article →

Share this article

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

Maximum 2000 characters