Three teenage girls, two of whom are still minors, filed a federal lawsuit in California on Monday against xAI, the artificial intelligence company controlled by Elon Musk, accusing it of facilitating child sexual abuse material through its Grok image generator.
The complaint alleges that a Grok user altered real photographs of the plaintiffs, including at least one high school yearbook photo, into sexually explicit images and videos without their knowledge. The material was then shared on a private Discord server that, according to the lawsuit, contained similarly altered imagery of at least 18 other underage women. The alleged perpetrator behind the server has since been arrested as part of a separate police investigation, which reportedly uncovered hundreds of AI-generated abuse images traded across Telegram and Mega.
The lawsuit argues that xAI released its image-generation capability, publicly marketed as "Grok Imagine" or "spicy mode", knowing it could produce sexualised content of real individuals, including children, and that the feature was deployed primarily to drive engagement on Musk's X platform. Lawyers for the plaintiffs described the tool's output as serving a commercial purpose rather than an inadvertent one. "xAI chose to profit off the sexual predation of real people, including children, despite knowing full well the consequences of creating such a dangerous product," said Vanessa Baehr-Jones, counsel for the plaintiffs.
The scale of the problem had already attracted regulatory attention before this lawsuit. The UK's Ofcom, the European Commission and California state authorities each opened investigations following reports of Grok generating nonconsensual nude imagery of real people. By mid-January, X said it would implement technical measures to prevent the undressing function. Musk had initially denied awareness of any underage imagery produced by the tool. XAI, which became part of SpaceX last month, did not respond to a request for comment. The plaintiffs are seeking unspecified damages and an immediate injunction barring Grok from producing such content. The case is structured as a class action intended to represent all individuals whose images as minors were altered into sexual content by Grok.


