Elon Musk’s AI company, xAI, has reportedly faced a lawsuit in the US after three anonymous individuals accused its Grok AI models of generating explicit images of them.
According to the latest news, the case was filed on March 16, 2026, in a federal court in California. The plaintiffs have asked the court to allow the lawsuit to proceed as a class action. The names of the plaintiffs are Jane Doe 1, Jane Doe 2, and Jane Doe 3.
“If the allegations get approved, the case could represent anyone whose real childhood images were turned into sexual content using Grok,” an official statement highlighted. In the complaint, the plaintiffs argued that xAI failed to implement basic safety measures. They also stated that other AI image generators actually prevent the creation of sexual images involving real people.
One of the plaintiffs, Jane Doe 1, claimed that some photos from her high school homecoming and yearbook were altered using Grok. The altered pictures depicted her unclothed. She also said she learned about the incident after an anonymous person contacted her on Instagram.
The unknown person had also shared a link to a Discord server. In this server, where her explicit images, along with those of other minors from her school, were circulating. Other allegations highlighted that several altered explicit images of the plaintiffs were created through a third-party mobile app that uses Grok models.
These allegations have created turmoil online, with several netizens saying that Elon Musk is not keeping his word. Even before these incidents, Musk had made certain promises of presenting error-free AI products.
One netizen said, “If a system allows nude or erotic content to be generated from real photos, it becomes extremely difficult to stop users from producing sexual images of children.” The complaint comes in line with Musk’s public promotion of Grok’s ability to create sexualised images and depict real people in revealing outfits.
Also read: Elon Musk Unveils ‘Macrohard’ AI To Run Entire Companies