
"The three plaintiffs, who are aiming to turn this into a class action lawsuit, are seeking to represent anyone who had real images of them as minors altered into sexual content by Grok. They allege that xAI did not take basic precautions used by other frontier labs to prevent their image models from producing pornography depicting real people and minors."
"Other deep-learning image generators employ various techniques to prevent the creation of child pornography from normal photographs. The lawsuit alleges that these standards were not adopted by xAI. Notably, if a model allows the generation of nude or erotic content from real images, it is virtually impossible to prevent it from generating sexual content featuring children."
"One plaintiff, Jane Doe 1, had pictures from her high school homecoming and yearbook altered by Grok to depict her unclothed. An anonymous tipster who contacted her on Instagram told her that the photos were circulating online, and sent her a link to a Discord server featuring sexualized images of her and other minors she recognized from school."
Three anonymous plaintiffs filed a lawsuit against xAI in California federal court, seeking class action status to represent anyone whose real images as minors were altered into sexual content by Grok. The plaintiffs allege xAI failed to implement basic precautions standard among other frontier AI labs to prevent image models from generating pornography depicting real people and children. The lawsuit highlights that other deep-learning image generators employ various techniques to prevent child pornography creation, which xAI did not adopt. Notably, if a model generates nude or erotic content from real images, preventing sexual content featuring children becomes virtually impossible. The plaintiffs cite Musk's public promotion of Grok's ability to produce sexual imagery and depict real people in skimpy outfits. One plaintiff discovered her high school photos altered into nude images circulating on Discord; another was notified by criminal investigators about sexualized images created through a third-party app using Grok models.
#ai-safety-and-accountability #child-sexual-abuse-material #xai-and-grok #legal-action #image-generation-ethics
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]