X says it will no longer allow people to use its Grok AI tool to create unauthorized sexual imagery of other people.
On Wednesday night, the social media platform’s Safety arm issued a lengthy statement on the matter. In the article, the company said it has “zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content.” As such, it says it’s taken action “to remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity,” as well as cracking down on accounts that violate its rules in creating it. The company also says it reports accounts seeking CSAM to law enforcement authorities.
This statement comes after weeks of controversy regarding people using Grok to generate photos of people in various states of undress, often in bikinis but, in some cases, also in the nude. Despite these concerns, X waved away responsibility, blaming users for prompting the inappropriate photos in the first place and restricting Grok image generation to paying subscribers. In other words, it was directly profiting from this blatantly awful material.
The controversy led to many governments around the world criticizing X and launching their own investigations. Malaysia and Indonesia actually banned Grok entirely, while the UK said it was considering a ban of X as regional communications regulator Ofcom noted there had been “deeply concerning reports” of Grok sharing photos of undressed people, including children.
Canadian AI minister Evan Solomon, however, said Canada wasn’t considering a ban. In the meantime, Canadian privacy commissioner Philippe Dufresne said his office is expanding a probe into X that was launched last February. “The privacy commissioner has taken note of the subsequent update from the company, communicating its intention to address the matter,” Dufresne’s office told CBC News. “This will be taken into consideration by [the commissioner’s] office as it proceeds with this investigation.”
Ultimately, it’s disgraceful that it took this long for X to take any action, especially since this issue involved children. But even still, it’s not exactly taken any accountability. The blog post still doesn’t accept any ownership for what Grok has done, while X owner Elon Musk claimed on Wednesday night that he’s seen “literally zero” naked underage images of children. (Even if that were true, it’s still a big problem that sexualized images of kids with minimal clothing in provocative poses are being spread around.)
CNN also notes that researchers at the European non-profit AI Forensics have still observed “inconsistencies in the treatment of pornographic content generation” between public interactions with Grok on X and private chat on Grok.com.
Header image credit: Shutterstock
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.
