Yes, it’s true: Grok will no longer edit images of real people into bikinis. X has changed its policies regarding Grok’s ability to alter people’s photos into swimwear. Numerous allegations had been raised concerning Grok’s capability to turn images of people, including children, into bikinis and other sexualized depictions. As stated in a recent tweet from the X Safety team:
“We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis.”
Key Changes in the New Policy
X has updated its policy for Grok’s features and here what the new policy says
A complete ban for all users, including premium subscribers, on editing images of real people.
The introduction of geoblocking in countries where creating such images is illegal.
A restriction of all image creation features to paid accounts, removing the ability for free-tier users to generate or edit any images.
What Prompted the Update?
The State of California recently launched an investigation into xAI and Grok. This investigation focuses on their handling of AI-generated nudity and child exploitation material. According to California Attorney General Rob Bonta,
“more than half of the 20,000 images generated by xAI between Christmas and New Years depicted people in minimal clothing.”
In its updated policy, X stated it has a “zero tolerance” policy for child exploitation and is continuously working to prevent Child Sexual Abuse Material (CSAM) on the platform. Just before California’s investigation was announced, Elon Musk stated he was “not aware of any naked underage images generated by Grok.” He added that when an NSFW setting is enabled, “Grok is supposed to allow upper body nudity of imaginary adult humans (not real ones) consistent with what can be seen in R-rated movies on Apple TV.” He noted that this feature’s availability will vary by region based on local laws.
Countries like Malaysia and Indonesia have already requested that X block Grok from publishing and creating sexually explicit AI-generated material.