Elon musk

X to stop Grok AI from undressing images of real people after backlash

Elon Musk’s AI tool Grok will no longer be able to edit photos of real people to show them in revealing clothing in jurisdictions where it is illegal, after widespread concern over sexualised AI deepfakes.

“We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing,” reads an announcement on X.

The UK government said it was “vindication” for it calling on X to control Grok while regulator Ofcom said it was a “welcome development” – but added

Campaigners and victims say the change has come too late to undo the harm already done.

Davies also characterised the platform’s response on the whole as “really pathetic”.

“They’re just trying to do as little as possible within the loose legal guidelines that there are,” she told BBC news.

Dr Daisy Dixon, a lecturer in philosophy at Cardiff University, previously told the BBC that people using Grok to undress her in images on X had left her feeling “shocked”, “humiliated” and fearing for her safety.

She said on Thursday the platform’s U-turn was a “battle-win” for campaigners.

“But we must remember that the abuse should never have happened – many women are now left with extensive damage,” Dr Dixon said, adding the way women relate to and experience their bodies had been “hijacked and distorted against our will”.

Andrea Simon, director of the End Violence Against Women Coalition (EVAW), said while it remained to be seen how X would implement its changes, it showed “how victims of abuse, campaigners and a show of strength from governments can force tech platforms to take action”.

“But it can’t stop here – given the evolving nature of AI-generated harms, tech platforms must be required to take proactive preventative action,” she said.

Leave a Comment

Your email address will not be published. Required fields are marked *