Ask Onix
Law against non-consensual intimate images to take effect next month
A Durham University professor who helped draft legislation criminalising the creation of non-consensual intimate images has expressed frustration over the UK government's delayed enforcement of the law, which will finally come into force in February.
Government delay draws criticism
Prof Clare McGlynn, a law researcher at Durham University and part of the coalition that campaigned for the legislation, said the government had not provided a reason for the postponement, despite the law being passed last June.
"They could have done it in August, September, October," she told reporters. "We have been quite frustrated." While acknowledging that laws cannot be enacted immediately, McGlynn described the delay as "quite long" given the "urgent" nature of the issue.
Recent AI deepfake scandal accelerates action
The law gained renewed attention following controversy over Elon Musk's Grok AI chatbot, which was used to generate explicit images of individuals without their consent. Though sharing such deepfakes is already illegal in the UK, creating them had not been a criminal offence until now.
McGlynn praised the government's response to the recent scandal, stating: "[The government] have been really firm and really robust in their reaction to X. And they have been really clear in what they expect from Ofcom."
Deputy PM condemns online abuse
Deputy Prime Minister David Lammy condemned the "disgusting, abusive behaviour" seen online, emphasising that perpetrators would face the "full force of the law."
"I am repulsed by the disgusting, abusive behaviour we've seen online. X's announcement is welcome, but it is imperative this government continues to take urgent action to stop vile criminals using these tools to exploit innocent women and children online."
David Lammy, Deputy Prime Minister
Lammy confirmed that the government had fast-tracked the legislation, ensuring it would become law "within weeks."
X responds with new safeguards
In a statement on its platform, X said it had "implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing."
Next steps: Platform accountability
McGlynn and her colleagues are now working on a second law, currently progressing through the House of Lords, which would require UK-based web platforms to remove non-consensual intimate images upon request from the affected individual.
"If your image is up there and X aren't doing anything about it, there's very little you can do at the moment."
Prof Clare McGlynn, Durham University
The proposed legislation aims to strengthen protections for victims by holding platforms accountable for content removal.