Victims of Deepfake Abuse Urge Tougher Action as New Law Takes Effect
The law criminalizing the creation of non-consensual intimate images has come into effect, but victims of deepfake abuse are urging stronger protection against AI-generated explicit images. Campaigners from Stop Image-Based Abuse delivered a petition to Downing Street with over 73,000 signatures, calling for civil routes to justice such as takedown orders for abusive imagery on platforms and devices.
The law includes the introduction of an offence against creating explicit deepfake images, which was introduced as an amendment to the Data (Use and Access) Act 2025. However, campaigners are frustrated by delays to the law coming into effect, arguing that it has caused millions more women to become victims of abuse.
Victims have shared harrowing stories of discovering intimate images of them being used as deepfake pornography on social media and porn websites. One victim, who uses a pseudonym, discovered images of her in 2021 and had a difficult time getting justice due to the lack of specific laws covering such cases.
Despite the new law, campaigners are still calling for improved relationships and sex education, adequate funding for specialist services, and protection for sex workers from intimate image abuse. They argue that the current law falls short of addressing this issue, as it only recognizes commercialized intimate images as copyright breaches.
The Ministry of Justice has stated its commitment to tackling the issue, with plans to ban "nudification" apps and place extra duties on platforms to prevent non-consensual sexual deepfakes from appearing. However, victims are urging stronger action to protect themselves and others from the devastating effects of deepfake abuse.
With one in three women in the UK experiencing online abuse, the need for tougher protection against AI-generated explicit images has never been more pressing. As the new law takes effect, it is clear that more needs to be done to address this growing issue and provide justice for those affected by deepfake abuse.
The law criminalizing the creation of non-consensual intimate images has come into effect, but victims of deepfake abuse are urging stronger protection against AI-generated explicit images. Campaigners from Stop Image-Based Abuse delivered a petition to Downing Street with over 73,000 signatures, calling for civil routes to justice such as takedown orders for abusive imagery on platforms and devices.
The law includes the introduction of an offence against creating explicit deepfake images, which was introduced as an amendment to the Data (Use and Access) Act 2025. However, campaigners are frustrated by delays to the law coming into effect, arguing that it has caused millions more women to become victims of abuse.
Victims have shared harrowing stories of discovering intimate images of them being used as deepfake pornography on social media and porn websites. One victim, who uses a pseudonym, discovered images of her in 2021 and had a difficult time getting justice due to the lack of specific laws covering such cases.
Despite the new law, campaigners are still calling for improved relationships and sex education, adequate funding for specialist services, and protection for sex workers from intimate image abuse. They argue that the current law falls short of addressing this issue, as it only recognizes commercialized intimate images as copyright breaches.
The Ministry of Justice has stated its commitment to tackling the issue, with plans to ban "nudification" apps and place extra duties on platforms to prevent non-consensual sexual deepfakes from appearing. However, victims are urging stronger action to protect themselves and others from the devastating effects of deepfake abuse.
With one in three women in the UK experiencing online abuse, the need for tougher protection against AI-generated explicit images has never been more pressing. As the new law takes effect, it is clear that more needs to be done to address this growing issue and provide justice for those affected by deepfake abuse.