Victims urge tougher action on deepfake abuse as new law comes into force

Victims of Deepfake Abuse Urge Tougher Action as New Law Takes Effect

The law criminalizing the creation of non-consensual intimate images has come into effect, but victims of deepfake abuse are urging stronger protection against AI-generated explicit images. Campaigners from Stop Image-Based Abuse delivered a petition to Downing Street with over 73,000 signatures, calling for civil routes to justice such as takedown orders for abusive imagery on platforms and devices.

The law includes the introduction of an offence against creating explicit deepfake images, which was introduced as an amendment to the Data (Use and Access) Act 2025. However, campaigners are frustrated by delays to the law coming into effect, arguing that it has caused millions more women to become victims of abuse.

Victims have shared harrowing stories of discovering intimate images of them being used as deepfake pornography on social media and porn websites. One victim, who uses a pseudonym, discovered images of her in 2021 and had a difficult time getting justice due to the lack of specific laws covering such cases.

Despite the new law, campaigners are still calling for improved relationships and sex education, adequate funding for specialist services, and protection for sex workers from intimate image abuse. They argue that the current law falls short of addressing this issue, as it only recognizes commercialized intimate images as copyright breaches.

The Ministry of Justice has stated its commitment to tackling the issue, with plans to ban "nudification" apps and place extra duties on platforms to prevent non-consensual sexual deepfakes from appearing. However, victims are urging stronger action to protect themselves and others from the devastating effects of deepfake abuse.

With one in three women in the UK experiencing online abuse, the need for tougher protection against AI-generated explicit images has never been more pressing. As the new law takes effect, it is clear that more needs to be done to address this growing issue and provide justice for those affected by deepfake abuse.
 
OMG, I'm so frustrated 🀯 with how long it took for this law to come into effect... like, what's taking so long?! 😑 73k signatures and still, victims are suffering πŸŒͺ️. We need stronger protection from these AI-generated explicit images, ASAP πŸ’₯! Takedown orders on platforms and devices would be a game-changer πŸ“Š. And what's with the lack of specific laws for cases like this? It's not just about commercialized intimate images... it's about women being harassed, humiliated, and traumatized online πŸ˜”. We need better relationships and sex education, more funding for services, and protection for sex workers too πŸ’ͺ! The Ministry of Justice is all talk, but what are they doing to back up their promises? πŸ€·β€β™€οΈ Let's keep pushing until we see real change πŸ”„πŸ’Ό
 
Ugh, I'm so frustrated with all these new laws taking ages to pass 🀯! Like, can't they see how many lives are being ruined by these AI-generated explicit images? It's insane that we're still dealing with this mess in 2025. I mean, I've seen my friends and acquaintances get their intimate images shared online without their consent, and it's just devastating 😩.

I'm all for the law that's finally coming into effect, but come on, it's about time! We need to see more action from our leaders, like improved relationships and sex education 🀝, adequate funding for specialist services πŸ’Έ, and protection for sex workers from intimate image abuse. It's not just about creating a new law, it's about making sure we have the resources and support in place to tackle this issue.

And let's be real, one in three women in the UK experiencing online abuse is not okay 🚫. We need to do better, and we need to do it now! The Ministry of Justice needs to step up their game and provide stronger protection for its citizens. I'm glad that campaigners are pushing hard for change, but we can't just sit back and wait for the law to catch up – we need to take action ourselves πŸš€πŸ’ͺ.
 
It's crazy how fast technology is moving & yet we're still struggling with these deepfake issues 🀯 I mean, remember when NSYNC was the ultimate boyband? Now we got AI-generated explicit images that can ruin people's lives 😩 it's like, what even happened to our innocence?

I'm all for stronger laws & protections but sometimes I feel like they're too little, too late. We need education, awareness, and resources on a bigger scale to tackle this issue. I mean, have you seen the petition they delivered to Downing Street? 73k signatures is massive! 🀯 It's about time we take this seriously.
 
🚨😱 another law just went into effect and people are still getting harassed online πŸ€¦β€β™€οΈ 73k signatures is not enough for these victims, they're saying it's time to get stricter on AI-generated explicit images πŸ’». I feel bad for them, one woman got abused with deepfake images in 2021 and she had a hard time getting justice πŸ•°οΈ. The government's trying to do something, but one in three women are still getting online abuse and this just won't cut it πŸ˜”. We need better education, more funding for services and stricter laws on these apps πŸ’Έ. It's not over yet 🚨
 
I'm so worried about these victims πŸ€•. It's not just about creating laws, we need to make sure they're actually enforced 🚨. I mean, having 73k signatures is amazing, but what's the point if it takes ages for things to happen? πŸ™„ These women are already going through so much emotional trauma and it feels like no one's listening πŸ—£οΈ. We need to push for more funding, better education, and support services that actually help them recover from this abuse πŸ’”. And let's not forget about the sex workers – they're often targeted too 🚫. This new law is a good start, but we need to do so much more πŸ’ͺ.
 
OMG, can u believe how frustrating it is 4 victims of deepfake abuse?! 🀯 They're already dealing w/ emotional trauma & now they gotta navigate a law that's not specific enough? It's like, we get the intention 2 protect ppl from AI-generated explicit images, but do we have 2 take it 1 step further? What about online safety education in schools? We should be teaching kids how 2 spot deepfakes & how 2 report suspicious activity πŸ“šπŸ”. And what's w/ the delays in implementing this law? It's like, we're still seeing millions of women become victims of abuse & it's not fair 😩. The gov needs 2 do more 2 support victims & prevent deepfake abuse from happening in the first place πŸ’ͺ.
 
🀯 I'm so frustrated when I think about these women who've gone through this trauma just because they couldn't get justice in time πŸ™ˆ. The law's gotta do better than that - like, we need some serious consequences for the people creating and sharing these sick deepfakes 🚫. I mean, it's not just about protecting women from online abuse, but also about keeping our devices safe from all this crap πŸ’».

We should be talking about proper education and support systems instead of just slapping a law together πŸ€”. Like, where are the resources for these victims? Where are the specialists to help them deal with this trauma? We need to take care of people's well-being over just punishing the abusers πŸ€—. It's time for us to get serious about deepfake abuse and make sure we're doing everything in our power to protect each other online πŸ‘«πŸ’•
 
Ugh I'm still getting chills thinking about those deepfake stories that went viral a few years ago 🀯. It's crazy how far we've come with the law, but like 1 in 3 women is experiencing online harassment and it feels like we're just scratching the surface, you know? πŸ€·β€β™€οΈ Those petitioners at Stop Image-Based Abuse are really speaking truth, it's not just about the tech companies doing more, it's about having a solid support system for victims. We need better education on consent and digital safety, especially in schools... I remember when I was in school, we didn't even have to learn about online harassment πŸ“š. And those "nudification" apps? They're like something out of a bad 90s teen movie πŸ˜‚. Seriously though, let's get this law working properly and protect people from these AI-generated images, it's not worth waiting for anyone else...
 
πŸ€·β€β™€οΈ Like, I'm all for people holding each other accountable, but 73k signatures feels like a drop in the bucket considering how widespread this problem is 🀯. It's wild that there are still delays to the law taking effect and now millions more women have become victims of abuse 🚨.

Can we talk about why platforms aren't doing enough to prevent these images from popping up in the first place? I mean, they're making money off ads on these sites, so it's not like they don't care about their users' experiences πŸ˜’. It feels like a classic case of "we'll fix it eventually" πŸ™„.

I do love that Stop Image-Based Abuse is pushing for takedown orders and improved relationships and sex education, though 🀝. Those are some real wins in my book πŸ‘. But let's be real, this law needs to be more than just a Band-Aid solution πŸ’‰. We need systemic change here πŸ’ͺ.
 
Back
Top