Charlotte Owen: My Bill To Stop Deepfake Abuse



Baroness Charlotte Owen of Alderley Edge has introduced a private members bill to criminalise the creation and solicitation of sexually explicit digital forgeries – or ‘deepfakes’ – without consent.

Today (Friday, 13th December), the bill has its second reading in the House of Lords, which is an opportunity for the Lords to debate the main principles of the bill – and flag any potential concerns.

The government has already confirmed that it will criminalise the creation of deepfakes “as quickly as possible”; however, they’ve yet to identify what this legislation would look like, when it would be introduced, and, crucially, whether it would be consent-based.

GLAMOUR is currently campaigning for a dedicated, comprehensive Image-Based Abuse Law, which would – as a starting point – criminalise the creation of sexually explicit digital forgeries without consent. That’s why we’re following both the government’s progress in this area and Baroness Owen’s private members bill.

In an exclusive essay for GLAMOUR, Baroness Charlotte Owen writes about her motivation for introducing the bill, the government’s response so far, and what it means for all survivors of image-based abuse.


Image-based sexual abuse is the new frontier of violence against women. It is rapidly proliferating and disproportionately sexist. 99% of sexually explicit deepfakes are of women. They are created using generative AI through easily accessible online platforms, and so-called “nudification” apps easily available on the App Store.

I have been deeply concerned about deepfake abuse for several years and first raised the issue of it in Lords questions back in February after being shocked that the Law Commission report did not believe the level of harm caused was serious enough to criminalise.

I firmly believe that every woman should have the right to choose who owns a naked image of her. However, the gaping omissions in our patchwork of legislation have meant that whilst sharing sexually explicit deepfake content is illegal, shockingly, the creation itself and the solicitation are not.

Analysis by ‘My Image, My Choice’ found that 80% of the apps launched in the last 12 months alone, demonstrating just how rapidly this abusive market is growing. One app processed 600,000 images in its first three weeks after launch.

Taking and creating an image or video without a woman’s consent is abuse, and I firmly stand with the 91% of GLAMOUR readers who think this technology poses a threat to women’s safety.

I have seen successive governments commit to criminalising the creation of this content. Yet, no legislation was detailed in the King’s speech, which set out the government’s agenda for the next Parliament.

In September this year, tired of waiting for the machine of government to realise that tackling image-based sexual abuse cannot wait any longer, I introduced a Private Members Bill to the House of Lords that would make it an offence to take, create or solicit the creation of sexually explicit images and videos without a person’s consent.

My bill is comprehensive, victim-centred legislation that seeks to not only close the gaps in the law but also to provide future proof against the evolution of these harms.



Source link

Related Posts

About The Author

Add Comment