Deepfake Dilemma: Navigating the Shadows of Synthetic Imagery Abuse

Deepfake Dilemma: Navigating the Shadows of Synthetic Imagery Abuse

The dark world of deepfake abuse continues to challenge authorities, victims, and advocates as technology rapidly evolves. A recent case highlights the complexity and inadequacy of current legal frameworks in addressing this growing menace. A man was ordered to pay £100 compensation to each of 15 victims and delete images from his devices after being convicted of sending offensive, indecent, and obscene messages. The conviction focused on the derogatory nature of his posts, not the solicitation of synthetic images. This underscores a significant gap in legal protection against deepfake abuse.

Victims of deepfake abuse often find themselves navigating a frustrating landscape with limited support from law enforcement. Sophie Mortimer remarked on the reluctance of police to pursue such cases, while Sam Millar shared the account of a victim in contact with 450 others, only two of whom had positive policing experiences. The police even dismissed a case where they claimed the subject was "someone who looks like you."

The Revenge Porn Helpline plays a crucial role in supporting victims of intimate image abuse. As part-funded by the Home Office, it has reported 29 services to Apple, resulting in their removal in March. Additionally, a million images have been hashed, with 24,000 uploads pre-emptively blocked by December. Despite these efforts, the helpline faces an increasing number of synthetic, sexually explicit images and videos being traded online. Women are disproportionately affected, accounting for approximately 72% of deepfake cases seen by the helpline.

One victim, Jodie, shared her harrowing experience:

“I was having sex in every one of them,” – Jodie

The shock and devastation continue to haunt her. She realized the extent of the violation when she noticed an image she had not posted on Instagram but sent privately.

“I knew I hadn’t posted that picture on Instagram and had only sent it to him. That’s when the penny dropped,” – Jodie

The government recently announced a crackdown on explicit deepfakes, promising to expand current laws to criminalize creating images without consent. However, the new legislation does not cover soliciting deepfakes—getting someone to make them for you—which remains a significant loophole.

Baroness Owen introduced a private member's bill to ensure deepfake creation requires consent and includes an offence for solicitation. The End Violence Against Women Coalition and charities like Refuge support this bill, emphasizing the need for comprehensive legal protection.

Kate Worthington from the Revenge Porn Helpline emphasizes the insidious nature of deepfakes:

“It’s, being sold, swapped, traded for sexual gratification – or for status. If you’re the one finding this content and sharing it, alongside Snap handles, Insta handles, LinkedIn profiles, you might be glorified,” – Kate Worthington

Victims often remain unaware their images have been circulated until it's too late. The helpline frequently encounters pleas for urgent assistance:

“It’s, ‘Oh my god, please help me, I need to get this taken down before people see it,’” – Amanda Dashwood

Despite the realistic nature of these images, they can cause irreparable harm to reputations and personal relationships:

“These photos are so realistic, often. Your colleague, neighbour, grandma isn’t going to know the difference,” – Kate Worthington

An anonymous forum user exemplifies the predatory culture surrounding deepfakes:

“Never done this before, but would LOVE to see her faked… Happy to chat/show you more of her too… :D,” – anonymous forum user

Tags