The Rising Threat of Deepfake Abuse: Victims Seek Justice and Awareness

The Rising Threat of Deepfake Abuse: Victims Seek Justice and Awareness

The digital age has ushered in a new era of cybercrimes, with deepfake technology at the forefront of this unsettling wave. Victims of deepfake abuse, predominantly women, face a harrowing reality as synthetic, sexually explicit images and videos are created, traded, and sold online both in Britain and across the globe. A recent case highlights this growing menace, where a court ordered a man to pay £100 compensation to each of 15 victims and delete all images from his devices. Despite this ruling, the journey to justice remains fraught with challenges and inconsistencies.

Deepfake abuse often occurs without the victim's knowledge. This insidious nature complicates the fight against it. As Kate Worthington noted, "A lot of the time, the victim has no idea their images have been shared." Victims frequently find themselves in a desperate scramble to remove these images before they spread further. Amanda Dashwood captured their urgency: “It’s, ‘Oh my god, please help me, I need to get this taken down before people see it.’”

A victim, under the pseudonym Jodie, recounted her traumatic experience. “I was having sex in every one of them,” she said. The impact of these violations is profound and enduring. “The shock and devastation haunts me to this day,” Jodie added, emphasizing the emotional toll such abuse extracts. She described the moment she discovered her images as surreal: “It was just like time stood still.”

The policing of deepfake cases remains inconsistent and often unsatisfactory for victims. Sam Millar highlighted this disparity: “Even yesterday, a victim said to me that she is in a conversation with 450 victims of deepfake imagery, but only two of them had had a positive experience of policing.” Sophie Mortimer echoed this sentiment, stating, “It does feel like sometimes the police look for reasons not to pursue these sorts of cases.”

The Revenge Porn Helpline, a service launched in 2015 and part-funded by the Home Office, aims to assist victims of intimate image abuse. About 72% of deepfake cases reported to the helpline involve women. The helpline's manager points out that under-reporting is prevalent because victims fear inadequate responses from law enforcement.

The legal framework surrounding deepfakes is currently insufficient. A proposed law by the government does not cover the solicitation of deepfakes — the act of requesting someone else to create them. Baroness Owen has introduced a private member's bill to address this gap, ensuring deepfake creation is consent-based and criminalizing solicitation.

Moreover, cultural and religious sensitivities exacerbate the impact of deepfake abuse. Several cases have emerged where Muslim women were targeted with deepfaked images depicting them in revealing clothing or with their hijabs removed. These actions not only violate personal privacy but also attack cultural identities.

In March, a charity reported 29 services related to revenge porn to Apple, leading to their removal from its platform. This action marks a crucial step in curbing the distribution of abusive content. Furthermore, as of December, a million images have been hashed and 24,000 uploads pre-emptively blocked, demonstrating technological efforts to mitigate the spread of deepfake content.

Online forums continue to be breeding grounds for deepfake abuse. An anonymous forum user once posted an invitation for others to create sexually explicit deepfakes using fully clothed photos of Jodie. The disturbing message read: “Never done this before, but would LOVE to see her faked… Happy to chat/show you more of her too… :D”

Victims sometimes receive anonymous emails notifying them of their altered images circulating online. These messages carry a heavy emotional burden. One such email opened with an apology: “I’m genuinely so, so sorry to reach out to you.” It was followed by a cautionary note: “Huge trigger warning … They contain lewd photoshopped images of you.”

The motivation behind creating and distributing deepfakes ranges from sexual gratification to gaining social status within certain circles. Kate Worthington explained, “It’s being sold, swapped, traded for sexual gratification – or for status. If you’re the one finding this content and sharing it, alongside Snap handles, Insta handles, LinkedIn profiles, you might be glorified.” The realism of these images further complicates detection and prevention efforts. Worthington added, “These photos are so realistic, often. Your colleague, neighbour, grandma isn’t going to know the difference.”

Tags