Blog 16 Aug 2024

Buyers Guide

Complete playbook to understand liveness detection industry

Learn More
Blog featured image entitled ‘why should AI deepfake pornography be criminalized?

Why Should AI Deepfake Pornography be Criminalized?

Author: admin | 16 Aug 2024

A 400% increase in deepfake pornographic and nude content has been recorded between 2022 to 2023. Recently, the state of San Francisco sued 16 pornographic websites for publishing non-consensual deepfake pornographic content of women and young girls and it doesn’t end here. Girls as young as 11 or 12 are now hesitant to go online due to the fear of being manipulated by deepfake nudes. Experts are now calling AI Deepfake nudes ‘virtual guns’ that are actually killing the digital selves of victims. We’ve observed how federal legislation attempts to mitigate AI fake porn. At least 10 laws are enacted in regulating and criminalizing deepfake pornography especially when it is non-consensual. 

This blog provides information on AI Deepfake nudes, the damage they cause, and how they can be mitigated to protect everyone’s online presence.

Shocking Statistics on AI Deepfake Pornography

  • Deepfake pornographic content increased by 464% from 2022 to 2023.
  • Deepfake porn websites have a 90% global online pornography market share.
  • 99% of deepfake nudes circling the internet are female victims. Most of these are non-consensual.
  • Singers and Actors are the prime targets of deepfake pornography considering professions.
  • It takes only a few minutes and costs $0 to create a 1-minute deepfake video of anyone using a single facial image.
  • Technological curiosity has significantly impacted deepfake porn viewership.
  • 70% of people don’t feel guilty about watching deepfake undressing women (without knowing that most of this activity is either forced or non-consensual somehow).

How AI Deepfake Undressing Website Exploit Victims?

First, the deepfake creators locate the victim whose facial images or video feed is available. Hacking it, recording, or even taking a screenshot of it is no more difficult. They use these images to create deepfake porn material or run their online undressing websites using Generative AI. Victims of these deepfakes are mostly unaware at the beginning but once their content goes viral, it causes them stress, embarrassment, and other psychosocial problems. Even their social circle starts questioning their integrity and morality. 

Photo manipulation is not new. Cheapfakes that used to go viral on Google back in 2006 and 2007 were created using Adobe Photoshop and other image editing tools. These cheap fake nudes used the victim’s face and placed it over a nude body of a porn star. 

Now AI has become the favorite tool for deepfake manipulative content creation especially to damage a person’s social image by creating their deepfake explicit videos and images.

Overview of Federal Legislations Against AI Deepfake Technology

The US’s legislative framework seems more paperwork and less practical. However, we have seen some progress as 16 non-consensual deepfake undressing websites were sued last week by San Francisco.

Discover More News: San Francisco targets AI deepfake porn, seeking to shut down websites distributing non-consensual explicit content of women and girls, aiming to protect victims and hold offenders accountable.

Role of Search Engines in Leashing the Deepfake Beast

Apart from these legislations, the responsibility falls on the shoulders of search engines like Google which need to play a pivotal role in mitigating deepfake nudes online. Google Face Match, Google Photos, and other third-party facial recognition service providers can help in detecting and reporting deepfake nudes. Search Engines have free access to information but it should now be regulated to protect individuals from becoming victims of deepfake porn.

Is Facial Recognition Useful in Detecting Deepfakes?

Third-party facial recognition service providers offer deepfake detection as a separate service nowadays. Most of these solutions rely on liveness detection algorithms to detect deepfakes. However, it should be clear that detecting a deepfake in a pre-recorded video and an uploaded image is complex. It requires offsite liveness detection in which GAN and CNN-trained AI models are used for two main things:

  1. Reverse image search and photo matching.
  2. Detecting the fakeness present in the subject photo.

Offsite Liveness will help in differentiating between a deepfake image/video and an original one in uploaded material. If anyone’s deepfake porn or nude image has gone viral and he/she reaches out a robust FRT solution, they can detect the first source from where the deepfake image was uploaded and can indirectly help with reporting and removal of deepfake material.

Facia is an AI-powered Liveness Detection & Identity Verification solution that can help you detect deepfakes in the minimum possible time. With a Liveness detection speed of less than a second, Facia can protect your photos and images from being manipulated as you can use our technology to detect a deepfake and compare it with your target photo.

Final Word

Deepfake pornography should now be strictly shut down. Severe legal penalties must be imposed after the detection and reporting of a deepfake crime including the creation of defamatory and explicit deepfake material. Facial liveness detection can help law enforcement in accurately and swiftly handling deepfake criminal cases and help cybersecurity agencies remove such data from the internet.

Lastly, user awareness against the damaging use of deepfake technology is vitally important. For this purpose, user-oriented awareness campaigns by Face recognition tool providers and government regulatory bodies could prove helpful.

Published
Categorized as Blog