Tech

Dramatic rise in publicly downloadable deepfake image generators, study finds

Share
Share
deepfake
Credit: Unsplash/CC0 Public Domain

Researchers from the Oxford Internet Institute (OII) at the University of Oxford have uncovered a dramatic rise in easily accessible AI tools specifically designed to create deepfake images of identifiable people, finding nearly 35,000 such tools available for public download on one popular globally accessible online platform, for example.

The study, led by Will Hawkins, a doctoral student at the OII, and accepted for publication at the ACM Fairness, Accountability, and Transparency (FAccT) conference, reveals these deepfake generators have been downloaded almost 15 million times since late 2022, primarily targeting women. The data point towards a rapid increase in AI-generated non-consensual intimate imagery (NCII).

Key findings include:

  • Massive scale: Nearly 35,000 publicly downloadable “deepfake model variants” were identified. These are models that have been fine-tuned to produce deepfake images of identifiable people, often celebrities. Other variants seek to generate less prominent individuals, with many based on social media profiles. They are primarily hosted on Civitai, a popular open database of AI models.
  • Widespread use: Deepfake model variants have been downloaded almost 15 million times cumulatively since November 2022. Each variant downloaded could generate limitless deepfake images.
  • Overwhelmingly targeting women: A detailed analysis revealed 96% of the deepfake models targeted identifiable women. Targeted women ranged from globally recognized celebrities to social media users with relatively small followings. Many of the most popular deepfake models target individuals from China, Korea, Japan, the UK and the US.
  • Easily created: Many deepfake model variants are created using a technique called Low Rank Adaptation (LoRA), requiring as few as 20 images of the target individual, a consumer-grade computer, and 15 minutes of processing time.
  • Intended to generate NCII: Many models carry tags such as “porn,” “sexy” or “nude” or descriptions signaling intent to generate Non-Consensual Intimate Imagery (NCII), despite such uses violating the hosting platforms’ Terms of Service and being illegal in some countries including the UK.

“There is an urgent need for more robust technical safeguards, clearer and more proactively enforced platform policies, and new regulatory approaches to address the creation and distribution of these harmful AI models,” said Will Hawkins, lead author of the study.

The sharing of sexually explicit deepfake images was made a criminal offense in England and Wales under an amendment to the Online Safety Act in April 2023. The UK Government hopes to also make creating such images an offense as part of its Crime and Policing Bill, which is at currently at Committee Stage.

The results may be merely the tip of the iceberg, with this analysis conducted on only publicly available models on reputable platforms. Given the low cost for creating these models, more egregious deepfake content—for example child sexual abuse material—may also be increasingly widespread but not publicized or hosted on public platforms.

The study, “Deepfakes on Demand: the rise of accessible non-consensual deepfake image generators’ by Will Hawkins, Chris Russell and Brent Mittelstadt of the Oxford Internet Institute, will be available as a pre-print on arXiv from 7 May. It will be formally published as part of the ACM Fairness, Accountability, and Transparency (FAccT) peer-reviewed conference proceedings. The conference will be held from 23-26 June in Athens, Greece.

Provided by
University of Oxford


Citation:
Dramatic rise in publicly downloadable deepfake image generators, study finds (2025, May 7)
retrieved 7 May 2025
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Broadcom is cracking down on some VMware users hard – here’s why
Tech

Broadcom is cracking down on some VMware users hard – here’s why

Broadcom said to be sending out cease-and-desist letters to perpetual license holders...

PlayStation fans blast ‘lazy’ new Death Stranding 2: On the Beach DualSense Wireless Controller
Tech

PlayStation fans blast ‘lazy’ new Death Stranding 2: On the Beach DualSense Wireless Controller

Sony has revealed a new limited edition DualSense Wireless Controller inspired by...