Might 7, 2025
The GIST Editors' notes
This text has been reviewed in keeping with Science X's editorial course of and insurance policies. Editors have highlighted the next attributes whereas guaranteeing the content material's credibility:
fact-checked
trusted supply
proofread
Dramatic rise in publicly downloadable deepfake picture mills, examine finds

Researchers from the Oxford Web Institute (OII) on the College of Oxford have uncovered a dramatic rise in simply accessible AI instruments particularly designed to create deepfake pictures of identifiable individuals, discovering almost 35,000 such instruments accessible for public obtain on one in style globally accessible on-line platform, for instance.
The examine, led by Will Hawkins, a doctoral scholar on the OII, and accepted for publication on the ACM Equity, Accountability, and Transparency (FAccT) convention, reveals these deepfake mills have been downloaded nearly 15 million occasions since late 2022, primarily focusing on ladies. The info level in direction of a speedy enhance in AI-generated non-consensual intimate imagery (NCII).
Key findings embrace:
- Large scale: Practically 35,000 publicly downloadable "deepfake mannequin variants" had been recognized. These are fashions which were fine-tuned to supply deepfake pictures of identifiable individuals, usually celebrities. Different variants search to generate much less distinguished people, with many based mostly on social media profiles. They’re primarily hosted on Civitai, a preferred open database of AI fashions.
- Widespread use: Deepfake mannequin variants have been downloaded nearly 15 million occasions cumulatively since November 2022. Every variant downloaded might generate limitless deepfake pictures.
- Overwhelmingly focusing on ladies: An in depth evaluation revealed 96% of the deepfake fashions focused identifiable ladies. Focused ladies ranged from globally acknowledged celebrities to social media customers with comparatively small followings. Most of the hottest deepfake fashions goal people from China, Korea, Japan, the UK and the US.
- Simply created: Many deepfake mannequin variants are created utilizing a way known as Low Rank Adaptation (LoRA), requiring as few as 20 pictures of the goal particular person, a consumer-grade pc, and quarter-hour of processing time.
- Supposed to generate NCII: Many fashions carry tags corresponding to "porn," "horny" or "nude" or descriptions signaling intent to generate Non-Consensual Intimate Imagery (NCII), regardless of such makes use of violating the internet hosting platforms' Phrases of Service and being unlawful in some nations together with the UK.
"There’s an pressing want for extra strong technical safeguards, clearer and extra proactively enforced platform insurance policies, and new regulatory approaches to deal with the creation and distribution of those dangerous AI fashions," stated Will Hawkins, lead writer of the examine.
The sharing of sexually specific deepfake pictures was made a felony offense in England and Wales below an modification to the On-line Security Act in April 2023. The UK Authorities hopes to additionally make creating such pictures an offense as a part of its Crime and Policing Invoice, which is at at present at Committee Stage.
The outcomes could also be merely the tip of the iceberg, with this evaluation performed on solely publicly accessible fashions on respected platforms. Given the low value for creating these fashions, extra egregious deepfake content material—for instance baby sexual abuse materials—can also be more and more widespread however not publicized or hosted on public platforms.
The examine, "Deepfakes on Demand: the rise of accessible non-consensual deepfake picture mills' by Will Hawkins, Chris Russell and Brent Mittelstadt of the Oxford Web Institute, will likely be accessible as a pre-print on arXiv from 7 Might. It is going to be formally printed as a part of the ACM Equity, Accountability, and Transparency (FAccT) peer-reviewed convention proceedings. The convention will likely be held from 23-26 June in Athens, Greece.
Supplied by College of Oxford Quotation: Dramatic rise in publicly downloadable deepfake picture mills, examine finds (2025, Might 7) retrieved 7 Might 2025 from https://techxplore.com/information/2025-05-downloadable-deepfake-image-generators.html This doc is topic to copyright. Aside from any truthful dealing for the aim of personal examine or analysis, no half could also be reproduced with out the written permission. The content material is offered for data functions solely.
Discover additional
Analysis reveals 'main vulnerabilities' in deepfake detectors shares
Feedback to editors
