Could 12, 2025
The GIST Editors' notes
This text has been reviewed in accordance with Science X's editorial course of and insurance policies. Editors have highlighted the next attributes whereas guaranteeing the content material's credibility:
fact-checked
respected information company
proofread
'Device for grifters': AI deepfakes push bogus sexual cures

Holding an outsized carrot, a brawny, shirtless man promotes a complement he claims can enlarge male genitalia—one among numerous AI-generated movies on TikTok peddling unproven sexual remedies.
The rise of generative AI has made it simple—and financially profitable—to mass-produce such movies with minimal human oversight, typically that includes pretend superstar endorsements of bogus and probably dangerous merchandise.
In some TikTok movies, carrots are used as a euphemism for male genitalia, apparently to evade content material moderation policing sexually express language.
"You’ll discover that your carrot has grown up," the muscled man says in a robotic voice in a single video, directing customers to a web based buy hyperlink.
"This product will change your life," the person provides, claiming with out proof that the herbs used as elements increase testosterone and ship vitality ranges "via the roof."
The video seems to be AI-generated, in accordance with a deepfake detection service lately launched by the Bay Space-headquartered agency Resemble AI, which shared its outcomes with AFP.
"As seen on this instance, deceptive AI-generated content material is getting used to market dietary supplements with exaggerated or unverified claims, probably placing customers' well being in danger," Zohaib Ahmed, Resemble AI's chief govt and co-founder, advised AFP.
"We're seeing AI-generated content material weaponized to unfold false data."
'Low cost manner'
The development underscores how speedy advances in synthetic intelligence have fueled what researchers name an AI dystopia, a deception-filled on-line universe designed to govern unsuspecting customers into shopping for doubtful merchandise.
They embrace all the pieces from unverified—and in some instances, probably dangerous—dietary dietary supplements to weight reduction merchandise and sexual treatments.
"AI is a great tool for grifters trying to create massive volumes of content material slop for a low price," misinformation researcher Abbie Richards advised AFP.
"It's an affordable approach to produce ads," she added.
Alexios Mantzarlis, director of the Safety, Belief, and Security Initiative at Cornell Tech, has noticed a surge of "AI physician" avatars and audio tracks on TikTok that promote questionable sexual treatments.
A few of these movies, many with thousands and thousands of views, peddle testosterone-boosting concoctions comprised of elements similar to lemon, ginger and garlic.
Extra troublingly, quickly evolving AI instruments have enabled the creation of deepfakes impersonating celebrities similar to actress Amanda Seyfried and actor Robert De Niro.
"Your husband can't get it up?" Anthony Fauci, former director of the Nationwide Institute of Allergy and Infectious Ailments, seems to ask in a TikTok video selling a prostate complement.
However the clip is a deepfake, utilizing Fauci's likeness.
'Pernicious'
Many manipulated movies are created from present ones, modified with AI-generated voices and lip-synced to match what the altered voice says.
"The impersonation movies are notably pernicious as they additional degrade our skill to discern genuine accounts on-line," Mantzarlis mentioned.
Final 12 months, Mantzarlis found lots of of advertisements on YouTube that includes deepfakes of celebrities—together with Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson—selling dietary supplements branded as erectile dysfunction cures.
The speedy tempo of producing short-form AI movies implies that even when tech platforms take away questionable content material, near-identical variations shortly reappear—turning moderation right into a sport of whack-a-mole.
Researchers say this creates distinctive challenges for policing AI-generated content material, requiring novel options and extra refined detection instruments.
AFP's reality checkers have repeatedly debunked rip-off advertisements on Fb selling remedies—together with erectile dysfunction cures—that use pretend endorsements by Ben Carson, a neurosurgeon and former US cupboard member.
But many customers nonetheless take into account the endorsements authentic, illustrating the enchantment of deepfakes.
"Scammy online marketing schemes and questionable intercourse dietary supplements have existed for so long as the web and earlier than," Mantzarlis mentioned.
"As with each different unhealthy factor on-line, generative AI has made this abuse vector cheaper and faster to deploy at scale."
© 2025 AFP
Quotation: 'Device for grifters': AI deepfakes push bogus sexual cures (2025, Could 12) retrieved 13 Could 2025 from https://techxplore.com/information/2025-05-tool-grifters-ai-deepfakes-bogus.html This doc is topic to copyright. Other than any truthful dealing for the aim of personal examine or analysis, no half could also be reproduced with out the written permission. The content material is offered for data functions solely.
Discover additional
Generative AI and deepfakes are fueling well being misinformation. Right here's what to look out for therefore you don't get scammed 40 shares
Feedback to editors