Home Technology AI undressing deepfake nude generation apps soar in popularity

AI undressing deepfake nude generation apps soar in popularity

81
0


Apps and websites that use artificial intelligence to digitally remove clothing from women’s photos without consent are skyrocketing in popularity, according to a Time report.

Researchers at the social network analysis firm Graphika found that in September alone, 24 million people visited so-called “nudify” or “undressing” services that apply AI algorithms to actual photos of clothed women to generate fake nude images.

The startling growth corresponds with the open-sourcing of several new AI diffusion models capable of creating strikingly realistic fake nude images. Whereas previous AI-altered images tended to be blurry or unrealistic, these new models allow amateurs to generate convincingly real nude photos without sophisticated training.

Marketing for these services often occurs openly on platforms like Reddit and Telegram, with some ads even suggesting users send the fake nudes back to the victim herself. One Nudify app pays for sponsored YouTube content and ranks first in Google searches despite the practice of violating most sites’ policies. Still, the services operate largely unchecked.

Deepfakes are dangerous, especially for non-consensual pornography

Experts warn the proliferation of DIY deepfake nudes represents a dangerous new phase in non-consensual pornography. “We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin of the Electronic Frontier Foundation, noting a rise in cases among high schoolers.

Still, many victims never even discover such images exist—those who do often struggle to get law enforcement to investigate or lack funds to pursue legal action.

Currently, federal law does not explicitly ban deepfake pornography, only the generation of faked child sexual abuse material. In November, a North Carolina psychiatrist received 40 years in prison – the first conviction of its kind – for using AI undressing apps on underage patient photos.

In response to the trend, TikTok has blocked search terms related to nudify apps and Meta began blocking related keywords on its platforms as well. Still, analysts say much more awareness and action is needed around non-consensual AI porn.

The apps treat women’s bodies as raw material without consent, running roughshod over privacy rights in pursuit of profit. “You can create something that actually looks realistic,” said Santiago Lakatos. Indeed, for victims, that is precisely the danger.

Featured Image Credit: Photo by Markus Winkler; Pexels

Radek Zielinski

Radek Zielinski is an experienced technology and financial journalist with a passion for cybersecurity and futurology.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here