Key Points:
- CNBC investigated “nudify” apps and how a group of friends in Minnesota became key figures in the fight against nonconsensual, AI-generated porn.
- Experts said these services, which are often promoted via Facebook ads, are available to download on the Apple and Google app stores and are easily accessed using simple web searches.
- “That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.
In the summer of 2024, a group of women in the Minneapolis area learned that a male friend used their Facebook photos mixed with artificial intelligence to create sexualized images and videos.
Using an AI site called DeepSwap, the man secretly created deepfakes of the friends and over 80 women in the Twin Cities region. The discovery created emotional trauma and led the group to seek the help of a sympathetic state senator.
As a CNBC investigation shows, the rise of “nudify” apps and sites has made it easier than ever for people to create nonconsensual, explicit deepfakes. Experts said these services are all over the Internet, with many being promoted via Facebook ads, available for download on the Apple and Google app stores and easily accessed using simple web searches.
“That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.
CNBC’s reporting shines a light on the legal quagmire surrounding AI, and how a…

CNBC investigated “nudify” apps and how a group of friends in Minnesota became key figures in the fight against nonconsensual, AI-generated porn.
“””Sometimes, she said, a simple click of a camera shutter can cause her to lose her breath and begin trembling, her eyes swelling with tears. That’s what happened at a conference she attended a month after first learning about the images.”””
Knowing my name is on a registry and my address too, Everytime the doorbell rings, I panic it is the cops or vigilantes coming to ruff me up. I want laws to protect me and my property since I have finished my sentence.
“””Because the women weren’t underage and the man who created the deepfakes never distributed the content, there was no apparent crime.”””
So if he never distributed the images, does she have any idea what he created???? It is all in her head. Sounds like more problems than she is letting on.