AI is getting better at generating porn. We might not be prepared for the consequences.

Mobile

A red-headed woman stands on the moon, her face obscured. Her naked body looks like it belongs on a poster you’d find on a hormonal teenager’s bedroom wall — that is, until you reach her torso, where three arms spit out of her shoulders.

AI-powered systems like Stable Diffusion, which translate text prompts into pictures, have been used by brands and artists to create concept images, award-winning (albeit controversial) prints and full-blown marketing campaigns.

But some users, intent on exploring the systems’ murkier side, have been testing them for a different sort of use case: porn.

AI porn is about as unsettling and imperfect as you’d expect (that red-head on the moon was likely not generated by someone with an extra arm fetish). But as the tech continues to improve, it will evoke challenging questions for AI ethicists and sex workers alike.

Pornography created using the latest image-generating systems first arrived on the scene via the discussion boards 4chan and Reddit earlier this month, after a member of 4chan leaked the open source Stable Diffusion system ahead of its official release. Then, last week, what appears to be one of the first websites dedicated to high-fidelity AI porn generation launched.

Called Porn Pen, the website allows users to customize the appearance of nude AI-generated models — all of which are women — using toggleable tags like “babe,” “lingerie model,” “chubby,” ethnicities (e.g. “Russian” and “Latina”) and backdrops (e.g. “bedroom,” “shower” and wildcards like “moon”). Buttons capture models from the front, back or side, and change the appearance of the generated photo (e.g. “film photo,” “mirror selfie”). There must be a bug on the mirror selfies, though, because in the feed of user-generated images, some mirrors don’t actually reflect a person — but of course, these models are not people at all. Porn Pen functions like “This Person Does Not Exist,” only it’s NSFW.

On Y Combinator’s Hacker News forum, a user purporting to be the creator describes Porn Pen as an “experiment” using cutting-edge text-to-image models. “I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated,” they wrote. “New tags will be added once the prompt-engineering algorithm is fine-tuned further.” The creator did not respond to TechCrunch’s request for comment.

But Porn Pen raises a host of ethical questions, like biases in image-generating systems and the sources of the data from which they arose. Beyond the technical implications, one wonders whether new tech to create customized porn — assuming it catches on — could hurt adult content creators who make a living doing the same.

“I think it’s somewhat inevitable that this would come to exist when [OpenAI’s] DALL-E did,” Os Keyes, a PhD candidate at Seattle University, told TechCrunch via email. “But it’s still depressing how both the options and defaults replicate a very heteronormative and male gaze.”

Ashley, a sex worker and peer organizer who works on cases involving content moderation, thinks that the content generated by Porn Pen isn’t a threat to sex workers in its current state.

“There is endless media out there,” said Ashley, who did not want her last name to be published for fear of being harassed for their job. “But people differentiate themselves not by just making the best media, but also by being an accessible, interesting person. It’s going to be a long time before AI can replace that.”

On existing monetizable porn sites like OnlyFans and ManyVids, adult creators must verify their age and identity so that the company knows they are consenting adults. AI-generated porn models can’t do this, of course, because they aren’t real.

Ashley worries, though, that if porn sites crack down on AI porn, it might lead to harsher restrictions for sex workers, who are already facing increased regulation from legislation like SESTA/FOSTA. Congress introduced the Safe Sex Workers Study Act in 2019 to examine the affects of this legislation, which makes online sex work more difficult. This study found that “community organizations [had] reported increased homelessness of sex workers” after losing the “economic stability provided by access to online platforms.”

“SESTA was sold as fighting child sex trafficking, but it created a new criminal law about prostitution that had nothing about age,” Ashley said.

Currently, few laws around the world pertain to deepfaked porn. In the U.S., only Virginia and California have regulations restricting certain uses of faked and deepfaked pornographic media.

Systems such as Stable Diffusion “learn” to generate images from text by example. Fed billions of pictures labeled with annotations that indicate their content — for example, a picture of a dog labeled “Dachshund, wide-angle lens” — the systems learn that specific words and phrases refer to specific art styles, aesthetics, locations and so on.

This works relatively well in practice. A prompt like “a bird painting in the style of Van Gogh” will predictably yield a Van Gogh-esque image depicting a bird. But it gets trickier when the prompts are vaguer, refer to stereotypes or deal with subject matter with which the systems aren’t familiar.

For example, Porn Pen sometimes generates images without a person at all — presumably a failure of the system to understand the prompt. Other times, as alluded to earlier, it shows physically improbable models, typically with extra limbs, nipples in unusual places and contorted flesh.

“By definition [these systems are] going to represent those whose bodies are accepted and valued in mainstream society,” Keyes said, noting that Porn Pen only has categories for cisnormative people. “It’s not surprising to me that you’d end up with a disproportionately high number of women, for example.”

While Stable Diffusion, one of the systems likely underpinning Porn Pen, has relatively few “NSFW” images in its training dataset, early experiments from Redditors and 4chan users show that it’s quite competent at generating pornographic deepfakes of celebrities (Porn Pen — perhaps not coincidentally — has a “celebrity” option). And because it’s open source, there’d be nothing to prevent Porn Pen’s creator from fine-tuning the system on additional nude images.

“It’s definitely not great to generate [porn] of an existing person,” Ashley said. “It can be used to harass them.”

Deepfake porn is often created to threaten and harass people. These images are almost always developed without the subject’s consent out of malicious intent. In 2019, the research company Sensity AI found that 96% of deepfake videos online were non-consensual porn.

Mike Cook, an AI researcher who’s a part of the Knives and Paintbrushes collective, says that there’s a possibility the dataset includes people who’ve not consented to their image being used for training in this way, including sex workers.

“Many of [the people in the nudes in the training data] may derive their income from producing pornography or pornography-adjacent content,” Cook said. “Just like fine artists, musicians or journalists, the works these people have produced are being used to create systems that also undercut their ability to earn a living in the future.”

In theory, a porn actor could use copyright protections, defamation and potentially even human rights laws to fight the creator of a deepfaked image. But as a piece in MIT Technology Review notes, gathering evidence in support of the legal argument can prove to be a massive challenge.

When more primitive AI tools popularized deepfaked porn several years ago, a Wired investigation found that nonconsensual deepfake videos were racking up millions of views on mainstream porn sites like Pornhub. Other deepfaked works found a home on sites akin to Porn Pen — according to Sensity data, the top four deepfake porn websites received more than 134 million views in 2018.

“AI image synthesis is now a widespread and accessible technology, and I don’t think anyone is really prepared for the implications of this ubiquity,” Cook continued. “In my opinion, we have rushed very, very far into the unknown in the last few years with little regard for the impact of this technology.”

To Cook’s point, one of the most popular sites for AI-generated porn expanded late last year through partner agreements, referrals and an API, allowing the service — which hosts hundreds of nonconsensual deepfakes — to survive bans on its payments infrastructure. And in 2020, researchers discovered a Telegram bot that generated abusive deepfake images of more than 100,000 women, including underage girls.

“I think we’ll see a lot more people testing the limits of both the technology and society’s boundaries in the coming decade,” Cook said. “We must accept some responsibility for this and work to educate people about the ramifications of what they are doing.”

Products You May Like

Articles You May Like

FTX CTO Gary Wang avoids prison time
Alloy Women’s Health is making menopausal care accessible
Volkswagen slashes the value of its stake in Northvolt, the beleaguered battery manufacturer
Fusion startup Tokamak Energy attracts $125M for its egg-like reactor design
Aerospace Corp’s CEO talks literal moonshots and Space Agenda 2025

Leave a Reply

Your email address will not be published. Required fields are marked *