When Stable Diffusion, the text-to-image AI developed by startup Stability AI, was open sourced earlier this year, it didn’t take long for the internet to wield it for porn-creating purposes. Communities across Reddit and 4chan tapped the AI system to generate realistic and anime-style images of nude characters, mostly women, as well as non-consensual fake nude imagery of celebrities.
But while Reddit quickly shut down many of the subreddits dedicated to AI porn, and communities like NewGrounds, which allows some forms of adult art, banned AI-generated artwork altogether, new forums emerged to fill the gap.
By far the largest is Unstable Diffusion, whose operators are building a business around AI systems tailored to generate high-quality porn. The server’s Patreon — started to keep the server running as well as fund general development — is currently raking in over $2,500 a month from several hundred donors.
“In just two months, our team expanded to over 13 people as well as many consultants and volunteer community moderators,” Arman Chaudhry, one of the members of the Unstable Diffusion admin team, told TechCrunch in a conversation via Discord. “We see the opportunity to make innovations in usability, user experience and expressive power to create tools that professional artists and businesses can benefit from.”
Unsurprisingly, some AI ethicists are as worried as Chaudhry is optimistic. While the use of AI to create porn isn’t new — TechCrunch covered an AI-porn-generating app just a few months ago — Unstable Diffusion’s models are capable of generating higher-fidelity examples than most. The generated porn could have negative consequences particularly for marginalized groups, the ethicists say, including the artists and adult actors who make a living creating porn to fulfill customers’ fantasies.
“The risks include placing even more unreasonable expectations on women’s bodies and sexual behavior, violating women’s privacy and copyrights by feeding sexual content they created to train the algorithm without consent and putting women in the porn industry out of a job,” Ravit Dotan, VP of responsible AI at Mission Control, told TechCrunch. “One aspect that I’m particularly worried about is the disparate impact AI-generated porn has on women. For example, a previous AI-based app that can ‘undress’ people works only on women.”
Humble beginnings
Unstable Diffusion got its start in August — around the same time that the Stable Diffusion model was released. Initially a subreddit, it eventually migrated to Discord, where it now has roughly 50,000 members.
“Basically, we’re here to provide support for people interested in making NSFW,” one of the Discord server admins, who goes by the name AshleyEvelyn, wrote in an announcement post from August. “Because the only community currently working on this is 4chan, we hope to provide a more reasonable community which can actually work with the wider AI community.”
Early on, Unstable Diffusion served as a place simply for sharing AI-generated porn — and methods to bypass the content filters of various image-generating apps. Soon, though, several of the server’s admins began exploring ways to build their own AI systems for porn generation on top of existing open source tools.
Stable Diffusion lent itself to their efforts. The model wasn’t built to generate porn per se, but Stability AI doesn’t explicitly prohibit developers from customizing Stable Diffusion to create porn so long as the porn doesn’t violate laws or clearly harm others. Even then, the company has adopted a laissez-faire approach to governance, placing the onus on the AI community to use Stable Diffusion responsibly.
Stability AI didn’t respond to a request for comment.
The Unstable Diffusion admins released a Discord bot to start. Powered by the vanilla Stable Diffusion, it let users generate porn by typing text prompts. But the results weren’t perfect: the nude figures the bot generated often had misplaced limbs and distorted genitalia.
The reason why was that the out-of-the-box Stable Diffusion hadn’t been exposed to enough examples of porn to “know” how to produce the desired results. Stable Diffusion, like all text-to-image AI systems, was trained on a dataset of billions of captioned images to learn the associations between written concepts and images, like how the word “bird” can refer not only to bluebirds but parakeets and bald eagles in addition to more abstract notions. While many of the images come from copyrighted sources, like Flickr and ArtStation, companies such as Stability AI argue their systems are covered by fair use — a precedent that’s soon to be tested in court.
Only a small percentage of Stable Diffusion’s data set — about 2.9% — contains NSFW material, giving the model little to go on when it comes to explicit content. So the Unstable Diffusion admins recruited volunteers — mostly members of the Discord server — to create porn data sets for fine-tuning Stable Diffusion, the way you would give it more pictures of couches and chairs if you wanted to make a furniture generation AI.
Much of the work is ongoing, but Chaudhry tells me that some of it has already come to fruition, including a technique to “repair” distorted faces and arms in AI-generated nudes. “We are recording and addressing challenges that all AI systems run into, namely collecting a diverse dataset that is high in image quality, captioned richly with text, covering the gamut of preferences of our users,” he added.
The custom models power the aforementioned Discord bot and Unstable Diffusion’s work-in-progress, not-yet-public web app, which the admins say will eventually allow people to follow AI-generated porn from specific users.
Growing community
Today, the Unstable Diffusion server hosts AI-generated porn in a range of different art styles, sexual preferences and kinks. There’s a “men-only” channel, a softcore and “safe for work” stream, channels for hentai and furry artwork, a BDSM and “kinky things” subgroup — and even a channel reserved expressly for “nonhuman” nudes. Users in these channels can invoke the bot to generate art that fits the theme, which they can then submit to a “starboard” if they’re especially pleased with the results.
Unstable Diffusion claims to have generated over 4,375,000 images to date. On a semiregular basis, the group hosts competitions that challenge members to recreate images using the bot, the results of which are used in turn to improve Unstable Diffusion’s models.
As it grows, Unstable Diffusion aspires to be an “ethical” community for AI-generated porn — i.e. one that prohibits content like child pornography, deepfakes and excessive gore. Users of the Discord server must abide by the terms of service and submit to moderation of the images that they generate; Chaudhry claims the server employs a filter to block images containing people in its “named persons” database and has a full-time moderation team.
“We strictly allow only fictional and law-abiding generations, for both SFW and NSFW on our Discord server,” he said. “For professional tools and business applications, we will revisit and work with partners on the moderation and filtration rules that best align with their needs and commitments.”
But one imagines Unstable Diffusion’s systems will become tougher to monitor as they’re made more widely available. Chaudhry didn’t lay out plans for moderating content from the web app or Unstable Diffusion’s forthcoming subscription-based Discord bot, which third-party Discord server owners will be able to deploy within their own communities.
“We need to … think about how safety controls might be subverted when you have an API-mediated version of the system that carries controls preventing misuse,” Abhishek Gupta, the founder and principal researcher at the Montreal AI Ethics Institute, told TechCrunch via email. “Servers like Unstable Diffusion become hotbeds for accumulating a lot of problematic content in a single place, showing both the capabilities of AI systems to generate this type of content and connecting malicious users with each other to further their ‘skills’ in the generation of such content .. At the same time, they also exacerbate the burden placed on content moderation teams, who have to face trauma as they review and remove offensive content.”
A separate but related issue pertains to the artists whose artwork was used to train Unstable Diffusion’s models. As evidenced recently by the artist community’s reaction to DeviantArt’s AI image generator, DreamUp, which was trained on art uploaded to DeviantArt without creators’ knowledge, many artists take issue with AI systems that mimic their styles without giving proper credit or compensation.
Character designers like Hollie Mengert and Greg Rutkowski, whose classical painting styles and fantasy landscapes have become one of the most commonly used prompts in Stable Diffusion, have decried what they see as poor AI imitations that are nevertheless tied to their names. They’ve also expressed concerns that AI-generated art imitating their styles will crowd out their original works, harming their income as people start using AI-generated images for commercial purposes. (Unstable Diffusion grants users full ownership of — and permission to sell — the images they generate.)
Gupta raises another possibility: artists who’d never want their work associated with porn might become collateral damage as users realize certain artists’ names yield better results in Unstable Diffusion prompts — e.g., “nude women in the style of [artist name]”.
Chaudhry says that Unstable Diffusion is looking at ways to make its models “be more equitable toward the artistic community” and “give back [to] and empower artists.” But he didn’t outline specific steps, like licensing artwork or allowing artists to preclude their work from training data sets.
Artist impact
Of course, there’s a fertile market for adult artists who draw, paint and photograph suggestive works for a living. But if anyone can generate exactly the images they want to see with an AI, what will happen to human artists?
It’s not an imminent threat, necessarily. As adult art communities grapple with the implications of text-to-image generators, Simply finding a platform to publish AI-generated porn beyond the Unstable Diffusion Discord might prove to be a challenge. The furry art community FurAffinity decided to ban AI-generated art altogether, as did Newgrounds, which hosts mature art behind a content filter.
When reached for comment, one of the larger adult content hosts, OnlyFans, left open the possibility that AI art might be allowed on its platform in some form. While it has a strict policy against deepfakes, OnlyFans says that it permits content — including AI-generated content, presumably — as long as the person featured in the content is a verified OnlyFans creator.
Of course, the hosting question might be moot if the quality isn’t up to snuff.
“AI generated art to me, right now, is not very good,” said Milo Wissig, a trans painter who has experimented with how AIs depict erotic art of non-binary and trans people. “For the most part, it seems like it works best as a tool for an artist to work off of… but a lot of people can’t tell the difference and want something fast and cheap.”
For artists working in kink, it’s especially obvious to see where AI falls flat. In the case of bondage, in which tying ropes and knots is a form of art (and safety mechanism) in itself, it’s hard for the AI to replicate something so intricate.
“For kinks, it would be difficult to get an AI to make a specific kind of image that people would want,” Wissig told TechCrunch. “I’m sure it’s very difficult to get the AI to make the ropes make any sense at all.”
The source material behind these AIs can also amplify biases that already exist in traditional erotica – in other words, straight sex between white people is the norm.
“You get images that are pulled from mainstream porn,” said Wissig. “You get the whitest, most hetero stuff that the machine can think up, unless you specify not to do that.”
These racial biases have been extensively documented across applications of machine learning, from facial recognition to photo editing.
When it comes to porn, the consequences may not be as stark – yet there is still a special horror to watching as an AI twists and augments ordinary people until they become racialized, gendered caricatures. Even AI models like DALLE-2, which went viral when its mini version was released to the public, have been criticized for disproportionately generating art in European styles.
Last year, Wissig tried using VQGAN to generate images of “sexy queer trans people,” he wrote in an Instagram post. “I had to phrase my terms carefully just to get faces on some of them,” he added.
In the Unstable Diffusion Discord, there is little evidence to support that the AI can adequately represent genderqueer and transgender people. In a channel called “genderqueer-only,” nearly all of the generated images depict traditionally feminine women with penises.
Branching out
Unstable Diffusion isn’t strictly focusing on in-house projects. Technically a part of Equilibrium AI, a company founded by Chaudhry, the group is funding other efforts to create porn-generating AI systems including Waifu Diffusion, a model fine-tuned on anime images.
Chaudhry sees Unstable Diffusion evolving into an organization to support broader AI-powered content generation, sponsoring dev groups and providing tools and resources to help teams build their own systems. He claims that Equilibrium AI secured a spot in a startup accelerator program from an unnamed “large cloud compute provider” that comes with a “five-figure” grant in cloud hardware and compute, which Unstable Diffusion will use to expand its model training infrastructure.
In addition to the grant, Unstable Diffusion will launch a Kickstarter campaign and seek venture funding, Chaudhry says. “We plan to create our own models and fine-tune and combine them for specialized use cases which we shall spin off into new brands and products,” he added.
The group has its work cut out for it. Of all the challenges Unstable Diffusion faces, moderation is perhaps the most immediate — and consequential. Recent history is filled with examples of spectacular failures at adult content moderation. In 2020, MindGeek, Pornhub’s parent company, lost the support of major payment processors after the site site was found to be circulating child porn and sex-trafficking videos.
Will Unstable Diffusion suffer the same fate? It’s not yet clear. But with at least one senator calling on companies to implement stricter content filtering in their AI systems, the group doesn’t appear to be on the steadiest ground.
Meet Unstable Diffusion, the group trying to monetize AI porn generators by Kyle Wiggers originally published on TechCrunch
DUOS