Kickstarter stopped campaigning for porn group AI Unstable Diffusion due to policy changes

The group trying to monetize the AI ​​porn generation, Unstable Diffusion, has raised over $56,000 on Kickstarter from 867 backers. With Kickstarter changing its mindset about what kind of AI-based projects it will allow, the crowdfunding platform has ended the Unstable Diffusion campaign. Since Kickstarter is on an all-or-nothing model and the campaign was ongoing, all funds raised by Unstable Diffusion will be returned to backers. In other words, Unstable Diffusion won’t see that $56,000, which more than doubled its original goal of $25,000.

“Over the past few days, we’ve been reaching out to our Community Advisory Board and reading your feedback about our team and on social media,” CEO Everette Taylor said in a blog post. “And one thing is clear: Kickstarter must and always will be on the side of the creative work and the people behind it. We are here to help creative work succeed.”

Kickstarter’s new approach to hosting AI projects is intentionally vague.

“This technology is really new and we don’t have all the answers,” Taylor wrote. “The decisions we make now may not be the decisions we make in the future, so we want this to be an ongoing conversation with all of you.”

The platform says it is currently considering how projects handle copyrighted material, particularly when artists’ work appears in algorithm training data without permission. Kickstarter will also consider whether the project “exploits a specific community or puts anyone at risk.”

In recent months, tools such as OpenAI’s ChatGPT and Stable AI’s Stable Diffusion have enjoyed great success, bringing conversations about the ethics of AI artwork to the forefront of public debate. If apps like Lensa AI can take advantage of stable, open-source distribution to instantly create artistic avatars that look like the work of a professional artist, how will that affect those same artists at work?

Some artists used Twitter to pressure Kickstarter to shut down the Unstable Diffusion project, raising concerns about how AI art generators could jeopardize artists’ careers.

That @Kickstarter So are you allowing an AI project to generate (possibly non-consensual) pornography as its main premise and main selling point is that people can steal from Greg Rutkowski? Or are you going to do something to protect creators and the public? https://t.co/26nTl4dTNM

— Karla Ortiz 🐀 (@kortizart) December 10, 2022

Embarrassed @Kickstarter for enabling crowdfunding of Unstable Diffusion. They allow blatant theft and fund a tool that can create abusive content such as non-consensual pornography.

—Sarah Andersen (@SarahCAndersen) December 11, 2022

Many cite the fate of The Work of Greg Rutkowski as an example of what can go wrong. Rutkowski’s name, a living illustrator who has created detailed, imaginative illustrations for franchises like Dungeons & Dragons, was one of Stable Diffusion’s hottest search terms when it launched in September, making it easy for users to emulate her signature style. Rutkowski has never consented to his artwork being used to train the algorithm, leading him to speak openly about how AI art generators affect working artists.

“With $25,000 in funding, we are able to train the new model with 75 million high-quality images, consisting of approximately 25 million anime and cosplay images, approximately 25 million Artstation/DeviantArt/Behance artwork images, and approximately 25 million photographs. ” Unstable Diffusion wrote on its Kickstarter.

Spawn, a suite of artificial intelligence tools designed to help artists, has a website called Have I Been Trained? developed that allows artists to see if their work appears on popular records and to opt out. According to a lawsuit in April, there is precedent for advocating the scraping of publicly available data.

Inherent problems in AI porn generation

Ethical questions about AI art become even more murky when considering projects like Unstable Diffusion, which focuses on developing NSFW content.

Stable Diffusion uses a dataset of 2.3 billion images to train its text-to-image generator. But only about 2.9% of the dataset contains NSFW material, giving the model little to hold in terms of explicit content. This is where unstable diffusion comes into play. The project, which is part of Equilibrium AI, enlisted volunteers from the Discord server to develop stronger pornographic datasets to refine the algorithm, much like you’d upload more photos of sofas and chairs to a dataset if you wanted to create AI. . . 🇧🇷

But any AI generator tends to fall prey to the biases of the people behind the algorithm. Much of the free and easily accessible pornography online is designed for the male gaze, meaning the AI ​​is likely to spit it out, especially if that’s the type of image users are entering into the dataset.

In its now-ongoing Kickstarter, Unstable Diffusion said it would work to create an AI art model that “better handles human anatomy, generates diverse and manageable art styles, more accurately represents poorly educated concepts such as LGBTQ, and race and gender.” 🇧🇷

In addition, there is no way to verify that much of the free pornography available on the internet is consensual (although adult creators using paid platforms such as OnlyFans and ManyVids are required to verify their age and identity before using these services use). 🇧🇷 Even if a model agrees to appear in pornography, it doesn’t mean they agree to have their images used to train an AI. While this technology can produce incredibly realistic images, it also means it can be used as a weapon to create non-consensual fake pornography.

Currently, there are few laws in the world that address non-consensual fake pornography. In the United States, only Virginia and California have regulations restricting certain uses of counterfeit and counterfeit pornographic media.

“One aspect of particular concern to me is the differential impact that AI-generated pornography has on women,” Ravit Dotan, vice president of AI responsible for Mission Control, told TechCrunch last month. “For example, an earlier AI-based app that can ‘undress’ people only works on women.”

Unstable Diffusion has not responded to a request for comment at time of publication.

Source: La Neta Neta

follow:
\