HomeTechGuns Are Fine, Boobs Are Bad? Stable Diffusion 3's Focus on Safety...

Guns Are Fine, Boobs Are Bad? Stable Diffusion 3’s Focus on Safety Irks Users


The developers of Stable Diffusion are close to unleashing a new version of the AI image generator. But users are worried it’ll arrive with more restrictions in place, neutering its ability to churn out NSFW content, including porn. 

“Guns are fine guys, but boobs are super dangerous!” wrote one user on Reddit. 

On Thursday, Stability AI announced Stable Diffusion 3, what promises to be the company’s “most capable text-to-image model” yet. Although details are thin, Stability AI says the new model can significantly boost the picture quality and even spell out words in the desired image.  

But rather than explain more about the capabilities, Stability AI devoted a significant portion of the announcement to safety. “We believe in safe, responsible AI practices. This means we have taken and continue to take reasonable steps to prevent the misuse of Stable Diffusion 3 by bad actors,” it wrote. “Safety starts when we begin training our model and continues throughout the testing, evaluation, and deployment.”

As a result, the company has baked in “numerous safeguards” into the new model for the public preview phase. The change is now generating alarm that Stable Diffusion 3 won’t offer the same freedom as earlier models, which can be downloaded and used on a PC.  

“Why the hell would I use a local model if I wanted censorship and ‘safety,’” wrote one user in the Reddit forum devoted to Stable Diffusion. 

Recommended by Our Editors

For now, the company is only releasing Stable Diffusion 3 in a limited preview, requiring interested users to put their info on a waitlist for access.

Stability AI didn’t immediately respond to a request for comment. But earlier this month, the company joined the US Artificial Intelligence Safety Institute Consortium, which is focused on the safe development of AI. In recent weeks, regulators have also become concerned with how AI programs can be used to generate deepfakes of anyone, including for child porn.

Like What You’re Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.




Please enter your comment!
Please enter your name here

Most Popular

Recent Comments