set iptv

set iptv

Man Arrested for Creating AI Child Sexual Abuse Material Using Stable Diffusion


Man Arrested for Creating AI Child Sexual Abuse Material Using Stable Diffusion


US authorities have arrested a 42-year-old Wisconsin man for allegedly using AI image generator Stable Diffusion to create photorealistic child sexual abuse images.

The Justice Department says Steven Anderegg used the free and open-source tool to create thousands of “hyper-realistic images of nude and semi-clothed prepubescent children” performing sex acts. He allegedly installed add-ons for Stable Diffusion “that specialized in producing genitalia,” investigators say. 

Law enforcement learned of the activities after Anderegg boasted to a 15-year-old about creating the sexual imagery back in October. According to court documents, Anderegg — who has decades of software engineering experience— even sent some of the imagery via Instagram’s direct messages function, which prompted the National Center for Missing and Exploited Children to flag the activity to US law enforcement. 

“Evidence recovered from Anderegg’s electronic devices revealed that he generated these images using specific, sexually explicit text prompts related to minors, which he then stored on his computer,” the Justice Department says. Anderegg was arrested after a federal grand jury returned an indictment.

“Today’s announcement sends a clear message: using AI to produce sexually explicit depictions of children is illegal, and the Justice Department will not hesitate to hold accountable those who possess, produce, or distribute AI-generated child sexual abuse material,” Principal Deputy Assistant Attorney General Nicole Argentieri said in a statement.

Under federal law, computer-generated child sexual abuse material (CSAM) that is “indistinguishable from an actual minor” is illegal, even though the minor depicted isn’t a real person. Still, the case against Anderegg could test US law. Court documents note that Anderegg isn’t being charged under a US code that specifically outlaws sexual exploitation of children. Still, prosecutors claim his conduct—generating AI CSAM and sharing it online, including with a 15-year-old—amounts to possessing and transferring CSAM.

Recommended by Our Editors

“The only reasonable explanation for sending these images was to sexually entice the (15-year-old) child. And as a result of this exceptionally serious conduct, the defendant now faces a mandatory minimum sentence of five years and a maximum sentence of decades in prison if convicted,” federal prosecutors add.

In the meantime, Stable Diffusion has rolled out safeguards in its latest model that can prevent it from generating NSFW content.

SecurityWatch<\/strong> newsletter for our top privacy and security stories delivered right to your inbox.”,”first_published_at”:”2021-09-30T21:22:09.000000Z”,”published_at”:”2022-03-24T14:57:33.000000Z”,”last_published_at”:”2022-03-24T14:57:28.000000Z”,”created_at”:null,”updated_at”:”2022-03-24T14:57:33.000000Z”})” x-show=”showEmailSignUp()”>

Like What You’re Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Thank You For The Order

Please check your email we sent the process how you can get your account

Select Your Plan