A Wisconsin software program engineer was arrested on Monday for allegedly creating and distributing hundreds of pictures of synthetic intelligence-generated little one sexual abuse materials (CSAM).
Courtroom paperwork describe Steven Andreiger as “extraordinarily tech-savvy” with a background in laptop science and “a long time of software program engineering expertise.” Anderegg, 42, is accused of sending synthetic intelligence-generated nude pictures of minors to a 15-year-old boy by way of Instagram DMs. Anderegg got here to the eye of legislation enforcement after the Nationwide Heart for Lacking and Exploited Youngsters flagged the messages, which he allegedly despatched in October 2023.
Based on data obtained by legislation enforcement from Instagram, Anderegg posted an Instagram Story in 2023 “consisting of precise GenAI pictures of minors carrying BDSM-themed leather-based fits” and inspired others to “take a look at” on Telegram if they’d missed it What. Andreger allegedly “mentioned his want to have intercourse with prepubescent boys” in non-public messages with different Instagram customers and instructed one Instagram person that there have been “tons” of different AI-generated messages on his Telegram CSAM pictures.
Andreger allegedly began sending the pictures to a different Instagram person after studying he was solely 15 years outdated. “When the minor knowledgeable him of his age, the defendant didn’t refuse him or inquire additional. As a substitute, he wasted no time describing to the minor how he created sexually express GenAI pictures and despatched personalized content material to kids,” the criticism states. the doc states.
Prosecutors stated that when legislation enforcement searched Andreiger’s laptop, they discovered greater than 13,000 pictures, “a whole lot if not hundreds of which depicted nude or semi-nude prepubescent minors.” Charging paperwork say Anderegg produced the pictures on the text-to-image mannequin Steady Diffusion, a product created by Stability AI, and used “extraordinarily particular and express prompts to create the pictures.” Andreger additionally allegedly used “unfavorable cues” to keep away from creating pictures depicting adults, and used third-party stabilization plugins that “specialise in genitalia.”
Final month, a number of main tech corporations, together with Google, Meta, OpenAI, Microsoft and Amazon, stated they’d reviewed CSAM’s synthetic intelligence coaching information. The businesses dedicated to a brand new set of rules, together with a “stress testing” mannequin, to make sure they didn’t create CSAM. Stability AI has additionally signed as much as these rules.
Based on prosecutors, this isn’t the primary time Anderegg has been contacted by legislation enforcement for allegedly possessing CSAM by way of a peer-to-peer community. Prosecutors stated that in 2020, somebody used the Web at Anderegg’s house in Wisconsin to aim to obtain a number of identified CSAM recordsdata. When legislation enforcement raided his house in 2020, Anderegg admitted to operating a peer-to-peer community on his laptop and steadily resetting his modem, however he was not charged.
In a quick supporting Andregg’s pretrial detention, the federal government famous that he has been a software program engineer for greater than 20 years and that his resume features a latest job at a startup, the place he leveraged his “experience in formulating synthetic intelligence.” Wonderful technical understanding of sensible fashions.”
If convicted, Anderegg faces as much as 70 years in jail, though prosecutors stated the “advisable sentencing vary could possibly be as excessive as life in jail.”