WI: Possession of AI-generated child sexual abuse imagery may be protected by First Amendment in some cases, judge rules

Source: nbcnews.com 3/18/25

In February, a district judge tossed out a charge against a Wisconsin man of possession of the obscene material but allowed other charges to move forward

Federal prosecutors are appealing a federal judge’s ruling in Wisconsin that possessing child sexual abuse material created by artificial intelligence is in some situations protected by the Constitution.

The order and the subsequent appeal could have major implications for the future legal treatment of AI-generated child sexual abuse material, or CSAM, which has been a top concern among child safety advocates and has become a subject of at least two prosecutions in the last year. If higher courts uphold the decision, it could cut prosecutors off from successfully charging some people with the private possession of AI-generated CSAM.

The case centers on Steven _____, 42, of Holmen, Wisconsin, whom the Justice Department charged in May with “producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct and transferring obscene material to a minor under the age of 16.”

Prosecutors alleged that he used an AI image generator called Stable Diffusion to create over 13,000 images depicting child sexual abuse by entering text prompts into the technology that then generated fake images depicting non-real children. (Some AI systems are also used to create explicit images of known people, but prosecutors do not claim that is what Anderegg was doing.)

In February, in response to _____’s motion to dismiss the charges, U.S. District Judge James D. Peterson allowed three of the charges to move forward but threw one out, saying the First Amendment protects the possession of “virtual child pornography” in one’s home. On March 3, prosecutors appealed. 

Read the full article

 

Related posts

Subscribe
Notify of

We welcome a lively discussion with all view points - keeping in mind...

 

  1. Submissions must be in English
  2. Your submission will be reviewed by one of our volunteer moderators. Moderating decisions may be subjective.
  3. Please keep the tone of your comment civil and courteous. This is a public forum.
  4. Swear words should be starred out such as f*k and s*t and a**
  5. Please avoid the use of derogatory labels.  Always use person-first language.
  6. Please stay on topic - both in terms of the organization in general and this post in particular.
  7. Please refrain from general political statements in (dis)favor of one of the major parties or their representatives.
  8. Please take personal conversations off this forum.
  9. We will not publish any comments advocating for violent or any illegal action.
  10. We cannot connect participants privately - feel free to leave your contact info here. You may want to create a new / free, readily available email address that are not personally identifiable.
  11. Please refrain from copying and pasting repetitive and lengthy amounts of text.
  12. Please do not post in all Caps.
  13. If you wish to link to a serious and relevant media article, legitimate advocacy group or other pertinent web site / document, please provide the full link. No abbreviated / obfuscated links. Posts that include a URL may take considerably longer to be approved.
  14. We suggest to compose lengthy comments in a desktop text editor and copy and paste them into the comment form
  15. We will not publish any posts containing any names not mentioned in the original article.
  16. Please choose a short user name that does not contain links to other web sites or identify real people.  Do not use your real name.
  17. Please do not solicit funds
  18. No discussions about weapons
  19. If you use any abbreviation such as Failure To Register (FTR), Person Forced to Register (PFR) or any others, the first time you use it in a thread, please expand it for new people to better understand.
  20. All commenters are required to provide a real email address where we can contact them.  It will not be displayed on the site.
  21. Please send any input regarding moderation or other website issues via email to moderator [at] all4consolaws [dot] org
  22. We no longer post articles about arrests or accusations, only selected convictions. If your comment contains a link to an arrest or accusation article we will not approve your comment.
  23. If addressing another commenter, please address them by exactly their full display name, do not modify their name. 
ACSOL, including but not limited to its board members and agents, does not provide legal advice on this website.  In addition, ACSOL warns that those who provide comments on this website may or may not be legal professionals on whose advice one can reasonably rely.  
 

14 Comments
Inline Feedbacks
View all comments

It should be considered a computer crime because the act of physical sex is not taking place. It’s actually a sight crime, but lawmakers got around that by adding a subjective “abuse and harm” component that goes along with possession just to make it out to be more than it is. Lawmakers have historically over-valued and hyped the “severity” of possession.

I think it was not too long ago in this forum this situation with AI was brought up as a possibility with the courts.  But they “will know it when they see it” even when it is a fake. Up the chain this goes…

As a victim sentenced under Wisconsin’s insane 3-year first time possession law, I’m never shy about voicing my opinion on the cheeseheads who run that crooked state. Wisconsin DOC paid therapists attempted to guilt trip me into believing I “harmed a child” by viewing their image on my computer, although it was another person or children themselves that were sexually exploiting their bodies and filming it. If I didn’t fall for the sot B.S. when a real child was involved, does one believe we should accept those same crackpot views when no children are involved? That’s even too much for a federal judge to believe, but were talking about the state Wisconsin where authorities live inside an alternative reality.

What surprises me is that the entire under 18 nudist genre of photos on the internet hasn’t been addressed by legislators. They have attacked everything else.

CSAM is making thought crimes happen! Doesn’t matter that the “kids” in the images aren’t real. Doesn’t matter that absolutely nobody was harmed in any way by the production of the images.. The “Intention” of the Creator is, whatever they want, and that makes it a crime!

These images are “proof” of future blood libels! No need to commit an actual crime, this makes you pre-guilty of whatever they want to say you’re going to do tomorrow! If necessary, they will also pile on some past invisible blood libels, as further “proof” of the blood libels yet to come!

Then there is the ambiguous nature of CSAM. One person’s “art” is another person’s “CSAM”. Everything gets to be on a case by case basis.

👨 + Everyone = CP
🏃‍♀️ + You = All good
🏃‍♀️ + Me = CSAM

Certain things remain CP, so they can go after anyone and everyone that has a copy. CSAM allows them to be far more selective as to who has committed a crime by having a copy of an image. The intention of the possession makes all the difference. The State will inform you what your intentions were…

Once again, the power of CSAM in action.

Computer generated images have long been in the K&D Squad’s crosshairs, but to no avail. Computer generated images of computer generated people fail to meet the definition of Pornography. If they are not porn, then no way they are a crime. However, just because they aren’t porn, they can still be CSAM!

So, so many stories can be fabricated to turn the images into “groomer materials” or just plain old “sexual gratification material”. Either way, they can be declared CSAM, despite not being pornography. Also, the designation of CSAM can be, in part based on the combination of image + Person. All Asst DA Karen has to do is convince the jury that the reason you have this image is…

The film Dirty Dancing.. not a problem… unless you intend to use it groom girls by “Normalizing the idea of teen girl “Hooking up” with adult men! Then it is CSAM, but only for you, nobody else. Stream this on Netflix, felony… but, Netflix has committed no crime, nor has anyone else that also streamed this… just you! It is your evil intention that makes it a crime!

There was a poster here awhile back that got into hot water with his PO and Therapy provider because he had family photos that included underaged relatives. The “Therapist” decided that the only reason he still had these images was, sexual gratification materials. CSAM is going to do that for everyone…

I have an old family photo that has my nephews in it when they were preadolescents. My Dad has the same picture, so does my Brother. no problem for them… but my copy….CSAM? Asst DA Karen will decide that, and let me know.

I understand and fully support laws that protect children from harm, which is the rationale for child pornography (CP) laws. Creating CP by definition involves a minor victim. I get that. Distributing CP harms the victim in a number of ways, including embarrassment, psychological damage and potential effects on their later life. I get that. Simply viewing CP is an invasion of the victim’s privacy. I get that. Superimposing a real minor’s face on a compromising photo has similar harms. I get that. Sending erotic images to a minor–particularly if the intent is to desensitize the child to sex–can cause psychological damage. I get that.

What minor is the victim of images that don’t resemble an identifiable person, but are totally computer generated? Other than possible spiritual or psychological harm to the viewer, where is the victim? I’m not condoning these computer generated images, but making them unlawful in a person’s home comes perilously close to a thought crime. 18 U.S.C. §2256(8)(B) defines CP as an image that is “indistinguishable” from that of a real minor. Perhaps I’m missing something. Can someone explain this to me?