In February, a district judge tossed out a charge against a Wisconsin man of possession of the obscene material but allowed other charges to move forward
Federal prosecutors are appealing a federal judge’s ruling in Wisconsin that possessing child sexual abuse material created by artificial intelligence is in some situations protected by the Constitution.
The order and the subsequent appeal could have major implications for the future legal treatment of AI-generated child sexual abuse material, or CSAM, which has been a top concern among child safety advocates and has become a subject of at least two prosecutions in the last year. If higher courts uphold the decision, it could cut prosecutors off from successfully charging some people with the private possession of AI-generated CSAM.
The case centers on Steven _____, 42, of Holmen, Wisconsin, whom the Justice Department charged in May with “producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct and transferring obscene material to a minor under the age of 16.”
Prosecutors alleged that he used an AI image generator called Stable Diffusion to create over 13,000 images depicting child sexual abuse by entering text prompts into the technology that then generated fake images depicting non-real children. (Some AI systems are also used to create explicit images of known people, but prosecutors do not claim that is what Anderegg was doing.)
In February, in response to _____’s motion to dismiss the charges, U.S. District Judge James D. Peterson allowed three of the charges to move forward but threw one out, saying the First Amendment protects the possession of “virtual child pornography” in one’s home. On March 3, prosecutors appealed.
It should be considered a computer crime because the act of physical sex is not taking place. It’s actually a sight crime, but lawmakers got around that by adding a subjective “abuse and harm” component that goes along with possession just to make it out to be more than it is. Lawmakers have historically over-valued and hyped the “severity” of possession.
I think it was not too long ago in this forum this situation with AI was brought up as a possibility with the courts. But they “will know it when they see it” even when it is a fake. Up the chain this goes…
As a victim sentenced under Wisconsin’s insane 3-year first time possession law, I’m never shy about voicing my opinion on the cheeseheads who run that crooked state. Wisconsin DOC paid therapists attempted to guilt trip me into believing I “harmed a child” by viewing their image on my computer, although it was another person or children themselves that were sexually exploiting their bodies and filming it. If I didn’t fall for the sot B.S. when a real child was involved, does one believe we should accept those same crackpot views when no children are involved? That’s even too much for a federal judge to believe, but were talking about the state Wisconsin where authorities live inside an alternative reality.
What surprises me is that the entire under 18 nudist genre of photos on the internet hasn’t been addressed by legislators. They have attacked everything else.
CSAM is making thought crimes happen! Doesn’t matter that the “kids” in the images aren’t real. Doesn’t matter that absolutely nobody was harmed in any way by the production of the images.. The “Intention” of the Creator is, whatever they want, and that makes it a crime!
These images are “proof” of future blood libels! No need to commit an actual crime, this makes you pre-guilty of whatever they want to say you’re going to do tomorrow! If necessary, they will also pile on some past invisible blood libels, as further “proof” of the blood libels yet to come!
Then there is the ambiguous nature of CSAM. One person’s “art” is another person’s “CSAM”. Everything gets to be on a case by case basis.
👨 + Everyone = CP
🏃♀️ + You = All good
🏃♀️ + Me = CSAM
Certain things remain CP, so they can go after anyone and everyone that has a copy. CSAM allows them to be far more selective as to who has committed a crime by having a copy of an image. The intention of the possession makes all the difference. The State will inform you what your intentions were…
Once again, the power of CSAM in action.
Computer generated images have long been in the K&D Squad’s crosshairs, but to no avail. Computer generated images of computer generated people fail to meet the definition of Pornography. If they are not porn, then no way they are a crime. However, just because they aren’t porn, they can still be CSAM!
So, so many stories can be fabricated to turn the images into “groomer materials” or just plain old “sexual gratification material”. Either way, they can be declared CSAM, despite not being pornography. Also, the designation of CSAM can be, in part based on the combination of image + Person. All Asst DA Karen has to do is convince the jury that the reason you have this image is…
The film Dirty Dancing.. not a problem… unless you intend to use it groom girls by “Normalizing the idea of teen girl “Hooking up” with adult men! Then it is CSAM, but only for you, nobody else. Stream this on Netflix, felony… but, Netflix has committed no crime, nor has anyone else that also streamed this… just you! It is your evil intention that makes it a crime!
There was a poster here awhile back that got into hot water with his PO and Therapy provider because he had family photos that included underaged relatives. The “Therapist” decided that the only reason he still had these images was, sexual gratification materials. CSAM is going to do that for everyone…
I have an old family photo that has my nephews in it when they were preadolescents. My Dad has the same picture, so does my Brother. no problem for them… but my copy….CSAM? Asst DA Karen will decide that, and let me know.
I understand and fully support laws that protect children from harm, which is the rationale for child pornography (CP) laws. Creating CP by definition involves a minor victim. I get that. Distributing CP harms the victim in a number of ways, including embarrassment, psychological damage and potential effects on their later life. I get that. Simply viewing CP is an invasion of the victim’s privacy. I get that. Superimposing a real minor’s face on a compromising photo has similar harms. I get that. Sending erotic images to a minor–particularly if the intent is to desensitize the child to sex–can cause psychological damage. I get that.
What minor is the victim of images that don’t resemble an identifiable person, but are totally computer generated? Other than possible spiritual or psychological harm to the viewer, where is the victim? I’m not condoning these computer generated images, but making them unlawful in a person’s home comes perilously close to a thought crime. 18 U.S.C. §2256(8)(B) defines CP as an image that is “indistinguishable” from that of a real minor. Perhaps I’m missing something. Can someone explain this to me?