BreakDecks,

In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that’s not what a CSAM ban is about. It’s about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can’t consent, so even the distribution or basic retention of this content violates a child’s rights.

Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it’s attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.

It’s a sensitive subject that most people don’t see nuance in. It’s hard to admit that pedophilia isn’t a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.

With that said, we don’t have much of a description of the South Korean man’s offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it’s my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • world@lemmy.world
  • DreamBathrooms
  • ngwrru68w68
  • InstantRegret
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • tacticalgear
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • mdbf
  • ethstaker
  • JUstTest
  • GTA5RPClips
  • modclub
  • tester
  • Leos
  • osvaldo12
  • cisconetworking
  • everett
  • cubers
  • normalnudes
  • anitta
  • megavids
  • provamag3
  • lostlight
  • All magazines