shrugal,

Normalising AI-generated CSAM might reduce the harm done to children during production of the material but it creates many more abusers.

The problem with your argument is that you assume a bunch of stuff that we just don’t know, because we haven’t tried it yet. The closest thing we do know are drugs, and for them controlled access has proven to work really well. So I think it’s at least worth thinking about and doing limited real-world trials.

And I don’t think any sane person is suggesting to just legalize and normalize it. It would have to be a way for people to self-report and seek help, with conditions such as mandatory check-in/counseling and not being allowed to work with children.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • world@lemmy.world
  • DreamBathrooms
  • mdbf
  • ngwrru68w68
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • osvaldo12
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • InstantRegret
  • tacticalgear
  • provamag3
  • ethstaker
  • cisconetworking
  • modclub
  • tester
  • GTA5RPClips
  • cubers
  • everett
  • normalnudes
  • megavids
  • Leos
  • anitta
  • JUstTest
  • lostlight
  • All magazines