cumming_normi,

Because “CSAM” states abuse as the third word in the acronym. Machine learning could (in theory, I lack knowledge on the current implementations) be trained without any children being abused (in any traditional sense anyway) and used to produce the content without any real children being involved (ignoring training data).

The downvotes likely come from a difference in definition between abuse and CP, images of nonexistent people cannot realistically harm anyone.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • ngwrru68w68
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • anitta
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • provamag3
  • tester
  • Leos
  • megavids
  • JUstTest
  • All magazines