jasonkoebler,
@jasonkoebler@mastodon.social avatar

Tech companies have realized that their AI models are trained on children and are enabling CSAM so they're going to try to iterate their way out of it regardless of the wreckage

https://www.404media.co/tech-companies-promise-to-try-to-do-something-about-all-the-ai-csam-theyre-enabling/

drahardja,
@drahardja@sfba.social avatar

@jasonkoebler What a complete hornet’s nest our technology has opened up.

I’m skeptical that we can confidently categorize generated CSAM in an automated way without lots of false positives, nor find a way to prevent combining models and training methods from continuing to generate CSAM-like or deepfake imagery downstream. It’s hard to even imagine what kind of legal framework (short of a free-for-all) can be consistently applied in this space.

This mess is going to be with us as long as this tech is around.

drahardja,
@drahardja@sfba.social avatar

@jasonkoebler To be clear, this problem space has existed long before generative AI. Fake celebrity porn has been around since Photoshop was sold. What AI has done is to automate and democratize the process, so that the output becomes both more personalized and numerous. What was once an annoyance is now an epidemic.

rysiek,
@rysiek@mstdn.social avatar

@drahardja @jasonkoebler

> I’m skeptical that we can confidently categorize generated CSAM in an automated way without lots of false positives

It is basically impossible.

But more importantly — and this is constantly missed in such conversations! — what is CSAM is contextual.

A photo of a naked baby sent between parents and close family is most definitely not CSAM. Photo sent between teenage kids infatuated with one another also.

Change the context, and suddenly same photo is CSAM.

drahardja,
@drahardja@sfba.social avatar

@rysiek @jasonkoebler While that may be true of human-generated images, I think a stronger line can be drawn for generated images.

Nevertheless, it’s incredibly difficult, and it is likely impossible. It’s already hard enough to distinguish between “art” and “porn” with human judges.

rysiek,
@rysiek@mstdn.social avatar

@drahardja

> While that may be true of human-generated images, I think a stronger line can be drawn for generated images.

But how would that work in practice? There are no tools currently that can reliably tell if an image was or was not synthetically generated…

And the generators are being improved specifically to make it even harder to spot synthetically generated images.

@jasonkoebler

drahardja,
@drahardja@sfba.social avatar

@rysiek @jasonkoebler In theory, server-side image generation services can run detectors on their own output, which are known to be machine-generated, and prevent suspicious images from being delivered. There’s certainly nothing to monitor client-side generators.

In any case I think it’s likely impossible to get this kind of system to a point where it’s satisfactory without being annoying.

rysiek,
@rysiek@mstdn.social avatar

@drahardja oh yeah, that's a good point. And they kinda sorta do, but currently as far as I understand on the input (prompt) side.

Of course running these kinds of filters on the output side would be way more expensive to the service, so that won't happen unless they are forced to.

But yeah, this is an interesting thought!

@jasonkoebler

rysiek,
@rysiek@mstdn.social avatar

@drahardja @jasonkoebler

Which means there is no technical way of sorting this out!

If identical photos can simultaneously be and not be CSAM, depending on context, trying to create a software filter based on some photo characteristics of what is or isn't CSAM is outright folly.

jasonkoebler,
@jasonkoebler@mastodon.social avatar

we take this problem that we ourselves created and which enriches us very seriously

jasonkoebler,
@jasonkoebler@mastodon.social avatar

a company that has existed for two years, whose main interaction with its community happens on Discord, and who has like a dozen employees total has now deemed itself ready to tackle such questions as "What defines a child?" and "how do you define age?"

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • cubers
  • Youngstown
  • ngwrru68w68
  • ethstaker
  • slotface
  • PowerRangers
  • hgfsjryuu7
  • khanakhh
  • kavyap
  • tsrsr
  • InstantRegret
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • Leos
  • rosin
  • tacticalgear
  • thenastyranch
  • Durango
  • osvaldo12
  • vwfavf
  • modclub
  • everett
  • GTA5RPClips
  • cisconetworking
  • normalnudes
  • tester
  • anitta
  • All magazines