Can we have SCAM scanners in Foss apps without opening up legal and ethical sinkholes?

I know that scanning images for Scam is kind if a dystopian and scary. However, that doesn’t mean that we need to open ourselves for abusive materials to be sent to us.

What I think we need is some publicly available ML models that can be run on each device voluntary to block SCAM from being shown or stored.

Publicly available models would help but implementing them could be a slippery sloap. If popular encrypted messaging apps start having this feature built in its possible it will become illegal to turn it off or use versions of the app with scanner removed. This would mean that we would effectively stuck with a bad egg in our code.

Maybe the best answer is to not give individuals with questionable history the ability to message you.

Does anyone else have a thought?

WhoRoger,
@WhoRoger@lemmy.world avatar

Don’t see why not. You can download a database of hashes and compare that locally. Granted, those hashes aren’t “free”, but that’s due to the legal status of such material. The principle itself - comparing hashes - can be foss.

Yea people can look into the algorithms to see how they work and circumvent etc., but that’s no different than with… Anything else. If someone is motivated enough to distribute the material, they’ll make their own network. Foss doesn’t make any difference here.

palitu,

There Is a tool that someone built directly to scan images uploaded to lemmy for CSAM.

It is really quite clever. The image is put through a ML/AI model, which describes it (Imange to text), then the text is reviewed against a set of rules to see if it has the hallmarks of CSAM. If it does, it is deleted.

This is fully self hosted.

What I like is that it avoids the trauma of a person having to see those sort of things

VonReposti,

Poor guy who had to define the rules.

palitu,

you mean the ML model?

I dont think it is too bad, as it is more like look for a description that has children and a sexual context. This can be trained without CSAM as the model generalises situations it has seen before - a pornographic picture (sexual context) and kids playing at a platground (children in the scene).

Cypher,

The justifications for closed source scanners are slim, even knowing how a scanner works it would be difficult for CSAM to be altered to completely avoid detection and those gaps could quickly be closed.

We need an open source scanner that can be integrated safely and with trust into FOSS.

This will only happen with government permission as anyone developing this without permission obviously opens themselves up to legal action.

The FOSS community needs to get Governments on side with this but I don’t know where lobbying would be best started. Potentially the EU would be most receptive to this approach?

possiblylinux127,

I figured they could just release a ML model that was trained on CSAM internally

ono,

The point of CSAM scanners is not to protect children, but to circumvent due process by expanding warrantless surveillance. That is antithetical to FOSS.

So, in a word, no.

possiblylinux127,

So you like child porn? I want a way to block bad content from being received and displayed

uk_,

You have to rely on a 3rd party that provides you hashes or what not to identify images. And that is a business model. Or you could create a DB with hashes (aka getting a yourself) I think you will bring you in all kinds of legal troubleds this way. Or you create a algo for that work and burn through a hell lot of GPU hours (welcome back to a business model)

jeffw,

SCAM or CSAM?

Leperhero,

Im assuming the latter.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • privacy@lemmy.ml
  • DreamBathrooms
  • ngwrru68w68
  • tester
  • magazineikmin
  • thenastyranch
  • rosin
  • khanakhh
  • InstantRegret
  • Youngstown
  • slotface
  • Durango
  • kavyap
  • mdbf
  • tacticalgear
  • JUstTest
  • osvaldo12
  • normalnudes
  • cubers
  • cisconetworking
  • everett
  • GTA5RPClips
  • ethstaker
  • Leos
  • provamag3
  • anitta
  • modclub
  • megavids
  • lostlight
  • All magazines