strypey, (edited )
@strypey@mastodon.nzoss.nz avatar

Read all about it. Washington Post report on a study, written by a "journalist" who has no idea how internet technology works if it isn't run by a corporation, and imposes culturally-specific judgements on internet use by people in other cultures, with different ethical frameworks. Honestly, this study could have been titled 'researchers shocked to find dodgy stuff on dodgy little websites':

https://www.washingtonpost.com/politics/2023/07/24/twitter-rival-mastodon-rife-with-child-abuse-material-study-finds/

(Edit: I blamed the study authors for shoddy journalism)

nicol,
@nicol@social.coop avatar

@strypey I haven't yet read the study but I have read David Thiel's really insightful thread on it : https://social.coop/@det@hachyderm.io/110769470109893365

He wrote it and clearly both knows what he's talking about, and understands the fediverse, and makes a bunch of useful suggestions.

Dismissing reports of CSAM that use the same software we use, even if it's mostly on blocked servers, isn't a good look. We're not so fragile we can't welcome the research and work to try to improve Mastodon's in built tools, no?

nicol,
@nicol@social.coop avatar

@strypey "culturally-specific judgements on internet use by people in other cultures, with different ethical frameworks" also sounds at danger of normalising/minimising CSAM, which I'm sure isn't your intention

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> sounds at danger of normalising/minimising CSAM, which I'm sure isn't your intention

Ae, what I was commenting on is the cultural differences between the US and Japan, in where they draw the line between what is CSAM, and what is legitimate art. I know it's not a popular position, but I think the Japanese get it right.

I'd be wise not to post hot takes on triggering topics late at night, when I'm annoyed by wilful corporate media slurs against my community (the fediverse, that is).

nicol,
@nicol@social.coop avatar

@strypey the Post's framing was really sensationalist (not that I read the full article).

ftr he kind of covers your point here somewhat: https://hachyderm.io/@det/110769472947313556 and follows it here: https://hachyderm.io/@det/110769473750112571

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
I read @det's thread. It's certainly more nuanced than either the Washington Paste article or the study abstract suggested.

But I remain uncomfortable with the assumption that lolicon inevitably leads to CSAM. For the same reason I reject the broader conservative assumption that all porn leads to CSAM (and all sex work leads to human trafficking). It's equally arguable that lolicon serves as replacement for CSAM, this reducing demand for it.

Have any rigorous studies been done on this?

smallcircles,
@smallcircles@social.coop avatar

@nicol @strypey

Well haven't read @det report yet, but first line of the WaPo article is immediately such a gross misrepresentation:

> "A new report has found rampant child sexual abuse material on Mastodon, a social media site that has gained popularity in recent months as an alternative to platforms like Twitter and Instagram."

A social media site??? WTF, @washingtonpost

> "Mastodon did not return a request for comment."

Which Mastodon? The FOSS project?

This article is utterly sloppy!

KimSJ,
@KimSJ@mastodon.social avatar

@smallcircles @nicol @strypey @det @washingtonpost They should have interviewed John Mastodon, at the very least.

smallcircles,
@smallcircles@social.coop avatar

@KimSJ
@nicol @strypey @det

FYI @washingtonpost that is the elusive billionaire with plans to go to Venus.

strypey,
@strypey@mastodon.nzoss.nz avatar

@KimSJ
> They should have interviewed John Mastodon, at the very least

Or if he wasn't available, I'm sure Sakura Akkoma, Jane Misskey, Bob Firefish or Jim Epicyon would have been happy to comment on the record ; )

@smallcircles @nicol @det @washingtonpost

nicol,
@nicol@social.coop avatar

@smallcircles @strypey @washingtonpost I wasn't defending the shoddy click-bait Post article (and I won't pay to get pass the paywall to read it). But the report it refers to and is enflaming and sensationalising, appears sensible and clearly well-informed, based on the thread by its author who is active here. @strypey's post was attacking the study writer, not the Post journalist, and that's what my comment was about.

smallcircles,
@smallcircles@social.coop avatar

@nicol Wasn't critical about the report, purely the article itself 🤗

Just annoyed on the article itself. Further down the fold there's more nuance, but then for the TL;DR skimming readers the reputational damage is already done. Journalists know this all too well, and it is done deliberately to make the article an easy snack.

@strypey @det @washingtonpost

nicol,
@nicol@social.coop avatar

@smallcircles yeh it's shoddy and infuriating. They should know better. It also puts people on the defensive rather than saying 'we can/must improve things'. Perhaps the Post believe sensationalism forces action, I'm not sure.

But this won't be the last.
What stands out from the reply to @det's thread by mastodon.social dev @renchaphttps://oisaur.com/@renchap/110774247198272383 – is how under-resourced we are to deal with real, serious problems that centralised social media have both legal & PR motivation to fix.

smallcircles,
@smallcircles@social.coop avatar

@nicol

FWIW From what I see on a quick skim of the Stanford report announcement and the report PDF itself, @det and co-author Renée DiResta give a clear and very well laid out explanation that accurately depicts the nature of federated social networks. The moderation challenges are real and urgent esp. in the light of CSAM.

(I think @strypey meant to criticize the @washingtonpost article itself)

smallcircles,
@smallcircles@social.coop avatar

@nicol @det @strypey @washingtonpost

> PR motivation to fix.

@J12t started a community to collect articles in the Press with the purpose of discussing inaccuracies and subsequently notifying article authors or publishers:

https://lemmy.world/c/fediverse_press

strypey,
@strypey@mastodon.nzoss.nz avatar

@smallcircles
> I think @strypey meant to criticize the @washingtonpost article itself

Yes and no. Certainly that article was a steaming garbage pile, and potentially libelous to boot. Lucky for them John Mastodon is not the litigious sort.

But from a skim of the abstract, the study authors ignore the fact that widely blocked Mastodon servers are effectively not part of the fediverse. Any more than CessPit or Lies.Social are. Or the Chans for that matter.

(1/2)

@nicol @det
@det

strypey,
@strypey@mastodon.nzoss.nz avatar

@smallcircles
What they're doing is essentially a guilt-by-association. Holding Mastodon accountable for any use of the software they release, under a Free Code license with no usage restrictions. I doubt they would blame Automattic if these CSAM sites were using WordPress ( with the AP plugin) instead of Mastodon.

(2/2)

@washingtonpost @nicol @det @det

nicol,
@nicol@social.coop avatar

@strypey @smallcircles @det

Why wouldn't the report authors blame Automattic in that scenario?

Maybe WaPost's framing makes it feel like the Bezos/FAANG-establishment is starting its 'smeer-Mastodon' engines triggering Goliath vs David instincts?
But there's nothing I've read from Thiel in the highlights of the report to support that. He's found blocked servers kinda leak & makes a number of practical suggestions to improve things. Dismissing this helps Big Tech claim to be the 'safe space'.

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> Why wouldn't the report authors blame Automattic in that scenario?

Because people can't be held responsible for things they have no control over. Like John Mastodon, Automattic have no control over what people do, independently of them, with software they release under Free Code licenses.

If I stab someone with a screwdriver (1), do you blame the company that made or sold the screwdriver?

  1. Not something I do, just to be clear :)

@smallcircles @det

nicol,
@nicol@social.coop avatar

@strypey @smallcircles @det the reason why Mastodon isn't like a CMS or a screwdriver is on Page 1 of the report:
"If a local user follows a remote user who posts illegal content, that content will be federated to the local server and potentially be displayed to users in their federated timeline, as well as stored on the server or media cache."
It's like saying, if you and I own the same brand of screwdriver, I could be an accessory if you stabbed someone - even if we'd never met.

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> If a local user follows a remote user who posts illegal content, that content will be federated to the local server

This has nothing you do with the people who release the Mastodon software. They don't control any if this. The only people in control of this, and responsible for it, are the admins of the servers concerned.

@smallcircles @det

nicol,
@nicol@social.coop avatar

@strypey @smallcircles @det I really don't get what you are arguing against.

The report made five recommendations for Mastodon software that would help instance operators/mods. Moderators have a legal (and moral) obligation to deal with illegal content and many seem to say the tooling at present falls short. What are you so worried about happening if Mastodon makes it easier to moderate/block/report/trauma-protect CSAM?

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> What are you so worried about happening if Mastodon makes it easier to moderate/block/report/trauma-protect CSAM?

That seems unnecessarily personal. I'm just trying to answer the question you asked;

> Why wouldn't the report authors blame Automattic in that scenario?

(1/2)

@smallcircles @det

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
As it happens, I'm a survivor of (non-sexual) child abuse, by a person who later did jail time for sexually abusing members of my family, while they were children. I'll let you guess how much sympathy I have for the abuse of children (or anyone).

But precisely because this is such a triggering topic for me, I'm very careful to run a logical slide rule over anything said on the subject. Also because of QAnon hysteria around the topic (Pizzagate etc).

(2/2)

@smallcircles @det

nicol,
@nicol@social.coop avatar

@strypey @smallcircles @det

I'm very sorry to hear that. And I'm sorry if you feel my tone was personal. Ironically I stepped into your mentions as I felt your tone to the report authors was unfair!

I've just finished reading it (https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf). Sadly it's not clear what % of its found CSAM was from Japan.

It has 5 broad recommendations, some I imagine would be much less controversial than others.
I feel they invite calm discussion (and the discussion cannot/shouldn't be avoided).

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> Ironically I stepped into your mentions as I felt your tone to the report authors was unfair!

Fair call. The OP was a late night hot take. I don't know if you saw, but I edited it to focus my criticism on the clickbait Post article and its author.

> I feel they invite calm discussion (and the discussion cannot/shouldn't be avoided)

Again, fair call.

@smallcircles @det

nicol,
@nicol@social.coop avatar

@strypey @smallcircles @det

Am pasting Thiel's comment on this point (https://hachyderm.io/@det/110771062279584394) as I find it convincing. The only argument against it, is us acknowledging the existence of the full Mastodon-using universe (Gab, Truth & much worse) risks more troll-press - but it seems kinda techbro/three-wise-monkeys to wash our hands and say 'not our problem, what we block doesn't exist'.

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
@det's comment reflects a common fallacy; that it's the distribution of CSAM that drives the harm. Demand drives production. Production drives distribution. Effective prevention of child sex abuse (whether documented or not) starts with effective mental health treatment to reduce demand, and effective law enforcement to reduce production. Targeting distribution is only useful to the degree it helps with those.

(1/2)

@smallcircles

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol We're not responsible for things we have no control over. That includes what people do on private instances of Mastodon, or WP, or any other CMS. Any wailing and gnashing of teeth about that is a distraction.

(2/2)

@det @smallcircles

nicol,
@nicol@social.coop avatar

@strypey @det @smallcircles

I disagree. This is the 'guns don't kill people, people kill people' line the gun lobby uses to stop regulations that would clearly save lives. If media didn't cause an effect on those who consume it, the ad industry wouldn't exist.

Distribution is the only bridge between demand and production. In any medium, throttle distribution and there's fewer ways to meet demand, so less motivation to produce.

But I'm not sure us debating this will be productive.

strypey, (edited )
@strypey@mastodon.nzoss.nz avatar

@nicol
> Distribution is the only bridge between demand and production. In any medium, throttle distribution and there's fewer ways to meet demand, so less motivation to produce

This is exactly the same argument prohibitionists use for targeting drug distribution, as if this will reduce drug-related harms. The data is in and they're so clearly wrong that many jurisdictions have already switched away from targeting distribution.

(1/2)

@det @smallcircles

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
Where the analogy breaks down is that CSAM harm is done with every act of production, and only then. Whereas drug harm is only done during inappropriate acts of consumption.

But in either case, distribution is not where the harm happens. People I've known, who worked on law enforcement against CSAM, will tell you that old fashioned police work - going undercover in CSAM groups not trying to snuff them out - is the only effective way to catch CSAM producers.

(2/2)

@det @smallcircles

nicol,
@nicol@social.coop avatar

@strypey @smallcircles this is why I didn't feel debating would be productive.

I think it's like gun control, something that I think should be largely illegal. You think it's like drug prohibition - something that (I/we think) shouldn't be illegal.

Most people I know have tried drugs, many have enjoyed & the harm is the supply end, so I don't get the analogy. In the UK you can't buy a gun & we have no mass shootings.

Neither of us are experts so our analogies probably aren't much use either.

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> Neither of us are experts so our analogies probably aren't much use either

Granted. Which is why I mentioned the law enforcement folks I've spoken to, and their opinion that driving CSAM distributors deeper into the Dark Net makes their job harder, while doing nothing to reduce overall distribution. This is a debate as old as the net itself:

https://en.m.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalypse

@smallcircles

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> This is a debate as old as the net itself

... and it's a pressing issue in the US right now, because of the horrific KOSA Bill. Once again "think of the children" is being wheeled out to justify a China-style crackdown on civil liberties in the digital sphere:

https://universeodon.com/@siderea/110779140924711110

@smallcircles

nicol,
@nicol@social.coop avatar

@strypey @smallcircles in the UK too with the Online Harms Bill, which sounds similar to KOSA & in the EU with Chat Control (https://www.theguardian.com/world/2023/may/08/eu-lawyers-plan-to-scan-private-messages-child-abuse-may-be-unlawful-chat-controls-regulation).

Because it is so easily weaponised seems all the more reason imo we need to be more proactive and not wait for the first major fediverse scandal / press storm to act.

Eg it struck me: where do instance moderators & admins hangout to discuss these things? Is there a forum / Safety Group? A space on Activity Pub Rocks, or is it all private chats?

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> where do instance moderators & admins hangout to discuss these things?

I believe there's an admins' lounge run by John Mastodon, a Discourse forum or a Discord channel or somesuch. Maybe also the tag? It sounds like the proposals you mention are more dev masters though. In which case, SocialHub, of the Fediverse Developers network matrix rooms are two places to start conversations. I think there's also some Lemmy groups?

@smallcircles

strypey,
@strypey@mastodon.nzoss.nz avatar

@nicol
> Because it is so easily weaponised seems all the more reason imo we need to be more proactive

Pushing back against the propaganda and libel being spread by politically or financially motivated bloggers, or corporate media, seems like a good start. There's no point coding solutions until we have carefully defined problems. So pointing out flaws in proposed ones seems useful too.

@smallcircles

strypey, (edited )
@strypey@mastodon.nzoss.nz avatar

@nicol
> the EU with Chat Control

The 'think of the children' manipulation being used to justify is truly galling:

https://european-pirateparty.eu/manipulative-eu-opinion-poll-no-justification-for-indiscriminate-chatcontrol/

This is bigger than the verse vs. the DataFarms. This is about whether governments will;

a) defend an open, uncapturable internet and make sure operators respect privacy

b) criminalise permissionless networks and rigorous privacy protection, and drive them underground

It's a fight for the very soul of the net.

@smallcircles

Nika2022,

@strypey @nicol @smallcircles
Yep we saw how "well" sesta turned out.
Used ONCE but made a lot of legal content disappear.

strypey,
@strypey@mastodon.nzoss.nz avatar

We are getting some runs on the board though:

"Today’s landmark ruling by the European Court of Justice likely means the end of the common practice of clickstream logging, which I’ve been fighting for over a decade. Meta’s pretexts to justify this practice have largely been dismissed. Even for security purposes, indiscriminate and pervasive logging of all our clicks in an identifiable way is not necessary."

MEP , 2023

https://european-pirateparty.eu/ecj-ruling-on-meta-browsing-records-breakthrough-for-online-privacy/

@nicol
@smallcircles

nicol,
@nicol@social.coop avatar

@strypey @smallcircles fyi Patrick is here @echo_pbreyer. He gave a good presentation 2022's Freedom Not Fear, a Brussels event that brings activists and MEPs together, where I first heard about Chat Control

  • All
  • Subscribed
  • Moderated
  • Favorites
  • Futurology
  • DreamBathrooms
  • everett
  • osvaldo12
  • magazineikmin
  • thenastyranch
  • rosin
  • normalnudes
  • Youngstown
  • Durango
  • slotface
  • ngwrru68w68
  • kavyap
  • mdbf
  • InstantRegret
  • JUstTest
  • ethstaker
  • GTA5RPClips
  • tacticalgear
  • Leos
  • anitta
  • modclub
  • khanakhh
  • cubers
  • cisconetworking
  • megavids
  • provamag3
  • tester
  • lostlight
  • All magazines