UK Trial: Pornhub's Chatbot Halts Millions from Accessing Child Abuse Content

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

Kusimulkku,

I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive

Thorny_Insight,

I’m not sure if it’s related but as a life-long miniskirt lover I’ve noticed that many sites no longer return results for the term “schoolgirl” and instead you need to search for a “student”

Hyperreality,

Obviously don't google this, but IRC one of the terms used was lemon party.

LanternEverywhere,

Can you very loosely tell me what that is so I don't have to google it?

Beardedsausag3,
Beardedsausag3 avatar

Lemon party was a bunch of old naked dudes sat in a group i think.. Mightve been involving themselves with each other? It's been a fucking loooong ass time since I got shown that and meatspin at school lol

NoIWontPickaName,

Really?

ShadowRam,

hahaha... it saddens me that only those >30yrs old may get this.

jaycifer,

Hey now, I understood that reference and I’m.. only.. 27.

30 years draws ever nearer.

Squire1039,
@Squire1039@lemm.ee avatar

The MLs have been shown to be extraordinarily good at statistically guessing your words. The words covered are probably comprehensive.

Kusimulkku,

I think the other article talks about it being a manually curated list because while ML can get correct words it also gets random stuff, so you need to check it isn’t making spurious connections. It’s pretty interesting how it all works

xePBMg9,

“Young” and “playful” probably.

Bgugi,

Aylo maintains a list of more than 28,000 banned terms in multiple languages, which is constantly being updated.

Id be very curious what these terms are, but I wouldn’t be surprised if “pizza guy” or “school uniform” would trigger a response.

FinishingDutch,
@FinishingDutch@lemmy.world avatar

Sounds like a good feature. Anything that stops people from doing that is great.

But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

tordenflesk,

I think it’s an early prevention type of thing.

AceSLS,

It had all sorts of illegal things before they purged everyone unverified due to legal pressure

silasmariner,

wree people really expecting to find that content on PornHub?

Welcome to the internet 😂 where people constantly disappoint/surprise you (what word is that? Dissurprise? Disurprint?

520,

So...pornhub has actually had problems with CSAM. It used to be much more of a Youtube-like platform where anyone can upload.

Even without that aspect, there are a looot of producers that don't do their checks well and a lot of underage actresses that fall through the cracks

CameronDev,

PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

And im guessing they are trying to catch users who are trending towards questionable material. "College"✅ -> "Teen"⚠️ -> "Young Teen"⚠️⚠️⚠️ -> "CSAM"🚔 etc.

macrocephalic,

That explains why it’s all commercial stuff now… So I heard.

CameronDev,

Sure sure, whatever you say Big Dick :D

OKRainbowKid,

Big head, rather

kescusay,
@kescusay@lemmy.world avatar

No, no, it’s French. Macro ce phallique. It means “macro this phallic.” Obviously.

FinishingDutch,
@FinishingDutch@lemmy.world avatar

Wow, that bad? I was aware they purged a lot of ‘amateur’ content over concerns regarding consent to upload/revenge porn, but I didn’t know it was that much.

CameronDev,

Yeah, unverified user content had a lot of problems. Also piracy and gore etc.

arstechnica.com/…/pornhub-purges-all-unverified-u…

The purge appears to have hit almost 9 million of the 13.5 million videos on Pornhub as of Sunday, or nearly two-thirds of all the content hosted on the site.

azertyfun,

Eeeeeeeh. There’s nuance.

IIRC there were only a handful of verified CSAM videos on the entire website. It’s inevitable, it happens everywhere with UGC, including on here. Anecdotally, in the years leading up to the purge PH had already cleaned up its act and from what I saw pirated content was rather well moderated. However this time the media made a huge stink about the alleged CSAM, payment processors threatened to pull out (they are notoriously very puritan, it’s caused a lot of trouble to lemmynsfw’s admins for instance) and so regardless of the validity of the initial claims PH had to do something to gain back the trust of payment processors, so they basically nuked every video that did not have a government ID attached.

Now if I may speculate a little, one of the reasons it happened this way is probably that due to its industry position PH is way better moderated than most (if not all) websites of their size and already had verified a bunch of its creators. At the same time the rise of OnlyFans and similar websites means that real amateur content has all but disappeared so there was less and less reason to allow random UGC anyway. So the high moderation costs probably didn’t make much sense anymore anyway.

CameronDev,

Yeah, there was a lot of reasons. CSAM was just the loud reason.

root,

Spot on. The availability of CSAM was overblown by a well funded special interest group (Exodus Cry). The articles about it were pretty much ghost written by them.

When you’re the biggest company in porn you’ve got a target on your back. In my opinion they removed all user content to avoid even the appearance of supporting CSAM, not because they were guilty of anything.

PornHub has been very open about normalizing healthy sexuality for years, while also providing interesting data access for both scientists and the general public.

“Exodus Cry is an American Christian non-profit advocacy organization seeking the abolition of the legal commercial sex industry, including pornography, strip clubs, and sex work, as well as illegal sex trafficking.[2] It has been described by the New York Daily News,[3] TheWrap,[4] and others as anti-LGBT, with ties to the anti-abortion movement.[5]”

en.wikipedia.org/wiki/Exodus_Cry

azertyfun,

They’re the fuckers who almost turned OF into Pinterest as well? Not surprising in retrospect. The crazy thing is how all news outlets ran with the narrative and payment processors are so flaky with adult content. De-platforming sex work shouldn’t be this easy.

CameronDev,

That kinda sounds reasonable. Especially if it can prevent someone going down that rabbithole? Good job PH.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • DreamBathrooms
  • thenastyranch
  • ngwrru68w68
  • magazineikmin
  • khanakhh
  • rosin
  • mdbf
  • Youngstown
  • slotface
  • everett
  • cisconetworking
  • kavyap
  • tacticalgear
  • InstantRegret
  • JUstTest
  • Durango
  • osvaldo12
  • ethstaker
  • modclub
  • GTA5RPClips
  • Leos
  • cubers
  • tester
  • normalnudes
  • megavids
  • provamag3
  • anitta
  • lostlight
  • All magazines