@FotoVorschlag #Vertrauen
Vertrauen in die Mitmenschen.
Vertrauen in die Familie.
Vorallem aber, muss man lernen, sich selbst zu vertrauen. Dann können wir eines Tages über uns hinauswachsen 😊
(Sorry für das viele Bearbeiten. Der Schwan hat meine Bilder gefressen 😬)
Paper on Trust & Safety, titled ‘Securing Federated Platforms: Collective Risks and Responses’ from last year's panels with the Carnegie Endowment for International Peace cohosted by @yoyoel is now available, and it's well worth a read: https://tsjournal.org/index.php/jots/article/view/171
Very pleased that I could participate & contribute to this in a small part.
Re: Forum/community/UGC platforms selling data to AI/LLM companies to learn on.
I've owned and operated forums (and other spaces with UGC) since 1999. I have millions of pieces of content covering specific verticals in databases, written by people who are smart and deeply interested in those verticals. Some are from closed communities, some are live.
If this trade sneaks through my ESPN+ standard 10 team fantasy basketball league, at the trade deadline, because at least 4 of the 8 other managers don't see it in time, I am going to have to start arguing that big fantasy basketball league hosts need trust and safety teams.
We're proud to have had those rules in place here since day 1 and will be reviewing our policies to further strengthen them as needed.
There is no room for bigotry against gender and sexual minorities here, and we thank IFTAS for their work on making the fediverse a safer and more welcoming place.
From 2022: “I want Elon Musk to spend one day as a frontline production content moderator, and then get back to this [Community Signal] crew about how that went. ...
“Were you able to keep up with the expected pace at Twitter? Could you … make good decisions over 90% of the time, over 1,000, 2,000 times a day? Could you do that all the while seeing animals being harmed, kids being beat on, [and] child sexual exploitation material?” –@ubiquity75
Reminder that Middlesex University London is seeking volunteer moderators to participate in research about the impact of upsetting material, and how moderators cope with the work.
The survey is mostly written for salaried moderation teams, but Middlesex has asked volunteers to skip any irrelevant questions, they really want to hear from us!
The study may lead to tools you can use to help with moderator stress.
Middlesex University London is seeking Moderators to participate in research regarding the impact of moderating upsetting material, and how moderators cope with the impact.
The survey description is written for employees of organisations that employ moderators, but Middlesex has specifically asked to find some Fediverse moderators as well, to ensure they have a broad view of the impact.
That's what online community legend Rebecca Newton is navigating. After 30 years of full-time work in this industry, she has retired.
What does it mean to retire from community, moderation, trust, and safety work? How do you prepare? What does it feel like to step away from the day-to-day care that you provide for others?
In all the posts I've read from the #Threads federation meetings, all held under the chatham house rule, it's never been clear exactly when they'll implement the features required for successful moderation of ActivityPub based platforms.
I would love to gain some clarity into how & when Threads intends to handle Flag, Block, Ignore, and Reject activities, since these are essential to ensuring we can retain safe spaces.
Like, I believe there is a direct correlation between a platform's business values (how it makes money), its moderation practices, and overall user happiness & desired retention.
Like there's a reason users are referring to your platform as a "hell hole" besides just it being funny.
The IFTAS Moderator Advisory Council plays a pivotal role in assessing needs, shaping project proposals, and guiding IFTAS activities.
If you have moderator experience and can offer two hours a week to contribute valuable feedback, we welcome your interest! A monthly stipend is available.
"At its shining moment, Twitter was like the Tower of Babel before it fell." From Israel vs. Hamas threats to Donald Trump’s “wild” posts, Del Harvey helped make the platform’s hardest content moderation calls for 13 years. Then she left in 2021 … and disappeared. https://www.wired.com/story/del-harvey-twitter-trust-and-safety-breaks-her-silence/
For all those who'd been wondering about FSEP and where it's at, @nivenly have just published an update: FSEP is on hold, pending the return of the original maintainer, or until a new maintainer can be found:
"'As an industry we’ve had a lot of hammers at our disposal. We’re trying to introduce more scalpels into our approach', John Redgrave, Discord’s vice president of trust and safety, told me in an interview. 'That doesn’t just benefit Discord — it benefits all platforms, if users can actually change their behavior'."
Trust & safety is not nearly as easy as you think it is.
"You will be tasked with growing the Trust & Safety team at a social media startup and navigating a series of difficult dilemmas. Get ready to make tough moderation decisions, shape platform policies, and invest in your team as the company scales from small startup to IPO."
@evacide This will cause serious problems with #moderation staff's #MentalHealth too. Forget all the crap you read about what gives us long-term #stress and years of #PTSD, most of that is crap that is pushed by inexperienced #TrustAndSafety managers and tool vendors. The major factor for an experienced vocational worker is agency, ethics, and the deep down knowledge that they are doing good.
“[When our filtering tech] catches something that it sees in the [CSAM] database, it packages a report which includes the image, the email that the image was attached to, and a very small amount of identifying information. ... NCMEC looks at it, decides if it’s something that they can run with, and if it is … they send the report to law enforcement in [the correct] jurisdiction.” –@RALSpencer
“I’m going to give you a number that was very shocking. This Azerbaijan [Facebook manipulation] network, it comprised 3% of all comments by [Facebook Pages] on other pages through the entire world. …
“Azerbaijan is, of course, a tiny country. Somewhere at Facebook, I’m sure there was a team whose [goal] was to make page activity go up, and they were congratulating themselves on the comment numbers.”