about.iftas.org

@about.iftas.org@about.iftas.org

Nonprofit trust and safety support for volunteer social web content moderators

This profile is from a federated server and may be incomplete. Browse more on the original instance.

about.iftas.org, to fediverse

We’ve had a tremendously busy first quarter, too much too fit in a newsletter, so here’s the roundup of what’s been happening these past few months.

Content Classification System

This is the biggest project we have underway: build an opt-in, privacy-preserving CSAM detection and reporting system to help protect the Fediverse. We are halfway through our initial buildout, which will allow server operators to optionally send their media to IFTAS for hash and match detection using the Safer platform from Thorn. No media leaves IFTAS, and if we get a pertinent match we take care of the necessary reporting. This one’s a complex activity, but we are working our way through and hope to have more on this soon.

https://cryptpad.fr/form/#/2/form/view/SXiobzcxTRrpVsWMJDh+h+loLkAmsTQ-8-egNm+ihlo/

Moderator Advisory Panel

We’ve set aside funds to pay active moderators a monthly stipend for their guidance and input on our activities, and we are extremely pleased to announce our initial cohort has been onboarded and the first payments went out for March. This group is tasked with reviewing our our products and services, and ensuring a broad range of voices are heard throughout the process. You can review our Moderator Advisory Panel on the About Us page, welcome to everyone who stepped up to help guide this work and thank you for your participation!

FediCheck / CARIAD

FediCheck is our moderation-as-a-service domain federation app, it allows Mastodon servers to sign in and automatically update their domain blocks and retractions from a trusted list. For this iteration we are using our CARIAD list (an aggregation of the most blocked domains) combined with our Do Not Interact list, each domain is reviewed before inclusion, and the service is intended for new administrators to get a kick start on their federation choices while keeping them safe from day 1 harassment.

We’ve onboarded our first batch of beta testers and while we’ve got some kinks to iron out, the service is working well. We’ll keep adding more servers from the initial round of requests, and work our way toward making this a free, public service.

If you’d like to use FediCheck for your server, please register your interest.

Screenshot of the FediCheck app showing a server's current denylistPersonal Digital Safety

We have contracted with Tall Poppy for up to 20 moderators to gain access to a range of personal safety tools, including live support during online harassment and doxxing attacks. We are scheduling the first onboarding and hope to offer this to many more in the coming months.

If you’d like to be notified when we open up to more applicants, let us know using this form.

EU Digital Services Act

https://about.iftas.org/wp-content/uploads/2024/04/DSA-Guide-Decentralised-Servers.pdf

Working with the great folks at Tremau, we launched the first of our regulatory guidance materials, this easy-to-read guide to the DSA allows Fediverse administrators to review their exposure to the DSA, and practical guidance on working toward compliance.

Download the DSA Guide for Decentralised Servers.

Composable Moderation

Bluesky logo

We’ve installed and experimented with the Bluesky open source moderation tool Ozone. As part of this activity, we’ve set up a labeller account on Bluesky, we’re not actively moderating anything (yet) but we are looking into if and how we can support labelling on the network. Our Bluesky links:

Moderator Templates

Lillian and Jon have been conducting a survey of moderators to help guide the production of a number of templates for moderation teams, including a moderator agreement and code of conduct. Longer-term this will become a handbook for hands-on moderator activities.

If you’d like to add your insight and feedback, fill out this brief questionnaire.

Charitable Status

IFTAS is now a recognised 501c3 organisation, this means we can accept tax-deductible donations to support our work. Everything we do is free of charge and we need your support to keep the work moving forward!

We have a number of ways for you to support the mission:

If you’re considering making a contribution, your employer may have matching funds available! We are registered with Benevity, check with your company’s giving program to see if you can double your contribution!

Spoiler Alert

We are slowly opening the doors to our collaboration portal “IFTAS Connect”.

We’ve issued invitations to our Needs Assessment participants, and will be opening up more broadly in late April. IFTAS Connect is a community of practice for moderators, community managers and researchers new and old to come together, share what works, seek help, and get guides and resources for their day to day work.

Stay tuned for the official announcement!

IFTAS Connect Groups featureWhat’s Next?

A few of the items on the todo list…

  • Signals sharing – we will soon be convening a group to begin the work of classifying shareable information to strengthen the Fediverse defenses against spam, disinformation and more, using an ISAC-like format. Email us for more info.
  • Additional regulatory guidance for administrators, GDPR, UK’s OSA and more on the list.
  • Moderator wellness and resilience education

Thanks for reading, to stay on top of our activities please join our newsletter.

https://about.iftas.org/2024/04/23/spring-2023-update/

about.iftas.org, to space

https://about.iftas.org/wp-content/uploads/2024/04/DSA-Guide-Decentralised-Servers.pdf

IFTAS is happy to announce the public availability of our DSA Guide for Decentralized Services – a practical guide for small and micro services that are subject to the EU’s Digital Services Act.

Developed in collaboration with the great people at Tremau, our DSA Guide is designed to help independent social media service providers navigate these complex regulations and achieve compliance with these new rules without compromising the unique qualities of federated, open social networks.

As part of our Needs Assessment activities, we’ve heard a repeated need for help understanding the complex regulatory landscape that decentralized services need to consider, and this DSA Guide is the first of many in our plan to provide clear, actionable guidance to a range of regulations for the community.

As of February 2024, all online services and digital platforms that offer services in the European Union are required to be fully compliant with the DSA. If your server has member accounts in the EU, or is publicly viewable in the EU, your service is most likely impacted by this regulation.

However, various portions of the DSA are not applicable to “small and micro” services, and this guide will show you clearly which parts apply and which do not.

For administrators of platforms like Mastodon, PeerTube, and Pixelfed, the DSA Guide can help demystify the requirements and offer practical advice on achieving compliance for the over 27,000 independent operators of these and other decentralized social media services who otherwise may not be able to obtain the guidance and advice that larger operations can afford to invest in.

Download the DSA Guide for Decentralized Fediverse Services.

To join the discussion, visit our community chat service at https://matrix.to/#/#space:matrix.iftas.org or stay tuned to join our community portal in the coming weeks!

https://about.iftas.org/2024/04/09/dsa-guide-for-the-fediverse/

about.iftas.org, to space

Please note, this post will be a living document that will be updated over time as new information becomes available.

Over the past several weeks, IFTAS has fielded an increasing number of inquiries about the implications of Threads – the microblogging platform from Instagram, a Meta platform – enabling ActivityPub and testing their connectivity with the Fediverse.

Server admins and moderator teams are grappling with the decision and trying to understand the impact of allowing their service to interact with Threads, and thereby with Meta’s network and data infrastructure.

IFTAS has solicited a list of questions from the community which has been sent to the Threads team. If we get replies, we will post them here. We will continue to collect your questions for the foreseeable future.

IFTAS will remain steadfast in its mission to support the moderators of federated social media services. If a large number of threads.net accounts opt in to federating their content, this would increase both the source of content that may break your terms of service leading to an increase in local reports, as well as the number of accounts able to view your member’s content, leading to an increase in remote reports if your member’s content is deemed objectionable.

Before federating with Threads, you may want to review the Instagram Community Guidelines (Threads is an Instagram product) to review your member content’s applicability. Federating with Threads may expose you to compliance issues you have not previously been concerned with, as Threads is a US corporation with strict compliance requirements regarding subject matter commonly found on the Fediverse, including intellectual property concerns, sexually explicit content, and sex work. Threads users can report any content they find that meets their definition of spam, nudity or sexual activity, hate speech or symbols, violence or dangerous organizations, bullying or harassment, selling illegal or regulated goods, intellectual property violations, suicide or self-injury, eating disorders, scams or fraud, and false information.

According to the GLAAD Social Media Safety Index, Instagram, Thread’s parent, has a 63% SMSI score for safety. While Instagram scores the highest of all the rated platforms, you should note that Instagram will allow accounts on their service that many would choose to block. We are unaware of any shared lists of such accounts on Threads, but if we become aware of such a list we may link to it here. Online hate leads to offline violence which leads to yet more online hate, and all hate and harassment should be reported to the relevant platform, no matter the source.

If you wish to completely shield your members from interacting with Threads, be aware that defederating threads.net stops content coming in, but not necessarily going out. Followers of your members may boost or repost content to their followers, which in turn may be threads.net accounts. Mastodon offers an Authorized Fetch option – Configuring your environment – Mastodon documentation – which will completely remove the ability for Threads to gather content from your service. Other platforms may have similar options, and you should pose this question to the relevant developer team.

You should also be aware of the Threads Supplemental Privacy Policy. This document describes the data Instagram will collect from your users if they interact with Threads, and the intent to service privacy requests, notably:

What information do we collect?

[…]

Information From Third Party Services and Users: We collect information about the Third Party Services and Third Party Users who interact with Threads. If you interact with Threads through a Third Party Service (such as by following Threads users, interacting with Threads content, or by allowing Threads users to follow you or interact with your content), we collect information about your third-party account and profile (such as your username, profile picture, and the name and IP address of the Third Party Service on which you are registered), your content (such as when you allow Threads users to follow, like, reshare, or have mentions in your posts), and your interactions (such as when you follow, like, reshare, or have mentions in Threads posts).

(IFTAS note, this is the same information most ActivityPub servers will collect if a user interacts)

and:

How can you manage or delete your information and exercise your rights?

[…]

If you are a Third Party User, our ability to verify your request may be limited and we may be unable to process your request. Please note, however, that the interoperable protocol allows Third Party Services to automatically send Threads requests for deletion of individual posts when those posts are deleted on the Third Party Service. We make reasonable efforts to honor such requests when we receive them. Contact your Third Party Service to learn more.

([*https://help.instagram.com/515230437301944?helpref=faq_content*](https://help.instagram.com/515230437301944?helpref=faq_content) *retrieved 2023-12-16)

Below are the initial set of questions submitted to the Threads team, as we learn more, we will update this page.

Questions

  • If a Fediverse user reports content from threads.net to their service provider and chooses to notify the source server, how does Threads receive it? Can Threads receive it?
  • If a Threads user reports content from a third party to Threads Trust and Safety, is that report forwarded to the third party moderation workflow?
  • How will Threads observe and effect user-to-user blocks that involve a third party?
  • If a third party service publicly defederates Threads in a fashion Threads can discern, will Threads reflect that defederation and not ingest posts or profiles from that service?
  • Will Threads take an “allowlist” approach, only federating with approved instances; or a “denylist” approach, federating with all instances by default unless they are explicitly blocked? Will any such lists – if they exist – be public?
  • How will Threads safeguard against federating with known bad actors in the existing ActivityPub space, thereby exposing Threads users to accounts and servers that are widely defederated by the community at large?
  • Will Threads require instances that federate with it to adhere to Threads-defined moderation standards? If yes, will Threads publish these standards?

To submit a question for consideration, use this document: https://cryptpad.fr/pad/#/2/pad/edit/6IxyBdggAi+7+bDOCh2AAT+t/

To discuss this issue with IFTAS and the IFTAS community, join our Matrix chat: https://chat.iftas.org/#/room/#space:matrix.iftas.org

Helpful Links

https://about.iftas.org/2023/12/20/moderating-the-fediverse-threads-from-instagram/

about.iftas.org, to random

IFTAS intends to provide guidance, and operate or facilitate services to support electronic service providers (ESPs) who require assistance mitigating Child Sexual Abuse Media (CSAM) on their services.

Motivation

IFTAS serves the independent social media trust and safety community, and is driven in large part by the community Needs Assessment.

Support for CSAM issues is consistently ranked as one of the most requested needs, and as such IFTAS is seeking to mitigate the legal exposure and personal trauma faced by ESPs and content moderators who are tasked with moderating CSAM.

Regulatory compliance requires ESPs to either actively scan for, or respond to reports of CSAM on their service. Understanding the regulatory requirements is confusing and jurisdictionally complex. Detection solutions can be costly, technically difficult to implement, and pose an additional regulatory burden. Moderating CSAM can be traumatic. The various bodies engaged in child safety are not open to working with thousands of ActivityPub service providers.

IFTAS wishes to:

  1. Promote a healthier, safer Internet;
  2. Reduce the regulatory burden and legal exposure for ESPs;
  3. Minimise harm to content moderators;
  4. Provide or facilitate the use of CSAM classification services while preserving privacy and security to the fullest extent possible;
  5. Reduce duplicative effort;
  6. Serve as a trusted voice for this issue in the open social web.

IFTAS Activities and Services

IFTAS intends to make various resources available, including but not limited to the following:

Content moderator trauma support

Moderators exposed to CSAM via their moderation workflows have expressed the need for post-trauma support. Working with University Middlesex London Centre of Abuse and Trauma Studies, IFTAS is reviewing self-help materials and guidance for trauma mitigation resources, to be made available on the forthcoming IFTAS community library.

Legal and regulatory guidance

While we have published some guidance already (https://github.com/iftas-org/resources/tree/main/CSAM-CSE), IFTAS plans to consult with domain experts in relevant jurisdictions to provide guidance for ESPs, updating routinely to ensure accurate, actionable guidance from trustworthy sources.

IFTAS Media classification

Safer is an IFTAS-hosted enterprise deployment that performs hash-matching on images and videos securely transmitted from opted-in services to IFTAS for classification, and creates an automatic report to NCMEC if required. Shield is a hash-matching API from the Canadian Centre for Child Protection that can be called to examine locally-hosted media (images and video) and provide a classification. 3-is is a similar service oriented to EU hosts. IFTAS is exploring methods to facilitate access to these services.

https://safer.io/, https://projectarachnid.ca/en/#shield, https://www.3-is.eu/

fedi-safety

fedi-safety is an open source clip interrogation tool that can help classify images. IFTAS is exploring methods to facilitate the use of fedi-safety locally, or as a third-party service.

Known Hashtags

IFTAS plans to provide service administrators with a rolling list of known hashtags in use by sellers and sharers of CSAM, to support local service moderation decisions.

Known Hosts

IFTAS plans to provide service administrators with a rolling list of services seen to host CSAM with no intent or ability to moderate the content, to support local service moderation decisions.

Best Practice

IFTAS is consulting with child safety experts including INHOPE, Arachnid, NCMEC, End Violence Against Children and others to source and share best practices for moderation workflow enhancements to minimise harm for moderators likely to be exposed to CSAM. for example blurring images, using monochrome, and using a dedicated browser profile for this work.

Reference Material

https://about.iftas.org/2023/12/13/iftas-csam-roadmap/

about.iftas.org, to random

three cartoon figures completing a jigsaw puzzle of a lightbulbThe IFTAS Moderator Advisory Council is the community body that reviews the Moderator Needs Assessment results, weighs in on projects and spending, and directly reviews activities to ensure they are fit for purpose, meeting the needs of the moderator community, and steering the work of IFTAS.

Because participation in such groups is often dominated by people with the privilege of being able to volunteer uncompensated time, IFTAS offers a stipend of $260 per month to help ensure a broad representation of voices. Part of our non-profit mission is to support the uncompensated labour provided by the hardworking moderator community, and this is a small first step in that direction.

Participation includes one video meeting per month, and we ask that you commit to two hours each week commenting on proposals, reviewing activities, or suggesting new services. To get a sense of what we’re working on, and the kinds of things we’d like your feedback on, take a look at our Activity Board.

While anyone with relevant experience can apply, priority is given to active moderators working in ActivityPub platforms.

Our current panel has been an invaluable resource for feedback on our planned activities, and as we begin working to respond to the most recent Needs Assessment, we’d like to add more members to the group to ensure we are receiving feedback from a diverse, representative group of active moderators.

If you’d like to apply, please fill out this form: Moderator Council Application

Applications will be reviewed by December 31, and announcements made in early January 2024.

If you’d like further details, or have questions, please email Jaz

https://about.iftas.org/2023/12/07/open-call-moderator-council-members/

  • All
  • Subscribed
  • Moderated
  • Favorites
  • Leos
  • InstantRegret
  • magazineikmin
  • hgfsjryuu7
  • Durango
  • Youngstown
  • slotface
  • everett
  • ngwrru68w68
  • rosin
  • kavyap
  • mdbf
  • PowerRangers
  • DreamBathrooms
  • anitta
  • osvaldo12
  • thenastyranch
  • vwfavf
  • khanakhh
  • tacticalgear
  • cisconetworking
  • modclub
  • GTA5RPClips
  • ethstaker
  • cubers
  • normalnudes
  • tester
  • provamag3
  • All magazines