RT @mathver
Telegram has appointed a Belgian company as its legal representative for the purposes of article 13 of the Digital Services Act ,making Belgiums @BIPT_IBPT the regulator for Telegram, which under the #DSA qualifies as an online platform (not a VLOP.. yet) https://www.standaard.be/cnt/dmf20240505_96985357
Hey #PnPDE Community, was sind Eurer Meinung nach die schlimmsten Gatekeeper Sätze & Handlungen in unserer #Rollenspiel Welt? Welches Verhalten oder Sprüche verhindert das Menschen die neu sind nicht wohl fühlen?
Interessant, dass eine Institution da jetzt erst drauf kommt. Manche hier wissen das schon seit Jahren.
Weiss auch nicht, aber was neues ist das jetzt nicht. Meta war schon immer ein Höllenloch, wer sich da rumtreibt, lebt einfach hinterm Mond was Content Moderration, Meldemöglichkeiten und Sanktionierungen von Accounts/Postings angeht.
Als Francis Haugen vor dem EU Parlament aussagte, hätten die dort eigtl schon alles wissen können. Da brauchts nicht erst noch ne eigene Untersuchung. Meta ist das Befinden der Nutzer:innen völlig egal, denen gehts halt einfach nur um Profit. 🤷♀️
#AI#Algorithms#DSA#OSA#AlgorithmicAudits#Law#PoliticalEconomy: "Accepted in the Proceedings of the 2024 ACM Conference on Fairness, Accountability and Transparency. For almost a decade now, scholarship in and beyond the ACM FAccT community has been focusing on novel and innovative ways and methodologies to audit the functioning of algorithmic systems. Over the years, this research idea and technical project has matured enough to become a regulatory mandate. Today, the Digital Services Act (DSA) and the Online Safety Act (OSA) have established the framework within which technology corporations and (traditional) auditors will develop the ‘practice’ of algorithmic auditing thereby presaging how this ‘ecosystem’ will develop. In this paper, we systematically review the auditing provisions in the DSA and the OSA in light of observations from the emerging industry of algorithmic auditing. Who is likely to occupy this space? What are some political and ethical tensions that are likely to arise? How are the mandates of ‘independent auditing’ or ‘the evaluation of the societal context of an algorithmic function’ likely to play out in practice? By shaping the picture of the emerging political economy of algorithmic auditing, we draw attention to strategies and cultures of traditional auditors that risk eroding important regulatory pillars of the DSA and the OSA. Importantly, we warn that ambitious research ideas and technical projects of/for algorithmic auditing may end up crashed by the standardising grip of traditional auditors and/or diluted within a complex web of (sub-)contractual arrangements, diverse portfolios, and tight timelines."
Are you aware of harmful practices by companies under the Digital Services Act and Digital Markets Act?
We've launched whistleblower tools to report violations of obligations by Very Large Online Platforms, Search Engines and gatekeepers under the #DSA and #DMA.
EU-Kommission eröffnet Verfahren gegen Meta-Konzern
Die EU-Kommission fühlt Meta auf den Zahn: Der Facebook-Mutterkonzern steht im Verdacht, nicht ausreichend gegen Falschinformationen auf seinen Plattformen vorzugehen. Deshalb leitete die Kommission ein Verfahren ein.
Das erste #DSA#pnpde Solo von 1984 kann man jetzt auch als kurzes Computerspiel kostenfrei runterladen. Aus meiner Sicht eine schöne Idee und einmal mehr ein gelungenes Projekt aus der Fanszene.
This is why we have conducted a stress test to ensure the readiness of Very Large Online Platforms to counter potential manipulation and interference using the tools provided by the Digital Services Act.
It addressed multiple scenarios, including:
⚠ Information manipulation via deep fakes, other AI uses
⚠ Attempts of suppression of voices
⚠ Intentional spread of false information
⚠ Online incitement to violence
We’ve had a tremendously busy first quarter, too much too fit in a newsletter, so here’s the roundup of what’s been happening these past few months.
Content Classification System
This is the biggest project we have underway: build an opt-in, privacy-preserving CSAM detection and reporting system to help protect the Fediverse. We are halfway through our initial buildout, which will allow server operators to optionally send their media to IFTAS for hash and match detection using the Safer platform from Thorn. No media leaves IFTAS, and if we get a pertinent match we take care of the necessary reporting. This one’s a complex activity, but we are working our way through and hope to have more on this soon.
We’ve set aside funds to pay active moderators a monthly stipend for their guidance and input on our activities, and we are extremely pleased to announce our initial cohort has been onboarded and the first payments went out for March. This group is tasked with reviewing our our products and services, and ensuring a broad range of voices are heard throughout the process. You can review our Moderator Advisory Panel on the About Us page, welcome to everyone who stepped up to help guide this work and thank you for your participation!
FediCheck / CARIAD
FediCheck is our moderation-as-a-service domain federation app, it allows Mastodon servers to sign in and automatically update their domain blocks and retractions from a trusted list. For this iteration we are using our CARIAD list (an aggregation of the most blocked domains) combined with our Do Not Interact list, each domain is reviewed before inclusion, and the service is intended for new administrators to get a kick start on their federation choices while keeping them safe from day 1 harassment.
We’ve onboarded our first batch of beta testers and while we’ve got some kinks to iron out, the service is working well. We’ll keep adding more servers from the initial round of requests, and work our way toward making this a free, public service.
We have contracted with Tall Poppy for up to 20 moderators to gain access to a range of personal safety tools, including live support during online harassment and doxxing attacks. We are scheduling the first onboarding and hope to offer this to many more in the coming months.
Working with the great folks at Tremau, we launched the first of our regulatory guidance materials, this easy-to-read guide to the DSA allows Fediverse administrators to review their exposure to the DSA, and practical guidance on working toward compliance.
We’ve installed and experimented with the Bluesky open source moderation tool Ozone. As part of this activity, we’ve set up a labeller account on Bluesky, we’re not actively moderating anything (yet) but we are looking into if and how we can support labelling on the network. Our Bluesky links:
Lillian and Jon have been conducting a survey of moderators to help guide the production of a number of templates for moderation teams, including a moderator agreement and code of conduct. Longer-term this will become a handbook for hands-on moderator activities.
IFTAS is now a recognised 501c3 organisation, this means we can accept tax-deductible donations to support our work. Everything we do is free of charge and we need your support to keep the work moving forward!
We have a number of ways for you to support the mission:
If you’re considering making a contribution, your employer may have matching funds available! We are registered with Benevity, check with your company’s giving program to see if you can double your contribution!
Spoiler Alert
We are slowly opening the doors to our collaboration portal “IFTAS Connect”.
We’ve issued invitations to our Needs Assessment participants, and will be opening up more broadly in late April. IFTAS Connect is a community of practice for moderators, community managers and researchers new and old to come together, share what works, seek help, and get guides and resources for their day to day work.
Signals sharing – we will soon be convening a group to begin the work of classifying shareable information to strengthen the Fediverse defenses against spam, disinformation and more, using an ISAC-like format. Email us for more info.
Additional regulatory guidance for administrators, GDPR, UK’s OSA and more on the list.
Moderator wellness and resilience education
Thanks for reading, to stay on top of our activities please join our newsletter.
⚠️ Opening of formal proceedings against TikTok under the Digital Services Act.
We are concerned that TikTok has launched the "Task and Reward Program" of TikTok Lite without conducting a diligent assessment of the risks it entails.
We suspect an infringement of the #DSA, and we consider that there are risks of serious damage for the mental health of users.
Therefore, we have communicated to TikTok our intention to suspend the TikTok Lite rewards programme in the EU.
Online platforms must be accountable for ensuring safer, more transparent online spaces.
As of next week, adult entertainment platforms Pornhub, Stripchat and Xvideos will have to comply with the most stringent obligations under the Digital Services Act, including:
Submitting risk assessment reports
Putting in place mitigation measures to address systemic risks
Complying with additional transparency obligations
Komisja Europejska wciągnęła największe platformy pornograficzne na listę VLOP-ów: https://europa.eu/!mYM4vb
To zmusi je do mapowania ryzyka i minimalizowania go.
Byliśmy w grupie organizacji, które walczyły o to, żeby te największe platformy pornograficzne poddać kontroli zgodnie z #DSA, jako VLOP-y. Dlaczego to ważne, wyjaśnialiśmy na stronie: