The Act could give the UK government the power to access, collect, and read anyone’s private conversations at any time. The UK Government has previously admitted that the proposals are "technically unfeasible," and we hope @Ofcom
keeps this front of mind during the implementation process. (1/3)
⚠️ Age verification is risky but Ofcom wants to introduce it on web users ⚠️
"The potential consequences of data being leaked are catastrophic and could include blackmail, fraud, relationship damage, and the outing of people's sexual preferences in very vulnerable circumstances."
This grab bag of half-baked fantasy solutions to misunderstood (or misrepresented) problems has received Royal Assent, including powers to break #encryption in messaging apps and censor content before it's even posted.
Scrutiny over how Ofcom implements the law and how the government exercises its powers is critical now that the threats to #privacy and #freedomofexpression have become law.
The European Court of Human Rights (ECtHR) emphases that encryption contribute to ensuring the enjoyment of privacy and other fundamental rights, such as freedom of expression. Case of Podchasov v. Russia (no. 33696/19).
(Given the hedging in Art 8 about "national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals," I wasn't sure until now.)
Over 80 civil society groups and experts joined us to oppose the spy clause that breaks #e2ee.
"These measures will embolden hostile and abusive regimes who will be only too pleased to use the UK as an excuse to monitor the private messages of their citizens."
"What I am concerned about is that [solutions] seem not to go far enough [eg] the UK's #OnlineSafetyAct ... was catalysed through very real concerns but [doesn’t] look at the [big tech] business model. They take as a given these mass social platforms. And the solutions often look like extending surveillance and control to government ... not looking at how we attack the surveillance business model that is at the heart of [big tech]" - @Mer__edith
It's "very concerning that Ofcom is solely relying upon data protection laws and the Information Commissioner's Office" to protect privacy when using age verification on sites with porn content.
We need specific and clear privacy rules, given that loads of sensitive data will be processed.
Age verification forces a choice between "freedom of expression by not accessing content" or "increased security risks that will arise from data breaches and phishing sites"
ORG warns that Ofcom (UK) proposals could create new oppportunities for fraudsters to scam people into providing identification and payment information.
Last week we published our response to Ofcom's Online Safety Act (UK) consultation.
We've raised concerns about the threat to free expression in requirements to proactively screen users' social media content and measures that undermine end-to-end encryption.
“Adults will be faced with a choice: either limit their freedom of expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites [by having to do age assurance]."
Educational and help material, especially where it relates to sexuality, gender identity, drugs and other sensitive topics may be denied to young people by moderation systems.
Risks to children will continue with these measures. Regulators need to shift their approach to one that empowers children to understand the risks they may face.
"Were the UK government facing serious and widespread civil unrest, and the prospect of a popular uprising, would its "secret" technologies be anything to worry about? How would it be possible to organise effective resistance against a close-to-omniscient state adversary?"
Right now I'm all in on #vdash but that will mature before long so today I wondered what next.
I wondered, how can we stop small businesses and self hosters shutting down services like #Mastodon, unable to meet unrealistic moderation requirements of UK law.
I think we can do this with #SafeNetwork or another #p2p platform.
My ideas are very raw and won't fit in a few toots, so not for now.
In July, we published a legal opinion finding “real and significant issues” on the lawfulness of a prior restraint #censorship clause in the #OnlineSafetyAct.
It warned of “a sea change" for #freedomofexpression in the UK, as platforms must screen content and users from seeing anything deemed illegal.
The opinion found there is “likely to be significant interference with freedom of expression that is unforeseeable”.
Having had time to digest this 🤣 I wonder if there's any light you can shed on what would allow a no-server/no-owner Mastodon like service to operate without being affected by the Act. I'm wondering how to build a p2p protocol to avoid being in scope, but don't know the catch criteria, where the lines are, exceptions etc. 🤷♂️
Starting to think about a #p2p social media protocol that doesn't fall foul of the UK's ridiculously broad #OnlineSafetyAct, and leaves what you see in your own hands.
Feed curation as active or effortless as you like, using whatever approach you choose.
So a protocol that provides the basis for user respecting apps with different approaches.
Goal: curation with zero effort via support for 'algos' that serve you rather than the other way around.
As warned, the UK #OnlineSafetyAct is forcing communities to shut down services such as Mastodon for UK users.
The draconian bill was deliberately targeted at individuals, communities and small businesses, none of which can afford moderation costs or legal risks.
How long before the UK has nothing but corporate services that exploit rather than serve users?
This terrible act hits anything self or community hosted: social media, blogs, git services like #Codeberg and of course #Mastodon.
Enabling back doors (because there would be more than one) in cryptographic systems is not the thin end of the wedge it is the destruction of that method of Cryptography.
ORG urges Ofcom to make it clear that companies must ensure human rights and due process considerations are accounted for through all stages of the moderation process.