Age verification forces a choice between "freedom of expression by not accessing content" or "increased security risks that will arise from data breaches and phishing sites"
ORG warns that Ofcom (UK) proposals could create new oppportunities for fraudsters to scam people into providing identification and payment information.
“Adults will be faced with a choice: either limit their freedom of expression by not accessing content, or expose themselves to increased security risks that will arise from data breaches and phishing sites [by having to do age assurance]."
Educational and help material, especially where it relates to sexuality, gender identity, drugs and other sensitive topics may be denied to young people by moderation systems.
Risks to children will continue with these measures. Regulators need to shift their approach to one that empowers children to understand the risks they may face.
Last week we published our response to Ofcom's Online Safety Act (UK) consultation.
We've raised concerns about the threat to free expression in requirements to proactively screen users' social media content and measures that undermine end-to-end encryption.
ORG urges Ofcom to make it clear that companies must ensure human rights and due process considerations are accounted for through all stages of the moderation process.
"What I am concerned about is that [solutions] seem not to go far enough [eg] the UK's #OnlineSafetyAct ... was catalysed through very real concerns but [doesn’t] look at the [big tech] business model. They take as a given these mass social platforms. And the solutions often look like extending surveillance and control to government ... not looking at how we attack the surveillance business model that is at the heart of [big tech]" - @Mer__edith
The European Court of Human Rights (ECtHR) emphases that encryption contribute to ensuring the enjoyment of privacy and other fundamental rights, such as freedom of expression. Case of Podchasov v. Russia (no. 33696/19).
(Given the hedging in Art 8 about "national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals," I wasn't sure until now.)
Over 80 civil society groups and experts joined us to oppose the spy clause that breaks #e2ee.
"These measures will embolden hostile and abusive regimes who will be only too pleased to use the UK as an excuse to monitor the private messages of their citizens."
In July, we published a legal opinion finding “real and significant issues” on the lawfulness of a prior restraint #censorship clause in the #OnlineSafetyAct.
It warned of “a sea change" for #freedomofexpression in the UK, as platforms must screen content and users from seeing anything deemed illegal.
The opinion found there is “likely to be significant interference with freedom of expression that is unforeseeable”.
It's "very concerning that Ofcom is solely relying upon data protection laws and the Information Commissioner's Office" to protect privacy when using age verification on sites with porn content.
We need specific and clear privacy rules, given that loads of sensitive data will be processed.
ORG calls on Ofcom to go further in setting out clearer standards and guidelines.
This is necessary to protect users’ data from the substantially increased risk of fraud and cybercrime that comes with invasive age verification technologies.
⚠️ Age verification is risky but Ofcom wants to introduce it on web users ⚠️
"The potential consequences of data being leaked are catastrophic and could include blackmail, fraud, relationship damage, and the outing of people's sexual preferences in very vulnerable circumstances."
Starting to think about a #p2p social media protocol that doesn't fall foul of the UK's ridiculously broad #OnlineSafetyAct, and leaves what you see in your own hands.
Feed curation as active or effortless as you like, using whatever approach you choose.
So a protocol that provides the basis for user respecting apps with different approaches.
Goal: curation with zero effort via support for 'algos' that serve you rather than the other way around.
#UK#Censorship#OnlineSafetyAct#Privacy: "Free-speech campaigners are broadly unhappy with the final Online Safety Act. Barbora Bukovská, senior director for law and policy at the human rights organisation Article 19, says that while the final version has been “slightly improved” by the removal of “legal but harmful”, the act remains an “extremely complex and incoherent piece of legislation that will undermine freedom of expression and information, [and] the right to privacy”. It will be “ineffective” in making the internet safer, she adds."
Regardless, we are going to see a lot more instances of business and individuals shutting down services that allow public posting, which of course includes any UK based and even UK facing Mastodon servers.
#OffCom recently published its proposals for implementation, for consultation. They are >1,500 pages long.
How can anyone but a corporation understand let alone comply. It'll be a clusterfuck, as warned.
Having had time to digest this 🤣 I wonder if there's any light you can shed on what would allow a no-server/no-owner Mastodon like service to operate without being affected by the Act. I'm wondering how to build a p2p protocol to avoid being in scope, but don't know the catch criteria, where the lines are, exceptions etc. 🤷♂️