#FacialRecognition is used by the police to try and match people to watch lists but its error rate is unacceptable, particularly for younger people and people with darker skin.
See Big Brother Watch's briefing on facial recognition surveillance.
The police used a Beyonce gig as an excuse to deploy live facial recognition.
The creeping use of this tech is alarming, especially with concerns about its accuracy and efficacy. We should be able to expect privacy and not be ID’d without cause.
Would you put up with being fingerprinted as you go to the shops? Or give DNA as you go to work? Or have your face compared to a database just for going to a gig?
The decision by the European Parliament to ban AI systems of biometric mass surveillance in public spaces is welcome. It should be reflected in the UK.
We must protect civil liberties from the encroachment of the state that’s leading us down the road of predictive policing.
「 Clearview AI, the US startup that’s attracted notoriety in recent years for a massive privacy violation after it scraped selfies off the Internet and used people’s data to build a facial recognition tool it pitched to law enforcement and others, has been hit with another fine in France over non-cooperation with the data protection regulator 」
— @TechCrunch
If there isn't an app that distorts your selfies a tiny little bit (just enough to confuse the face recognition algorithm), can somebody please develop one?
Just like the tools that remofe metadata from pictures before uploading...
The best antisurveillance is probably to flood the web with distorted portraits of yourself.
I have to admit that my mother gave me this idea. She doesn't even know how to use properly her smartphone but is my countersurveillance hero:
She gets customer cards from every supermarket, then swaps them with all kind of other people to prank their algorithms of customer preferences 🤣
Today New Yorker City Council members and community members stood firmly against the abuses of billionaires like James Dolan and aggressive landlords who want to use #FacialRecognition technology to strip away our rights.
Just watched a video from a large intl company proposing a system for K-12 schools using #FacialRecognition to allow students & staff access, to detect former students on campus, to block access to non-custodial parents & sex offenders and more. All using AI based facial recognition.
Leaving aside issues of accuracy, just think about the database of personal information behind that. Then think about this:
More stuff in the #Signal chat app that doesn't work without proprietary software (unless they've replaced these non-free dependencies in the last 3 years):
"...maps and automatic facial recognition don’t work; there’s a separate patch for OSM support 110 which the Signal developers were not keen on merging. You will simply have to survive without Google’s binary ML-Kit facial recognition for now"
#BWI#airport will lessen your hassle of getting on a plane in return for your agreeing (you actually have no choice) to #facialrecognition - they scan your license into the db - "check" verify your #identity - which is BS. The claim is that they do not keep this profile, but of course they do - they profile at most airports in the US, and the data goes upstream. The #surveillancesociety has been here for a very long time folks.
Madison Square Garden used facial recognition to identify and stop a mom from attending a Christmas show with her kid because she's an attorney at a firm who is engaged in litigation with them.
This is why it's not enough to just ban government and law enforcement use of #facialrecognition. There are so many ways private companies and even individuals can abuse #biometric#surveillance tech.
A little-known multi-agency drug war group runs thousands of #facialrecognition scans and other #surveillance operations. We found over 37,000 requests for support, connecting at least 233 #lawenforcement organizations in #Minnesota and beyond.
#HIDTA operations, like the North Central HIDTA, exist across the United States with little public scrutiny. Now, the dataset Unicorn Riot is making public shows over 1,600 #facialrecognition searches in 1,395 days for investigations including “property crimes,” and others.
The log of #facialrecognition searches spans roughly 1,300 days from January of 2019 to late October 2022. The log indicates investigators ran 1,677 facial recognition searches since Jan. 2019.
Of those searches, 921, or 54%, were not associated with any case numbers.
Unicorn Riot obtained data detailing the names of law enforcement organizations that requested investigative support from the ISC; the dates and types of cases in which the center provided real time support through the RTAC; spreadsheets logging facial recognition searches &more. #facialrecognition
A Look Behind The Curtain – Facial Recognition in Minnesota
Before UR obtained these documents, an in-depth understanding of how local authorities have used #facialrecognition was not available. But now, with thousands of cases to observe, it’s possible to illustrate the extent.
On Feb. 12, 2021, Minneapolis banned its police department from using #facialrecognition for surveillance, but that didn’t stop the law enforcement agency from using the controversial technology.