#Wise customers are now expected to consent to retention and disclosure to partners of #biometric facial information for up to a year simply to continue using their accounts, and even when not required by local financial regulations.
"...[Sara] says after her bag was searched she was... banned from all stores using the #Technology.
"I was just crying and crying the entire journey home… 'Oh, will my life be the same? I'm going to be looked at as a shoplifter when I've never stolen'.
"#Facewatch later wrote to Sara and acknowledged it had made an error..."
#UK#Privacy#Biometrics#FacialRecognition: "Silkie Carlo, director of Big Brother Watch, has filmed the police on numerous facial-recognition deployments. She was there the night Shaun Thompson was picked up by police.
"My experience, observing live facial recognition for many years, [is that] most members of the public don't really know what live facial recognition is," she says.
She says that anyone's face who is scanned is effectively part of a digital police line-up.
"If they trigger a match alert, then the police will come in, possibly detain them and question them and ask them to prove their innocence."
The use of facial recognition by the police is ramping up.
Between 2020 and 2022 the Metropolitan Police used live facial recognition nine times. The following year the figure was 23.
Already in 2024 it has been used 67 times, so the direction of travel is clear.
Champions say that misidentifications are rare.
The Metropolitan Police say that around one in every 33,000 people who walk by its cameras is misidentified.
But the error count is much higher once an someone is actually flagged. One in 40 alerts so far this year has been a false positive."
Updated with a statement from Sen. Merkley, who proposed the amendment and was a part of the negotiation: "As I worked with other Senate negotiators to develop a compromise proposal governing TSA’s use of facial recognition, it became abundantly clear that the end goal for TSA is to make facial recognition mandatory for all American air travelers and that the current opt-out system will end."
#CyberSecurity#DataBreach#Biometrics#FacialRecognition#DataProtection#Australia: "Police and federal agencies are responding to a massive breach of personal data linked to a facial recognition scheme that was implemented in bars and clubs across Australia. The incident highlights emerging privacy concerns as AI-powered facial recognition becomes more widely used everywhere from shopping malls to sporting events.
The affected company is Australia-based Outabox, which also has offices in the United States and the Philippines. In response to the Covid-19 pandemic, Outabox debuted a facial recognition kiosk that scans visitors and checks their temperature. The kiosks can also be used to identify problem gamblers who enrolled in a self-exclusion initiative. This week, a website called “Have I Been Outaboxed” emerged, claiming to be set up by former Outabox developers in the Philippines. The website asks visitors to enter their name to check whether their information had been included in a database of Outabox data, which the site alleges had lax internal controls and was shared in an unsecured spreadsheet. It claims to have more than 1 million records.
The incident has rankled privacy experts who have long set off alarm bells over the creep of facial recognition systems in public spaces such as clubs and casinos."
If Congress is going to give the FAA $105 billion, they need to bake in a ban on dangerous and discriminatory #FacialRecognition tech that the TSA is saying they're bringing to 400+ airports nationwide.
Our survey of migrants, refugees and asylum seekers with Positive Action in Housing found that most people worry about the UK government sharing their data with third-party organisations.
@sylkegruhnwald sprach mit Strafrechtsprofessorin Monika Simmler zu #Ethik und Gefahren von Open Source Intelligence, also der Nutzung öffentlicher Informationen für die Strafverfolgung. Dabei wird intensiv die Problematik der #Biometrie erläutert.
Simmler vertritt dabei die Ansicht, dass #Gesichtserkennung durch CH-Behörden aktuell verboten sei, da nicht explizit erlaubt.
"A Māori mum misidentified as a trespassed 'thief' at a Rotorua supermarket trialling facial recognition technology says she felt 'racially discriminated' against and embarrassed during the 'horrible' birthday incident."
In an exclusive interview with The Times, the Met’s director of intelligence, Lindsey Chiswick, said that the tool had been a “game-changer”, triggering an arrest every two hours of alleged criminals including rapists, burglars and robbers since it was introduced last April.
A Whitehall source said that it had been so successful that the government was planning to make a policy statement setting out its facial recognition strategy in May or June.
#EU#Biometrics#EURODAC#Surveillance#FacialRecognition#AsylumReform: "On Wednesday (10 April), the EU is set to vote on a new set of asylum and migration reforms. Among the many controversial changes proposed in the new migration pact, one went almost unnoticed — a seemingly innocent reform of the EU's asylum database, EURODAC.
Although framed as purely technical adjustments, the reality is far more malicious. The changes to EURODAC will massively exacerbate violence against people on the move.
Reform of this 20 year-old database will make it the technological sword of EU's hostile asylum and border policies. It will harness the most nefarious surveillance technologies that exist to date — namely the capture, processing and analysis of biometric data — and enable EU states to have full control over migrants' body and movements.
With the collection of biometrics, the body has already become a "passport" for many. Biometrics is the process of making data out of a person's biological or physiological characteristics. Fingerprints, facial images and iris scans are among the forms of biometrics most widely used by states to uniquely identify a person." https://euobserver.com/opinion/158292
FTC denies rating board's suggestion for age verification system
The Federal Trade Commission has denied a petition to allow companies to use facial age estimation (FAE) technology to obtain parental consent when collecting data from children under 13, a requirement for the Children's Online Privacy Protection Act (COPPA).
#UK#London#Surveillance#Biometrics#FacialRecognition: "The question of who, exactly, police want to talk to is what privacy watchdogs are concerned about. One 23-year-old man flagged by the system for possession of points and blades was later found to have six rounds of ammunition, stolen mobile phones, a large quantity of cannabis and a stolen Oyster card linked to a robbery in 2022. Other arrests were variously for assault, burglary, theft, pickpocketing, breaching court-imposed conditions, fraud, threatening behavior and obstructing a constable.
Calling these arrests “precision policing” is dubious at best, says Madeleine Stone, senior advocacy officer for Big Brother Watch.
“Rather than actively pursuing people who pose a risk to the public, police officers are relying on chance and hoping that wanted people happen to walk in front of a police camera.”
A news release on the Met police’s website says the facial recognition system “identifies people who are on a bespoke watchlist which can include those who are wanted, have outstanding arrest warrants as issued by the court, or to ensure a person is complying with their conditions.”
#Surveillance#Biometrics#FacialRecognition#Israel#Palestine#Gaza: "Israel is deploying a mass facial recognition program in Gaza, conducting surveillance of Palestinians without their knowledge or consent, according to a new report from The New York Times.
As the publisher reports, speaking to Israeli intelligence officers, military officials, and soldiers, the facial recognition program is run by the Israel Defense Forces (IDF)'s military Unit 8200, which is "collecting and cataloging the faces of Palestinians". The program reportedly uses technology from Corsight, an Israel facial recognition company that provides services for government agencies, law enforcement, and corporations, alongside Google Photos.
The Times says this mass surveillance is being rolled out in Israel to identify members of Hamas, following the Oct. 7 attacks. The Israeli military also set up checkpoints — along roads Palestinians are using to flee the war — with facial recognition cameras, and soldiers have used security camera footage, videos uploaded by Hamas on social media, and also asked Palestinian prisoners to identify anyone affiliated with Hamas." https://mashable.com/article/israel-palestine-gaza-facial-recognition-program
#Russia#Surveillance#Biometrics#FacialRecognition#GigEconomy: "Now TBIJ, in partnership with Follow The Money and Paper Trail Media, can reveal that the technology used to repress dissent against Putin’s authoritarian regime is powered by unwitting gig workers in the global south. A sprawling global network supporting Russia’s surveillance regime draws in US investment firms, one of Russia’s biggest tech companies and two companies sanctioned for their alleged role in Putin’s oppression.
At the heart of it all is Toloka, a little-known tech platform that recruits the gig workers and raises questions about the effectiveness of EU sanctions. Before a recent restructure, all of Toloka was ultimately owned by Yandex, a Russian tech giant with major shareholders in the west.
#Surveillance#Biometrics#FacialRecognition: "New York Times reporter Kashmir Hill has been writing about the intersection of privacy and technology for well over a decade; her book about Clearview AI’s rise and practices was published last fall. She speaks with EFF’s Cindy Cohn and Jason Kelley about how face recognition technology’s rapid evolution may have outpaced ethics and regulations, and where we might go from here.
In this episode, you’ll learn about:
The difficulty of anticipating how information that you freely share might be used against you as technology advances.
How the all-consuming pursuit of “technical sweetness” — the alluring sensation of neatly and functionally solving a puzzle — can blind tech developers to the implications of that tech’s use.
The racial biases that were built into many face recognition technologies.