estelle, to science
@estelle@techhub.social avatar

'My normative lesson is, “Heed Marginalized People.” Fundamentally and foundationally. And, like, don’t include them necessarily in your training data, but include them in the questions that you ask at the outset, and who you think to ask about what you ought to do.
[…]
So to ask that question, “Who have we not thought about; whose harms, whose needs, whose voice has been, perhaps speaking, but unheeded, for a very long time? And how do we ensure that the things that they have called out as potential sites of failure, don’t go unremarked, don’t go unaddressed.'

Dr. Damien P. Williams, @Wolven https://afutureworththinkingabout.com/?p=5442 @ethics @dataGovernance @data

sinabhfuil, to random

Why should the Gardaí have facial recognition tech when they're blocking traffic cameras that could police dangerous use of phones in cars, etc?
https://www.irishtimes.com/politics/2023/06/01/helen-mcentee-resumes-role-as-minister-for-justice-after-maternity-leave/

openrightsgroup, to random
@openrightsgroup@social.openrightsgroup.org avatar

Three years ago, the murder of an innocent Black man by US police officers caused a global reckoning on race.

Today we face institutional racism in UK policing that's being hardwired in the tech they use.

Read more from @mssophiaakram ORG Policy Manager.

https://www.openrightsgroup.org/blog/george-floyds-murder-three-years-on-insitutional-racism-hardwired-in-police-tech/

openrightsgroup,
@openrightsgroup@social.openrightsgroup.org avatar

is used by the police to try and match people to watch lists but its error rate is unacceptable, particularly for younger people and people with darker skin.

See Big Brother Watch's briefing on facial recognition surveillance.

9/15

https://bigbrotherwatch.org.uk/wp-content/uploads/2020/06/Big-Brother-Watch-briefing-on-Facial-recognition-surveillance-June-2020.pdf

MediaActivist, to random
openrightsgroup, to random
@openrightsgroup@social.openrightsgroup.org avatar

The police used a Beyonce gig as an excuse to deploy live facial recognition.

The creeping use of this tech is alarming, especially with concerns about its accuracy and efficacy. We should be able to expect privacy and not be ID’d without cause.

Would you put up with being fingerprinted as you go to the shops? Or give DNA as you go to work? Or have your face compared to a database just for going to a gig?

Read our new blog. https://www.openrightsgroup.org/blog/dont-use-beyonce-to-normalise-live-facial-recognition/

mattburgess, to infosec

New figures from London's Met Police show that 67,000 faces were scanned by face recognition at the Coronation of King Charles.

There was a grand total of 2 (two) alerts matching people on its watchlist. One person was arrested, in the other incident there was no action taken

The figures say there were no false alerts; the threshold for the system seems low

Link to PDF: https://www.met.police.uk/SysSiteAssets/media/downloads/force-content/met/advice/lfr/new/lfr-deployment-grid-2023-v.3.1-web.pdf

openrightsgroup, to random
@openrightsgroup@social.openrightsgroup.org avatar

The decision by the European Parliament to ban AI systems of biometric mass surveillance in public spaces is welcome. It should be reflected in the UK.

We must protect civil liberties from the encroachment of the state that’s leading us down the road of predictive policing.

Read the statement from @edri

https://edri.org/our-work/eu-parliament-committee-vote-strong-message-protecting-fundamental-rights-from-ai-systems/

jbzfn, to random
@jbzfn@mastodon.social avatar

「 Clearview AI, the US startup that’s attracted notoriety in recent years for a massive privacy violation after it scraped selfies off the Internet and used people’s data to build a facial recognition tool it pitched to law enforcement and others, has been hit with another fine in France over non-cooperation with the data protection regulator 」
@TechCrunch


https://techcrunch.com/2023/05/10/clearview-ai-another-cnil-gspr-fine

glynmoody, to random
@glynmoody@mastodon.social avatar
PaulNemitz, to random

Sign this petition against in Brussels: Pour l’interdiction de la reconnaissance faciale à Bruxelles https://democratie.brussels/initiatives/155 #

info_activism, to random
@info_activism@mastodon.cc avatar

Our face has a unique set of data measurements -such as the width of your nose & the distance between your eyes- called a faceprint. Facial recognition tech is everywhere. But, do you know how it works & what can be done with the information collected? https://theglassroom.org/en/what-the-future-wants/exhibits/the-real-life-of-your-selfie-wtfw

earthworm,
@earthworm@kolektiva.social avatar

If there isn't an app that distorts your selfies a tiny little bit (just enough to confuse the face recognition algorithm), can somebody please develop one?
Just like the tools that remofe metadata from pictures before uploading...

The best antisurveillance is probably to flood the web with distorted portraits of yourself.

I have to admit that my mother gave me this idea. She doesn't even know how to use properly her smartphone but is my countersurveillance hero:
She gets customer cards from every supermarket, then swaps them with all kind of other people to prank their algorithms of customer preferences 🤣

@aral @info_activism

barsteward, to random

The palace are really pulling out all the stops to ensure that Meghan doesn’t attend the !

https://metro.co.uk/2023/05/03/fury-over-police-plans-to-use-facial-recognition-on-coronation-crowds-18719009/

team, to random

Today New Yorker City Council members and community members stood firmly against the abuses of billionaires like James Dolan and aggressive landlords who want to use technology to strip away our rights.

image/jpeg
image/jpeg
image/jpeg

Bluedonkey, to infosec
@Bluedonkey@mastodon.social avatar

Just watched a video from a large intl company proposing a system for K-12 schools using to allow students & staff access, to detect former students on campus, to block access to non-custodial parents & sex offenders and more. All using AI based facial recognition.

Leaving aside issues of accuracy, just think about the database of personal information behind that. Then think about this:

https://www.edsurge.com/news/2023-04-17-student-privacy-is-at-more-risk-than-ever-before-can-k-12-schools-keep-it-safe

strypey, to random
@strypey@mastodon.nzoss.nz avatar

More stuff in the chat app that doesn't work without proprietary software (unless they've replaced these non-free dependencies in the last 3 years):

"...maps and automatic facial recognition don’t work; there’s a separate patch for OSM support 110 which the Signal developers were not keen on merging. You will simply have to survive without Google’s binary ML-Kit facial recognition for now"

https://forum.f-droid.org/t/ive-degoogled-signal-messenger/10443

bespacific, to random
@bespacific@newsie.social avatar

will lessen your hassle of getting on a plane in return for your agreeing (you actually have no choice) to - they scan your license into the db - "check" verify your - which is BS. The claim is that they do not keep this profile, but of course they do - they profile at most airports in the US, and the data goes upstream. The has been here for a very long time folks.

HistoPol,
@HistoPol@mastodon.social avatar

@bespacific

If you decline, you do not fly?
How about arrival from abroad?

@CassandraZeroCovid

evangreer, to random
@evangreer@mastodon.online avatar

Oh. my. god.

Madison Square Garden used facial recognition to identify and stop a mom from attending a Christmas show with her kid because she's an attorney at a firm who is engaged in litigation with them.

This is why it's not enough to just ban government and law enforcement use of . There are so many ways private companies and even individuals can abuse tech.

entirely. Yesterday.

https://www.nbcnewyork.com/news/msgs-facial-recognition-stops-mom-from-attending-christmas-show-with-child/4004471/

UnicornRiot, to Minnesota
@UnicornRiot@mastodon.social avatar

NEW: https://unicornriot.ninja/2022/criminal-intel-files-show-facial-recognition-warrantless-surveillance-in-minnesota/
Criminal Intel Files Show Facial Recognition, Warrantless Surveillance in Minnesota

A little-known multi-agency drug war group runs thousands of scans and other operations. We found over 37,000 requests for support, connecting at least 233 organizations in and beyond.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

operations, like the North Central HIDTA, exist across the United States with little public scrutiny. Now, the dataset Unicorn Riot is making public shows over 1,600 searches in 1,395 days for investigations including “property crimes,” and others.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

The log of searches spans roughly 1,300 days from January of 2019 to late October 2022. The log indicates investigators ran 1,677 facial recognition searches since Jan. 2019.
Of those searches, 921, or 54%, were not associated with any case numbers.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

Unicorn Riot obtained data detailing the names of law enforcement organizations that requested investigative support from the ISC; the dates and types of cases in which the center provided real time support through the RTAC; spreadsheets logging facial recognition searches &more.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

A Look Behind The Curtain – Facial Recognition in Minnesota
Before UR obtained these documents, an in-depth understanding of how local authorities have used was not available. But now, with thousands of cases to observe, it’s possible to illustrate the extent.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

On Feb. 12, 2021, Minneapolis banned its police department from using for surveillance, but that didn’t stop the law enforcement agency from using the controversial technology.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

Some other examples – Since Jan. of 2019, the Police Department requested these searches 33 times; , 69 times; 7; 42; 14; Richfield 23; St. Louis Park 30; Police Department 42; and 1.

UnicornRiot,
@UnicornRiot@mastodon.social avatar

is only one side of today's investigation (written by Sam Richards).
Check out the full story about and here: https://unicornriot.ninja/2022/criminal-intel-files-show-facial-recognition-warrantless-surveillance-in-minnesota/

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • tacticalgear
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • Durango
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • ngwrru68w68
  • kavyap
  • GTA5RPClips
  • provamag3
  • ethstaker
  • InstantRegret
  • Leos
  • normalnudes
  • everett
  • khanakhh
  • osvaldo12
  • cisconetworking
  • modclub
  • anitta
  • tester
  • megavids
  • lostlight
  • All magazines