Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Open Rights Group
Open Rights Group
@openrightsgroup@social.openrightsgroup.org  ·  activity timestamp last month

Facial recognition is biased. The police know it. The UK Home Office knows it. And they don't care.

This dangerous, intrusive tech produces more false positives for women, young people and members of ethnic minority groups.

We need Parliamentary scrutiny now!

https://www.theguardian.com/technology/2025/dec/10/police-facial-recognition-technology-bias

#facialrecognition #surveillance #policing #privacy #ukpolitics #ukpol #safetynotsurveillance

  • Copy link
  • Flag this post
  • Block
Bill Zaumen
Bill Zaumen
@bzdev@fosstodon.org replied  ·  activity timestamp last month

@openrightsgroup With regard to facial recognition: shouldn't the manufacturers or organizations deploying it be required to determine the false positive rate not only for the population as a whole, but for demographic groups as well? Then you can adjust the criteria for a match so that everyone is treated fairly.

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.1 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct