Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Share Post: Reddit Facebook
Boston bans government use of facial recognition
#1
[Image: Bg65Xrn.png]

Quote:It’s simple: Boston doesn’t want to use crappy technology.

Boston Police Department (BPD) Commissioner William Gross said last month that abysmal error rates – errors that mean it screws up most particularly with Asian, dark or female skin – make Boston’s recently enacted ban on facial recognition use by city government a no-brainer:

Until this technology is 100%, I’m not interested in it. I didn’t forget that I’m African American and I can be misidentified as well.

Thus did the city become the second-largest in the world, after San Francisco, to ban use of the infamously lousy, hard-baked racist/sexist technology. The city council voted unanimously on the bill on 24 Jun – here’s the full text, and here’s a video of the 3.5-hour meeting that preceded the vote – and Mayor Marty Walsh signed it into law last week.

The Boston Police Department (BPD) isn’t losing anything. It doesn’t even use the technology. Why? Because it doesn’t work. Make that it doesn’t work well. The “iffy” factor matters most particularly if you’re Native American, black, asian or female, given high error rates with all but the mostly white males who created the algorithms it runs on.

According to a landmark federal study released by the National Institute of Standards of Technology in December 2019, asian and black people are up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type of search. Commercial facial analysis systems vary widely in their accuracy, but overall, Native Americans had the highest false-positive rate of all ethnicities.

The faces of black women were often falsely identified in the type of search wherein police compare their images with thousands or millions of others in hopes of hitting a match for a suspect. According to an MIT study from 2018, the darker the skin, the higher the error rates. For the darkest-skinned women, two commercial facial-analysis systems had an error rate of nearly 35%, while two systems got it wrong nearly 47% of the time.

Boston city councilors put it this way: do we want to adopt this type of error-pockmarked surveillance, even as we’re attempting to untangle the knots of systemic racism? Councilor Ricardo Arroyo, who sponsored the bill along with Councilor Michelle Wu, had this to say ahead of the city council hearing:

It has an obvious racial bias, and that’s dangerous. But it also has sort of a chilling effect on civil liberties. And so, in a time where we’re seeing so much direct action in the form of marches and protests for rights, any kind of surveillance technology that could be used to essentially chill free speech or … more or less monitor activism or activists is dangerous.

Wu said that in the days of Black Lives Matter (BLM) protests and rising awareness, the last thing that Boston needs is a technology that’s part of the problem:

We’re working to end systemic racism. So ending the … over-surveillance of communities of color needs to be a part of that, and we’re just truly standing with the values that public safety and public health must be grounded in trust.

A recent, real-world example of a wrongful arrest came up during the city council’s discussions: that of Robert Williams. Williams, a black man living in Michigan, was arrested in January when police used automatic facial recognition to match his old driver’s license photo to a store’s blurry surveillance footage of a black man allegedly stealing watches.

As he described in an editorial published in the Washington Post last month, Williams spent the night on the floor of a filthy, overcrowded jail cell, lying next to an overflowing trashcan, without being informed of what crimes he was suspected of having committed. He’d simply been hauled away, handcuffed, as his wife and daughters watched.

I never thought I’d have to explain to my daughters why Daddy got arrested. How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway?

The ACLU has lodged a complaint against the Detroit police department on Williams’ behalf, but he doubts it will change much.

My daughters can’t unsee me being handcuffed and put into a police car. But they can see me use this experience to bring some good into the world. That means helping make sure my daughters don’t grow up in a world where their driver’s license or Facebook photos could be used to target, track or harm them.

His wife said that she’d known about the issues with facial recognition. She never expected it to lead to police on her doorstep, arresting her husband, though.

I just feel like other people should know that it can happen, and it did happen, and it shouldn’t happen.

The Detroit Police Department (DPD) claims that it doesn’t make arrests based solely on facial recognition. It’s just one investigative tool that is “used to generate leads only.” The DPD had conducted an investigation that involved reviewing video, interviewing witnesses, conducting a photo line-up, and submission of a warrant package containing facts and circumstances, to the Wayne County Prosecutors Office (WCPO) for review and approval.


Continue reading HERE
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Biden bans Kaspersky antivirus software in US over security concerns mrtrout 8 1,387 06-28-2024 , 02:55 PM
Last Post: Nicos18
  Minneapolis bans its police department from using facial recognition software Bjyda 0 985 02-13-2021 , 10:01 PM
Last Post: Bjyda
  Google Play Bans Stalkerware and ‘Misrepresentation’ mrtrout 0 1,027 09-18-2020 , 03:07 AM
Last Post: mrtrout
  Clearview AI’s facial-recognition app is a nightmare for stalking victims Herran 0 1,752 01-23-2020 , 02:57 PM
Last Post: Herran
  Lithuania bans Kaspersky Lab software on sensitive computers omidomi 0 2,043 12-24-2017 , 06:48 AM
Last Post: omidomi



Users browsing this thread: 2 Guest(s)