NewsNational PoliticsThe Race

Actions

Should law enforcement use facial recognition technology?

frame_1955.jpg
Posted at 10:03 AM, Apr 01, 2022
and last updated 2022-04-01 12:07:13-04

DETROIT, Mich. — Facial recognition technology is being used by law enforcement to help solve crimes. However, many believe that technology is doing more harm than good.

“It's been two years and it's still, we're still talking about it,” said Melissa Williams.

Williams remembers the night her husband Robert was arrested in front of her and their two daughters like it was yesterday.

“They had no reason to arrest me. I still can't figure out why I was arrested, other than they said, I look like somebody,” said Robert Williams.

Two Detroit police officers met Robert at his front door as he was coming home from work. The Williams family lives in Farmington, more than half an hour away from the city. The officers told Robert he was a suspect in a crime and took him to jail.

“I was like, ‘Y'all got the wrong person,’” said Williams.

Robert was detained for 30 hours. When police questioned him, they showed him photos of a theft suspect their facial recognition technology matched to Williams’ driver’s license, but it wasn’t him.

“So he turns over the last one, and says, ‘So I guess computer got it wrong. And I'm like, ‘Well, yeah, the computer got it wrong, because that's not me. And that's not me either,’” Robert said.

Robert worked with the ACLU of Michigan, and his case was dismissed.

“But I still had to go to court on this as if I had a felony charged for stealing,” Robert said.

“This is exactly what we had been warning about, for years and months preceding that, that this type of thing was happening,” said Phil Mayor, a lawyer with the ACLU of Michigan.

There have been two additional widely known cases of wrongful arrests due to facial recognition technology in the United States. One occurred in Detroit. The other happened in New Jersey.

Mayor said he is worried this is just the start.

“When we let the technology take the first steps, we let the technology lead us, we make mistakes,” said Mayor.

A 2018 study by MIT found some facial classification software misidentifies people of color at higher rates than white individuals. Algorithms have advanced since then, and there have not been follow up studies to reaffirm these findings. Still, many believe those biases still exist within the technology.

“We'd like them to not use it because it's a flawed tool,” said Melissa Williams of law enforcement relying on this technology.

With privacy and fairness concerns, regulations are starting to become more common. At least seven states and nearly two dozen cities have limited government use of the technology. This is something Colorado based activist Connor Swatling would like to see more.

“With law enforcement specifically, we believe it has to be held to a higher standard,” said Swatling. “Right now, the technology is not in such a place where we feel that standard has been met.”

Swatling and his group ran tests on several facial recognition software programs to find it wrongly matched high ranking city officials with people on the sex offender registry.

Still, other activists say limiting facial recognition now is a safety risk. Last year, Virginia banned the use of facial recognition for law enforcement—this year the ban was overturned after officials felt it was needed for investigations.

“There's been hundreds of thousands of investigations that have been aided by the use of this technology,” said Jake Parker, the Senior Director of Government Relations at the Security Industry Association. “It doesn't make sense to completely ban the technology. But it makes more sense to establish parameters for it.”

Parker points to huge advances in the technology in just the last couple years alone and says the top algorithms are moving past biases. The NIST“leaderboards” on facial recognition have ranked facial recognition algorithms, and Parker hopes people will review these before making assessments on the technology.

“In the ongoing test series that includes clear race/gender demographic categories, the top 80 algorithms are 99% accurate across the white male, black male, white female, and black female categories, and for the top 40, white male is actually the LOWEST performing demographic of those four within that narrow range,” said Parker.

Still, concerns exist. “I believe this is truly a bipartisan issue,” said Swatling. “Whether you are more concerned about governmental overreach or racial injustice, this issue impacts you.”

For the Williams family, the impact took an emotional toll they are still working through. They just hope changes are made before another family endures their struggle.

“If it happened to me, it can happen to anybody,” said Robert. “Maybe one day it will be useful, but at the current, no, I'm not behind it.”