close
close
Local

Detroit Police Revise Facial Recognition Rules After Bad Arrest

The Detroit Police Department has revised its policies on how it uses facial recognition software to solve crimes as part of a federal settlement with a man who was wrongly arrested for theft in 2020 based on this technology, authorities said Friday.

Robert Williams was falsely identified as the shoplifter at a Shinola store in October 2018. Fifteen months later, he was arrested in front of his family in his driveway, jailed and held for 30 hours before being released on personal recognizance, according to his complaint.

The settlement between Williams and the city of Detroit was filed Friday in the U.S. District Court for the Eastern District of Michigan. Earlier in May, the Detroit City Council approved paying Williams $300,000 in damages as part of the settlement.

A Detroit detective used the department's facial recognition technology to analyze a grainy photo taken from poorly lit footage, Williams' complaint says. In the video, the shoplifter never looks directly at the camera, the complaint says.

The detective sent the poor-quality photo to Michigan State Police for a facial recognition search, which returned a possible match to a photo of Williams' expired driver's license, according to the American Civil Liberties Union.

City officials cited “sloppy” detective work for the wrongful arrest, expunged Williams' record and removed his personal information from the police database.

Williams said the abuse of facial recognition technology “completely turned my life upside down.”

“My wife and young daughters had to watch helplessly as I was arrested for a crime I did not commit. When I came home from prison, I had already missed my youngest daughter's first tooth and my oldest couldn't even stand to look at my picture. Even now, years later, it still makes them cry when they think about it,” he said in a statement.

“The scariest thing is that what happened to me could have happened to anyone.”

National civil rights advocates have decried law enforcement’s use of the technology because of its dangerous misidentifications. At least seven people have been wrongfully arrested across the country because “police relied on incorrect facial recognition results,” the ACLU said in April. Nearly all of those falsely accused were black. Three of the cases occurred in Detroit, including a woman who was eight months pregnant at the time and arrested in front of her children, the ACLU said.

ACLU: Detroit's new policy serves as a model for American departments

The ACLU of Michigan, which filed the lawsuit on Williams' behalf, announced the Detroit police policy changes at a news conference Friday. Among them:

  • Police cannot make arrests based solely on facial recognition results or the results of a photo lineup based on a facial recognition search.
  • Police cannot make identifications based on facial recognition alone without other independent and reliable evidence linking the suspect to a crime.
  • Police must disclose flaws in facial recognition technology and instances where it is used in an arrest. Officers must also disclose instances where facial recognition technology failed to identify a suspect or when the results showed different suspects.
  • Training on facial recognition software that includes the risks and dangers of the technology and the disproportionate rate of misidentification of people of color.
  • An audit is to be conducted of all cases since 2017 in which Detroit police used facial recognition technology to obtain an arrest warrant.

The policies will be enforced in federal court for four years, the ACLU said. Representatives of the nonprofit group called the controversial facial recognition technology “dangerous” and the regulation “revolutionary.”

Detroit's new policies will serve as a model for other police departments nationwide for best practices in facial recognition technology, said Phil Mayor, senior attorney at the ACLU of Michigan.

Detroit police said Friday that the department is pleased with the policy changes and also “strongly” believes they will serve as a national example of facial recognition best practices.

“While the work of DPD and the ACLU may differ, our goals are similar: to ensure that policing is carried out in a fair, equitable, and constitutional manner,” the department wrote.

After Williams' wrongful arrest, Detroit police created a facial recognition policy that provided for three independent approvals before the technology could be approved for use in an investigation, the department said. The policy also stated that the technology could not be used as a basis for identifying a suspect.

Use of facial recognition software raises questions

A facial recognition system uses biometric software to map a person's facial features from a video or photo. The system then attempts to match information in databases to verify a person's identity.

Police departments use facial recognition to find potential suspects and witnesses to a crime by combing through millions of photos. The software is also used to provide surveillance of public places such as concerts and schools.

But the technology has sparked opposition across the United States, which it says could lead to misidentification of suspects and serious consequences.

A Texas man wrongly arrested and jailed for nearly two weeks filed a lawsuit in January, accusing facial recognition software of wrongly identifying him as the suspect in a store robbery. Using poor-quality surveillance footage of the theft, artificial intelligence software installed at a Sunglass Hut in Houston falsely identified Harvey Murphy Jr. as a suspect, leading to a warrant for his arrest , according to the lawsuit.

In August, Detroit police tightened their policies on photo screening and facial recognition technology after “shoddy” police work led to the wrongful arrest of a pregnant woman, Police Chief James White previously said. Porcha Woodruff filed a federal lawsuit after she was wrongfully arrested in a carjacking and robbery.

In December, the Federal Trade Commission banned Rite Aid from using AI facial recognition technology, accusing the drugstore chain of recklessly deploying technology that subjected customers — particularly people of color and women — to unjustified searches.

The move comes after Rite Aid deployed AI-based facial recognition to identify customers who may be engaging in criminal behavior such as shoplifting. The FTC said the technology often based its alerts on low-quality footage, such as from security cameras, phone cameras and news reports, leading to thousands of “false positive matches” and customers being searched or kicked out. stores for the crimes they had committed. did not commit.

Contributions: Terry Collins and Bailey Schulz

Andrea Sahouri covers criminal justice for the Detroit Free Press, part of the USA TODAY Network. She can be contacted at [email protected].

Related Articles

Back to top button