Detroit Police Chief James White said investigative lapses, not faulty facial recognition technology, is what caused officers in his department to wrongly arrest a pregnant woman earlier this year.
Facial recognition software indicated that Porcha Woodruff was a possible match to gas station surveillance video of another person wanted for carjacking and robbery. Police then used her image in a photo line-up, and the victim identified her as the perpetrator.
Police then arrested Woodruff, who was eight months pregnant at the time, based solely on that identification. Charges against her were eventually dropped due to lack of evidence, and Woodruff is now suing the city for false arrest.
At a Wednesday press conference, White insisted that this wasn’t a case of botched facial recognition. Instead, he blamed investigators for using improper procedures, and failing to do any follow-up investigation.
“I have no reason to conclude at this time that there has been any violations of the DPD facial recognition policy,” White said. “However, I have determined that there's been a number of policy violations by the lead investigator in this case.
“What this is, is very, very poor investigative work that led to a number of inappropriate decisions being made.”
White said officers should not have used a facial recognition-generated image in the line-up. He said that sets up a “misleading” situation for witnesses, who may wrongly pick out someone who only resembles the real perpetrator.
White said facial recognition hits and witness identifications “should only be used as a lead” that needs to be corroborated by other evidence. In Woodruff’s case, he said that should have led to further investigation—but it didn’t.
“From there, investigators follow up to see if there is other evidence incriminating the individual who was identified,” White said. “We have talked about that ad nauseum. That is our policy. It must be followed each and every time.” In this case, he admitted that officers should have been able to eliminate Woodruff as a suspect because she was eight months pregnant, unlike the perpetrator.
White said he’s making some immediate changes “so that this cannot happen again." For one, “no member will be allowed to use facial recognition-derived images in a photographic lineup. Period,” he said. “You cannot put a facial recognition photo in a photo lineup, because it's going to generate at least a lookalike.”
White said he’ll also institute other changes to how photo line-ups are conducted, and a supervisor will be required to confirm that there’s “an independent basis for believing that the suspect who was pictured in the photo line-up has the means, ability and opportunity to commit the crime.”
White gave no indication, however, that DPD plans to scale back or abandon its use of facial recognition. The department’s use of that software has been highly contested from the start, and critics say Woodruff’s case is further proof that it’s dangerous and unreliable. They point out that of the six people reported to have been falsely arrested based on faulty matches nationwide, three are from Detroit, and suggest that additional police policy changes are unlikely to prevent future mistakes.