Sister Porcha Woodruff is one example.
On August 3, Porcha Woodruff (age 32, of Detroit, Michigan) filed a lawsuit against the City of Detroit and a female investigator for illegally arresting and detaining her in violation of the Fourth Amendment.
Knowing that faulty technology is still in use
According to the news agency AFP, the incident started about half a year ago. There was a car robbery at a gas station. On 16 February, six policemen came to the house with a warrant to arrest Ms Woodruff for car theft.
She was surprised: “The car thief has a gun? Are you kidding? Can’t you see I’m eight months pregnant?”
Police found him and handcuffed him in front of his family and neighbors and brought him back for questioning. His iPhone was seized for evidence. He was released on bail after about 11 hours.
She went immediately to the hospital with results of “slow heart rate due to dehydration” and “cramps due to stress”. 15 days later the court dropped the charges against him because there was not enough evidence.
Woodruff was arrested by mistake after the Detroit Police Department accessed a camera at a gas station that had recorded the suspect’s car and used facial recognition software Dataworks Plus to compare.
The search results matched Porcha Woodruff’s name to her ID confiscated in 2015 when she was arrested for driving with an expired license. Actually the facial recognition system was wrong.
The director of the Detroit Police Department acknowledges that the problem is very serious. Three years ago, he himself explained that if only this system is used without other methods, the error rate is up to 96%. Despite this, Detroit police used it 125 times last year.
Ms. Woodruff is the sixth person to claim that Detroit police wrongfully arrested her because of a system glitch. All are black. Earlier, two victims filed a lawsuit against the Detroit police.
Following false identifications in 2019, the Detroit Police Department revised its guidelines for using the facial recognition system, limiting its use to violent crime or trespassing investigations.
training data is important
In the lawsuit, Ms. Woodruff’s attorneys point out that facial recognition technology has many known flaws, but Detroit police do not set proper rules for its use, and do not provide adequate training for use by staff, causing harm. Deliberate indifference is shown towards by people of mistaken identity.
Over the past few years, some weaknesses in facial recognition technology have come to light in the US. Face recognition requires artificial intelligence (AI), but AI related technology has many flaws, so the reliability is not 100%.
AI works thanks to machine learning, but the first step is to train the computer, that is, to teach the computer algorithm to process input data in order to obtain appropriate output data. However, algorithmic bias can arise due to non-standardized data.
Newspaper Washington Post For example, in late 2019, the National Institute of Standards and Technology (NIST) in the US studied dozens of algorithms, then concluded that they were 100 times more likely not to recognize black or Asian faces than to recognize white ones. is more.
Both algorithms also failed to correctly identify the gender of black women in 35% of cases. This degree of difference is more pronounced in algorithms developed in the US than in algorithms developed in Asia.
There are also many other faulty reasons such as non-standard images (resolution degradation, insufficient brightness), or facial changes (aging, wearing makeup, wearing glasses). This means that we can fool the facial recognition system or the system can mistake one person for another.