Presidential candidate Bernie Sanders has called for a national ban on it. A member of the Detroit Board of Police Commissioners was removed from of a board meeting centering on the issue and was arrested for disorderly conduct (charges were later dropped). And concerned Detroiters have doubled the number of attendees at such meetings just to voice their concerns with it. The controversies surrounding police use of facial recognition software, which has been proven to misidentify people of color disproportionately, have been brewing in cities across the country over the past year. Among them are Chicago, New York City, and San Francisco, which banned the technology in May. But in Detroit, a city in which 79% of the population is African American, tension is especially high.
In May, the Georgetown Law Center on Privacy and Technology published “America Under Watch,” a new study which examined use of facial recognition software by the municipalities in several of the country’s most crime-filled cities. Among them, Detroit, which bears the second-highest crime rate in the U.S., according to the FBI’s most recent Uniform Crime Report. The study reported that, as a part of the city’s Project Green Light, a partnership between the DPD and more than 500 private businesses to install cameras at their locations with the intention of both deterring and solving crime, “… all people caught on camera — passersby, patrons, or patients — are scanned, their faces compared against the face recognition database on file.” Though categorically false, the statement was enough to send citizens into a panic, prompting them to show up to commissioner meetings in droves. Previously, residents were largely unaware the technology had been in use for two years without board approval. In June, the board of 11 civilian members called a vote on the matter.
Orwellian implications of the study aren’t the sole points of contention in the facial recognition affray. Piling onto the outrage are the fundamental inaccuracies of the technology. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” a study published last year by MIT Media Lab, found that, although facial analysis programs had a less than 1% error rate in assessing the faces of white men, it jumped to nearly 12% for men of color and 34% for women of color. It was these disparities, in part, that spurred Tawana Petty, a Detroit resident and the director of data justice programming for the Detroit Community Technology Project, to speak up at nearly every Board of Police Commissioners meetings since Detroit caught wind of the Georgetown study almost five months ago. According to Petty, flaws in the technology will lead to marginalization of a disadvantaged community by increasing false arrests. “If we want to prevent crime, we have to catch the right people, not shoot a net out there and hope we find someone based off an algorithm that may not accurately identify the person,” she says.
Some, however, believe having human employees operate the technology makes it a reliable resource. Detroit Police Chief James Craig, an advocate for the technology, says it is not the sole identifier of the suspect. Rather, it produces many probable matches from a database, which are then examined by multiple human analysts, who decide if and which of those matches is correct — a justification Petty deems insufficient. “Add potential biases, the pressure to solve crime, someone calling off work, any number of factors — adding two human bodies to the equation doesn’t make it a safer technology.” At any rate, she says, the technology is likely to perpetuate racial profiling. “It doesn’t make sense to have Detroit — one of the largest black cities in the U.S. — experimenting with this technology.” And her concern is not unfounded. Despite the city’s majority African American population, Detroit possesses what Georgetown called an “expensive, expandable face surveillance system.”
“It doesn’t make sense to have Detroit — one of the largest black cities in the U.S. — experimenting with this technology.” –Tawana Petty
Still, Craig is not alone in his support for the software. Board of Police Commissioners member Willie Bell, who spent 32 years on the DPD force, says one of the key benefits of the software is the time it saves narrowing down potential suspects of a crime. “It’s a very cumbersome process,” he says. “You can make mistakes using photo line-ups, and with the volume of crime in Detroit, we need the best resources available.”
Craig believes that much of the opposition directed at facial recognition technology stems from a fundamental misunderstanding of how it’s being used — specifically, the notion that Green Light cameras are connected to the software, analyzing the faces of everyone they capture. “No Green Light Cameras have facial recognition,” he says. “If there is a violent crime, that’s the only time we use it.” Craig does not deny that the software disproportionately misidentifies people with darker skin tones and agrees that it would present a distinct concern if investigators relied solely on the software to identify the suspect. “The software is just a tool.” For Craig, the aptness of that tool supervenes upon the people wielding it. “It’s the analysts behind the technology who are really doing the investigative work.”
To contextualize the role analysts play, Craig outlines one investigation of a gas station shooting, which utilized the recognition software. First, he says technicians ran a screenshot of the perpetrator through the software and received 178 probable suspects from the DPD’s mugshot database. Through a multi-step process, technicians then compared the screenshot to the suspect’s driver’s license and social media photos and ran the match by another analyst and a supervisor, as is required by protocol. The analyst on the case made a match, and only then was the lead presented to an investigator.
“We cannot and do not make arrests solely on the work the analysts do,” Craig says. “While they work, investigators are still looking for evidence and interviewing witnesses.” He says analysts have used the software more than 500 times, and 30% of those searches produced suspects that made it all the way through the identification process. Analysts could not be certain enough to move the other 70% through the process, and therefore, did not present a match to investigators. Craig says none of the suspects were misidentified.
In July, State Rep. Isaac Robinson (D-Detroit) introduced House Bill 4810 — a measure that would impose a five-year moratorium on police use of the software. “I’m concerned about overreach by government. I’m a supporter of our civil liberties, and I think that use of facial recognition technology by law enforcement violates our constitutional rights,” he says. Because the majority population in his district is African American, Robinson says, he also feels it’s his responsibility to heed their concerns about the faults of the technology.
Rather than proposing a permanent ban on the technology, Robinson says the moratorium gives the public time to research and have discussions about their civil liberties. Although he says he would support a permanent ban, like the pending Senate Bill 342, a more lenient option has a better chance of passing, especially in the event that the Senate bill fails. “Some people don’t know a lot about facial recognition technology, so even those who aren’t militantly against the idea might be supportive of a moratorium as they learn more about it.” Robinson says he has received bipartisan support for the bill.
Chairwoman of the Board of Police Commissioners, Lisa Carter, says the board will vote on the facial recognition policy in late September or early October. If the board votes against it, the Detroit police will be forced to discontinue use of the software, but its use among other municipalities in the state will not be affected. Unless, that is, the state bills pass, making Michigan the first state to ban police use of facial recognition software. But many are afraid approval of the policy will lead down a slippery surveillance slope. “The software is expanding to suburban police departments, being installed at schools.” Petty says. The Detroit Free Press reported this introduction to local schools in August. “It’s a very dangerous road we’re travelling down.”