Half of all U.S. adults are in face-recognition databases, and Black people more likely to be targeted

One in two American adults is in a law enforcement face recognition network.

“The Perpetual Lineup” report out today from a Georgetown University thinktank makes a compelling case for greater oversight of police facial-recognition software that “makes the images of more than 117 million Americans — a disproportionate number of whom are black — searchable by law enforcement agencies across the nation,” as the New York Times account reads.

The 150-page report [PDF Link] released by Georgetown University’s Center for Privacy and Technology on Tuesday shows that the faces of about half of all adults in the United States are stored in face-recognition databases that federal, state, and local authorities can search.

From Fusion:

The databases are compiled primarily from images like mugshots, driver’s license photos, passports and visa pictures. Georgetown found that 1 in 4 police departments use face recognition databases, more than 4,000 total departments. The FBI’s database, many times larger than those of local police departments, is also sourced largely from non-criminal images, meaning that inclusion in the face recognition database (unlike fingerprint and DNA databases) isn’t reserved for criminal suspects.

screen-shot-2016-10-18-at-11-4

Here's an excerpt from the report:

There is a knock on your door. It’s the police. There was a robbery in your neighborhood. They have a suspect in custody and an eyewitness. But they need your help: Will you come down to the station to stand in the line-up? Most people would probably answer “no.” This summer, the Government Accountability Office revealed that close to 64 million Americans do not have a say in the matter: 16 states let the FBI use face recognition technology to compare the faces of suspected criminals to their driver’s license and ID photos, creating a virtual line-up of their state residents. In this line-up, it’s not a human that points to the suspect—it’s an algorithm.

But the FBI is only part of the story. Across the country, state and local police departments are building their own face recognition systems, many of them more advanced than the FBI’s. We know very little about these systems. We don’t know how they impact privacy and civil liberties. We don’t know how they address accuracy problems. And we don’t know how any of these systems—local, state, or federal—affect racial and ethnic minorities.

This report closes these gaps. The result of a yearlong investigation and over 100 records requests to police departments around the country, it is the most comprehensive survey to date of law enforcement face recognition and the risks that it poses to privacy, civil liberties, and civil rights. Combining FBI data with new information we obtained about state and local systems, we find that law enforcement face recognition affects over 117 million American adults. It is also unregulated. A few agencies have instituted meaningful protections to prevent the misuse of the technology. In many more cases, it is out of control.

The benefits of face recognition are real. It has been used to catch violent criminals and fugitives. The law enforcement officers who use the technology are men and women of good faith. They do not want to invade our privacy or create a police state. They are simply using every tool available to protect the people that they are sworn to serve. Police use of face recognition is inevitable. This report does not aim to stop it. Rather, this report offers a framework to reason through the very real risks that face recognition creates. It urges Congress and state legislatures to address these risks through commonsense regulation comparable to the Wiretap Act. These reforms must be accompanied by key actions by law enforcement, the National Institute of Standards and Technology (NIST), face recognition companies, and community leaders.

“THE PERPETUAL LINE-UP: UNREGULATED POLICE FACE RECOGNITION IN AMERICA,” OCTOBER 18, 2016: perpetuallineup.org

Loading...