The International Association of Chiefs of Police—which is exactly what it sounds like—held its annual conference last month in Boston, bringing together leaders of some 18,000 police departments to talk shop and gawk at all the cool new gadgets they could potentially charge to taxpayers. But the biggest hype this year was—perhaps unsurprisingly—about the future of AI in policing, MIT Technology Review reports.
In the event's expo hall, the vendors (of which there were more than 600) offered a glimpse into the ballooning industry of police-tech suppliers. Some had little to do with AI—booths showcased body armor, rifles, and prototypes of police-branded Cybertrucks, and others displayed new types of gloves promising to protect officers from needles during searches. But one needed only to look to where the largest crowds gathered to understand that AI was the major draw.
The AI Policing showcase broke down into three main use cases:
- AI-generated virtual reality training systems (so no one gets bored with having to learn things and hone the skills they're supposed to be using to help their communities);
- Consolidating data collection and interpretation; which is to say, compiling all of their data in a single software suite, instead of buying separate tools (license plate readers, cameras, etc.) all from different vendors; and
- AI-generated police reports, based on AI analyses of bodycams.
I'd heard about the rise in AI-generated police reports in late August; The AP had published some fairly general coverage on the topic, based largely on what sound like pre-approved press quotes from both the police and the production company:
Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft.
Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilmore's body camera, the AI tool churned out a report in eight seconds.
"It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilmore said. It even documented a fact he didn't remember hearing — another officer's mention of the color of the car the suspects ran from.
Sure, there may be some merit to this. While generative AI is nowhere near as "objective" as its proponents claim, it may at least have different biases than a human officer who is struggling to phrase things in those ever-clunky cop linguistics. Capturing the color of a car, a detail that the officer otherwise failed to note? Sounds great.
But there is another catch with allowing a computer to pre-draft a police report, as the Tech Review notes:
By showing an officer a generated version of a police report, the tools also expose officers to details from their body camera recordings before they complete their report, a document intended to capture the officer's memory of the incident. That poses a problem.
"The police certainly would never show video to a bystander eyewitness before they ask the eyewitness about what took place, as that would just be investigatory malpractice," says Jay Stanley, a senior policy analyst with the ACLU Speech, Privacy, and Technology Project, who will soon publish work on the subject.
Sure seems like a great opportunity for cops to more carefully skew the narrative in their own favor—not a great solution in a country where police already routinely lie under oath and during interrogations!
The broader takeaway from all this new AI police tech seems to be that none of it is demonstrably better than any of the current BS "mind reading" police technology that currently exists. It just offers a new opportunity for different kinds of con men to get in on the action, and creates a different layer of plausible deniability for the police.
How the largest gathering of US police chiefs is talking about AI [James O'Donnell / Technology Review]
Previously: Catalog of Creepily Nondescript Police Gadgets