One of the griftiest corners of late-stage capitalism is the "public safety" industry, in which military contractors realize they can expand their market by peddling overpriced garbage to schools, cities, public transit systems, hospitals, etc — which is how the "aggression detection" industry emerged, selling microphones whose "machine learning" backends are supposed to be able to detect "aggressive voices" (as well as gunshots) and alert cops or security guards.
Propublica and Wired teamed up to independently evaluate the industry-leading "aggression detectors" from Sound Intelligence and Louroe Electronics, and they're basically terrible. The "gunshot detectors" go off when kids slam their locker doors, the voice detection system trips when kids cough or sing "Happy Birthday" — but don't go off when someone screams as loud as they can, or talks in a low, menacing voice. You can trip them by playing recordings of Gilbert Gottfried, but not "an agitated man who was screaming and pounding on a desk."
These $1,000 mics are marketed to school as tools to prevent school shootings, but there's no evidence that school shooters shout or scream prior to opening fire (whereupon the situation is easy to detect without special apparatus) — and the mics do nothing to pick up "cold anger."
Meanwhile, every $1,000 a school spends on spy-mics is $1,000 they can't spend on counsellors, special services, classroom teachers, or other interventions.
The vendors insist that their mics are not privacy invasive because they only analyze the sounds they pick up, rather than trying to analyze speech — but school administrators can access recording made by the mics and listen in on their students.
To test the algorithm, ProPublica purchased a microphone from Louroe Electronics and licensed the aggression detection software. We rewired the device so we could measure its output while testing pre-recorded audio clips. We then recorded high school students and examined which types of sounds set off the detector.
We found that higher-pitched, rough and strained vocalizations tended to trigger the algorithm. For example, it frequently triggered for sounds like laughing, coughing, cheering and loud discussions. While female high school students tended to trigger false positives when singing, laughing and speaking, their high-pitched shrieking often failed to do so.
Aggression Detectors: The Unproven, Invasive Surveillance Technology Schools Are Using to Monitor Students [Jack Gillum and Jeff Kao/Propublica and Wired]
(Image: Adrienne Grunwald)