danah boyd (previously) writes enthusiastically about Virginia Eubanks's forthcoming book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, which she calls "the best ethnography I've read in years," "on par with Barbara Ehrenreich's 'Nickel and Dimed' or Matthew Desmond's 'Evicted.'"
I've got a copy coming.
Whether we're talking about judicial decision making (e.g., "risk assessment scoring") or modeling who is at risk for homelessness, algorithmic systems don't simply cost money to implement. They cost money to maintain. They cost money to audit. They cost money to evolve with the domain that they're designed to serve. They cost money to train their users to use the data responsibly. Above all, they make visible the brutal pain points and root causes in existing systems that require an increase of services.
Otherwise, all that these systems are doing is helping divert taxpayer money from direct services, to lining the pockets of for-profit entities under the illusion of helping people. Worse, they're helping usher in a diversion of liability because time and time again, those in powerful positions blame the algorithms.
This doesn't mean that these tools can't be used responsibly. They can. And they should. The insights that large-scale data analysis can offer is inspiring. The opportunity to help people by understanding the complex interplay of contextual information is invigorating. Any social scientist with a heart desperately wants to understand how to relieve inequality and create a more fair and equitable system. So of course there's a desire to jump in and try to make sense of the data out there to make a difference in people's lives. But to treat data analysis as a savior to a broken system is woefully naive.
Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor [Virginia Eubanks/St Martins]
Beyond the Rhetoric of Algorithmic Solutionism [danah boyd/Data and Society]