Weeks before the publication of Virginia Eubanks's new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, knowledgeable friends were already urging me to read it, which was a no-brainer, having followed Eubanks's work for years.

Eubanks's work is a combination of technical analysis, ethical argument and thick, richly described ethnography, and it uses three different algorithmic systems to show how the false empiricism of service delivery through computerized triage and prediction is a means to harm, marginalize, and even kill the poorest among us.

Eubanks's first testbed for this brand of algorithmic cruelty are the Indiana benefits system, where a collaboration between an ideology-driven Republican state government and an overpromising, underdelivering IBM resulted in a shambles that cost the poorest Hoosiers food security, housing, medical care — even their lives.

Next is Los Angeles's "no wrong door" approach to extending benefits to people living on the city's notorious Skid Row, a seemingly intractable tent-city that is a stain on the city's conscience, where quirks of the scoring system means that some of the most desperate, at-risk people are denied benefits, stuck in a catch-22 where (for example) being imprisoned can make you score as lower-risk because you had been housed recently.

Finally, there's the Allegheny child protective services system, whose dual mission of providing services to families and taking away children from families that neglect their children places the poorest people in one of America's poorest places in constant danger of having their children snatched from them, and where the children of parents who have attracted even anonymous and unsubstantiated complaints are at risk of having their children taken away a generation later, because growing up in a high-risk household puts you at high risk.

Eubanks's ability to combine beautiful biographic storytelling with keen observation and criticism makes this an indispensable addition to the literature on what Cathy O'Neil calls Weapons of Math Destruction.

For example, in a description of the use of algorithms to conduct surveillance to fight crime, she points out that in the traditional model of policing, the cops would find some reason to suspect someone, and then put them under surveillance; the data-driven approach is to put whole populations under surveillance, and then decide who is suspicious. This has enormous implications for social justice: if the reason to replace the old system is the racial bias of cops or other human frailties, then consider what happens if those same biases are reflected in the decisions of whom to surveil (poor people are more likely to find their lives under database scrutiny than wealthy people — for example, if you have a high-waged job you will never be considered in an algorithmic sweep of food-stamps records looking for potential fraud) and what constitutes suspicious behavior.

A recurring theme in Eubanks's work is the power of algorithms to diffuse responsibility for human suffering: using math to decide who the "deserving" poor are makes it easier to turn away from everyone else whom the system has deemed undeserving. The history of poor peoples' rights is one of popular uprisings coupled with sympathy from wealthier people, often driven by stories of mediagenic people being harmed by the system — the middle class is more apt to demand systemic overhaul when a small child or a sweet pensioner dies then when the system mostly kills racialized men with drug addictions. By using algorithms to "triage" the neediness of poor people, system designers can ensure that the people harmed by the system are the least sympathetic and least likely to provoke outrage among those with political clout.

Algorithmically defined guilt is also a problem because of the real problems agencies are trying to solve. In Allegheny, your child's at-risk score is largely defined by your use of social services to deal with financial crises, health crises, addiction and mental health problems. If you deal with these problems privately — by borrowing from relatives or getting private addiction treatment — you aren't entered into the system, which means that if these factors are indeed predictors of risk to children, then the children of rich people are being systematically denied interventions by the same system that is over-policing poor children.

Eubanks closes her book with some practical advice for improving the fairness of algorithms in public service. She advises that systems designers should guide their practice by asking themselves two questions.

1. Does the tool increase the self-determination and agency of the poor?

2. Would the tool be tolerated if it was aimed at non-poor people?

These feel like the kinds of simple-to-pose rules of thumb that can keep a lot of mischief at bay, and I hope that designers take them to heart — but as useful as those principles are, I hope that people working in algorithmic service delivery read this book in its entirety.

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor [Virginia Eubanks/St Martin's Press]