Algorithmic guilt: using secret algorithms to kick people off welfare

A wrenching and beautifully argued essay by Virginia Eubanks describes the inevitable consequences of letting secret, unaccountable algorithms decide who is eligible for welfare.

The essay is part of New America's What Drives Innovation Around the Country conference, and it describes the drive to outsource management of critical state services to private firms that control costs by using proprietary, secret algorithms to ascribe suspicion and guilt to members of the public.

It's the no-fly-list model of governance, metastasizing into schools, crime prevention, child benefits, welfare, and every other area of public life.

In December 2007, Indiana resident Sheila Perdue received a notice in the mail that she must participate in a telephone interview in order to be recertified to receive public assistance. In the past, Perdue, who is deaf and suffers from emphysema, chronic obstructive pulmonary disease, and bipolar disorder, would have visited her local caseworker to explain why this was impossible. But the state's welfare eligibility system had recently been "modernized," leaving a website and an 800 number as the primary ways to communicate with the Family and Social Services Administration.

Perdue requested and was denied an in-person interview. She gathered her paperwork, traveled to a nearby help center, and requested assistance. Employees at the center referred her to the online system. Uncomfortable with the technology, she asked for help with the online forms and was refused. She filled out the application to the best of her ability. Several weeks later, she learned she was denied recertification. The reason? "Failure to cooperate" in establishing eligibility.

The Policy Machine [Virginia Eubanks/Slate]