You treasure what you measure: how KPIs make software dystopias

"Key Performance Indicators" — KPIs — are the metrics used by software shops to figure out whether their products are improving; notoriously, much of the software industry has converged on "engagement" (that is, minutes spent with an app) as a KPI, and everyone from designers to programmers to managers to execs earn their bonuses and promotions by making that number go up.


Absent any of this, "engagement" actually is a pretty good proxy for product quality: if your users voluntarily increase the amount of time they spend with your product, it's likely that they're enjoying themselves. But as Goodhart's Law has it, "when a measure becomes a target, it ceases to be a good measure."


For example, Google revolutionized the search industry by counting links to pages as a measure of the pages' relevance — competitors like Altavista had been using textual analysis to decide which pages were most relevant to a given query, but Google's founders had the insight that when a lot of people across the web independently linked to a page, it was a good bet that something important was happening on that page.


The problem is that it's not hard to generate a bunch of links that look like independent links. So once Google's measure of page quality (inbound links) became a target for publishers, link farms were born, and the measure ceased to be a good measure.


The inspiration for Google's link-counting is an academic practice called "citation analysis," in which the quality of an academic study is judged based on the number of times it is cited. This, too, turns out to be simple to game, and Purdue Pharma and other pharma companies managed to turn a five-sentence letter to the editors of the New England Journal of Medicine into the justification for massive, lethal overprescription of opioids.

The dominance of KPIs in the tech industry often results in dysfunctional outcomes. When Google launched G+, its social network, it was determined to make the service a success, viewing the service as a life-or-death hedge against Facebook eroding Google's dominance. Google made its employees' bonuses dependent on the integration of G+ into their own products, with the result that every product was made more complex, more privacy-invasive, and less good, as googlers dutifully earned their bonuses by cramming G+ into every available niche, regardless of how much sense that made.


Since then, we've seen how prioritizing "engagement" has created the space for disinformation campaigns and calls for genocidal violence. Engineers don't write code that says, if subject="Nazi" then recommend, but the way the algorithms rank, prioritize and recommend user submissions has the same effect.

Columbia University's Chris Wiggins ruminates on the discussions in his "Data: Past, Present, and Future" course, where they are investigating the role of KPIs in software outcomes. He points out that while computer ethicists have codified principles for designing software that benefits its users, that simple KPIs tend to subvert these principles. Wiggins asks whether it would be possible to come up with KPIs that improved the lives of users, and how you would monitor the software in the field to make sure that users' wellbeing is being maximized.

To that end, Wiggins proposes four rules for making KPIs that reflect and honor principles.

Your KPIs can't conflict with your principles if you don't have principles.

That is:

1. start by defining your principles. I'd suggest the 5 above, which are informed by the collective research of the authors of the Belmont and Menlo reports on ethics in research, augmented by a concern for safety of the users of a product. The choice is important, as is the choice to define, in advance, the principles which guide your company, from the high level corporate goals to the individual product KPIs.

2. Next: before optimizing a KPI, consider how this KPI would or would not align with your principles. Now document that and communicate, at least internally if not externally to users or simply online.

3. Next: monitor user experience, both quantitatively and qualitatively. Consider what unexpected user experiences you observe, and how, irrespective of whether your KPIs are improving, your principles are challenged.

Ethical Principles, OKRs, and KPIs: what YouTube and Facebook could learn from Tukey [Chris Wiggins/Columbia]

(via Four Short Links)


(Image: Cryteria, CC-BY)