Statistical atrotcities to convict war-criminals

Great interview with Patrick Ball, the deputy director of the Science and Human Rights Program at the American Association for the Advancement of Science, who uses statistical modelling of war atrocities to build human-rights cases.

Every human rights story goes like this: I am a deponent, and I'm here to tell you about things that happened to one or many victims. I myself may or may not be one of those victims. Each of those victims may have suffered one or more violations, and those violations may or may not be what historians call colligated at one or more points in time or space. Each of the violations may have been perpetrated by zero, one, or many identifiable perpetrators, and those perpetrators may be individuals with names and ranks, or they may be institutions. Each of those may be associated with one or more of the violations in this story. That's the complexity of one story. Now we're going to collect 10,000 stories, and there is a dense, complex overlapping of all the stories. Then we aggregate the stories from, say, four different organisations, and each of those organisations' sets of judgements has a dense and complex overlap with the other organisations' information. The result is a multidimensional, multilayered Venn diagram built up from this information, which I refer to as "reporting density".

Link

Discuss

(via Ambiguous)