Big Tech is deleting evidence needed to prosecute war crimes, and governments want them to do more of it


War crimes are among the most grisly and difficult-to-prosecute crimes; and yet, ironically, the criminals have made it easier for prosecutors, by uploading videos celebrating their atrocities to Big Tech platforms like Facebook and Youtube, where they can act as recruiting tools for terrorists and extremists.


It's these very videos that human rights activists, atrocity survivors and armchair sleuths turn to in order to perform "open source intelligence" analysis on the perpetrators, effectively doxing them and handing overworked, under-resourced prosecutors the evidence they need to bring war criminals to justice.

Against this trend, though, is Big Tech's zeal to remove "terrorist content," a kind of overreaction to years of indifference to complaints about all kinds of bad content that violated the platforms' own guidelines. The newly self-deputized platforms are playing content police and taking down "terrorist content" as fast as they can find it, using algorithmic dragnets that catch plenty of dolphins along with the tuna they're trawling for. To make things worse, Big Tech invents its own definitions of "terrorism," that barely overlap with internationally recognized definitions.


It's about to get much worse: in the wake of the Christchurch white terror attacks, the Australian government rushed through legislation requiring platforms to remove "terror" content within an hour (a deadline so short that it guarantees that there will be no serious checks undertaken before content is removed) and now both the EU and the UK are poised to follow suit.

And there's plentiful evidence that terror cops are incredibly sloppy when they wield the censor's pen: last month, a French intelligence agency gave the Internet Archive 24 hours to remove "terrorist" content, targeting the Archive's collections of 15,000,000 text files, its copy of Project Gutenberg, and its entire archive of Grateful Dead recordings.

Human rights advocates are sounding the alarm, but no one is listening. It's a rerun of SESTA/FOSTA, the US anti-sex-trafficking bill that sex workers vigorously opposed, saying it would make them less safe — but which passed anyway, and now sex workers are much less safe.


Designed to identify and take down content posted by "extremists"—"extremists" as defined by software engineers—machine-learning software has become a potent catch-and-kill tool to keep the world's largest social networks remarkably more sanitized places than they were just a year ago. Google and Facebook break out the numbers in their quarterly transparency reports. YouTube pulled 33 million videos off its network in 2018—roughly 90,000 a day. Of the videos removed after automated systems flagged them, 73 percent were removed so fast that no community members ever saw them. Meanwhile, Facebook removed 15 million pieces of content it deemed "terrorist propaganda" from October 2017 to September 2018. In the third quarter of 2018, machines performed 99.5 percent of Facebook's "terrorist content" takedowns. Just 0.5 percent of the purged material was reported by users first.

Those statistics are deeply troubling to open-source investigators, who complain that the machine-learning tools are black boxes. Few people, if any, in the human-rights world know how they're programmed. Are these AI-powered vacuum cleaners able to discern that a video from Syria, Yemen, or Libya might be a valuable piece of evidence, something someone risked his or her life to post, and therefore worth preserving? YouTube, for one, says it's working with human-rights experts to fine-tune its take-down procedures. But deeper discussions about the technology involved are rare.

"Companies are very loath to let civil society talk directly to engineers," says Dia Kayyali, a technology-advocacy program manager at Witness, a human-rights organization that works with Khatib and the Syrian Archive. "It's something that I've pushed for. A lot."


Tech Companies Are Deleting Evidence of War Crimes [Bernhard Warner/The Atlantic]

(via Four Short Links)