[Stanford's Daphne Keller is a preeminent cyberlawyer and one of the world's leading experts on "intermediary liability" — that is, when an online service should be held responsible for the actions of this user. She brings us a delightful tale of Facebook's inability to moderate content at scale, which is as much of a tale of the impossibility (and foolishness) of trying to support 2.3 billion users (who will generate 2,300 one-in-a-million edge-cases every day) as it is about a specific failure. We're delighted to get the chance to run this after a larger, more prestigious, longer running publication spiked it because it had a penis in it. Be warned: there is a willie after the jump. -Cory]
Those of us who study the rules that Internet platforms apply to online speech have increasingly rich data about platforms' removal decisions. Sources like transparency reports provide a statistical big picture, aggregating individual takedown decisions.