Now that Twitter has admitted it has a troll problem, it has to figure out how to scale up its human review of abuse complaints.
On Slashdot, Bennett Haselton proposes a volunteer jury pool of Twitter users who are polled to evaluate abuse complaints, such that if a supermajority of them agree that there is abuse going on, the complaint is passed on to a second review (either a larger jury or a Twitter mod) who then makes a determination.
It's an interesting idea, and is similar in part to the existing Slashdot moderation system. What do you think?
Because the voting panel is randomly selected from among the entire pool of volunteers, that means you can't "game the system" by forming a mob with dozens of your friends so that everyone can file an abuse report about the same content at once. As long as your mob only comprises a tiny proportion of the 100,000+ reviewers in the system, there's virtually no change that a randomly selected panel would contain enough of you to swing the vote.
This could also potentially result in an almost-instant turnaround time for handling abuse cases (a matter of reassurance for victims of normal harassment, and a matter of life and death in the case of suicide threats or threats of violence). Twitter could restrict their random sample to only those users who happen to be signed in at the present moment, and who have a minute or two to review a piece of content and vote on whether it violates the guidelines.
Twitter Should Use Random Sample Voting For Abuse Reports [Bennett Haselton/Slashdot]