Alex Stamos on the security problems of the platforms' content moderation, and what to do about them

Alex Stamos (previously) is the former Chief Security Officer of Yahoo and Facebook. I've jokingly called him a "human warrant canary" because it seems that whenever he leaves a job, we later learn that his departure was precipitated by some terrible compromise the company was making -- he says that he prefers to be thought of as "the Forrest Gump of infosec" because whenever there is a terrible geopolitical information warfare crisis, he's in the background, playing ping-pong. Read the rest

Moderators for large platform tell all, reveal good will, frustration, marginalization

Alex Feerst, Medium's head of trust and safety, conducted a long, wide-ranging interview with senior content moderation staffers with experience at Dropbox, Google, Facebook, Reddit, Pintrest, and many unnamed platforms; the interview is very frank and reveals a group of people with a deep, real-world commitment to protecting users as well as defending free speech. Read the rest

Twitter suspends academic who quoted feminist STEM research

MIT Comparative Media Studies researcher/instructor Chris Peterson is an adrent supporter of the Math Prize for Girls, and as part of his work with the organization, he's learned about the way that STEM fields were once considered inherently feminine, while the higher-status humanities were dominated by men -- it's the subject of some outstanding feminist scholarship by Professor Maria Charles. Read the rest

Twitter's NSFW porn spam nightmare for women with common names

For at least a couple of years, Twitter has allowed one porn spam bot to clog up search results for common women's names, as well as for names of young female celebrities. It would not take a lot to create an algorithm to block this specific spam, but it's still here, because Twitter can't seem to address the platform's pervasive hostility to women. Read the rest

Moderating the internet is traumatic

Catherine Buni and Soraya Chemaly report on The Secret Rules of The Internet: the mass-scale moderation practiced by social networks. Read the rest

Facebook Community Council quietly closes

Last year, I mentioned that I was one of about 350 people tapped to participate in Facebook Community Council, an experiment in crowdsourced content moderation. We'd slog through profiles and sites tagged as inappropriate by other users (tags included nudity, attacking, drugs, and violence). And there were plenty of each, especially the first two. A million tags later, we got the following notice (verbatim):

Thank you for participating in the Facebook Community Council.We will be shutting down starting this week, and will let you knowwhen we start another experiment.

Despite my fifth-place showing, I was not awarded a prize. I suspect they now have image and text recognition algorithms that make mere humans ineffective except in the more subtle forms of abuse. Over time, we saw a lot less penis and puke pics and "___ sucks" Facebook groups, and a lot more subtle trolling of middle and high school students and teachers. It was certainly interesting to see what garden-variety low-level trolling looks like these days. There were a lot of similarities to Wikipedia vandalism: a lot of one-off drive-by crap, with a few people fixated on gaming the system in ever more subtle ways. Read the rest