Jigsaw: "wildly ambitious" Google spin-out aimed at tackling "surveillance, extremist indoctrination, censorship"


Technologists have a dismal pattern: when it comes to engineering challenges ("build a global-scale comms platform") they rub their hands together with excitement; when it comes to the social challenges implied by the engineering ones ("do something about trolls") they throw their hands up and declare the problem to be too hard to solve.

Jigsaw is a Google spinout that began life as Eric Schmidt's do-nothing "think/do tank," Google Ideas. But under leadership of Jared Cohen, the organization has morphed into an incubator devoted to not just "advanc[ing] the best possibilities of the Internet but to fix the worst of it: surveillance, extremist indoctrination, censorship."


The organization has released a set of free and mostly great tools that make strides on these lines: Uproxy lets people behind censorwalls use their friends' internet connections as proxies to get around them; Project Shield uses Google's serverfarms to let political dissidents get the message out in the face of state-actor denial-of-service attacks; Montage helps human rights groups crowdsource analysis of Youtube videos; Password Alert catches phishing attempts on Google logins; Redirect Method shows videos by people who regret joining terrorist groups to people looking for extremist material, with surprising success; and Conversation AI is an experimental system to help filter/moderate troll-floods like those directed at Gamergate targets like Zoe Quinn and Anita Sarkeesian.

But Jigsaw isn't without its weaknesses and criticisms. Many people blame Cohen for promoting Haystack, a profoundly flawed and dangerous communications tool that the State Department urged on dissidents during the Arab Spring (Cohen denies it); while Wikileaks and Julian Assange accuse Cohen of being a front for the US State Department's "regime change" projects, as Cohen is a veteran of State and had a prominent, high profile career there.

On the social/technical side, there's good reason to be skeptical of machine-learning approaches to troll-fighting. From spam-filters to parental controls, systems of automated censorship have a dismal track record, and while Conversation AI performs well at times, it has some really troubling failure modes. A lot will depend on how it is implemented, and what kinds of human judgment is in the loop.


But on the plus side, Jigsaw seems aware of these issues, and has sourced some pretty high-quality training data ("130,000 snippets of discussion around Wikipedia pages" and "17 million comments from [New York] Times stories, along with data about which of those comments were flagged as inappropriate by moderators"), though it doesn't seem to have plans to make that data public for people who want to independently audit the sampling methodologies, a prerequisite for good technical work that's as old as the scientific method.

Conversation AI builds on a very successful project, Riot Games' League of Legends, where a combination of software-based controls, sanctions, nudging and moderation have created a multiplayer gaming environment that has markedly less harassment and abuse than its rivals.


Jigsaw's methodologies are a mixed bag, but in places they're very good, as with their inclusion of people who are on the receiving end of abuse, censorship, and related problems, early and continuously through their design and development and refinement process.


At one recent meeting, Cohen leans over a conference table as 15 or so Jigsaw recruits—engineers, designers, and foreign policy wonks—prepare to report back from the dark corners of the Internet. "We are not going to be one of those groups that sits in our offices and imagines what vulnerable populations around the world are experiencing," Cohen says. "We're going to get to know our users." He speaks in a fast-­forward, geeky patter that contrasts with his blue-eyed, broad-­shouldered good looks, like a politician disguised as a Silicon Valley executive or vice versa. "Every single day, I want us to feel the burden of the responsibility we're shouldering."

We hear about an Albanian LGBT activist who tries to hide his identity on Facebook despite its real-names-only policy, an admini­strator for a Libyan youth group wary of govern­ment infiltrators, a defector's memories from the digital black hole of North Korea. Many of the T-shirt-and-­sandal-­wearing Googlers in the room will later be sent to some of those far-flung places to meet their contacts face-to-face.

"They'll hear stories about people being tortured for their passwords or of state-sponsored cyberbullying," Cohen tells me later. The purpose of these field trips isn't simply to get feedback for future products, he says. They're about creating personal investment in otherwise distant, invisible problems—a sense of investment Cohen says he himself gained in his twenties during his four-year stint in the State Department, and before that during extensive travel in the Middle East and Africa as a student.

Cohen reports directly to Alphabet's top execs, but in practice, Jigsaw functions as Google's blue-sky, human-rights-focused skunkworks. At the group's launch, Schmidt declared its audacious mission to be "tackling the world's toughest geopolitical problems" and listed some of the challenges within its remit: "money laundering, organized crime, police brutality, human trafficking, and terrorism." In an interview in Google's New York office, Schmidt (now chair of Alphabet) summarized them to me as the "problems that bedevil humanity involving information."


INSIDE GOOGLE'S INTERNET JUSTICE LEAGUE AND ITS AI-POWERED WAR ON TROLLS
[Andy Greenberg/Wired]