Google order its secretive "raters'" hours cut, so now they're going public

Google often boasts about the 10,000 skilled raters who test its results, reporting weird kinks in the ranking algorithms and classifiers that the company uses for everything from search results to ad placement to automated photo recognition.


But these 10,000 people don't work for Google; they are contracted through giant, obscure piecework companies like Leapforce, paid prices they can't control for doing short bursts of work, without being entitled to Google's legendary perks.

The work is gruelling, requiring hours of continuous training in Google's ever-shifting criteria for what is and isn't acceptable, all of it unpaid, and constant assessment through tasks that are actually performance evaluations. Workers rely on pulling extremely long hours to pull together a living wage.

So when Leapforce announced that it would be capping the hours that its workers could clock to 26 per week as of June 1, workers were panicked. The company's explanation for this was incoherent, leading to the realization that the hours cuts had been ordered by Google.


The low morale that ensued has led Leapforce workers and other raters to go public with the weird, secretive world they live in, talking to Annalee Newitz of Ars Technica for a long, in-depth and fascinating piece on the future of the gig economy.

The tasks themselves are widely varied. Some ask raters to evaluate whether a search result is useful or an audio file has been transcribed correctly, while others solicit feedback on the behavior of Android apps. According to raters, some tasks can feel "creepy." These are usually tasks related to personalization services, which require raters to first give Google access to their e-mail, chats, photos, and other Google services they use. Google then turns the rater's personal data into tasks that allow them to give feedback on how well Google's personalization algorithms work. "I don't like the photo tasks," one rater said. "They will show you pictures you have taken and have you rate them." Other raters expressed discomfort at giving Google access to their personal accounts.

Though each task is brief, a rater's work isn't easy. Before they begin at Leapforce, all raters must pass a series of rigorous exams to make sure they understand the 160-page book of guidelines that Google provides to raters. "It's hard to pass," one rater told Ars. "I have referred nine people to this job. Every one of them failed the exam."

For those who do pass, the testing doesn't end. Every few months, raters have to familiarize themselves with important updates to the guidelines, like the recent "upsetting/offensive" flag rules. Plus, each week brings new kinds of tasks or tweaks to what counts as a right answer on old tasks.


"The learning curve is steep," one rater said. Raters are encouraged to take weekly quizzes to keep up to date with changes and to make sure their task responses are in line with other raters. They say they are not paid for this re-training or testing, even though it can take a few hours every week.

At any time, raters may find themselves assigned a job called a "review task." In reality, it's a performance evaluation. Google has already figured out the right answers to the task and uses the review to make sure each rater gives answers that are calibrated with what the company expects. If a rater is too far off the mark, he or she is limited to one hour per day of work until scores improve.

The secret lives of Google raters
[Annalee Newitz/Ars Technica]