Google tweaks its machinery to deal with fake news

Last December, anyone who searched Google for "did the Holocaust happen? would see a white supremacist website at the top of the results. To counteract these kinds of problems, Google updated its Search Quality Rater Guidelines to devalue "misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories," says Google vice president of engineering Ben Gomes.

In addition to changing its algorithm to reduce "offensive or clearly misleading content," Google is also giving users the ability to rate and flag Autocomplete offerings and Featured Snippets results:

When you visit Google, we aim to speed up your experience with features like Autocomplete, which helps predict the searches you might be typing to quickly get to the info you need, and Featured Snippets, which shows a highlight of the information relevant to what you're looking for at the top of your search results. The content that appears in these features is generated algorithmically and is a reflection of what people are searching for and what's available on the web. This can sometimes lead to results that are unexpected, inaccurate or offensive. Starting today, we're making it much easier for people to directly flag content that appears in both Autocomplete predictions and Featured Snippets. These new feedback mechanisms include clearly labeled categories so you can inform us directly if you find sensitive or unhelpful content. We plan to use this feedback to help improve our algorithms.

[via]