In an effort to reduce the amount of false and extremist content pushed to viewers, Google's video platform YouTube will adjust its presentation algorithm for UK users. YouTube tried a similar algorithm tweak in the US recently, and recommendations of such ‘borderline’ videos were cut by 50%.
After six months of the US trial, YouTube says recommendation-led views to extremist content fell by half.
Because the trial was so effective, it will now be extended to the UK, Ireland, South Africa “and other English-language markets”, says Susan Wojicki.
According to the video sharing site’s chief executive, Susan Wojcicki, the move is intended to give quality content “more of a chance to shine” and has the effect of reducing views from recommendations by 50%.
YouTube has long taken action against content that violates the site’s policies, removing infringing videos and issuing “strikes” against creators that can ultimately result in them being blocked from uploading new videos.
But only recently has the company moved against content that, in Wojcicki’s words, “brushes right up against our policy line”. This sort of content is the bedrock of the fear that YouTube is a driver of extremist views worldwide: the combination of borderline content and a recommendation algorithm that rewards the most engaging content can, critics argue, cause audiences to spiral towards more radical viewing.
YouTube first took action against borderline content in the US earlier this year, and focused on videos that “could misinform users in harmful ways – such as videos promoting a phoney miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11”.
The company distinguishes between this sort of content and content that violates its terms of service, but the distinction is confusing for outsiders like me.
YouTube to adjust UK algorithm to cut false and extremist content