Alphabet, parent company of Google and YouTube, told a U.S. House panel that it spends hundreds of millions of dollars on reviewing content each year, and claims to have identified at least one million "suspected terrorist videos" on YouTube in the first quarter of 2019.
Alphabet/Google/YouTube disclosed this information to lawmakers in an April 24 letter that was made public on Thursday.
The letter claims that a "manual review" by YouTube content moderation workers found that 90,000 of the videos identified as "suspected terrorist videos" violated YouTube's terrorism policy.
In March, following the live-streaming on social media of a mass shooting in New Zealand, the chair of the U.S. House Committee on Homeland Security urged the top executives of Google, Facebook Inc, Twitter Inc and Microsoft Corp do a better job of removing violent political content.
After a briefing in March, Representative Max Rose, who chairs a subcommittee on intelligence and counter-terrorism, asked the four companies in an April 10 letter to disclose their budgets for counter-terrorism programs and number of people working solely on counter-terrorism programs.
Rose said in a statement Facebook has not responded and the other firms did not fully or directly answer his questions.
Twitter said in an April 24 to Rose that "putting a dollar amount on our broader efforts is a complex request."
Twitter said a "substantial portion" of its 4,100-person global workforce are involved in reviewing content.