UK report denounces proliferation of terrorism videos on YouTube
London - Google “social media” and “terrorism” and the results deliver more than 95 million hits, many of which are tales of vulnerable young people being self-radicalised online via sites such as YouTube and Twitter.
The British Parliament has published an extremely critical report outlining the large amount of extremist and offensive content shared on social media platforms and made recommendations about how companies should tackle this dangerous phenomenon.
“Social media companies’ failure to deal with illegal and dangerous material online is a disgrace,” said Member of Parliament Yvette Cooper, who heads the parliament’s Home Affairs Committee, which drafted the report.
“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorism recruitment… Yet repeatedly, they have failed to do so. It is shameful,” Cooper said.
MPs said it was not acceptable that social media companies rely on users to report offensive content, saying they were ostensibly “outsourcing” the problem “at zero expense.”
The report was particularly critical of YouTube, which is owned by Google, pointing to the huge number of videos posted by extremist and terrorist groups such as the Islamic State (ISIS) and al-Qaeda but also by far-right groups.
The parliamentary report was published a few months after news that videos promoting ISIS were monetised, meaning that YouTube and the original poster were receiving money from advertising on videos that contained illegal content.
One ad from Mercedes was automatically paired with a pro-ISIS video that garnered more than 115,000 views, British media reports said. A YouTube advert usually earns about $7.70 per 1,000 views, meaning that the advert in question could have earned the video creators more than $900.
Following the news, L’Oreal, McDonald’s, Starbucks and other major brands pulled their advertising from YouTube, sparking what has been called the “adpocalypse” among YouTubers. Several governments and government agencies, including the British government, pulled adverts from the video-sharing website, fearing a public relations disaster if taxpayer money was found to be funding terrorism.
“It is shocking that Google failed to perform basic due diligence regarding advertising on YouTube… which appeared alongside videos containing inappropriate and unacceptable content, some of which were created by terrorist organisations,” the report said.
“We note that Google can act quickly to remove videos from YouTube when they are found to infringe copyright rules but that the same prompt action is not taken when the material involves hateful or illegal content,” the report added.
For a video-sharing platform with more than 1.3 billion users, with more than 300 hours of video uploaded every minute and almost 5 billion videos watched every day, that is a difficult challenge.
In a statement, Google’s UK and Ireland Managing Director Ronan Harris said the company had strict guidelines on content and spent “millions of dollars every year” to enforce standard practices. However, he said: “We don’t always get it right.”
“In a very small percentage of cases, ads appear against content that violates our monetisation policies. We promptly remove the ads in those instances but we know we can and must do more,” he said.
Britain’s parliamentary committee said that would not be enough and recommended that an entirely new system be put in place to guard against online extremism.
The report called on ministers to consult on requiring social media companies to contribute to the cost of the Metropolitan Police Counter Terrorism Internal Referral Unit, which was set up in 2010 with a brief to remove unlawful terrorist material from the internet. The report suggested imposing multimillion-pound fines for social media firms that fail to remove illegal content within a strict time frame.
“The social media companies have been shirking their responsibilities even when there has been illegal material on their platforms. Even when informed, they have taken no action and this is not acceptable anymore,” said Fiyaz Mughal, founder of Tell MAMA, a national project that records and measures anti-Muslim incidents in Britain.
Mughal, who testified before the cross-party Home Affairs Select Committee, said social media needed to be regulated, just like other media, to counter the spread of extremism and hate speech.
“They believed that they were immune to public criticism. The Home Affairs Select Committee has shown that they are not immune from public pressure and this must continue to get them to act responsibly. Lives have been damaged because of the inaction of social media companies on illegal material and this simply cannot continue,” he added.