UK report denounces proliferation of terrorism videos on YouTube

Sunday 04/06/2017
Difficult challenge. Signage inside the YouTube Space LA offices in Los Angeles. (AP)

London - Google “social media” and “terrorism” and the re­sults deliver more than 95 million hits, many of which are tales of vul­nerable young people being self-radicalised online via sites such as YouTube and Twitter.
The British Parliament has pub­lished an extremely critical re­port outlining the large amount of extremist and offensive content shared on social media platforms and made recommendations about how companies should tackle this dangerous phenomenon.
“Social media companies’ fail­ure to deal with illegal and danger­ous material online is a disgrace,” said Member of Parliament Yvette Cooper, who heads the parliament’s Home Affairs Committee, which drafted the report.
“They have been asked repeat­edly to come up with better systems to remove illegal material such as terrorism recruitment… Yet repeat­edly, they have failed to do so. It is shameful,” Cooper said.
MPs said it was not acceptable that social media companies rely on users to report offensive content, saying they were ostensibly “out­sourcing” the problem “at zero ex­pense.”
The report was particularly criti­cal of YouTube, which is owned by Google, pointing to the huge num­ber of videos posted by extremist and terrorist groups such as the Is­lamic State (ISIS) and al-Qaeda but also by far-right groups.
The parliamentary report was published a few months after news that videos promoting ISIS were monetised, meaning that YouTube and the original poster were receiv­ing money from advertising on vid­eos that contained illegal content.
One ad from Mercedes was auto­matically paired with a pro-ISIS vid­eo that garnered more than 115,000 views, British media reports said. A YouTube advert usually earns about $7.70 per 1,000 views, meaning that the advert in question could have earned the video creators more than $900.
Following the news, L’Oreal, McDonald’s, Starbucks and other major brands pulled their advertis­ing from YouTube, sparking what has been called the “adpocalypse” among YouTubers. Several govern­ments and government agencies, including the British government, pulled adverts from the video-sharing website, fearing a public relations disaster if taxpayer money was found to be funding terrorism.
“It is shocking that Google failed to perform basic due diligence re­garding advertising on YouTube… which appeared alongside videos containing inappropriate and un­acceptable content, some of which were created by terrorist organisa­tions,” the report said.
“We note that Google can act quickly to remove videos from You­Tube when they are found to in­fringe copyright rules but that the same prompt action is not taken when the material involves hateful or illegal content,” the report added.
For a video-sharing platform with more than 1.3 billion users, with more than 300 hours of video up­loaded every minute and almost 5 billion videos watched every day, that is a difficult challenge.
In a statement, Google’s UK and Ireland Managing Director Ronan Harris said the company had strict guidelines on content and spent “millions of dollars every year” to enforce standard practices. How­ever, he said: “We don’t always get it right.”
“In a very small percentage of cases, ads appear against content that violates our monetisation poli­cies. We promptly remove the ads in those instances but we know we can and must do more,” he said.
Britain’s parliamentary commit­tee said that would not be enough and recommended that an entirely new system be put in place to guard against online extremism.
The report called on ministers to consult on requiring social media companies to contribute to the cost of the Metropolitan Police Counter Terrorism Internal Referral Unit, which was set up in 2010 with a brief to remove unlawful terrorist material from the internet. The re­port suggested imposing multimil­lion-pound fines for social media firms that fail to remove illegal con­tent within a strict time frame.
“The social media companies have been shirking their responsi­bilities even when there has been illegal material on their platforms. Even when informed, they have tak­en no action and this is not accept­able anymore,” said Fiyaz Mughal, founder of Tell MAMA, a national project that records and measures anti-Muslim incidents in Britain.
Mughal, who testified before the cross-party Home Affairs Select Committee, said social media need­ed to be regulated, just like other media, to counter the spread of ex­tremism and hate speech.
“They believed that they were im­mune to public criticism. The Home Affairs Select Committee has shown that they are not immune from pub­lic pressure and this must continue to get them to act responsibly. Lives have been damaged because of the inaction of social media companies on illegal material and this simply cannot continue,” he added.