Suspicion of Facebook bias extends to Mideast
Washington - Conspiracy theories accusing Facebook of suppressing some while promoting articles of a different political agenda may not be far from the truth, former Facebook contractors claim.
The reports first appeared in Gizmodo, a tech blog that based its article on testimonies of anonymous former Facebook “news curators”. The report prompted the US Senate Committee on Commerce, Science and Transportation to send a letter to Facebook founder Mark Zuckerberg demanding clarification of the news curating process at the social platform behemoth.
Facebook appears to have launched human-administered news curating sometime in 2014 behind a tight veil of secrecy. The Arab Weekly learned that contractors hired as news curators, most of them young journalists from privileged backgrounds, were encouraged by management to reach out to friends and colleagues for new hires. The work was so secretive that the news curators could not describe it to potential recruits except in the vaguest of terms.
The Gizmodo report sheds light on news curating at Facebook as something akin to editorial meetings at a news outlet. News curators decide the hierarchy of daily news events, ranging from politics to international affairs, sports and entertainment. This becomes problematic when Facebook users read the resulting “trending” news and assume that it is an unfiltered representation of news by popularity as Facebook claims. Some 600 million users might see a trending story at any one time.
The allegations will play into the minds of special interest groups who wonder whether Facebook is manipulating news about their respective causes. Avid readers of Israeli and Palestinian news often post on Facebook such suspicions.
An Israeli group tested censors at Facebook and concluded there was an anti-Israel bias. The group created two identical pages with violent content but one page appeared to be attacking Israelis while the other Palestinians. The group reported both pages for inappropriate content. Censors at Facebook shut down the anti-Palestinian page but said that the identical content in the anti-Israeli page was not in violation of Facebook rules.
Facebook censors are all over the world and there was no way to know whether the same censor had made the contradictory decision or whether two censors worked on the pages and made different judgments.
Facebook has repeatedly said that, although it relies on human judgment in censoring, it issues clear guidelines designed to minimise or eliminate human bias. Facebook said the same applies to its news curators.
According to Gizmodo, however, the news curators suppressed conservative news and at times injected news into the trending column.
“People stopped caring about Syria. If it wasn’t trending on Facebook, it would make Facebook look bad,” one former curator told Gizmodo, explaining that curators “injected” Syria back into the trending column to make the news relevant again.
The allegations may not be surprising given the homogeneity of the corporate culture at Facebook in general and news curators in particular. As with most high-tech companies, Facebook cadres are young, white and left of centre. Zukerberg, 31, is unapologetic about expressing his views on political issues of the day, such as immigration (he is in favour of it).
Indeed, the general zeitgeist at Facebook might encourage employees to wonder out loud, as they did, according to a screenshot leaked to Gizmodo, whether they should do something to manipulate Republican front runner Donald Trump’s chances of winning.
While a news media organisation might find it well within its US Constitution First Amendment rights to strike a clear political position, Facebook runs into murky territory if it did that.
The platform has unprecedented reach around the world, with more than 1 billion users, making it virtually impossible for it to align with a political candidate without a backlash.
More importantly, Facebook appears to be building an algorithm to curate the news automatically using artificial intelligence. The algorithm is “learning” the process from the news curators, including some who spoke to Gizmodo.
This means that it has programmed into it the human biases of the curators who unwittingly played a part in training it. Left unchecked, the algorithm will exaggerate the biases as users believe it is an “objective” representation of the news.