Suspicion of Facebook bias extends to Mideast

Sunday 22/05/2016
Facebook CEO Mark Zuckerberg speaks on stage during the Facebook F8 conference in San Francisco, California, on April 12th.

Washington - Conspiracy theories ac­cusing Facebook of sup­pressing some while promoting articles of a different political agenda may not be far from the truth, for­mer Facebook contractors claim.
The reports first appeared in Giz­modo, a tech blog that based its ar­ticle on testimonies of anonymous former Facebook “news curators”. The report prompted the US Senate Committee on Commerce, Science and Transportation to send a letter to Facebook founder Mark Zucker­berg demanding clarification of the news curating process at the social platform behemoth.
Facebook appears to have launched human-administered news curating sometime in 2014 behind a tight veil of secrecy. The Arab Weekly learned that contrac­tors hired as news curators, most of them young journalists from privileged backgrounds, were en­couraged by management to reach out to friends and colleagues for new hires. The work was so secre­tive that the news curators could not describe it to potential recruits except in the vaguest of terms.
The Gizmodo report sheds light on news curating at Facebook as something akin to editorial meet­ings at a news outlet. News cura­tors decide the hierarchy of daily news events, ranging from politics to international affairs, sports and entertainment. This becomes problematic when Facebook users read the resulting “trending” news and assume that it is an unfiltered representation of news by popular­ity as Facebook claims. Some 600 million users might see a trending story at any one time.
The allegations will play into the minds of special interest groups who wonder whether Facebook is manipulating news about their respective causes. Avid readers of Israeli and Palestinian news often post on Facebook such suspicions.
An Israeli group tested censors at Facebook and concluded there was an anti-Israel bias. The group cre­ated two identical pages with vio­lent content but one page appeared to be attacking Israelis while the other Palestinians. The group re­ported both pages for inappropri­ate content. Censors at Facebook shut down the anti-Palestinian page but said that the identical content in the anti-Israeli page was not in violation of Facebook rules.
Facebook censors are all over the world and there was no way to know whether the same cen­sor had made the contradictory decision or whether two censors worked on the pages and made dif­ferent judgments.
Facebook has repeatedly said that, although it relies on human judgment in censoring, it issues clear guidelines designed to mini­mise or eliminate human bias. Fa­cebook said the same applies to its news curators.
According to Gizmodo, however, the news curators suppressed con­servative news and at times inject­ed news into the trending column.
“People stopped caring about Syria. If it wasn’t trending on Fa­cebook, it would make Facebook look bad,” one former curator told Gizmodo, explaining that cura­tors “injected” Syria back into the trending column to make the news relevant again.
The allegations may not be sur­prising given the homogeneity of the corporate culture at Facebook in general and news curators in particular. As with most high-tech companies, Facebook cadres are young, white and left of centre. Zukerberg, 31, is unapologetic about expressing his views on po­litical issues of the day, such as im­migration (he is in favour of it).
Indeed, the general zeitgeist at Facebook might encourage em­ployees to wonder out loud, as they did, according to a screenshot leaked to Gizmodo, whether they should do something to manipu­late Republican front runner Don­ald Trump’s chances of winning.
While a news media organisa­tion might find it well within its US Constitution First Amendment rights to strike a clear political po­sition, Facebook runs into murky territory if it did that.
The platform has unprecedent­ed reach around the world, with more than 1 billion users, making it virtually impossible for it to align with a political candidate without a backlash.
More importantly, Facebook ap­pears to be building an algorithm to curate the news automatically using artificial intelligence. The algorithm is “learning” the process from the news curators, including some who spoke to Gizmodo.
This means that it has pro­grammed into it the human biases of the curators who unwittingly played a part in training it. Left un­checked, the algorithm will exag­gerate the biases as users believe it is an “objective” representation of the news.

21