Facebook’s secret rules mean that it’s ok to be anti-Islam, but not anti-gay | Ars Technica

For all those interested in free speech and hate speech issues, a really good analysis of how Facebook is grappling with the issue and its definitions of protected groups. Urge all readers to go through the slide show (need to go to the article to access) which capture some of the complexities involved:

In the wake of a terrorist attack in London earlier this month, a US congressman wrote a Facebook post in which he called for the slaughter of “radicalized” Muslims. “Hunt them, identify them, and kill them,” declared US Rep. Clay Higgins, a Louisiana Republican. “Kill them all. For the sake of all that is good and righteous. Kill them all.”

Higgins’ plea for violent revenge went untouched by Facebook workers who scour the social network deleting offensive speech.

But a May posting on Facebook by Boston poet and Black Lives Matter activist Didi Delgado drew a different response.

“All white people are racist. Start from this reference point, or you’ve already failed,” Delgado wrote. The post was removed, and her Facebook account was disabled for seven days.

A trove of internal documents reviewed by ProPublica sheds new light on the secret guidelines that Facebook’s censors use to distinguish between hate speech and legitimate political expression. The documents reveal the rationale behind seemingly inconsistent decisions. For instance, Higgins’ incitement to violence passed muster because it targeted a specific sub-group of Muslims—those that are “radicalized”—while Delgado’s post was deleted for attacking whites in general.

Over the past decade, the company has developed hundreds of rules, drawing elaborate distinctions between what should and shouldn’t be allowed in an effort to make the site a safe place for its nearly 2 billion users. The issue of how Facebook monitors this content has become increasingly prominent in recent months, with the rise of “fake news”—fabricated stories that circulated on Facebook like “Pope Francis Shocks the World, Endorses Donald Trump For President, Releases Statement“—and growing concern that terrorists are using social media for recruitment.

While Facebook was credited during the 2010-2011 “Arab Spring” with facilitating uprisings against authoritarian regimes, the documents suggest that, at least in some instances, the company’s hate-speech rules tend to favor elites and governments over grassroots activists and racial minorities. In so doing, they serve the business interests of the global company, which relies on national governments not to block its service to their citizens.

One Facebook rule, which is cited in the documents but that the company said is no longer in effect, banned posts that praise the use of “violence to resist occupation of an internationally recognized state.” The company’s workforce of human censors, known as content reviewers, has deleted posts by activists and journalists in disputed territories such as Palestine, Kashmir, Crimea, and Western Sahara.

One document trains content reviewers on how to apply the company’s global hate speech algorithm. The slide identifies three groups: female drivers, black children, and white men. It asks: which group is protected from hate speech? The correct answer: white men.

The reason is that Facebook deletes curses, slurs, calls for violence, and several other types of attacks only when they are directed at “protected categories”—based on race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation, and serious disability/disease. It gives users broader latitude when they write about “subsets” of protected categories. White men are considered a group because both traits are protected, while female drivers and black children, like radicalized Muslims, are subsets, because one of their characteristics is not protected. (The exact rules are in the slide show below.)

Facebook has used these rules to train its “content reviewers” to decide whether to delete or allow posts. Facebook says the exact wording of its rules may have changed slightly in more recent versions. ProPublica recreated the slides.

Behind this seemingly arcane distinction lies a broader philosophy. Unlike American law, which permits preferences such as affirmative action for racial minorities and women for the sake of diversity or redressing discrimination, Facebook’s algorithm is designed to defend all races and genders equally.

But Facebook says its goal is different—to apply consistent standards worldwide. “The policies do not always lead to perfect outcomes,” said Monika Bickert, head of global policy management at Facebook. “That is the reality of having policies that apply to a global community where people around the world are going to have very different ideas about what is OK to share.”

Facebook’s rules constitute a legal world of their own. They stand in sharp contrast to the United States’ First Amendment protections of free speech, which courts have interpreted to allow exactly the sort of speech and writing censored by the company’s hate speech algorithm. But they also differ—for example, in permitting postings that deny the Holocaust—from more restrictive European standards.

The company has long had programs to remove obviously offensive material like child pornography from its stream of images and commentary. Recent articles in the Guardian and Süddeutsche Zeitung have detailed the difficult choices that Facebook faces regarding whether to delete posts containing graphic violence, child abuse, revenge porn and self-mutilation.

The challenge of policing political expression is even more complex. The documents reviewed by ProPublica indicate, for example, that Donald Trump’s posts about his campaign proposal to ban Muslim immigration to the United States violated the company’s written policies against “calls for exclusion” of a protected group. As The Wall Street Journal reported last year, Facebook exempted Trump’s statements from its policies at the order of Mark Zuckerberg, the company’s founder and chief executive.

The company recently pledged to nearly double its army of censors to 7,500, up from 4,500, in response to criticism of a video posting of a murder. Their work amounts to what may well be the most far-reaching global censorship operation in history. It is also the least accountable: Facebook does not publish the rules it uses to determine what content to allow and what to delete.

Users whose posts are removed are not usually told what rule they have broken, and they cannot generally appeal Facebook’s decision. Appeals are currently only available to people whose profile, group, or page is removed.

The company has begun exploring adding an appeals process for people who have individual pieces of content deleted, according to Bickert. “I’ll be the first to say that we’re not perfect every time,” she said.

Facebook is not required by US law to censor content. A 1996 federal law gave most tech companies, including Facebook, legal immunity for the content users post on their services. The law, section 230 of the Telecommunications Act, was passed after Prodigy was sued and held liable for defamation for a post written by a user on a computer message board.

The law freed up online publishers to host online forums without having to legally vet each piece of content before posting it, the way that a news outlet would evaluate an article before publishing it. But early tech companies soon realized that they still needed to supervise their chat rooms to prevent bullying and abuse that could drive away users.

Source: Facebook’s secret rules mean that it’s ok to be anti-Islam, but not anti-gay | Ars Technica

Advertisements

About Andrew
Andrew blogs and tweets public policy issues, particularly the relationship between the political and bureaucratic levels, citizenship and multiculturalism. His latest book, Policy Arrogance or Innocent Bias, recounts his experience as a senior public servant in this area.

One Response to Facebook’s secret rules mean that it’s ok to be anti-Islam, but not anti-gay | Ars Technica

  1. Pingback: Facebook Struggles to Limit Speech | Dave Alexander & Company with David Edgren and Gus Bailey – The Artisan Craft Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: