Facebook says human rights report shows it should do more in Myanmar

FAN Editor
FILE PHOTO: A Facebook panel is seen during the Cannes Lions International Festival of Creativity, in Cannes
FILE PHOTO: A Facebook panel is seen during the Cannes Lions International Festival of Creativity, in Cannes, France, June 20, 2018. REUTERS/Eric Gaillard/File Photo

November 6, 2018

By Paresh Dave

SAN FRANCISCO (Reuters) – Facebook Inc on Monday said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence.

The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Alex Warofka, a Facebook product policy manager, said in a blog post.

BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar’s 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released.

A Reuters special report https://www.reuters.com/investigates/special-report/myanmar-facebook-hate in August found that Facebook failed to promptly heed numerous warnings from organizations in Myanmar about social media posts fueling attacks on minority groups such as the Rohingya.

In August 2017 the military led a crackdown in Myanmar’s Rakhine State in response to attacks by Rohingya insurgents, pushing more than 700,000 Muslims to neighboring Bangladesh, according to U.N. agencies.

The social media website in August removed several Myanmar military officials from the platform to prevent the spread of “hate and misinformation,” for the first time banning a country’s military or political leaders.

It also removed dozens of accounts for engaging in a campaign that “used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military.”

The move came hours after United Nations investigators said the army carried out mass killings and gang rapes of Muslim Rohingya with “genocidal intent.”

Facebook said it has begun correcting shortcomings.

Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanizing posts while they undergo review.

In the third quarter, the company said it “took action” on about 64,000 pieces of content that violated its hate speech policies. About 63 percent were identified by automated software, up from 52 percent in the prior quarter.

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar.

BSR said locating staff there, for example, could aid in Facebook’s understanding of how its services are used locally but said its workers could be targeted by the country’s military, which has been accused by the U.N. of ethnic cleansing of the Rohingya.

(Reporting by Paresh Dave and Antoni Slodkowski; Editing by Peter Henderson and Michael Perry)

Free America Network Articles

Leave a Reply

Next Post

Facebook blocks 115 accounts ahead of US midterm elections

Facebook announced Monday evening that it had blocked 115 accounts on its services that may have been engaged in “coordinated inauthentic behavior” ahead of the U.S. midterm elections. The company said U.S. law enforcement notified Facebook on Sunday of the accounts’ online activity, saying they believed the accounts “may be […]

You May Like