Facebook said Monday that an independent report it commissioned found the company hasn't always done enough to prevent its platform from being used to spread hate speech that has fueled deadly violence in Myanmar.
The report, conducted by the nonprofit Business for Social Responsibility, also offered Facebook recommendations for helping improve human rights in the country, including stricter enforcement of content policies and regular publishing of data related to human rights violations.
"The report concludes that, prior to this year, we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence," Alex Warofka, Facebook product policy manager, wrote in a blog post Monday. "We agree that we can and should do more."
The report comes amid reports of widespread genocide being committed by the military in Myanmar. In March, United Nations human rights experts investigating violence in the country concluded that Facebook played a "determining role" in the crisis, in which hundreds of thousands Rohingya Muslims have fled the country.
Facebook for the public
David Mathieson, an independent analyst who has lived and worked in the region for years, told CBSN earlier this year that social media has been "one of the most damaging aspects of this entire crisis."
BSR recommended Facebook improve enforcement of its community standards, which describe what is and isn't allowed on the social network. Facebook said that central to achieving this is its near-complete development of a team that understands local Myanmar issues along with policy and operations expertise.
Facebook said it's using the social-listening tool CrowdTangle to analyze potentially harmful content and understand how it spreads in Myanmar. The company is also using artificial intelligence to identify and prevent the spread of posts that contain graphic violence or dehumanizing comments.
Preserving and sharing data that can be used to help evaluate human rights violations was also suggested, especially data specific to the situation in Myanmar so the international community can better evaluate the company's enforcement efforts.
"We are committed to working with and providing information to the relevant authorities as they investigate international human rights violations in Myanmar, and we are preserving data for this purpose," Warofka wrote, noting it took this approach with content and accounts associated with the Myanmar military it removed in August and October.
Another recommendation includes the establishment of a policy that defines Facebook's approach to content moderation with respect to human rights, a suggestion Warofka said Facebook is "looking into."
The U.N.'s top human rights officials recommended in August that Myanmar military leaders be prosecuted for genocide against Rohingya Muslims. More than 700,000 Rohingya Muslims have fled Myanmar's Rakhine state since rebel attacks sparked a military backlash in August 2017.
U.N. investigators have reportedly found numerous crimes committed against the minority in Myanmar, including gang rape, enslavement, torching villages and killing children. Roughly 10,000 people have reportedly been killed in the violence, and tens of thousands have fled the country.