Monday Nov 18, 2024
Monday Nov 18, 2024

‘Kill more’: Facebook fails to detect hate against Rohingya


Nepalnews
AP
2022 Mar 22, 12:01, Jakarta

A new report has found that Facebook failed to detect blatant hate speech and calls to violence against Myanmar’s Rohingya Muslim minority years after such behaviour was found to have played a determining role in the genocide against them.

The report shared exclusively with The Associated Press showed the rights group Global Witness submitted eight paid ads for approval to Facebook, each including different versions of hate speech against Rohingya. All eight ads were approved by Facebook to be published.

The group pulled the ads before they were posted or paid for, but the results confirmed that despite its promises to do better, Facebook’s leaky controls still fail to detect hate speech and calls for violence on its platform.

The army conducted what it called a clearance campaign in western Myanmar’s Rakhine state in 2017 after an attack by a Rohingya insurgent group. More than 700,000 Rohingya fled into neighbouring Bangladesh and security forces were accused of mass rapes, killings and torching thousands of homes.

Rohingya Muslim children who crossed over from Myanmar into Bangladesh, wait squashed against each other to receive food handouts distributed to children and women by a Turkish aid agency at Thaingkhali refugee camp, Bangladesh, Saturday, Oct. 21, 2017. A new report has found that Facebook failed to detect blatant hate speech and calls to violence against Myanmar’s Rohingya Muslim minority years after such behavior was found to have played a determining role in the genocide against them. (AP Photo/Dar Yasin, File)
Rohingya Muslim children who crossed over from Myanmar into Bangladesh, wait squashed against each other to receive food handouts distributed to children and women by a Turkish aid agency at Thaingkhali refugee camp, Bangladesh, Saturday, Oct. 21, 2017. A new report has found that Facebook failed to detect blatant hate speech and calls to violence against Myanmar’s Rohingya Muslim minority years after such behavior was found to have played a determining role in the genocide against them. (AP Photo/Dar Yasin, File)

On Feb. 1 of last year, Myanmar’s military forcibly took control of the country, jailing democratically elected government officials. Rohingya refugees have condemned the military takeover and said it makes them more afraid to return to Myanmar.

Experts say such ads have continued to appear and that despite its promises to do better and assurances that it has taken its role in the genocide seriously, Facebook still fails even the simplest of tests — ensuring that paid ads that run on its site do not contain hate speech calling for the killing of Rohingya Muslims.

“The current killing of the Kalar is not enough, we need to kill more!” read one proposed paid post from Global Witness, using a slur often used in Myanmar to refer to people of East Indian or Muslim origin.

“They are very dirty. The Bengali/Rohingya women have a very low standard of living and poor hygiene. They are not attractive,” read another.

“These posts are shocking in what they encourage and are a clear sign that Facebook has not changed or done what they told the public what they would do: properly regulate themselves,” said Ronan Lee, a research fellow at the Institute for Media and Creative Industries at Loughborough University, London.

The eight ads from Global Witness all used hate speech-language taken directly from the United Nations Independent International Fact-Finding Mission on Myanmar in their report to the Human Rights Council. Several examples were from past Facebook posts.

The fact that Facebook approved all eight ads is especially concerning because the company claims to hold advertisements to an “even stricter” standard than regular, unpaid posts, according to their help centre page for paid advertisements.

“I accept the point that eight isn’t a very big number. But I think the findings are really stark, that all eight of the ads were accepted for publication,” said Rosie Sharpe, a campaigner at Global Witness. “I think you can conclude from that that the overwhelming majority of hate speech is likely to get through.”

Facebook’s parent company Meta Platforms Inc. said it has invested in improving its safety and security controls in Myanmar, including banning military accounts after the Tatmadaw, as the armed forces are locally known, seized power and imprisoned elected leaders in the 2021 coup.

“We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw, disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content,” Rafael Frankel, director of public policy for emerging markets at Meta Asia Pacific wrote in an e-mailed statement to AP on March 17. “This work is guided by feedback from experts, civil society organizations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018.”

Facebook has been used to spread hate speech and amplify military propaganda in Myanmar in the past.

Shortly after Myanmar became connected to the internet in 2000, Facebook paired with its telecom providers to allow customers to use the platform without having to pay for the data, which was still expensive at the time. The use of the platform exploded. For many in Myanmar, Facebook became the internet itself.

Local internet policy advocates repeatedly told Facebook to hate speech was spreading across the platform, often targeting the Muslim minority Rohingya in the majority Buddhist nation.

For years Facebook failed to invest in content moderators who spoke local languages or fact-checkers with an understanding of the political situation in Myanmar or to close specific accounts or delete pages being used to propagate hatred of the Rohingya, said Tun Khin, president of Burmese Rohingya Organization UK, a London-based Rohingya advocacy organization.

In March 2018, less than six months after hundreds of thousands of Rohingya fled violence in western Myanmar, Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, told reporters social media had “substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public.”

“Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media,” Darusman said.

Asked about Myanmar a month later at a U.S. Senate hearing, Meta CEO Mark Zuckerberg said Facebook planned to hire “dozens” of Burmese speakers to moderate content and would work with civil society groups to identify hate figures and develop new technologies to combat hate speech.

“Hate speech is very language-specific. It’s hard to do it without people who speak the local language and we need to ramp up our effort there dramatically,” Zuckerberg said.

Yet in internal files leaked by whistleblower Frances Haugen last year, AP found that breaches persisted. The company stepped up efforts to combat hate speech but never fully developed the tools and strategies required to do so.

Rohingya refugees have sued Facebook for more than $150 billion, accusing it of failing to stop hate speech that incited violence against the Muslim ethnic group by military rulers and their supporters in Myanmar. Rohingya youth groups based in the Bangladesh refugee camps have filed a separate complaint in Ireland with the 38-nation Organization for Economic Cooperation and Development calling for Facebook to provide some remediation programs in the camps.

The company now called Meta has refused to say how many of its content moderators read Burmese and can thus detect hate speech in Myanmar.

“Rohingya genocide survivors continue to live in camps today and Facebook continue to fail them,” said Tun Khin. “Facebook needs to do more.”

READ ALSO:

Facebook Myanmar Rohingya Muslim minority hate speech Falied to detect Internet NepalNews
Nepal's First Online News Portal
Published by Nepalnews Pvt Ltd
Editor: Raju Silwal
Information Department Registration No. 1505 / 076-77

Contact

Kathmandu, Nepal,


Newsroom
##

E-mail
nepalnewseditor@gmail.com

Terms of Use Disclaimer
© NepalNews. 2021 All rights reserved. | Nepal's First News Portal