(Bloomberg) — In July, Facebook Inc. quietly hired Miranda Sissons, a 49-year old human rights activist whose previous work has included stints at the Australian diplomatic service and the International Center for Transitional Justice. The hiring, which was never formally announced, is part of a broader effort by the company to atone for more than once failing to stop online abuse on Facebook from spilling over into real-world violence.
Human rights advocates in places like Sri Lanka, the Philippines, India and Brazil have long complained that the company has refused to acknowledge mounting evidence about the dangers of digital hate. As Facebook pursued world-changing growth, particularly in developing countries, it didn’t always have local staff there, or even employees who spoke the language. In Myanmar, a wave of online hate preceded a campaign of violence against the country’s Rohingya minority that led to thousands of deaths and the displacement of over 700,000 people. An independent report Facebook commissioned in 2018 found that it bore partial responsibility for fueling the conflict.
Immediately after taking the job, Sissons took a five-day trip to the country. “I was deeply, deeply aware of the criticism of Facebook’s inaction in Myanmar, and deeply aware of the struggles humankind is facing with the impact of social media,” Sissons told Bloomberg News earlier this month in her first press interview in her new role. “This is one of the greatest challenges of our time.”
Sissons work is part of a broader reckoning within the technology industry, which has been forced to reexamine its role in world conflicts. Several months before Facebook hired Sissons, Twitter Inc. brought on Cynthia Wong, a former researcher at Human Rights Watch, to be its human rights director. As with Facebook, Twitter never announced the hiring.
In discussions with more than a dozen people familiar with Facebook’s work on human rights, a picture emerges of a company that has been moving rapidly but, according to its skeptics, not always effectively. One Facebook employee, who asked not to be identified discussing private information, said its shortcomings have not always been the result of having too few people dedicated to human rights, but at times having so many people involved that they’re working at cross-purposes.
Human rights advocates outside the company acknowledge Facebook’s effort to hire experts, and say it has become far more responsive. But they worry that internal advocates like Sissons won’t be adequately empowered, and many are withholding praise until the company makes more concrete changes. “They are hiring people who have the right knowledge, experience and sensibility to tackle human rights problems,” said Matthew Smith, chief executive of Fortify Rights, a human rights group. “So far, though, that’s clearly not enough.”
Sissons’ human rights education started early. Her father was a prominent Australian historian who served in the occupation force of Hiroshima after World War II, then worked as an interpreter in the Australian-led tribunals of Japanese officials accused of war crimes. “My early childhood was completely taken up with discussions of war crimes, war criminals, the Second World War, and notions of justice,” she said.
After attending the University of Melbourne, Sissons spent time in East Timor, researched Middle Eastern issues and took several posts with the Australian diplomatic corps, including a frustrating stint answering phones at an Australian embassy in Egypt. “My Arabic wasn’t very good,” she confessed. “People would ring me up and shout at me about all kinds of things, and I would have to find a solution. ” Eventually, Sissons went on to work on her own high-profile tribunal as an independent observer of the trial of former Iraqi leader Saddam Hussein, and she did stints at Human Rights Watch and the Australian diplomatic corps.
In 2011 Sissons switched her focus to the relationship between human rights and technology. She had been working in the Middle East, where the Arab Spring was just getting underway, and many people believed social media could shift the balance of power between citizens and oppressive regimes. It was a time of unmatched optimism about the potential of social media in political organizing.
The good feelings didn’t last. As early as 2014 there were credible reports emerging of coordinated incitement on Facebook against the Rohingya in Myanmar. The online abuse foreshadowed a wave of violence that began in earnest in 2016.
By the time Facebook began looking for a human rights director in 2018, the conventional wisdom on tech from a few years earlier had effectively reversed. The killings in Myanmar and elsewhere, coupled with Russian-led disinformation campaigns in Donald Trump’s presidential election, had darkened popular opinion. Companies that were accustomed to being revered were suddenly being accused of simultaneously squelching free expression and tolerating active manipulation of their platforms.
The tech industry’s first halting steps to control the flow of abuse initially won few fans. In an online essay in late 2018 Cynthia Wong, then senior internet researcher for Human Rights Watch, said it was time for a “moral reckoning” in Silicon Valley. “If regulators, investors, and users want true accountability, they should press for a far more radical re-examination of tech sector business models, especially social media and advertising ecosystems,” she wrote.
In some cases, the companies started hiring their critics. Twitter brought on Wong as its legal director of human rights in April 2019. The company declined to make her available for an interview, and said in a statement that it was “uniquely positioned to help activist and civic-minded people around the globe make their voices heard.”
Other attempts at reform were wholly unsuccessful. In early 2019 Ross LaJeunesse, then Google’s global head of international relations, saw Facebook’s posting for a human rights director, and used it to argue for the creation of a similar structure at his company. He failed, and left the company soon after. LaJeunesse, who is currently running for the U.S. Senate in Maine, now says tech companies can’t handle these issues on their own. “There has to be government oversight,” he said.
Sissons, who reports to Facebook’s head of global policy management Monika Bickert, has over the last several months been quietly incorporating human rights protections into Facebook’s policies, and making sure that people with human rights training are in the meetings where executives sign off on new product features. She said the company had made progress before she arrived, including the reform of its 2018 decision to begin removing misinformation in situations where it could lead to physical harm.
“There are now a lot of resources in place,” Sissons said. The challenge is to quickly identify local signs of trouble, then block or slow the spread of certain content, or take swift action against particular users. “We are testing continuously in crisis environments to try and predict what resources we’ll need,” she said, “and to ensure they’re in place.”
When Sissons went to Myanmar with Facebook she made a stop in Phandeeyar, a tech hub and community center in downtown Yangon. Jes Kaliebe Petersen, its CEO, said he’s been meeting with Facebook employees for years—he helped the company develop local community standards almost five years ago. But the encounters have calcified into a depressingly predictable routine. “They send a bunch of people who have never been here before, and they talk to us,” said Petersen. “And we never hear from them again.”
A spokesman for Facebook said it has held many introductory meetings at the request of local advocates, and argued the company has taken significant strides in the country. Besides hiring Sissons, it shut down hundreds of pages and accounts, including that of the head of Myanmar’s army, for spreading misinformation and hatred. It has hired a Myanmar head of public policy for the first time. And it assembled a team of 100 content moderators who speak Burmese. That group will be able to “support escalations” in other languages used in the country as well, Sissons said.
The company also set up an independent review board for thorny content moderation issues, and in an unusual step, commissioned independent human rights assessments of what happened in Myanmar and other trouble spots. In November 2018, it published a 60-page report on Myanmar from the nonprofit group Business for Social Responsibility, in full. “They deserve praise for putting it out there,” said Dunstan Allison-Hope the lead author of the report. “You don’t see that.”
But Facebook has never made the results of a similar assessment in Sri Lanka public, despite calls to do so. Sissons declined to say whether it had plans to publish those results. And there are currently no Facebook staff members working in Myanmar full-time—something that many advocates have called for. Representatives for Facebook say its staff based in Singapore and elsewhere are regularly in Myanmar, and that it has spent well over a year taking hundreds of meetings with people in the country.
One person who said he’d never gotten an invitation to meet with Facebook is Nickey Diamond, a local advocate working for Fortify Rights. Diamond said he has been the target of harassing posts from the government for years, and still faces a menacing atmosphere online. “They’re sharing my picture with the word ‘traitor’ in Burmese,” he said. “Every human rights defender is in the same situation.”
The broader problem Facebook is confronting—the vigilant monitoring of an ever-evolving social network used by 2.3 billion people—can seem almost impossibly daunting. The company now has content moderators examining posts in approximately 50 languages, Sissons said, a number that is unchanged from its count last April, and is fewer than half of the languages that Facebook actively supports.
Facebook has said only technological improvements can combat problems at scale. It has automated tools that scan for hate speech, as well as image recognition technology monitoring for obscene content regardless of language. About 80% of the posts that Facebook acts on for violating its hate speech policies are now first identified by its automated filters, up from about 24% a year earlier.
Soon, the challenges of monitoring the spread of abusive posts could become even more difficult. Facing pressure to increase user privacy, Facebook has prioritized private communications, meaning more content is encrypted so that even the company itself won’t know what it says. In those cases, Sissons said the company is working on tools that will look for patterns associated with problematic content, so it can either remove such messages or impede them from spreading so rapidly.
Facebook is aware of the scope of its challenges, said Rebecca MacKinnon, the director of Ranking Digital Rights, an online advocacy group. “Facebook is making an effort to engage. Whether that will make a difference in the real world, we’ll see,” she said. “They’re dealing with some problems that no one knows how to solve.”
When Sissons met with members of the Phandeeyar team last November in Myanmar, they came prepared with a handful of suggestions for actions Facebook should take before the national elections there, which are expected to take place later this year. While Phandeeyar staffers had been deeply engaged in the specifics for months, Sissons was still just getting her feet under her, and there wasn’t enough time in the hour-long meeting to get much resolution, said Phandeeyar CEO Petersen.
“There’s always lots of goals for improvements. Hopefully Miranda has a sound plan for how to get there,” he said. “The thing is, we don’t really have that much time.”
(Clarifies that Petersen was not in the meeting with Sissions in the penultimate paragraph of article published Tuesday. An earlier version of this story corrected the dates of Sissons’ hiring and trip to Myanmar.)
To contact the author of this story: Joshua Brustein in New York at [email protected]
To contact the editor responsible for this story: Anne VanderMey at [email protected], Andrew Pollack
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="For more articles like this, please visit us at bloomberg.com” data-reactid=”55″>For more articles like this, please visit us at bloomberg.com
<p class="canvas-atom canvas-text Mb(1.0em) Mb(0)–sm Mt(0.8em)–sm" type="text" content="Subscribe now to stay ahead with the most trusted business news source.” data-reactid=”56″>Subscribe now to stay ahead with the most trusted business news source.
©2020 Bloomberg L.P.
Add Comment