3rdPartyFeeds

Facebook whistleblower says riots and genocides are the ‘opening chapters’ if action isn’t taken

Frances Haugen gave the warning while providing evidence to Parliament Read More...

The Facebookwhistleblower Frances Haugen has said that events such as the January 6 riots and genocides in Myanmar and Ethopia are the “opening chapters” of worse events if action is not taken against the social media company.

Ms Haugen gave the warning while giving evidence to Parliament ahead of the government’s development of an Online Harms Bill.

“Engagement-based ranking prioritises and amplifies divisive, polarising content”, Ms Haugen said, adding that the company could make non-content based choices that would sliver off half-percentage points of growth but “Facebook is unwilling to give up those slivers for our safety”.

The “opening chapters” of this “novel”, Ms Haugen said, will be “horrific” to read in both the global South and in western societies.

Her evidence comes as a number of files was shared with a variety of media publications about internal research Facebook conducted.

It has been revealed that Facebook lacked misinformation classifiers in Myanmar, Pakistan, and Ethiopia – countries designated at highest risk last year.

Countries like Brazil, India, and the United States were placed in “tier zero,” with “war rooms” that would monitor the geographic spaces continuously.

“Facebook never set out to prioritise polarising content, it just rose as a side effect of priorities it did take”, Ms Haugen said. She also emphasised the need for local languages and dialects to be supported.

“UK English is sufficiently different that I would be unsurprised if the safety systems that they developed, primarily for American English, would be under-enforced in the UK”, she said.

The difference between systems in the United States compared to Ethiopia is stark. Facebook offers a huge range of services designed to protect the public discourse such as building artificial intelligence systems to detect hate speech in memes and quickly respond to hoaxes and incitement to violence in real-time.

In Ethopia, however, does not have the company’s community standards translated into all of its official languages, and there is not machine learning or fact-checkers available to manage content there.

Facebook said in 2018 that it agreed with an independent report it commissioned that said it had failed to prevent its platform being used to “incite offline violence” in Myanmar.

The report said Facebook platform had created an “enabling environment” for the proliferation of human rights abuse, which culminated in violence against the Rohingya people (a stateless Muslim minority) that the UN said may amount to genocide.

Moving to systems that are human-scaled, rather than algorithms telling people where the focus is, is the safest action, she claimed.

“We liked social media before we had an algorithmic feed”, Ms Haugen said. “Slowing the platform down, agnostic strategies, human-scale solutions. That’s the direction we need to go”.

Ms Haugen alleged that the reason Facebook does not take these actions is because the company does not want to sacrifice users being on the platform for a shorter period of time, or less revenue – allegations that we have heard prior from previous leaks.

“Facebook has a strategy of only slowing the platform down once a crisis has begun … rather than as the temperature gets hotter and making the platform safer as it happens”, Ms Haugen said, describing it as a “break the glass measure”.

These measures have nothing to do with the content itself, but were “little questions” where Facebook would optimise its algorithm for growth over safety.

Much of these problems, across all social media sites, is due to engagement-based ranking, Ms Haugen said, due to the necessity of AI within those systems to take out extremist content. “Facebook never set out to prioritise polarising content, it just rose as a side effect of priorities it did take.”

Any tech company that has a large societal impact needs to have information shared with the public, Ms Haugen said. She compared the SEC whistleblower protections she had because Facebook is a public company to TikTok, which is a private company.

“You can’t take a college class today to understand these systems inside of Facebook”, she said. “The only people who understand it are inside Facebook.”

Facebook should also be forced to publish what integrity systems exist, Ms Haugen said, claiming that a government source said that Facebook does not track self-harm content.

Self-harm content has been dangerous on Facebook-owned platforms, such as Instagram, with research from inside the company suggesting that the company was aware that its algorithms made some young girls feel worse about their body image.

“TikTok is about doing fun activities with your friends. Snapchat is about faces and augmented reality. Reddit is at least vaguely about ideas. But Instagram is about social comparison”, Ms Haugen said.

While many children had “good homes to go to” when she was younger, the “bullying [now] follows them home”, she continued.

“There is no will at the top … to make sure these systems are run in an adequately safe way, and until we bring in a counterweight, things will be operated in the shareholders’ interest, not in the public interest.”

The Independent has reached out to Facebook for comment on Ms Haugen’s evidence.

More follows…

Read More

Add Comment

Click here to post a comment