Facebook is under scrutiny yet again – but this time, it’s big. Whistleblowers, journalists and politicians working together to expose the very suss inner workings of this seemingly untouchable empire. Major news organisations including the Associated Press, CNN, Le Monde, Reuters and the Fox Business network have joined forces to carefully go through all the leaked documents – aka the Facebook Papers – and publish what they find. The articles are now live and… it’s a lot of reading. To help keep you in the loop, here’s a summary of the Facebook Papers findings that are most important for Australians to understand.
What’s happened so far?
Quick backstory before we get to the juicy stuff. In September 2021, whistleblower Frances Haugen left her position as Facebook’s product manager and took tens of thousands internal documents with her. She went to The Wall Street Journal, who initially published a nine-part investigation on the documents called The Facebook Files ($).
The documents showed proof that through the Facebook’s own research, the company knew that their products were causing harm. These revelations included:
- Facebook’s products being deemed “toxic” and “harmful” for teens
- Internal concerns that the company wasn’t doing enough to stop human trafficking being coordinated through the platform
- Facebook promotes ethnic violence in countries such as Myanmar and Ethiopia
- The company failed to shut down misinformation before the infamous United States Capitol riot in January 2021
This information has led to a U.S. congressional hearing where Frances Haugen has pleaded for further congressional action. Senators have supported issuing subpoenas to get the full text of the leaked documents (some sections were blacked out). Overall it was agreed that congress needs to take quick action to investigate further into what the heck is going on at Facebook.
But as much as this was good progress, there needs to be more and Frances Haugen agrees. She began working with a diverse group of news outlets to release more stories about the leaked Facebook documents all at the same time. Now they’re out – this is what we’ve learnt.
1. Facebook neglects the serious harm it causes outside of the US
The reporting on Facebook generally, and the Facebook Papers, is very US-centric. That’s partly because they are an American company, but also because of Facebook’s own prioritisation of the US despite having clear evidence that they are creating tangible, dangerous impacts around the world.
The leaked Facebook Papers revealed that employees repeatedly raised concerns that the platform was inflaming the civil war in Ethiopia. An internal report showed that armed militia groups and ‘bad actors’ in Ethiopia were using Facebook to make explicit calls for violence – particularly against minority groups.
In the report, the team warned that “current mitigation strategies are not enough” in response Facebook did… very little. The company did not increase moderation efforts, resources or tools.
Its algorithm amplifies inflammatory content, and the company is letting this slide in ‘at-risk’ countries including India, Myanmar and Afghanistan. Now we know that Facebook knows this.
The company uses a tiered system to categorise which countries need the most moderation and attention – but there are no transparent rules about how countries are classified. Tier zero is the highest level; these countries have a dedicated team monitoring what’s happening and get special tools created, like fact-check pop ups. The US, India and Brazil sit at this level. According to The Verge, Germany, Indonesia, Iran, Israel, and Italy sit at tier one – they also get focussed attention and special tools. Despite the evident danger in Ethiopia, it does not feature high up enough in the tier system to have even basic content policies in local languages.
It’s clear that Facebook’s senior decision-makers are not as concerned with the harm it causes outside of its homebase in the US… even though 72% of its users are reportedly outside of North America and Europe.
Haugen said this is the prime reason she blew the whistle: “I genuinely fear that a huge number of people are going to die in the next five to ten years, or twenty years, because of [Facebook’s] choices and underfunding.”
While the problems in Australia are fortunately less extreme than in other places, as we approach the next election the threat of disinformation is real. Unfortunately, the tech giant is unlikely to care enough about destabilising our elections to do much about it.
2. ‘High profile’ accounts are given content exemptions – sometimes by Mark Zuckerberg himself
That suspicion we’ve all had is now confirmed: Facebook literally has a separate set of content standards for politicians, celebrities and ‘high profile’ accounts. It uses a system called XCheck (like, ‘cross-check’) to moderate posts by VIPs in a separate process to the ones for regular folks like us. This allows VIPs to share posts that are abusive, spread misinformation or generally break the rules without facing any consequences.
The WSJ was the first to report on XCheck, and journalist Jeff Horowitz says the premise of the system is really to protect Facebook by “never publicly [tangling] with anyone influential enough to do you harm.”
In the most extreme cases, posts are given exemptions by Mark Zuckerberg himself. In the lead up to Vietnam’s 2021, the ruling Communist Party of Vietnam gave Facebook an ultimatum: either help them censor anti-government posts, or be blocked from the Internet. Zuckerberg personally made the decision to increase censorship in Vietnam. The country is worth approximately USD$1billion in revenue for Facebook.
It’s interesting to think about which Australian politicians and high-profile figures may also be receiving special treatment from Facebook – especially given that our local media and politics skews to the right of the political spectrum.
3. Facebook can radicalise you in as little as two days
Another takeaway that’s not out of the blue, but still surprising to have confirmed: the Facebook algorithm will start showing you dangerous, radical content very quickly. The internal documents prove that Facebook is aware that its products make political division and misinformation much worse. They have run experiments and research that found new accounts:
- In the US can be directed to conspiracy content, including QAnon, in two days (Carol’s Journey to QAnon)
- In India are funneled towards doctored images, fake news and extremely graphic content in around 10 days. A researcher noted that during the project they saw “more images of dead people in the past three weeks than I’ve seen in my entire life total.”
A Facebook employee wrote that the company has “compelling evidence that our core product mechanisms, such as virality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform.”
If you’re concerned about what your older loved ones, and even politicians, are exposed to on Facebook – you’re right to be.
4. They know young people hate Facebook and that Instagram is bad for you
One of the few revelations from the Facebook Papers that isn’t completely terrifying is that young people are not using the blue app. In the US (remember, it’s all so US-centric) the company expects that its number of active teenage users will drop by almost 50% in the next two years.
The documents show the tech giant has had problems engaging users under 30 since as far back as 2012… the same year it bought Instagram. Coincidence? Unlikely.
On Instagram, a leaked research report outlines how damaging the app is to young women’s self-esteem and body image. The company’s research found Instagram is worse for social comparison than TikTok or Snapchat, and literally states: “We make body image issues worse for one in three teen girls.”
Expect to see more intense efforts by Facebook to launch new products and platforms specifically aimed at young people in the coming years. And you should probably treat them with the same informed skepticism that you treat ‘Blue Facebook’ now.
5. Facebook employees already know how to start fixing these problems… but decision-makers aren’t listening
Arguably the biggest bombshell to come from the Facebook Papers is that many employees have suggested ways to start fixing all these problems… but Facebook’s decision makers, including Zuckerberg, have ignored them.
Among the internal recommendations reported by Wired for how to make Facebook actually good:
- Stop prioritising user engagement as the key metric of success
- Introduce quality rankings for content (similar to how Google ranks page results)
- Reintroduce a reverse-chronological feed (most recent posts first, to de-emphasise ‘engaging’ content which is likely to be inflammatory)
- Restricting how the Share feature works (deep re-shares are more likely to be misinformation)
- Invest in development of tools and resources for developing and non-English speaking countries, to bring them on-par with tools for North America
- Separate the Content Moderation and Public Policy teams – the Public Policy team that deals with governments has influence and control over too many unrelated departments in the current structure
For a company worth USD$1 trillion, it can afford to implement any and all of these suggestions, with the backing of its own research. But it won’t, because profit – made by maintaining your attention to sell to advertisers – is more important that the real lives being impacted at-scale around the world by the big, blue app.