It all began in September 2021 when The Wall Street Journal started to publish a series of articles called The Facebook Files: A Wall Street Journal Investigation. It is based on a review of internal Facebook documents, including research reports, online employee discussions, and drafts of presentations to senior management. The allegations range from allowing harassment and hate speech for certain public figures, over the harmfulness of Instagram on teenagers to ignoring their platform being used for human trafficking and encouragement of violence against ethnic minorities. Nobody knew who was behind this massive leak of internal documents until the former Facebook product manager Frances Haugen appeared in the show 60 Minutes on October 3, 2021. Shortly after, on October 5, 2021, Frances Haugen testified in front of the U.S. Congress, pleading them to take legal action against Facebook. Further, she disclosed all the internal documents to the U.S. Securities and Exchange Commission and a redacted version to the U.S. Congress.
The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.
Frances Haugen, October 5, 2021 U.S. Congress testimony
At this point, most of us probably thought that this was it, but not at all. Beginning on October 25, 2021, a collaboration of 17 American news organizations and a separate consortium of European news outlets launched The Facebook Papers project. They have been given access to a redacted version of all documents by Frances Haugen and further obtained even more information from current and former Facebook employees. The project has released a storm of news articles about Facebook. It was already known that Facebook has caused many grievances in the past decades, like the Cambridge Analytica scandal or inciting the Rohingya genocide in Myanmar, but these documents further prove that Facebook was entirely aware of these and other issues, in some cases even has developed solutions for them, and yet they are not implementing them simply because they would impact their profit negatively.

How Facebook makes right-wing populists popular
The algorithm of Facebook decides, what posts users see in their feed and in which order. How this algorithm works is entirely in-transparent to the public, but through the leaks, we have gotten a lot of information about how it works and how Facebook constantly tweaks its algorithm.
The argument that we deliberately push content that makes people angry for profit is deeply illogical.
Mark Zuckerberg, October 2021
The papers unveiled that the algorithm weighted the angry emoji, like other emoji reactions, five times higher than a like. The reason for this is simple, a post that prompts a lot of emotions keeps the users more engaged with the platform. For instance, users would be more likely to start a discussion or share the post, keeping them longer on the platform in order to generate more revenue through ads. Applied to politics, this setting tends to amplify highly polarizing, extremist parties, as they get both negative and positive emotional reactions. It was not until Oct. 1, 2020, when Facebook reduced the weight of an angry emoji to zero, after gradually decreasing its weight.

The leaks showed, that amplification of extremism and hate speech through the Facebook algorithm has been reported in internal studies in several countries.
Facebook employees reported back specific concerns for Poland in April 2019, one year after an algorithm change, billed by Facebook chief executive Mark Zuckerberg. The change had the intention to promote “meaningful social interactions” by making the feed more focused on posts from “family and friends” instead of businesses and media, which essentially opened the flood gates for misinformation on the platform. In this document, European and Polish politicians have stated their concern about the influence of radical parties on the platform. An independent data analysis of major political parties in Poland conducted that after the algorithm change, negative messages were more likely to receive a high number of shares. Since extremist parties tend to attack their opponents with negative statements or even hate speech, the algorithm amplified their voice on the platform. Facebook’s own researchers and politicians warned about this issue two or three years ago and their concerns have become reality, in 2019 the right-wing populist party “Law and Justice” won the election. Soon after they effectively abolished the separation of power and they continue to violate EU directives to this day.
Facebooks role in the attack on the U.S. capitol
In the U.S., Facebook made changes to its algorithm for the last election season, cracking down on misinformation, foreign interference, and hate speech. It worked and many employees welcomed this change but soon after the elections were over, in December 2020, they went back to normal, since it had a negative impact on profit. According to many current and former Facebook employees, this led to the attack on the U.S. Capitol on January 6, 2021. Internal reports show, that Facebook did not act forcefully enough against the “Stop the Steal” movement.

Over the years, Facebook employees have developed ways to diminish the spread of political polarization, conspiracy theories, and incitements to violence but that in many instances executives declined to implement those steps.
Facebook hands over total control to goverment in Vietman
At the End of 2020, the government of Vietnam placed Facebook in front of a decision: They either censor anti-government posts or their platform would be banned in the country. Mark Zuckerberg of course did not want to miss out on the yearly annual revenue of $1 billion and personally decided to go with censorship.
Facebook has increased the censorship of anti-state content to such a great extent that the platform is virtually controlled by the government. They argue, that a ban would have been worse than censorship since it would have affected every citizen. Though, I am asking myself what value a social media platform is supposed to provide, when it is essentially filled with propaganda and further supports the government in locating dissidents who made forbidden posts.

It is just another instance, where Facebook chose profit over everything. Additionally, this case proves that if Facebook wants to filter some sort of content out, they indeed have the tools to do so.
Facebook in India, a legal vacuum
India is Facebook’s largest market with 340 million users and the situation has completely got out of hand, as dozens of leaked internal documents show.
Following this [Indian] test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total.
Facebook researcher, 2019
In 2019, a Facebook researcher has set up an Indian test account on Facebook. With this account, she only followed the algorithm’s recommendations and her feed has become “a near-constant barrage of polarizing nationalist content, misinformation, and violence and gore”. India’s ruling party is using a massive amount of bots and fake accounts to spread this content, without Facebook interfering.

The reports show that 87% of the budget spent on spreading misinformation is used for the U.S. market, while the remainder is used for the rest of the world, which essentially makes Facebook an anarchic zone in many countries, with India only being the tip of the iceberg. Facebook officials try to talk their way out by claiming that their hate speech detection would not support Hindi yet. They do not give any comment about the gore content. I highly doubt that they are not able to detect pictures of beheaded bodies when they have algorithms to detect nudity.
What you can do
I personally highly doubt that Facebook or Mark Zuckerberg will face any severe legal consequences once again, just like after the Cambridge Analytica scandal. It seems to me that authorities are either unable or do not want to keep this abomination of a company that Facebook has become in check. It is about us, the people of this world, to end this nightmare. You have to make yourself aware of the fact that every time you post something on Facebook, scroll through your Instagram feed or send a message through Whats App, you generate revenue for them by watching ads and selling your data, making you an accomplice of their crimes.

Does the value you get from Facebook services, which is mostly generated by the network effect, outweigh the cost of the harm they cause to the world?
Are you willing to support an organization that takes the suffering and death of humans into account to increase their astronomically high profits even further, only because “everybody” is a part of their networks?
If you have come to the right conclusion, speak to your friends and family, make them aware of the issue, and move to different networks together as a group. This will slowly but surely destroy Facebook since without us they are worthless.
Alternative social media platforms with better data privacy and higher ethical standards already exist. Although, not entirely perfect in both of those aspects but still miles ahead of Facebook, Reddit and Twitter are great alternatives with a huge userbase. Mastodon is an uprising decentralized social network, similar to Twitter in functionality, and truly free.
When it comes to instant messaging, be aware that Facebook is able to read every single message you send. In fact, they automatically process every message. In one of my previous blog posts, I am comparing the most common instant messengers in regard to data privacy and security. While getting on Telegram or Signal is a great start, once again only the decentralized Matrix network truly solves the issue of one company being in control of a communication channel.
1 comment