Facebook’s products harm children, stoke division, and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people. – Frances Haugen
By Hemant Kashyap
In September, The Wall Street Journal published shocking revelations about Facebook and Instagram, following Frances Haugen’s whistleblowing on the Silicon Valley giant. By and large, the documents reveal that Facebook had the knowledge and the means to stop things that happen on a horrifyingly fast frequency, yet chose to prioritize its profits over the good of the users, as Haugen puts it.
Some of the aforementioned phenomena, in brief, include enabling the spread of hate and aggravating it, the adverse impact of the platform on the mental health of teenagers, a weak response to the threat of human traffickers and drug cartels, anti-vaccination campaigns…the list goes on.
These revelations come in the aftermath of Mark Zuckerberg's testimony to Congress that Facebook spends billions to remove harmful content. However, if the company was, in fact, spending billions, the impact of those billions has not been felt.
Facebook’s fall from grace has raised questions on the role of social media in our lives, and what the future holds. During the times when people are using more internet than ever, and hence more social media than ever, how can we hold these untouchable, faceless social media giants responsible for the people that spend hours using their services? And with Facebook changing to Meta, and moving towards a more immersive experience than ever, should it be working to draw people in deeper, or work to clean up its act?
The Underlying Problem
In the thousands of documents leaked by Haugen, there is a lot to unpack. While the company has worked to remove the questionable content, the efforts have been failing, and hence, people within the company have felt the need to reevaluate the algorithms that dictated what content showed on a user’s Feed and where. Worryingly, this algorithm can be easily overruled by the algorithm that works with promoted content to maximize engagements.
This is an expensive process, as Facebook would have to sink resources in rewriting the entire code of the algorithms to redefine priorities each of them take and incorporating a method to weed out content that might be objectionable. Therefore, this comes down to the decision of whether to prioritize profits or to check the safety of people. The popular perception here is that Facebook prioritized the profits, which has been reflected in its latest results – it managed to earn over $9 billion in profits even after the whistleblowing incident.
However, things are not as black and white as just that. Haugen herself has come out and said that it is in Facebook’s financial interests to keep people safe on its apps. “Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites”, she said. Haugen also pointed out that Facebook has invested $13 billion and about 40,000 people to remove harmful content.
So, what is the problem here?
One of the most alarming revelations that came along with many others was that Mark Zuckerberg directly intervened with the working of the Civic Integrity Team, the internal team that is responsible for flagging and removing content, and making Facebook’s apps a safer place. The documents revealed that while the team had ideas to implement which would make these social media channels safer, they were overruled, sometimes directly by Zuckerberg himself.
Facebook introduced the world to the phenomenon of virality, as one of the documents from the team reads, dated December 2019. The team had also pointed out that it has “compelling evidence that our core product mechanics, such as virality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform”.
Interestingly, till 2009, Facebook’s News Feed was just a collection of posts, arranged chronologically with the latest posts at the top. However, as the company grew larger, attracted more companies that paid for ads, and money started flowing in, the company introduced algorithms that were used to push posts more relevant to the users and keep them in line with the interests of the advertisers while also bringing users back. However, this has seen the News Feed become an “arms race”, with more and more damaging content making it to the top of people’s Feeds around the world.
A year later, the company’s Public Policy team also was cited in a report stating that it “commonly veto launches which have significant negative impacts on politically sensitive actors”. This means that Facebook has a lot of control over its algorithm. How much control? According to Haugen’s testimony before Congress, it has all of it.
“They have a hundred percent control over their algorithms,” Haugen told Congress. “Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety. They shouldn’t get a free pass on that because they’re paying for their profits right now with our safety.”
In short, the problem lies in Facebook’s unwillingness to change its algorithm too drastically. While that would definitely make Facebook and Instagram safer, however, it might tip the scales away from profitability.
Facebook Needs to Clean Up its Act
As this has shown, the company cannot keep this up; the headwinds it is facing and the negative public image it has now forced it to change its name to Meta, to differentiate the rest of its business from the belligerent social media channel. In keeping the current status quo up, it might end up losing a large share of its users, as more and more of them move away from Facebook.
In another leaked document, Apple was cited to have threatened to pull Facebook and Instagram from iOS on October 23, 2019, over Facebook’s alleged inaction over human trafficking content after a BBC News Arabic report found that domestic workers were being sold on Instagram and other apps. In the face of such an existential threat, the company removed over 1,29,000 pieces of content and even ditched a policy exception letting established brick and mortar businesses like recruitment agencies post ads about domestic workers.
Apple might have dropped its threat, closing the incident within a week, but Facebook needs to realize that the Haugen Whistleblowing scandal has brought it to the edge of ostracizing on a much larger scale.
These continued incidents, and Facebook’s inaction, speak volumes about the priorities of the tech giants, and hence, the company has found itself at the center of all the negative attention for the last 3 months. The extent to which Meta goes to ensure that its ad revenues stay up is reflected neatly by the fact that it never went through with the plan of hiding the likes on posts across Facebook and Instagram. When the company realized that it hurt ad revenues, it pulled the plug on the plan. However, after much opposition, Instagram finally allowed users to opt into hiding their likes.
Along with this, Facebook’s flagging system has seen banned content go through easily, and the company has failed to rectify a simple design flaw in the monitoring system, also called Laser in the leaked reports. The system was only trained to filter out groups or pages that posted obviously political content over the last few days or so. Therefore, such groups could easily avoid being banned, and thus, people kept getting recommendations of such groups and content, meaning that Laser had a very high churn rate.
All of these silly mistakes have seen Facebook’s reliability, which was already on a very low level, tank even further. With the company’s vision of a more immersive social media experience with a Metaverse, it needs to clean up fast, before it finds itself in an irrevocable position.
Final Thoughts
On January 28, 1986, an O-Ring failure led to the explosion of the space shuttle Challenger, just 72 seconds after lift-off, killing all 7 aboard. Later, Tom Brokaw would say of the accident, “the American public would be demanding some difficult answers, and of course, we’ll all have to examine what it is that we want from this era of high technology”.
Almost 40 years on, those words still ring true. Facebook’s fall from grace has raised questions about if we can ever say “social media” and “public good” in a synergic sense and has cast shadows on the impact of technology on our lives.