Let's Talk About Facebook

American President Joe Biden recently made a sensational comment insinuating that Facebook’s permissive stance towards vaccine misinformation was “killing people.” This sensational attempt to blame Facebook for his administration’s inability to hit vaccination campaign targets left a bad taste in my mouth. I could not quite understand how he could credibly blame a private company, while he himself sat at the helm of the Federal Government. It reminded me of an exchange from the TV show West Wing in which the president says, “School boards and local elections are where true governance takes place.” In the TV show, the president’s comment is used to show how unhinged and paranoid the President had become and how he had let personal enmity cloud his judgment. (His advisors point out to him later that he has bigger fish to fry and can’t get involved in these local elections or swing them using his pulpit.) In the real world, Biden has used this technique to create a scapegoat for his administration’s failings.

From the time that Facebook came on the scene, someone has been completely in love with it, while someone else has been incensed at its existence. I believe that Facebook is polarizing because it demands something radical of the people who believe that they have some understanding of the world around them: Adapt to something truly new. This demand is rarely made by an ill-defined private entity with a misunderstood product and a (sadly) robotic CEO who can’t quite express what his goal for the product is. (When I watch Zuckerberg talk about his company, I wish he would just say that all he wanted to do with Facebook was build a cool engineering product and make a lot of money with it.) A demand like this one is not novel. But in the past, these demands were made by revolutionary, charming CEOs with well-understood products.

Two similar adaptation demands that come to mind are the ones made by Ford, who pioneered the auto-assembly line and increased auto-manufacturing productivity many-fold, and Steve Jobs, who pioneered the iPhone and insisted on every human being’s fundamental right to have a computer in their pocket. They insisted that society should live with these improvements, however uncomfortable they made the old guard of their time. The iPhone, in particular, was marketed by a sometimes-crazy, sometimes-revolutionary founder. These two adaptations became a part of the mainstream very quickly.

Facebook’s trajectory was similar: millions of users adopted it across the world. And yet, no one quite knew what to make of it. What happens when a misunderstood product becomes wildly popular? I think the answer to this question is the key to understanding where Facebook is right now. Facebook is on the cusp of becoming a scapegoat for spreading misinformation, is waiting for the other shoe to drop on antitrust allegations in the US, and is facing a mutiny among its employees, whose political beliefs are not representative of those of its user base.

Before proceeding, I will put my cards on the table: I don’t use Facebook anymore. I used it a lot from 2010 to 2014. I used it much less from 2014 to 2016. And I stopped using it around 5 years ago, when I was in my third year of college, because I could see that it was making me actively agitated. (This was the period in my life when it seemed like everyone around me was landing an internship or a conference talk and going abroad for the first time; while I was going to Bangalore to intern at a start-up no one knew much about.) I have a libertarian view of the usage of technology and the Internet. I think that people who like to use Facebook should use it. I don’t believe that paternalistic entities (like the government or a Netflix documentary) counseling people about the evils of social media is going to help one bit.

Now, let’s talk about Facebook.

First off, the most thorny question of all: what is Facebook?

Facebook is a product which allows people to post text and images, add other people as their friends, follow celebrities and news websites who post content, and see content posted by their friends and the pages they have followed in a constant stream called the Newsfeed.

That’s a mouthful. I tried to boil down Facebook’s core feature, Newsfeed, that also happens to be the most controversial. The other parts of Facebook are widely accepted as useful (or at least not harmful enough as the Newsfeed): messaging, games, groups, etc.

The Algorithm

The things that show up on a person’s Newsfeed and the order that they are displayed in is decided by “the Algorithm”. This constant reference to the algorithm is misleading. The thinking around this algorithm goes something like this: Facebook has a system where they can specify the weights for each of the pieces of content that a person is eligible to see. (These pieces of content enter this list because they were posted by the user’s friends, friends of friends, a page they follow, a group they are a member of, or because a recommendation algorithm predicted that the user will like said piece of content, among others.)

I doubt that Facebook’s internal systems are anywhere near that simple or easy to customize. My prediction is that the algorithm is a combination of handmade heuristics and one or more machine learning models, which have optimized to move some metric in the desired direction. (Common metrics could be of the form: Maximize the amount of time a user spends on Facebook; Maximize the number of comments that they post in a given time span.) Each machine learning model would try to classify the user into one of several buckets that it has created based on training data. Then, for each piece of content that it sees, it would decide how likely that piece of content is to move its goal metric in the direction that is desired. After this calculation, it would present the pieces of content to the user in an order of decreasing likelihood. (This is all complete speculation based on my experience with similar recommendation algorithms and my understanding about how such a system would be designed if it were to be designed from scratch. Facebook probably does not have such a simple, easy-to-understand design, because the system was not built from scratch, rather, it was built in an iterative fashion.)

By simplifying this possibly very complex system into “the Algorithm”, administration officials and activists can effectively use Facebook as a scapegoat for pretty much anything.

Surgeon General Vivek Murthy specifically calls on companies to redesign their algorithms to “avoid amplifying misinformation.” He also suggests that they build more “friction” into sharing functions that urge users to rethink whether to share a post containing false information. – Surgeon General calls on Social Media Companies to Curb Misinformation on Their Platforms

The level of control implied in the above suggestion shifts blame effortlessly. It paints Facebook as the private corporation which is unwilling to take steps to fix its algorithm, implying that the only thing lacking is Facebook’s desire to make such changes. The nuances of whether such changes are possible and how effective they would be is missing from this characterization.

The content posted by users on any platform is diverse and it is impossible to classify something as “containing false information”. The Internet will confirm your beliefs and your stand on any topic. There is very little objectivity remaining in the topics that have become part of the common discourse today.

I am putting aside the philosophical debate about what misinformation even is. A few examples:

The suggestion to make it harder to share “false information” assumes that Facebook is an omniscient entity which can classify something as false. Softening the definition of misinformation to “widely believed to be false” is not helpful because we are stuck with the same classification problem. Technological solutions are not the root cause of the misinformation problem. The root cause is not knowing what misinformation even is, until something becomes popular enough to warrant further investigation, a fact check, or public uproar.

Private Corporations are not Governments

I think this needs to be repeated: Facebook is a private corporation. Milton Friedman, the economist behind shareholder capitalism, insisted that the only responsibility a private corporation has is to its shareholders. i.e. Corporations like Facebook should do whatever it takes to increase the company’s value to its shareholders, within the boundaries set by law.

The “social responsibility” of corporations was always a sideshow. It is one that corporations are compelled to engage in when they come under activist pressure. Some notions of responsibility have been built into corporations through the “Stakeholder Capitalism” model, which claims that corporations have responsibility to parties other than their shareholders such as the environment and society. This notion has not taken off and I remain unconvinced that this approach will succeed in the long term. Forcing private corporations to do something other than make profits and increase shareholder value seems strange and counter-productive. There are other types of organizations who are engaged in this kind of activity: Non-profits look out for people who don’t have a voice. Activists talk to lawmakers about agendas that they care about. Private companies are not really supposed to be activists or anything other than profit-making corporations.

The ultimate non-profit is the Government. Governments have always been about everyone. (“Everyone” as in the people who are being governed and not being deliberately excluded due to discrimination.) One of the government’s jobs is to put the right amount of regulation in place, so that innovation continues unhampered, while citizens can live full, satisfying lives. When private corporations become as big as Facebook and Google, the government’s power does not simply vanish. The power held by governments is special and can not be taken away by anyone else.

Any country’s government can legally require Facebook to stop misinformation on its platform through these steps:

  1. Enact legislation to make Facebook liable for the harms caused due to misinformation shared on the platform

That’s it. There is just one step to that process. Once the government starts finding pieces of misinformation, identifies the tangible harms caused by it, and holds Facebook liable for these harms, it is in Facebook’s interests to stop the spread of such misinformation. There is no “choice” or high-minded talk of “social responsibility” here. Facebook’s shareholders will not stand idly by as governments impose hefty fines on the company for not cracking down on misinformation. Facebook’s executives will be forced to act (or pretend to act or curb the government’s enforcement action).

The realist politics watcher will instinctively oppose this idea: Facebook pumps in a huge amount of money into the campaigns of people who become part of the legislature and no politician will enact legislation against their top campaign donor. This problem is not without a solution either: Campaign finance reform. Anyone can donate unlimited amounts of money to any candidate and remain completely hidden through organizations which don’t have to publish their donor lists. This was proven by Stephen Colbert who did the actual legal paperwork to establish such an organization and prove that it was possible. (The resulting TV show segment was also funny.)

The Convenient Scapegoat

Comments like Biden’s recent “They are killing people” are mind-boggling to hear. Mainstream news outlets have been peddling vaccine misinformation1 and are arguably perceived as more reliable than posts by random anti-vaccination campaigners on Facebook. Given this disconnect between the various sources of misinformation and the government’s focus on Facebook, it is important to analyze the timing and the possible ulterior motives for this zeroing in.

Misinformation on Facebook has been going on for some time now and anti-vaccination misinformation has been spreading for a really long time. Savvy political campaigns, such as the one run by Donald Trump or those run by the Republican Party’s candidates during the Georgia run-off elections, have used the Facebook ads platform to peddle misleading information about their opponents. These ads were disguised as political advertising, and Facebook did not take any step to curb this kind of advertising. (By comparison, Twitter decided to completely ban political advertising on its platform.)

The primary reason that I found the Biden administration’s witch-hunt of Facebook distasteful was because there is an ongoing search for the convenient scapegoat.

While Trump was at the helm, Biden did not have to go looking for a scapegoat. Trump made himself readily available too, Trump thrived on anti-Trump rhetoric, while Biden’s supporters wanted to see him frustrated by Trump’s apparent “childishness” and self-centered approach to crises. They were in a mutually beneficial relationship. Trump says something crazy, Biden and other Democrats say things about him, Trump calls them names, … so on.

Facebook’s approach to misinformation has not changed dramatically between the Trump and Biden presidencies. But Facebook banning Trump from their platform was a dramatic change. In an instant, they evicted the existing scapegoat. The cage stood empty as the Biden administration struggled to give the first dose of the COVID-19 vaccine to 70% of the eligible population. This failure started a race to fill the cage with a scapegoat once again.

The “Misinformation Dozen”, 12 people who are responsible for 65% of all vaccine misinformation in the US, would have been the obvious choice. But looking at their combined base of followers, it is obvious why they did not make the cut. Even at the peak of their popularity, they had 15 million followers and 62 active accounts. With a population upwards of 300 million, I can understand why most people would be suspicious if the administration tried to blame a group with direct access to less than 5% of the total population2.

I have one final criticism of the administration’s stance. As it has railed against Facebook, they have repeatedly asked for information from Facebook. But these requests seem to be non-binding. The administration, undoubtedly, has the ability to open investigations, subpoena documents and data directly from Facebook. Publishing this information is key to Biden’s argument about Facebook being responsible for the stalled vaccination campaign. But to do that, time-consuming work that (even) Biden’s party colleagues are probably unwilling to take up is a necessity.

The measured approach of compelling Facebook to share information, publicizing it and then, (finally) placing blame, requires work that lawmakers would forgo. Sensational comments are just as effective in the court of public opinion and can rile up supporters against the chosen scapegoat3.


  1. This Media Matters report about the vaccine misinformation claims on Fox News is worth reading. Here’s the topline from the summary: “In the 2 weeks from June 28 2021 to July 11 2021, 57% of 129 segments about the COVID-19 vaccines had claims that undermined immunization efforts. 45% of segments included claims that the vaccination drive is coercive or that it represents government outreach. 37% of segments included claims that the vaccines are unnecessary or dangerous.” 

  2. These 15 million followers are the tip of an iceberg whose real size is unknown. The followers are bound to be members of groups and probably share content from the Misinformation Dozen amplifying the content’s reach to several million more people. Facebook is (understandably) unwilling to share this information voluntarily. The Biden administration is (inexplicably) unwilling to force Facebook to share this information through legislative or judicial pressure. Blaming 12 people is simply not viable. 

  3. I admit that on seeing a news article saying that the Biden administration was compelling Facebook to share data, I would not have written a post like this one. My guess is that Opinion desks would have written something but this reaction would not have generated half the news headlines that the sensational comments did.