Back in October 2019, during a presidential debate in Ohio, Senator Elizabeth Warren made a strong case for breaking up Facebook. “A handful of monopolists shouldn’t dominate our economy and our democracy,” she declared. Even though the debate showed that the other candidates had different opinions, the case for breaking up big tech has gained momentum in recent years. There’s no denying the fact that Silicon Valley tech giants hold too much power that’s considered as a threat to democracy and healthy competition.
Fast forward to 2020 and the Facebook co-founder and CEO, Mark Zuckerberg, seems to be making Senator Warren’s case even stronger. He has been on the receiving end after refusing to censor US President Trump’s controversial post “When the looting starts, the shooting starts” on the social media platform. Unlike Facebook, its leading competitor Twitter was quick to give Trump a reality check for “glorifying violence” on its platform.
Zuckerberg tried to clarify his stand during a company-wide call with his employees [more on that later]. He also defended his decision in a separate Facebook post by quoting Trump’s intention to deploy the National Guard and enforce state action.
This has been an incredibly tough week after a string of tough weeks. The killing of George Floyd showed yet again that…
This brings us to two possibilities: Either Zuckerberg is really concerned about “free expression” and is willing to face widespread backlash, or he’s simply refusing to stand up to the occasion and using the challenging reality of free speech to his advantage.
To search for the answers, we’ll have to look into Facebook’s past records and see if the company’s policies have really been consistent when it comes to content moderation on the platform.
TW: This article contains examples of some Facebook posts with language that some readers might find disturbing.
How Not To Moderate Content
Instead of searching for some of the most recent reports of Facebook’s alleged double standards, I looked for some older incidents. During my research, I stumbled upon an article from Pulitzer-prize winning non-profit publication ProPublica. The 2017 report contains numerous examples of how Facebook turned a blind eye to its own policy of restricting posts mentioning violent threats on the basis of religious inclination of a person.
When a reader reported a photo saying “the only good Muslim is a f*cking dead one,” she got an (automated) message saying the post doesn’t violate the community standards. However, when multiple users reported a comment saying “Death to the Muslims,” the company took it down.
In 2018, another startling revelation was made by an undercover journalist who worked as a Facebook moderator; he shared his findings in a popular documentary called Inside Facebook: Secrets of the Social Network. While working at a third-party moderator firm in Ireland, he found that Facebook allowed far-right groups to exceed the deletion threshold provided by the company itself. In case you’re wondering, this threshold is fluid — this metric keeps changing from group to group and person to person.
If you wish to go through some more recent examples, a website called Facebook Jailed is doing a decent job documenting the posts that expose the social media giant’s bias. The website was started by Kayla Avery, who was banned multiple times for posting ‘hate speech’ content. While Facebook has banned the usage of term Men Are Trash, which is a popular term used by thousands of women to speak up against the ingrained culture of patriarchy and toxic masculinity, it’s not so keen to remove posts with captions like women are scum/trash/the worst. Users who find these posts offensive have the option of blocking the post and content but it doesn’t stop the other users from interacting with the content.
Update: “Men are scum/ugly/trash/theworst” posts get removed (quickly) no matter the context and get you suspended.
“Women are scum” – reviewed and found by FB not to violate standards. pic.twitter.com/ebEXK6xG7Q
— Marcia Belsky (@MarciaBelsky) November 29, 2017
On numerous occasions, the company’s spokespersons have simply quoted its standard guidelines: “We define hate speech as a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability.”
In this particular case of different standards for different genders, it’s apparent that Facebook doesn’t understand sexism. Even if we give Facebook the benefit of the doubt and ignore the undeniable context of centuries of systematic oppression of women across the globe, its content moderation fails to justify its own guidelines — it doesn’t even treat both behaviors equally; it does the reverse of what’s expected.
There are numerous other examples that can be cited to say that Facebook’s relationship with free speech on its platform isn’t exemplary. As we’re focused on the role of closed networks like Facebook, it becomes important to know a bit more about content moderation — it is an integral part of this mess and it’s one of the biggest challenges these social media platforms are facing. To decide what content is inappropriate and might hurt people’s feelings, Facebook (and other online platforms) takes the help of content moderators. Given how important they are to the well-being of Facebook, they should be one of the most important employees at the company — but that’s not the case.
The Tragic Lives Of Content Moderators
Just last month, Facebook agreed to pay its 11,000 current and former content moderators $52 million to compensate for mental health-related issues that they developed doing their job. In September 2018, a former moderator sued Facebook and claimed that she was regularly exposed to pictures of murder, rape, and suicide; as a result, she developed PTSP.
Last year, The Verge editor Casey Newton also published a comprehensive report that numerous content moderators outsourced via Cognizant were also subject to similar issues. It was reported that due to their non-disclosure agreements with Cognizant, the moderators were asked not to discuss the details of their work as well as the emotional challenges with their friends and family.
Mark Zuckerberg has praised the work done by moderators on numerous occasions by calling them one of the most important members of the company. However, contrary to the regular employees who enjoy lavish perks and ample vacations, the moderators live a life full of trauma and micromanagement. Outsourcing a big chunk of the workforce has obvious benefits for the company — Zero responsibility and financial advantage. To put things in context, Facebook’s regular employees earn a median salary of $240,000 and the moderators earn $28,800.
I think it’s evident that Facebook doesn’t follow its own rules — the rules which are flawed in the first place. The results of its content moderation are highly biased and unpredictable. Moreover, it doesn’t care enough about the moderators who are at the frontline fighting this war.
On numerous occasions, Zuckerberg’s enthusiasm for free speech appears to be a crowd-pleasing exercise that he’s fond of, and the praise for content moderators a classic case of guilt trip.
This brings me to my second theory — Are Facebook’s inactions deliberate? Does Mark Zuckerberg’s choice to go soft on majoritarian politics reveal something more about his ambitions?
Facebook: The News Company
Over the years, Zuckerberg has shown reluctance to acknowledge Facebook’s status as a media company that produces and hosts content; it’s also one of the biggest sources of news for hundreds of millions of people. During a 2018 congressional testimony, he defended Facebook’s case by calling it a company comprised of “engineers who write code and build product and services for other people.” In the wake of increasing security and privacy risks on the web, Zuckerberg knows that being designated a media company will make Facebook liable to follow strict advertising laws and even face more challenges in terms of content moderation.
While Zuckerberg might seem like a hopeless optimist who can’t stop talking about all great things Facebook, as a platform, has done for humanity, the ground reality isn’t so peachy. Over the years, Facebook and other social networks have helped reinforce biases and failed to protect the privacy of vulnerable individuals.
Following Zuckerberg’s decision to leave Trump’s posts untouched, which encouraged violence and shared dubious information on voting, some Facebook employees decided to speak out publicly against him and a couple of them resigned as well. This prompted Zuckerberg to arrange an online meeting with about 25,000 employees. While the 36-year-old CEO talked a lot about his thinking process and why he left Trump’s post untouched, he declined to take any action despite being on the receiving end.
If you go through the 10,000-word transcript of the call, you’ll notice that Zuckerberg claims to have consulted with multiple executives and scour through company policies but the final decision to take was his own. He also went on at length to painstakingly justify hateful content on Facebook, which often favors the far right, by painting a world where censorship will ultimately harm those whose voices are being suppressed.
Zuckerberg vs Dorsey vs Spiegel
The scrutiny of Zuckerberg in this particular matter attracts obvious comparisons to his contemporaries like Twitter CEO Jack Dorsey and Snap CEO Spiegel — both of them considered Trump’s posts harmful for their users and, as a result, censored the content. Contrary to their thoughts, Zuckerberg has abstained from this responsibility by mentioning that a private corporation should not act as the “arbiter of truth.” “Private companies probably shouldn’t be, especially these platform companies, shouldn’t be in the position of doing that,” he added.
While it’s a fact that section 230(c) of the 1996 Telecommunications Act frees private corporations from being held accountable for what people share on their platforms, it also gives the power to censor content that they think is unsafe for the users.
In response to Facebook CEO, Dorsey further offered a more nuanced view of content moderation that makes more sense. While Zuckerberg is more than happy to allow controversial posts from politicians, Dorsey calls his company’s actions an attempt to give a complete picture of conflicting statements and let the people decide. “This does not make us an arbiter of truth. More transparency from us is critical so folks can clearly see the why behind our actions.”
Not true and not illegal.
This was pulled because we got a DMCA complaint from copyright holder. https://t.co/RAsaYng71a
— jack (@jack) June 6, 2020
Many of us are quick to attribute Zuckerberg’s missteps to his inexperience of the actual world, the delusion of Silicon Valley, his disconnect to the troubling history of the oppressed communities, and the privilege of being a straight white urban male. But if you observe closely, you’ll find a power-hungry billionaire who knows his way around like a seasoned politician.
Facebook: The Star Campaigner
He knows how to hide behind the protections provided by section 230(c) without even mentioning the term. While his inaction might not be a response to Trump’s threat of action against social media platforms, he’s willing to please the right people. He knows politics is synonymous with power and so is money. When Zuckerberg let the dedicated marketing teams from Facebook help Trump to use the platform to his advantage, Zuckerberg knew he was working with a racist person who had also been accused of multiple sexual assaults.
It’s also safe to assume that Zuckerberg had some knowledge of Indian Prime Minister’s complicated past and his party’s right-wing Hindu nationalist ideals when his company started assisting him in election campaigns. Facebook’s little known government and politics team has helped political parties in democracies like India, Brazil, and the UK to attain power by assisting the candidates to enhance their digital reach.
Mark Zuckerberg is also perfectly adept in apologizing when he knows Facebook has crossed a line. In recent years, Facebook has accepted that it was used to enable an environment of hate that allowed deadly violence in Sri Lanka and Myanmar. He also knows how to dodge questions like a seasoned suspect during congressional hearings and in-company.
Absolute Power Corrupts Absolutely
To sum it up, Zuckerberg is aware of how much power he holds and exercises it the way he pleases. With about 60% of the company’s voting shares, his control over Facebook is also nearly unchallenged. His speeches might give you an impression of an idealist who’s trying hard to use technology for connecting humans but he knows how to use diplomatic wordplay to keep wooing the users and investors. Today he’s amusing Trump because he knows that he can do both — entertain him and keep preaching the gospel of free expression. If things cross the threshold he has in his mind, he might just do something.
If we look at the bigger picture, the web has evolved over the years and it has become a giant playground where corporations like Google, Facebook, Twitter, and Reddit are fighting to capture the biggest piece of the pie. The constant push to scale their platforms and gain as many users as possible has become the only way to upstage each other. Facebook, for instance, also owns Instagram and WhatsApp. The sheer size of Silicon Valley corporations and the willingness of billionaire tech bros to compromise ethical values to fuel a maddening pace of growth makes them more vulnerable to scandals, data leaks, and corruption.
Facebook isn’t doing a great job — actually, it’s doing a terrible job. While most of the cases might be a result of an overwhelming number of posts to moderate, it’s been established earlier that Zuckerberg wants to have it all. He loves the profits of being the biggest social media platform in the world that has become a source of news coverage for millions but he is reluctant to officially call it a news platform.
He acknowledges there are numerous gaps in content moderation policies and executions, but turns a blind eye to the working conditions of its content moderators He wants Facebook to be the leading platform that shapes human conversations and global politics but refuses to make decisions that might offend world leaders. He doesn’t want to be called the arbiter of truth but is willing to help divisive politicians win elections.
In 1968, J.C.R. Licklider, an ARPA director, in his landmark paper, “The Computer as a Communication Device,” called computers a plastic and moldable medium of communication. He believed that in a few years, humans will be able to communicate with each other more effectively via a machine instead of talking face to face. Little did he know that the dream of turning the world of computing into an inclusive and mature place for everyone will start cracking to reveal its negative consequences.
Mark Zuckerberg is aware of this harsh reality, he might even try to put some bandages on this wound in the future, but I’m afraid the internet, as we know it, could soon become untreatable.