Facebook is planning to use its internal moderation tools designed for “at-risk” countries during the U.S. Presidential elections. The toolkit has been used in countries like Sri Lanka and Myanmar to reduce the possibility of unrest due to misinformation.
According to The Wall Street Journal‘s sources, the company will deploy the toolkit only under dire circumstances like election-related violence. However, the preparation for such possibilities is a matter of concern in itself.
What Are Facebook Moderation Tools For “At-Risk” Countries?
With the increasing influence of social media, comes an increased possibility of misinformation on the platform. That’s why social media platforms have different moderation tools to stop or limit the damage caused by certain type of posts. Facebook has marked some countries as “at-risk” places where special moderation tools are used.
According to WSJ sources, the tools will include across-the-board slowing of the spread of posts as they start to go viral and tweaking the news feed to change what types of content users see. There may also be a lower threshold for Facebook’s systems flagging dangerous content during the U.S. elections.
Simply put, the tools limit the spread of sensational content, incitement to violence, and misinformation about the U.S. elections. Limiting such content would mean Facebook is the gatekeeper to what millions of Americans will see during the election days.
Facebook, U.S. Elections, and Content Moderation
The viability of such tools in the U.S. may lead to bigger problems for Facebook. The company’s CEO and chairperson, Mark Zuckerberg was issued a subpoena by the Senate Republicans regarding moderation of a New York Post article. This was one instance of Facebook trying to limit one article with shady sources — imagine the backlash if it did so to thousands of outlets at once.
Facebook also recently decided to stop running political ads when the elections are close. The company has learned a lesson from the last the U.S. election, but still comes up short when it comes to everyday moderation. Lately, Facebook focused on taking down anti-vax and holocaust-denial posts, both of which are a step forward to a healthy moderation.
Political discussion in good faith may also be affected if Facebook decides to implement these measures. Ultimately, there’s no guarantee that the algorithms won’t filter out content that needs to be seen by the people to make the right voting choice as well. However, given the circumstances, it would be a smart move for the social media giant to moderate strictly rather than being held responsible for any unrest.