What Facebook can learn from Reddit

Nov. 30, 2018, 1:00 a.m.

In this digital age, social media platform companies are increasingly pressured to take stands on what content is allowed on their platform, in turn capturing the public’s interest on how content moderation policies are formed and enforced. Content moderation plays a pivotal role in encouraging effective discussion, as well as balancing free expression and toxicity, in online communities built on these platforms.

Reddit and Facebook are two popular, yet differently structured social media platforms.

Reddit is a collection of forums called “subreddits,” encouraging discussion on a wide range of topics, based on user interest; their content moderation is editorially minded, with a community of moderators responsible for each subreddit. On the other hand, Facebook is a collection of personal news feeds and public pages, and their content moderation process is more “reactionary,” only taking action when the media puts them under the spotlight. Though Facebook and Reddit are structured differently, Facebook can learn a lot from Reddit’s editorially minded social media platform. Below is a list of recommendations for Facebook to improve their content moderation, based on our interviews of 20 moderators from the most popular and controversial subreddits.

  1. Hiring in greater quantity and quality

Facebook needs to hire in greater quantity, in order to handle the platform’s scale, and quality, by hiring moderators from other platforms. A moderator from /r/meirl suggests that “Facebook … [should] headhunt mods like myself, high action wiki editors, [etc.] to moderate public posts to a strict standard ruleset decided by [Facebook’s] community managers”. These moderators, who already have extensive experience in managing online communities, will easily be able to translate their skills to improve Facebook’s content moderation. Another benefit is that these moderators are more likely to be incentivized by the mission of content moderation, rather than money; many Reddit moderators that we interviewed cared about seeing their subreddits succeed, even though moderation is purely on a volunteer basis. A moderator from /r/AskHistorians argues that the success of content moderation has less to do with the differences among platforms, but rather the “mission and commitment of the moderators.”

  1. Stronger collaboration between moderators and software developers

Successful content moderation relies on strong, transparent collaboration between moderators and software developers, because technology is necessary to handle the scale of moderation. Reddit moderators have a few tools at their disposal to help them with their job; this includes Automod, a regex-comparison tool to flag problematic posts, and Modqueue, a queue of flagged or reported posts for moderators to process. When Reddit moderators found existing tools to be lacking, they created /r/toolbox, increasing the features available to moderators and helping the entire Reddit moderator community. With Facebook, however, content moderation has always been hidden behind a curtain of secrecy, due to NDAs that legally bind moderators from discussing their work. Even if the moderation process is never made fully transparent to the public, there needs to be open communication between moderators and software developers within the company, so moderators can suggest tools to make their work easier, and programmers can develop them.

  1. Building moderation communities

Reddit’s success in content moderation is tied to the strong community of moderators within each subreddit. Moderators are frequently consulting with the rest of their team, establishing a system of checks and balances where everyone not only “helps moderate the submissions, but also one another.” Though Facebook is not inherently divided into communities like subreddits, Facebook can create communities of moderators based on categories of public pages, and based on region, because moderation standards are culture-specific. This requires Facebook to build technological infrastructure to allow moderators within the same community to communicate with one another, ask each other questions and make crucial moderation decisions together.  Content moderation is a challenge that shouldn’t be tackled alone, due to the potentially grueling nature of the work. Developing a sense of community boosts the morale of moderators since they are surrounded by individuals with similar passions, and by making their job feel like a team effort.

  1. Adopt a stronger public stance on content moderation

Most importantly, Facebook needs to adopt a stronger stance on content moderation, rather than standing by neutrality. Recent events, such as Facebook’s failure in Myanmar’s Rohingya genocide, have only proven that staying neutral is impossible, as the cost of inaction is far too great. Facebook should view content moderation less as a “free speech issue,” and more from an angle of “what types of communities they want to foster.” This includes how to best protect the platform’s users from influencers who leverage the platform to spread false information and hate, among other things. In these extreme cases, Facebook needs to take a stronger stand on de-platforming, the idea of removing an individual from a platform to prevent them from spreading offensive views to the public. A moderator of /r/IAmA gives the example of Alex Jones, an infamous conspiracy theorist who galvanizes his audience through artificial narratives. An increasing number of studies show that de-platforming is successful; “falloff [of controversial figures’ audiences] is pretty significant, and they don’t gain the same amplification of power they had prior to [their de-platforming]”. Facebook made the right decision in de-platforming Alex Jones, but needs to continue to move in this direction.

Content moderation will never be perfect, as social media platforms are made by people for people, and mistakes are an inherent part of that. Facebook, however, needs to prioritize content moderation, since it is their corporate and ethical responsibility to their users. Increased content moderation would help avoid preventable incidents on their platform, and allow them to be better equipped when unavoidable problems do arise within these online communities.

– Aileen Wang ’19 and Phathaphol (Peter) Karnchanapimonkul ’19


Contact Aileen Wang at aileen15 ‘at ‘stanford.edu and Peter Karnchanapimonkul at phatk ‘at’ stanford.edu.


The Daily is committed to publishing a diversity of op-eds and letters to the editor. We’d love to hear your thoughts. Email letters to the editor to eic ‘at’ stanforddaily.com and op-ed submissions to opinions ‘at’ stanforddaily.com.

Toggle Dark Mode Toggle Dark Mode
Toggle Large Font Size Toggle Font Size

Login or create an account