GENERAL PRINCIPLES
The following five principles should be used to guide the reader:
1. Platforms Have First Amendment Rights
The First Amendment applies only to government suppression of speech. While states like Florida
and Texas have passed laws that aim to prohibit large social media companies from banning users
or removing content based on political viewpoints, the constitutionality of these laws has come
under scrutiny.
3
Platforms have the right as private entities to deny any kind of speech that doesn't
align with the content moderation rules they’ve created.
2. There is Too Much Content to Moderate Perfectly in Real-Time
As of February 2020, more than 30,000 hours of video were uploaded to YouTube and roughly
350,000 tweets and 510,000 Facebook comments were posted every hour.
4
Across Facebook,
Messenger, Instagram, and WhatsApp, 1 billion stories are shared around the world every day.
5
Moderating this amount of content at scale and in real-time becomes even more complex when
you consider the world’s many languages and social contexts that moderators have to navigate
when making content decisions. Moreover, content moderation decisions are inherently
subjective, and most decisions occur in a gray area influenced by everything from coded language
to suggestive imagery. Those who have had their content moderated are most likely going to
disagree with the decision to have it taken down. This inherently means there is no way to
moderate in a universally agreed upon way. According to a 2020 NYU Stern report, Facebook’s
moderators review about 3 million posts, photos, and videos that have been flagged by AI or users
a day, and Mark Zuckerberg has admitted content moderators make the wrong call more than 10%
of the time. That means about 300,000 content mistakes are made per day on Facebook alone.
6
3. Content Moderation is Not Binary
The prevailing narratives are that platforms are either moderating too much or too little. However,
that explanation is overly simplistic. Many platforms are trying in earnest to improve moderation
practices, and they have not devoted enough resources to content moderation, do not give
moderators sufficient time to make challenging content decisions that have long-term
consequences, and do not properly support outsourced moderation teams.
7
Not enough resources
7
Billy Perrigo. “Facebook Faces New Lawsuit Alleging Human Trafficking and Union-Busting in Kenya.” Time Magazine, May 11, 2022.
https://time.com/6175026/facebook-sama-kenya-lawsuit/.
6
Paul Barrett. “Who Moderates the Social Media Giants? A Call to End Outsourcing.” Issuu. Accessed May 6, 2022.
https://issuu.com/nyusterncenterforbusinessandhumanri/docs/nyu_content_moderation_report_final_version?fr=sZWZmZjI1NjI1Ng.
5
“Wild and Interesting Facebook Statistics and Facts (2022).” Kinsta®, January 3, 2021, https://kinsta.com/blog/facebook-statistics/.
4
“57 Fascinating and Incredible YouTube Statistics.” Brandwatch. Accessed May 6, 2022,
https://www.brandwatch.com/blog/youtube-stats/ ; “Twitter Usage Statistics.” Twitter Usage Statistics - Internet Live Stats. Accessed
July 1, 2022, https://www.internetlivestats.com/twitter-statistics/.
3
Cat Zakrzweski, “11th Circuit blocks major provisions of Florida’s social media law”, Washington Post, May 23, 2022,
https://www.washingtonpost.com/technology/2022/05/23/florida-social-media-11th-circuit-decision/. Mark Sullivan. “Why the Texas
Social Media Law Just Became a Big Headache for Big Tech,” Fast Company, May 17, 2022,
https://www.fastcompany.com/90752528/why-the-texas-social-media-law-just-became-a-big-headache-for-big-tech.
Trust & Trade-offs: Pathways to Better Content Moderation // 5