Searching...
English
EnglishEnglish
EspañolSpanish
简体中文Chinese
FrançaisFrench
DeutschGerman
日本語Japanese
PortuguêsPortuguese
ItalianoItalian
한국어Korean
РусскийRussian
NederlandsDutch
العربيةArabic
PolskiPolish
हिन्दीHindi
Tiếng ViệtVietnamese
SvenskaSwedish
ΕλληνικάGreek
TürkçeTurkish
ไทยThai
ČeštinaCzech
RomânăRomanian
MagyarHungarian
УкраїнськаUkrainian
Bahasa IndonesiaIndonesian
DanskDanish
SuomiFinnish
БългарскиBulgarian
עבריתHebrew
NorskNorwegian
HrvatskiCroatian
CatalàCatalan
SlovenčinaSlovak
LietuviųLithuanian
SlovenščinaSlovenian
СрпскиSerbian
EestiEstonian
LatviešuLatvian
فارسیPersian
മലയാളംMalayalam
தமிழ்Tamil
اردوUrdu
Behind the Screen

Behind the Screen

Content Moderation in the Shadows of Social Media
by Sarah T. Roberts 2019 280 pages
3.66
100+ ratings
Listen
Try Full Access for 7 Days
Unlock listening & more!
Continue

Key Takeaways

1. Commercial Content Moderation: The Invisible, Essential Internet Labor

Commercial content moderators are professional people paid to screen content uploaded to the internet’s social media sites on behalf of the firms that solicit user participation.

Defining the work. Commercial content moderation (CCM) involves paid professionals reviewing user-generated content (UGC) on platforms like social media, news sites, and apps. Their job is to decide whether content adheres to site guidelines and legal standards, quickly evaluating thousands of items daily. Unlike earlier volunteer moderators, they are often deliberately kept discreet and undetectable.

Why it's essential. As social media platforms grew to global scale, the volume of UGC became staggering (e.g., hundreds of hours of video uploaded to YouTube per minute). Companies need CCM for brand protection, legal compliance (e.g., copyright, child abuse material), and maintaining a usable, appealing environment for users. Without it, platforms would be overwhelmed by objectionable, illegal, or off-topic content.

The paradox of invisibility. Despite being mission-critical, CCM workers and their labor are largely invisible to the public. This opacity is by design, allowing platforms to maintain an image of unfettered user expression while quietly controlling content flow. Unveiling this hidden labor is crucial to understanding how the contemporary internet is shaped and for whom.

2. From Volunteer Governance to Industrial-Scale Gatekeeping

What is new, however, is the industrial-scale organized content moderation activities of professionals who are paid for their evaluative gatekeeping services, and who undertake the work they do on behalf of large-scale commercial entities.

Early internet moderation. Content moderation isn't new; early online communities (MOOs, MUDs, BBSs) had users creating and enforcing rules, often through volunteer efforts. These spaces developed unique norms and self-governance structures, reflecting the community's identity and values. Access was limited, primarily to academics and tech enthusiasts.

The shift to commercial scale. The explosion of the World Wide Web and social media transformed the internet into a mass-market, commercial space. This necessitated a shift from volunteer, community-based moderation to organized, paid, industrial-scale operations. Large corporations, not users, now dictate the rules and employ gatekeepers to enforce them.

Gatekeeping for profit. This new era of CCM is driven by commercial interests. Moderators act on behalf of platforms and brands to curate content that maximizes user engagement and advertising revenue. This fundamentally changes the nature of online spaces, prioritizing corporate goals over community norms or ideals of free expression.

3. A Fractured Global Workforce: The Taxonomy of Moderation Labor

To understand the practice of commercial content moderation, it is necessary to gain a picture of how, where, and by whom the work of moderation takes place.

Diverse labor arrangements. CCM work is not monolithic but occurs under various organizational and geographic arrangements, often by design to reduce costs and accountability. Key types include:

  • In-house: Workers on-site at the tech company, often contractors.
  • Boutique: Specialized firms managing online presence for clients.
  • Call Center: Large BPO operations offering moderation as one service.
  • Microlabor Platform: Task-based work on sites like Mechanical Turk.

Global dispersion. Workers are often located far from the content's origin or destination, leveraging global labor pools for cost savings and specific linguistic/cultural skills. This dispersion makes the workforce difficult to track, organize, and advocate for.

Hybrid strategies. Large platforms often use a combination of these arrangements, creating complex layers of labor stratification. This allows for 24/7 coverage and cost efficiency but further obscures the human labor involved and complicates accountability.

4. Silicon Valley's Digital Factory: Precarious Labor Behind the Plush Facade

It’s factory work, almost. It’s doing the same thing over and over.

The contractor class. Even within the seemingly luxurious environment of Silicon Valley tech campuses, CCM workers often exist as a lower-status contractor class. Hired through third-party firms, they lack the benefits, job security, and status of full-time employees, despite working on-site and performing critical functions.

Rote, repetitive tasks. Despite the digital nature, the work is often described as akin to factory labor – repetitive, queue-based, and driven by productivity metrics (e.g., processing hundreds of tickets per hour). Tools are designed for efficiency, reducing complex judgments to quick clicks.

Limited mobility and recognition. Contractors face a revolving door, with limited-term contracts and little opportunity for permanent employment or advancement within the tech company. Their expertise and insights, gained from direct exposure to platform issues, are often undervalued and ignored by full-time staff and policy makers.

5. The Psychological Toll: Exposure to Humanity's Worst Expressions

I can’t imagine anyone who does [this] job and is able to just walk out at the end of their shift and just be done.

Constant exposure to trauma. Moderators routinely view disturbing content, including violence, pornography, hate speech, and child abuse material. This exposure is not fleeting but a daily reality, leading to potential psychological harm like burnout, desensitization, anxiety, and PTSD.

Difficulty compartmentalizing. Workers struggle to separate their job experiences from their personal lives. Disturbing images or themes can intrude on their thoughts and interactions outside of work, impacting relationships and well-being.

Lack of adequate support. Despite the known risks, psychological support for moderators is often insufficient, inconsistent, or inaccessible (especially for contractors lacking health benefits). Workers often rely on peer support, but even this can be challenging due to job insecurity, turnover, and the difficulty of discussing the content.

6. Brand Protection and Geopolitics: Moderation as Corporate Foreign Policy

There really is no free speech on commercial sites.

Corporate values dictate policy. Content moderation policies are not neutral but reflect the values and priorities of the platforms, primarily brand protection and profitability. Internal rules, often secret, prioritize avoiding PR crises and legal issues over abstract notions of free speech.

Editorial power. Moderators, guided by these policies, wield significant editorial power, deciding what content is seen by millions globally. This includes sensitive material like war zone footage or political advocacy, where decisions can have real-world geopolitical implications.

Uneven application. Policies can be applied unevenly, favoring large "partners" or reflecting the cultural and political biases of the company's headquarters (e.g., U.S.-centric views on nudity or conflict). This highlights how corporate interests shape the information landscape presented to users worldwide.

7. Outsourcing and Postcolonial Legacies: The Global Race to the Bottom

Outsourcing is not just a geographic concept denoted by physical-world spatial characteristics or constraints, but something else: a set of labor processes and configurations, and a designation of an available labor pool, as opposed to simply location.

Beyond geography. Outsourcing CCM is not merely moving jobs but leveraging global labor markets characterized by lower wages, fewer protections, and specific cultural/linguistic skills. This creates a "race to the bottom" where companies seek the cheapest labor pool, driving down wages and conditions globally.

Postcolonial connections. The rise of countries like the Philippines as BPO hubs is linked to historical colonial relationships, particularly with the U.S. Filipino workers' fluency in American English and familiarity with Western culture, a legacy of occupation, is commodified and sold as a valuable asset for moderation work.

Infrastructure and inequality. This global labor flow is supported by specific infrastructure developments (e.g., IT parks, fiber optics) often built through public-private partnerships in the Global South. These developments can exacerbate local inequalities, creating islands of high-tech work within areas lacking basic services, further highlighting the uneven benefits of globalization.

8. The Future of Moderation: AI, Regulation, and the Fight for Visibility

That’s a question we get asked a lot: When is AI going to save us all? We’re a long way from that.

AI limitations. While AI and automation are increasingly used (e.g., for detecting known child abuse material), they are far from replacing human moderators. Complex judgments involving context, nuance, and cultural understanding still require human cognitive skills. AI is also opaque and raises questions about bias and accountability.

Growing pressure for transparency and regulation. Scandals and public concern are forcing platforms to acknowledge CCM and face calls for greater transparency in their policies and practices. Governments, particularly in Europe, are enacting laws demanding faster content removal and holding platforms more accountable.

Worker organizing and advocacy. Despite challenges like global dispersion and NDAs, there are nascent efforts to organize moderators and advocate for better working conditions, pay, and psychological support. Legal challenges are also emerging, seeking to hold companies liable for the harm caused by the work. The fight for visibility and value for this essential labor continues.

Last updated:

Review Summary

3.66 out of 5
Average of 100+ ratings from Goodreads and Amazon.

Behind the Screen receives mixed reviews, with an average rating of 3.66/5. Readers appreciate its informative nature on content moderation, highlighting the challenging work and precarious conditions of moderators. Some praise the author's interviews and analysis, while others find the writing repetitive and lacking depth. Critics desire more details about moderators' experiences and company policies. The book is valued for exposing the hidden aspects of social media and sparking discussions on digital labor, though some feel it could have explored certain areas more thoroughly.

Your rating:
4.25
4 ratings

About the Author

Sarah T. Roberts is an academic researcher specializing in commercial content moderation and digital labor. Her book, Behind the Screen, is based on extensive research and interviews with content moderators. Sarah T. Roberts brings attention to the often overlooked workforce responsible for filtering online content. Her work combines academic rigor with accessible writing, aiming to shed light on the human cost of maintaining social media platforms. Roberts' background in information studies and her focus on the intersection of technology, labor, and society inform her approach to this subject. Her research has contributed significantly to understanding the complexities of digital content moderation and its impact on workers.

Download PDF

To save this Behind the Screen summary for later, download the free PDF. You can print it out, or read offline at your convenience.
Download PDF
File size: 0.23 MB     Pages: 11

Download EPUB

To read this Behind the Screen summary on your e-reader device or app, download the free EPUB. The .epub digital book format is ideal for reading ebooks on phones, tablets, and e-readers.
Download EPUB
File size: 2.97 MB     Pages: 10
Listen to Summary
0:00
-0:00
1x
Dan
Andrew
Michelle
Lauren
Select Speed
1.0×
+
200 words per minute
Home
Library
Get App
Create a free account to unlock:
Requests: Request new book summaries
Bookmarks: Save your favorite books
History: Revisit books later
Recommendations: Personalized for you
Ratings: Rate books & see your ratings
100,000+ readers
Try Full Access for 7 Days
Listen, bookmark, and more
Compare Features Free Pro
📖 Read Summaries
All summaries are free to read in 40 languages
🎧 Listen to Summaries
Listen to unlimited summaries in 40 languages
❤️ Unlimited Bookmarks
Free users are limited to 4
📜 Unlimited History
Free users are limited to 4
📥 Unlimited Downloads
Free users are limited to 1
Risk-Free Timeline
Today: Get Instant Access
Listen to full summaries of 73,530 books. That's 12,000+ hours of audio!
Day 4: Trial Reminder
We'll send you a notification that your trial is ending soon.
Day 7: Your subscription begins
You'll be charged on May 22,
cancel anytime before.
Consume 2.8x More Books
2.8x more books Listening Reading
Our users love us
100,000+ readers
"...I can 10x the number of books I can read..."
"...exceptionally accurate, engaging, and beautifully presented..."
"...better than any amazon review when I'm making a book-buying decision..."
Save 62%
Yearly
$119.88 $44.99/year
$3.75/mo
Monthly
$9.99/mo
Try Free & Unlock
7 days free, then $44.99/year. Cancel anytime.
Scanner
Find a barcode to scan

Settings
General
Widget
Loading...