Key Takeaways
1. Facebook's pursuit of growth led to unintended consequences
"If we don't create the thing that kills Facebook, someone else will."
Rapid expansion and innovation. Facebook's growth strategy was driven by a fear of competition and a desire to dominate the social media landscape. This led to rapid expansion and frequent product changes, often implemented without fully considering potential negative impacts.
Prioritizing metrics over user experience. The company focused heavily on metrics like Daily Average People (DAP) and engagement, sometimes at the expense of user well-being. This approach encouraged the development of features that boosted these metrics, even if they had potentially harmful effects on users or society.
Key growth tactics:
- Copying features from competitors
- Acquiring potential rivals (e.g., WhatsApp, Instagram)
- Constantly tweaking algorithms to maximize engagement
- Expanding aggressively into new markets
2. Internal research revealed Instagram's negative impact on teen mental health
"We make body image issues worse for one in three teen girls."
Shocking findings. Facebook's internal research uncovered disturbing trends about Instagram's effects on young users, particularly teenage girls. The platform was found to exacerbate issues related to body image, anxiety, and depression for a significant portion of its user base.
Public vs. private stance. Despite these internal findings, Facebook publicly downplayed the negative effects of its platforms on mental health. This discrepancy between internal knowledge and public statements would later become a major point of controversy.
Key research findings:
- 32% of teen girls said Instagram made them feel worse about their bodies
- 13% of British users and 6% of American users traced suicidal thoughts to Instagram
- The platform's emphasis on curated, idealized lifestyles contributed to feelings of inadequacy
3. The company's content moderation efforts were inadequate and inconsistent
"We estimate that we may action as little as 3–5% of hate and 0.6% of [violence and incitement] on Facebook, despite being the best in the world at it."
Overwhelmed systems. Despite Facebook's claims of robust content moderation, internal documents revealed that the company's efforts were often inadequate, especially in non-English speaking markets. The sheer volume of content overwhelmed both human moderators and AI systems.
Inconsistent enforcement. The XCheck program, which provided special treatment to high-profile users, further undermined the company's claims of fair and consistent moderation. This system often allowed influential users to bypass normal content rules.
Content moderation challenges:
- Lack of moderators for many languages and dialects
- AI systems struggling with context and nuance
- Difficulty in scaling moderation efforts to match platform growth
- Special treatment for VIPs and celebrities through XCheck
4. Facebook's algorithms amplified divisive and sensationalist content
"Anger and hate is the easiest way to grow on Facebook."
Engagement-driven algorithms. Facebook's content recommendation systems, particularly the News Feed algorithm, were designed to maximize user engagement. This often resulted in the promotion of sensationalist, divisive, and emotionally charged content.
Unintended consequences. While these algorithms were effective at keeping users on the platform, they also contributed to the spread of misinformation, conspiracy theories, and polarizing content. The company's efforts to address these issues were often too little, too late.
Algorithm effects:
- Boosting controversial and emotionally charged content
- Creating filter bubbles and echo chambers
- Amplifying extreme political views
- Inadvertently promoting conspiracy theories and misinformation
5. Whistleblower Frances Haugen exposed Facebook's inner workings
"I believe in giving people a voice because, at the end of the day, I believe in people."
Insider turned whistleblower. Frances Haugen, a former Facebook employee, gathered thousands of internal documents and shared them with the Wall Street Journal and regulators. Her actions brought unprecedented public scrutiny to Facebook's internal operations and decision-making processes.
Public reckoning. Haugen's revelations led to congressional hearings, media investigations, and increased calls for regulation of social media platforms. Her testimony and the documents she provided painted a picture of a company that often prioritized growth and engagement over user safety and societal impact.
Key revelations from Haugen:
- Instagram's negative impact on teen mental health
- Facebook's struggles with content moderation
- The company's role in spreading misinformation
- Internal research on algorithmic amplification of divisive content
6. The company prioritized engagement over user well-being
"We have a lot of people using the product because they're addicted and not because they're enjoying it."
Metrics-driven culture. Facebook's internal culture was heavily focused on metrics like engagement, time spent on the platform, and user growth. This often led to decisions that prioritized these metrics over user well-being or societal impact.
Resistance to change. Despite internal research highlighting potential negative effects of the platform, Facebook often resisted making significant changes that might impact its core metrics. This reluctance to address known issues contributed to many of the problems the company faced.
Examples of prioritizing engagement:
- Promoting content that generated strong emotional responses
- Encouraging users to join multiple groups and pages
- Designing features to maximize time spent on the platform
- Resisting changes that might reduce overall engagement, even if they improved user experience
7. Facebook struggled to address misinformation and political manipulation
"We were getting ready to win this election. Frankly, we did win this election."
Platform vulnerabilities. Facebook's size and reach made it a prime target for those seeking to spread misinformation and manipulate public opinion. The company's efforts to combat these issues were often reactive and insufficient.
Political challenges. The platform's role in elections and political discourse became increasingly controversial, especially after the 2016 U.S. presidential election and the events surrounding January 6, 2021. Facebook struggled to balance free speech concerns with the need to prevent the spread of harmful misinformation.
Key misinformation and manipulation issues:
- Foreign interference in elections
- Spread of conspiracy theories like QAnon
- Proliferation of false claims about COVID-19 and vaccines
- Use of the platform to organize violent events like the January 6 insurrection
8. The platform's global expansion outpaced its ability to manage risks
"We have heavily overpromised regarding our ability to moderate content on the platform."
Rapid international growth. Facebook's aggressive expansion into new markets often outpaced its ability to effectively moderate content and manage risks in these regions. This led to serious issues in countries like Myanmar, where the platform was used to incite violence against the Rohingya minority.
Resource allocation problems. The company often lacked adequate resources, including content moderators and AI systems, to effectively manage its platforms in many non-English speaking countries. This resulted in a disproportionate impact of Facebook's negative effects in these regions.
Global expansion challenges:
- Lack of language expertise for content moderation
- Insufficient understanding of local political and cultural contexts
- Inadequate resources allocated to at-risk countries
- Difficulty in scaling safety measures to match rapid user growth
9. Facebook's leadership often ignored or downplayed internal concerns
"I'm sorry. I believe Guy was included."
Disconnect between research and action. Facebook's internal research teams often identified significant problems with the platform's effects on users and society. However, these findings were frequently ignored, downplayed, or not acted upon by company leadership.
Culture of denial. There was a tendency among Facebook's top executives to dismiss or minimize critiques of the platform, both internally and externally. This created a culture where employees felt their concerns about the company's impact were not being taken seriously.
Examples of leadership ignoring concerns:
- Downplaying Instagram's impact on teen mental health
- Resisting changes to the News Feed algorithm that might reduce engagement
- Failing to adequately address content moderation issues in non-English markets
- Dismissing concerns about the platform's role in political polarization
Last updated:
Review Summary
Broken Code by Jeff Horwitz offers a detailed look into Facebook's inner workings, revealing how the company prioritized growth and engagement over user safety and content integrity. Readers found the book informative, eye-opening, and alarming, praising Horwitz's thorough reporting and clear explanations of complex issues. Many were disturbed by Facebook's repeated decisions to ignore harmful consequences of its platform. While some found the writing dry at times, most readers considered it an essential read for understanding social media's impact on society and politics.
Download PDF
Download EPUB
.epub
digital book format is ideal for reading ebooks on phones, tablets, and e-readers.