Facebook Pixel
Searching...
English
EnglishEnglish
EspañolSpanish
简体中文Chinese
FrançaisFrench
DeutschGerman
日本語Japanese
PortuguêsPortuguese
ItalianoItalian
한국어Korean
РусскийRussian
NederlandsDutch
العربيةArabic
PolskiPolish
हिन्दीHindi
Tiếng ViệtVietnamese
SvenskaSwedish
ΕλληνικάGreek
TürkçeTurkish
ไทยThai
ČeštinaCzech
RomânăRomanian
MagyarHungarian
УкраїнськаUkrainian
Bahasa IndonesiaIndonesian
DanskDanish
SuomiFinnish
БългарскиBulgarian
עבריתHebrew
NorskNorwegian
HrvatskiCroatian
CatalàCatalan
SlovenčinaSlovak
LietuviųLithuanian
SlovenščinaSlovenian
СрпскиSerbian
EestiEstonian
LatviešuLatvian
فارسیPersian
മലയാളംMalayalam
தமிழ்Tamil
اردوUrdu
Race After Technology

Race After Technology

Abolitionist Tools for the New Jim Code
by Ruha Benjamin 2019 258 pages
4.27
2k+ ratings
Listen
Listen

Key Takeaways

1. The New Jim Code: How technology reinforces racial inequality

"The power of the New Jim Code is that it allows racist habits and logics to enter through the backdoor of tech design, in which the humans who create the algorithms are hidden from view."

Hidden biases in tech. The New Jim Code refers to how new technologies, often marketed as objective or progressive, can actually reinforce existing racial inequalities. This happens through the incorporation of biased data, flawed design processes, and unchallenged assumptions about race. Examples include:

  • Facial recognition systems that struggle to accurately identify people of color
  • Predictive policing algorithms that disproportionately target minority neighborhoods
  • AI-powered hiring tools that favor candidates with "white-sounding" names

Illusion of neutrality. By presenting these technologies as neutral or scientific, the New Jim Code makes it harder to challenge racial bias. It creates a veneer of objectivity that obscures the human decisions and societal contexts shaping these tools.

2. Engineered inequity: Amplifying social hierarchies through tech design

"Robots exemplify how race is a form of technology itself, as the algorithmic judgments of Beauty AI extend well beyond adjudicating attractiveness and into questions of health, intelligence, criminality, employment, and many other fields, in which innovative techniques give rise to newfangled forms of racial discrimination."

Bias by design. Some technologies are explicitly designed in ways that amplify existing social hierarchies based on race, class, and gender. This engineered inequity can manifest in various forms:

  • Beauty AI contests that overwhelmingly select white contestants as winners
  • Social credit systems that codify and reinforce societal prejudices
  • Ethnicity recognition software used for targeted marketing or surveillance

Power dynamics. These technologies reflect and reproduce the values and biases of their creators, often reinforcing the power of dominant groups. By encoding race into technical systems, they can make racial categorizations seem more "scientific" or inevitable.

3. Default discrimination: When "neutral" algorithms perpetuate bias

"Anti-Blackness is no glitch. The system is accurately rigged, we might say, because, unlike in natural weather forecasts, the weathermen are also the ones who make it rain."

Biased data, biased outcomes. Even when not explicitly designed to discriminate, many algorithms perpetuate bias through their reliance on historically biased data. This leads to a cycle of discrimination:

  • Predictive policing algorithms direct more officers to minority neighborhoods, leading to more arrests, which then "justifies" increased policing
  • Risk assessment tools in the criminal justice system often rate Black defendants as higher risk due to factors shaped by systemic racism
  • Healthcare algorithms can underestimate the needs of Black patients due to historical disparities in access to care

Myth of neutrality. The belief that algorithms are inherently objective obscures how they can reinforce and amplify existing societal biases. Challenging this myth is crucial for addressing algorithmic discrimination.

4. Coded exposure: The paradox of hypervisibility and invisibility

"Exposure, in this sense, takes on multiple meanings. Exposing film is a delicate process – artful, scientific, and entangled in forms of social and political vulnerability and risk."

Dual nature of visibility. Technology can simultaneously render marginalized groups hypervisible to systems of surveillance and control, while making them invisible in other contexts:

  • Facial recognition struggles to identify dark-skinned individuals, yet is disproportionately used to monitor communities of color
  • Medical imaging technologies historically calibrated for light skin, leading to missed diagnoses for patients of color
  • Digital platforms that claim to be "colorblind" while ignoring how race shapes user experiences

Power of representation. Who controls the means of representation through technology has significant implications for how different groups are seen and treated in society. This extends from photography to modern AI systems.

5. Technological benevolence: When "fixes" reinforce discrimination

"Magical for employers, perhaps, looking to streamline the grueling work of recruitment, but a curse for many job seekers."

Trojan horse of progress. Many technologies are marketed as solutions to social problems, including racial bias. However, these "fixes" often end up reinforcing the very systems they claim to address:

  • AI-powered hiring tools that claim to reduce bias but encode existing prejudices
  • Predictive healthcare algorithms that reinforce racial disparities under the guise of efficiency
  • Electronic monitoring as an "alternative" to incarceration that extends carceral control

Critique of techno-solutionism. The belief that complex social issues can be solved through technological innovation alone often leads to superficial fixes that fail to address root causes of inequality.

6. Beyond code-switching: Rewriting dominant cultural codes

"Whereas code-switching is about fitting in and 'leaning in' to play a game created by others, perhaps what we need more of is to stretch out the arenas in which we live and work to become more inclusive and just."

Limitations of adaptation. Code-switching, or changing one's behavior to conform to dominant norms, places the burden of change on marginalized individuals. Instead, we need to challenge and rewrite the dominant cultural codes embedded in our technologies and institutions.

Systemic change. This involves:

  • Diversifying tech workforces and leadership
  • Implementing rigorous bias testing in AI development
  • Creating new design paradigms centered on equity and justice
  • Empowering communities to shape the technologies that affect them

7. Abolitionist tools: Reimagining technology for liberation

"An abolitionist toolkit, in this way, is concerned not only with emerging technologies but also with the everyday production, deployment, and interpretation of data."

Technology for justice. Drawing inspiration from abolitionists of the past, we can develop technologies and practices that actively work to dismantle systems of oppression:

  • Appolition: An app that converts spare change into bail money for incarcerated individuals
  • Algorithmic Justice League: An organization that audits AI systems for bias
  • Community-based data projects that challenge official narratives and empower marginalized groups

Reimagining the future. This approach involves:

  • Centering the experiences and knowledge of marginalized communities in tech development
  • Creating accountability mechanisms for tech companies and governments
  • Fostering digital literacy and critical engagement with technology
  • Developing alternative visions of how technology can serve social justice

Last updated:

FAQ

What's Race After Technology about?

  • Explores technology and race: Race After Technology by Ruha Benjamin examines how emerging technologies often reinforce existing racial biases and inequities, introducing the term "New Jim Code" to describe this phenomenon.
  • Focus on systemic racism: The book argues that technologies are not neutral; they are embedded with the values and biases of their creators, which can perpetuate systemic racism.
  • Calls for critical engagement: Benjamin encourages readers to critically engage with technology, advocating for a more equitable approach to tech design and implementation.

Why should I read Race After Technology?

  • Timely and relevant: The book addresses pressing issues of racial inequality in the context of rapidly advancing technology, making it essential for understanding contemporary social dynamics.
  • Empowers readers: It provides tools and frameworks for readers to recognize and challenge the biases embedded in technology, fostering a sense of agency in advocating for justice.
  • Interdisciplinary approach: Combining insights from science and technology studies with critical race theory, the book offers a rich, multifaceted perspective on the intersection of race and technology.

What are the key takeaways of Race After Technology?

  • Technological neutrality is a myth: Technologies reflect and reproduce societal biases, particularly against marginalized groups, rather than being neutral.
  • The New Jim Code: Benjamin introduces this concept to describe how modern technologies can perpetuate racial discrimination under the guise of objectivity and efficiency.
  • Call for abolitionist tools: The author advocates for the development of tools that can help dismantle systemic racism and promote equity in technology.

What is the "New Jim Code" as defined in Race After Technology?

  • Definition of the term: The New Jim Code refers to the use of new technologies that reflect and reproduce existing racial inequities, often perceived as objective or progressive.
  • Examples in practice: Technologies like predictive policing algorithms and facial recognition software often target marginalized communities, reinforcing systemic racism.
  • Historical context: The term draws parallels to the Jim Crow laws, highlighting how contemporary technologies can perpetuate similar forms of racial control and discrimination.

How does Race After Technology define engineered inequity?

  • Explicit amplification of hierarchies: Engineered inequity refers to how certain technologies are designed to explicitly reinforce social hierarchies based on race, class, and gender.
  • Case studies: The book discusses examples like the Beauty AI contest, where algorithms favored lighter skin tones, demonstrating how biases are coded into technology.
  • Implications for society: This concept emphasizes the need for critical scrutiny of technological designs and their societal impacts, urging a rethinking of how technologies are developed.

What role do algorithms play in perpetuating racial bias according to Race After Technology?

  • Data-driven discrimination: Algorithms often rely on historical data that reflects existing biases, leading to outcomes that reinforce racial stereotypes and inequalities.
  • Lack of transparency: Many algorithms operate as "black boxes," making it difficult to understand how decisions are made and who is affected by them.
  • Need for reform: The book argues for the necessity of auditing algorithms to ensure they do not perpetuate systemic racism and to promote accountability in their design.

How does Ruha Benjamin suggest we reimagine technology in Race After Technology?

  • Advocacy for justice-oriented design: The author calls for a shift towards designing technologies that prioritize equity and justice, rather than merely efficiency and profit.
  • Community engagement: Benjamin emphasizes the importance of involving marginalized communities in the design process to ensure that their needs and perspectives are represented.
  • Critical awareness: She encourages readers to cultivate a critical awareness of the technologies they use and to challenge the narratives that frame them as neutral or benevolent.

What are some examples of default discrimination discussed in Race After Technology?

  • Predictive policing algorithms: These algorithms often mislabel Black individuals as high-risk, perpetuating racial profiling and discrimination.
  • Facial recognition technology: These systems are less accurate for people of color, leading to higher rates of misidentification and wrongful accusations.
  • Employment algorithms: Studies show that job applicants with Black-sounding names receive fewer callbacks than those with White-sounding names, illustrating systemic bias in hiring practices.

How does Race After Technology address the concept of privacy?

  • Privacy as a social issue: The book argues that privacy is not just an individual concern but a collective issue that affects marginalized communities disproportionately.
  • Surveillance technologies: Benjamin discusses how technologies like facial recognition and data collection infringe on the privacy rights of individuals, particularly those from racialized backgrounds.
  • Need for protective measures: The author calls for stronger regulations and community-led initiatives to safeguard privacy and ensure that technology serves the public good rather than surveillance interests.

What are some of the best quotes from Race After Technology and what do they mean?

  • “Racism is the most slovenly of predictive models.” This quote emphasizes that racial biases are often embedded in algorithms, leading to inaccurate and harmful predictions about marginalized groups.
  • “Hope is a discipline.” This reflects the idea that advocating for justice and equity requires sustained effort and commitment, rather than passive optimism.
  • “The tool never possess the [hu]man.” This quote underscores the importance of maintaining human agency in the face of technological advancements that seek to control or define individuals.

How does Race After Technology address the intersection of race and technology in a global context?

  • Global implications of the New Jim Code: Benjamin discusses how the principles of the New Jim Code extend beyond the United States, affecting marginalized communities worldwide.
  • Comparative analysis: The book examines how different countries implement technologies that reflect their unique racial and social dynamics, highlighting the need for a global perspective on tech equity.
  • Call for solidarity: The author advocates for international solidarity among movements fighting against technological oppression, emphasizing the interconnectedness of struggles for racial justice.

How does Ruha Benjamin define "abolitionist tools" in Race After Technology?

  • Tools for liberation: Abolitionist tools are methods and technologies designed to dismantle oppressive systems and promote justice and equity.
  • Community-centered approach: These tools emphasize the importance of community involvement and the need to prioritize the voices of those most affected by technological injustices.
  • Examples of tools: Benjamin discusses initiatives like Appolition, which helps fund bail for those unable to afford it, as a practical application of abolitionist principles.

Review Summary

4.27 out of 5
Average of 2k+ ratings from Goodreads and Amazon.

Race After Technology explores how technology perpetuates racial biases and inequalities, challenging the notion of technological neutrality. Benjamin introduces the concept of the "New Jim Code" to describe how algorithms and AI systems can reinforce discrimination while appearing objective. Reviewers praise the book's eye-opening insights and diverse examples, though some found it lacking in structure or solutions. Many readers consider it essential for those working in or studying technology, as it raises critical questions about the intersection of race, justice, and innovation in our increasingly digital world.

Your rating:

About the Author

Ruha Benjamin is a Professor of African American Studies at Princeton University, specializing in the study of science, medicine, race, and technology. She authored multiple books, including "People's Science" and "Race After Technology," and edited "Captivating Technology." Benjamin holds degrees from Spelman College and UC Berkeley, completed postdoctoral fellowships at UCLA and Harvard, and has received numerous awards and grants. Her work examines the complex relationships between knowledge, power, and technology, with a focus on how these intersect with issues of race and social justice in contemporary society.

Other books by Ruha Benjamin

Download PDF

To save this Race After Technology summary for later, download the free PDF. You can print it out, or read offline at your convenience.
Download PDF
File size: 0.19 MB     Pages: 9

Download EPUB

To read this Race After Technology summary on your e-reader device or app, download the free EPUB. The .epub digital book format is ideal for reading ebooks on phones, tablets, and e-readers.
Download EPUB
File size: 2.95 MB     Pages: 8
0:00
-0:00
1x
Dan
Andrew
Michelle
Lauren
Select Speed
1.0×
+
200 words per minute
Create a free account to unlock:
Requests: Request new book summaries
Bookmarks: Save your favorite books
History: Revisit books later
Ratings: Rate books & see your ratings
Try Full Access for 7 Days
Listen, bookmark, and more
Compare Features Free Pro
📖 Read Summaries
All summaries are free to read in 40 languages
🎧 Listen to Summaries
Listen to unlimited summaries in 40 languages
❤️ Unlimited Bookmarks
Free users are limited to 10
📜 Unlimited History
Free users are limited to 10
Risk-Free Timeline
Today: Get Instant Access
Listen to full summaries of 73,530 books. That's 12,000+ hours of audio!
Day 4: Trial Reminder
We'll send you a notification that your trial is ending soon.
Day 7: Your subscription begins
You'll be charged on Mar 1,
cancel anytime before.
Consume 2.8x More Books
2.8x more books Listening Reading
Our users love us
50,000+ readers
"...I can 10x the number of books I can read..."
"...exceptionally accurate, engaging, and beautifully presented..."
"...better than any amazon review when I'm making a book-buying decision..."
Save 62%
Yearly
$119.88 $44.99/year
$3.75/mo
Monthly
$9.99/mo
Try Free & Unlock
7 days free, then $44.99/year. Cancel anytime.
Settings
Appearance
Black Friday Sale 🎉
$20 off Lifetime Access
$79.99 $59.99
Upgrade Now →