Key Takeaways
1. The New Jim Code: How technology reinforces racial inequality
"The power of the New Jim Code is that it allows racist habits and logics to enter through the backdoor of tech design, in which the humans who create the algorithms are hidden from view."
Hidden biases in tech. The New Jim Code refers to how new technologies, often marketed as objective or progressive, can actually reinforce existing racial inequalities. This happens through the incorporation of biased data, flawed design processes, and unchallenged assumptions about race. Examples include:
- Facial recognition systems that struggle to accurately identify people of color
- Predictive policing algorithms that disproportionately target minority neighborhoods
- AI-powered hiring tools that favor candidates with "white-sounding" names
Illusion of neutrality. By presenting these technologies as neutral or scientific, the New Jim Code makes it harder to challenge racial bias. It creates a veneer of objectivity that obscures the human decisions and societal contexts shaping these tools.
2. Engineered inequity: Amplifying social hierarchies through tech design
"Robots exemplify how race is a form of technology itself, as the algorithmic judgments of Beauty AI extend well beyond adjudicating attractiveness and into questions of health, intelligence, criminality, employment, and many other fields, in which innovative techniques give rise to newfangled forms of racial discrimination."
Bias by design. Some technologies are explicitly designed in ways that amplify existing social hierarchies based on race, class, and gender. This engineered inequity can manifest in various forms:
- Beauty AI contests that overwhelmingly select white contestants as winners
- Social credit systems that codify and reinforce societal prejudices
- Ethnicity recognition software used for targeted marketing or surveillance
Power dynamics. These technologies reflect and reproduce the values and biases of their creators, often reinforcing the power of dominant groups. By encoding race into technical systems, they can make racial categorizations seem more "scientific" or inevitable.
3. Default discrimination: When "neutral" algorithms perpetuate bias
"Anti-Blackness is no glitch. The system is accurately rigged, we might say, because, unlike in natural weather forecasts, the weathermen are also the ones who make it rain."
Biased data, biased outcomes. Even when not explicitly designed to discriminate, many algorithms perpetuate bias through their reliance on historically biased data. This leads to a cycle of discrimination:
- Predictive policing algorithms direct more officers to minority neighborhoods, leading to more arrests, which then "justifies" increased policing
- Risk assessment tools in the criminal justice system often rate Black defendants as higher risk due to factors shaped by systemic racism
- Healthcare algorithms can underestimate the needs of Black patients due to historical disparities in access to care
Myth of neutrality. The belief that algorithms are inherently objective obscures how they can reinforce and amplify existing societal biases. Challenging this myth is crucial for addressing algorithmic discrimination.
4. Coded exposure: The paradox of hypervisibility and invisibility
"Exposure, in this sense, takes on multiple meanings. Exposing film is a delicate process – artful, scientific, and entangled in forms of social and political vulnerability and risk."
Dual nature of visibility. Technology can simultaneously render marginalized groups hypervisible to systems of surveillance and control, while making them invisible in other contexts:
- Facial recognition struggles to identify dark-skinned individuals, yet is disproportionately used to monitor communities of color
- Medical imaging technologies historically calibrated for light skin, leading to missed diagnoses for patients of color
- Digital platforms that claim to be "colorblind" while ignoring how race shapes user experiences
Power of representation. Who controls the means of representation through technology has significant implications for how different groups are seen and treated in society. This extends from photography to modern AI systems.
5. Technological benevolence: When "fixes" reinforce discrimination
"Magical for employers, perhaps, looking to streamline the grueling work of recruitment, but a curse for many job seekers."
Trojan horse of progress. Many technologies are marketed as solutions to social problems, including racial bias. However, these "fixes" often end up reinforcing the very systems they claim to address:
- AI-powered hiring tools that claim to reduce bias but encode existing prejudices
- Predictive healthcare algorithms that reinforce racial disparities under the guise of efficiency
- Electronic monitoring as an "alternative" to incarceration that extends carceral control
Critique of techno-solutionism. The belief that complex social issues can be solved through technological innovation alone often leads to superficial fixes that fail to address root causes of inequality.
6. Beyond code-switching: Rewriting dominant cultural codes
"Whereas code-switching is about fitting in and 'leaning in' to play a game created by others, perhaps what we need more of is to stretch out the arenas in which we live and work to become more inclusive and just."
Limitations of adaptation. Code-switching, or changing one's behavior to conform to dominant norms, places the burden of change on marginalized individuals. Instead, we need to challenge and rewrite the dominant cultural codes embedded in our technologies and institutions.
Systemic change. This involves:
- Diversifying tech workforces and leadership
- Implementing rigorous bias testing in AI development
- Creating new design paradigms centered on equity and justice
- Empowering communities to shape the technologies that affect them
7. Abolitionist tools: Reimagining technology for liberation
"An abolitionist toolkit, in this way, is concerned not only with emerging technologies but also with the everyday production, deployment, and interpretation of data."
Technology for justice. Drawing inspiration from abolitionists of the past, we can develop technologies and practices that actively work to dismantle systems of oppression:
- Appolition: An app that converts spare change into bail money for incarcerated individuals
- Algorithmic Justice League: An organization that audits AI systems for bias
- Community-based data projects that challenge official narratives and empower marginalized groups
Reimagining the future. This approach involves:
- Centering the experiences and knowledge of marginalized communities in tech development
- Creating accountability mechanisms for tech companies and governments
- Fostering digital literacy and critical engagement with technology
- Developing alternative visions of how technology can serve social justice
Last updated:
Review Summary
Race After Technology explores how technology perpetuates racial biases and inequalities, challenging the notion of technological neutrality. Benjamin introduces the concept of the "New Jim Code" to describe how algorithms and AI systems can reinforce discrimination while appearing objective. Reviewers praise the book's eye-opening insights and diverse examples, though some found it lacking in structure or solutions. Many readers consider it essential for those working in or studying technology, as it raises critical questions about the intersection of race, justice, and innovation in our increasingly digital world.
Download PDF
Download EPUB
.epub
digital book format is ideal for reading ebooks on phones, tablets, and e-readers.