How does technology continue to produce racism, bias, and discrimination, while obscuring the racist conditions that precede it? Speaking today at the Berkman Klein Center is Ruha Benjamin, Associate Professor in the African American Studies department at Princeton. I was in the audience and took notes, which I’m sharing here with the hope that they might be useful to others as an alternative to attending in person or watching the video recording.
Prof. Benjamin starts her talk with a land acknowledgement. Harvard University, located on Massachusett land, is founded upon and continues to enact the erasure of indigenous peoples.
She presents three provocations:
- Racism is productive. It produces and constructs, creating value for some as it wreaks havoc on others. People are often conditioned to think of racism as backward, individual bad apples, the back woods, etc—instead of innovative, the whole orchard, the ivory tower. Race is often acknowledged to be socially constructed—it also constructs.
- Race and technology are coproduced. People are often conditioned to think about the social impacts of technology. But social norms exist prior to tech development. Technology has social inputs. As a consequence, some inventions seem inevitable when they are not.
- Imagination is a site of contestation. It is a battleground, an input and output of tech and social order. Most people are forced to live inside someone else’s imagination. These imaginations are the underside of elite fantasies about efficiency, profit, and control. For those who want to imagine another world, we can’t just critique the underside. We must wrestle with widely held deep investment in, and desire for, social domination.
For example, Prof. Benjamin examines Citizen, an app that tracks crime on a map and provides real-time 911 alerting. What could possibly go wrong, in the age of “bbq beckies” calling the police on black people minding their own business? The app “Citizen” was rebranded from its original name “Vigilante.” In its original iteration, the app had a “report incident” feature and was about users policing crime—now it’s about avoiding it.
Social norms and values shape what tools are imagined necessary in the first place. How should we understand the duplicity of tech solutions? Prof. Benjamin was interested in headlines and hot takes about racist robots. The first wave of these headlines were about recognizing that “artifacts have politics”. The second wave: “of course, artifacts inherit the biases of their creators”. But lots of different kinds of “bias” were being conflated. To make progress, Prof. Benjamin asks: how do we meaningfully differentiate technologies that are meant to differentiate us?
Prof. Benjamin names a phenomenon called “The New Jim Code.” It consists of:
- coded bias + imagined objectivity
- innovation that enables containment
These issues constituted by tech “solutions” penetrate every aspect of social life under the guise of progress. She cites colleagues Brown, Broussard, Eubanks, Noble, Daniels.
Prof. Benjamin asks us to consider “old school targeted ads,” like a newspaper ad for housing which tries to entice only white buyers by offering “beneficial restrictions” on who the housing could be sold to. Later came the Fair Housing Act and other civil rights laws supposed to prevent such discrimination. They didn’t necessarily succeed, because today on Facebook, housing advertisers can still make discriminatory ads that exclude based on race using a dropdown menu.
She situates her work within “race critical code studies,” a field of growing literature concerned not only with impacts of technology, but its production.
Consider Appolition, which is an app that lets people contribute their spare change to bail funds. One might wonder if this money is just contributing to an already bloated carceral state, but in this case bail fund money is returned after someone meets their court dates. This means that the money can be reused like an endowment to bail out someone else. In contrast, Jay-Z’s “decarceration startup” Promise contributes money to monitoring systems, which directly supports the continuation of carceral systems. This form of solutionism is insidious because it’s packaged as social betterment.
Recently, tech workers have been speaking out about corporate involvement in state-sanctioned abuse. Benjamin says that while this kind of informed refusal is necessary, we can’t wait for worker sympathy to sway the industry. Instead, she offers examples in Data for Black Lives and the Detroit Community Tech Project. These organizations offer support to grassroot efforts, and make tech addressing community needs. For instance, in Saint Paul, a data-sharing agreement was set up to make predictive tools to identify “at-risk” youth. The Stop the Cradle to Prison Algorithm Coalition pushed Saint Paul to adopt an alternative community-based approach.
The Our Data Bodies Digital Defense Playbook, downloadable online, contains tools, interviews, etc from places like Charlotte, Detroit, and LA that are dealing with the integration of data-driven systems into people’s lives. Crucially, not everything the Our Data Bodies team knows is exposed in this book. Prof. Benjamin recalls how Frederick Douglass reprimanded those who revealed the routes that slaves used to escape, turning the Underground Railroad into an “overground railroad.” While some information can be tweeted around the world, other tactics need strategic discretion. The stuff that keeps community members alive, Our Data Bodies keeps to themselves. Benjamin also cites the Design Justice Network, which prioritizes the needs of, and impact on, the community over the goals of the designer. Other examples include: DuBois’ data visualizations, Ida B Wells use of statistics in the Red Record.
Pointing to imaginative, speculative efforts, Benjamin brings up White Collar Crime Risk Zones. This project flips the script on predictive policing by flagging city blocks where financial crime is likely to occur, bringing the hidden crimes of capitalism into view. The audience laughed at this, but we’re also reminded that creative exercises like this are only comical when we ignore that all of these practices are drawn from the real world and used against real people. Prof. Benjamin recommends the Tracking the Trackers webinar for those who are interested in contributing to data gathering and community co-research on predictive analytics issues.
Prof. Benjamin closes with the thought that in contrast to carceral imagination, abolitionist imagination opens up new possibilities, draws from intellectual traditions that are concerned with liberation and allowing the marginalized to create their own futures.
What are some of the ideas in her new book, Race After Technology?
This book was Prof. Benjamin’s Sociology and Black studies sides trying to tango. She set out trying to show the productivity of racism, finding racism everywhere. She got advice from a colleague that she couldn’t just stop at “everything is racist,” but also needed to create categories and frames, as sociologists do. How do we differentiate coded biases? There’s a spectrum, from engineered malevolence to techno-solutionism, from more obvious to more insidious forms of coded inequity. Technologies can reproduce inequity independently of the intentions of designer. It’s important to disentangle intentions from outcomes in these conversations.
How does the language we use when talking about technology frame how we think about it?
There’s a growing public consciousness about technology, and a growing deep skepticism of technology. The more that the public critique grows, the more there’s a need to regroup on the tech side and be attentive to new developments. With The New Jim Crow, from which The New Jim Code draws inspiration, there’s been a sea change. Along came many efforts to reform the criminal justice system. But it’s precisely how change comes that matters. Within the process of reform, you can deepen the tentacles of the carceral state. Prof. Benjamin is interested in how our desire to do better can backfire and reinforce inequities.
What does it look like to do design that prioritizes the needs of the community?
Well before the process of designing anything, we have to think about the social infrastructure and relationships that precede the product. There needs to be reciprocity between designers and communities—the designer is not the final expert, but needs to attend to the needs and concerns of the people they are trying to serve. There is no magic bullet, no checklist for participatory tech—but this is precisely what’s often demanded! People want the easy fix—how do we immediately create sound relationships so people feel heard? This requires a reorientation of how we think about the relationship between the academy and surrounding communities, for instance. The people creating knowledge are divorced from the people they are talking about and for. The Stop LAPD Spying Coalition did interviews about what it feels like to be surveilled. People from surveilled communities were doing interviews, crafting questions, and producing outputs. Structures in the academy make this way of working less feasible. Publishing timelines, authorship incentives, etc, run against building these relationships.
How do we build up inclusion in engineering departments?
Among the various arenas of change—how do we litigate, legislate, organize workers and communities—ground zero for Benjamin is education, training, and pedagogy. This is where we seed new ways of thinking about our relationship to material and digital infrastructure. People are taking seriously “public interest technology”—but in all examples, we must be vigilant that the ethics of technology doesn’t become a token thing that we throw in as an afterthought, or the end of training, or thing you throw in at the end of the semester because there’s time. What’s the structure of inclusion? You can include something in a tokenistic way that reinforces its inferiority. Often, students don’t realize the power of their voice in academia. “We don’t feel like we’re equipped to go out in the world if we don’t have XYZ skills on race, equity, etc.” And so students are suggesting a direction for their training. Like White Coats for Black Lives, what might something in engineering look like that helps students realize that these issues go beyond their universities?
What do you want this book to do in the world? Start a movement? Who is reading this book?
Benjamin sees the book and herself as part of an existing movement of people who call into question techno-utopianism. As such, the book is more of a provocation than an end statement. Readers in mind for her include her own students, coming from all over the university, from sociology, black studies, STEM, etc. There’s a notion that their interests should converge. She hopes to bring together these fields. The questions that often come up, for example, are engineers asking how they can contribute. How do we respond to these questions in a productive way? People like Tech Workers Coalition, for example, might be people who took a good sociology class in college and are now making trouble at Google. How can she use her position in the academy to lend legitimacy to what others are doing?
There’s a prevalent belief in tech that ethics is pairing a philosopher with CS and we’re done. How do you think about asking people to take this and internalize it and use it, beyond the classroom?
The real work of change is taken up by people embedded in an institution trying to change things. As a provocateur, Benjamin views her job as the easiest. Those that have to grapple with the politics of a place, intransigence, giving lip-service to ideas while not implementing them, have more ahead of them. On the surface, she is trying to bring together communities and coalitions. But others are crucially needed to work through the nitty gritty.
What is the theory of change here? How do individuals change, how do institutions change?
There are different strategies to change. Some think about how to make the status quo untenable—walkouts, whistleblowers, etc. This is the “stick” approach. But Benjamin also asks, how do we also make change desirable? In the same way that domination is desirable, can we make people crave change? Can we seed the longing? This is the “carrot” approach. One way to think about this is the concept of “linked fate”. This term often describes black communal relationships. On a more universal scale, it describes the way that those who on one level are the “perpetrators” and supposed beneficiaries of oppressive systems, if you go down just a bit, also harmed by them in ways they may or may not be cognizant of. In public health, for example, where there is greater inequity, the halves fare worse than in places where there is more equity. This is an empirical question, and Benjamin would love to amass more data so we can see where equity benefits all. Monopolization of resources, and being the overserved of a system, can bite you in the ass. In white Americans—see the opiod crisis, or the reproductive health of white women. She cites (something I didn’t catch): if you take white women as a country by themselves, their reproductive health is worse than many other countries. We need to seed a desire for change among those who might be benefiting.
When US companies think about inclusion, equity, how do these policies play out on the global scale?
This is an opening to encourage more work. The New Jim Code invokes a US history of white supremacy. The challenge is to recognize what are the socially salient hierarchies in any region or country, and ask similar questions, and understand how they play out. Examples in the book: India’s national ID system creating new caste exclusions, as biometrics are the gateways to access public systems. Muslims in China. This is not just a US problem. She does not offer a grand theory. Grand theories are counterproductive—this knowledge needs to be situated. She encourages students and other researchers to think about how power and tech are coproduced, and think about this in other places.
I had to step out at this point. Thank you Prof. Benjamin!