You may have seen a Youtube clip of the 'racist soap dispenser', which while happily plopping a globule of soap onto a white hand, simply sits schtum when a black hand is passed below. A fairly low-stakes incident of a machine exhibiting racial discrimination some might say - in this case, because the infrared light doesn't bounce sufficiently off darker skin.
This is just one example of discriminatory design examined by Ruha Benjamin in her new book Race After Technology: Abolitionist Tools for the New Jim Code. An associate professor in the Department of African American Studies at Princeton University, she draws on a rich tapestry of decades-old African-American studies to interrogate how racial discrimination is being perpetuated by computers in the algorithm age, before suggesting new ways of developing socially conscious tech.
"What I'm interested in in the book is to bring together seemingly mundane or trivial technologies where this is happening, and linking back to the places where the stakes are much higher - where we realise that if there's bias here, there's life and death stakes with respect to that," says Benjamin.
Benjamin has long studied discrimination through the dimension of science and technology. She says one thing that prompted to her to write the book was the lack of engagement she saw between scholars studying discrimination within a sociological context, and those studying bias in computers. "On one hand, intellectually, I wanted to bridge that gap between the field that's been thinking hard for over 100 years about questions of discrimination and equity, and this new field of discrimination," she says.
But this wasn't her only impetus to start writing the book.
"On a more personal note, part of what sparked it was around the time of the uprising in Ferguson, Missouri after Michael Brown, the 18 year old teenager, was gunned down by police," she says. This led to several weeks where for her, the contrast between the capacity of militarised technologies to level harm against communities, and communication technologies to spread activism in the wake of the tragedy rose to fore. "There was a vivid illustration of how technology can be used for justice, equity and democracy, and at the same time, a really striking example of it being used for oppressive practices," she says.
In the book, Benjamin confronts a virulent strain of techno-determinism that posits technology as a kind of unstoppable and self-perpetuating biological entity that evolves and mutates at will, and to which humanity is in abject enthrallment.
To escape this view, she argues we have to step back and "de-naturalise" our view of technology. "We don't have software growing on trees," she points out. In rebuking the perceived inevitability of technology, we have to look at the social inputs of particular products instead.
"I wanted to think about the human decisions, priorities, values, vices and desires that we embed into the technologies we produce, and how if we don't really question that part of the story, then we're likely to reinforce existing forms of inequity," says Benjamin.
Returning to the racist soap dispenser, she says: "Here's an example where you question 'how did this get through the development stage?' Why was it never tested on someone with darker skin - perhaps someone who worked there?" The same broad design principles, which so casually overlook the existence of different skin tones, are shaping the design process of much more consequential technologies.
One of the most blatant recent examples of racial bias in tech has been in the development of facial recognition technologies - which have been demonstrated time and again to be drastically ill-equipped at discerning the identity of black people - often misassigning their gender and even humanity.
This is a topic popularised by Joy Adowaa Buolamwini, a Ghanaian-American computer scientist and digital activist based at the MIT Media Lab. A now-famous video shows her attempting to be recognised by facial recognition algorithms before donning a white mask, whereupon she's immediately picked up. Perhaps the most dangerous real-world manifestation of this technological fault is in autonomous vehicles. If a black person is invisible to a self-driving car, it will be more likely to crash into them. In fact, an early 2019 study from the Georgia Institute of Technology found that was the case for the state of the art detection systems used in self-driving vehicles today.
But, as Ruha points out, there is a great irony in facial recognition systems that can't accurately profile black people: "These are the very same groups that are often over policed with old-school policing - communities that are already targeted. Enter in a technology that everyone seems to believe is going to be more objective and more neutral than the humans doing the policing and yet, biases are built in where it can't accurately distinguish between two individuals.
"On the one on one hand, you're invisible because you're a racialized person, a darker-skinned person: invisible in one case, and hyper visible in another. So it's this kind of combination of being seen too much and not being seen in various contexts."
Other arenas where racial discrimination has crept into algorithms include healthcare, where algorithms for identifying melanoma work poorer for black skin; predictive policing, where black neighbourhoods have been disproportionately suggested for targeting; and the legal justice system, where algorithms used for determining sentences have been disproportionately harsh towards black suspects.
To describe these sprawling cases of tech-perpetuated injustice, Benjamin has coined the concept of 'The New Jim Code'. Jim Crow laws enforced racial segregation in the southern United States in the late 19th and early 20th centuries up until 1965. "Especially in public places, whether it was separate water fountains, separate schools or separate medical facilities," says Benjamin. "It grew and grew to describe a whole set of systems and practices which reinforce white supremacy through the law."
In recent years, however, civil rights advocate and writer Michelle Alexander's book, The New Jim Crow, updated the term to describe the mass incarceration of black men in the US, "referring to how mass incarceration and policing today extends that process of enforced second class citizenship and legalised discrimination, where if someone is convicted of a felony, the law allows businesses and other public institutions to legally discriminate against them," says Benjamin.
"I wanted to build on both of those concepts - Jim Crow, The New Jim Crow - and think about how inequality, and the rate of racial oppression today is being codified through automated systems," she adds. This latest iteration might just be especially insidious, says Benjamin, in how it operates under assumptions of neutrality and objectivity. "That's what makes it even more important for us to wrestle with, because the bias is not as obvious."
It's been pointed out before that the lack of diversity of the tech workforce impacts on the development of products that go on to shape the lives of a whole pluralistic population, but Benjamin believes the solution must go beyond diversifying the workforce. "We've also got to think about the larger milieu in which people are sending out their little algorithms and robots into the world economic structure, the political priorities, and also tackle it from that end," she says.
There are signs in the industry that some backlash is underway, with a growing public appetite for increased regulation of the tech industry. Benjamin says she has been heartened too, by her dialogue with tech industry insiders themselves.
"There's a growing movement of tech workers in some of the major companies - but all throughout the tech workforce - that's mobilizing around these same issues, that understand they're more than a cog in a wheel in a system - that they have a responsibility for the public good," Benjamin says. She gives the example of tech workers who have demonstrated against companies taking on contracts with ICE, the US immigration agency known for incarcerating immigrants in inhumane conditions and separating minors from their parents.
However, there are still endemic factors related to capitalism that make it difficult to truly shake up how the industry operates.
"It's about the speed at which technology is developed, and the market incentive to go faster, to get your product to market faster," explains Benjamin, "which means that having a more thoughtful approach that is more conscientious and aware of the social dimensions is not an incentive device. We have to question the economic values that drive us to skip steps in the process, and skip discussions of equity and justice."
At the beginning of the book, Benjamin responds to Facebook's notorious and now ostensibly dropped exhortation to 'move fast and break things' with, 'what about the people and the places broken in the process?' The people on zero hour contracts in Amazon warehouses or the tech hub cities with soaring homelessness, which just so happens to disproportionately affect black and disadvantaged people?
The tech industry has not yet produced a convincing response to this question, but Benjamin says answering it involves reconceptualising society's relationship to tech as a whole. "We talk about ourselves as users, and one of the things I realised over the course of this project is that users get used," she says. "We have to become empowered and exercise agency in terms of becoming stewards of technology, rather than being pawns of it."