Ideologies of Boring Things: The Internet and Infrastructures of Race

That’s just how it is, how it has always been, and there’s nothing political about it. The pornification of black women on the web bears echoes of Sarah Baartman’s exploitation in the 19th century, argues Noble. Think of Robert Moses and his transformation of New York City through his network of expressways. As she notes, the algorithm has been changed to remove pornography from the first set of results when users Google “black girls.” Google also removes anti-Semitic content from the web in response to hate speech laws in Germany, and complies with Right to Be Forgotten laws in Europe more broadly. Noble argues that a company that plays this kind of role in our lives must be subject to the checks of government regulation. Organizing toward that end will surely involve building our own infrastructures, and that is the code that must be written next. Read critically, such systems reveal the ways that power and privilege are normalized such that they extend and consolidate patriarchy, white supremacy, and wealth inequality. Its use is ubiquitous. Google’s retrieval mechanism is not interested in what might be good or true or necessary for an audience of actual black girls, curious about themselves and their world. Advocates of net neutrality argue that government regulation of commercial internet companies is necessary in order to ensure that these entities don’t prioritize access to some content over others. How might Google respond to the concerns raised by Noble and others about the racism and misogyny embedded in its networks? Noble argues instead that the web is instead a machine of oppression, a set of “digital decisions” that “reinforce oppressive social relationships and enact new modes of racial profiling.” The internet is not a magic box spitting out facts about Donald Trump, ex-girlfriends, and the history of Algonquian fishing weirs. The internet is a tool of democracy, after all. We carry smartphones with the computing power of a desktop in our hands, along with the promise of a flattening of the social order. From its early days connecting Department of Defense computers to each other, the internet has morphed for many of us into an extension of our minds and selves. Searching “professional hairstyles” returns images of white women wearing ponytails and French braids while “unprofessional hairstyles” features black women. Her second solution is an appeal to the state in service to the public good. Black girls matter only for the role they play in the racist and misogynist fantasies of Google’s majority client: the white American man. There is nothing benign about encoding white supremacy in Google’s search algorithm. Rather than focus on what it facilitates, she explores the internet as infrastructure, investigating what is hidden from view by mathematical algorithm. FEBRUARY 13, 2018
INFRASTRUCTURE IS CREATED by people and therefore embeds and reflects the values of the people who create it. As Noble makes clear, the internet is only “free and unfettered” if racism or sexism are not problems for you. Under Trump, FCC chairman Ajit Pai has disagreed, advocating for an end to regulation in order to “restore Internet Freedom.” The argument, of course, is: Freedom for whom? Infrastructure is a human thing and thus a political thing. Among the most public of these fights has been the one around net neutrality. By December 2017, his administration had withdrawn or delayed nearly 1,600 individual regulations. Not so coincidentally, suggests Noble, this is the demographic most likely hired by Google to build their algorithm in the first place. This is a fundamental insight in the study of what information studies scholar Susan Leigh Star has called “boring things”: phone books, medical coding manuals, the Dewey Decimal System. For all its innovation and disruption, Silicon Valley simply repeats very old racist stories. They facilitate normal life, and the inequities that are sustained by them are not seen at all. The internet as it is retrieved by Google could be different. It would be enough if search simply made the internet inhospitable to African-American women, but Noble makes a compelling case that pervasive racism online inflames racist violence IRL. It is refreshing to see a call to state power amid the frenzy of deregulation that has accompanied the Trump administration. In a chapter on the Dylann Roof shootings in Charleston, South Carolina, Noble describes the beginning of Roof’s radicalization: he “allegedly typed ‘black on White crime’ in a Google search to make sense of the news reporting on Trayvon Martin, a young African American teenager who was killed and whose killer, George Zimmerman, was acquitted of murder.” What Google retrieved for Roof — a vast trove of white supremacist fantasy about black-on-white crime, was instrumental in Roof’s decision to enter a church and murder nine people. If code is invariably created by only one kind of person — usually white, usually male, with a worldview so thoroughly aligned with the forces of dominant power that he can’t see that he has any power at all — the code will always fail to account for minority and minoritizing perspectives. How might we understand other boring things — our subway systems, tax codes, mortgage rules — as sedimentations of power and privilege, and what must we do to change them? Many of these represent attacks on clear public goods like clean air and clean water, and, some argue, attacks on the internet. What she surfaces online parallels extended histories of racist white representations of blackness, and black femininity in particular. Having turned the “boring thing” of algorithmic code into a site of political and cultural analysis, Noble turns to potential prescriptions. Noble makes a convincing case that despite its status as a publicly held corporation, Google functions much like any other public utility. Rollbacks of administrative rules have proceeded with speed as Trump largely makes good on his campaign promise to be the most deregulating president in history. Without that, we have no recourse except to hope that Google abides by its founding motto: Don’t be evil. The way that power is wielded online is acutely familiar, even as digital tools hold the promise of the new. The story told about black people online is almost entirely refracted through a white racist lens. And yet, racist content persists. Noble offers two solutions. What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores. Infrastructure does not work equally well for each of us. ¤
Emily Drabinski is associate professor and coordinator of library instruction at Long Island University, Brooklyn. Safiya Umoja Noble addresses internet search as one such critical infrastructure in her book, Algorithms of Oppression. Her target is the internet, that structuring machine of everyday life. Noble’s central insight — that nothing about internet search and retrieval is politically neutral — is made again and again through the accumulation of alarming and disturbing examples. In their place, we’re promised, untold Arab Springs will bloom. Image searches for “gorilla” turn up photos of African-American people. Given Noble’s research, we have plenty of evidence that the company cannot. Diversifying the technological workforce is an important first step toward building an internet that accounts for power and privilege at the level of code itself. The problem is not just that racist search results are retrieved by Google, but that the people who make Google don’t anticipate that such results will appear at all, and therefore don’t account for them in advance. Code is power, and it is white and male. The genesis of Noble’s project emerges from a very ordinary moment online. White supremacy and patriarchy are foundational to the government too, after all. Of course, what is felt as a disruption changes depending on the social and political position occupied by a given person. The first is a call for Google and other Silicon Valley companies whose code invisibly structures so much of contemporary life to hire people who understand how race and gender and other categories of social difference function in the world to produce different life experiences for different people. But the poor and working-class people of color cut off and isolated from the rest of the city by a swoop of the Brooklyn–Queens Expressway can tell you that Moses’s roads are broken. She is editor of Gender and Sexuality in Information Studies, a book series from Library Juice Press/Litwin Books. Such systems are insidious because they are substrate, by definition sitting underneath the world as we experience it. The roadways work quite well for elites moving in, through, and past the city in their cars. Google has acted in several cases. Indeed, the numbers at Google are stark: in 2016, only two percent of its workforce was African-American, and only three percent were Latino. In the fall of 2010, Noble sat at her computer, looking for “things on the Internet that might be interesting to my stepdaughter and nieces.” When she Googles “black girls,” she finds instead HotBlackPussy.com. In the course of writing this short review I have checked my Gmail dozens of times, added appointments to my Google Calendar, shared drafts with colleagues via Google Docs, and Googled myriad things. The internet is a public utility like the telegraph and telephone that came before it, and should be regulated the same way. Noble contests this fantasy of the internet as equalizing device. Gone are the gatekeepers that allow only sanctioned voices into the public dialogue. More often, such infrastructures are left unexamined. Looking for “black teenagers” returns police mug shots. As a former urban marketing executive whose job was in part to insulate companies against potential racist missteps, Noble is keen to the ways that Google responds to accusations of racism in its algorithm. Government regulation evens playing fields and makes things fair. We rarely think about sewer pipes unless they’re backing up into our houses, and the patterns of air traffic don’t matter unless they’re disrupted by weather. If, as Noble suggests, regulation by an agent of the public good is a necessary counterweight to corporate exploitation of legacies of racism, the question becomes how we can make the state that agent. For many of us, Google is a wraparound company. Google has become synonymous with “looking things up on the internet,” and its suite of services have locked in many of us willing to trade a little personal privacy and marketable data in exchange for free email and a standard battery of web-based software essential to the contemporary professional life. For Noble, the dominant whiteness at the heart of technology companies leads to a host of other algorithmically driven problems, from racist Twitter trolls and SnapChat filters to sharing platforms like Uber and Airbnb that facilitate discrimination based on race. It’s hard to get from Red Hook, Brooklyn, to Manhattan. It’s just a matter of the road. Coders need critical race theorists, suggests Noble, or at least workers who understand that frictionless digital infrastructures aren’t frictionless for everyone. Such systems are also usually invisible as long as they work.