“What can those of us who work in, and maybe even love, computing cultures do about computing’s colonial expansions?”
Sareeta Amrute’s keynote address “Tech Colonialism Today” opened EPIC2019 in a provocative, mobilizing spirit that inspired discussions on stage, in breakout sessions, and around breakfast tables. Sareeta journeyed across time and territory to explore what characteristics make something colonial to begin with, such as extractive and hierarchical systems. As you might guess, she argued that yes, the tech industry today has core colonial attributes. But goal wasn’t just critique; Sareeta showcased counterconduct—the agency that people, communities, and companies have to build alternatives.
If colonial legacies and socioeconomic systems seem a bit “out of scope” as context for standard product or user research projects, check out Sareeta’s award-winning book Encoding Race, Encoding Class. You’ll learn about Meena’s daily tea ritual, hear Bipin describe why he sometimes chooses to write bad code, and follow the transatlantic journeys of a Sesame Street diaper bag (the brand matters). Connecting large-scale social systems to the intimate details of how people make meaning and navigate their lives is the bread and butter of ethnography. So I asked Sareeta some follow-up questions about how she made these connections in her book.
Sareeta Amrute is Director of Research at Data & Society and Associate Professor of Anthropology at the University of Washington.
—Jennifer Collier Jennings, epicpeople.org
Sareeta, your keynote at EPIC2019 addressed technological and social systems on a transnational and historic scale. Your book Encoding Race, Encoding Class connects this kind of systemic, epochal analysis to everyday lives of your research participants. I hope you can reflect on two concepts you use to make these connections: embodiment and freedom.
First, embodiment. Because of the hard work of activists, journalists, researchers & others, tech utopianism is being tempered a bit by greater awareness of the social “effects” of technology. For example, it’s become more mainstream to talk about racial and gender bias, and people are beginning to understand the way training data can create AI products with "unintended" social outcomes. But your research makes a deeper argument that code itself is material, embodied, and social—not just something abstract and cognitive that might have social outcomes. In your book, the code and lives unfold together. Can you describe what code is in your work?
SAREETA: You’re absolutely right. In the span of time since my book came out, which is about three years, the needle on the tech industry has certainly moved. When I was writing Encoding Race, Encoding Class, the prevailing attitude on coding and related services, in fact, on the internet at large, was that it was a space free of bodies and the attendant problems of discrimination that face to face interaction entailed. We now collectively inhabit a vastly changed landscape where the issues of gender, race, age and ability that we supposedly left behind make their presence known in predictive policing, welfare allocations, social media—the list goes on. Much of this changed perspective is due to the pathbreaking work of scholars such as Ruha Benjamin, Safiya Noble, Sarah Roberts, Marie Hicks, Charlton McIllwain, Anita Say Chan, Wendy Chun, Lisa Nakamura, Virginia Eubanks, Beth Coleman, and others.
Where we need to go next, and what I tried to address in my book, is understanding the limited uses of the term bias itself. In some ways, bias feeds tech solutionism, because it can suggest—depending on how it is deployed—that the problem of discrimination can be solved by building systems that are gender or race blind. Even when more sophisticated approaches are deployed that consider factors that can be used as proxies for protected categories, the focus on bias can bracket out asking about what questions are being answered through machine learning and other interventions, what the limits of those processes should be, and who has determined what the important questions worth asking and answering are.
Now, to get to your question about what code is in my book. This is a really interesting way to pose the question of embodiment, and thank you for asking it in this way. One thing that emerged during fieldwork with short-term, visa-backed coders from India was how greatly their experience differed from how they were talked about in tech offices and immigration discourse, and from what I had learned about coding by reading books about free software, hackers and the like. Place matters to embodiment—both the place people come from and how they are emplaced in organizational life. So how they produced code and how code produced them was specific. Code isn’t universal, it’s embodied. Starting from the way my interlocutors thought of code, I think of code in the first instance as a means of making things—including livelihoods—and a means of reflecting on things—code is good to think with because of the way it moves across categories, from material and immaterial to poetry and math.
“Coding can be a tool to extend and think through human possibilities… [T]he critique of working life that programmers elaborate through their understanding of code yields strategies of bending the code, their time in the office, and their free time toward the full development of themselves” (Amrute 2016:21)
The EPIC2019 theme was Agency. Agency is commonly framed in terms of degrees of freedom that can be expanded or reduced if one has more or less control. In your book you argue that this distinction between freedom and control is too simplistic. Can you explain the idea that freedom “emerges in a historically mediated context where it can become the focus of social practice precisely because of the way it is defined”?
SAREETA: Something that I addressed in my talk was the idea of counterconduct, a term I borrow from Michel Foucault, to describe agency in a new way. Rather than searching for agency outside of structure or power, agency emerges from the very way that social and individual life is conducted in a double sense, both directed from without and the everyday ways through which life takes place. Counterconducts are those practices that emerge from these same structures but move in a different direction. To take one example from the book, in coding worlds, freedom has most often been associated with movements for free software and open source coding, which argues that code should not be owned and that projects should be designed to be shared and modified by anyone. Yet, this notion of freedom does not work for immigrant coders, who see instead in it a contradiction between the ideology of code being free but immigrants being restricted by visa regimes and office divisions of labor. So, I argue, we need to modify our notion of freedom in coding worlds to align not only with the idea of freedom from restriction, but also with the positive freedom to pursue different worlds other than the ones we currently inhabit. A certain kind of ownership might be part of this different definition of freedom, which I call in the book proprietary freedom.
Among ethnographers there is some anxiety about engaging with code—about the range of knowledge and skills that are expected for work in business and tech, particularly quantitative and programming skills. You have an interesting note about this in your book. You advise that, although ethnographers may need to learn some programming, you learned as much about coders by studying them outside of the office as you did by studying coding practices themselves. To understand ‘coding’ and ‘work', you wanted to figure out what is ‘not coding’ and ‘not work’. So how should we think about the range of knowledge and skills required for ethnographers and other researchers working in tech and data-driven environments?
SAREETA: I have both an ethnographic and a positional answer to this question. As ethnographers, we are forever updating our skills as our interests range across domains and deepen within our projects. I have had to read widely across postcolonial studies, postwar German migration history, Indian labor migration going all the way back to the period of indenture, software studies and algorithmic imaginaries, critical race studies, even some economics and legal theory, not to mention developing some fluency in Germany and at least two South Asian languages. All of these are still educational goals in process. So, my ethnographic answer to this question is that we learn what our projects demand as well as we can, with full awareness that this knowledge is incomplete—this opening is in fact part of what allows us to be open to our ethnographic co-constitutors of worlds.
My second answer is that we often fall prey to the same hierarchy of knowledge regnant in the worlds we study, which puts computer science at the top of the hierarchy and social science somewhere near the bottom-middle. Instead, we should consider technical skills as important but not all consuming. In my experience, other skills, such as qualitative research methods, and especially language skills in Spanish, the languages of South and Southeast Asia, and mainland China, to name a few important language groups, are just as important. These other skills will help us both find the agency that often exists at the peripheries of existing systems and to surface solutions that will certainly come the more we are able to look beyond the confines of the organizations that employ us.
Your organization Data & Society is a phenomenal source of research that helps frame our work in tech and data-driven industries. It’s exciting that an ethnographer is now Director of Research—congratulations on assuming that role recently! How have you framed your research program and research priorities for Data & Society? And how has your perspective as an anthropologist and ethnographer informed your approach?
SAREETA: Anthropology at its best dialogs in two directions, first, with ethnographic companions, interlocutors and collaborators; and second, with taken-for-granted understandings of the world, whether these opinions crop up in op-ed columns, executive boardrooms, or the scholarly literature. Data & Society’s strength resides in using mixed methods, including ethnography, to move the debate on the social effects of data-centric technologies beyond received opinions. To do this, we need to lead with empirical findings and build our interventions around them. Most often, our research findings lead to a radical reframing of how a particular issue is understood and help identify a particular community who might benefit in their praxis in understanding that new frame. What I bring to that mix as an anthropologist is an understanding for the research process, that researchers need to follow their findings rather than fitting data into existing frames, and that the most robust findings come from engaging with technological systems in practice, as we say in anthropology, ‘on the ground’, where there are used, developed, and reworked.
Data & Society sits at the intersection of corporate, public, and academic debates on technology. From this perch, we can see the great advantages that come from translating between domains like anthropology and computer science, history and media studies, the study of law and policy and the study of organizational structure. At the same time, we can sometimes glimpse lacunae that exist across all these domains. Our analyses suggest that surveillance as it relates to the well-being of patients, workers, immigrants, and those intersecting with the justice system will be an important topic over the next year, at least. Our understanding of surveillance differs sharply from some prevailing accounts. Research demonstrates that what is often imagined as an all-controlling view from the top managed by algorithmic intelligences is actually a series of unique systems that include humans and machines and are calibrated to achieve different ends. We will build on this insight to understand worker surveillance, the design of hospital detection systems, and how ethics is done in tech companies, to name just a few of our ongoing projects. A further theme in Data & Society’s output will be in following how actors take advantage of the way technological systems are built to skew results, and what companies can do to prevent this from happening. We call this focus on identifying vulnerabilities that come from features rather than flaws, “sociotechnical security”. This new work extends Data & Society’s existing focus on misinformation and disinformation to new realms, like climate change and the 2020 census.
Finally, not all anthropologists are the same. What I bring as part of my background, in addition to a sustained focus on research methods and relating findings to the big questions of our day, is the recognition that we need to think transnationally and from below to answer these questions. During my time at Data & Society, I have begun a project on thinking from below called #unsettle: decolonizing tech research with my colleague Rigoberto Lara Guzman. Even while acknowledging the dangers of using ‘decolonize’ as a term, since it has already been coopted by institutional structures, we take the opportunity this term provides to bring new voices and fresh perspectives to the understanding of data-centric technologies that start from a holistic perspective integrating concerns for environment, person, and community.
Works Cited in "Tech Colonialism Today"
AI Now Institute. “How to Interview a Tech Company: A Guide for Students.” Medium, 17 Sept. 2019, medium.com/@AINowInstitute/how-to-interview-a-tech-company-d4cc74b436e9.
Amrute, Sareeta. Encoding Race, Encoding Class: Indian IT Workers in Berlin. Duke University Press, 2016.
Arnold, David. 2000. Science, Technology, and Medicine in Colonial India. Cambridge: Cambridge University Press.
Benjamin, Ruha. Race After Technology. Polity, 2019.
Breckenridge, Carol. “The Aesthetics and Politics of Colonial Collecting: India at World Fairs.” Comparative Studies in Society and History 31(2):195-216.
Broussard, Meredith. Artificial Intelligence: How Computers Misunderstand the World. MIT Press, 2018.
Buolamwini, Joy, and Timnit Gebru. “Gender Shades.” Gender Shades, MIT Media Lab, gendershades.org/.
Byler, Darren. “Darren Byler, Author at Art of Life in Chinese Central Asia.” Art of Life in Chinese Central Asia, livingotherwise.com/author/lutbulla/.
Cohn, Bernard S. “Notes on the History of the Study of Indian Society and Culture.” Structure and Change in Indian Society, by Bernard S. Cohn and Milton B. Singer, Aldine, 1970.
Comaroff, Jean, and John L. Comaroff. Theory from the South or, How Euro-America Is Evolving Toward Africa. Routledge, 2016.
Cooperativismo. Coopersystem, coopersystem.com.br/en/.
Couldry, Nick and Mejias, Ulises A. The Costs of Connection. Stanford, 2019.
“Crafting Innovative, Usable Technology for Social Change.” Sassafras Tech Collective, sassafras.coop/.
Daniels, Jessie, et al. Advancing Racial Literacy in Tech: Why Ethics, Diversity in Hiring, & Implicit Bias Training Aren't Enough. Data & Society Research Institute, 2019. datasociety.net/wp-content/uploads/2019/05/Racial_Literacy_Tech_Final_0522.pdf.
“Data for Black Lives.” Data 4 Black Lives, d4bl.org/.
“Digital Justice Lab.” Digital Justice Lab, digitaljusticelab.ca/.
Ensmenger, Nathan. “The Environmental History of Computing.” Technology and Culture, vol. 59, no. 4S, 2018, pp. S7-S33.
Fang, Lee. “Google Hired Gig Economy Workers to Improve Artificial Intelligence in Controversial Drone-Targeting Practice.” The Intercept, 4 Feb. 2019, theintercept.com/2019/02/04/google-ai-project-maven-figure-eight/.
Fanon, Frantz. The Wretched of the Earth. Grove Press, 2004.
Gandy, Oscar. 2003. “Journalists and Academics and the Delivery of Race Statistics: Being a Statistician Means Never Having to Say You’re Certain” Race and Society 4:149-160.
Gray, Mary L., and Siddharth Suri. Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Houghton Mifflin Harcourt, 2019.
Hayden, Cori. When Nature Goes Public: The Making and Unmaking of Bioprospecting in Mexico. Princeton University Press, 2004.
Hicks, Marie. Programmed Inequality: How Britain Discarded Women Technologists and Lost its Edge in Computing. MIT Press, 2017.
Irani, Lilly. Chasing Innovation: Making Entrepreneurial Citizens in Modern India. Princeton University Press, 2019.
Mavhunga, Clapperton Chakanetsa. What Do Science, Technology, and Innovation Mean from Africa? MIT Press, 2017.
“Panels • EPIC2019.” EPIC2019, 2019.epicpeople.org/panels/.
Rhode Island Department of State. Record of the Deed for Aquednick, 1637. sos.ri.gov/divisions/Civics-And-Education/teacher-resources/native-americans.
Rosenblat, Alex. UBERLAND: How Algorithms Are Rewriting the Rules of Work. University of California Press, 2019.
“Salons • EPIC2019.” EPIC2019, 2019.epicpeople.org/salons/.
Subramanian, Ajantha. The Caste of Merit: Engineering Education in India. Harvard University Press, 2019.
“Tech Workers Coalition.” Tech Workers Coalition, techworkerscoalition.org/.
“Te Mana Raraunga.” Te Mana Raraunga, www.temanararaunga.maori.nz/.
“The Tricontinental.” Tricontinental Institute for Social Research. https://www.thetricontinental.org/
“Warehouses Are Tracking Workers' Every Muscle Movement.” IndustryWeek, 11 Nov. 2019, www.industryweek.com/technology-and-iiot/warehouses-are-tracking-workers-every-muscle-movement.