technology ethics

Making Tech More Accessible: An Ethnographic Lens on Ability and Disability

Mural on a street in Croydon, London in blue, light brown, black and cyan colors. A busy image with many different figures that are both humanoid and rectangular/robotic. A central image has three large eyes arranged vertically.
"An ethnographic lens influences us to define ability and disability in a way that is maximally inclusive...many different abilities are present in our world, and each deserves to be taken as its own reality and respected as such." —RICHARD BECKWITH (Research Psychologist, Intelligent Systems Research Lab) & SUSAN FAULKNER, (Research Director, Research and Experience Definition), Intel Corporation EPIC Members Richard Beckwith and Susan Faulkner (Intel) have assembled a panel of luminaries in accessible tech research, design, and engineering for our January 26 event, Seeing Ability: Research and Development for Making Tech More Accessible. In anticipation, we asked them a few questions about their approach to accessibility and key first steps all of us can take to do more inclusive work. How do you define ability and accessibility? How does an ethnographic lens influence your definitions? Ability has to do with what an individual is capable of perceiving or physically doing with their body; accessibility has to do with...

Tutorial: Data and Ethnography for Better AI Product Development

video link
OVETTA SAMPSON Microsoft Overview Ovetta Sampson covers when, how, and where to integrate ethnography and data science in the exploratory research process to have better and more ethical AI product development outcomes. With a combination of lecture, case study examples, and exercises, attendees gained a clear understanding of why making data a stakeholder in user research will create a more ethical and human-centered AI product. This tutorial is created especially for researchers who understand the need to mix ethnography and data science but just don’t know quite how to do it. Topics include: Bringing data into research planning to help identify and reduce bias Bringing data science into synthesis to help illuminate system solutions Bringing data science into insight and design principle generation Aligning qualitative data and behavioral data This tutorial was presented in full at EPIC202020. The video includes instructor presentations; discussions and breakout sessions are excluded for the privacy of...

There’s No Playbook for Praxis: Translating Scholarship into Action to Build a More Ethical Bank

JEFFREY GREGER Varo The US banking industry has a long history of excluding, exploiting, or simply ignoring low-income communities, recent immigrants, and racial minorities. In this paper, I share my experiences creating a community of practice where employees of a rapidly-growing banking startup can identify and confront the ethical challenges facing the financial technology (fintech) industry. This community is informed by insights from four years of activism and anthropological research that I conducted with small teams of service designers and ethnographers developing financial services for and with low- to moderate-income communities around the world. Through this research, I identified three institutional logics—insularity, decontextualization, and technological hubris—which limit efforts to build a more inclusive, equitable banking system. These logics hold the potential to lead well-intentioned organizations, and the practitioners they employ, to harm the marginalized communities they set out to help. This paper concludes...

Agency & Tech Colonialism: Extending the Conversation

“What can those of us who work in, and maybe even love, computing cultures do about computing’s colonial expansions?” Sareeta Amrute’s keynote address “Tech Colonialism Today” opened EPIC2019 in a provocative, mobilizing spirit that inspired discussions on stage, in breakout sessions, and around breakfast tables. Sareeta journeyed across time and territory to explore what characteristics make something colonial to begin with, such as extractive and hierarchical systems. As you might guess, she argued that yes, the tech industry today has core colonial attributes. But goal wasn’t just critique; Sareeta showcased counterconduct—the agency that people, communities, and companies have to build alternatives. If colonial legacies and socioeconomic systems seem a bit “out of scope” as context for standard product or user research projects, check out Sareeta’s award-winning book Encoding Race, Encoding Class. You’ll learn about Meena’s daily tea ritual, hear Bipin describe why he sometimes chooses to write bad code,...

Tutorial: Ethics in Data-Driven Industries

EMANUEL MOSS CUNY/Data & Society FRIEDERIKE SCHÜÜR Cityblock Health Overview Technology companies have discovered ethics in the wake of public pressure to consider the consequences of their products. This has been prompted by the finding that machine learning and artificial intelligence (ML/AI) systems, as fundamentally pattern-seeking technologies, can and do exacerbate long-term structural inequalities. Companies and employees also struggle with the challenges posed by the dual-use nature of technology. This tutorial will prepare you to understand and contribute to the more ethical development and deployment of ML/AI systems. It covers: An overview of ethical challenges in ML/AI today An introduction to the development of ML/AI systems, designed to give you insight into the reasoning processes and workflows of technical colleagues and how they generally address issues like accuracy and fairness (no quantitative background required!) A overview of current efforts to design more ethical ML/AI systems,...

Tech Colonialism Today

EPIC2019 Keynote Address, Providence, Rhode Island SAREETA AMRUTE, Director of Research, Data & Society; Associate Professor of Anthropology, University of Washington Studies on the social effects of computing have enumerated the harms done by AI, social media, and algorithmic decision-making to underrepresented communities in the United States and around the globe. I argue that the approaches of enumerating harms and arguing for inclusion have to be unsettled. These approaches, while important, frame populations as victims whose existence is dominated by and divided from centers of power. We lack a structural analysis of how these harms fit into a larger social economic pattern. I ask us to consider instead whether all of these harms add up to computing technologies today being one of the largest aspects of a colonial relationship today. Using historical evidence, the talk will consider what makes something ‘colonial’ to begin with, and then weigh corporate computing’s relationship with the world to gauge whether this...

Reconceptualizing Privacy

EPIC2019 Panel, Providence, Rhode Island Moderator: KEN ANDERSON, Principal Researcher, Intel Corporation Panelists: LIZ KENESKI, Head of Privacy Research, Facebook Inc. PETER LEVIN, Principal Researcher, Autodesk ELENA O’CURRY, Senior User Researcher, Uber JEFF SOKOLOV, Designer & Researcher, IBM Watson Health Algorithmic systems are increasingly integrated into the physical and digital infrastructures of our lives. The borders of privacy are being pushed and redefined, provoking new debate about what privacy is. All corporations claim privacy is important, but what does that mean? Panelists will consider what privacy might look like or mean when individuals are tied into multiple networks, both human and AI. KEN ANDERSON, panel chair, is an anthropologist, who is a Principal Engineer in Next Generation Standards at Intel Corporation. He does pathfinding at the intersection of technology, strategy, and the human experience to drive towards creating technology that enables us to have richer relationships,...