Keynote Speaker: PANTHEA LEE, Reboot
Panthea Lee is a strategist, organizer, designer, and facilitator, and the Executive Director of Reboot. She is passionate about building transformative coalitions between communities, activists, movements, and institutions to tackle structural inequity—and working with artists to realize courageous social change.
Panthea is a pioneer in designing and guiding multi-stakeholder processes to address complex social challenges, with experience doing so in 30+ countries with partners including UNDP, MacArthur Foundation, Luminate, CIVICUS, Wikimedia, Women’s Refugee Commission, and governments and civil society groups at the national, state, and local levels. The global co-creation efforts she’s led have launched new efforts to protect human rights defenders, tackle public corruption, strengthen participatory democracy, advance equity in knowledge access, reform international agencies, and drive media innovation. Panthea began her career as a journalist, ethnographer, and cultural...
Moderator: JILLIAN POWERS, Cognizant
Panelists: JORDAN KRAEMER, Anti-Defamation League; ARWA MICHELLE MBOYA, Magic Leap; JESSICA OUTLAW, The Extended Mind LLC
As new technologies, from AI to immersive experiences, are developed at scale, they raise ethical concerns for research and design. Data-driven systems have repeatedly been shown to entrench social biases along lines of race, gender, and class, from racist algorithms in the criminal justice system to misgendering trans and nonbinary people. Immersive technologies, such as virtual reality (VR) and augmented reality (AR), however, raise separate and thorny questions for ethical design. Immersive technologies create novel experiences of embodiment and reality, not to mention new sources of personal data. These facets create distinctive challenges for ethics, equity, and inclusion, intensifying the potential harms of misinformation, harassment, privacy violations, surveillance, or unequal access. How can ethnographic research anticipate emergent ethical questions specific...
Keynote Speaker: JASON LEWIS, Concordia University
Jason Edward Lewis' multidisciplinary research and creative practice has been central to developing Indigenous media art in North America and worldwide, establishing a vital conversation about the interaction between Indigenous culture and computational technology. His contributions comprise scholarly writing, art making and technology research, as well as his leadership of the Initiative for Indigenous Futures and his creation of the Indigenous Futures Research Centre. A digital media theorist, poet, and software designer, Lewis is currently University Research Chair in Computational Media and the Indigenous Future Imaginary and Professor of Computation Arts at Concordia University. At Concordia he also serves as Special Advisor to the Provost on Indigenous Spaces.
Lewis spent a decade working in a range of industrial research settings, including Interval Research, US West's Advanced Technology Group, and the Institute for Research on Learning, and, at the turn of...
Instructors: CHELSEA MAULDIN (Executive Director, Public Policy Lab) & NATALIA RADYWYL (Research Director, Public Policy Lab)
This tutorial gives you robust, actionable tools for navigating inequity through a project life cycle.
This tutorial was conducted at EPIC2021. Exercises and discussions have been omitted to protect the privacy of participants.
To do ethical, equitable work in any domain, we need robust tools for assessing and addressing power. Whether we’re creating products, services, or policies, inequities can create direct and indirect risks for research participants and underserved populations. This tutorial gives you robust, actionable tools for navigating inequity through a project life cycle.
Public Policy Lab developed Power Tools over many years of innovative and effective work with at-risk communities. Across planning, research, design, and implementation, the instructors will teach you how to use Power Tools to check biases, inform theories of change and logic models, identify effective...
Ovetta Sampson covers when, how, and where to integrate ethnography and data science in the exploratory research process to have better and more ethical AI product development outcomes. With a combination of lecture, case study examples, and exercises, attendees gained a clear understanding of why making data a stakeholder in user research will create a more ethical and human-centered AI product. This tutorial is created especially for researchers who understand the need to mix ethnography and data science but just don’t know quite how to do it. Topics include:
Bringing data into research planning to help identify and reduce bias
Bringing data science into synthesis to help illuminate system solutions
Bringing data science into insight and design principle generation
Aligning qualitative data and behavioral data
This tutorial was presented in full at EPIC202020. The video includes instructor presentations; discussions and breakout sessions are excluded for the privacy of...
3A Institute, Australian National University
PechaKucha Presentation—A wise woman once shared with me that the opposite of poverty isn't wealth. It's dignity. In a world where scale is about optimising for something bigger, faster, easier, broader and more profitable, we risk decision-making that is at odds with preserving, enabling and enhancing human dignity. What if we changed our focus to instead work out how we scale human dignity?
This PechaKucha draws on my career across consulting, social enterprise and academia in geographies from Sydney CBD to rural Uganda and highlights three moments where I experienced dignity that I believe can scale.
Through the telling of stories it shows glimmers of how we can choose a definition of scale that preferences dignity. It can look like making space for a chicken gift, enshrining dignity in our organisational values and structures and building question-asking muscles.
If we believe that the opposite of poverty is dignity, then scaling dignity is an antidote to poverty....
The US banking industry has a long history of excluding, exploiting, or simply ignoring low-income communities, recent immigrants, and racial minorities. In this paper, I share my experiences creating a community of practice where employees of a rapidly-growing banking startup can identify and confront the ethical challenges facing the financial technology (fintech) industry. This community is informed by insights from four years of activism and anthropological research that I conducted with small teams of service designers and ethnographers developing financial services for and with low- to moderate-income communities around the world. Through this research, I identified three institutional logics—insularity, decontextualization, and technological hubris—which limit efforts to build a more inclusive, equitable banking system. These logics hold the potential to lead well-intentioned organizations, and the practitioners they employ, to harm the marginalized communities they set out to help. This paper concludes...
FANI NTAVELOU BAUM
Despite companies facing real consequences for getting ethics wrong, basic ethical questions in emerging technologies remain unresolved. Companies have begun trying to answer these tough questions, but their techniques are often hindered by the classical approach of moral philosophy and ethics – namely normative philosophy – which prescribe an approach to resolving ethical dilemmas from the outset, based on assumed moral truths. In contrast, we propose that a key foundation for ‘getting ethics right’ is to do the opposite: to discover them, by going out into the world to study how relevant people resolve similar ethical dilemmas in their daily lives – a project we term ‘grounded ethics’. Building from Durkheim's theory of moral facts and more recent developments in the anthropology of morals and ethics, this paper explores the methods and theory useful to such a mission – synthesizing these into a framework to guide future...
CUNY/Data & Society
Technology companies have discovered ethics in the wake of public pressure to consider the consequences of their products. This has been prompted by the finding that machine learning and artificial intelligence (ML/AI) systems, as fundamentally pattern-seeking technologies, can and do exacerbate long-term structural inequalities. Companies and employees also struggle with the challenges posed by the dual-use nature of technology.
This tutorial will prepare you to understand and contribute to the more ethical development and deployment of ML/AI systems. It covers:
An overview of ethical challenges in ML/AI today
An introduction to the development of ML/AI systems, designed to give you insight into the reasoning processes and workflows of technical colleagues and how they generally address issues like accuracy and fairness (no quantitative background required!)
A overview of current efforts to design more ethical ML/AI systems,...
Alliance Innovation Lab – Silicon Valley
Alliance Innovation Lab – Silicon Valley, MIT
Alliance Innovation Lab – Silicon Valley
This paper explores how the design of everyday interactions with artificial intelligence in work systems relates to broader issues of interest to social scientists and ethicists: namely human well-being and social inequality. The paper uses experience designing human interactions with highly automated systems as a lens for looking at the social implications of work design, and argues that what human and automation each do is less important than how human and automation are structured to interact. The Human-Autonomy Teaming (HAT) paradigm, explored in the paper, has been a promising alternative way to think about human interactions with automation in our laboratory's research and development work. We argue that the notion of teaming is particularly useful in that it encourages designers to consider human well-being as central to the operational success...
This paper recounts research into the orientation and mobility experiences of people who are blind or visually impaired, and describes the novel sonic research method I developed for this purpose. “Participant Phonography,” as I call the method, aims to empower research participants with low or no vision through the self-guided creation of sound recordings that represent their experiences of the world in a first-person perspective. More broadly, the paper highlights the inadequate efforts of ethnographers in industry to tackle challenges of disability and reflects on the ethical challenges that face researchers who want to include disabled people in research. Inclusive methods like participant phonography have great potential to break down traditional power structures that have rendered non-normative groups marginal in user research, but these methods also come with substantial barriers to their implementation in a corporate context....
The social welfare system was built to protect the vulnerable through the provision of basic needs. I left my social service job to join an organization with a mission to shift that system from safety nets to trampolines - from services designed to maximize safety, to those that develop agency and resilience. That’s meant interrogating and renewing my principles for ethical engagement with people who are getting the poorest outcomes from services. Returning people’s data to them, in the form of a story is now a practice at the heart of my relationships to the people with whom I do research. At the best of times this interaction is an intervention in and of itself, validating someone’s experience and allowing them to open themselves up to new self-narratives. But the goal of story return in not a positive reception; rather, it’s about following through on our ethical commitment to recognize people’s ownership over their own data, and allowing them the opportunity to benefit...
IDEO Chicago and DePaul University
Case Study—This case study provides an inside look at what occurs when methods from the data science and ethnographic fields are mixed to solve perennial customer service problems within the call center and cruise industries. The paper details this particular blend of ethnographic practitioners with a data scientist resulted in changes to design approaches, debunking myths about qualitative and quantitative research methods being at odds and altering team member perspectives about the value of both. The project also led to the creation of innovative blended design research and data science methods to discover and leverage the right customer data to the benefit of both the customer and the call center agents who serve them. This paper offers insight into the untold value design teams can unlock when data scientists and ethnographers work together to solve a problem. The result was a design solution that gives a top-performing company an edge to grow even better by leveraging the millions...
ELIZABETH CHURCHILL, Distinguished Researcher, IBM Almaden Research Center
MIRIAM LUECK AVERY, Mozilla
ASTRID COUNTEE, Data for Democracy
NATHAN GOOD, Good Research
This EPIC2018 panel addresses questions of fairness and justice in data-centric systems. While the many social problems caused by data-centric systems are well known, what options are available to us to make things better? Chair Elizabeth Churchill draws the panelists and audience into conversation about making change on many levels, in our daily work as well as larger-scale collaborations.
Elizabeth Churchill is a Director of User Experience at Google. She has built research groups and led research in a number of well-known companies, including as Director of Human Computer Interaction at eBay Research Labs, Principal Research Scientist and Research Manager at Yahoo!, and Senior Scientist at PARC and Fuji Xerox’s Research lab. Elizabeth has more than 50 patents granted or pending, 5 co-edited and 2 co-authored books (Foundations for...
CHRISTOPHER A. GOLIAS, PHD
This PechaKucha explores the ethics of interpreting data by employing an extended metaphor of data as the lifeblood of the connected world. It begins by exploring two distinct viewpoints on medical pulse diagnosis, starting from the perspective of the acupuncturist diagnosing a patient’s pulse and continuing through differences between Eastern pulse diagnosis and biomedical pulse diagnosis. I explore data as lifeblood, and imagine more visceral ways to read data (e.g., auguring data) and the ethical implications of such a reading. I envision data as a flowing river filling a lake, in which diagnostic specialists observe society’s reflection. In the process, I contrast utopian visions of a data driven world with dystopian ones before resolving tension by returning to the central comparison of data scientist and medical doctor. The presentation concludes by recalling medicine’s Hippocratic Oath, an ethical charter binding practitioners to a code of conduct, and implying...