Learn models and principles to ensure organizations are creating, using, and deploying AI that coworkers, customers, and society can trust.
Instructors: KATHY BAXTER, Principal Architect of Ethical AI Practice, Salesforce) & YOAV SCHLESINGER, Architect of Ethical AI Practice, Salesforce
This video has been edited to protect the privacy of participants in the live tutorial.
Our lives are directed, enriched, influenced, and sometimes harmed by AI, in both obvious and not-so-obvious ways. Although many AI regulations have been implemented in the last few years and more regulation is coming, organizations cannot wait until they are compelled by external forces to develop responsible AI practices. For the sake of your business, customers, and society, everyone has a responsibility to ensure that the technology they help build, sell or deploy is fair, transparent and ethical.
In this tutorial, we will walk participants through the steps of creating a responsible AI practice using a combination of lecture,...
This tutorial gives you and your teams robust, actionable tools for navigating inequity and shifting power hierarchies, from project planning to implementation.
Instructors: CHELSEA MAULDIN, Executive Director, Public Policy Lab & NATALIA RADYWYL, Head of Research & Capability, Today Design
This video has been edited to protect the privacy of participants in the live tutorial.
To do ethical, equitable work in any domain, we need robust tools for assessing and addressing power. Whether we’re creating products, services, or policies, inequities can create direct and indirect risks for research participants and underserved populations. This tutorial gives you robust, actionable tools for navigating inequity through a project life cycle, including planning, research, design, and implementation. You will:
Identify power dynamics in research and design projects
Learn frameworks and tools to navigate power dynamics through a project lifecycle
Learn power-based assessments to use with individuals...
DANA C. GIERDOWSKI
This case study examines how researchers at Lenovo and dscout partnered to conduct a mobile ethnographic study on the technology experiences of individuals who are d/Deaf and hard of hearing, with the goal of making their products and research practices more accessible and inclusive. The study revealed common frustrations and pain points people experience when using their every-day technology. The researchers also learned valuable research design and operations lessons related to recruiting participants who are d/Deaf and hard of hearing, providing accommodations, and establishing an accessible research environment. This case explores the benefits of mobile-forward research design, and the additional considerations and adaptations necessary for collecting both asynchronous and synchronous data from individuals who have hearing loss and who have different communication modes and preferences, including American Sign Language. The authors discuss how more inclusive...
Code for America
Social Workers Who Design
This paper is an exploration of trauma, how and why it can surface during ethnographic and qualitative research, and the importance of anticipating its potential presence. We present a model to help plan for and mitigate the risks of trauma and demonstrate how it fits into broader methodological discussions of conducting safer and more ethical, responsible, and humane research. We close by discussing one pathway for a journey from being sensitive and aware of trauma to actively responding to it at both the individual and organizational levels across your work. Keywords: Trauma informed care, trauma responsive research and design, design research, ethics, qualitative methods
Article citation: 2022 EPIC Proceedings pp 9–34, ISSN 1559-8918, https://www.epicpeople.org/epic...
Keynote Speaker: PANTHEA LEE, Reboot
Panthea Lee is a strategist, organizer, designer, and facilitator, and the Executive Director of Reboot. She is passionate about building transformative coalitions between communities, activists, movements, and institutions to tackle structural inequity—and working with artists to realize courageous social change.
Panthea is a pioneer in designing and guiding multi-stakeholder processes to address complex social challenges, with experience doing so in 30+ countries with partners including UNDP, MacArthur Foundation, Luminate, CIVICUS, Wikimedia, Women’s Refugee Commission, and governments and civil society groups at the national, state, and local levels. The global co-creation efforts she’s led have launched new efforts to protect human rights defenders, tackle public corruption, strengthen participatory democracy, advance equity in knowledge access, reform international agencies, and drive media innovation. Panthea began her career as a journalist, ethnographer, and cultural...
Moderator: JILLIAN POWERS, Cognizant
Panelists: JORDAN KRAEMER, Anti-Defamation League; ARWA MICHELLE MBOYA, Magic Leap; JESSICA OUTLAW, The Extended Mind LLC
As new technologies, from AI to immersive experiences, are developed at scale, they raise ethical concerns for research and design. Data-driven systems have repeatedly been shown to entrench social biases along lines of race, gender, and class, from racist algorithms in the criminal justice system to misgendering trans and nonbinary people. Immersive technologies, such as virtual reality (VR) and augmented reality (AR), however, raise separate and thorny questions for ethical design. Immersive technologies create novel experiences of embodiment and reality, not to mention new sources of personal data. These facets create distinctive challenges for ethics, equity, and inclusion, intensifying the potential harms of misinformation, harassment, privacy violations, surveillance, or unequal access. How can ethnographic research anticipate emergent ethical questions specific...
Keynote Speaker: JASON LEWIS, Concordia University
Jason Edward Lewis' multidisciplinary research and creative practice has been central to developing Indigenous media art in North America and worldwide, establishing a vital conversation about the interaction between Indigenous culture and computational technology. His contributions comprise scholarly writing, art making and technology research, as well as his leadership of the Initiative for Indigenous Futures and his creation of the Indigenous Futures Research Centre. A digital media theorist, poet, and software designer, Lewis is currently University Research Chair in Computational Media and the Indigenous Future Imaginary and Professor of Computation Arts at Concordia University. At Concordia he also serves as Special Advisor to the Provost on Indigenous Spaces.
Lewis spent a decade working in a range of industrial research settings, including Interval Research, US West's Advanced Technology Group, and the Institute for Research on Learning, and, at the turn of...
Instructors: CHELSEA MAULDIN (Executive Director, Public Policy Lab) & NATALIA RADYWYL (Research Director, Public Policy Lab)
This tutorial gives you robust, actionable tools for navigating inequity through a project life cycle.
This tutorial was conducted at EPIC2021. Exercises and discussions have been omitted to protect the privacy of participants.
To do ethical, equitable work in any domain, we need robust tools for assessing and addressing power. Whether we’re creating products, services, or policies, inequities can create direct and indirect risks for research participants and underserved populations. This tutorial gives you robust, actionable tools for navigating inequity through a project life cycle.
Public Policy Lab developed Power Tools over many years of innovative and effective work with at-risk communities. Across planning, research, design, and implementation, the instructors will teach you how to use Power Tools to check biases, inform theories of change and logic models, identify effective...
Ovetta Sampson covers when, how, and where to integrate ethnography and data science in the exploratory research process to have better and more ethical AI product development outcomes. With a combination of lecture, case study examples, and exercises, attendees gained a clear understanding of why making data a stakeholder in user research will create a more ethical and human-centered AI product. This tutorial is created especially for researchers who understand the need to mix ethnography and data science but just don’t know quite how to do it. Topics include:
Bringing data into research planning to help identify and reduce bias
Bringing data science into synthesis to help illuminate system solutions
Bringing data science into insight and design principle generation
Aligning qualitative data and behavioral data
This tutorial was presented in full at EPIC202020. The video includes instructor presentations; discussions and breakout sessions are excluded for the privacy of...
3A Institute, Australian National University
PechaKucha Presentation—A wise woman once shared with me that the opposite of poverty isn't wealth. It's dignity. In a world where scale is about optimising for something bigger, faster, easier, broader and more profitable, we risk decision-making that is at odds with preserving, enabling and enhancing human dignity. What if we changed our focus to instead work out how we scale human dignity?
This PechaKucha draws on my career across consulting, social enterprise and academia in geographies from Sydney CBD to rural Uganda and highlights three moments where I experienced dignity that I believe can scale.
Through the telling of stories it shows glimmers of how we can choose a definition of scale that preferences dignity. It can look like making space for a chicken gift, enshrining dignity in our organisational values and structures and building question-asking muscles.
If we believe that the opposite of poverty is dignity, then scaling dignity is an antidote to poverty....
The US banking industry has a long history of excluding, exploiting, or simply ignoring low-income communities, recent immigrants, and racial minorities. In this paper, I share my experiences creating a community of practice where employees of a rapidly-growing banking startup can identify and confront the ethical challenges facing the financial technology (fintech) industry. This community is informed by insights from four years of activism and anthropological research that I conducted with small teams of service designers and ethnographers developing financial services for and with low- to moderate-income communities around the world. Through this research, I identified three institutional logics—insularity, decontextualization, and technological hubris—which limit efforts to build a more inclusive, equitable banking system. These logics hold the potential to lead well-intentioned organizations, and the practitioners they employ, to harm the marginalized communities they set out to help. This paper concludes...
FANI NTAVELOU BAUM
Despite companies facing real consequences for getting ethics wrong, basic ethical questions in emerging technologies remain unresolved. Companies have begun trying to answer these tough questions, but their techniques are often hindered by the classical approach of moral philosophy and ethics – namely normative philosophy – which prescribe an approach to resolving ethical dilemmas from the outset, based on assumed moral truths. In contrast, we propose that a key foundation for ‘getting ethics right’ is to do the opposite: to discover them, by going out into the world to study how relevant people resolve similar ethical dilemmas in their daily lives – a project we term ‘grounded ethics’. Building from Durkheim's theory of moral facts and more recent developments in the anthropology of morals and ethics, this paper explores the methods and theory useful to such a mission – synthesizing these into a framework to guide future...
CUNY/Data & Society
Technology companies have discovered ethics in the wake of public pressure to consider the consequences of their products. This has been prompted by the finding that machine learning and artificial intelligence (ML/AI) systems, as fundamentally pattern-seeking technologies, can and do exacerbate long-term structural inequalities. Companies and employees also struggle with the challenges posed by the dual-use nature of technology.
This tutorial will prepare you to understand and contribute to the more ethical development and deployment of ML/AI systems. It covers:
An overview of ethical challenges in ML/AI today
An introduction to the development of ML/AI systems, designed to give you insight into the reasoning processes and workflows of technical colleagues and how they generally address issues like accuracy and fairness (no quantitative background required!)
A overview of current efforts to design more ethical ML/AI systems,...
Alliance Innovation Lab – Silicon Valley
Alliance Innovation Lab – Silicon Valley, MIT
Alliance Innovation Lab – Silicon Valley
This paper explores how the design of everyday interactions with artificial intelligence in work systems relates to broader issues of interest to social scientists and ethicists: namely human well-being and social inequality. The paper uses experience designing human interactions with highly automated systems as a lens for looking at the social implications of work design, and argues that what human and automation each do is less important than how human and automation are structured to interact. The Human-Autonomy Teaming (HAT) paradigm, explored in the paper, has been a promising alternative way to think about human interactions with automation in our laboratory's research and development work. We argue that the notion of teaming is particularly useful in that it encourages designers to consider human well-being as central to the operational success...
This paper recounts research into the orientation and mobility experiences of people who are blind or visually impaired, and describes the novel sonic research method I developed for this purpose. “Participant Phonography,” as I call the method, aims to empower research participants with low or no vision through the self-guided creation of sound recordings that represent their experiences of the world in a first-person perspective. More broadly, the paper highlights the inadequate efforts of ethnographers in industry to tackle challenges of disability and reflects on the ethical challenges that face researchers who want to include disabled people in research. Inclusive methods like participant phonography have great potential to break down traditional power structures that have rendered non-normative groups marginal in user research, but these methods also come with substantial barriers to their implementation in a corporate context....