"An ethnographic lens influences us to define ability and disability in a way that is maximally inclusive...many different abilities are present in our world, and each deserves to be taken as its own reality and respected as such."
—RICHARD BECKWITH (Research Psychologist, Intelligent Systems Research Lab) & SUSAN FAULKNER, (Research Director, Research and Experience Definition), Intel Corporation
EPIC Members Richard Beckwith and Susan Faulkner (Intel) have assembled a panel of luminaries in accessible tech research, design, and engineering for our January 26 event, Seeing Ability: Research and Development for Making Tech More Accessible. In anticipation, we asked them a few questions about their approach to accessibility and key first steps all of us can take to do more inclusive work.
How do you define ability and accessibility? How does an ethnographic lens influence your definitions?
Ability has to do with what an individual is capable of perceiving or physically doing with their body; accessibility has to do with...
An EPIC talk with RICHARD BECKWITH, DARRYL ADAMS, TIM GRAHAM, STACY BRANHAM, ADAM MUNDER, Computing Accessibility
January 26, 3–4:30 pm Pacific time (10–11:30 am AEDT)
Free for EPIC Members
One of the biggest challenges facing the tech industry today is access to computing for people of all abilities. This EPIC Talk explores making computing more accessible through ethnography. How can the EPIC community think about bringing accessibility into our work? What has been the impact of design inspired by disability? How should ethnographers work with people who have disabilities?
Industry leaders in tech, educators, and researchers from Intel, UC Irvine, Dell Technologies, and OmniBridge will share key insights on how we can use ethnography to address computing accessibility. This will be an engaging session that is intended to amplify a much-needed conversation with people who are leading the work to advocate for inclusion in research, computing and technology. We encourage our EPIC community to share their...
JOHN W. SHERRY
Scale suffuses the work we do and, recently, has us considering an aspect of scale best suited to those with ethnographic training. We've been asked to help with scaling up one of the latest blockbusters in high tech – deep learning. Advances in deep learning have enabled technology to be programmed to not only see who we are by using facial ID systems and hear what we say by using natural language systems; machines are now even programmed to recognize what we do with vision-based activity recognition. However, machines often define the objects of their gaze at the wrong scale. Rather than “look for” people or objects, with deep learning, machines typically look for patterns at the smallest scale possible. In multiple projects, we've found that insights from anthropology are needed to inform both the scale and uses of these systems.
Keywords Deep Learning, Human Scale, Ethnographic Insights
Article citation: 2020 EPIC Proceedings...