by QAMAR ZAMAN, Stripe Partners
When I was studying economics at university one of our professors introduced us to Jorge Luis Borges’s “On Exactitude in Science”, a one-paragraph story. It imagines an empire so enthralled by cartography that larger and larger maps of the place are built by successive generations until a map on the same scale as the empire is drawn. Following generations realise a map of such magnitude is cumbersome and “in the western deserts, tattered fragments of the map are still to be found, sheltering an occasional beast or beggar”. Our professor’s point back then was that in a world where trying to see and make sense of too much is impossible, simple models to comprehend the world (and economics was built on simple models) carry immense value. Some years on from that, combining big data and thick data promises the ability to see and understand much more. Their combination can provide maps which are vast but also allow us to make sense of the landscape and people inhabiting them. This piece shows what...
University of Albany, SUNY
Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. Her most recent book is Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, which dana boyd calls “the first [book] that I’ve read that really pulls you into the world of algorithmic decision-making and inequality, like a good ethnography should,” and Ethan Zuckerman calls “one of the most important recent books for understanding the social implications of information technology for marginalized populations in the US.” Eubanks is also the author of Digital Dead End: Fighting for Social Justice in the Information Age; and co-editor, with Alethia Jones, of Ain’t Gonna Let Nobody Turn Me Around: Forty Years of Movement Building with Barbara Smith. Her writing about technology and social justice has appeared in The American Prospect, The Nation, Harper’s and Wired. For two decades, Eubanks has worked in community technology and...
Few professions appear more at odds, at least on the surface, than ethnography and data science. The first deals in qualitative “truths,” gleaned by human researchers, based on careful, deep observation of only a small number of human subjects, typically. The latter deals in quantitative “truths,” mined through computer-executed algorithms, based on vast swaths of anonymous data points. To the ethnographer, “truth” involves an understanding of how and why things are truly the way they are. To the data scientist, “truth” is more about designing algorithms that make guesses that are empirically correct a good portion of the time. Data science driven products, like those that Uptake builds, are most powerful and functional when they leverage the core strengths of both data science and ethnographic insights: what we call Human-Centered Data Science. I will argue that data science, including the collection and manipulation of data, is a practice that is in many ways as human-centered and subjective...
imec-SMIT, Vrije Universiteit Brussel
imec-SMIT, Vrije Universiteit Brussel
This paper aims to contribute to the debate on the integration of ethnography and data science by providing a concrete research tool to deploy this integration. We start from our own experiences with user research in a data-rich environment, the smart city, and work towards a research tool that leverages ethnographic praxis with data science opportunities. We discuss the different key components of the system, how they work together and how they allow for human sensemaking....
MILLIE P. ARORA
As algorithms play an increasingly important role in the lives of people and corporations, finding more effective, ethical, and empathetic ways of developing them has become an industry imperative. Ethnography, and the contextual understanding derived from it, has the potential to fundamentally change the way that data science is done. Reciprocally, engaging with data science can help ethnographers focus their efforts, build stronger and more precise insights, and ultimately have greater impact once their work is incorporated into the algorithms that increasingly power our society. In practice, building contextually-informed algorithms requires collaboration between human science and data science teams who are willing to extend their frame of reference beyond their core skill areas. This paper aims to first address the features of ethnography and data science that make collaboration between the two more valuable than the sum of their...
Program of Applied Anthropology, Oregon State University
Program of Geography, Oregon State University
Program of Mechanical Engineering and Program of Applied Anthropology, Oregon State University
For its volume, velocity, and variety (the 3 Vs), big data has been ever more widely used for decision-making and knowledge discovery in various sectors of contemporary society. Since recently, a major challenge increasingly recognized in big data processing is the issue of data quality, or the veracity (4th V) of big data. Without addressing this critical issue, big data-driven knowledge discoveries and decision-making can be very questionable. In this paper, we propose an innovative methodological approach, an archaeological-ethnographic approach that aims to address the challenge of big data veracity and to enhance big data interpretation. We draw upon our three recent case studies of fake or noise data in different data environments. We approach big data as but another kind of human...
Case Study—We report on a two-year project focused on the design and development of data analytics to support the cloud services division of a global IT company. While the business press proclaims the potential for enterprise analytics to transform organizations and make them ‘smarter’ and more efficient, little has been written about the actual practices involved in turning data into ‘actionable’ insights. We describe our experiences doing data analytics within a large global enterprise and reflect on the practices of acquiring and cleansing data, developing analytic tools and choosing appropriate algorithms, aligning analytics with the demands of the work and constraints on organizational actors, and embedding new analytic tools within the enterprise. The project we report on was initiated by three researchers; a mathematician, an operations researcher, and an anthropologist well-versed in practice-based technology design, in collaboration...
JEANETTE BLOMBERG, Distinguished Researcher, IBM Almaden Research Center
MARC BÖHLEN, Professor, Department of Art, Emerging Practices, State University of New York at Buffalo
TOM LEE, Director of Data Science, Fisher Center for Business Analytics, University of California Berkeley
What does a data expert see when they look at a design problem? This panel immerses us in the practices of two data experts, both of whom have collaborated with ethnographers, as they navigate through design challenges in different ways. Chair Jeanette Blomberg draws the panelists and audience into conversation about synergies and challenges for interdisciplinary design collaborations.
Jeanette Blomberg is Distinguished Researcher at the IBM Almaden Research Center and Adjunct Professor at Roskilde University in Denmark. She has done foundational work on ethnography in design processes over three decades, and her current research is focused on organizational analytics and the linkages between human action, digital data production, data...
ELIZABETH CHURCHILL, Distinguished Researcher, IBM Almaden Research Center
MIRIAM LUECK AVERY, Mozilla
ASTRID COUNTEE, Data for Democracy
NATHAN GOOD, Good Research
This EPIC2018 panel addresses questions of fairness and justice in data-centric systems. While the many social problems caused by data-centric systems are well known, what options are available to us to make things better? Chair Elizabeth Churchill draws the panelists and audience into conversation about making change on many levels, in our daily work as well as larger-scale collaborations.
Elizabeth Churchill is a Director of User Experience at Google. She has built research groups and led research in a number of well-known companies, including as Director of Human Computer Interaction at eBay Research Labs, Principal Research Scientist and Research Manager at Yahoo!, and Senior Scientist at PARC and Fuji Xerox’s Research lab. Elizabeth has more than 50 patents granted or pending, 5 co-edited and 2 co-authored books (Foundations for...
WILLIAM WELSER IV
Case Study—This case-study details how a team of anthropologists and a team of data scientists sought to help a Middle Eastern theme park make use of their big data platform to measure ‘the good customer experience’. Ethnographic research within the theme park revealed that visitors yearned to bond with the other members of their group, as they rarely got the chance during their busy everyday lives back home. However, trying to build a measurement of how the theme park delivered on bonding – through the development of a ‘bonding index’ – turned out to be unfeasible, because the big data platform focused on capturing operational data. The decision to focus on operational data had unintentionally created a path dependency that made the big data setup unfit for answering some of the theme park’s most fundamental questions. This is a problem ReD Associates has observed across clients and to solve it this...
Those who work in research know that we live in a world that is strongly influenced by what Tricia Wang has called the quantification bias. More so than other forms of information, numbers have incredible formative power. In our culture, numbers are seen as trustworthy representations of reality that are strongly associated with objectivity and untainted by human bias and shortcomings. Recently, data science, big data, algorithms, and machine learning have fueled a new wave of the quantification bias. One of the central fascinations of this wave has been the promise that humans now have the power of prediction at their fingertips. In this paper, I reflect on what it means to make predictions and explore the differences in how predictions are accomplished via quantitative modeling and ethnographic observation. While this is not the first time that ethnographic work has been put in conversation and in contrast with quantified practices, most theorists have framed the role of ethnography as providing context...
by TOM HOY, Stripe Partners
Sensemaking: The Power of Humanities in the Age of the Algorithm
2017, 240 pp, Hachette Books
Christian Madsbjerg has done a huge amount to elevate the profile and impact of ethnography in corporate settings. As co-founder of ReD Associates, Madsbjerg makes a consistent and compelling case for ethnographers to set their sights beyond user experience and design to impact decisions at the pinnacle of global organisations.
His new book Sensemaking advances his mission further, advocating humanities-based thinking to a much wider business audience. The central analysis feels more even resonant today than when the book was released last year: the power of big data has created a false idol, lulling us into the belief that the algorithm has the capacity to replace critical thinking.
What unfolds is a story which is compelling and bold in critique, but strangely conservative and ambiguous in the solutions it prescribes.
Silicon Valley and the Renaissance Man [sic]
by TYE RATTENBURY (Salesforce) & DAWN NAFUS (Intel)
As EPIC2018 program co-chairs, we developed the conference theme Evidence to explore how evidence is created, used, and abused. We’ll consider the core types of evidence ethnographers make and use through participant observation, cultural analysis, filmmaking, interviewing, digital and mobile techniques, and other essential methods, as well as new approaches in interdisciplinary and cross-functional teams.1
We’ve also made a special invitation to data scientists to join us in Honolulu to advance the intersection of computational and ethnographic approaches. Why?
One of us is a data scientist (Tye) and the other an ethnographer (Dawn), both working in industry. We regularly see data science and ethnography conceptualized as polar ends of a research spectrum—one as a crunching of colossal data sets, the other as a slow simmer of experiential immersion. Unfortunately, we also see occasional professional stereotyping. A naïve view of “crunching” can make it seem...
EPIC2017 Platinum Panel
Moderated by: CHRIS HAMMOND (IBM)
Panelists: MARK BURRELL (IBM), MELISSA CEFKIN (Nissan Research Center), CHRISTIAN MADSBJERG (ReD Associates) & DAWN NAFUS (Intel)
Increasingly, experiences are being created that incorporate augmented intelligence, promising to make us smarter, more efficient, and more effective. Doctors can recommend more comprehensive personalized treatment plans, teachers can provide lesson plans tailored to individual students, and farmers can vary crop irrigation and fertilization cycles in response to predicted weather patterns. Human capabilities (some might say intelligence) are being augmented, aided by machine learning algorithms that interpret and find meaning in vast quantities of both structured and unstructured data.
This panel addresses challenges of doing design research in a cognitive world where predictive analytics, conversational interfaces, and augmented intelligence are core aspects of the technology solutions being designed. What skills...
by SAKARI TAMMINEN, Gemic
Ever since the 1970s, the promise of increased productivity through technology has been under intense scrutiny. It’s a promise that has pushed questions about nature and the role of technology in society into the hands of scholars, including anthropologists. For those working in industry – really, one of the few places where anthropologists can engage with technology the real, rather than technology the theory – the question always boils down to value. Whether it’s big data, AI, biotech, nanotechnology, robots, smart dust or driverless cars, the one question we’re always looking to answer is: What’s the value of a new technology?
Economically, the promise and gains of technological efficiency – particularly information technology – is known as the productivity paradox. Whether a paradox or a series of assumptions about the impact of technology on productivity, the question of the value of technology sparked heated debate among economists over the first wave of computerization. In 1987,...