Convo Research & Strategy Pvt Ltd
PechaKucha Presentation—This paper raises the implications of simplifying algorithms for scale and uplifting content that is damaging for human evolution. Technology is powerful because of its scale and also disempowering for the same reason. Scale is in the variables and online media, in the zest of empowering women, is deciding our fate. I get it when the housewife looks to YouTube to cook a meal. However, I also see the heartbreak when what should be freeing is actually being used to throttle progress. When a girl from a small sub-segment of global population like Rajasthan, while wanting to feel empowered realises that she's unable to measure up? Are we responsible for this? Are our “hashtags” and “likes” fuelling our continued repression?
As an ethnographer, I study media consumption to overcome barriers to participation in the online world, and as a gender trainer, I also create and use media content to overcome barriers in the real world. I find myself continually...
This 2019 project conducted in the US and the UK sought to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism. This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person's general sense of societal alienation. We discovered that any extreme version of a conspiracy theory could be harmful. The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content, and led to a refinement of the algorithms defining the discoverability of this content. The aim of this project was to scale and amplify through algorithmic interventions the work of individual debunkers.
Keywords: Conspiracy theories,...
by MINNA RUCKENSTEIN, University of Helsinki
It is easy to become pessimistic, if not dystopic, about tracking technologies. The current digital services landscape promotes scoring, selecting and sorting of people for the purposes of maximizing profit. Machine logics rely on profiling characteristics and predicting actions, and management by algorithms appears to be disproportionately affecting those with temporary and low-income jobs. Tracking technologies become complicit in deepening and accelerating social divisions and inequalities. The most vulnerable in societies have no say in how their actions are monitored and lives are harmed by algorithmically produced metrics.
In this context, Quantified Self (QS) – an international community of ‘self-trackers’ that shares insights gained through self-quantification and data analysis – seems rarified, an example of the privileged techno-elite positioned to use tracking data to pursue their own values and goals. With this limitation, QS hardly appears to be a useful prism...
Instructor: IAN LOWRIE
Approx 1 hr 43 min. This video presents the lecture portion of a half-day tutorial. Case studies and a bibliography are provided for your use.
Instructor Ian Lowrie describes the organizational and technological aspects of modern data pipelines, framing data science ethnographically as a knowledge practice and data scientists as a particular kind of expert. He also explores methodological approaches to studying data work in real-world contexts. Participants learned to:
Think ethnographically about data work as a knowledge practiceDevelop methodological strategies for studying data workChart the organizational and technological components of data infrastructureInterpret the mindset, jargon, and practical orientations of their data scientist and developer colleaguesUnderstand how algorithmic systems and data analytics impact organizational structures, work practices, and business models
In the second half of the tutorial, participants worked collaboratively to develop a pitch for...
University of Albany, SUNY
Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. Her most recent book is Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, which dana boyd calls “the first [book] that I’ve read that really pulls you into the world of algorithmic decision-making and inequality, like a good ethnography should,” and Ethan Zuckerman calls “one of the most important recent books for understanding the social implications of information technology for marginalized populations in the US.” Eubanks is also the author of Digital Dead End: Fighting for Social Justice in the Information Age; and co-editor, with Alethia Jones, of Ain’t Gonna Let Nobody Turn Me Around: Forty Years of Movement Building with Barbara Smith. Her writing about technology and social justice has appeared in The American Prospect, The Nation, Harper’s and Wired. For two decades, Eubanks has worked in community technology and...
MILLIE P. ARORA
As algorithms play an increasingly important role in the lives of people and corporations, finding more effective, ethical, and empathetic ways of developing them has become an industry imperative. Ethnography, and the contextual understanding derived from it, has the potential to fundamentally change the way that data science is done. Reciprocally, engaging with data science can help ethnographers focus their efforts, build stronger and more precise insights, and ultimately have greater impact once their work is incorporated into the algorithms that increasingly power our society. In practice, building contextually-informed algorithms requires collaboration between human science and data science teams who are willing to extend their frame of reference beyond their core skill areas. This paper aims to first address the features of ethnography and data science that make collaboration between the two more valuable than the sum of their...
Founder, CEO, Acesio Inc.
Head of Behavioral and Organizational Research, Acesio Inc.
The focus of this paper is to investigate deep learning algorithm development in an early stage start-up in which edges of knowledge formation and organizational formation were unsettled and contested. We use a debate by anthropologists Clifford Geertz and Claude Levi-Strauss to examine these contested computational forms of knowledge through a contemporary lens. We set out to explore these epistemological edges as they shift over time and as they have real practical implications in how expertise and people are valued as useful or non-useful, integrated or rejected by the practice of deep learning algorithm R&D. We discuss the nuances of epistemic silences and acknowledgments of domain knowledge and universalizing machine learning knowledge in an organization that was rapidly attempting to develop algorithms for diagnostic insights. We conclude with reflections on how an AI-Inflected Ethnography perspective...
Case Study—We report on a two-year project focused on the design and development of data analytics to support the cloud services division of a global IT company. While the business press proclaims the potential for enterprise analytics to transform organizations and make them ‘smarter’ and more efficient, little has been written about the actual practices involved in turning data into ‘actionable’ insights. We describe our experiences doing data analytics within a large global enterprise and reflect on the practices of acquiring and cleansing data, developing analytic tools and choosing appropriate algorithms, aligning analytics with the demands of the work and constraints on organizational actors, and embedding new analytic tools within the enterprise. The project we report on was initiated by three researchers; a mathematician, an operations researcher, and an anthropologist well-versed in practice-based technology design, in collaboration...
CUNY Graduate Center / Data & Society
Cloudera Fast Forward Labs
The successes of technology companies that rely on data to drive their business hints at the potential of data science and machine learning (DS/ML) to reshape the corporate world. However, despite the headway made by a few notable titans (e.g., Google, Amazon, Apple) and upstarts, the advances that are advertised around DS/ML have yet to be realized on a broader basis. The authors examine the tension between the spectacular image of DS/ML and the realities of applying the latest DS/ML techniques to solve industry problems. The authors discern two distinct ways, or modes, of thinking about DS/ML woven into current marketing and hype. One mode focuses on the spectacular capabilities of DS/ML. It expresses itself through one-off, easy-to-grasp marketable projects, such as DeepMind’s AlphaGo (Zero). The other mode focuses on DS/ML’s potential to transform industry. Hampered by an emphasis on tremendous but as of yet unrealized...
Those who work in research know that we live in a world that is strongly influenced by what Tricia Wang has called the quantification bias. More so than other forms of information, numbers have incredible formative power. In our culture, numbers are seen as trustworthy representations of reality that are strongly associated with objectivity and untainted by human bias and shortcomings. Recently, data science, big data, algorithms, and machine learning have fueled a new wave of the quantification bias. One of the central fascinations of this wave has been the promise that humans now have the power of prediction at their fingertips. In this paper, I reflect on what it means to make predictions and explore the differences in how predictions are accomplished via quantitative modeling and ethnographic observation. While this is not the first time that ethnographic work has been put in conversation and in contrast with quantified practices, most theorists have framed the role of ethnography as providing context...