This 2019 project conducted in the US and the UK sought to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism. This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person's general sense of societal alienation. We discovered that any extreme version of a conspiracy theory could be harmful. The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content, and led to a refinement of the algorithms defining the discoverability of this content. The aim of this project was to scale and amplify through algorithmic interventions the work of individual debunkers.
Keywords: Conspiracy theories,...
While a number of scholars have studied online communities, research on games has been mostly focused on the business, experience, and content of gameplay. Interactions between players within games has received less attention, and toxic behavior is a newer area of investigation in academia. Inquiry into toxicity in gaming is part of a larger body of literature and public interest emerging around disruptive and malicious social interactions online, cyberbullying, child-grooming, and extremist recruiting. Through our research we reaffirmed that toxicity in gaming is a problem at a global scale, but we also discovered that on a micro scale, what behavior gamers perceive as toxic, or how toxicity is enacted in gaming is different depending on cultural context amongst other things. The generalized problem at scale, and its particular manifestations on the micro level raise philosophical and technology design questions, which we address through examples from our own research...