This 2019 project conducted in the US and the UK sought to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism. This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person's general sense of societal alienation. We discovered that any extreme version of a conspiracy theory could be harmful. The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content, and led to a refinement of the algorithms defining the discoverability of this content. The aim of this project was to scale and amplify through algorithmic interventions the work of individual debunkers.
Keywords: Conspiracy theories,...
We've worked hard to eliminate cookies that don't serve you and our nonprofit community. By clicking "Accept" you consent to our use of all cookies. To manage analytics and social cookies, click "Settings."
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
These cookies are used by social media links that you can use to share our content easily. If you use these links on our site, data will be exchanged with the platform on which you’re sharing (e.g., Twitter, LinkedIn)