by JULIA TAN & CAROLINA ALDAS, Spotify
When we think about the “Discovery” phase in the product development process, we often picture product owners, design, and researchers working to understand a problem area, the needs of end users in that area, and testing product ideas that might deliver on those needs. When no product exists yet, it can be difficult to justify Engineering’s time.
As such, the Discovery phase tends to be heavily driven by product owners, designers, and insights practitioners by default, and Engineering takes a more active role when product requirements and specifications become more defined. The process looks more like a relay race than synchronized swimming. In the process of passing the baton, important context gets lost and some agility is compromised.
We’ve all been there. We devote time and energy to truly understand people, their needs and motivations. We identify and user-test solutions that have a high promise to deliver user and business value, only to find out it’s not entirely feasible...
This 2019 project conducted in the US and the UK sought to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism. This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person's general sense of societal alienation. We discovered that any extreme version of a conspiracy theory could be harmful. The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content, and led to a refinement of the algorithms defining the discoverability of this content. The aim of this project was to scale and amplify through algorithmic interventions the work of individual debunkers.
Keywords: Conspiracy theories,...