Advancing the Value of Ethnography

Fighting Conspiracy Theories Online at Scale

Share:

Download PDF

Cite this article:

2020 EPIC Proceedings pp 265–278, ISSN 1559-8918, https://epicpeople.org/fighting-conspiracy-theories-online-scale/

This 2019 project conducted in the US and the UK sought to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism. This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person’s general sense of societal alienation. We discovered that any extreme version of a conspiracy theory could be harmful. The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content, and led to a refinement of the algorithms defining the discoverability of this content. The aim of this project was to scale and amplify through algorithmic interventions the work of individual debunkers.

Keywords: Conspiracy theories, fieldwork, engineers

INTRODUCTION

In 2019, Jigsaw, a technology incubator within Google, and ReD Associates, a strategy consultancy, undertook ethnographic research on conspiracy theorists across the United States and the United Kingdom. The project set out with the initial mandate to understand which conspiracy theories are harmful and which are benign, with an eye towards finding ways to combat disinformation and extremism online. Although a small cadre of self-motivated conspiracy theory “debunkers” generate content online, their efforts are insufficient to tackle the proliferation of conspiracy misinformation online––some of which motivates serious violence. (The August 2019 El Paso shooting of 23 people in a Walmart was fueled in part by a belief in the “white genocide” conspiracy theory.)

In its constant aim to navigate between the Scylla of undue censorship and the Charybdis of permitting harmful speech, Jigsaw (and more broadly, Google) stood to benefit from being able to surgically parse harmful conspiracy content from the harmless; that way, only the harmful could be penalized. More generally, in Google’s quest to better understand niches within a user base of over two billion, an ethnography of conspiracy theorists stood to render rich portraits of dimly understood and often reflexively vilified Internet users to those who broadly shape some of the Internet’s most popular services.

This case study demonstrates how ethnographic methods led to insights on what “triggered” conspiracy belief, the social and emotional roles conspiracy theories played in believers’ lives, and how conspiracy belief was often a reflection of a person’s general sense of societal alienation.

Our initial assumption that some conspiracy theories were more harmful than others because they could incite acts of violence was ultimately revised, for two reasons. First, we found that any conspiracy theory, if followed to an extreme length, could become harmful. Second, we came to feel the more useful finding from our ethnography was not that there were types of theories, but rather types of theorists. At a certain point in our study, we pivoted and sought to help identify the types of conspiracy theorists that are more likely to respond to at-scale technological deterrence strategies.

By focusing on two specific ethnographic encounters, we demonstrate why it is more important to distinguish between types of theorists rather than types of conspiracy theories. Our conclusion is that “extreme” theorists themselves cannot be affected by debunking content, because they will not consider factual argumentation at all. Extreme theorists are distinguished by their visceral, emotionally driven beliefs. Rather, debunking content is best deployed to people who are milder in their conspiracy belief, at a stage where the belief has not yet become embedded and visceral. In-person ethnography was essential in arriving at this understanding, since the personas and behaviors revealed by our in-person visits often overturned the personas and behaviors suggested by a conspiracist’s digital presence.

The findings of this project changed how the client—and by extension engineers behind major tech platforms—understood harmful conspiracy-related content and how to scale efforts to curtail extremism fueled by conspiracy theories.

In this paper, we begin by providing background on debunking as a strategy to dissuade people from upholding conspiracy theories and explaining why the methodology was based on an ethnographic approach. Second, we share two cases from our fieldwork to illustrate our main argument that the most strategically feasible way of combatting conspiracy theories requires us to segment different types of theorists. Lastly, we discuss how our findings impacted the way Jigsaw approached users who consume conspiracy theories online. This case study stands as an example of why ethnographic research on what happens offline helps explain, contradict, and influence what happens online. This perspective is necessary to developing and designing for online products and understanding the users themselves.

BACKGROUND

Belief in conspiracy theories, particularly in the U.S., is not new, but the Internet has made it possible to spread fringe beliefs rapidly, widely, and efficiently (Merlan 2019). Conspiracy theories continue to spread online at an alarming rate. Believers in extreme versions of conspiracy theories are sometimes moved to action. For instance, in 2016, a gunman stormed a pizza parlor in Washington, DC, convinced—because of conspiracy theories circulating online—that it was the site of a child sex trafficking ring. Understanding the line between a playful conspiracy theory and one that motivates people to harmful action is crucial. So, too, is understanding what can be done to help debunk conspiracy theories in a way that is persuasive, so that people who start to fall down conspiracy rabbit holes can climb back out.

Jigsaw had been studying misinformation, but conspiracy theories caught their attention as a poorly understood form of misinformation that was closely, repeatedly linked to real world violence. They began focusing on the questions of how conspiracy theories could be so powerful that they motivate real world action, and how to deter conspiracists. A popular approach used to counter conspiracy theories is debunking. This typically entails engaging others one-on-one in great depth, or sometimes via broadcast, to counter very specific, and often highly technical, arguments. Jigsaw looked for efforts to scale debunking and came across the work of Mick West, an expert at conspiracy theory debunking.

Mick West is a successful video game programmer (noted for his role in the popular Tony Hawk skateboarding series) who retired early and became a full-time debunker. He’s the author of a book titled Escaping the Rabbit Hole: How to Debunk Conspiracy Theories Using Facts, Logic, and Respect / A Guide to Helping Friends, Family and Loved Ones. We also consulted the foundational scholarly literature on conspiracy theories, including The Paranoid Style of American Politics by Richard Hofstadter, and Conspiracy Theories by Cass Sunstein and Adrian Vermeule, whose notion of the conspiracy theorists’ “crippled epistemology” we employed in our analysis.

It was Mick West’s book that was the touchstone, though; in it, he outlines a process that involves taking seriously the points offered by the believer and offering counterinformation. Rather than being dismissive, he brings a deep sense of empathy, a wealth of knowledge, and tremendous amounts of patience and care to each interaction.

Mick West has been debunking conspiracy theories for years, and he’s an inspiration for his deeply empathic approach. He has made dozens of videos on YouTube and runs a forum called Metabunk where he hosts debates on theories as varied as 9/11, Chemtrails, and the notion that the moon landing was a hoax. His book contains a few great success stories of people who have been deep down their rabbit hole, but have gradually been coaxed out.

Still, Mick West is only one man. And even though there are other conspiracy debunkers online, too, the problem is simply too big for a handful of hobbyist debunkers to make a real dent in.

A large portion of society believes in a conspiracy theory to some degree. In the 2015 article, “Conspiracy Theories and the Paranoid Styles of American Politics,” political scientists Eric Oliver and Thomas Wood found that in any given year, about half of the American public endorses at least one highly dubious conspiracy theory. No matter how popular Mick West’s websites become, his painstakingly personalized approach to debunking simply can’t scale to meet demand for this challenge.

This raises the question, how does one “scale” debunking? And is it even possible, or achievable in ways whose costs do not outweigh its benefits?

STUDYING CONSPIRACY THEORISTS

Academic research on conspiracy theories, limited to the Western, English-speaking context, has largely focused on the psychology of individuals who believe in conspiracy theories and why they believe in them (Kluger 2017, Preston 2019, Roose 2019, van Prooijen and van Vugt. 2018). Psychological factors exploring why certain individuals are motivated to uphold conspiracy theories highlight universal traits that make one receptive to this type of content, or how the belief in conspiracy theories is reflective of other existing psychological behaviors. For instance, a variety of cognitive differences were found to increase susceptibility to conspiratorial thinking, such as schizotypy, paranoia, or delusional ideation (Dagnall, Drinkwater, Parker, Denovan, and Parton 2015). Individuals with cognitive differences are engaged in a world where conspiracy theories have explanatory power. Believing in conspiracy theories can fulfill emotional goals by feeling good about the world or exerting a feeling of control and order amid feelings of powerlessness (Hart 2018; Kluger 2017; Imhoff and Lamberty 2016; Grzesiak-Feldman 2013; Whitson and Galinksy 2008). Also, social exclusion may lead people to support conspiratorial beliefs because they provide social meaning and value (Graeupner and Coman 2017). Psychologists have argued that people who were from low-status groups (less education and wealth) were more likely to believe in conspiracy theories (Douglas et al 2019; Freeman and Bentall 2017). What we found particularly relevant to this study is how one conspiracy theory acts as a gateway to other conspiracy theories. Once a person accepts one conspiracy theory, he/she is more likely to be receptive to other conspiracy theories (Brotherton, French and Pickering 2013; Jolley and Douglas 2014; van Proojien and Douglas 2018).

Beyond psychology, scholars of media studies have examined the role that social media has played in spreading conspiracy theories and helping to form new types of communities online (Jolley and Douglas 2014; Stempel, Hargrove, and Stempel III 2007; van Prooijen and Jostmann 2013). Despite the fact that conspiracy theories found online are theoretically accessible to anyone, researchers have found that conspiracy theory content tends to stay within specific communities that are already receptive to it or are actively seeking conspiracy theory content (Douglas et al 2019). One study found that conspiracy theories about the Zika outbreak were not spread online through a central authority but rather through a series of decentralized networks (Wood 2018). This suggests that people share and consider conspiracy theories outside of, or separate from, “official” stamps of approval or authority figures.

Our research builds upon existing studies in three ways. First, we focused on the role context, or large-scale social, economic, and political factors, play in shaping conspiratorial worldviews. This is distinct from psychological perspectives that are primarily focused on types of cognitive profiles that make one susceptible to conspiracy theories. We probed further into how the environments that people were living in were connected to the formation of conspiracy worldviews. For instance, conspiracy theories positing that a small group of elites control the global economy helped people understand their lack of social mobility. Second, similar to the research on the instrumental nature of conspiracy theories, our research is focused on the generative nature of adopting a conspiratorial worldview. Beyond fulfilling the need for control and power, we explored the social aspects of engaging in a conspiratorial worldview, including making friends, having a sense of purpose, and feeling excitement when theorizing with others. Third, we investigated the relationship between what people do online vs. offline. We traveled to meet theorists in person to gain a wider view of their everyday lives in their homes and workplace. We sought to understand how people go from consuming conspiracy theory content to acting upon it. We define “acting upon it” as everything from forwarding a website to others, to liking a post, to meeting other believers in the local library, to openly considering (even if only on theoretical grounds) committing acts of harm upon the imagined conspirators.

The goal of this study was to learn whether a relationship exists between types of conspiracy theories and potential for harm. Even though a correlation between extremism and belief in conspiracy theories exists, it is not the case that believing in conspiracy theories will automatically lead to extreme or violent behavior (Bartlett and Miller 2010). We conceptualize harm broadly to include the individual consequences on their personal relationships (e.g. estranged parents), health (e.g. refusing cancer treatment), and social status (e.g. being outed as a white nationalist). We also define harm in terms of actions taken against others who believers blame as perpetuating or benefiting from conspiracies, such as immigrants. In addition, we also considered but did not directly investigate the harms that conspiracy theory belief has upon faith in governments and institutions (Coaston 2018).

METHODOLOGY

To better understand how to stem the tide of false and potentially harmful conspiracy theories, Jigsaw had to better understand how people came to hold a conspiratorial worldview, what conspiracy thinking does for them and their lives, and what people do, if anything at all, with their beliefs. We wanted to understand how conspiracy theories fit into their overall life and what role conspiracy theories played in motivating other actions, offline. While it is an important area of study to understand the psychological factors that explain how a person even comes to believe in a conspiracy theory, we were more focused on tracing life histories and identifying any patterns between belief and action, specifically with an eye toward harms that are linked to believing in conspiracies. Rather than investigating levels of education and intelligence or cognitive deficits, we wanted to understand contextual, circumstantial, and personal factors that led someone down the rabbit hole, as well as what factors kept them from falling deeper. What role did the people around them, or life events, play in upholding or backing away from theories?

To answer these questions, our team of five researchers conducted in-person, in-depth interviews with 42 conspiracy theorists across the US and UK, as well as expert interviews with academics and journalists investigating conspiracy theories. In accordance with our initial hypothesis that some conspiracies were harmful and others innocuous, we recruited respondents across three different conspiracies: two theories we believed could be tied to real world harm and a third “control group” theory we believed was likely to be harmless. In the “believed harmful” camp were theorists who believed in “false flag” events (the notion that, for instance, mass shootings have been staged—which has been linked to harassment), as well as believers in “white genocide” (the notion that immigration trends indicate a deliberate plot to eliminate whites—which has been linked to mass shootings). In our “believed harmless” camp were believers in various science-related conspiracies (e.g. chemtrails, flat earth).

We used websites like 4chan and Twitter as starting points for recruitment and observations. By searching for the term “white genocide,” for instance, or a related term called “the Kalergi Plan,” we were able to follow, converse with, and ultimately recruit participants over Twitter. We also used surveys with questions designed to screen for conspiracy belief, and we drew from our personal networks as well. Since many conspiracy theorists are of course skeptical of strangers, going through intermediaries was often helpful.

Prior to conducting research, the team familiarized themselves with media coverage and scholarship on conspiracy theories and interviewing experts. We actively cultivated having an open mind and took particular care in understanding what language to use, and how to present ourselves in ways that would not cause us to lose our credibility (e.g. showing our familiarity with theories online rather than boasting about academic credentials). We consciously did not use the phrase “conspiracy theories” because of the negative connotations, but rather spoke about “alternative narratives,” “research,” and “truth.” Our approach was to be honest in presenting ourselves as former academics and journalists; we offered our sincere interest in understanding and listening to their points of view, and to have them guide us through the websites, videos, and channels that they used as sources of information. We also offered to meet in public places and did not record or take pictures, unless permission was granted.

It should be noted that in most respects, the people we met did not strike us as fundamentally different to the types of people we usually meet in other types of studies. Our research participants represented a variety of fields including teaching, technology, construction, and healthcare. Building rapport with them was similar to any other interactions we have in the field, though with more awareness around language, being actively empathetic, and not drawing suspicion with recording devices.

Ultimately, we made in-person visits, each lasting several hours, in or around people’s homes. (One of our researchers also explored the lighter side of conspiracy culture by attending the “Storm Area 51” event held in October.) The insights gathered through fieldwork were analyzed alongside literature on grappling with the history of, or sociological studies of, conspiracy theories, including: Kill All Normies, Fantasyland, and Republic of Lies: American Conspiracy Theorists and Their Surprising Rise to Power.

OUR FINDINGS: DISTINGUISH BETWEEN THE THEORISTS, NOT THE THEORIES

As discussed, our initial hypothesis was that certain theories are more extreme or harmful than others. In other words, we assumed a person’s likelihood to commit harm was related to the type of conspiracy they believed. We had an assumption that pseudo-scientific conspiracy theories like flat earth or chemtrails—the belief that the government is spraying mind-controlling chemicals from planes—were relatively innocuous. Meanwhile we assumed that racially-tinged theories like white genocide were perhaps dangerous by definition.

But what we found surprised us. We learned that it was less important to distinguish between theories, and more important to distinguish between theorists. What matters is how much of a person’s life is taken over by a conspiratorial worldview. If everything is part of the conspiracy, a person can no longer trust anything or anyone. An extreme conspiratorial worldview frames the elite “they” as powerful and as the enemy. Thus, it is not surprising that some studies have found that belief in conspiracy theories could be a predictor of having committed crimes or stating that they would commit a crime (McNamara 2019; Herrema 2019). In our own study, believers of extreme versions of conspiracy theories justified killing a conspirator, if one could be identified, because they would be saving others. It was not a particular theory that drove people to action but rather how deeply a person lived within an extreme conspiratorial worldview.

All conspiracy theories, we came to learn, have the potential to be harmful—more on that in a moment. And when it came to conspiracy theorists, we found, there was a very wide spectrum in terms of how hardened a person’s conspiracy belief is. This becomes very relevant to know when you are hoping to try to “debunk at scale.”

To explain why, we will first discuss a trip the authors made to Montana.

DEEP DOWN THE RABBIT HOLE: THE HARDENED THEORIST

In November 2019, we flew to a remote town in Montana to meet some friends of friends who believe the earth is flat. We met a couple who bonded over conspiracy theories and attend weekly meetups where they discuss and “test” conspiracy theories (for instance, by pointing telescopes at the horizon to try to determine the shape of the earth).

The woman we met with, whom we will call “Jennifer,” grew up with hippie parents who moved the family deeply off the grid. Homeschooled through youth, by her mid-30s Jennifer was living with her parents in a remote corner of Montana, without internet reception. It was only through her gig house-sitting that she was able to access the internet at all—which was how she met “Carl,” her first romantic partner, on an environmentally-themed dating website.

Soon, Carl began sending Jennifer thumb drives full of conspiracy material that she could consume on her home computer. Jennifer mainlined hours and hours of videos; down Carl’s rabbit hole she went, and by the time we met her, her idiosyncratic beliefs were legion. She believed the Earth was flat and there was a nefarious agenda to mislead us about its true shape. She believed that Hitler was a great man and that the Holocaust didn’t happen. She believed the government or the cabal controlling it sprayed mind-weakening chemicals from airplanes. There was hardly a conspiracy theory we had encountered that Jennifer didn’t believe in. Her whole worldview had been reprogrammed, doubly so now that Carl had moved to Montana to be with her. She, Carl, and others in their remote area began hosting a weekly conspiracy meetup, that doubled as something of a self-help group.

What we came to sense, meeting with Jennifer and people like her, was that there was no such thing as an innocuous conspiracy per se. Any theory could become dangerous or extreme, depending on what other theories it was caught up in. Jennifer’s flat earth belief was intimately tied up with the idea that a cabal of people—likely Jews, she had come to feel—were lying to her about the nature of the world. Given the right opportunity, she said, she would attack a representative of this cabal—in fact, she said, she would consider such an act a form of “self-defense” due to the “scale of the atrocity” this cabal was perpetrating on humankind. No longer did flat earth belief appear to us to be inherently innocuous. Belief in a flat earth, as discussed earlier, is a reflection of how extreme a conspiratorial worldview is. To believe in a flat earth, one must discount whole fields of expertise, such as physics and geography. In addition, to pull off a flat-earth conspiracy, one must also consider all of the instruments and narratives that propagate the theory, such as books taught in school, and “captured” professors of astronomy.

We learned something else from meeting hardened theorists like Jennifer: that for people who have reached this stage of conspiracy belief, debunking isn’t the right strategy at all. The notion of debunking presumes a sort of rational, civil debate, where each side shares facts in a sporting fashion, and some victor emerges. But for people like Jennifer, the very notion of what was a “fact” had become subverted. Any mainstream source of information was now reflexively dismissed as lies; if The New York Times (and its happens-to-be-Jewish ownership) toed the party line on the earth being round, could it really be trusted?

Furthermore, coming away from our meeting with Jennifer, we felt that a fundamentally cognitive-and-rational intervention like debunking was likely to fail against someone for whom conspiracy belief appeared to serve an emotional role. The conspiracy belief helped them make emotional sense of a world where they felt marginalized, disenfranchised, and alone.

The intervention needed to pull such a person away from conspiracy belief is likely multi-faceted, not to mention more intimate, personal, and personalized than social media would currently be able to offer at scale. Jennifer may well pull herself out from her rabbit hole someday, but it will likely take a perfect storm of personal and even societal factors before she is ready to do so.

The question of whether debunking can be scaled is moot with the set of conspiracy theorists like Jennifer: even if it could be, it won’t work on her.

AT THE TOP OF THE RABBIT HOLE: THE BUDDING CONSPIRACY THEORIST

This isn’t to say, though, that debunking has no purpose. Because we did also encounter success stories in the field, showing that with the right interventions, tech platforms can help prevent the spread of misinformation at scale. The key, we came to feel, was to make sure debunking was deployed at the right—early—moment in a budding conspiracist’s journey.

For an example of that, let’s talk about the case of Lois.

When we met her, Lois, who lives outside of San Diego, believed in so-called “chemtrails.” When airplanes fly at high altitude, the exhaust from airplanes cause condensation in the air—these familiar streaks in the sky are called contrails. But proponents of the “chemtrails” conspiracy theory believe that in many cases, the lines in the sky aren’t just water condensation but rather a nefarious chemical, likely sprayed by the government. In some variants of the theory, it’s all an experiment in climate control. In other variants, the chemicals are poisons that conspirators use to subtly undermine the will of the population (along with fluoride in the water).

When our researcher met Lois at an Italian restaurant in a San Diego strip mall, she explained why she believed in chemtrails. “I’ve seen them!” she said. She explained that back in 2015, her brother, a rancher, had pointed them out to her. Her brother said the government must be spraying poisons to “control the masses.” That struck Lois (a retired and college-educated marketing professional) as a little far-fetched, but she went home and started doing internet searches related to chemtrails. She went to NASA’s website, but couldn’t find anything debunking it. Instead, she eventually landed a video with a number of pilots and other self-proclaimed experts speaking out at a conference against supposed chemtrails. Persuaded by this parade of seeming experts, she shared this video on Facebook. (For a sense of the theory’s reach on this platform; at one point a chemtrails-themed Facebook Group had over 100,000 members.) Lois even wrote to her senator about chemtrails, but never received a response. By the time we met Lois in the fall of 2019, she was less focused on chemtrails, which had principally been her brother’s concern—and she hoped an investigative journalist would someday expose the truth.

What Lois didn’t know was that in the intervening years, major tech platforms like Google had identified the chemtrails conspiracy theory and had begun to implement policies that had the eventual effect of leading to more fact-based and authoritative content rising to the top of the page of chemtrails searches. As of this writing in late 2020, for instance, if you conduct a YouTube search for “chemtrails,” the first videos that come up are debunking videos rather than conspiracy videos. YouTube has also inserted a box at the top of the search linking to the Encyclopedia Britannica entry for “contrail”; this encyclopedia entry also debunks the chemtrails theory.

This recent change in Google policy allowed for an experiment. Our researcher asked Lois to go home and, over the subsequent week, to re-open her investigation into chemtrails. At the end of the week, we called up Lois. The difference was remarkable. She said, “I found some new articles that debunked it. I’d have to say I’m not leaning towards not thinking chemtrails are real. I don’t think they’re spraying chemicals.” Is there even such a thing as “chemtrails,” as distinct from normal airplane contrails, we asked? “I’m leaning 80-90% no,” Lois concluded.

What Lois’s story demonstrates is that if technology platforms surface the right kinds of debunking content, it can have an effect on people who haven’t yet become deeply attached to a conspiracy theory. In other words, through our ethnography we determined that it does seem possible for tech platforms to do what the debunker Mick West does, at scale.

What distinguished Lois from Jennifer is the relationship between what happens offline and online. A person who is heavily engaged in one and not the other is someone who can decrease or step back from a conspiratorial worldview. In Jennifer’s case, her offline and online behaviors are melded together and influence each other. Her social engagements with other believers in the library revolved around online content that they dissect together as a group. Her relationship with her boyfriend is founded upon their shared belief in conspiracy theories. Her increasing isolation from her family and from mainstream sources such as Google are because of her belief in conspiracies – her parents can no longer relate to her and Google cannot be trusted. Lois, on the other hand, considered chemtrails, but the rest of her life offline is not related to or motivated by a conspiratorial worldview. In fact, her research online was short-lived, brief, and kept private between herself and her brother. Lois is not part of a crusade or dedicated to finding the truth, in which chemtrails are linked to other conspiracy theories. This relationship between what happens on-and off-line was only discovered because of our ethnographic engagement with research participants. By visiting them where they live, we could observe economic changes to the town where they lived and appreciate why conspiracy theories could explain why some companies are so powerful and rich and their main street is shuttered. We could meet with their families, we could see what it means it to live off the grid, and we could witness the other parts of their lives that were not tied to conspiracies – their jobs, market investments, and church involvement. By observing both what happens online and offline, we could also observe what it meant to be a budding or a “light” conspiracist vs. a hardened, deeply entrenched and enmeshed conspiracist.

We also had demonstrated that a focus on treating some theories as harmful, and others as not, ultimately wasn’t the most fruitful way to look at conspiracy theories. More fruitful was to be aware of the difference in types of conspiracy theorists themselves. Those who are newer to a conspiracy theory are the ones you are more likely to be able to reach with the facts, and therefore Alphabet’s efforts to counter misinformation will be likeliest to have impact the further upstream they are. In other words, it’s important to catch people at the top of the rabbit hole, before they really fall down it.

CHANGING VIEWS ON CONSPIRACY THEORISTS

We delivered our findings in a set of presentations for stakeholders across Jigsaw, Google, and YouTube in December of 2019. For many of these stakeholders, it was the first time they had encountered in-depth qualitative data about the lived experience of conspiracy theorists. Of course, many employees at Alphabet are highly specialized computer scientists and businesspeople; to bring an ethnographic perspective humanizing this segment of their user base was eye-opening. The study gave teams at Alphabet new language and perspective into how conspiracies work that they would not have had from other approaches.

Not long after the completion of this study, COVID-19 hit. Those who had been briefed on our research into conspiracy theories and its relationship to harm soon had to make difficult and rapid decisions about what sorts of COVID-19 content would and wouldn’t be allowed on Google’s platforms.

By April, YouTube had pulled thousands of conspiracy and misinformation videos related to coronavirus from the platform. It began surfacing an informational panel that linked to national health agencies’ websites—like the CDC in the U.S. It also began aggressively enforcing medical misinformation policies around false COVID-19 cures, and it expanded that policy to bar promoting actions that go against recommendations from national health authorities. This expanded policy led YouTube to swiftly remove conspiratorial posts by Brazilian President Jair Bolsonaro, who had downplayed the virus. This decision was lauded by the business press.

Decision-making at an organization as large as Alphabet is diffuse, and it would be impossible to attribute these decisions to our ethnographic study alone. What we can say with confidence was that our study was a highly relevant and valued input to educate top decision makers of the harm of conspiracies at a crucial moment when Alphabet faced a flood of COVID-19 conspiracies.

We can be more precise and confident of our impact at Jigsaw itself. Ethnographic research has been a core research stream at Jigsaw since its conception, but now its value is established and appreciated by the organization’s top leadership, who participated in some of the conspiracy theory ethnography themselves. In June and July of 2020, ReD Associates and Jigsaw teamed up again, revisiting about half of our former conspiracy theorists, as well as a cohort of new ones, to learn what conspiracy beliefs they had about the COVID-19 pandemic. This time, we brought senior stakeholders not only from Jigsaw but also from Google/YouTube’s own policy teams into the “field” (redefined as Zoom calls). These senior Google stakeholders reported to us their hope that actually meeting conspiracy theorists would humanize and make more visceral their own understanding of the population their policies would affect; “I hope to feel I understand these people better than I would just by reading an article,” one said. After their participation in fieldwork, these Alphabet policymakers confirmed to us that the interviews had achieved just that: humanizing an otherwise mysterious community.

APPLICABILITY TO OTHER STUDIES

Based on our research experience, we have identified several learnings that others could consider implementing.

First, given the sensitive nature of our topic, we adopted a multi-pronged approach to recruitment. We learned after engaging with recruitment agencies that it was too off-putting to recruit explicitly for conspiracy theorists and white nationalists. Instead, we identified a variety of proxies that could help us identify potential research participants. For example, one of the questions used in our recruitment screener solicited their news sources; we listed a mix of mainstream and conspiracy-specific publications. Once we identified people who consumed conspiracy theory content, we held an initial conversation to gauge their level of familiarity with the types of conspiracies we were recruiting for and how frequent their engagement was in their day-to-day lives.

We also conducted our own recruiting. We engaged with people on social media who used hashtags associated with conspiracy theories (e.g. #wwg1wga for QAnon supporters), then approached them for an interview once we connected with them. Given the fact that some believers in conspiracy theories are suspicious of others, we also relied upon our social networks to connect with friends of friends, or sometimes friends of friends of friends. These warm introductions were a shortcut to trust among the distrustful that would otherwise not have been possible on our timeframe.

Finally, throughout the project researchers worked closely with the client, especially during fieldwork. This is a common practice, but we draw attention to it because experiencing going to a remote location together and conducting the interviews together meant that the client already had an understanding of the everyday lives of conspiracy theorists. It was helpful, too, to come to the realization together that our initial hypothesis was wrong. We no longer believed that one theory was more harmful than another. Rather, every type of conspiracy had the potential to be extreme, and that once an extreme version was adopted, it often indicated a way of seeing the world. We did not have to spend time convincing the client why our fieldwork had overturned this initial assumption and instead could focus on telling a story that humanized conspiracy theorists while expanding our sense of what made them harmful.

We’re optimistic that this more empathetic, holistic understanding of conspiracy theorists will be vital to decision making when they wrestle with refinements in their policies about how to handle disinformation, misinformation, and conspiracy theories on some of the world’s largest tech platforms.

Rebekah Park was a Senior Manager at ReD Associates when this research was conducted. She holds a PhD in Anthropology from UCLA and currently works at Gemic. She also serves as a board member of the Association of Legal and Political Anthropology of the American Anthropological Association.

David Zax is a Senior Consultant at ReD Associates, where he has focused on technology clients for the past three years. Prior to ReD, David was a technology journalist contributing to Fast Company, Wired, The New York Times, and other publications.

Beth Goldberg is at a Research Program Manager at Jigsaw, where she oversees research on violent extremism. Jigsaw is a unit within Google that builds technology to tackle global security challenges.

NOTES

Acknowledgements – We would like to thank the people we met with for sharing their stories and experiences with us so generously and kindly.

REFERENCES CITED

Bartlett, Jamie, and Miller, Carl. 2010. The Power of Unreason: Conspiracy Theories, Extremism and Counter-terrorism. London: Demos.

Brotherton, Robert, Christopher French, and Alan Pickering. 2013. “Measuring Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale.” Frontiers in Psychology 4: 279.

Coaston, Jane. 2018. “Why Conspiracy Theories Matter.” Vox website. December 31. Accessed October 8, 2020. https://www.vox.com/2018/12/31/18144710/conspiracy-theories-trump-2018-qanon-soros-false-flags.

Dagnall, Neil, Kenneth Drinkwater, Andrew Parker, Andrew Donovan, and Megan Parton. 2015. “Conspiracy Theory and Cognitive Style: A Worldview.” Frontiers in Psychology 6: 206.

Douglas, Karen M., Robbie M. Sutton, and Aleksandra Cichocka. 2017. “The Psychology of Conspiracy Theories.” Current Directions in Psychological Science 26(6): 538-542.

Douglas, Karen M., Joseph E. Uscinski, Robbie M. Sutton, Aleksandra Chichoka, Turkay Nefes, Chee Siang Ang, and Farzin Deravi. 2019. “Understanding Conspiracy Theories.” Political Psychology 40(1): 3-35.

Freeman, Daniel, and Richard Bentall. 2017. “The Concomitants of Conspiracy Concerns.” Social Psychiatry and Psychiatric Epidemiology 52(10): 595-604.

Graeupner, Damaris, and Alin Coman. 2017. “The Dark Side of Meaning-making: How Social Exclusion Leads to Superstitious Thinking.” Journal of Experimental Social Psychology 69: 218-222.

Grzesiak-Feldman, Monika. 2013. “The Effect of High-Anxiety Situations on Conspiracy Thinking.” Current Psychology 32: 100-118.

Hart, Joshua. 2018. “Profiling a Conspiracy Theorist: Why Some People Believe.” Live Science website, September 26. Accessed October 9, 2020. https://www.livescience.com/63658-why-people-believe-conspiracy-theories.html.

Herrema, Martin. 2019. “How Conspiracy Theories Affect Low-level Crime.” University of Kent News Centre website, February 26. Accessed October 9, 2020. https://www.kent.ac.uk/news/society/21303/belief-in-conspiracy-theories-makes-people-more-likely-to-engage-in-low-level-crime.

Imhoff, Roland, and Lamberty, Karoline. 2017. “Too Special to be Duped: Need for Uniqueness Motivates Conspiracy Beliefs.” European Journal of Social Psychology 47(6): 724-734.

Jolley, Daniel and Douglas, Karen. 2014. “The Social Consequences of Conspiracism: Exposure to Conspiracy Theories Decreases the Intention to Engage in Politics and to Reduce One’s Carbon Footprint.” British Journal of Psychology 105(1): 35-56.

Kluger, Jeffrey. 2017. “Why So Many People Believe Conspiracy Theories.” Time website, October 15. Accessed October 9, 2020. https://time.com/4965093/conspiracy-theories-beliefs/.

McNamara, Audrey. 2019. “The Disturbing Link Between Conspiracy Theories and Petty Crime.” Daily Beast website, February 26. Accessed October 9, 2020 https://www.thedailybeast.com/the-disturbing-link-between-conspiracy-theories-and-petty-crime.

Petersen, Michael Bang, Mathias Osmundsen, and Kevin Arceneaux. 2018. “The “Need for Chaos” and Motivations to Share Hostile Political Rumors.” PsyArXiv. September 1. https://doi.org/10.31234/osf.io/6m4ts.

Preston, Elizabeth. 2019. “The Psychology and Allure of Conspiracy Theories.” Undark website, February 27. Accessed October 9, 2020 https://undark.org/2019/02/27/the-psychology-and-allure-of-conspiracy-theories/.

Roose, Kevin. 2019. “YouTube Unleashed a Conspiracy Theory Boom. Can It Be Contained?” The New York Times website, February 19. Accessed October 9, 2020 https://www.nytimes.com/2019/02/19/technology/youtube-conspiracy-stars.html.

Stempel, Carl, Thomas Hargrove, and Guido Stempel III. 2007. “Media Use, Social Structure, and Belief in 9/11 Conspiracy Theories.” Journalism and Mass Communication Quarterly 84(2): 353-372.

van Prooijen Jan-Willem and Karen M. Douglas. 2018. “Belief in Conspiracy Theories: Basic Principles of an Emerging Research Domain.” European Journal of Social Psychology 48(7):897-908.

van Prooijen, Jan-Willem, and Nils Jostmann. 2013. “Belief in Conspiracy Theories: The Influence of Uncertainty and Perceived Morality.” European Journal of Social Psychology 43(1): 109-115.

van Prooijen, Jan-Willem, and Mark van Vugt. 2018. “Conspiracy Theories: Evolved Functions and Psychological Mechanisms.” Perspectives on Psychological Science 13(6): 770-788.

Whitson, Jennifer and Adam Galinsky. 2008. “Lacking Control Increases Illusory Pattern Perception.” Science 322(5898):115-117.

Wood, Michael J. 2018. “Propagating and Debunking Conspiracy Theories on Twitter During the 2015-2016 Zika Virus.” Cyberpsychology, Behavior, and Social Networking. 21(8): 485-490.

Share: