In “Mobilizing manufactured reality: How participatory disinformation shaped deep stories to catalyze action during the 2020 U.S. presidential election,” published April 16 in the Proceedings of the ACM on Human-Computer Interaction (CSCW), a group of CIP-affiliated co-authors, led by UW Information School doctoral student Stephen Prochaska, examined three cases of false and/or misleading information about voting processes from the 2020 U.S. presidential election. They present a framework for understanding the interaction between participatory disinformation and informal and tactical mobilization based on unfounded allegations of ballot dumping in Sonoma County, California, claims made around “Sharpiegate” in Maricopa County, Arizona, and Dominion Voting System machines in Antrim County, Michigan. They find that audiences and influencers engaged in the collaborative manufacturing of a misleading version of reality where individual claims of election fraud shaped and were shaped by a “deep story” of election fraud that led to actions like protests, filing of affidavits, and amplification of alleged “evidence.”
The other co-authors with Prochaska are iSchool doctoral student Kayla Duskin, UW HCDE doctoral student Zarine Kharazian, former undergraduate research assistant Carly Minow, former graduate research assistant Stephanie Blucker, undergraduate research assistant Sylvie Venuto, iSchool associate professor Jevin West and HCDE associate professor Kate Starbird.
Figure 1 (above): A high-level view of how participatory disinformation mobilized support based on a self-reinforcing process driven by informal (Figure 7 in the paper) and tactical mobilization (Figure 8 in the paper). The generation and propagation of participatory disinformation mediates the process by which events are interpreted, discussed online, and integrated into a manufactured version of reality that is both predicated on and supported by the disinformation. This version of reality constructs the lens or frame through which many members of online audiences interpret emergent events, and shapes the process of informal mobilization. Once online audiences are invested in the constructed version of reality facilitated through participatory disinformation, they are primed for specific tactical mobilization organized by cultivators, influencers, and other members of the online audience. The results of tactical mobilization (e.g., events at protests, affidavits for lawsuits, etc.) are then amplified as new, seemingly emergent events that are in reality often motivated by the disinformation campaign and provide additional fodder for continued informal mobilization and reinforcing “evidence” for the deep stories that motivated participation to begin with.
The co-authors combine resource mobilization theory with previous work examining participatory disinformation campaigns and “deep stories” to show how false or misleading information functioned to mobilize online audiences before, during, and after election day. Resource mobilization theory attempts to explain the processes behind catalyzing action in support of social movements, theorizing that moral resources — abstract, often religious or ideological resources — are needed in combination with other types of, often more tangible, resources in order for a social movement to successfully catalyze action. The framework presented by the co-authors suggests that participatory disinformation in the 2020 election filled the gap presented by moral resources necessary for action by manufacturing a (false) sense that injustice was being perpetrated by Democrats who were committing widespread fraud across the country.
The co-authors used Arlie Hochschild’s concept of deep stories, building on a Trump-era adaption of the concept by Franceska Polletta and Jessica Callahan. Deep stories play a significant role in both individual and collective identities, often connecting the two in narrative form. These stories are inherently difficult to define as there is a personal element to them, but at their core they are the stories people tell themselves about who they are, what common struggles they and in-groups they identify with face, and what values they and their in-groups hold. Deep stories are not based in facticity in a traditional sense. They evoke experience and emotions to create a narrative that feels true, even if the underlying details are not literally true. Importantly, these stories have no beginning or end and because they have no explicit definition are constantly evolving, often through a collaborative process between storytellers and audiences — a process that Prochaska and co-authors observed in data collected around the 2020 election.
The team’s analysis highlights how users on Twitter collaboratively construct and amplify alleged evidence of fraud that is used to facilitate action, both online and off. They find that mobilization is dependent on the selective amplification of false or misleading tweets by influencers, the framing around those claims, as well as the perceived credibility of their source. These processes, the co-authors write, are “a self-reinforcing cycle where audiences collaborate in the construction of a misleading version of reality, which in turn leads to offline actions that are used to further reinforce a manufactured reality,” a process visualized above.
PHOTO AT TOP: A “Vote Here” sign in Des Moines, Iowa on Election Day 2020. Photo by Phil Roeder / Flickr via CC BY 2.0.