RESEARCH
The Center for an Informed Public is an interdisciplinary research initiative at the University of Washington dedicated to resisting strategic misinformation, promoting an informed society and strengthening democratic discourse.
FEATURED PROJECTS
Misinformation media literacy: Supporting libraries as hubs for misinformation education
in Fall 2023, CIP researchers started work on a three-year project, supported by $750,000 in funding from the Institute of Museum and Library Services, that will create a comprehensive, nationwide information literacy program that aims to increase capacity for library staff and community members to address and navigate problematic information in their local communities. CIP researchers have partnered with instructional designers at WebJunction, a free online learning platform for library staff, that’s part of OCLC Research. They will also collaborate with libraries across the U.S. to implement educational programs modeled on successful programs the CIP has supported that have reached thousands of Washington state students, teachers, librarians and other educators in recent years.
***
Researching the changing role of public trust and misinformation in local communities in Whatcom County, Washington
CIP postdoctoral scholar Rachel Moran-Prestridge received funding from the CIP’s Innovation Fund for a project in Whatcom County, Washington that explored the changing role of public trust and the spread of misinformation. The CIP teamed up with nonprofit library organization OCLC and nonprofit, community-based newsroom Salish Current to explore how library professionals, educators and local journalists in Whatcom County build and sustain trust in their work. The research team held a series of in-person workshops on issues of trust and misinformation that brought together individuals from across local knowledge-building institutions to talk about the challenges of building trust in an era of heightened misinformation and political polarization.
- Learn more about this project
- July 2023 | In an opinion article in The Seattle Times, “Local information providers can cultivate curiosity to build community trust,” the CIP’s Rachel Moran-Prestridge shared insights from this research.
***
Misinformation Escape Room
Media and information literacy underpins the vast majority of educational programs aimed at supporting individuals to become critical information consumers and producers. Most models feature a linear pathway that goes through the steps of: defining one’s information question, finding a number of sources through search and other strategies, evaluating the sources for credibility, and using the information to solve the individual’s question. With the rise of misinformation and how it flows through social media, this model is being questioned. Perhaps the most powerful critique is the lack of attention on the emotional and psychological, or affective dimensions of misinformation that makes it so potent and pernicious.
The misinformation escape room project, co-led by CIP co-founder Chris Coward and CIP faculty member Jin Ha Lee, contributes to a growing number of game-based and experiential approaches to learning about developing resilience to misinformation. The project seeks to move beyond the rational and cognitivist approach to learning about misinformation by situating the problem as being connected to emotion and social interactions. The project’s first escape room, The Euphorigen Investigation — available at Loki’s Loop — draws on research on misinformation, mixed reality games, digital youth and media, and information literacy.
***
Sending the News Back Home: Analyzing the Spread of Misinformation Between Vietnam and Diasporic Communities in the 2020 Election
This project hopes to better understand how misinformation about the 2020 U.S. elections proliferated through social media and how it spreads across Vietnamese diasporic communities in the U.S. and between the U.S. and Vietnam. This project, led by Center for an Informed Public postdoctoral fellow Rachel E. Moran and UW Information School PhD student Sarah Nguyễn and supported with funding from George Washington University’s Institute for Data, Democracy & Politics (IDDP), is informed by the work of organizations such as VietFactCheck, The Interpreter, and other community organizations working to provide fact-checking and media analysis in Vietnamese and English.
- JANUARY 2021 | Learn more about Sending News Back Home
- JUNE 2021 | Early findings from explorations into the Vietnamese misinformation crisis” (English | Tiếng Việt)
***
COVID-19 Rapid-Response Research
Since the start of the COVID-19 pandemic, researchers at the Center for an Informed Public have been engaged in work to better understand how scientific knowledge, expertise, data and communication affect the spread and correction of online misinformation about an emerging pandemic. In May 2020, the National Science Foundation awarded approximately $200,000 in funding through the COVID-19 Rapid Response Research (RAPID) program for a project led by CIP principal investigators and co-founders Emma Spiro, an associate professor at the UW Information School, Kate Starbird, an associate professor in UW’s Department of Human Centered Design & Engineering, and Jevin West, an associate professor in the Information School. Their research looks at how a crisis situation like the COVID-19 pandemic can make the collective sensemaking process more vulnerable to misinformation.
In April 2020, the University of Washington’s Population Health Initiative awarded approximately $20,000 in COVID-19 rapid-response research funding for a proposal, also led by Spiro, Starbird and West, that seeks to understand online discourse during the ongoing pandemic and come up with strategies for improving collective action and sensemaking within science and society.
***
The ‘new elites’ of X: Identifying the most influential accounts engaged in Hamas/Israel discourse
RAPID RESEARCH REPORT | Through a novel data collection process, a team of CIP researchers identified highly influential accounts in Israel-Hamas war discourse on X that comprise the most dominant English-language news sources on Twitter for the event. In an October 20, 2023, CIP rapid research report that focuses on the first three days of the conflict, Mike Caulfield, Mert Can Bayar and Ashlyn B. Aske compare these accounts to traditional news sources and find on average they have far fewer subscribers while achieving far greater views, are of more recent popularity, and show a greater posting frequency.
- Read the CIP’s October 20, 2023, rapid research report.
- The CIP’s report was cited by NBC News, The New York Times, The Washington Post, The Atlantic, WNYC Public Radio’s “On the Media” and Cyberscoop.
- In a November 7, 2023, article for Nature, “The new Twitter is changing rapidly — study it before it’s too late,” Mike Caulfield observes: “In my more than ten years in this field, I’ve never seen an almost entirely new set of accounts come to dominate a major platform in less than a year.”
***
YouTube search surfaces good information about the causes of the Lahaina wildfire on Maui, but external links reveal a different world
RAPID RESEARCH REPORT | In August 2023, CIP research scientist Mike Caulfield and director Kate Starbird co-authored a rapid research report examining a variety of “planned crisis event” narratives on X, formerly known as Twitter, that shared unfounded claims that the destructive and deadly Lahaina, Maui wildfire earlier that month was part of a secret plan. At the same time, they found that YouTube’s search interface surfaced reliable information about the wildfire’s causes, even when prompted with a search targeted at bringing up coverage of non-existent deliberate causes.
***
One Vax, Two Lives
The Center for an Informed Public has been proud to work in partnership with UW Medicine‘s Department of OB-GYN and the UW Department of Communication‘s Communication Leadership Master’s Program on a social media campaign to dispel misinformation while addressing fears some expecting families may have about COVID vaccines. “Misinformation about vaccines happens when there is a data void, an absence of information. We are trying to eliminate the data void about vaccines during pregnancy so that people everywhere can be truly informed about the COVID-19 vaccine,” said Kolina Koltai, a CIP postdoctoral fellow who contributed her expertise studying anti-vaccination and vaccine-hesitant communities online.
- Learn more about the One Vax, Two Lives campaign
- Watch a video about the One Vax, Two Lives campaign
***
Examining the Role of Public Libraries in Combating Misinformation
Given their role in curating information and offering spaces where community members can collectively explore social issues, libraries are uniquely positioned ot be key players in combating misinformation, political polarization, and other threats to democracy, particularly at the community level. But, while libraries have this potential, few are recognized for effectively fulfilling this role. This research project, led by UW Information School senior research scientist and CIP research fellow Jason C. Young, seeks to understand why.
What are the experiences and challenges of librarians in their professional interactions helping people navigate problematic information? What types of issues do patrons raise? What assistance or programs have librarians provided? Have they been effective? Why or why not? What kinds of interventions could better support their efforts?
The aim of this project is to chat an applied research agenda for researchers and librarians to co-create new tools, resources, programs and services that strengthen the library’s role in addressing misinformation.
***
Election Integrity Partnership
In July 2020, the Center for an Informed Public joined with three other partners, Stanford Internet Observatory, Graphika and the Atlantic Council’s DFRLab, to form the Election Integrity Partnership, a nonpartisan dis- and misinformation research consortium, which worked to detect and mitigate the impact of attempts to prevent or deter people from voting in the 2020 U.S. elections or to delegitimize election results. From September to November, a team of approximately 20 CIP-affiliated principal investigators, postdoctoral fellows, PhD students, graduate students and undergraduate research assistants worked as part of the Election Integrity Partnership’s monitoring and analysis teams that tracked voting-related dis- and misinformation and contributed to numerous rapid-response research blog posts and policy analyses before, during and following election day. That included research into the ways false claims involving mail- and ballot-dumping incidents helped shape social media narratives around the U.S. elections; a “What to Expect” analysis of the types of voting-related dis- and misinformation researchers anticipated to see in days leading up Nov. 3, 2020, on election day and in the days and weeks that followed; and an analysis of domestic verified Twitter accounts that consistently amplified misinformation about the integrity of the U.S. elections. The Election Integrity Partnership’s final report, “The Long Fuse: Misinformation and the 2020 Election,” was released in March 2021. Craig Newmark Philanthropies and Omidyar Network contributed funding to support the CIP’s Election Integrity Partnership research contributions.
***
featured Publications
For complete listings of peer-reviewed published research from CIP-affiliated faculty, research scientists, postdoctoral fellows and students, scroll down beyond the following highlighted publications.
Rumors have rules
In “Rumors have rules,” an article published in the Spring 2023 edition of Issues in Science and Technology (and excerpted in The Seattle Times), CIP co-founders Emma S. Spiro and Kate Starbird explore how decades-old research about how and why people share rumors is even more relevant today in a world with social media. In their article, Spiro and Starbird revisit the 1954 “Seattle windshield pitting epidemic,” an event where residents in Western Washington reported finding unexplained pits, holes and other damage in their car windshields, leading to wide speculation about the cause. It’s “a textbook example of how rumors propagate: a sort of contagion, spread through social networks, shifting how people perceive patterns and interpret anomalies.”
***
Followback clusters, satellite audiences, and bridge nodes: Coengagement networks for the 2020 U.S. election
In “Followback clusters, satellite audiences, and bridge nodes: Coengagement networks for the 2020 U.S. election,” a paper published in the Proceedings of the Seventeenth International AAAI Conference on Web and Social Media and presented in June 2023 at ICWSM in Cyprus, a team of CIP-affiliated researchers analyzing large social network datasets from the 2020 U.S. presidential election, apply a method — engagement transformations — to convert such networks of social data into tractable images. By creating and contrasting different networks at different parameter sets, the co-authors, using the 2020 elections as a case study, define and characterize several structures in this discourse network, including bridging accounts, satellite audiences, and followback communities and discuss the importance and implications of these empirical network features in this context. The paper was co-authored by UW HCDE doctoral candidate Andrew Beers, HCDE doctoral student Joseph S. Schafer, UW Sociology doctoral graduate Ian Kennedy (now at Rice University), UW Political Science doctoral graduate Morgan Wack (now at Clemson University), CIP co-founder and Information School associate professor Emma S. Spiro and HCDE associate professor Kate Starbird.
***
Spotlight tweets: A lens for exploring attention dynamics within online sensemaking during crisis events
This paper, recently accepted for publication by ACM Transactions on Social Computing and written by Kaitlyn Zhou, Tom Wilson, Kate Starbird and Emma S. Spiro introduces the concept of a spotlight social media post — a post that receives an unexpected burst of attention — and explore how such posts reveal salient aspects of online collective sensemaking and attention dynamics during a crisis event, specifically, the online conversation surrounding a false missile alert in Hawaii in January 2018. Through a mixed-methods analysis and visualizations, their research uncovers mechanisms that lead to rapid attention gains, such as spotlighting — when a user with existing influence confers attention by sharing others’ content with their audience. They also highlight how spotlight social media posts (specifically spotlight tweets) are distinct from other heavily-shared content and that they offer insight into previously overlooked patterns in information exchange.
- READ THE PAPER | “Spotlight tweets: A lens for exploring attention dynamics within online sensemaking during crisis events“
***
Community-based strategies for combating misinformation: Learning from a popular culture fandom
Through a study using a virtual ethnography and semi-structured interviews of 34 Twitter users from the ARMY fandom, a global fan community supporting the Korean music group BTS, CIP faculty member Jin Ha Lee, a Information School professor, with co-authors Nicole Santero (University of Nevada Las Vegas), UW HCDE assistant teaching professor Arpita Bhattacharya, former UW iSchool MLIS student Emma May and CIP co-founder and iSchool associate professor Emma S. Spiro examine the effectiveness of fandom communities in reducing the impact and spread of misinformation within them. “ARMY fandom exemplifies community-based, grassroot efforts to sustainably combat misinformation and build collective resilience to misinformation at the community level, offering a model for others,” the researchers wrote in a September 2022 peer-reviewed essay in the Harvard Kennedy School Misinformation Review.
- READ THE ESSAY| “Community-based strategies for combating misinformation: Learning from a popular culture fandom”
***
Pathways through conspiracy: The evolution of conspiracy radicalization through engagement in online conspiracy discussions
The International Conference on Web and Social Media (ICWSM) awarded UW Information School doctoral candidate Shruti Phadke, iSchool assistant professor and CIP faculty member Tanu Mitra, along with co-author Mattia Samory (GESIS), the best paper award their 2022 conference in June. The paper, “Pathways through conspiracy: The evolution of conspiracy radicalization through engagement in online conspiracy discussions,” provides empirical modeling of various radicalization phases in online conspiracy theory discussion participants, studying 36,000 Reddit users through 169 million contributions, the co-authors uncover four distinct pathways of conspiracy engagement. In announcing the best paper award, ICWSM conference organizers said: “This [paper] shines a light on an important yet under-attended aspect of the radicalization problem. In doing so, they lay the foundation for more work and attention on this aspect of an important and timely social issue.”
- READ THE PAPER | “Pathways through conspiracy: The evolution of conspiracy radicalization through engagement in online conspiracy discussions“
***
Recognize the bias? News media partisanship shapes the coverage of facial recognition technology in the United States
In an May 2022 article published by the journal New Media & Society, University of Amsterdam assistant professor Sonia Jawaid Shaikh and UW Center for an Informed Public postdoctoral fellow Rachel E. Moran examine the influence of news media partisanship on the coverage of the controversial artificial intelligence technology facial recognition and using a mixed-methods content analysis of news articles from 23 U.S.-based news outlets highlights the emergence of several frames in coverage of facial recognition pertaining to issues of privacy and surveillance, bias, technology’s ability to provide solutions, and its problematic development and implementation. conspiracy beliefs is alienation from the political system, which occurs independent of foreign media consumption. The findings cast doubt on the ability of states to shape the attitudes of citizens abroad through the media and shine light on the domestic political factors underlying belief in conspiracy theories.
- READ THE ARTICLE | doi.org/10.1177/14614448221090916
***
Facing falsehoods: Strategies for polite misinformation correction
In March 2022, the International Journal of Communication published an article co-authored by UW Department of Communication doctoral candidate Pranav Malhotra and associate professor Katy Pearce, a CIP faculty member, who explore a gap in misinformation correction research to better understand the role of relational concerns, particularly adherence to politeness norms within relationships. Combining insights from the politeness literature with the misinformation correction strategies scholarship and through an interview study of Indian young adults, Malhotra and Pearce examine how they make sense of their correction experiences with older relatives who share misinformation on WhatsApp.
- READ THE PAPER | ijoc.org/index.php/ijoc/article/view/18361
'Solidarity through cynicism? The influence of Russian conspiracy theories abroad'
In an April 2022 International Studies Quarterly article, CIP faculty member Scott Radnitz, a UW Jackson School of International Studies associate professor, writes that despite recent attention to the spread of propaganda abroad, scholars have not addressed whether and how conspiracy theories spread across borders. This study assesses this question in the post-Soviet region, by examining the relationship between exposure to Russian state propaganda and belief in conspiracy theories in two countries that border the Russian Federation. Analyzing data from an original survey of Georgia and Kazakhstan indicates that exposure to Russian propaganda through television, social media, or websites has minimal effects on respondents’ endorsement of conspiracy theories. Respondents in Kazakhstan, and especially ethnic Russians, are likely to endorse pro-Russian conspiracy claims that are frequently propagated, owing to preexisting affinities. Yet the most consistent predictor of conspiracy beliefs is alienation from the political system, which occurs independent of foreign media consumption. The findings cast doubt on the ability of states to shape the attitudes of citizens abroad through the media and shine light on the domestic political factors underlying belief in conspiracy theories.
- READ THE PAPER | doi.org/10.1093/isq/sqac012
***
'Censorship-free platforms: Evaluating content moderation policies and practices of alternative social media'
- In a January 2022 special issue of For(e)dialogue, a journal from the University of Leicester’s School of Media, Communication and Sociology, focused on media and the far right, CIP research analyst Nicole Buckley, a UW School of Law student, and CIP undergraduate research assistant Joseph S. Schafer, a fourth-year student at the UW Paul G. Allen School of Computer Science & Engineering, explore and evaluate the policies of alternative social media platforms used by right-leaning influencers and their followers in comparison to mainstream platforms, and analyses how moderation policies interact with the ideological framework asserted at an alternative platform’s nascence.
- READ THE PAPER | doi.org/10.21428/e3990ae6.483f18da
***
Influence and improvisation: Participatory disinformation during the 2020 U.S. election
In “Influence and improvisation: Participatory disinformation during the 2020 U.S. election,” an article published June 2023 in Social Media + Society’s Special Issue on Political Influencers, CIP director and HCDE associate professor Kate Starbird, with Stanford University co-authors Reneé DiResta and Matt DeButts, examine efforts during the 2020 US. election to spread a false meta-narrative of widespread voter fraud as a domestic and participatory disinformation campaign in which a variety of influencers — including hyperpartisan media and political operatives — worked alongside ordinary people to produce and amplify misleading claims, often unwittingly. To better understand the nature of participatory disinformation, the co-authors examine three cases of misleading claims of voter fraud, applying an interpretive, mixed method approach to the analysis of social media data. Contrary to a prevailing view of such campaigns as coordinated and/or elite-driven efforts, this work reveals a more hybrid form, demonstrating both top-down and bottom-up dynamics that are more akin to cultivation and improvisation.
***
Misinformation or activism?: analyzing networked moral panic through an exploration of #SaveTheChildren
In an October 2022 paper published in Information, Communication & Society, co-authors Rachel E. Moran, a CIP postdoctoral fellow, and Stephen Prochaska, a CIP graduate researcher and Information School doctoral student, examine the murky space of activism based in misinformation by studying the #SaveTheChildren movement that has been central to the QAnon conspiracy theory. “Emergent themes highlight the pervasive spread of misinformation regarding human trafficking and the ideological, political, and social motivations of posters. Drawing on shared reality theory and social identity theory, we argue that the movement represents a ‘networked moral panic’ and explore the structural limitations of digital social movements in an era of information disorder,” the authors state.
- READ THE PAPER | “Misinformation or activism?: analyzing networked moral panic through an exploration of #SaveTheChildren“
***
Auditing Google's search headlines as a potential gateway to misleading content: Evidence from the 2020 U.S. election
In a September 2022 paper published in the Journal of Online Trust and Safety, “Auditing Google’s Search Headlines as a Potential Gateway to Misleading Content: Evidence from the 2020 U.S. Election,” a team of CIP-affiliated researchers, co-led by Himanshu Zade and Morgan Wack with co-authors Yuanrui Zhang, Kate Starbird, Ryan Calo, Jason Young, and Jevin D. West, examine Google search results pages which contained a disproportionate amount of undermining-trust content when compared to alternative SERP verticals (search results, stories, and advertisements).” The researchers found that “video headlines served to be a notable pathway to content with the potential to undermine trust.”
- READ THE PAPER | “Auditing Google’s search headlines as a potential gateway to misleading content: Evidence from the 2020 U.S. Election“
- READ MORE | Tech Policy Press: “Video headlines served by Google a ‘notable pathway’ to content that may undermine trust in elections, say researchers“
***
Combining interventions to reduce the spread of viral misinformation
In “Combining interventions to reduce the spread of viral misinformation,” an article published June 23 in Nature Human Behavior, a team of CIP researchers — CIP postdoctoral fellow Joe Bak-Coleman, UW Sociology doctoral graduate Ian Kennedy, UW Political Science doctoral student Morgan Wack, UW Human Centered Design & Engineering doctoral student Andrew Beers, CIP undergraduate research assistant Joseph S. Schafer, and CIP co-founders Emma S. Spiro, Kate Starbird, and Jevin D. West – “provide a framework to evaluate interventions aimed at reducing viral misinformation online, both in isolation and when used in combination,” according to the paper’s abstract. “We begin by deriving a generative model of viral misinformation spread, inspired by research on infectious disease. By applying this model to a large corpus (10.5 million tweets) of misinformation events that occurred during the 2020 U.S. election, we reveal that commonly proposed interventions are unlikely to be effective in isolation. However, our framework demonstrates that a combined approach can achieve a substantial reduction in the prevalence of misinformation. Our results highlight a practical path forward as misinformation online continues to threaten vaccination efforts, equity and democratic processes around the globe.”
- READ THE PAPER | “Combining interventions to reduce the spread of viral misinformation”
- NEWS COVERAGE | Grid News, CNET and Phys.org.
***
Bridging contextual and methodological gaps on the 'misinformation Beat': Insights from journalist-researcher collaborations at speed
In an April 2022 CHI ’22: CHI Conference on Human Factors in Computing Systems paper, UW Human Centered Design & Engineering doctoral student and CIP researcher Melinda McClure Haughey, UW Department of Communication student Martina Polovo, and CIP director and HCDE associate professor Kate Starbird, through an ethnographic study of thirty collaborations, including participant-observation and interviews with journalists and researchers, identify five types of collaborations and describe what motivates journalists to reach out to researchers — from a lack of access to data to support for understanding misinformation context. We highlight challenges within these collaborations, including misalignment in professional work practices, ethical guidelines, and reward structures. The co-authors end with a call to action for CHI researchers to attend to this intersection, develop ethical guidelines around supporting journalists with data at speed, and offer practical approaches for researchers filling a “data mediator” role between social media and journalists.
- READ THE ARTICLE | dl.acm.org/doi/10.1145/3491102.3517503
***
'Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism'
Researchers from the UW Center for an Informed Public (UW iSchool doctoral student Sarah Nguyen and CIP postdoctoral fellow Rachel E. Moran) and the University of North Carolina studying how mis- and disinformation spreads in diasporic and transnational communites in the United States co-authored a March 2022 Havard Kennedy School Misinformation Review commentary where they use examples of case studies from Vietnam, Taiwan, China, and India and discuss research themes and challenges including legacies of multiple imperialisms, nationalisms, and geopolitical tensions as root causes of mis- and disinformation; difficulties in data collection due to private and closed information networks, language translation and interpretation; and transnational dimensions of information infrastructures and media platforms. This commentary introduces key concepts driven by methodological approaches to better study diasporic information networks beyond the dominance of Anglocentrism in existing mis- and disinformation studies.
- READ THE COMMENTARY | doi.org/10.37016/mr-2020-95
***
'Information literacy for mortals'
In a Project Information Literacy Provocation Series essay, “Information Literacy for Mortals,” where he elaborated on the SIFT method to contextualize claims and vet information sources, CIP research scientist Mike Caufield discusses the task environment of civic information literacy, how to use rules of thumb to understand whether a source is credible or misleading and how students apply this knowledge while consuming information online. “The fundamental idea driving them is that reputation of claims and sources is discoverable on the web, and discoverable relatively quickly. And what we find is that if students take steps to evaluate that reputation, even briefly, before engaging with content, they end up being served by the off-web skills they have developed quite well. Their intuitions improve dramatically,” Caufield wrote.
- READ THE ESSAY | “Information literacy for mortals”
***
‘Dilemmas of distrust: Conspiracy beliefs, elite rhetoric, and motivated reasoning’
In October 2021, Political Research Quarterly published “Dilemmas of Distrust: Conspiracy Beliefs, Elite Rhetoric, and Motivated Reasoning,” a paper written by Scott Radnitz, a CIP faculty member and associate professor at the UW Jackson School of International Studies. “The results show that motivated reasoning stemming from state-level geopolitical identities is strongly associated with higher conspiracy belief, whereas official claims have little effect on people’s perceptions of conspiracy,” according to Radnitz’s abstract.
- READ THE PAPER | doi.org/10.1177/10659129211034558
***
'Addressing the root of vaccine hesitancy during the COVID-19 pandemic'
In an article published in the Winter 2021 edition of XRDS: Crossroads, The ACM Magazine for Students, CIP postdoctoral fellows Kolina Koltai and Rachel E. Moran and iSchool doctoral student Izzi Grasso write that the “rise of vaccine hesitancy worldwide cannot be only attributed to the pandemic itself. While certainly, prominent anti-vaccination leaders have maximized their messages with social media and the chaos of the pandemic amplified their narratives, widespread vaccine hesitancy was always something that was a possibility. This is because vaccine misinformation and vaccine hesitancy are rooted in larger socio-ecological issues in society and conspiratorial thinking. Becoming vaccine hesitant was always a possibility for a large portion of the population, and the pandemic and social media provided the opportunity to distribute an array of vaccine misinformation to the public. And as evident by the number of educators and healthcare professionals who refuse to vaccinate, educational attainment does not make you immune to vaccine misinformation.
- READ THE PAPER | https://doi.org/10.1145/3495259
***
'Polls, clickbait, and commemorative $2 bills: Problematic political advertising on news and media websites around the 2020 U.S. elections'
In a 2021 paper for the Proceedings of the 21st ACM Internet Measurement Conference, UW Paul G. Allen School of Computer Science & Engineering doctoral students Eric Zeng and Miranda Wei, Allen School undergraduate research student Theo Gregersen, Allen School professor Tadayoshi Kohno, and CIP faculty and Allen School associate professor Franziska Roesner, collected and analyzed 1.4 million online ads on 745 news and media websites from six cities in the U.S. The finding reveals the widespread use of problematic tactics in political ads, the use of political controversy for clickbait, and the more frequent occurrence of political ads on highly partisan news websites. This research was funded by the National Science Foundation, the UW Center for an Informed Public and the John S. and James L. Knight Foundation.
- READ THE PAPER | https://dl.acm.org/doi/10.1145/3487552.3487850
***
'Misinformation in and about science'
When studying misinformation, the focus of attention is often on popular and social media. But in “Misinformation in and about science,” an April 2021 paper published in the Proceedings of the National Academies of Science, Center for an Informed Public faculty members Jevin D. West and Carl T. Bergstrom write that ”[a]ppealing as it may be to view science as occupying a privileged epistemic position, scientific communication has fallen victim to the ill effects of an attention economy.” But Bergstrom, a UW Department of Biology professor, and West, a UW Information School associate professor, stress: “This is not to say that science is broken. Far from it. Science is the greatest of human inventions for understanding our world, and it functions remarkably well despite these challenges. Still, scientists compete for eyeballs just as journalists do.”
- 04.14.2021 | New paper explores ways scientific communication has ‘fallen victim to the ill effects of an attention economy’
- READ THE PAPER | https://doi.org/10.1073/pnas.1912444117
***
'Auditing e-commerce platforms for algorithmically curated vaccine misinformation'
At the 2021 ACM CHI Virtual Conference on Human Factors in Computing Systems in June, UW Information School PhD student Prerna Juneja presented findings from a recently published paper, “Auditing E-Commerce Platforms for Algorithmically Curated Vaccine Misinformation,“ which was awarded a CHI Best Paper Honorable Mention. The paper, written with iSchool assistant professor and Center for an Informed Public faculty member Tanu Mitra, details research that examined how vaccine misinformation has been amplified by algorithms used by Amazon and, through an auditing framework, found the e-commerce giant’s platform to be a “marketplace of multifaceted health misinformation,” as The Seattle Times wrote in a Jan. 28 article about their research.
- 05.25.2021 | By auditing algorithms, iSchool researchers work to better understand misinformation we see and consume online
- READ THE PAPER | https://dl.acm.org/doi/10.1145/3411764.3445250
***
'How do you solve a problem like misinformation?'
To understand the dynamics and complexities of misinformation, it’s vital to understand three key distinctions — misinformation vs. disinformation, speech vs. action and mistaken belief vs. conviction — the University of Washington’s Center for an Informed Public’s five co-founders wrote in a newly published essay in Science Advances. In “How do you solve a problem like misinformation?,” published Dec. 8, School of Law professor Ryan Calo, Information School senior principal research scientist Chris Coward, iSchool associate professor Emma S. Spiro, Human Centered Design & Engineering associate professor Kate Starbird and iSchool associate professor Jevin D. West, write that failing to recognize these distinctions can “lead to unproductive dead ends” while understanding them is “the first step toward recognizing misinformation and hopefully addressing it.”
- 12.08.2021 | CIP co-founders offer advice to misinformation researchers and policymakers in Science Advances
- READ THE ESSAY | DOI: 10.1126/sciadv.abn0481
***
'Mask narratives promoted by anti-vaccination accounts on Instagram prior to the COVID-19 pandemics'
In an article included in the Association of Internet Researchers Selected Papers of Internet Research 2021, CIP researchers Kolina Koltai, Iva Grohmann, Devin T. Johnson, Samantha Rondini, and Ella R. Foley seek to identify if the anti-vaccination movement held prior beliefs about masks to prevent the spread of respiratory diseases and if those beliefs differ from their mask sentiment today. Through thematic analysis of 44 Instagram posts prior to the onset of the COVID-19 pandemic, we find that online vaccine safety communities have, in the past, regarded mask-wearing as a viable alternative to vaccines. Notably, posts supported the efficacy of mask-wearing while criticizing the mandates to wear masks in healthcare settings. In this paper, we elaborate on these mask narratives, as well as their implications in how the anti-vaccination group had a dramatic shift in mask sentiment during the pandemic.
- READ THE ESSAY | https://doi.org/10.5210/spir.v2021i0.12195
***
'Characterizing social imaginaries and self-disclosures of dissonance in online conspiracy discussion communities'
In a 2021 paper published in the Proceedings of the ACM on Human-Computer Interaction, UW iSchool doctoral student Shruti Phadke, Mattia Samory of the Liebnitz Institute for the Social Sciences, and UW iSchool assistant professor and CIP faculty member Tanu Mitra offer a systematic framework for uncovering the dimensions and coded language related to QAnon social imaginaries and can serve as a toolbox for studying other conspiracy theories across different platforms. The co-authors contribute a computational framework for identifying dissonance self-disclosures and measuring the changes in user engagement surrounding dissonance. Our work provide insights into designing dissonance based interventions that can potentially dissuade conspiracists from engaging in online conspiracy discussion communities.
- READ THE PAPER | https://dl.acm.org/doi/10.1145/3479855
***
‘Trust and authenticity as tools for journalism and partisan disinformation’
Despite popular perceptions that trust in news media is on the decline, trust may actually be ascendant. However, CIP postdoctoral fellow Rachel E. Moran argues in a July 2021 article for the Social Science Research Council‘s Items Disinformation Series, to understand trust in media today, one has to understand its relational nature. Relational trust is tied to hyper-individualized assessments of the authenticity of the journalist that give leverage to micro-celebrities and pseudo journalists on social media.
- READ THE ARTICLE | items.ssrc.org
***
'Stewardship of global collective behavior'
Our ability to confront global crises, from pandemics to climate change, depends on how we interact and share information. Social media and other communication technology restructure these interactions in ways that have consequences. Unfortunately, we have little insight into whether these changes will bring about a healthy, sustainable and equitable world. As a result, a multidisciplinary group of researchers say that the study of collective behavior must rise to a “crisis discipline,” just like medicine, conservation and climate science have done, according to a June 2021 paper in the Proceedings of the National Academy of Sciences. The paper’s first author is Center for an Informed Public postdoctoral fellow Joseph B. Bak-Coleman. Other CIP-affiliated co-authors include faculty member Carl T. Bergstrom and postdoctoral fellow Rachel E. Moran.
- 06.14.2021 | Communication technology, study of collective behavior must be ‘crisis discipline,’ researchers say
- READ THE PAPER | https://doi.org/10.1073/pnas.2025764118
***
2023
- Kate Starbird, Renee DiResta, and Matt DeButts. “Influence and Improvisation: Participatory Disinformation during the 2020 U.S. Election.” Social Media + Society’s Special Issue on Political Influencers. (2023) doi.org/10.1177/2056305123117794
-
Yim Register, Lucy Qin, Amanda Baughan, and Emma S. Spiro. “Attached to “The Algorithm”: Making sense of algorithmic precarity on Instagram.” Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. (2023) doi.org/10.1145/3544548.3581257
- Carl T. Bergstrom and Jevin D. West. “How publishers can fight misinformation in and about science and medicine.” Nature Medicine. (2023) doi.org/10.1038/s41591-023-02411-7
- Joseph B. Bak-Coleman, Carl T. Bergstrom, Jennifer Jacquet, James Mickens, Zynep Tufekci, and Timmons Roberts. “Create an IPCC-like body to harness benefits and combat harms of digital tech.” Nature. (2023) doi.org/10.1038/d41586-023-01606-9
- Stephen Prochaska, Kayla Duskin, Zarine Kharazian, Carly Minow, Stephanie Blucker, Sylvie Venuto, Jevin West, and Kate Starbird. “Mobilizing manufactured reality: How participatory disinformation shaped deep stories to catalyze action during the 2020 U.S. presidential election.” Proceedings of the ACM on Human-Computer Interaction, Vol. 7, Issue CSCW1 (2023) dl.acm.org/doi/abs/10.1145/3579616
- Rachel E. Moran, Sarah Nguyễn, and Linh Bui. “Sending news back home: Misinformation lost in transnational social networks.” Proceedings of the AMC on Human-Computing Interaction, Vol. 7, Issue CSCW1 (2023) dl.acm.org/doi/10.1145/3579521
- Tianjiao Yu, Sukrit Venkatagiri, Ismini Lourentzou, and Kurt Luther. “Sedition Hunters: A quantitative study of the crowdsourced investigation into the 2021 U.S. Capitol attack.” Proceedings of the ACM Web Conference 2023. doi.org/10.1145/3543507.3583514
- Rachel E. Moran, Stephen Prochaska, Izzi Grasso, and Isabelle Schlegel. “Navigating information-seeking in conspiratorial waters: Anti-trafficking advocacy and education post QAnon.” Proceedings of the AMC on Human-Computing Interaction, Vol. 7, Issue CSCW1 (2023) dl.acm.org/doi/10.1145/3579510
- Sarah Nguyễn, Rachel E. Moran, Trung-Anh Nguyen, and Linh Bui. ”We never really talked about politics”: Race and ethnicity as foundational forces structuring information disorder within the Vietnamese Diaspora.” Political Communication. (2023) doi.org/10.1080/10584609.2023.2201940
- Lia Bozarth, Jane Im, Christopher Quarles, Ceren Budak. “Wisdom of two crowds: Misinformation moderation on Reddit and how to improve this process: A case study of COVID-19.” Proceedings of the AMC on Human-Computing Interaction, Vol. 7, Issue CSCW1 (2023) doi.org/10.1145/3579631
- Kaitlyn Zhou, Tom Wilson, Kate Starbird and Emma S. Spiro. “Spotlight tweets: A lens for exploring attention dynamics within online sensemaking during crisis events.” Proceedings of the ACM Transactions on Social Computing (2023) dl.acm.org/doi/10.1145/3577213
2022
- Stephen Prochaska and Rachel E. Moran. “Misinformation or activism?: Analyzing networked moral panic through an exploration of #SaveTheChildren.” Information, Communication & Society. (2022) doi.org/10.1080/1369118X.2022.2146986
- Rachel E. Moran. “The so-called ‘crisis’ of trust in journalism.” The Routledge Companion to News and Journalism. Routledge. (2022)
- Katy E. Pearce and Jessica Vitak. “Performing honor online: The affordances of social media for surveillance and impression management in an honor culture.” New Media & Society. (2022) doi.org/10.1177/14614448156002
- Carl T. Bergstrom, Daniel R. Pimentel, and Jonathan Osborne. “To fight misinformation, we need to teach that science is dynamic.” Scientific American. (2022)
- Rachel E. Moran, Izzi Grosso, and Kolina Koltai. “Folk theories of avoiding content moderation: How vaccine-opposed influencers amplify vaccine opposition on Instagram.” Social Media and Society. (2022) doi.org/10.1177/20563051221144252
- Jin Ha Lee, Nicole Santero, Arpita Battacharya, Emma May, and Emma S. Spiro. “Community-based strategies for combating misinformation: Lessons from a popular culture fandom.” Harvard Kennedy School (HSK) Misinformation Review. (2022) doi.org/10.37016/mr-2020-105
- Himanshu Zade, Morgan Wack, Yuanrui Zhang, Kate Starbird, Ryan Calo, Jason Young, and Jevin D. West. “Auditing Google’s search headlines as a potential gateway to misleading content: Evidence from the 2020 U.S. election.” Journal of Online Trust and Safety. (2022) doi.org/10.54501/jots.v1i4.72
- Eric Zeng, Rachel McAmis, Tadayoshi Kohno, Franziska Roesner. “What factors affect targeting and bids in online advertising?: a field measurement study.” Proceedings of the 22nd ACM Internet Measurement Conference. (2022) doi.org/10.1145/3517745.3561460
- Farnaz Jahanbakhsh, Amy X. Zhang, David R. Karger. “Leveraging structured trusted-peer assessments to combat misinformation.” Proceedings of the ACM on Human-Computer Interaction. (2022) doi.org/10.1145/3555637
- Shruti Phadke, Mattia Samory, and Tanu Mitra. “Pathways through conspiracy: The evolution of conspiracy radicalization through engagement in online conspiracy discussions.” Proceedings of the Sixteenth International AAAI Conference on Web and Social Media. (2022) ojs.aaai.org/index.php/ICWSM/article/download/19333/19105
- Galen Weld, Amy X. Zhang and Tim Althoff. “What makes online communities ‘better’? Measuring values, consensus, and conflict across thousands of subreddits.” Proceedings of the International AAAI Conference on Web and Social Media. ojs.aaai.org/index.php/ICWSM/article/view/19363
- Joseph B. Bak-Coleman, Ian Kennedy, Morgan Wack, Andrew Beers, Joseph S. Schafer, Emma S. Spiro, Kate Starbird, Jevin D. West. “Combining interventions to reduce the spread of viral misinformation.” Nature Human Behaviour. (2022) doi.org/10.1038/s41562-022-01388-6
- Joseph B. Bak-Coleman and Jevin D. West. “Modest interventions complement each other in reducing misinformation.” Nature Human Behaviour. (2022) doi.org/10.1038/s41562-022-01389-5
- Ian Kennedy, Morgan Wack, Andrew Beers, Joseph S. Schafer, Isabella Garcia-Camargo, Emma S. Spiro and Kate Starbird. “Repeat spreaders and election delegitimization: A comprehensive dataset of misinformation tweets from the 2020 U.S. election” Journal of Quantitative Description: Digital Media. (2022) doi.org/10.51685/jqd.2022.013
- Julian D. Ford, Davide Marengo, Miranda Olff, Cherie Armour, Jon D. Elhai, Zack Almquist, and Emma S. Spiro. “Temporal trends in health worker social media communication during the COVID-19 pandemic.” Research in Nursing & Health. (2022) doi.org/10.1002/nur.22266
- Yim Register and Emma S. Spiro. “Developing self-Advocacy skills through machine learning education: The Case of Ad Recommendation on Facebook.” Proceedings of the International AAAI Conference on Web and Social Media. (2022) ojs.aaai.org/index.php/ICWSM/article/view/19337
- Sonia Jawaid Shaikh and Rachel E. Moran. “Recognize the bias? News media partisanship shapes the coverage of facial recognition technology in the United States.” New Media & Society. (2022) doi.org/10.1177/14614448221090916
- Rachel E. Moran and Sonia Jawaid Shaikh. “Robots in the news and newsrooms: Unpacking meta-journalistic discourse on the use of artificial intelligence in journalism.” Digital Journalism. (2022)
- Scott Radnitz. “Why democracy fuels conspiracy theories.” Journal of Democracy. (2022)
- Scott Radnitz. “Solidarity through cynicism? The influence of Russian conspiracy theories abroad.” International Studies Quarterly. (2022) doi.org/10.1093/isq/sqac012
- Sarah Nguyễn, Rachel Kuo, Madhavi Reddi, Lan Li, and Rachel E. Moran. “Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism.” Harvard Kennedy School Misinformation Review. (2022)
- Pranav Malhotra and Katy E. Pearce. “Facing falsehoods: Strategies for polite misinformation correction.” International Journal of Communication. (2022)
- Jason Portenoy, Marissa Radensky, Jevin D. West, Eric Horvitz, Daniel S. Weld and Tom Hope. “Bursting scientific filter bubbles: Boosting innovation via novel author discovery.” Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. (2022) dl.acm.org/doi/10.1145/3491102.3501905
- Zack Almquist, Zach Hsiao, Kolina Koltai and Emma S. Spiro. “Talking about the jab: Medical professionals’ expressions of vaccine hesitancy online.” Population Association of America Annual Meeting. (2022)
- Rachel E. Moran and Kolina Koltai. “How to research misinformation online.” Sage Research Methods. (2022) dx.doi.org/10.4135/9781529609226
- Katy E. Pearce and Pranav Malhotra. “Inaccuracies and Izzat: Channel Affordances for the Consideration of Face in Misinformation Correction.” Journal of Computer-Mediated Communication. (2022) doi.org/10.1093/jcmc/zmac004
- Joseph B. Bak-Coleman and Carl T. Bergstrom. “A high-speed scientific hive mind emerged from the COVID-19 pandemic.” Scientific American. (2022) 10.1038/scientificamerican0322-34
- Kolina Koltai, Rachel E. Moran, and Izzi Grasso. “Addressing the root of vaccine hesitancy during the COVID-19 pandemic.” XRDS: Crossroads, The ACM Magazine for Students. (2022) dl.acm.org/doi/10.1145/3495259
- Rachel E. Moran and Efrat Nechushtai. “Before reception: Trust in the news as infrastructure.” Journalism. (2022) doi.org/10.1177%2F14648849211048961
- Nicole Buckley and Joseph S. Schafer. “Censorship-free platforms: Evaluating content moderation policies and practices of alternative social media,” For(e)dialogue. (2022) doi.org/10.21428/e3990ae6.483f18da
- Jevin D. West and Carl T. Bergstrom. “An Introduction to Calling Bullshit: Learning to Think Outside the Black Box.” Numeracy: Advancing Education in Quantitative Literacy. (2022) doi.org/10.5038/1936-4660.15.1.1405
2021
- Ryan Calo, Chris Coward, Emma S. Spiro, Kate Starbird, and Jevin D. West. “How do you solve a problem like misinformation?,” Science Advances. (2021) 10.1126/sciadv.abn0481
- Mike Caulfield. “Information literacy for mortals.” Project Information Literacy Provocation Series. (2021)
- Scott Radnitz. “Dilemmas of distrust: Conspiracy beliefs, elite rhetoric, and motivated reasoning.” Political Research Quarterly. (2021) doi.org/10.1177/10659129211034558
- Rachel E. Moran, Stephen Prochaska, Isabelle Schelgel, and Emilia May Hughes. “Misinformation or activism: Mapping networked moral panic through an analysis of #SaveTheChildren.” Association of Internet Researchers, Selected Papers of Internet Research. (2021) https://journals.uic.edu/ojs/index.php/spir/article/view/12212
- Kolina Koltai, Iva Grohmann, Devin T. Johnson, Samantha Rondini, and Ella R. Foley. “Mask narratives promoted by anti-vaccination accounts on Instagram prior to the COVID-19 pandemic.” Association of Internet Researchers, Selected Papers of Internet Research. (2021) https://doi.org/10.5210/spir.v2021i0.12195
- Andrew Beers, Sarah Nguyễn, Emma S. Spiro, and Kate Starbird. “Rejecting science with science: Boundary-work in anti-mask Twitter reply threads during COVID-19.” Association of Internet Researchers, Selected Papers of Internet Research. (2021) https://doi.org/10.5210/spir.v2021i0.12143
- Tom Wilson and Kate Starbird. “Cross-platform information operations: Mobilizing narratives and building resilience through both ‘Big’ and ‘Alt’ tech.” Proceedings of the ACM on Human-Computer Interaction. (2021) https://doi.org/10.1145/3476086
- Joseph Bak-Coleman, Ian Kennedy, Morgan Wack, Andrew Beers, Joseph S. Schafer, Emma S. Spiro, Kate Starbird, and Jevin D. West. SocArXiv (preprint). (2021) https://osf.io/preprints/socarxiv/4jtvm/
- Shruti Phadke, Mattia Samory, and Tanu Mitra. “Characterizing social imaginaries and self-disclosures of dissonance in online conspiracy discussion communities.” Proceedings of the ACM on Human-Computer Interaction. (2021) https://doi.org/10.1145/3479855
- Eric Zeng, Miranda Wei, Theo Gregersen, Tadayoshi Kohno, and Franziska Roesner. “Polls, clickbait, and commemorative $2 bills: Problematic political advertising on news and media websites around the 2020 U.S. elections.” Proceedings of the 21st ACM Internet Measurement Conference. (2021) https://dl.acm.org/doi/10.1145/3487552.3487850
- Joseph Bak-Coleman, Mark Alfano, Wolfram Barfuss, Carl T. Bergstrom, Miguel A. Centeno, Iain D. Cousin, Jonathan F. Donges, Mirta Galesic, Andrew S. Gersick, Jennifer Jacquet, Albert B. Kao, Rachel E. Moran, Thayer S. Patterson, Pawel Romanczuk, Daniel I. Rubenstein, Kaia J. Tombak, Jay J. Van Bavel, and Elke U. Weber. “Stewardship of global collective behavior.” Proceedings of the National Academy of Sciences. (2021)
- Joseph Bak-Coleman and Carl T. Bergstrom. “Reply to Cheong and Jones: The role of science in responding to collective behaviorial threats.” Proceedings of the National Academy of Sciences. (2021) https://doi.org/10.1073/pnas.2114477118
- Jevin D. West and Carl T. Bergstrom. “Misinformation in and about science,” Proceedings of the National Academies of Science. (2021)
- Rachel E. Moran. “Trust and authenticity as tools for journalism and partisan disinformation,” Social Science Reasearch Council “Beyond Disinformation” series. (2021)
- Jason C. Young, “Disinformation as the weaponization of cruel optimism: A critical intervention in misinformation studies,” Emotion, Space and Society. (2021) Volume 38 https://doi.org/10.1016/j.emospa.2020.100757.
- Rachel E. Moran and TJ Billard. “Imagining Resistance to Trump Through the Networked Branding of the National Park Service.” Popular Culture and Civic Imagination. (2021)
- Rachel E. Moran and Nikki Usher. “Objects of journalism, revised: Rethinking materiality in journalism studies through emotion, culture and ‘unexpected objects.’” Journalism. (2021)
2020
- Jason C. Young, Brandyn Boyd, Katia Yefimova, Stacey Wedlake and Chris Coward. “The role of libraries in misinformation programming: a research agenda.” Journal of Libraianship and Information Science. (2020) https://doi.org/10.1177/0961000620966650
- Melinda McClure-Haughey, Meena Devii Muralikuma, Cameron A. Wood and Kate Starbird. Proceedings of the ACM on Human-Computer Interaction, Volume 4, Issue CSCW2, (2020) Article No.: 133 pp 1–22 https://doi.org/10.1145/3415204
- Rachel E. Moran. “Subscribing to transparency: Trust building within virtual newsrooms on Slack.” Journalism Practice. (2020) https://doi.org/10.1080/17512786.2020.1778507
- Carl T. Bergstrom, Jevin D. West. Calling Bullshit: The Art of Skepticism in a Data-Driven World, Penguin Random House (2020)
Other Publications
- Carl T. Bergstrom and Joseph B. Bak-Coleman. “Information gerrymandering in social networks skews collective decision-making.” Nature. (2019): 40-41. https://www.nature.com/articles/d41586-019-02562-z
- Kate Starbird, Ahmer Arif, and Tom Wilson. “Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations.” Proceedings of the ACM on Human-Computer Interaction CI. 3, Computer-Supported Cooperative Work (CSCW 2019). Article 127. https://doi.org/10.1145/3359229 http://faculty.washington.edu/kstarbi/StarbirdArifWilson_DisinformationasCollaborativeWork-CameraReady-Preprint.pdf
- Peter M. Krafft, Emma S. Spiro. “Keeping Rumors in Proportion: Managing Uncertainty in Rumor Systems,” Proceedings of the 2019 CHI Conference on Human Computer Interaction (2019) https://dl.acm.org/doi/fullHtml/10.1145/3290605.3300876
- Coward, C., McClay, C., Garrido, M. (2018). Public libraries as platforms for civic engagement. Seattle: Technology & Social Change Group, University of Washington Information School.
- Ahmer Arif, Leo G. Stewart, and Kate Starbird. (2018). Acting the Part: Examining Information Operations within #BlackLivesMatter Discourse. PACMHCI. 2, Computer-Supported Cooperative Work (CSCW 2018). Article 20.https://faculty.washington.edu/kstarbi/BLM-IRA-Camera-Ready.pdf
- Philip N. Howard, Samuel Woolley and Ryan Calo (2018) Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration, Journal of Information Technology & Politics, 15:2, 81-93, DOI: 10.1080/19331681.2018.1448735
- H. Piwowar, J. Priem, V. Larivière, J.P. Alperin, L. Matthias, B. Norlander, A. Farley, J.D. West, S. Haustein. (2018) “The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.” PeerJ 6: e4375.
- Carl T. Bergstrom, and Jevin D. West. (2018) “Why scatter plots suggest causality, and what we can do about it.” arXiv preprint arXiv:1809.09328
- Madeline Lamo & Ryan Calo (2018) Regulating Bot Speech. UCLA Law Review. https://www.uclalawreview.org/regulating-bot-speech/
- L. Kim, J.H. Portenoy, J.D. West, Katherine W. Stovel. (2018). Scientific Journals Still Matter in the Era of Academic Search Engines and Preprint Archives. Journal of the American Society for Information Science & Technology. (in press)
- Kate Starbird, Ahmer Arif, Tom Wilson, Katherine Van Koevering, Katya Yefimova, and Daniel Scarnecchia. (2018). “Ecosystem or Echo-System? Exploring Content Sharing across Alternative Media Domains.” Presented at 12th International AAAI Conference on Web and Social Media (ICWSM 2018), Stanford, CA, (pp. 365-374). https://faculty.washington.edu/kstarbi/Starbird-et-al-ICWSM-2018-Echosystem-final.pdf
- Kate Starbird, D Dailey, O Mohamed, G Lee, ES Spiro. Engage Early, Correct More: How Journalists Participate in False Rumors Online during Crisis Events. CHI, 2018.
- P Krafft, K Zhou, I Edwards, K Starbird, ES Spiro. Centralized, Parallel, and Distributed Information Processing during Collective Sensemaking. CHI, 2017.
- J.D. West. (2017) How to fine-tune your BS meter. Seattle Times. Op-ed
- L. Kim, J.D. West, K. Stovel. (2017) Echo Chambers in Science? American Sociological Association (ASA) Annual Meeting, August 2017
- Kate Starbird. (2017). Examining the Alternative Media Ecosystem through the Production of Alternative Narratives of Mass Shooting Events on Twitter. In 11th International AAAI Conference on Web and Social Media (ICWSM 2017), Montreal, Canada, (pp. 230-339). http://faculty.washington.edu/kstarbi/Alt_Narratives_ICWSM17-CameraReady.pdf
- Ahmer Arif, John Robinson, Stephanie Stanek, Elodie Fichet, Paul Townsend, Zena Worku and Kate Starbird. (2017). A Closer Look at the Self-Correcting Crowd: Examining Corrections in Online Rumors. Proceedings of the ACM 2017 Conference on Computer-Supported Cooperative Work & Social Computing (CSCW ’17), Portland, Oregon, (pp. 155-168). http://faculty.washington.edu/kstarbi/Arif_Starbird_CorrectiveBehavior_CSCW2017.pdf
- Kate Starbird, Emma Spiro, Isabelle Edwards, Kaitlyn Zhou, Jim Maddock and Sindhu Narasimhan. (2016). Could This Be True? I Think So! Expressed Uncertainty in Online Rumoring. Proceedings of the ACM 2016 Conference on Human Factors in Computing Systems (CHI 2016), San Jose, CA. (pp, 360-371).
http://faculty.washington.edu/kstarbi/CHI2016_Uncertainty_Round2_FINAL-3.pdf - A Arif, K Shanahan, R Chou, S Dosouto, K Starbird, ES Spiro. How Information Snowballs: Exploring the Role of Exposure in Online Rumor Propagation. CSCW, 2016.
- L Zeng, K Starbird, ES Spiro. #Unconfirmed: Classifying Rumor Stance in Crisis-Related Social Media Messages. ICWSM, 2016.
- M. Rosvall, A.V. Esquivel, A. Lancichinetti, J.D. West, R. Lambiotte. (2014) Memory in network flows and its effects on spreading dynamics and community detection. Nature Communications. 5:4630, doi:10.1038/ncomms5630
- Kate Starbird, Jim Maddock, Mania Orand, Peg Achterman, and Robert M. Mason. (2014). Rumors, False Flags, and Digital Vigilantes: Misinformation on Twitter after the 2013 Boston Marathon Bombings. Short paper. iConference 2014, Berlin, Germany, (9 pages). http://faculty.washington.edu/kstarbi/Starbird_iConference2014-final.pdfJ.D. West, T.C. Bergstrom, C.T. Bergstrom. (2014). Cost-effectiveness of open access publications. Economic Inquiry. 52: 1315-1321. doi: 10.1111/ecin.12117
- ES Spiro, J Sutton, S Fitzhugh, M Greczek, N Pierski, CT Butts. Rumoring During Extreme Events: A Case Study of Deepwater Horizon 2010. WebSci, 2012.
- J.D. West, C.T Bergstrom (2011) Can Ignorance Promote Democracy? Science. 334(6062):1503-1504. doi:10.1126/science.1216124
OTHER PROJECTS
Tracking and Unpacking Rumor Permutations to Understand Collective Sensemaking Online
Emma S. Spiro (PI), Kate Starbird (Co-PI)
This research addresses empirical and conceptual questions about online rumoring, asking: (1) How do online rumors permute, branch, and otherwise evolve over the course of their lifetime? (2) How can theories of rumor spread in offline settings be extended to online interaction, and what factors (technological and behavioral) influence these dynamics, perhaps making online settings distinct environments for information flow? The dynamics of information flow are particularly salient in the context of crisis response, where social media have become an integral part of both the formal and informal communication infrastructure. Improved understanding of online rumoring could inform communication and information-gathering strategies for crisis responders, journalists, and citizens affected by disasters, leading to innovative solutions for detecting, tracking, and responding to the spread of misinformation and malicious rumors. This project has the potential to fundamentally transform both methods and theories for studying collective behavior online during disasters. Techniques developed for tracking rumors as they evolve and spread over social media will aid other researchers in addressing similar problems in other contexts. Learn more
Tracking and Unpacking Rumor Permutations to Understand Collective Sensemaking Online
Emma S. Spiro (PI), Kate Starbird (Co-PI)
This research addresses empirical and conceptual questions about online rumoring, asking: (1) How do online rumors permute, branch, and otherwise evolve over the course of their lifetime? (2) How can theories of rumor spread in offline settings be extended to online interaction, and what factors (technological and behavioral) influence these dynamics, perhaps making online settings distinct environments for information flow? The dynamics of information flow are particularly salient in the context of crisis response, where social media have become an integral part of both the formal and informal communication infrastructure. Improved understanding of online rumoring could inform communication and information-gathering strategies for crisis responders, journalists, and citizens affected by disasters, leading to innovative solutions for detecting, tracking, and responding to the spread of misinformation and malicious rumors. This project has the potential to fundamentally transform both methods and theories for studying collective behavior online during disasters. Techniques developed for tracking rumors as they evolve and spread over social media will aid other researchers in addressing similar problems in other contexts. Learn more
Understanding Online Audiences for Police on Social Media
Emma S. Spiro (PI)
A number of high-profile incidents have highlighted tensions between citizens and police, bringing issues of police-citizen trust and community policing to the forefront of the public’s attention. Efforts to mediate this tension emphasize the importance of promoting interaction and developing social relationships between citizens and police. This strategy – a critical component of community policing – may be employed in a variety of settings, including social media. While the use of social media as a community policing tool has gained attention from precincts and law enforcement oversight bodies, the ways in which police are expected to use social media to meet these goals remains an open question. This study seeks to explore how police are currently using social media as a community policing tool. It focuses on Twitter – a functionally flexible social media space – and considers whether and how law enforcement agencies are co-negotiating norms of engagement within this space, as well as how the public responds to the behavior of police accounts. Learn more
Mass Convergence of Attention During Crisis Events
Emma S. Spiro (PI)
When crises occur, including natural disasters, mass casualty events, and political/social protests, we observe drastic changes in social behavior. Local citizens, emergency responders and aid organizations flock to the physical location of the event. Global onlookers turn to communication and information exchange platforms to seek and disseminate event-related content. This social convergence behavior, long known to occur in offline settings in the wake of crisis events, is now mirrored – perhaps enhanced – in online settings. This project looks specifically at the mass convergence of public attention onto emergency responders during crisis events. Viewed through the framework of social network analysis, convergence of attention onto individual actors can be conceptualized in terms of network dynamics. This project employs a longitudinal study of social network structures in a prominent online social media platform to characterize instances of social convergence behavior and subsequent decay of social ties over time, across different actors types and different event types. Learn more
Social Interaction and Peer Influence in Activity-Based Online Communities
Emma Spiro (Co-PI), Zack Almquist (Co-PI)
Individuals are influenced by their social networks. People adjust not only their opinions and attitudes, but also their behaviors based on both direct and indirect interaction with peers. Questions about social influence are particularly salient for activity-based behaviors; indeed much attention has been paid to promoting healthy habits through social interaction in online communities. A particularly interesting implication of peer influence in these settings is the potential for network-based interventions that utilize network processes to promote or contain certain behaviors or actions in a population; however, the first step toward designing such intervention strategies is to understand how, when, and to what extent social signals delivered via social interaction influence behavior. This project fills this gap by using digital traces of behaviors in online platforms to observe and understand how social networks and interactions are associated with behavior and behavior change. Learn more.
Detecting Misinformation Flows in Social Media Spaces During Crisis Events
Kate Starbird (PI), Emma Spiro (Co-PI), Robert Mason (Co-PI)
This research seeks both to understand the patterns and mechanisms of the diffusion of misinformation on social media and to develop algorithms to automatically detect misinformation as events unfold. During natural disasters and other hazard events, individuals increasingly utilize social media to disseminate, search for and curate event-related information. Eyewitness accounts of event impacts can now be shared by those on the scene in a matter of seconds. There is great potential for this information to be used by affected communities and emergency responders to enhance situational awareness and improve decision-making, facilitating response activities and potentially saving lives. Yet several challenges remain; one is the generation and propagation of misinformation. Indeed, during recent disaster events, including Hurricane Sandy and the Boston Marathon bombings, the spread of misinformation via social media was noted as a significant problem; evidence suggests it spread both within and across social media sites as well as into the broader information space.
Taking a novel and transformative approach, this project aims to utilize the collective intelligence of the crowd – the crowdwork of some social media users who challenge and correct questionable information – to distinguish misinformation and aid in its detection. It will both characterize the dynamics of misinformation flow online during crisis events, and develop a machine learning strategy for automatically identifying misinformation by leveraging the collective intelligence of the crowd. The project focuses on identifying distinctive behavioral patterns of social media users in both spreading and challenging or correcting misinformation. It incorporates qualitative and quantitative methods, including manual and machine-based content analysis, to look comprehensively at the spread of misinformation. Learn more
Hazards, Emergency Response and Online Informal Communication
Emma S. Spiro (PI)
Project HEROIC is a collaborative, NSF funded effort with researchers at the University of Kentucky and the University of California, Irvine which strives to better understand the dynamics of informal online communication in response to extreme events.
The nearly continuous, informal exchange of information — including such mundane activities as gossip, rumor, and casual conversation — is a characteristic human behavior, found across societies and throughout recorded history. While often taken for granted, these natural patterns of information exchange become an important “soft infrastructure” for decentralized resource mobilization and response during emergencies and other extreme events. Indeed, despite being historically limited by the constraints of physical proximity, small numbers of available contacts, and the frailties of human memory, informal communication channels are often the primary means by which time-sensitive hazard information first reaches members of the public. This capacity of informal communication has been further transformed by the widespread adoption of mobile devices (such as “smart-phones”) and social media technologies (e.g., microblogging services such as Twitter), which allow individuals to reach much larger numbers of contacts over greater distances than was possible in previous eras.
Although the potential to exploit this capacity for emergency warnings, alerts, and response is increasingly recognized by practitioners, much remains to be learned about the dynamics of informal online communication in emergencies — and, in particular, about the ways in which existing streams of information are modified by the introduction of emergency information from both official and unofficial sources. Our research addresses this gap, employing a longitudinal, multi-hazard, multi-event study of online communication to model the dynamics of informal information exchange in and immediately following emergency situations. Learn more
Eigenfactor Project
Jevin D. West, Carl Bergstrom
The aim of the Eigenfactor Project is to develop methods, algorithms, and visualizations for mapping the structure of science. We use these maps to identify (1) disciplines and emerging areas of science, (2) key authors, papers and venues and (3) communication patterns such as differences in gender bias. We also use these maps to study scholarly publishing models and build recommendation engines and search interfaces for improving how scholars access and navigate the literature. Learn more
Open Access Project
Jevin D. West, Carl Bergstrom
The open access movement has made great strides. There has been a significant increase in Open Access journals over the last ten years and many large foundations now require OA. Unfortunately, during the same time, there has been a signficant increase in exploitative, predatory publishers, which charge authors to publish with little or no peer review, editorial services or authentic certification. We are developing a cost effectiveness tool that will create an open journal market of prices and influence scores where these kinds of journals can be objectively identified [2]. Learn more
Cross-platform analysis of the information operation targeting the White Helmets in Syria
Kate Starbird (PI)
Ph.D. Student Lead: Tom Wilson
While there is increasing awareness and research about online information operations—efforts by state and non-state actors to manipulate public opinion through methods such as the coordinated dissemination of disinformation, amplification of specific accounts or messages, and targeted messaging by agents who impersonate online political activists—there is still a lot we do not understand, including how these operations take place across social media platforms. In this research we are investigating cross-platform collaboration and content amplification that can work to propagate disinformation, specifically as part of an influence campaign targeting the White Helmets, a volunteer response organization that operates in Syria.
Through a mixed-method approach that uses ‘seed’ data from Twitter, we have examined the role of YouTube in the influence campaign against the White Helmets. Preliminary findings suggest that on Twitter the accounts working to discredit and undermine the White Helmets are far more active and prolific than those that support the group. Furthermore, this cluster of anti-WH accounts on Twitter uses YouTube as a resource—leveraging the specific affordances of the different platforms in complementary ways to achieve their objectives. Learn more
Using Facebook engagements to assess how information operations micro-target online audiences using “alternative” new media
Kate Starbird (PI)
Ph.D. Student Lead: Tom Wilson
There is a pressing need to understand how social media platforms are being leveraged to conduct information operations. In addition to being deployed to influence democratic processes, information operations are also utilized to complement kinetic warfare on a digital battlefield. In prior work we have developed a deep understanding of the Twitter-based information operations conducted in the context of the civil war in Syria. Extending upon this work, this project examines how Facebook is leveraged within information operations, and how a subsection of the “alternative” media ecosystem is integrated into those operations. We aim to understand the structure and dynamics of the media ecosystem that is utilized by information operations to manipulate public opinion, and more about the audiences that engage with related content from these domains. Our research will provide insight into how Facebook features into a persistent, multi-platform information operations campaign. Complementing previous research, it will provide insight into how a subsection of the alternative media ecosystem is leveraged by political entities to micro-target participants in specific online communities with strategic messaging and disinformation.
Unraveling Online Disinformation Trajectories: Applying and Translating a Mixed-Method Approach to Identify, Understand and Communicate Information Provenance
Kate Starbird (PI)
This project will improve our understanding of the spread of disinformation in online environments. It will contribute to the field of human-computer interaction in the areas of social computing, crisis informatics, and human centered data science. Conceptually, it explores relationships between technology, structure, and human action – applying the lens of structuration theory towards understanding how technological affordances shape online action, how online actions shape the underlying structure of the information space, and how those integrated structures shape information trajectories. Methodologically, it enables further development, articulation and evaluation of an iterative, mixed method approach for interpretative analysis of “big” social data. Finally, it aims to leverage these empirical, conceptual and methodological contributions towards the development of innovative solutions for tracking disinformation trajectories.
The online spread of disinformation is a societal problem at the intersection of online systems and human behavior. This research program aims to enhance our understanding of how and why disinformation spreads and to develop tools and methods that people, including humanitarian responders and everyday analysts, can use to detect, understand, and communicate its spread. The research has three specific, interrelated objectives: (1) to better understand the generation, evolution, and propagation of disinformation; (2) to extend, support, and articulate an evolving methodological approach for analyzing “big” social media data for use in identifying and communicating “information provenance” related to disinformation flows; (3) to adapt and transfer the tools and methods of this approach for use by diverse users for identification of disinformation and communication of its origins and trajectories. More broadly, it will contribute to the advancement of science through enhanced understandings and conceptualization of the relationships between technological affordances, social network structure, human behavior, and intentional strategies of deception. The program includes an education plan that supports PhD student training and recruits diverse undergraduate students into research through multiple mechanisms, including for-credit research groups and an academic bridge program. Learn more
Community Labs in Public Libraries
Rolf Hapel, Chris Coward, Chris Jowaisas, Jason Young
Given their historic role of curating information, libraries have the potential to be key players in combating misinformation, political bias, and other threats to democracy. This research project seeks to support public libraries in their efforts to address misinformation by developing library-based community labs: spaces where patrons can collectively explore pressing social issues. Community labs have become popular across many European countries, where they are used to instigate democratic debate, provide discussion spaces and programming, alleviate community tensions, and promote citizenship and mutual understanding. This project aims to adapt a community lab model for use by public libraries in Washington. Learn more
Community Labs in Public Libraries
Rolf Hapel, Chris Coward, Chris Jowaisas, Jason Young
Given their historic role of curating information, libraries have the potential to be key players in combating misinformation, political bias, and other threats to democracy. This research project seeks to support public libraries in their efforts to address misinformation by developing library-based community labs: spaces where patrons can collectively explore pressing social issues. Community labs have become popular across many European countries, where they are used to instigate democratic debate, provide discussion spaces and programming, alleviate community tensions, and promote citizenship and mutual understanding. This project aims to adapt a community lab model for use by public libraries in Washington. Learn more