Explore news coverage from June 2022 featuring the Center for an Informed Public and CIP-affiliated research and researchers.
- Open Mind (June 2): “How science fuels a culture of misinformation”
In an Open Mind magazine article, which was republished by Nieman Lab, writer Noelle Renstrom interviews CIP co-founder and iSchool associate professor Jevin West about science in and about science. Renstrom, who also highlights insights from UW Biology professor and CIP faculty member Carl Bergstrom, writes: “Most readers, including journalists, can’t discern the quality of the science. Yet it’s ‘taken forever for the publishing community to provide banners on the original papers’ to signal they ‘might not reach the conclusion readers think,’ West says. Tentative or unsubstantiated claims can have profound social impacts.”
***
- Agence France Presse (June 3): “Baseless ‘false flag’ claims circulate after Texas school shooting”
An AFP factcheck of conspiracy allegations emerging from the Uvalde, Texas mass shooting cites a 2017 paper written by CIP director and UW Human Centered Design & Engineering associate professor Kate Starbird that explores the ways social media can amplify “alternative narratives” around major events.
***
- Agence France Presse (June 6): “Old tricks, new crises: How misinformation spreads”
CIP research scientist Mike Caulfield was interviewed in an AFP factcheck article about how many of the conspiracy tropes that emerge from mass shooting events, like the one in Uvalde, Texas, are repurposed from previous narratives.
***
- The New York Times (June 7): “Don’t believe everything you read about the man in this photo”
“I don’t think you can find an event of significant magnitude where this doesn’t happen in the aftermath — it’s almost a reflex at this point,” said Mike Caulfield, a CIP research scientist. “Nowadays, people are promoting false-flag and crisis-actor theories 20 minutes after the event, and in very formulaic ways.”
***
- Nature Human Behaviour (June 8): “Journals must do more to support misinformation research”
In a Q&A interview with Springer Nature editor Arunas Radzvilavicius, CIP postdoctoral fellows Joe Back-Coleman and Rachel E. Moran discuss the challenges they’ve faced publishing interdisciplinary research about mis- and disinformation and how journals could change to better serve researchers in this space.
***
- The 19th (June 13): “Women secretaries of state face threats and harassment for battling election lies”
CIP postdoctoral fellow Rachel E. Moran was interviewed about research into better understanding whether women who serve as as top election officials in their states are the targets of more misinformation. “There’s this idea — this underlying misogynistic framework that makes a lot of this misinformation more attractive for people to believe in. And so there is this entire gendered element that has gone pretty understudied in the past,” Moran said.
***
- Tech Policy Press (June 13): “Researchers release comprehensive Twitter dataset of false claims from the 2020 elections”
In a piece co-published by Tech Policy Press and JustSecurity, Justin Hendrix features a paper from a CIP-led team published in the Journal of Quantitative Description, “Repeat spreaders and election delegitimization: A comprehensive dataset of misinformation tweets from the 2020 U.S. election.”
***
- Psychology Today (June 15): “The stain on the wall: How false ideas stick with us”
Western Washington University psychology professor Ira Hyman highlights key ideas from a recent CIP Invited Speaker Series presentation and a recent tweet thread from CIP co-founder and faculty director Kate Starbird that explores the dynamics of disinformation.
***
- ProPublica (June 17): “‘Big Lie’ vigilantism is on the rise. Big Tech is failing to respond.”
An April 30 research analysis from a CIP team led by research scientist Mike Caulfield that examined the origins and spread of “ballot mule” narratives online was referenced in a ProPublica article exploring how “[s]tolen-election activists and supporters of former President Donald Trump have embraced a new tactic in their ongoing campaign to unearth supposed proof of fraud in the 2020 presidential race: chasing down a fictional breed of fraudster known as a ‘ballot mule’ and using social media to do it.”
***
- CNN Business (June 17): “Elon Musk pressured Twitter to give him access to a ‘firehose’ of data to evaluate bots. Now what?”
In an interview about Elon Musk’s access to Twitter data to evaluate bot accounts on the platform, CIP co-founder Ryan Calo, a UW School of Law professor and faculty co-director of the UW Tech Policy Lab, said: “I think it defies imagination that Musk and his collaborators would be in a better position to give a good faith estimate of what the automated activity is. So, this is kind of like sharing a lot of information for no good reason, and whenever you do that, it raises concerns.”
***
- Bloomberg News (June 23): “Meta pulls support for tool used to keep misinformation in check”
In an interview with Davey Alba of Bloomberg News, CIP director and co-founder Kate Starbird, a UW Human Centered-Design & Engineering associate professor discussed the research ramifications of Meta’s Facebook pulling support of its CrowdTangle analytics tool and how she hopes that Facebook creates “a viable alternative” and “give researchers and journalists time to redesign their workflows around the new tool.”
***
- Grid News (June 27): “Social media sites can slow the spread of misinformation with modest interventions”
In an Q&A feature with Anya van Wagtendonk of Grid News, CIP postdoctoral fellow Joe Bak-Coleman discusses a Nature Human Behaviour paper from a team of CIP researchers, “Combining interventions to reduce the spread of viral misinformation.”
***
- MIT Technology Review (June 27): “Facebook is bombarding cancer patients with ads for unproven treatments”
CIP postdoctoral fellow Rachel E. Moran was interviewed about how “misinformation like this rarely stays confined to the platform where it’s originally posted. While Facebook plays a key role in getting sensational claims about dubious cancer treatments in front of desperate patients, the groups and ads carrying those claims often link to other sites and networks that reinforce them.”
***