News coverage from January 2024 about the Center for an Informed Public and CIP-affiliated research and researchers.
- The Guardian (January 1): “‘Stakes are really high’: misinformation researcher changes tack for 2024 U.S. election”
CIP co-founder Kate Starbird, a UW Human Centered Design & Engineering associate professor, was interviewed and featured in an article in The Guardian about the political and legal attacks on misinformation researchers. “Right now, we’ve got a space where we may be in a ‘Boy who cried wolf’ situation, where there’s so much misinformation about election integrity that if we have a true threat, we may miss it,” Starbird said.
***
- KUOW Public Radio (January 3): “Tips for sorting online fact from fiction”
CIP research scientist Mike Caulfield discussed Verified: How to Think Straight, Get Duped Less, and Make Better Decisions on What to Believe Online, the book he co-authored with Stanford University professor emeritus Sam Wineberg, in an interview with KUOW Public Radio
***
- Laconia Daily Sun (January 3): “What to expect when you’re electing: How to navigate election misinformation online”
The Laconia Daily Sun newspaper in New Hampshire republished an article originally published in the Keene Sentinel that references CIP research scientist Mike Caulfield and the SIFT method for factchecking and contextualizing claims online that he developed.
***
- Poynter (January 4): “Look for the death of X and audio deepfakes in 2024”
In a Poynter commentary, Alex Mahadevan cites a quote from CIP director and co-founder Kate Starbird, originally published in 2023 in The Washington Post about the political and legal attacks on misinformation researchers. “It’s clear to me that researchers and their institutions won’t be deterred by conspiracy theorists and those seeking to smear and silence this line of research for entirely political reasons.”
***
- The Forward (January 9): “How Twitter activists turned a viral story about Orthodox Jews into a modern blood libel”
The CIP’s rapid research around the “new elites” of Twitter/X was referenced in an opinion article in The Forward.
***
- Nature (January 9): “How online misinformation exploits ‘information voids’ — and what to do about it”
A Nature editorial about media literacy education cited CIP research scientist Mike Caulfield.
***
- National Public Radio (January 15): “Iowa Republicans will use an app to transmit caucus results. Sound familiar?”
A National Public Radio report on an app being used to transmit results from the Iowa Republic Caucus referenced a CIP rapid research report report studying conspiracy theories about voting during the caucus.
***
- San Francisco Examiner (January 18): “How to make the most — or least — of X this election year”
CIP research scientist Mike Caulfield was interviewed for a story in the San Francisco Examiner that explored tips for navigating and making sense of information on X, the social media formerly known as Twitter.
***
- New Hampshire Public Radio (January 18): “How to navigate election misinformation online”
New Hampshire Public Radio republished an article from The Keene Sentinel that referenced SIFT, the method for factchecking and contextualizing claims online developed by CIP research scientist Mike Caulfield.
***
- NBC News (January 18): “Disinformation poses an unprecedented threat in 2024 — and the U.S. is less ready than ever”
NBC News reporter Brandy Zadrozny interviewed CIP research scientist Mike Caulfield for a story about the challenges academic researchers face in 2024 trying to study election-related rumors, conspiracy theories and mis- and disinformation, including limited access to platform data. Zadrozny wrote: “The delay in catching false narratives early is essentially giving disinformation a head start, Caulfield said, and could mean a delay in fact-checking efforts and context from journalists.” Zadrozny also cited a CIP rapid research blog post about election rumor emerging from the 2024 Iowa Republican Caucus.
***
- The Washington Post (January 20): “AI bots are everywhere now. These telltale words give them away.”
CIP research scientist Mike Caulfield was interviewed for an article in The Washington Post about ChatGPT error messages appearing on various online platforms, including Amazon product descriptions and on X posts. “I hope it wakes people up to the ludicrousness of this, and maybe that results in platforms taking this new form of spam seriously,” Caulfield told Washington Post reporter Will Oremus.
***
- Salon (January 20): “‘Democrat shenanigans’: Experts alarmed as MAGA fans cry “fraud” in Iowa — despite Trump’s huge win”
CIP research scientist Mike Caulfield was interviewed and quoted in a Salon article about election rumors emerging from the Iowa Republican Caucus. “So far in this primary there hasn’t been a whole lot of rumor — even compared to 2016 or 2020, at least on X,” Caulfield told Salon. “That might change in New Hampshire if there is a serious challenge to the narrative of Trump’s dominance.”
***
- Forbes (January 26): “The creator economy is facing a perfect storm of AI-generated content and piracy”
CIP co-founder Jevin West was interviewed and quoted in Forbes for an article about AI-generated content. “The real question is, will users want a real human behind that content? And the data is inconclusive on that,” he said. “The danger is that when there are information vacuums, such as during natural disasters or elections, opportunists sweep in. My speculation is that it will get worse.”
***
- The Urban Activist (January 30): “Going offline to combat online disinformation”
The Co-Designing for Trust project, and comments from CIP research fellow Jason C. Young were featured in this detailed article in The Urban Activist online magazine about grassroots/community-driven approaches to countering mis- and disinformation.
***
- GeekWire (January 31): “New nonpartisan AI nonprofit TrueMedia, led by Oren Etzioni, is making a political deepfake detector”
GeekWire published a story about a new tech startup developing political deepfakes detection tools that mentions CIP co-founder Jevin West, who is a member of TruMedia’s scientific advisory board, and a related Rainier Club panel discussion on AI and deepfakes.