GW Study on Misinformation Earns APA Editor’s Choice Recognition


March 14, 2025

Stock image of likes and comments

What if understanding how we think could help stop misinformation in its tracks? The study “The role of mental representation in sharing misinformation online" takes a closer look at this crucial issue, investigating how the way people process information affects their likelihood of sharing misinformation. The research team, led by professor David Broniatowski, discovered that while simple yet insightful messages in a gist-based format can promote the spread of misinformation, they can also be leveraged to counteract it—offering a new, science-driven, practical approach to combat misinformation online.

Published in the American Psychological Association’s (APA) Journal of Experimental Psychology: Applied in July 2024, the study was recently selected as an APA Editor’s Choice article, an honor given only to papers deemed particularly impactful in advancing scientific understanding. For Broniatowski and co-authors Pedram Hosseini and Ethan Porter of GW and Thomas Wood of Ohio State University, this recognition amplifies the importance of their research as the first one to test the effectiveness of gist-based interventions systematically.

“I’m gratified to receive this recognition on behalf of myself and my co-authors. This paper builds upon established scientific theory with decades of evidence to support it and was carried out through collaboration across GW schools. The result was published in a top psychology journal and demonstrates how interdisciplinary and cross-school collaboration can move the ball forward intellectually,” said Broniatowski.

The researchers applied Fuzzy-Trace Theory, which suggests that people often simplify complex ideas into intuitive, emotional “gist” representations when making decisions, to study how this process impacts understanding and behavior around misinformation. Their findings explain why some fact-checking organizations have been more effective than others since social media platforms began partnering with them in 2016. For Broniatowski, this underscores why platforms should work to deliver fact-checks that are aligned with users’ values rather than a “one-size-fits-all” approach.

Platforms have begun ending their partnerships with professional fact-checking organizations this year. For example, in January, the social media giant Meta announced the end of its third-party fact-checking program in favor of a Community Notes model. “Rather than pitting these two models against one another, our paper suggests a third path forward: platforms working together with other organizations, such as professional fact-checkers but potentially community note writers as well, to offer explicit guidance on writing notes to help users understand and contextualize why and how a piece of misinformation might be harmful to them or those they care about,” Broniatowksi said.

As public health agencies, scientific communicators, professional fact-checkers, and individual social media users all work together to address the challenges of misinformation online, the insights from this study are crucial. Broniatowski hopes that this study empowers both users and platforms to make informed decisions about what content they wish to share and why.