Skip to content

PLOS is a non-profit organization on a mission to drive open science forward with measurable, meaningful change in research publishing, policy, and practice.

Building on a strong legacy of pioneering innovation, PLOS continues to be a catalyst, reimagining models to meet open science principles, removing barriers and promoting inclusion in knowledge creation and sharing, and publishing research outputs that enable everyone to learn from, reuse and build upon scientific knowledge.

We believe in a better future where science is open to all, for all.

PLOS BLOGS PLOS SciComm

PLOS SciComm chats with Matthew Facciani, author of “MISGUIDED: Where Misinformation Starts, How It Spreads, and What to Do About It” 

BY: BILL SULLIVAN & MATTHEW FACCIANI

It was said a long time ago that “a lie can travel halfway around the world while the truth is still putting on its shoes.” With the advent of social media, the truth barely has a chance to wake up and rub the sleep out of its eyes.

Misinformation, disinformation, and lies are the greatest challenges faced by science and medical communicators today. We live in a world where scientific facts are met with “alternative facts.” Unqualified podcasters and pundits are heralded as bastions of truth while expertise is eyed with suspicion. The manic street preacher is not only considered the sane one in the room but is now worthy of holding public office and making major policy decisions.

Misinformation does more than endanger our health, economy, and environment. As Barack Obama warned, the pollution in our information ecosystem is a significant threat to democracy.

Matthew Facciani is an interdisciplinary social scientist at the University of Notre Dame. He recently published his first book, MISGUIDED: Where Misinformation Starts, How It Spreads, and What to Do About It (Columbia University Press), which is about the social and psychological forces that guide people into believing false information. Facciani argues that everyone is susceptible to being misguided, and he offers evidence-based tools to avoid this trap and rescue those who’ve fallen into it.

Sullivan: Before we discuss MISGUIDED, I’d like to ask what inspired you to write it. You started your career in a neuroscience PhD program and switched to sociology. Was there a pivotal experience that sparked an interest in misinformation that drove you to make this change?

Facciani: I’ve always been fascinated by how people form beliefs. As an undergraduate, I enjoyed my neuroscience classes and pursued neuroimaging research opportunities, which led me to start a PhD in cognitive neuroscience. However, I realized that to study the bigger-picture questions about how and why people believe what they do, sociology was a better fit. So, I “changed majors” during graduate school and switched into a sociology PhD program at the same university. I still value my neuroscience background, as it gives me a broader perspective on the research questions I explore.

Sullivan: You’re currently a postdoctoral researcher at the University of Notre Dame. What are you studying there, and how will it help address the misinformation crisis?

Facciani: Much of my work focuses on developing and testing techniques to promote media and digital literacy. For example, our team created an online game that simulates a WhatsApp conversation, where players respond to hoaxes and rumors from friends and family by deciding whether the information is credible. Through the game, players learn skills like verifying sources and recognizing psychological biases. We found that those who played the short game were better at identifying false news headlines and less likely to share them. Originally designed for an Indonesian audience, the game was also effective in English for both Indonesians and Americans. We also developed a brief online intervention to help people spot manipulated images more effectively. Overall, our research shows that quick, scalable tools can help people build resilience against misinformation.

Sullivan: A central theme in your book is the concept of identity. Can you explain how that ties into falling for misinformation?

Facciani: An identity is a set of meanings that define who we are through our social roles, group memberships, and unique personal traits. We’re motivated to align our behaviors with these identities because doing so supports our self-esteem. For example, if we see ourselves as a student, we’re motivated to study and attend class, and we feel good about ourselves when we earn good grades.

Similarly, if we have a political identity, we may process information in ways that support our politics and attack opponents, even if that means we’re not being fully objective. This illustrates how our identities can bias the way we process information, especially when it comes to misinformation. Identities not only bolster our self-esteem but also connect us to our communities, so we’re motivated to process information in ways that reinforce both. Believing a falsehood can sometimes help us stay connected to our community and uphold our identity.

Sullivan: While your book illustrates that misinformation is not new, many feel that social media is the major propagator of it today. Social media isn’t going away, so what can be done to minimize the dissemination of falsehoods? Is there a way science communicators can use social media for good?

Facciani: First, platforms need to be more transparent and collaborative. If social media companies shared more about their algorithms and data, researchers and policymakers could better understand and address the problem.

Second, we can slow down how fast misinformation spreads by adding a little friction, like prompts to pause before sharing or small tasks that make people think twice before posting.

And third, social media can also be used for good. As science communicators, we can reach huge audiences with evidence-based information, craft messages that resonate with people’s values, and engage directly with people in ways that foster trust rather than divisiveness. So even though social media presents challenges, it also offers incredible opportunities to help people think more clearly and connect with reliable information.

Sullivan: What are the most effective approaches that science communicators can employ to debunk mythical beliefs and correct those who have been misinformed?

Facciani: There are a few evidence-based strategies that stand out when debunking myths and correcting misinformation. First, timing and repetition matter: corrections are more effective when delivered early, before the myth spreads widely, and when repeated at least twice to reinforce the correct information. This also highlights the effectiveness of “prebunking,” which provides people with skills to recognize manipulation techniques and misinformation before they see them online. It also helps to offer an alternative explanation rather than just stating something is false. People need something to fill the gap left by the myth, and giving them a clear, credible alternative helps the correction stick.

Finally, the source and tone are crucial. Corrections from trusted, credible sources, delivered with empathy and respect, are more likely to be received well. A “nerd of trust” — someone in a community or network who’s both knowledgeable and relatable — can often have more impact than a faceless and abstract scientific institution.

Sullivan: What would you say is the main thing we do NOT want to do when engaging with someone whose beliefs run counter to scientific or medical consensus?

Facciani: One of the most important things to avoid when engaging with someone whose beliefs contradict scientific or medical consensus is attacking them, shaming them, or dismissing their concerns. Such approaches tend to make people defensive, reinforce their existing beliefs, and shut down meaningful dialogue.

It’s vital to establish mutual respect, listen with empathy, and avoid judgment. Making someone feel heard and validated opens the door to a more productive conversation. Attacking someone’s identity or ridiculing their beliefs is counterproductive because people are motivated to defend their self-esteem provided by their social identity, even if that means rejecting accurate information.

The goal shouldn’t be to change their mind, but to remove defensive biases and increase the odds of them actually hearing your perspective.

Sullivan: A lot of scientists I know no longer feel that the public trusts their expertise the way they used to. What do you believe is behind this, and how might we regain that trust?

Facciani: People tend to trust those they see as both competent and having their best interests at heart. When science and medicine feel distant or disconnected from people’s lives, trust can erode. To rebuild it, we need to act on multiple levels.

At the micro level, individual scientists can foster trust by clearly communicating their work within their personal networks and showing humility about its limitations. At the macro level, institutions can become more transparent and accountable. But the meso level — the community level — is especially critical. Here, individuals, communities, and institutions can collaborate effectively.

For example, partnering with trusted local figures — like pastors, barbers, and teachers — to share accurate information can be much more impactful than a distant expert at a press conference. Similarly, hosting events in familiar, welcoming spaces such as churches, salons, or community centers allows people to hear messages from sources they already know and trust. The goal is to build partnerships with communities rather than relying on outdated, top-down messaging.

Photo by Christina @ wocintechchat.com on Unsplash

Sullivan: As you mentioned in the final chapter of MISGUIDED, our latest challenge in the war on misinformation is AI. What do you foresee as the major obstacle in that context, and how should we prepare?

Facciani: One of the biggest obstacles with AI in the context of misinformation is how easily and convincingly it can produce false or misleading content at scale. AI Chatbots and image/video generators make it possible to flood the information ecosystem with plausible-sounding text, realistic fake images, and even deepfakes that erode trust in what we see and hear. Beyond simply producing bad information, AI can exploit our existing identity biases that I described above.

To prepare, we need to address this challenge on several fronts. First, we must continue to improve media and digital literacy, helping people critically evaluate what they see, including teaching them the limitations of AI tools and AI-generated content. Second, we could develop tools that help verify the origin of content and flag potential manipulation. Finally, it’s crucial to remember that AI isn’t only a threat, it can also be part of the solution: studies show that AI chatbots can help debunk conspiracy theories, improve difficult conversations, and assist human fact-checkers.

Ultimately, tackling AI-driven misinformation requires cooperation between researchers, technologists, policymakers, and the public, and a commitment to staying mindful of how our own psychology interacts with these technologies.

Sullivan: Do you have advice for fellow science communicators who hope to write a book like yours one day?

Facciani: My biggest piece of advice is to start writing long before you think you’re ready. You don’t need to have everything figured out before you put words on the page. Start by writing short pieces (blogs, op-eds, essays) and use those to test ideas and see what resonates with people.

Be clear about your audience from the start. My book is grounded in research but written for a general audience, so I had to constantly think about what would be accessible, engaging, and meaningful for people outside of academia.

Also, expect it to take longer than you think, and know it’s okay to feel stuck or overwhelmed along the way. Find a few trusted readers who can give you honest feedback, and don’t be afraid to rewrite — a lot.

Finally, remember that your voice and perspective matter. You don’t have to cover everything perfectly or be the ultimate authority. Just focus on telling the story you’re uniquely positioned to tell, and trust that your passion for the topic will come through on the page.

Sullivan: I’d like to end with a note of hope. What can you offer us as a ray of light regarding the war on misinformation?

Facciani: Despite the challenges we face today, I believe we have a real opportunity to learn from our mistakes, strengthen our institutions, embrace science outreach, and rebuild trust.

Many brilliant people are already working on these issues, and I highlight their efforts throughout my book. One especially promising strategy is leveraging shared identities to reduce polarization. Focusing on what we have in common—whether a national identity, a local community, a hobby, or simply being human—helps people become less defensive and more open to dialogue. Research shows that emphasizing these shared identities can lower hostility, even across deep political divides, and help people process information more objectively.

In my book, I also describe the “5 R’s” for fostering productive conversations: establishing respect, relating to shared identity, reframing the discussion, revising language, and repeating these steps as needed. These strategies remind us that building trust takes time, but it is possible.

Finally, I want to again emphasize the importance of cultivating trust at the community level—through local organizations, conversations, and networks—which can create ripple effects that strengthen relationships and build resilience to misinformation. While the problem is complex, I’m optimistic that through connection, empathy, and shared purpose, we can make meaningful progress and create a more informed, trusting, and resilient society for everyone.

MATTHEW FACCIANI is a postdoctoral researcher at the University of Notre Dame. He is an interdisciplinary social scientist with a background in neuroscience and psychology and a PhD in sociology. His new book, Misguided, explains the social forces that make us vulnerable to misinformation and what we can do to build resilience against it.

BILL SULLIVAN is the Showalter Professor of Microbiology & Immunology at the Indiana University School of Medicine and author of Pleased to Meet Me: Genes, Germs, the Curious Forces That Make Us Who We Are. He is a member of the PLOS SciComm editorial board and Chair of the editorial advisory board for ASBMB Today. He has been featured in a wide variety of outlets, including CNN, Fox & Friends, The Doctors, TEDx, Science Fantastic with Dr. Michio Kaku, and more. He has published over 100 papers in scientific journals and has written for National GeographicDiscoverScientific American, WIRED, Washington PostPsychology Today, and more.

Related Posts
Back to top