The Prebunking Playbook for Communicators
Every communications professional knows the sinking feeling. You’ve spent weeks crafting clear, evidence-based messaging about a new policy or initiative, only to watch it get twisted into something unrecognisable on social media. By the time you’ve prepared a response, the misinformation has already spread to thousands who will never see your correction.
Because we do a lot of work in the climate and energy spaces, we see it all the time and work with our clients to anticipate and mitigate it.
This isn’t a new problem, but it’s getting worse. False information spreads farther, faster, deeper, and more broadly than the truth. And that’s particularly potent when trust in public institutions is declining. It’s also as true for internal communications as it is for external communications.
The Australian Government recognises that misinformation and disinformation threaten our democracy, society and economy. But legislative approaches alone aren’t enough. We need communication strategies that build public resilience before false information takes hold.
Understanding the landscape

Let’s be clear about definitions:
- Misinformation is incorrect or misleading content spread without harmful intent.
- Disinformation is false content shared deliberately to deceive.
- Malinformation is true information weaponised out of context to cause damage.
Why does this matter? Because intent shapes your response. You handle confused citizens differently from coordinated bad actors. Don’t waste energy trying to change the mind of someone who’s paid to lie. Focus on helping your unsuspecting target audience to recognise and resist mis/dis or malinformation before it spreads. (For reading ease, we will refer to misinformation for the rest of this post).
Why traditional approaches fall short
Many people believe clear, accurate information naturally wins out over mis/disinformation. If we just explained facts better, used more data, or found more credible spokespeople, people would recognise the truth. Wrong.
The challenge isn’t just about information quality – it’s about how people process information in an environment designed to capture attention and provoke emotional responses. False news is more novel, and people are more likely to share novel information, with false news around 70% more likely to be retweeted than true news.
Traditional fact-checking, while important, often arrives too late. By the time we’re correcting misinformation, people have already formed opinions and shared false content. Even worse, research shows that repeating false claims, even to debunk them, can sometimes strengthen people’s memory of those claims rather than the correction.
“I’m telling you a lie in a vicious effort that you will repeat my lie over and over until it becomes true” – Lady Gaga, perfectly summing up how misinformation works.
Prebunk so you don’t have to debunk
While debunking or fact checking might seem like a logical step in correcting misinformation, it can be time-consuming and risks reaffirming the incorrect information if not done well. Prebunking draws on the theory of psychological inoculation: analogous to medical immunisation, prebunking involves pre-emptively warning and exposing people to weakened doses of misinformation, helping cultivate “mental antibodies” against fake news.
Rather than waiting for false information to spread and then responding, prebunking helps people recognise manipulation tactics before they encounter them in the wild. Research shows that inoculation interventions can reduce engagement with misinformation, and the beauty is that it doesn’t require people to remember specific facts. Instead, it teaches them to recognise patterns and techniques used in misleading content.
What government communicators can do
So how can you implement prebunking strategies? Here are practical approaches that go beyond traditional fact-checking:
- Build pattern recognition
Help your audiences recognise techniques used to spread misinformation. Common patterns include emotional manipulation, false urgency, cherry-picked statistics, and appeals to “secret knowledge.” When you see these tactics emerging around your issues, call them out proactively.
- Inoculate against predictable attacks
Most government policies face predictable lines of attack. If you’re announcing environmental policy, expect claims about job losses or economic impacts. Rather than waiting for these attacks, address them upfront by explaining why people might hear these claims and providing context for evaluating them.
- Use specific eamples
Prebunking works by equipping people with skills they need to refute future falsehoods. Generic warnings about “fake news” don’t work. Show concrete examples of how misinformation spreads in your policy area and what manipulation techniques look like in practice.
- Leverage trusted messengers
You don’t have to do this alone. Partner with journalists, community leaders, and subject matter experts who can help explain manipulation tactics using language and examples that resonate with different communities.
- Build media literacy skills
Rather than just providing information, teach people how to evaluate sources. This includes checking publication dates, verifying sources, and recognising when content is designed to provoke emotional reactions rather than inform.
When correction is necessary
Once bad information is out there, correcting it requires strategy:
- Lead with truth: Avoid repeating myths in headlines as it can reinforce them.
- Use the “truth sandwich”: Start with truth → briefly mention the myth → end by reinforcing truth again.
- Make it accessible: Don’t rely on PDFs or dense statements. Use visuals, clear language, and formats your audience actually consumes.
Building trust
People don’t just evaluate the message – they judge the messenger. If they don’t trust you, your facts won’t land.
To build and maintain trust:
- Acknowledge uncertainty when it’s real – false confidence backfires.
- Be transparent about how decisions are made.
- Elevate community voices, not just institutional ones.
- Trust is earned slowly and lost fast. Invest early.
Looking ahead: AI changes everything
Deepfakes, fake screenshots, and AI-generated false narratives are here now. You need a plan for how to spot, respond to, and verify suspicious content. At minimum:
- Establish verification protocols.
- Train spokespeople for manipulated content scenarios.
- Brief executives on synthetic media risks.
Building resilience, not suspicion
The goal isn’t to make people suspicious of everything they read, but to help them develop better instincts for evaluating information quality. This requires a shift in thinking. Instead of seeing ourselves as information providers, we need to see ourselves as building public resilience against manipulation.
Government communicators are uniquely positioned to lead this shift. We understand the policy landscape, we know where misinformation typically emerges, and we have the credibility to explain complex issues accessibly. By moving beyond reactive fact-checking to proactive inoculation, we can help build a more resilient information environment.