| Debunking |
According to Countering Disinformation: A Guidebook for Public Servants, debunking aims "to explicitly expose false information to make the facts clear."
More
Debunking is not only about stating something is false. It's also about proving and explaining why it's false, and providing clear and corrective messaging, using accurate evidence and credible sources.
Related terms
Debunking is a technique people can use when they encounter false information. It's reactive. In contrast, pre-bunking prepares people to recognize and address false information before they encounter it. It's proactive.
Fact-checking is an important step before debunking, as it verifies whether information is accurate and provides evidence that can be used to correct false information.
Works consulted
|
| Democratic institutions |
Democratic institutions are actors and stakeholders that uphold democracy and serve the public interest. These include, but are not limited to, the electorate, political parties, the House of Commons, the Senate, the Governor General, and the public service.
More
The Canadian system of government is a parliamentary democracy. It relies on a series of processes to ensure that it represents and serves the people, like elections, campaigns, and leadership races.
The term "democratic institutions" is referenced by many sources but usually not clearly defined.
Works consulted
|
| Disinformation |
In Countering Disinformation: A Guidebook for Public Servants, the Government of Canada describes disinformation as "deliberately false information that is disseminated to deceive or cause harm."
More
Some sources add more nuance to their explanation of disinformation, stating that it may be partially false or misleading information.
An example of disinformation is a podcast spreading false information about a political candidate to discredit them. Another example is a fake version of a medical blog recommending a new diet pill with links to purchase it from a scam website.
Works consulted
- Global Affairs Canada, Rapid Response Mechanism Canada, n.d.
- Caroline Jack, Lexicon of Lies: Terms for Problematic Information, 2017.
- Privy Council Office, Countering Disinformation: A Guidebook for Public Servants, 2024.
- Jean-Baptiste Jeangène Vilmer, Alexandre Escorcia, Marine Guillaume, and Janaina Herrera, Information Manipulation: A Challenge for Our Democracies, 2018.
|
| Echo chambers |
Echo chambers are environments where people only encounter ideas and perspectives that are similar to their own. These environments tend to amplify or reinforce what people already believe.
More
Echo chambers exist both online and offline. Some digital tools like social media algorithms tailor content to match users' interests, which can promote echo chambers. While some users may voluntarily be in an echo chamber, others may not even realize they're in one.
Echo chambers narrow our exposure to diverse ideas and perspectives, which are crucial to a functioning democracy. Considering other viewpoints can challenge our beliefs and assumptions and strengthen our critical thinking, helping us make informed decisions.
Works consulted
- Commonwealth Parliamentary Association, Parliamentary Handbook on Disinformation, AI and Synthetic Media, 2023.
- Maria Giovanna Sessa, EU DisinfoLab, Disinformation glossary: 150+ Terms to Understand the Information Disorder, 2023.
- Global Affairs Canada, How Canada advances democracy in the world, n.d.
- Government of British Columbia, Critical Thinking and Reflective Thinking, n.d.
- Government of Ontario, Critical Thinking and Critical Literacy, n.d.
- OECD, Facts not Fakes: Tackling Disinformation, Strengthening Information Integrity, 2024.
- Nina Verishagen and Diane Zerr, Disinformation: Dealing with the Disaster, 2023.
- Claire Wardle and Hossein Derakhshan, Information Disorder: Toward an interdisciplinary framework for research and policymaking, 2017.
|
| Fake news |
Fake news is an umbrella term commonly used to describe any false, manipulated, or misleading information.
More
The term "fake news" is misused and overused. The term emphasizes news, but it is used to describe many different types of information. It has also become a politicized term used by people who wish to undermine, discredit, or distract from information they dislike or disagree with.
Experts recommend avoiding the term "fake news" and using more precise terms like "misinformation," "disinformation," and "malinformation."
Works consulted
- Cherilyn Ireton and Julie Posetti, eds., Journalism, ‘Fake News’ and Disinformation: Handbook for Journalism Education and Training, 2018.
- Jean-Baptiste Jeangène Vilmer, Alexandre Escorcia, Marine Guillaume, and Janaina Herrera, Information Manipulation: A Challenge for Our Democracies, 2018.
- Claire Wardle, Understanding Information disorder, 2020.
- Claire Wardle and Hossein Derakhshan, Information Disorder: Toward an interdisciplinary framework for research and policymaking, 2017.
|
| Foreign information manipulation and interference (FIMI) |
In their 2023 report, the G7 Rapid Response Mechanism describes FIMI as "an umbrella term for foreign efforts aimed at covertly manipulating the information environment to achieve strategic objectives." These malign efforts target and threaten individuals, organizations, and governments.
More
FIMI focuses on foreign threats, not domestic threats. An example of FIMI is a foreign government conducting a disinformation campaign to influence public opinion about a particular political candidate, which can impact the results of an election.
This term has been gaining traction in recent years, especially in government contexts.
Works consulted
- Canadian Security Intelligence Service, Foreign Interference Threats to Canada's Democratic Process, 2021.
- European Union External Action, 3rd EEAS Report on Foreign Information Manipulation and Interference Threats, 2025.
- European Union External Action, Information Integrity and Countering Foreign Information Manipulation & Interference (FIMI), 2025.
- Global Affairs Canada, G7 Rapid Response Mechanism Annual Report 2023, 2023.
- Nicolas Hénin, EU DisinfoLab, FIMI: towards a European redefinition of foreign interference, 2023.
- Public Inquiry into Foreign Interference in Federal Electoral Processes and Democratic Institutions, Final Report, 2025.
- Public Safety Canada, Foreign Interference and Canada, n.d.
|
| Information disorder |
Information disorder refers to the production and distribution of information that pollutes our information ecosystem. It includes misinformation, disinformation, and malinformation.
More
Online environments amplify information disorder. False and misleading content can be produced and circulated quickly, easily, cheaply, and widely. Such content also tends to provoke strong emotions, which makes people more likely to engage with it.
The term "information disorder" is mainly used in academic contexts.
Related terms
While information disorder describes elements that contribute to an unhealthy information ecosystem, information integrity describes a healthy one.
Works consulted
|
| Information ecosystem |
An information ecosystem is a complex network of elements, including people, communication technologies (both digital and analog), policies, legislation, beliefs, and practices. These elements don't exist in isolation. They interact, connect, and shape how information is created, shared, and consumed.
More
Information ecosystems are dynamic and always changing. They are informed by factors like region, community, age, and language.
Related terms
When discussing threats to information integrity, the Canada School of Public Service prefers the term "information ecosystem" over other related terms like "information environment" or "information landscape." This is consistent with the Privy Council Office, Global Affairs Canada, and the Canadian Security Intelligence Service.
Works consulted
- Aengus Bridgman et al., The Canadian Information Ecosystem, 2023.
- Canadian Security Intelligence Service, Democracy's New Challenge: Navigating the Era of Generative AI, n.d.
- Global Affairs Canada, Global declaration on information integrity online, 2023.
- Privy Council Office, Countering Disinformation: A Guidebook for Public Servants, 2024.
- Tara Susman-Peña, Why Information Matters: A Foundation for Resilience, 2014.
|
| Information integrity |
Information integrity presents a vision of an information ecosystem that produces and provides access to diverse information sources that are accurate, trustworthy, and reliable. Such an information ecosystem is inclusive, safe, and secure. People are exposed to different ideas and perspectives. At the same time, they can rely on the information sources they choose to help them make informed decisions.
More
Some governments and organizations have started taking concrete steps to promote and strengthen information integrity. Examples include implementing regulatory measures that keep digital service providers accountable for content hosted on their platforms, developing educational campaigns and curricula to promote digital and media literacy at the individual level, and establishing mechanisms to monitor and counter disinformation during elections.
The term "information integrity" tends to be used in government contexts.
Related terms
While information integrity describes elements that contribute to a healthy information ecosystem, information disorder describes an unhealthy one.
Works consulted
- Global Affairs Canada, Global declaration on information integrity online, 2023.
- OECD, Facts not Fakes: Tackling Disinformation, Strengthening Information Integrity, 2024.
- OECD, Recommendation of the Council on Information Integrity, 2024.
- Open Government Partnership, Digital Governance: Disinformation and Information Integrity, n.d.
- United Nations, United Nations Global Principles For Information Integrity, n.d.
|
| Malinformation |
According to Countering Disinformation: A Guidebook for Public Servants, malinformation is "when factual information … is deliberately and maliciously used with the goal of causing harm."
More
An example of malinformation is an activist sharing private information about a community leader in an attempt to damage their reputation.
Works consulted
|
| Misinformation |
As described in Countering Disinformation: A Guidebook for Public Servants, misinformation is "false information that is often shared in good faith and not intended to deceive."
More
An example of misinformation is a re-shared social media post about a teen vandalizing a local restaurant. The person who re-shared it did not verify the information. They believed the post was true, but it was actually false.
Disinformation can turn into misinformation. Building on the example, let’s say we discovered the original post was created to provoke fear and anger. The original post is disinformation. The person who re-shared the post didn’t realize it was false. They didn’t intend to deceive. What they’ve shared is misinformation.
Comparing misinformation and disinformation
Sources often define disinformation and misinformation in relation to one another to highlight their similarities and differences.
A key difference between misinformation and disinformation is intent. With disinformation, an actor creates and shares false information intentionally. The actor intends to deceive or cause harm. This differs from misinformation, where an actor shares false information unintentionally. The actor does not intend to deceive or cause harm. However, intentions are usually unclear. Some sources admit that intentions are difficult to assess and often based on assumptions.
Works consulted
- Caroline Jack, Lexicon of Lies: Terms for Problematic Information, 2017.
- Privy Council Office, Countering Disinformation: A Guidebook for Public Servants, 2024.
- Jean-Baptiste Jeangène Vilmer, Alexandre Escorcia, Marine Guillaume, and Janaina Herrera, Information Manipulation: A Challenge for Our Democracies, 2018.
- Claire Wardle, Understanding Information disorder, 2020.
|
| Polarization |
Polarization is when public opinions and beliefs split and concentrate at opposing extremes.
More
Polarization is multifaceted. It can relate to, for example, ideologies or social identity. Individuals could be polarized in one area but not another.
Polarization can affect social trust and social cohesion, which are essential to a functioning democracy. It can also lead to a loss of diversity in opinions expressed in public discourse. For example, people with more moderate opinions may refrain from expressing them because they're afraid of being mocked or attacked.
Works consulted
- Rafael Aguirre, Stephen Bird, Brenden Frank, and Monica Gattinger, Polarization over Energy and Climate: Understanding Canadian Public Opinion, Issue 2 – Oil and Gas (PDF), 2020.
- Canada School of Public Service, Future of Democracy Series: The State of Political Polarization in Canada, 2022.
- Institute for Research on Public Policy, Democracy under threat? Polarization and public policy in Canada, 2022.
- Justin Ling, Far and Widening: The Rise of Polarization in Canada, 2023.
|
| Pre-bunking |
According to Countering Disinformation: A Guidebook for Public Servants, pre-bunking aims "to address potential misconceptions, false information and disinformation campaigns before they start."
More
Pre-bunking involves identifying and explaining tactics, sources, and messages that are manipulative, so people are better equipped to recognize false or misleading information.
Pre-bunking also involves sharing accurate information early and often. This includes regularly reviewing information and revising anything that is outdated.
Related terms
Pre-bunking prepares people to recognize and address false information before they encounter it. It's proactive. In contrast, debunking is a technique people can use when they encounter false information. It's reactive.
Works consulted
|
| Threat actors |
This broad term is used to describe those who intentionally create, manipulate, or share false or misleading information.
More
Threat actors can be individuals, organizations, companies, or governments, among others. They can be domestic or foreign. They can work independently or in organized groups.
Threat actors can have different motivations. They can manipulate information for financial gain, attention, fame, or even personal enjoyment. They can also do it for political reasons, like influencing public opinion. Their motivations influence the strategies they use. It can be difficult to know what their motivations really are.
Their actions have different impacts. For example, a misleading post on social media could reach 5, 500, or 500,000 people. Within the affected group, it can cause doubt, anger, fear, or frustration. It can disrupt or destabilize their beliefs, values, practices, or communities.
Related terms
Some sources use other similar adjectives to describe such actors, such as malign, malicious, deceptive, hostile, or bad.
Works consulted
- Maria Giovanna Sessa, EU DisinfoLab, Disinformation glossary: 150+ Terms to Understand the Information Disorder, 2023.
- Privy Council Office, Countering Disinformation: A Guidebook for Public Servants, 2024.
- Canadian Centre for Cyber Security, How to identify misinformation, disinformation, and malinformation, 2024.
- Commonwealth Parliamentary Association, Parliamentary Handbook on Disinformation, AI and Synthetic Media, 2023.
- Global Affairs Canada, Russia's use of disinformation and information manipulation, n.d.
- Caroline Jack, Lexicon of Lies: Terms for Problematic Information, 2017.
- The Royal Society, The online information environment, 2022.
- Jean-Baptiste Jeangène Vilmer, Alexandre Escorcia, Marine Guillaume, and Janaina Herrera, Information Manipulation: A Challenge for Our Democracies, 2018.
- Claire Wardle and Hossein Derakhshan, Information Disorder: Toward an interdisciplinary framework for research and policymaking, 2017.
|
| Trolls |
A troll is a person who posts inflammatory, upsetting, or misleading content online to insult, provoke, and harass others.
More
Trolls are sometimes hired to participate in coordinated campaigns to spread false or misleading information. Sources refer to these organized groups as troll farms, troll factories, or troll armies.
An example of a troll is a user posting rude and hurtful messages in the comments section of a YouTube video.
Related terms
"Troll" is also used as a verb (trolling).
When discussing trolls, the concept of bots often comes up. The difference is that trolls are human, whereas bots are automated software programs that tend to imitate human behaviour. Trolls and bots can both be used to spread and amplify false or misleading information.
Works consulted
- Commonwealth Parliamentary Association, Parliamentary Handbook on Disinformation, AI and Synthetic Media, 2023.
- Maria Giovanna Sessa, EU DisinfoLab, Disinformation glossary: 150+ Terms to Understand the Information Disorder, 2023.
- Jean-Baptiste Jeangène Vilmer, Alexandre Escorcia, Marine Guillaume, and Janaina Herrera, Information Manipulation: A Challenge for Our Democracies, 2018.
- Claire Wardle, Information Disorder: The Essential Glossary, 2018.
- Claire Wardle and Hossein Derakhshan, Information Disorder: Toward an interdisciplinary framework for research and policymaking, 2017.
|