Language selection

Search CSPS

Information Integrity Concept Hub (TRN1-J28)

Description

This job aid describes common terms and concepts that relate to information integrity. It brings together ideas and perspectives from different sources, sectors and disciplines to provide a shared foundation meant to encourage reflection and discussion.

Published: March 19, 2026
Type: Job aid


Information Integrity Concept Hub

Job aid

Information integrity terms and concepts

This tool explains key terms and concepts that relate to information integrity. Each entry includes a concise, plain language description followed by additional information for those who want to learn more. Some entries include examples, usage notes, and related terms. The tool will continue to expand and evolve because terms in this subject area are complex and dynamic. In time, some descriptions of terms may change, be replaced by others, or be removed completely.

Disclaimer Disclaimer
The Concept Hub is not intended to include every term and concept related to information integrity. Nor is it meant to be a definitive or authoritative source for the Government of Canada. It is a reference tool and does not replace or add to existing Government of Canada policies.

List of terms

Glossary of information integrity terms
Term Description
Debunking

According to Countering Disinformation: A Guidebook for Public Servants, debunking aims "to explicitly expose false information to make the facts clear."

More

Debunking is not only about stating something is false. It's also about proving and explaining why it's false, and providing clear and corrective messaging, using accurate evidence and credible sources.

Related terms

Debunking is a technique people can use when they encounter false information. It's reactive. In contrast, pre-bunking prepares people to recognize and address false information before they encounter it. It's proactive.

Fact-checking is an important step before debunking, as it verifies whether information is accurate and provides evidence that can be used to correct false information.

Works consulted

Democratic institutions

Democratic institutions are actors and stakeholders that uphold democracy and serve the public interest. These include, but are not limited to, the electorate, political parties, the House of Commons, the Senate, the Governor General, and the public service.

More

The Canadian system of government is a parliamentary democracy. It relies on a series of processes to ensure that it represents and serves the people, like elections, campaigns, and leadership races.

The term "democratic institutions" is referenced by many sources but usually not clearly defined.

Works consulted

Disinformation

In Countering Disinformation: A Guidebook for Public Servants, the Government of Canada describes disinformation as "deliberately false information that is disseminated to deceive or cause harm."

More

Some sources add more nuance to their explanation of disinformation, stating that it may be partially false or misleading information.

An example of disinformation is a podcast spreading false information about a political candidate to discredit them. Another example is a fake version of a medical blog recommending a new diet pill with links to purchase it from a scam website.

Works consulted

Echo chambers

Echo chambers are environments where people only encounter ideas and perspectives that are similar to their own. These environments tend to amplify or reinforce what people already believe.

More

Echo chambers exist both online and offline. Some digital tools like social media algorithms tailor content to match users' interests, which can promote echo chambers. While some users may voluntarily be in an echo chamber, others may not even realize they're in one.

Echo chambers narrow our exposure to diverse ideas and perspectives, which are crucial to a functioning democracy. Considering other viewpoints can challenge our beliefs and assumptions and strengthen our critical thinking, helping us make informed decisions.

Works consulted

Fake news

Fake news is an umbrella term commonly used to describe any false, manipulated, or misleading information.

More

The term "fake news" is misused and overused. The term emphasizes news, but it is used to describe many different types of information. It has also become a politicized term used by people who wish to undermine, discredit, or distract from information they dislike or disagree with.

Experts recommend avoiding the term "fake news" and using more precise terms like "misinformation," "disinformation," and "malinformation."

Works consulted

Foreign information manipulation and interference (FIMI)

In their 2023 report, the G7 Rapid Response Mechanism describes FIMI as "an umbrella term for foreign efforts aimed at covertly manipulating the information environment to achieve strategic objectives." These malign efforts target and threaten individuals, organizations, and governments.

More

FIMI focuses on foreign threats, not domestic threats. An example of FIMI is a foreign government conducting a disinformation campaign to influence public opinion about a particular political candidate, which can impact the results of an election.

This term has been gaining traction in recent years, especially in government contexts.

Works consulted

Information disorder

Information disorder refers to the production and distribution of information that pollutes our information ecosystem. It includes misinformation, disinformation, and malinformation.

More

Online environments amplify information disorder. False and misleading content can be produced and circulated quickly, easily, cheaply, and widely. Such content also tends to provoke strong emotions, which makes people more likely to engage with it.

The term "information disorder" is mainly used in academic contexts.

Related terms

While information disorder describes elements that contribute to an unhealthy information ecosystem, information integrity describes a healthy one.

Works consulted

Information ecosystem

An information ecosystem is a complex network of elements, including people, communication technologies (both digital and analog), policies, legislation, beliefs, and practices. These elements don't exist in isolation. They interact, connect, and shape how information is created, shared, and consumed.

More

Information ecosystems are dynamic and always changing. They are informed by factors like region, community, age, and language.

Related terms

When discussing threats to information integrity, the Canada School of Public Service prefers the term "information ecosystem" over other related terms like "information environment" or "information landscape." This is consistent with the Privy Council Office, Global Affairs Canada, and the Canadian Security Intelligence Service.

Works consulted

Information integrity

Information integrity presents a vision of an information ecosystem that produces and provides access to diverse information sources that are accurate, trustworthy, and reliable. Such an information ecosystem is inclusive, safe, and secure. People are exposed to different ideas and perspectives. At the same time, they can rely on the information sources they choose to help them make informed decisions.

More

Some governments and organizations have started taking concrete steps to promote and strengthen information integrity. Examples include implementing regulatory measures that keep digital service providers accountable for content hosted on their platforms, developing educational campaigns and curricula to promote digital and media literacy at the individual level, and establishing mechanisms to monitor and counter disinformation during elections.

The term "information integrity" tends to be used in government contexts.

Related terms

While information integrity describes elements that contribute to a healthy information ecosystem, information disorder describes an unhealthy one.

Works consulted

Malinformation

According to Countering Disinformation: A Guidebook for Public Servants, malinformation is "when factual information … is deliberately and maliciously used with the goal of causing harm."

More

An example of malinformation is an activist sharing private information about a community leader in an attempt to damage their reputation.

Works consulted

Misinformation

As described in Countering Disinformation: A Guidebook for Public Servants, misinformation is "false information that is often shared in good faith and not intended to deceive."

More

An example of misinformation is a re-shared social media post about a teen vandalizing a local restaurant. The person who re-shared it did not verify the information. They believed the post was true, but it was actually false.

Disinformation can turn into misinformation. Building on the example, let’s say we discovered the original post was created to provoke fear and anger. The original post is disinformation. The person who re-shared the post didn’t realize it was false. They didn’t intend to deceive. What they’ve shared is misinformation.

Comparing misinformation and disinformation

Sources often define disinformation and misinformation in relation to one another to highlight their similarities and differences.

A key difference between misinformation and disinformation is intent. With disinformation, an actor creates and shares false information intentionally. The actor intends to deceive or cause harm. This differs from misinformation, where an actor shares false information unintentionally. The actor does not intend to deceive or cause harm. However, intentions are usually unclear. Some sources admit that intentions are difficult to assess and often based on assumptions.

Works consulted

Polarization

Polarization is when public opinions and beliefs split and concentrate at opposing extremes.

More

Polarization is multifaceted. It can relate to, for example, ideologies or social identity. Individuals could be polarized in one area but not another.

Polarization can affect social trust and social cohesion, which are essential to a functioning democracy. It can also lead to a loss of diversity in opinions expressed in public discourse. For example, people with more moderate opinions may refrain from expressing them because they're afraid of being mocked or attacked.

Works consulted

Pre-bunking

According to Countering Disinformation: A Guidebook for Public Servants, pre-bunking aims "to address potential misconceptions, false information and disinformation campaigns before they start."

More

Pre-bunking involves identifying and explaining tactics, sources, and messages that are manipulative, so people are better equipped to recognize false or misleading information.

Pre-bunking also involves sharing accurate information early and often. This includes regularly reviewing information and revising anything that is outdated.

Related terms

Pre-bunking prepares people to recognize and address false information before they encounter it. It's proactive. In contrast, debunking is a technique people can use when they encounter false information. It's reactive.

Works consulted

Threat actors

This broad term is used to describe those who intentionally create, manipulate, or share false or misleading information.

More

Threat actors can be individuals, organizations, companies, or governments, among others. They can be domestic or foreign. They can work independently or in organized groups.

Threat actors can have different motivations. They can manipulate information for financial gain, attention, fame, or even personal enjoyment. They can also do it for political reasons, like influencing public opinion. Their motivations influence the strategies they use. It can be difficult to know what their motivations really are.

Their actions have different impacts. For example, a misleading post on social media could reach 5, 500, or 500,000 people. Within the affected group, it can cause doubt, anger, fear, or frustration. It can disrupt or destabilize their beliefs, values, practices, or communities.

Related terms

Some sources use other similar adjectives to describe such actors, such as malign, malicious, deceptive, hostile, or bad.

Works consulted

Trolls

A troll is a person who posts inflammatory, upsetting, or misleading content online to insult, provoke, and harass others.

More

Trolls are sometimes hired to participate in coordinated campaigns to spread false or misleading information. Sources refer to these organized groups as troll farms, troll factories, or troll armies.

An example of a troll is a user posting rude and hurtful messages in the comments section of a YouTube video.

Related terms

"Troll" is also used as a verb (trolling).

When discussing trolls, the concept of bots often comes up. The difference is that trolls are human, whereas bots are automated software programs that tend to imitate human behaviour. Trolls and bots can both be used to spread and amplify false or misleading information.

Works consulted

About

The value of a common language

Many disciplines are grappling with the causes and consequences of misinformation and disinformation. Political scientists, policy leaders, communicators, and cyber security specialists are studying these phenomena from different vantage points. As a result, terms and concepts can vary in meaning and usage across disciplines.

Research on the information ecosystem is also evolving quickly. Terms and concepts are shifting constantly. What dominates today's conversation may not have the same significance months later. For those who are new to this subject area, it can feel overwhelming.

This tool will help you:

  • navigate the terminology: make sense of frequently used key terms and concepts related to information integrity
  • bridge gaps: clarify how perspectives on these terms vary from community to community, which is useful when working on cross-functional or multidisciplinary projects
  • align teams: facilitate reflections and discussions to achieve a common understanding of related terms and concepts

Consider creating your own version and modifying or adding terms to better suit your community, your role and your needs.

Moving forward together

To protect the integrity of Canada's information ecosystem, departments and agencies across the federal government need to coordinate their efforts and expertise: we need a whole-of-government approach.

Having a common language can help bridge diverse communities and build stronger communication between public servants navigating the information ecosystem. When we work together, we strengthen our collective ability to protect the integrity of our information ecosystem—a responsibility that belongs to all of us.

Feedback

To submit suggestions for additional terms or concepts, or comments on the descriptions, please email transferableskills-competencetransferables@csps-efpc.gc.ca.


Date modified: