Skip to content
SECD - Standing Committee

National Security, Defence and Veterans Affairs


THE STANDING SENATE COMMITTEE ON NATIONAL SECURITY, DEFENCE AND VETERANS AFFAIRS

EVIDENCE


OTTAWA, Monday, December 9, 2024

The Standing Senate Committee on National Security, Defence and Veterans Affairs met with videoconference this day at 4 p.m. [ET] to examine and report on issues relating to national security and defence generally.

Senator Hassan Yussuff (Chair) in the chair.

The Chair: Good afternoon.

Before I begin, I would like to ask all senators and other in-person participants to consult the cards on the table for guidelines to prevent audio feedback incidents. Thank you for your cooperation.

Welcome to this meeting of the Standing Senate Committee on National Security, Defence and Veterans Affairs. I’m Hassan Yussuff, senator from Ontario and chair of the committee. I’m joined by my fellow committee members, and I ask them to introduce themselves, beginning on my right.

[Translation]

Senator Dagenais: I am Jean-Guy Dagenais from Quebec.

[English]

Senator Al Zaibak: Mohammad Al Zaibak, Ontario.

Senator Patterson: Rebecca Patterson, Ontario.

Senator Cardozo: Andrew Cardozo, Ontario.

Senator Anderson: Margaret Dawn Anderson, Northwest Territories.

[Translation]

Senator Gignac: Good afternoon. I am Clément Gignac from Quebec.

[English]

Senator Dasko: Donna Dasko, senator from Ontario.

Senator Boehm: Peter Boehm, Ontario.

Senator Kutcher: Stan Kutcher, Nova Scotia.

The Chair: Today we begin our study on the impact of Russian disinformation on Canada. We welcome our three panels of experts who have been invited to provide a briefing on this topic.

On the first panel, we welcome from the Canadian Security Intelligence Service, Peter Madou, Assistant Deputy Minister, Operational Intelligence and Assessment Requirements; from the Communications Security Establishment Canada, Bridget Walshe, Associate Head, Canadian Centre for Cyber Security; from Global Affairs Canada, Larisa Galadza, Director General, Cyber, Critical Technology and Democratic Resilience Bureau; from Public Safety Canada, Sébastien Aubertin-Giguère, Associate Assistant Deputy Minister, National Security and Cyber Security; and from the Privy Council Office, Sarah Stinson, Director of Operations, Democratic Institutions.

Thank you for joining us here today. We invite you to provide opening remarks. I understand that you each will deliver shortened remarks today of three minutes each, given the size of our panel here, as we’re trying to get everybody in. Thank you very much.

We’ll start with Mr. Madou.

Peter Madou, Assistant Deputy Minister, Operational Intelligence and Assessment Requirements, Canadian Security Intelligence Service: Thank you.

Good afternoon, chair and members of the committee. My name is Peter Madou, and I am the Canadian Security Intelligence Service, or CSIS, Assistant Deputy Minister responsible for operational intelligence and assessment requirements.

I would like to thank you for inviting CSIS to appear on this issue. We recognize and appreciate the value and importance of the work you and your colleagues in the House of Commons are doing on the challenge of Russian disinformation. We welcome the opportunity to shed light on CSIS’s role in protecting Canadians against these threats.

[Translation]

Disinformation is a complex issue that CSIS, our colleagues across the Government of Canada and our democratic allies across the world face. It is harmful to global democratic interests and can have very negative impacts on social cohesion.

Foreign states use disinformation to discredit Canadian narratives and advance their own; advance inflammatory narratives to stoke tensions; manipulate international norms and standards; sow divisions among allies and partners; and erode Canadian values and faith in democracy. Both traditional and non-traditional media can be leveraged to these ends.

[English]

CSIS investigates disinformation when it rises to the threshold of foreign interference, as outlined in the Canadian Security Intelligence Service Act. That is, the efforts must be detrimental to Canada’s interests and be clandestine, deceptive or threatening.

Accordingly, we have observed a growing number of foreign states building and deploying programs dedicated to undertaking online influence as part of their daily business. Russia is chief among them. Russia employs disinformation campaigns as a cornerstone of its global foreign interference strategy, concentrating on areas of strategic importance. These operations primarily focus on neighbouring countries, the former Soviet bloc states, Five Eyes alliance members and North Atlantic Treaty Organization, or NATO, countries — Canada among them.

These efforts do not favour one party or another but, rather, aim to sow division and distrust in democratic institutions.

[Translation]

Canada is not currently targeted to the same extent as some of our allies for Russian disinformation. Given an ever-changing geopolitical landscape, it is important that Canada remain vigilant regarding the potential for future escalation against Canada’s democratic processes.

CSIS continues to engage with Canadian communities; advocacy groups; businesses; industry associations; academic institutions; federal, provincial, territorial and municipal governments; and Indigenous governing bodies to ensure that they are aware of the national security threats facing our country and give them the information they need to protect their interests.

[English]

I will conclude by noting that while CSIS cannot publicly comment on its operational activities or ongoing criminal investigations led by law enforcement, I welcome this opportunity to answer your questions.

The Chair: Ms. Walshe?

Bridget Walshe, Associate Head, Canadian Centre for Cyber Security, Communications Security Establishment Canada: Good afternoon, chair and members of the committee. Thank you for the invitation to appear today.

My name is Bridget Walshe. I am the Associate Head of the Canadian Centre for Cyber Security, or the Cyber Centre, for short.

[Translation]

The Cyber Centre is Canada’s technical authority on cybersecurity. Part of the Communications Security Establishment Canada, we are the single unified source of expert advice, guidance, services and support on cybersecurity for Canadians and Canadian organizations.

[English]

We work in close collaboration with government, critical infrastructure, Canadian businesses and international partners to prepare for, respond to, mitigate and recover from cyber incidents. Although disinformation has long been used by adversaries, Russia’s invasion of Ukraine in February 2022 gave the world a new view and understanding of how cyber activity is used to support warfare.

[Translation]

Russia’s unpredictable cyber program routinely breaks cyberspace norms, and furthers Moscow’s ambitions to confront and destabilize Canada and our allies.

[English]

As outlined in our recently published National Cyber Threat Assessment, Russia almost certainly views its cyber program as part of a multi-layered strategy to influence and shape the information environment. Russia combines conventional cyber espionage and computer network attacks with disinformation and influence operations to promote Russia’s global status and reinforce pro-Russian narratives, to erode trust in democratic institutions, to generate popular support for Russia’s war efforts — both within Russia and abroad — and to psychologically weaken or embarrass its opponents.

Russia and other countries have incorporated artificial intelligence into their disinformation campaigns. These campaigns make it difficult for Canadians to separate truth from falsehoods.

Disinformation is propagated online.

[Translation]

We encourage Canadians to have good cyber hygiene, which includes adopting security best practices on social media. That is especially important for those in public positions.

[English]

Be mindful of unsolicited or unusual emails by refraining from clicking on any links contained in suspicious emails. Use available security settings such as multi-factor authentication. Use good judgment when posting.

We prioritize monitoring cyber threats to Canada, and as they evolve, the Cyber Centre remains focused on tackling these threats.

Once again, thank you for the invitation to appear today to speak about the vital work being done at the Communications Security Establishment Canada, or CSE, and the Cyber Centre.

The Chair: Thank you.

Ms. Galadza?

Larisa Galadza, Director General, Cyber, Critical Technology and Democratic Resilience Bureau, Global Affairs Canada: Thank you, Mr. Chair, members of the committee.

I am pleased to have the opportunity to share what we know at Global Affairs Canada about Russian disinformation efforts.

[Translation]

My name is Larisa Galadza. I am the Director General for Cyber, Critical Technology and Democratic Resilience at Global Affairs Canada, or GAC.

[English]

One of my responsibilities is the G7Rapid Response Mechanism or the RRM.

[Translation]

Since its inception, the G7 RRM has focused primarily on countering foreign threats to democracy, such as information operations.

[English]

Rapid Response Mechanism data analysts use open-source data and research methods to monitor the global digital information ecosystem for foreign-backed information operations. Thanks to this work, Canada — together with allies and partners — is able to detect, correct and call out the Kremlin’s malign activities.

As my colleagues have said, Russia’s objective is to erode trust in democratic institutions and processes, undermine elections, replace facts with hostile and false narratives and create and amplify divisions in our societies.

Russia seeds its narratives in covert ways and is an equal opportunity divider, meaning they will engage in an inflammatory way across the political spectrum to exploit existing social issues and widen societal divisions.

Whenever Russia’s messages are repeated by influencers and people of authority, they take on credibility.

In its information operations, Russia is playing the long game, understanding their tactics erode and fragment social cohesion over the long run.

The documents laying out the United States’ federal indictment of Tenet Media clearly show how Russia launders millions of dollars to content creators to amplify devisive narratives on issues like immigration, inflation and other topics related to domestic and foreign policy.

[Translation]

The exploitation of these types of contentious topics and narratives can pose serious challenges, confirming to us that Canada is not immune to false and misleading narratives.

[English]

Global Affairs Canada has recently called out outlets like RT, formerly Russia Today, which purported to be a media entity but is, in fact, now an arm of Russian intelligence.

We have also condemned the Kremlin’s funding of firms like the Social Design Agency and Structura. They have run a sophisticated, complex disinformation network known as Doppelganger which spams online spaces with inauthentic posts, falsified documents and deep fakes.

The G7 RRM analysis has found that, among many tactics, Russia uses layers of unbranded, partner channels to promote pro-Kremlin viewpoints and finds indirect channels to target Western audiences to get around sanctions on Russian state media in Europe and North America.

Russia’s disinformation campaigns have had significant global impacts. We have tracked their malign activities in Moldova during recent elections.

The Kremlin funded a number of influencers and pro-Russian candidates with ties to Russia and spent nearly €100 million to undermine the presidential electoral process.

In sub-Saharan Africa, after Russia’s full-scale invasion of Ukraine, Russia’s disinformation weaponized food and hunger by falsely blaming Western sanctions for food insecurity.

[Translation]

Disinformation is a tool of subversion used by Russia to achieve its strategic objectives. It is global in scope, but tactics are tailored to the region or country it seeks to influence.

The better we understand Russia’s disinformation networks, tactics and strategies, the better we can respond.

[English]

Thank you for your interest in this topic.

The Chair: Thank you.

[Translation]

Sébastien Aubertin-Giguère, Associate Assistant Deputy Minister, National Security and Cyber Security Public Safety Canada: Good afternoon. My name is Sébastien Aubertin-Giguère, and I am the Associate Assistant Deputy Minister for National Security and Cyber Security at Public Safety Canada. I am also the Counter-Foreign Interference Coordinator.

[English]

Russia views information manipulation as a key tool to advance its interests. Over the past decade, the Russian Federation has increased its information manipulation and interference operations against NATO members, their democratic partners, including Canada.

The information efforts from Russia are pervasive, well funded and guided by the Kremlin but they are carried out by a large, distributed network of actors across multiple jurisdictions.

[Translation]

Russia uses a range of channels and techniques to manipulate information, and these techniques keep evolving, but, overall, Russia seeks to achieve key objectives.

[English]

First, to undermine the stability of Western democratic societies by eroding social cohesion and public trust in institutions.

Second, to break NATO unity and undermine Western support for Ukraine. Russia’s influence infrastructure is designed to opportunistically take advantage of current events and crises to further polarize our society. We are watching these activities closely.

Canada has well developed tools to address the threat. We are still working to develop further policy and operational responses.

The Government of Canada is also taking measures to address more broadly the threat of foreign interference. Bill C-70, An Act respecting countering foreign interference, received Royal Assent in June 2024.

In Bill C-70, we have new criminal offences in the Foreign Interference and Security of Information Act. It is now a criminal offence in Canada to work to hurt the security of Canada at the direction of, in association with or on behalf of or for the benefit of a foreign power, and also to interfere surreptitiously in a political process in Canada for a foreign power.

Bill C-70 also introduced the Foreign Influence Transparency and Accountability Act, which creates a registration obligation for individuals acting on behalf of foreign principals, if they seek to influence activities within our political and governmental processes.

I would note that in the recent indictment against RT in the United States included specific charges pursuant to the Foreign Agents Registration Act, or FARA, which is the functional equivalent to our Foreign Influence Transparency and Accountability Act, or FITAA.

I would be happy to discuss these issues with the committee. Thank you.

The Chair: Last but not least, Sarah Stinson.

Sarah Stinson, Director of Operations, Democratic Institutions, Privy Council Office: Good afternoon, Mr. Chair and senators. My name is Sarah Stinson. I’m the Director of Operations in the Democratic Institutions secretariat at the Privy Council Office.

I’m happy to appear here today before you to speak to the work we undertake to support Minister LeBlanc in his mandate to lead an integrated government response to protect Canada’s democratic institutions, including federal electoral processes against foreign interference and disinformation.

Nearing the end of the biggest election year in history, where more than half of the world’s population have gone to the polls, the work to protect Canada’s institutions from threats, including disinformation, has never been more important.

As part of Budget 2022, the government established the Protecting Democracy Unit within the Democratic Institutions secretariat at the PCO to coordinate, develop and implement government-wide measures designed to combat disinformation and protect democracy.

[Translation]

A key part of that work is the government-wide coordination of Canada’s Plan to Protect Democracy, which was introduced in advance of the 2019 general election.

The plan was updated prior to the 2021 general election, as were the measures to strengthen citizen resilience and organizational readiness to combat foreign interference, and to build a healthy information ecosystem.

We are constantly improving the plan in response to current challenges and the emerging issues, in accordance with Minister LeBlanc’s mandate.

Since protecting democracy requires a government-wide whole-of-society approach, we work closely with other federal departments and agencies, as well as with the provinces and territories, municipalities, civil society groups, academics and global partners.

We have a vested mutual interest in sharing information and establishing best practices. This cooperation includes developing tools that build resilience in order to counter disinformation and increase awareness of foreign interference threats.

These tools were shared with parliamentarians, political parties, public servants, as well as provincial, territorial and municipal governments. Thank you.

[English]

The Chair: Thank you for your opening remarks. We will now proceed to questions.

In order to ensure each member is able to participate fully, I will limit each question, including the answer, to four minutes. Please keep your questions succinct and identify the person you are addressing the question to.

Our first question comes from our deputy chair, Senator Dagenais.

[Translation]

Senator Dagenais: My first question is for Mr. Aubertin-Giguère.

I have a political question. Is Canada likely to experience more disinformation at the hands of Russia as a result of Donald Trump’s election as the new U.S. president? Also, did anything emerge during his first term as president that could help us better defend ourselves or reduce Russia’s capacity for disinformation?

Mr. Aubertin-Giguère: I can’t comment on the political situation south of the border, but I will say that the different parties in the U.S. have different perceptions of disinformation. For example, some believe that disinformation is more about freedom of expression, so efforts to combat that disinformation meet with different attitudes.

We are well aware of the different perspectives on the phenomenon, and we’re trying to adapt to the situation.

Senator Dagenais: Thank you.

My next question is for Ms. Stinson. To what extent is the information that intelligence agencies share with you shared with politicians in full or in part? Is some information withheld or systematically not shared? If so, for what reasons?

Ms. Stinson: Thank you for your question.

[English]

The Democratic Institutions secretariat at the Privy Council Office, or PCO, works very closely with its colleagues in the Security and Intelligence Secretariat at the PCO. We do not receive direct intelligence from the security agencies, including CSIS, in particular. Rather, we use that information as part of the policy development process. Once that analysis has been undertaken, whether by the security agencies or by PCO’s security and intelligence.

Given that, I’m not privy to intelligence that may flow up to the political level, whether it’s to the Prime Minister’s Office or to the Minister of Public Safety or to others who are interested or who need to receive that information. Our role is more in line with the policy development work, including, for example, the measures that I can outline as part of the plan to protect Canada’s democracy.

[Translation]

Senator Dagenais: My next question is for Mr. Madou.

Can you give us an idea or examples of the disinformation narratives targeting Canada, as compared with our Five Eyes partners? Also, are we as well equipped as they are to detect, and respond to, that disinformation?

Mr. Madou: Thank you for your question, senator.

First of all, I think much of the disinformation we face is fairly similar to what other countries are dealing with, given that a lot of the information is online and thus available to most Internet or social media users.

When it comes to Russia, there’s no doubt that we are less targeted than the Americans, if you want a comparison. However, the disinformation affecting Canadians lately has primarily been in relation to our position on Ukraine. The purpose of that disinformation messaging may be to influence public opinion or polarize the population in relation to our government’s support for Ukraine. That’s a real-life example of how disinformation has affected our country.

Senator Dagenais: Thank you very much.

[English]

Senator Boehm: Thank you, witnesses for being with us. Thank you, Senator Kutcher, for your inspiration in bringing this topic forward to the committee.

We’re very different from the Americans in that we don’t carry our titles around for the rest of our lives, but I’d like to thank Larisa Galadza for her role as our ambassador to Ukraine. With a compliment, of course, comes the question I’m going to ask you.

On the Rapid Response Mechanism — this is one of the successes that came out of the Group of Seven, or G7, presidency, our last one in 2018 — my information suggests that it’s the Canadian government that keeps it running. You have the secretariat. There’s a special Eastern European unit in there now. Is there as much dedication on the part of the other G7 members to the purpose of this mechanism? Are they providing funding, for example? I’m thinking, in particular, of countries that are on the front line of RT and other types of interference from Russia, such as Germany and Italy. Could you explain where exactly this mechanism is situated and whether you would see next year’s Canadian presidency as a way to give it a bit more push and life?

Ms. Galadza: Thank you very much both for the recognition and the question. I am very pleased to report that the G7 Rapid Response Mechanism is more active and more inclusive than ever before. The activity is increasing every year. It’s not just the G7 anymore. There are a number of partner countries that have joined it. Each country has a unit of some sort and takes on a piece of the digital ecosystem to watch that is particularly important for it.

Canada houses the G7 RRM secretariat, so it’s our job to coordinate among the members of the mechanism. We also have a very robust capacity of our own that looks beyond just Russia at the main threat actors in this space.

We’re seeing that there is an increasing desire to respond in a collaborative way. There’s work to draw up protocols for doing that. I have no doubt that this will be on the agenda for our presidency. It was with the Italians, and as I said, the interest in it is growing, unfortunately, because countries see exactly how malign actors are active in their information space very persistently and clearly. We know that a collective response is the strongest way that we can answer those malign activities.

Senator Boehm: Is the secretariat only Canadian financed?

Ms. Galadza: Yes.

Senator Boehm: To go to my colleague Senator Dagenais’s first question, are you anticipating any changes coming from Washington? I know it’s early days. The American machine is very powerful, both in terms of intelligence-gathering operations, et cetera. Would the Americans see this as something that might not be necessary in the future?

Ms. Galadza: The information that we have is that the Americans are as concerned about foreign actors meddling in their information space as we are.

Senator Boehm: Thank you.

Senator Kutcher: Thank you all for being here. I echo my colleague’s comments. Thank you again.

Sometimes I feel like Canada is in a bit of a second round of Igor Gouzenko. Some members of this committee are aware of the issues around the impact of disinformation in Canada. Thank you for bringing up the Doppelganger network, but also the influencers, some of whom are paid, and some of whom are not, and long-tail actors, many of whom are paid, and hide for quite a while then all of a sudden pop up, and there they go. I don’t think the public is very aware of the depth and perniciousness of Russian disinformation. I only became aware of it about a decade ago in the health space and vaccines, realizing that vaccine disinformation was being pushed out by Russia.

We know from academic work that there were three things that actually work at the public level to counter disinformation. One is prebunking, one is postbunking and the other is fact checking.

But these things have to be immediate. They have to be done in effective communication styles. They have to target specific communities. They also have to be followed by robust action to shut down the actors who are purveying this disinformation.

Could you share with us what specific things the Canadian government is doing in each of those areas that will give us comfort that we have active prebunking, post bunking and fact checking activities going on, that they’re reaching the government, that they’re being done by both government and non-government organizations?

Ms. Stinson: Thank you very much for the question. I might touch first on some of the prebunking. As I outlined, one of the four pillars in the plan to protect Canada’s democracy is building citizen resilience. That’s a key element in that prebunking space so that when Canadians receive information online or see things that they question, they can have media literacy. The degree to which we can build that literacy in our citizenry is one of the best defences.

Some of the programs that are in place to do that are led out by Canadian Heritage, the Digital Citizen Initiative, that in the context of the Russian war on Ukraine, they, in fact, funded 11 projects to build citizen resilience and media literacy within Canada which focused on that area. They have, in fact, been in place since 2019 as a program in the first plan to protect Canada’s democracy in advance of the general election of that year.

I would also highlight, more recently, in the spring of 2023, the government invested $5.5 million over three years in the Canadian Digital Media Research Network. You mentioned government programs but there are also programs and efforts by civil society because it is a whole-of-society effort to combat these threats. The Canadian Digital Media Research Network encompasses close to a dozen organizations across Canada in the areas of disinformation, foreign interference and the overall health of Canada’s information ecosystem. They have, in fact, released a number of different reports including on Tenet Media most recently among others. They’re very aware of how these impact Canada’s information ecosystem and the work they do, led out of McGill University and the University of Toronto.

Finally, I would just add, in the prebunking and postbunking space as well, Minister LeBlanc has undertaken some work. In my opening remarks, I referenced some toolkits that were shared with parliamentarians at the federal level, as well as the provincial level and municipalities through the Federation of Canadian Municipalities. Those programs are also targeting elected officials, community leaders and public servants.

These are all efforts to create awareness and provide helpful approaches and tools to be able to respond in the event that individuals see things that they think might be threatening.

We are also currently working with the Canada School of Public Service to develop some training modules, along with the University of Ottawa, for all public servants to build that resilience within the public service as well. Thank you.

Senator Patterson: Thank you very much. I will pull on one of those threads a little bit. It’s about public trust in institutions and how this also affects Canadians outside of Canada. I will focus specifically on the Canadian Armed Forces. I knew your name was familiar, and I wasn’t sure why.

We actually know that Russian disinformation has been attacking the credibility of the Canadian Armed Forces since Operation UNIFIER when it was in Lviv. We’d often have Russian media trying to get into groups that were supporting survivors of sexual misconduct in the Canadian Armed Forces, trying to get stories on officers and promoting it.

We know right up until today, our current Chief of the Defence Staff also went through a pretty horrific time with accusations that were untrue about her time in Iraq to try to discredit her in the eyes of the Canadian public but also in the eyes of members of the Canadian Armed Forces.

We have to focus on Canada — that’s where we’re going — but we also have a federal organization that is responsible for executing defence and security, especially out of the country on behalf of Canada. With that, we don’t really hear much about it. I agree about the debunking that needs to go on, but it is kind of a very quiet space.

With issues such as this, whether they’re affecting diplomats overseas, whether they be in China, Ukraine or wherever, should there be regular reporting of these incidents? Right now, the Canadian public has nothing to balance off what officials are doing both inside and outside Canada. I would suggest it may even be impacting on the ability to recruit to the Canadian Armed Forces to recruit. That’s just a small aside on which you’re not required to comment.

Should we be reporting on those activities where we can, in an unclassified manner, in order to help Canadians understand attacks that are being made on public officials such as the diplomatic corps and your Canadian Armed Forces? Perhaps Public Safety Canada can answer first or to whomever you think it belongs.

Mr. Aubertin-Giguère: Thank you for your question. Regarding the building blocks of our capacity to respond, Ms. Stinson was talking about the infrastructure we’ve put in place for election periods and democratic processes. Ms. Galadza was talking about the G7 Rapid Response Mechanism, and its capacity to draw attention to certain specific methods used by those actors.

The next step should be that each department will have a more aggressive communication approach to defending the interests of its own equities. The more connected work in identifying evidence of such operations and then supporting different departments in speaking up and not debunking but sort of setting the record straight around the source of information or suspicion.

It’s more of a whole-of-government effort, but, as I said, we have the building blocks established around the very important core equities that turn up around elections. We need to expand.

Ms. Walshe: I might add that we, as a government, have certainly responded to certain Russian disinformation campaigns. As you note, at times we have information that might be classified for a very good reason. We certainly have even declassified information so that we can call out particular instances — and in this case, disinformation targeting the Canadian Armed Forces. It’s very important work.

Mr. Madou: Thank you for your question. A lot of the work we do is classified, but we have a lot of open information available in different languages to target different communities. We’ve also had an information campaign on X, or Twitter, relative to building more resilience in the disinformation space with Canadians. Of course, we have a robust outreach program where we reach out to academia and different communities and different levels of government.

In the classified space, aside from the investigations we do, we will also brief impacted departments, even though it’s not public, but we will build resiliency that way by briefing GAC or others with the rest of the security and intelligence, or S&I, community.

Senator Dasko: Thank you to everyone for being here today. It’s nice to see the ambassador again, too.

I was looking at the Global Affairs site. There’s a list of disinformation messages from Russia and arguments to rebut those messages.

The site is very useful, but I was looking for a little more context around these messages, so I will ask you questions about them. Which of the messages that you’ve listed — or others — are the most prevalent that we are hearing from the Russians? What are the big messages? Is it that the growing strength of NATO is what got them to Ukraine? What is the most prevalent of the Russian disinformation messages coming to Canada? Prevalence is first.

The second is the impact. Do you have any sense of the impact on Canadians of particular messages? That leads me into a third, larger question, and I’m not sure who to direct any of these questions to, but whomever might wish to answer may. Three of you mentioned Canadian social cohesion as being affected. That’s a very big topic. Do you have any sense of the actual impact of messages on social cohesion?

I know what they’re trying to do. I’m trying to understand how successful they may be, so that’s three questions.

Ms. Galadza: In terms of prevalence, my colleagues can jump in on what they see. Russia is targeting Canadians and the citizens of like-minded countries about Ukraine. The message is tailored to them and also to Ukrainians. There is the misinformation about the historical, revisionist perspectives on Ukraine’s sovereignty, independence and territorial integrity. There is disinformation that there is waning support for Ukraine in Canada, the United States and Europe. That’s targeted at Ukrainians to make them concerned, but that also gets Canadians believing — and I’ve heard from Canadians that there’s less and less support and that we don’t support it anymore. There is also the stoking of anti-refugee sentiments to say that Ukrainians fleeing Russia’s ongoing military aggression are no longer welcome in Europe, North America or elsewhere. Then, of course, the classic message we know is that their invasion was a response to NATO’s encroachments, so those are clear Russian narratives.

They are, from what we see, the most prevalent. In terms of impact, it’s very hard to measure impact, especially within our own society but we know the impact is greater when Russian disinformation comes out of the mouths of influencers and credible authorities. Once that happens, it is very hard to debunk or roll back.

In terms of social cohesion, there is the misinformation tactic that they use, but they also sow discord. This makes debate much more difficult in a country, exploits existing divisions and creates new ones on topics that are vulnerable. So that is a separate kind of a tactic. As I said in my remarks, they don’t necessarily target the left or the right of the political spectrum. It is wherever there is discord to be sown, they pour accelerant on it through the information space and make public debate more difficult.

The Chair: Sorry, not to be rude, but the time on this has reached its limit.

Senator Cardozo: Thank you very much for being here. I’m both reassured by your work, but also confused. I wonder if I could ask you to, in one or two sentences, outline what your agency does and how is different from the others. And then if somebody can tie it all together, that would be useful. So things don’t fall between the cracks or you’re not stepping on each other’s toes, maybe I could just go down the table.

Ms. Stinson: Within the PCO Democratic Institutions secretariat, we coordinate the whole-of-government approach to protecting Canada’s democracy, particularly during the election period, but increasingly, all the time. We work with all of the relevant departments and the programs, measures and initiatives that they have in place to bring that together under Minister LeBlanc’s mandate.

This includes citizen resilience tools. It also includes support through some of our policy work for the Rapid Response Mechanism, obviously under the authorities of different ministers, but we have that overall coordination role as part of the plan to protect Canada’s democracy.

Senator Cardozo: So you coordinate with —

Ms. Stinson: I do.

Senator Cardozo: And focusing on this issue of foreign interference.

Mr. Aubertin-Giguère: Public Safety Canada is a policy-coordination department. We work with other partners in Canada to come up with responses, good policies and legislation to combat any national security issue but foreign interference in particular. We have also been the lead agency or the lead department for the new legislation, Bill C-70, and we’re also implementing the foreign interference transparency registry.

Ms. Walshe: Within the Communications Security Establishment, we have multiple responsibilities under our mandate. One of those is the collection of foreign intelligence. I come from that perspective and, as I mentioned, have even collected information on Russian disinformation, declassified that and released it to help inform the public.

Another one of our major responsibilities and within the Cyber Centre where I work is cybersecurity. Within that remit, we look very carefully at the techniques, the tools and the threats associated with those used by actors like Russians for activities including disinformation. A couple of things we do to counter those threats include producing publicly available assessments of the threat, so threats to democratic processes, that we publish regularly — the last about a year ago — and a national security cyber-threat assessment made public just in October, where we outline the tools and techniques to give the public a better understanding — and especially to those who might be public figures or others — on the techniques, including things like artificial intelligence, that are used by these threat actors to propagate disinformation.

Mr. Madou: Thank you for your question. From a CSIS perspective, it really goes down to the Canadian Security Intelligence Service Act where you’re quite right that the world of disinformation is rather large and some of it is free speech. We intersect that world when it comes to foreign interference. When it is detrimental to the interests of Canada and also deceptive, clandestine or threatening, then it becomes an investigative issue that we manage. From there, working with the rest of the SI community, we might take some threat-reduction measures or, alternatively, we could publicly disclose information similar to what my colleague was talking about. For us, it intersects when it is foreign interference by hostile actors.

Ms. Galadza: At Global Affairs Canada, we look at what is happening in the world that negatively affects our democratic resilience, so that’s the foreign interference piece. In my unit specifically, we have an open-source capacity as opposed to classified information. We look at the open source to see what is happening there. We also work with other countries to coordinate, monitor and respond, and we advise as part of the larger government-of-Canada effort. I work with these people all the time. We advise on the foreign policy impacts of these threats or any responses.

Senator Cardozo: Mr. Madou, did you say the S&I community?

Mr. Madou: Yes, the security and intelligence community is what I meant by S&I.

Senator Cardozo: Oh S&I. Who else is part of it beyond you?

Mr. Madou: We have our colleagues CSE and the Canadian Centre for Cyber Security, and also our colleagues from Global Affairs who also have an S&I community, the Public Safety Department, the RCMP and Canada Border Services Agency. Investigative agencies constitute the S&I community for Canada.

Senator Cardozo: Thank you.

Senator Anderson: Can you elaborate on Russia’s use of disinformation on the Arctic given the geopolitical situation and the growing interest in the Arctic as a shipping lane, potential economic opportunity and a strategic defence site? What is being done to counter disinformation? Specifically, how are you working with the communities and the territories in the North to ensure open lines of communication?

Mr. Madou: Thank you for your question, senator. From a CSIS perspective, aside from the notion that we monitor and investigate and attempt to reduce the threat of foreign interference in a broader sense, including with Russia and the multiple threat vectors coming from Russia, the manner in which we attempt to build resiliency is through engagement. So we do engage with multiple communities, including with territorial governments, as well as with industry and other sectors that are active in that space, in order to build resiliency.

We do this as part of our community effort with the rest of the community, but we play that one role.

Mr. Aubertin-Giguère: That wouldn’t be a prime area of interest from the Russian disinformation perspective, but the key message here is that, first, they have territorial claims that compete with ours in the North. The second stream is that Russia is a kind of actor for peace in the region where they have very clear, strategic interest in militarizing the North. Those would be the main or key messages. I haven’t seen any constructive effort from a Russian perspective of targeting northern populations.

Senator Anderson: I will expand on my question a little bit. According to some of the readings, Russia has targeted NATO, particularly as it pertains to the Arctic. Can you elaborate on that?

Mr. Aubertin-Giguère: Russia systematically targets NATO in everything they do. They are always accusing NATO countries of militarizing spaces, and they are positioning Russia to say that they are peace seekers, whereas it’s very clear in the Russian doctrine that they are militarizing the North and investing significant resources to upgrade their position over there.

In their messaging, they mostly target North American defence as we traditionally see it.

Ms. Galadza: I would add that, in the Arctic, the strategy that was released on Friday of last week recognizes the vulnerability to disinformation in the North. It says that we have to see Arctic security through traditional and non-traditional lenses. The non-traditional includes disinformation. And, yes, the idea is of a strong domestic narrative that Russia uses of a hostile, unfriendly West, used in part to justify militarization of the Arctic.

Ms. Walshe: We are recognizing the fact that we have strong relationships, and we dedicate time to working with all provinces and territories, but given the unique nature of the governments in the North, we do prioritize working very closely with them to make sure we are sharing threat information, including those on disinformation. We also give advice and guidance to those levels of governments, so they are well prepared to address that threat from their perspective.

Senator Al Zaibak: My question is directed to Ms. Walshe. Based on your assessment, what are the primary methods used by Russian actors to infiltrate Canada’s information space?

Ms. Walshe: I can speak to what we have observed from the online methods used by Russia for mis- and disinformation. Some of my colleagues may be better able to address other aspects of it.

In the opening remarks, it was noted that disinformation is often perceived as more believable and is taken up more by the public when it’s information that’s shared by someone that they trust or when it’s some sort of narrative that exists already in the space. What we do see and have noticed, in particular with Russia and other countries, is that through the use of artificial intelligence, they’re better able to take those narratives and promote them. For example, through the use of creating fake accounts on a social media platform, they may be able to take a message that isn’t getting as much traction and promote it more so it gets more traction. They can also use that same artificial intelligence to spread similar messages by creating fake content, things like deep fakes. Over time, we have noticed, in the last two or three years, as that capability has increased, that Russia is taking advantage of it and using it to spread those narratives that already exist.

Senator Al Zaibak: How can an ordinary citizen detect such scandalous disinformation and identify the source?

Ms. Walshe: It’s definitely a difficult piece. My colleagues have already spoken about prebunking and debunking as methods that are used to call out particular narratives, but also the ability to note sources that are trusted and, always typical, to take a good look at the information that is there and ensure that what you’re seeing sort of matches something you believe to be true. Do a good questioning of those things as a sort of pre-inoculation to mis- and disinformation. An important step for people to take includes the security of their own accounts. Take note; is this a real account that I’m seeing? Is this something that has that check showing the account is verified? Those who are in public spaces should be noting, looking and flagging any time they may see an account pretending to be them as well. Misrepresentation and fake accounts are things people should be on the lookout for.

Senator Al Zaibak: Thank you. I have a burning question. Do we have any counter-campaign to this misinformation? Do we have a response? I don’t know if we do or not, but I’m here to know. I would also like to know whether we have the capability to intercept and filter out such kinds of information before it arrives to the public. Any one of you can please answer.

Mr. Madou: Thank you for your question. The best way to counter disinformation is to have, as my colleague was saying, trusted sources of information, the government included. It is a massive challenge in the world of free speech and online international social media to attempt to filter out things that are identified as disinformation.

As an agency and as a community, when we do come across things that appear to be fake, avenues should be available to mitigate them by highlighting them to whomever is hosting such websites or to counter it by having our own tweets to speak against it and to speak about the issue differently. This is the manner in which we, as a community, are trying to address that.

Mr. Aubertin-Giguère: You don’t counter disinformation with disinformation. You counter it by information integrity, and that’s our position.

Senator Richards: Thank you for being here. It is a kind of loop, a figure eight, with the questions we’re asking because mine are so similar to Senator Dasko’s.

Is Russian disinformation the most pervasive, more so than China or other adversary countries? In what specific ways has Russia targeted and damaged Canadian institutions, especially political ones?

You mentioned creating divisions. We have much discord here already, so in what way is this exacerbated by Russia? Is it lasting? Has the discord which Russia has tried to sow pervasive, punitive and lasting enough to upset public opinions against our institutions?

Mr. Aubertin-Giguère: China tries to portray a positive image of itself. It’s more concerned about the things that should not be said, so it’s about controlling messaging, whereas Russia has a more negative approach to information. The core approach here is to target points of weaknesses in a population of social discord and amplify them and also, sometimes, to provide conflicting information so that, in the end, people don’t trust anything that they see. That’s been their approach. They’ve been much more aggressive in that space and have been for a long time. These are all Soviet technologies, even.

I would say the problem with that is the stickiness of negative messaging, especially in our information landscape. It’s having an impact; that’s for sure. Also, it’s difficult to measure the impact, but it’s concerning when you see messaging that is clearly Russian disinformation now becoming mainstream or being adopted by certain circles of influencers and social media personalities.

Mr. Madou: If I may, maybe in terms of impact, because it’s come up in a few of the questions, but I think a secondary order of impact of all of this disinformation space and the social cleavages it creates in our communities is violent extremism. It’s a second-order effect, but the confusion that’s created — and what we are seeing from a service perspective — is that there is definitely a rise in youth violent extremism. Some of that we can attribute to the notion that a lot of that generation lives online, and by consuming and being subjected to so much disinformation and dissent, it creates that confusion that mobilizes some to violence. It’s not the only effect, but it is definitely a secondary impact of this —

Senator Richards: Is that directly Russia, or is it other bad actors as well?

Mr. Madou: It’s the conflation of all that disinformation, so it’s not specifically directed by one country.

Senator Richards: Thank you.

The Chair: Colleagues, we are at the five o’clock mark, but given we only have two panellists in the next session, and four individuals have indicated they want to be on the second round, if you can ask a succinct question all at the same time. Then we’ll get our panellists to be equally succinct, and we’ll try to create enough time for the next four senators to ask a question.

Each one of you, if you can please ask your questions, and we’ll get the answers.

Senator Boehm: Thank you, chair. Succinct is my middle name, so here we go.

Ms. Stinson, I have a question for you. I presume that Elections Canada has built in some measures and is ready for whenever the writ is going to be dropped for elections here. The Germans are going to have an election in February. There’s a lot of electoral activity.

Is there anything that you can say in terms of you advising Minister LeBlanc that we are resilient enough, compared to other countries, to face the disinformation that’s coming? Let me offer but one example.

The most popular platform out there is still X. I am on it. I don’t post much anymore, because of what it has become, but the owner of this platform was perfectly prepared to put his finger on the scale in either purveying conspiracy theories or trying to influence elections based on the power that one platform has, which, of course, is full of bots and other things.

Do you have a view?

The Chair: We will come back to your answer in a minute.

Senator Kutcher: Through the chair, could we please ask each of the panellists to provide to us in written form the answers to the specific questions that I asked that did not get answered? First, what are the exact prebunking activities that your organization does, who does them and how effective are they?

Second, what are the exact postbunking activities your organization does, who does them, who do you fund to do them and how far effective are they?

Third, what are the exact fact-checking activities that your organizations do, who does them and how effective are they?

When you’re talking about effective, I want to know for the public as well.

And by the way, the municipal one, they do not have a clue. I did my homework.

What actual activities does the government do to shut down malignant actors? For example, how many bot farms have been shut down in the last two or three months, and what are the specific things that you have done to shut down malignant actors?

Thanks.

Senator Patterson: Mine is going to piggyback onto that beautifully, so I’m going to ask Ms. Stinson as well.

We also know that in a federation, the federation itself of Canada is being attacked. A lot of these great fact-checking and debunking activities require provincial and territorial cooperation, including right down into our schools.

My question is: How are we helping to work with provinces and territories — just as an example — in order to get it into primary school and kindergarten hands?

Thank you.

[Translation]

Senator Dagenais: My question is for Mr. Madou. It’s short and to the point. To what extent do the Russians use traditional media in Canada to spread disinformation? Are some media outlets unknowingly being manipulated?

Senator Gignac: Thank you again for the work you do. You help to maintain the public’s trust in our institutions. North Korea is very active when it comes to cyberattacks. What role does North Korea and its partnership with Russia play in Russian disinformation?

[English]

Ms. Stinson: In response, senator, to your question, there are three aspects that I want to highlight.

The first is the work we do looking at international best practices. We have, in that space, looked at, in particular — the U.K. had an election in July. We’ve looked at the EU and France to know how they responded in terms of foreign information manipulation and interference, what tools and measures they put in place and how can those be considered, potentially, in a Canadian context?

For example, we’ve looked very closely at the French VIGINUM model in terms of how they approach information manipulation integrity from foreign state actors in their domestic space. Those are great sources of policy inspiration for us.

That includes how they work with social media platforms. As you may know, in advance of the 2019 and 2021 general elections, there was a voluntary arrangement that the Government of Canada had with some of the key platforms in the spirit of ensuring — recognizing that it was voluntary — principles of integrity, authenticity and transparency. The landscape has changed significantly since 2021.

I think, particularly, as we look to the recommendations that will come out of the public inquiry into foreign interference, among other reviews and recommendations, how we can in that electoral process context work best with social media platforms to engage them to ensure, at a minimum, their community standards are upheld and integrity is enforced during the election period. It’s an ongoing and quickly evolving space.

I would also note, in terms of Minister LeBlanc, you mentioned Elections Canada. Currently before Parliament is Bill C-65 regarding Minister LeBlanc’s responsibilities for the Canada Elections Act, and that bill focuses both on participation measures as well as privacy measures, but there are some safeguarding measures in there as well in terms of further strengthening the act against foreign interference, particularly as it relates to foreign funds and the use of untraceable currencies like cryptocurrency and other measurements.

It’s through the Democratic Institutions secretariat that we provide that advice to Minister LeBlanc with respect to the administration of the elections, as that responsibility falls to Elections Canada.

The Chair: Thank you.

In regard to Senator Kutcher’s question, if you could provide in writing answers in regard to the very specific point he made in a number of areas as to what each department is doing and how that is coordinated. I think that will be very helpful, as we will look at the remarks and responses as we formulate our study findings at the end of the day.

Senator Patterson had a similar question that could piggyback on that.

Senator Dagenais had a question to Mr. Madou.

[Translation]

Mr. Madou: Thank you for your question, senator. Obviously, I come at the issue from CSIS’s standpoint and the way we deal with foreign interference. We look at the tactics used and the actors involved. We don’t look at the media as a whole. I can say, though, that there is no doubt, in my view, that traditional media include a large number of people, but I think more traditional media may be more rigorous about fact checking and scrutinizing information. That is especially true for media based in Canada, not online. When it comes to online media, it is less clear who is behind the media organization. Certainly, a number of our more traditional media sources can be trusted, but that doesn’t mean there aren’t attempts to influence them. Traditional media that rely on multiple sources and fact check before coming out with a story are more credible.

Mr. Aubertin-Giguère: Russia is not in the habit of using traditional media. Instead, it uses alternative media or manufactures media to start and spread stories. In some cases, the stories that come out of Russian information operations snowball and end up in more traditional media, but that is fairly rare. That isn’t their main modus operandi.

[English]

The Chair: Do you have a short point on Senator Gignac’s question?

[Translation]

Mr. Aubertin-Giguère: North Korea is very active in cybersecurity, but it isn’t an actor in the disinformation world.

[English]

The Chair: Does somebody have a short answer?

[Translation]

Ms. Walshe: About the threat North Korea poses?

Senator Gignac: Does North Korea work with Russia?

Ms. Walshe: They don’t cooperate in that way.

[English]

When I think about North Korea as a cyber-threat actor, it’s certainly a threat that we note in our assessments as being something that we track as a threat, both to Canada and regionally. The major threat actors that we see are the People’s Republic of China and Russia, which are much larger and more capable threat actors. It’s worth noting that North Korea is a threat.

Ms. Galadza: I would like to add that Russia goes to great lengths to obfuscate its involvement in the spreading of narratives or the fomenting of discord. This is an important lesson and is something that Canadians need to understand, because we can’t stay ahead of every single campaign, every single lie or exaggeration, but people need to know that is happening.

It’s particularly important for influencers to know. Our leaders and prominent people understand it, but influencers — in the social media sense — also need to know that they are being instrumentalized by Russia in the spread of disinformation. As I said, Russia goes to great lengths to obfuscate its role in the spread of disinformation and social dissent.

The Chair: Colleagues, this brings us to the end of our first panel. I want to thank Ms. Walshe, Ms. Galadza, Mr. Madou, Mr. Aubertin-Giguère and Ms. Stinson for all the remarks you’ve made. I would appreciate very much if you could address the questions you didn’t get to answer in your written responses. It would be much appreciated.

On behalf of the committee, thank you very much for the important work you do on behalf of the nation. As you know, this is a serious challenge we face in our country and in our democracy to give Canadians pride in their government and the work that the government does on their behalf. Thank you for appearing here today and thank you for going a little bit over in time. We appreciate your time here.

Senators, we now move to our second panel. For those joining us live, we are meeting today in relation to our study on the impact of Russian disinformation on Canada.

We extend a welcome to two of our witnesses who will be testifying by Zoom.

Jean-Christophe Boucher is an Associate Professor, School of Public Policy, University of Calgary; and Pekka Kallioniemi, a Non-Resident Research Fellow, at the International Centre for Defence and Security, also by video conference.

With that, we thank you for joining us today. We invite you to provide your opening remarks, which will be followed by questions from our members.

Jean-Christophe Boucher, Associate Professor, School of Public Policy, University of Calgary, as an individual: Thank you, chair and members, for inviting me. It’s always great to take a plane from Calgary to Ottawa. I’m from Aylmer. It’s always good to come back.

I am an Associate Professor at the University of Calgary. I work at the School of Public Policy in the Department of Political Science. My research team is funded by different agencies in Canada, Department of National Defence, Canadian Heritage. We also have SSHRC grants.

We work with partners on research teams in Germany, France, Japan, the U.K. and U.S. We have projects looking at Russian and Chinese disinformation. I’m happy to speak on either.

In the last 15 years, Russia has embarked on a massive offensive in the information space. They’re essentially waging information war worldwide. They’re trying to undermine — you saw this — our Western societies from within. They’re trying to expose and stoke the flames of radicalism, social strife and promote their own views.

One of the things we have to adapt is to understand that Russian information operations, like the Chinese, are adapting in the information space. Anything that is pre-2022 is old news. We have to adapt what we’re thinking. What I’m trying to convey here is what we’re learning now and not what we knew about Russian disinformation.

If we’re trying to think about Russian disinformation, we have to think of them as a strategic actor. As such, it means they are conveying and structuring their information wars around three big elements. This is strategic, because they know what they want to say.

The Russians have a clear view of their objectives. They’re pushing this forward. They’re not chaos agents. They have a sense of what they want to say and they are trying to say this. When we think about this, they have essentially three types of objectives:

Long-term objectives. They’re trying to redefine the distribution of power in world politics and make sure they reclaim the place they’ve lost under the Soviet Union. Many of their information operations are designed around their status-seeking enterprise. In doing so, they also try to undermine the NATO alliance in Europe.

In the projects that we’re doing in the Indo-Pacific, we see them trying to undermine Japan, doing influence operations around the Japanese, sometimes in coordination with the Chinese. It’s not just a NATO thing. We’re starting to see them be active almost everywhere in the world.

Their medium-term goal is to sap and weaken western society. They’re promoting illiberal values, intolerance toward immigrants, racialized people, LGBTQ members, and ultimately trying to erode citizens’ trust in Western democratic institutions.

Their short-term goal is usually to support their military operations in Ukraine. When you think about how the Russians are structuring their ways — I heard the questions on what is more prevalent — you have to think about these three kinds of narratives.

They’re strategic because they’re allocating significant resources to their information operations. The Russians, like the Chinese, are spending billions of dollars in trying to influence Western attitudes and shape behaviours.

What we’re seeing in the data is they are active, and very active in certain places that match their strategic interests. In Africa, for example, they’re active in certain kinds of countries, but not active in other countries. They’re active in some Western states and not others. They’re allocating their resources in a way that can maximize their interests. They’re thinking through these things.

The third thing we have to say is they are strategic because they study audiences. The Russians are good at studying the audiences, how these messages — and what they’re trying to convey — are understood by the audience, then they’re pushing this through. We have to understand this.

When you think about Canada, what does that mean? It means certain things. On the one hand, the Russians are pushing certain kinds of narratives. They’ve been doing this for a long time. They have been active before the war in Ukraine.

In fact, we have a data set on the Crimea in 2014. We see the Russians doing things in the information space at that time.

To be honest, they have penetrated our information space, especially on the far right. I heard some of the questions. I think some of us were trying to convey this.

Technically, right now, if you’re looking at the data sets, the Russians are exploiting both the far left and far right narratives and pushing those.

Right now, if I looked at the data sets, they’re completely integrated into the far right Canadian ecosystem. What does that mean? I could be specific.

On the one hand, from leaked documents we see, for example, they spend a lot of time looking at this far-right ecosystem. For example, we know from leaked documents they are looking at Rebel News and how their messages come out of this.

Second, we know they’ve paid Canadians to provide some commentary. Some of those were Americans. You were talking about impact. Some of those commentators and influencers who were paid through Russian money were top influencers during the Freedom Convoy. Those people during the “Freedom Convoy,” Americans who were pushing anti-Canadian, anti-government narratives, were actually getting money from the Russians. I think we have to understand how that comes through.

Third, we see in the Rebel News, there is a connection between the Rebel News and Russian disinformation. In fact, since the beginning of the war, the Rebel News has pushed Russian disinformation. Their lead contributor on international affairs works for RT International. In essence, they’re actually connected directly to Russian state media. They are still doing this. Essentially, they’re bypassing our ban on RT by employing somebody who actually promotes this, and we are seeing this in the information space.

If you want to ask about the influence and those effects in the questions, we can go through that.

The Chair: Thank you, Mr. Boucher.

Mr. Kallioniemi, you can begin your remarks. Thank you.

Pekka Kallioniemi, Non-Resident Research Fellow, International Centre for Defence and Security, as an individual: Good afternoon, members of the committee. Thank you for this opportunity to appear before you today.

My name is Pekka Kallioniemi. I’m a Finnish expert on social media and disinformation. In recent years, I have mainly focused on the Russian side of online disinformation.

It’s safe to say during at least the last 10 years, Russian online influence operations have been the most effective in the world. The Kremlin has attempted to interfere with elections and referendums around the world.

The latest example is the massive social media campaign Russia ran before the presidential elections in Romania. An unknown, pro-Kremlin, anti-NATO candidate gained over 20% of the total vote during the first round, only by campaigning on TikTok. The whole election was eventually annulled due to the massive Russian interference campaign exposed by Romanian intelligence agencies.

In many countries, Russians hire and manipulate people to spread false narratives online, and Canada is not an exception. There are several prominent figures parroting Kremlin viewpoints regarding, for example, Ukraine and Syria. Tenet Media has already been mentioned here, but there are various academics, journalists and other social personalities who spread Russia’s lies online. Some of them are motivated by money, others by ideology or their ego. Some may even have become victims of the Russian blackmail known as kompromat.

This is how Russia usually operates. They hide the origin of the message. It’s also one of the main reasons why their messaging is so effective. They have the ability to make it seem organic and local.

Of course, all this will be — and to some degree already is — supercharged with the use of generative AI.

After February 2022, the main goal of Russia’s influence operations has been to stop any kind of military aid to Ukraine. Long term, they have also tried to destabilize western societies, undermine trust in democratic institutions and weaken adversaries through division and confusion. The rationale behind this is that any country that is focusing on domestic disputes has, in general, a weaker foreign policy. We’ve seen this in the case of the United States, for example.

By my assessment, Russia’s online operations in Canada focused mostly on the same topics as in Finland, so NATO, aid to Ukraine, immigration, Indigenous people around the Arctic, identity politics, inflation and so on. One of the biggest previous efforts, both in Canada and in Finland, was related to COVID-19 mandates and vaccines.

Russian disinformation rarely has a large effect within the Finnish society. So how do we fight Russian disinformation in Finland? First of all, we are protected somewhat by our collective memory of fighting the Soviets during the 1930s and 1940s, but our greatest weapon against Russian lies is our high level of media literacy.

Finnish media literacy is widely regarded as one of the best in the world. From as early as preschool, Finnish children are taught through stories to critically analyze and evaluate information. Integrating media literacy and critical thinking into the school curriculum can effectively increase societies’ resilience against outside influence. Of course, this is a long-term solution, and it takes time. A lot can be done in the short term, too. For example, active civil society has been extremely effective in pre- and debunking Russia’s lies and providing a rapid response.

Let me explain briefly why this is important. People tend to remember best the first story about events. This is why the Kremlin often rushes to publish the first version of any major event. After the Russians and Ukrainian separatists shot down flight MH17 back in 2014, killing 298 people, the Kremlin quickly came up with eight different stories about what happened. Eventually, these fake stories were debunked by an independent investigative group Bellingcat, but some people are still skeptical about what actually happened. So the first story often sticks, which is why prebunking is also extremely important.

Any government that wants to fight against Russian or any other malign actors influence operations needs both short- and long-term solutions.

To conclude, Russian-style campaigns work well in so-called low-trust environments, societies where people generally have low trust for journalists, politicians, institutions and so on. Russian disinformation aims to lower this trust even further. By creating a healthy, happy society that trusts its decision makers and other institutions, we can increase our resilience against these malign influence operations.

For seven years in a row, Finland has been the happiest country in the world, which is one of the reasons why Russia’s lies have been very ineffective among our population. Canada is not that far behind. You are at spot 15, so that is also very good.

I’ll stop here, and I look forward to answering your questions. Thank you.

The Chair: Thank you very much, Mr. Kallioniemi.

Colleagues, now we now move to questions. We only have time for three minutes each. If you can keep your questions short and succinct, we’ll try to get answers for everybody. We’ll start with our deputy chair, Senator Dagenais.

[Translation]

Senator Dagenais: My question is for Mr. Boucher.

Those who spread disinformation aren’t exactly known for their subtlety. As untruths get repeated over and over again, people start to believe what they’re reading, hearing and seeing.

Are there ways to measure the impact of disinformation on the general public’s mind? Is it possible to identify the types of falsehoods people are most likely to believe? Where is a portion of the Canadian population more susceptible, in which areas? Lately, have some things hit a nerve with people more easily?

Mr. Boucher: I’ll start with your first question about the impact of repetitive messaging. The studies show that repetition is not the only factor. It’s also about the speed at which the information travels. In other words, the first person in the information space to spread the message wins the information war to a certain degree. It is very difficult after the fact to verify the information or change the narrative with the same momentum.

The University of Calgary did a study in 2022, and other groups across the country have done similar studies. We tried to measure how much Russian propaganda was affecting Canadians. We asked four questions about Russia, and all the studies showed almost the same thing. People on the far right or people who vote for far-right parties are, in large part, more susceptible to disinformation than other Canadians. The study revealed that 80% of people who wanted to vote for the People’s Party of Canada believed that the Russian disinformation narratives referenced were true. We also found that Conservative supporters were 10 to 15 times more likely to share Russian disinformation than other Canadians. That is a problem.

I come from Alberta, so, naturally, it is a frequent topic of conversation in circles there. From those studies, we know that people who tend to vote for right-wing parties are more vulnerable, and it’s less about the information and more about the information network. As I said, in certain cases, the far-right information ecosystem is already drawing on false Russian narratives. That ecosystem is largely tied to far-right groups in the U.S. who belong to Russian disinformation networks, themselves. That is the context where we see the biggest impact. I saw the same thing in France, and other research groups observed more or less the same thing.

If you ask me where Canada needs to focus its efforts and what we really need to do, I would say we need to put our money and more time into the far-right ecosystem or environment. We need to talk about the reasons why we want to be in Ukraine, and make people understand why we promote and defend Canadian values. Hopefully, then, we could have a bigger impact in that environment. There is a reason Russia finances far-right influencers. That’s where it can have the biggest impact.

[English]

Senator Kutcher: Thank you very much, both, for being here.

Mr. Kallioniemi, it’s nice to see you face-to-face after the long discussions we have had.

For Mr. Boucher, does Canada currently have a comprehensive and effective strategy for countering Russian disinformation? If not, what could it be?

For Mr. Kallioniemi, your book Vatnik Soup is outstanding, and I suggest all my colleagues read it. You also have “Vatnik Soup” on Twitter or X. There are some interesting Canadians and Americans named on there, some of whom have just been potentially elevated to very high posts. Could you share with us how our government having this information might be able to work in this space?

Mr. Boucher: It’s a great question. My answer would be no. What I’m seeing right now is that most of our efforts to fight information manipulation from the Chinese, Russians, Iranians and the Indian governments have stalled in the last two or three years. We’re not making headway for three big reasons. On the one hand, we don’t know what we want to say. There is no strategy for the narratives of what we’re defending and why we think this is the best country. I think this is a major problem.

The second part is we’re not spending the money to do it. There are a lot of dedicated public servants, and you’ve seen most of them here. All around the Government of Canada, I see people who are dedicated to defending democracy and our values, and I see them understanding the issues. What I don’t see is money. You know as well as I do that when I teach public policy, I say follow the money if you want to see what the countries are doing. When I travel around the world, in France, Japan, Germany and the U.K., I see hundreds of millions dedicated to this task. In Canada, that’s not what this is.

The last part is we’re still gun-shy on actually addressing and engaging with our vulnerable populations that feed off disinformation either on the far left or on the far right. We’re really quick to talk about censorship and freedom of expression, but somehow we ourselves have lost the freedom of expression to defend these values. There is a difference between censoring someone and defending your own values.

The last part is that there is a lack of political will to actually address this properly, more so in Canada than elsewhere in the world, unfortunately.

Mr. Kallioniemi: What can this information be used for? For example, I have researched. I’ve published information about various people whom I consider to be pro-Kremlin or spreading Kremlin narratives. Creating a network out of these people and seeing how they communicate with each other and how they amplify each other’s messages is extremely important if you want to have a full picture of what’s going on in this whole scene of Russian disinformation. Having this kind of map that you can use to analyze the network is extremely useful.

As Mr. Boucher previously said, following the money is extremely important. If there is some sort of money flow that can be tracked within these networks, it can also be extremely useful because there is this assumption, especially after Tenet Media, that many of these people are paid by the Kremlin. So I think these are the key elements.

Senator Patterson: It’s really hard coming after Senator Kutcher because he covered a good part of my question.

What I’d really like to do is focus on these vulnerable groups that you’re talking about because we think as a more mainstream Canadian, it would be very embarrassing to be jerked around by a foreign state, but again we have to understand that there is a big psychological component to this. We know that Finland has had a lot of success in the preventive space increasing media literacy there. How can we start talking to Canadians at all levels and tailoring and shaping the messages? What messages should Canadians be getting even within these vulnerable groups, knowing it’s up to them to change their minds? What messages should we be giving?

Mr. Boucher: It’s a great question, and we haven’t seen those kinds of arguments right now. Everybody is talking about prebunking, and I really believe that’s the way to go to spend money and be informed on this.

To do effective prebunking, you need three kinds of information. First, you need to know what you want to say or convey as information — why are we defending Ukraine? Why is this important? Why does it speak to what it means to be Canadian? Why is it important for Canadians to be tolerant and to have democratic values? — and to say this is not a debate, these are our values and how we feel about this.

The second part is also to understand what malign actors are targeting. We follow where they’re going. We know they’re targeting the far left. We know who the people going on RT are and which ecosystems they’re in. We should spend our money there instead of trying to do these massive communication campaigns and trying to address all Canadians. Maybe we should talk to some Canadians.

The last one is we should know our audiences and the kinds of questions and messages speak to them. What kind of anger fuels their grievances? A lot of those are legitimate grievances. It’s not to brush them off but to understand why they think the way they do and to engage with them in a respectful manner. It doesn’t mean we will convince them, but at least we will be filling the information space with our own information, and then the Russians or the Chinese will have to fight us instead of us fighting them. That’s where we have to transfer our energy and efforts, and we haven’t been doing that.

The Chair: Mr. Kallioniemi, do you have any contribution you want to make to this question?

Mr. Kallioniemi: Yes. I think there were really good points there. The target group that needs to be considered also are people who have been marginalized because they have the highest chance to be radicalized in the future. There was a study that explained that the two main factors that lead to people who believe in disinformation are that they’re anxious and they feel disenfranchised. They feel like they cannot make a difference in society, and they feel anxious because of how the world is, so if we can speak to these populations, I think that’s where you can actually make a big difference.

Senator Patterson: Thank you.

Senator Boehm: Thank you to the witnesses. My question is for Pekka Kallioniemi. Dr. Boucher just mentioned ecosystems, and in Finland, you’ve had about 80 years to develop your own ecosystem to deal with misinformation, disinformation and outright propaganda from your neighbour, whether it was the Soviet Union or the Russian Federation as it is today.

Sweden has also developed its own ecosystem. Suddenly, you’re both members of NATO. I’d like your views as to how much cross-referencing, best-practice discussion can and should take place and whether some of this could be branched out to speaking with the Five Eyes a bit more than the regular consultations that occur from time to time bilaterally between CSIS, for example, and other intelligence services including your own. Do you have any thoughts on that?

Mr. Kallioniemi: I cannot really speak for any intelligence agencies. I feel NATO collaboration is very crucial, and I think many countries could learn from the Finnish system that emphasizes education, and of course, the Swedish system is also extremely good. In Finland, I feel that we have a lot of blind spots. For example, in Finland, we don’t really have a clear image of what’s going on in the Global South or what’s going on in China, so I think this collaboration would be extremely useful right now so there could be a complete picture and defence against disinformation because we talk a lot about Russia, but it’s not just Russia. It’s also China. Iran is also a big player there. I think this collaboration is crucial if we want to be successful in this information war that is going on.

Mr. Boucher: For sure, it’s interesting to see how Russia is playing its game. Some countries like Finland or Ukraine or even Taiwan have been at the forefront of these relationships, so their resiliency against disinformation is high. When we do studies in Latvia in trying to see how they respond to disinformation, they can smell it when they see it. In Canada, it’s more difficult.

I think we’re still in an argument where we think this is a debate, and we think we are in a good-faith argument with other actors and we’re trying to determine what the facts are. We’re not. We’re in a battle of narratives, and we have to switch the focus to say that these are our values, this is what we think and we don’t actually care what Russia or China says. We have to defend those with every breath we can take. We’ve lost the capacity to do so, and this is the same across the world. The French have the same conversation, as do the Japanese and the Germans. What it is to be German. It will take some time to do this.

We need political will and political leaders like yourselves to lead these conversations because it’s not going to happen without a strong support from our leadership.

Senator Dasko: Thank you both for being here today. Professor Boucher, you said we have to defend our values and speak strongly. Would you agree that the message that says Canada is broken feeds right into Russian disinformation?

Mr. Boucher: Absolutely. The Russians have been very good in all societies in making the argument that democratic values are corrupted, and they use all sorts of ways to promote that. This is how illiberal forces connect Russian views with our own societies.

That message, actually, speaking of the Global South, when we look at data sets in the Global South, in Africa, that argument that Western values are corrupted speaks to local people. Yes, it is.

Senator Dasko: Broken. Yes, thank you.

Mr. Boucher: It’s a problem.

Senator Dasko: Thank you.

As you know, effective communication requires targeted messages to targeted audiences. In terms of the Russians — and you talked about the right wing influence, which could potentially just be an echo chamber — but my question is: Who are the audiences that the Russians target? Is it just right wing, or are they looking at other segments as well in Canadian society?

Mr. Boucher: That’s a good question. In the data sets we have, they’re effective both on the far left and on the far right.

Senator Dasko: And they focus on them?

Mr. Boucher: They actually focus on them.

We see Russian accounts amplifying their content and collaborating with them. Some of these influencers go on RT; they go to Russia. They really behave as if this is a normal conversation, and we see them doing both.

In other countries, they’re going to exploit other kinds of grievances. For example, in the Indo-Pacific, we’re seeing Russia exploiting anti-Japanese views in Korean society.

They’re equal opportunity. They’re trying to find ways that can increase the grievances within a society, and then they amplify these messages. They don’t really care about being right or wrong; they care about the impact.

Senator Dasko: A lot of Canadians are on social media. Would the average Canadian see a lot of Russian disinformation or some, and what messages would the average person see?

Mr. Boucher: It depends on the platform. We’re doing projects on TikTok, for example, and it’s surprising how China and Russia have been really fast at exploiting TikTok.

In the data sets that we have, younger Canadians have a harder time identifying Russian and Chinese disinformation than older Canadians, which tells us that there is something around the way people consume information.

I would say most people who follow X or TikTok or YouTube and those kinds of platforms will be exposed in some shape or form to those narratives, especially if they’re pushed by influencers who are well perceived within our societies, who have media outlets like the Rebel News or something like that.

In Alberta, there are a lot of people following Rebel News. In fact, when I give courses to reserve units, sometimes I hear people saying, “I read Rebel News every day.” This is interesting. You are now a reserve person, a soldier, and you are reading media, and your media literacy is kind of lacking.

The Department of National Defence recently had a snafu on this, where people were sharing far-right content with little media literacy on why that was a problem.

Oftentimes we read things, thinking they are innocuous, but we can link those to Russian narratives and or to Russian influence operatives abroad.

Senator Cardozo: I would like to carry on that conversation you had with Senator Dasko.

You mentioned Rebel News has influencers. Could you share more about who and how they do that?

You talked about the far right and the far left. Who is on the far left? We often think of the far left as unions and peace activists, and I don’t think it’s them.

Mr. Boucher: No. 

Senator Cardozo: One hears rumours that some of the Palestinian demonstrations have —

Mr. Boucher: That’s correct.

Senator Cardozo: — influencers. I wonder if you could expand on who it is on the far right and the far left and how they influence.

Mr. Boucher: In the far ecosystems, what we are seeing is that — and we see this also in the Freedom Convoy. There is really now a global far-right ecosystem, where Canadian far-right groups are associated with Americans and other people.

At the middle of this in Canada, Rebel News is probably the most influential far-right ecosystem. It has millions of views. It has international connections with Americans.

Some of the people who were hired through Tenet Media were actually ex-Rebel News people, and if you look at their connections with the Americans, you see Jack Posobiec in the U.S. being part of the Rebel News ecosystem, and Tucker Carlson. They’re well established and well developed.

On the far left, what we’re seeing, for example — I could show you a bunch of influencers. A lot of them were associated with Global Research. Some of them write for different kinds of organizations. A lot of them are pro-Palestinian. They were pro-Assad.

For example, some of the Canadian influencers associated with The Grayzone — for example Aaron Maté or Max Blumenthal — have been collaborating with RT forever. They were actually groomed during the Syrian conflict in 2014. They went to Syria and were groomed through Russian information operations. When the Russian invasion happened, all of those activated and defended the Russian invasion and tried to obfuscate the information space.

Today, there are those people who run around Minister Joly or Minister Freeland and criticize them for supporting Ukraine. They’re trying to make the argument that we’re a warmonger, where the only one actually at war is Russia.

They’re very active. Some of them go to Russia. Dimitri Lascaris, for example, went to Crimea, “Look at how things are great here; this is amazing.” A lot of them were tied to some of those demonstrations we saw against NATO recently.

It’s really this group that has ideological affinities and anti-capitalist views. They are not the unions. They are not the kind of far left we normally think of. They are really associated with these groups.

What is interesting is that when you look at those, they tend to defend Iranian, Syrian, Russian and Chinese information operations and they’re a part of those conversations. They move around these things a little bit.

Senator Cardozo: They are actually people whom they pay somehow to be influencers in those movements?

The other question I want to ask you is about the CBC. There is a proposal from one political party to defund the CBC. Some people look at that and say that it’s, sort of, the last remaining media organization as private sector television and private sector newspapers are falling apart. What’s your thought about the way the CBC would —

The Chair: Mr. Boucher, you have very little time, so you have to make it very short.

Mr. Boucher: I think having a public media helps make the conversation, but this is not about facts; this is about a battle of narratives. I don’t think the CBC should be in doing that work. I think this is the work of political leaders and Canadians to do this.

Russian state media is a $3 billion thing. We are outgunned and outmatched if we really want to complain about the CBC’s funding.

[Translation]

Senator Carignan: I am trying to understand, Mr. Boucher. In principle, Russia is an extreme left-wing communist country that promotes the left and socialism. Can we say that, in light of everything coming from the left or the far left, Russia has no interest in supporting wokism, while simultaneously discrediting the right, which is more in favour of the army and military spending? I’m trying to see the logic in what you are saying. By the way, it’s not because they put words in your mouth that Canada is broken, but political criticism must not stop.

Mr. Boucher: Absolutely.

Senator Carignan: Putting a label on it, when using that word, and stating that we are engaging in Russian propaganda is a little overstated.

Mr. Boucher: I do not believe that saying our country has problems constitutes Russian propaganda. However, Russians use this argument to convey their interests and move them forward.

It must be understood that Russia’s orientation is not what it used to be. When looking at Vladimir Putin and the people around him, one realizes they are not communists. In large part, they are people who promote values that are not liberal, meaning traditional, authoritarian and populist values. It is clear these images find traction on both the right and the left. People on the right, for example, are traditionalists or anti-LGBTQ. They hear Russia’s message, saying they are in favour of traditional values, and it’s true that woke people are a problem in Canada. People on the left say it is true that the Canadian government is corrupt and capitalist and it all has to be brought down.

In political science, we have what we call the horseshoe theory. That means if we go to the extreme of one side, we find ourselves on the other side. That’s what propels this conversation.

Russians are very good at amplifying anti-elitist arguments.

Senator Carignan: In short, they are trying to create chaos.

Mr. Boucher: Absolutely.

Senator Carignan: By equally supporting a far-left discourse and the discourse from the other side.

Mr. Boucher: That is what I believe, and that is what the data says. In other words, every time we see this phenomenon, it is the same thing.

Senator Carignan: The danger is that it also discredits the message where I criticize certain aspects of the government in good faith, but they amplify and credit that message or encourage people to discredit it.

Mr. Boucher: Absolutely.

Senator Carignan: So, they’re playing all sides.

Mr. Boucher: In my opinion, if we really want to proceed this way, we need a bipartisan base, meaning that our absolute threshold is that we defend Ukraine, and our values are tolerance, democracy and freedom. No matter who tries to exploit the argument, we will not accept it and will defend it. We need politicians and political leaders who draw this line in the sand and say: it stops here. The problem is that we see politicians in France, Germany and the United States who use these narratives to amplify and win political games. So, we sell out our country for the interests of another, and that raises an ethical problem. As long as there is a political consensus on specific issues, Russians will have a hard time engaging on that level. When looking at the right and the left, they will see that everyone agrees on the same issues, and there is no debate in that context.

A good example is the fact that, when the conservatives opposed the free trade agreement with Ukraine, the Russians loved it. I understand the Conservatives’ point of view, who wanted to put the motion forward. However, the process served Russia.

Senator Carignan: How do you identify the Russian source? The anti-Ukraine Russian source, I can understand. I have already spoken with Russians and I understand the reasoning. How do you identify the Russian source on a political aspect that has nothing to do with Ukraine, such as the vaccine?

[English]

Mr. Boucher: There are two ways: content creation or content amplification. Sometimes the Russians create content, and this is pushback, or they find content they like and amplify it. In the data sets, sometimes you will see one comment come in, and the bots will amplify this. They don’t have to carry this and create or make it, but sometimes they do both.

The Chair: We have three colleagues on the list for a second round. We have about two minutes each. If you could ask your questions, you have a minute or so for answers, that would be very helpful.

[Translation]

Senator Dagenais: I have a brief question for Mr. Boucher. What is happening elsewhere can certainly be taken into consideration to fight disinformation. Do you think the Russians effectively spread disinformation during the last American election? Could they have the same impact during the next Canadian election?

Mr. Boucher: In both cases, I think the answer is yes. Clearly, the Russians had an interest in a win by the Republican Party. All the data indicates that the Republicans were less favourable to Ukraine than the Democrats. Consequently, having a Trump administration plays in favour of Russia’s interests.

Naturally, the whole network does not know what to do with its time and will have Germany and Canada at the buffet, after they successfully defended their interests during the American campaign. Does that completely change the game? No. Would the Democrats have won without the Russians? Probably not. However, it undeniably helped the Republicans.

Senator Dagenais: It could therefore have an impact on Canadian elections.

Mr. Boucher: I think we may see the American far right interfere in our informational space and tried to promote its values in order to manipulate or change Canadians’ opinions, yes.

[English]

Senator Kutcher: I have the same question for both of our guests. Would it be reasonable to say that Canada is not at war with Russia in the disinformation space, but Russia is at war with Canada?

Mr. Boucher: I’ve been hogging the mic a little bit. I’ll let my colleague answer.

Mr. Kallioniemi: Definitely, I would say this is true. In general, Western democracies are in an information war with Russia and have been for a long time. It’s also a very asymmetrical war since Russia can spread lies, whereas we then have to answer to the lies.

During the Cold War, the West was setting the agenda in this kind of information war, but now the roles have changed; Russia is actively setting the agenda, and we are usually reacting. It seems to me that, if we don’t spend more resources on this information war, Western democracies will lose the information war.

Mr. Boucher: My sense is that we’ve conceded the battleground and the Russians and the Chinese are winning that game. I hope that we don’t go into war in the near future, because right now, we have no capabilities or capacities to actually fight the propaganda wave that will come to us.

One of the things I would like to convey is that, in the data sets we’re seeing right now in the Indo-Pacific region is that we’re seeing Russian and Chinese influence operations coordinating their activities. So if we think of isolating the Russians for the Chinese, it’s less and less real. Actually, they’re now collaborating and engaging in the amplification of each other’s accounts, using each other’s audiences and pushing the same narratives. We’re now fighting a battle against illiberal autocratic regimes, and we have to be careful about what is in the future. My sense everywhere is that we need to fight that battle, but we haven’t started yet.

Senator Patterson: Thank you very much. As you know, NATO is calling this the “cognitive security domain” that needs to be looked at and human security probably needs as much focus as we put on state security to get at this full spectrum of conflict. We need to start looking at it differently.

From both of your perspectives, with Russia having to withdraw from Syria, pulling out of their bases, it’s a loss. How are they going to message this? It’s similar to Ukraine taking their incursion into Russia. How are they going to spin this? We also need to look for opportunities to make sure there’s positive messaging about Russia’s current losses.

Mr. Boucher: My assumption is that they’ll flip the script and criticize the West for bringing about the end of Syria and put the ensuing chaos on our backs. My sense is that that’s what they’re going to do. Usually the script is clear: Blame the West and say that Russia is a vanguard against that. That’s probably what they’re going to say.

Mr. Kallioniemi: It has already started. Right now they are saying that the Islamic State of Iraq and Syria, or ISIS, or extremists have taken over the country, and it will become an extremely fundamentalist nation. That’s the way they will go. They will discuss or claim that the previous so-called secular Syria was a better option than what’s coming now.

Senator Patterson: Who is their audience? Is it us, or is it the home team? Or is it the BRICS?

Mr. Boucher: The BRICS countries.

Senator Patterson: Thank you.

The Chair: Thank you very much. This brings us to the end of this panel. I want to extend a sincere thanks to Mr. Boucher and Mr. Kallioniemi for taking the time to participate with us here today. The committee greatly appreciates your participation in this study. Again, thank you so much for being patient with us and being over time. Thank you again for being here today.

We will now move to the final panel. For this evening, I welcome our new presenters to our committee. First, Francis L. Graves, President, EKOS Research Associates Inc.; Marcus Kolga, Director, DisinfoWatch and Senior Fellow, Macdonald-Laurier Institute, by videoconference. Also, Brian McQuinn, Co-Director, Centre for Artificial Intelligence, Data, and Conflict and Associate Professor, Department of Politics and International Studies, University of Regina. Thank you so much for joining us. We invite you to make your opening remarks. If you can keep them to five minutes or so, that will be perfect. Then our committee members will have a chance to ask you questions.

Francis L. Graves, President, EKOS Research Associates Inc.: Thank you very much, Mr. Chair and members of the committee, for inviting me here to talk about this important topic.

Initially, as a member of the Federal Vaccine Confidence Task Force Group, I started studying disinformation and misinformation. Recently we have expanded the scope, and we’ve been looking at this for some time. I note that Canadians are concerned about a lot of things, but they rate as their top area of anxiety, polarization related to disinformation. Even though you would think that there are other issues keeping us up at night, this one is very high on the list if not at the top.

Although it is impossible for me to neatly disentangle the impacts of Russian activity, the extremely improbable connections I’ll talk about today on attitudes to Russia are linked to their activities. This has been corroborated by colleagues in Europe and other countries, who understand the disinformation ecosystem much better than I do. My work examines the impact and rise of disinformation using large scientific samples of the Canadian public. Much of this has been conducted under the rubric of what we call the risk monitor, which has been going on since March 2020, oddly enough, five years ago.

Initially, we were looking at issues related to COVID vaccines, but the scope has expanded to look at other types of risks. We find there are strong connections across a range of disinformation, including outlook on Russia, outlook on NATO, Ukraine and geopolitics in general. As our scope expanded, and we consulted with other researchers in other countries, we became familiar with the impacts of disinformation as a tool of state craft, and we found clear evidence of the reach and effectiveness of Russian disinformation. I don’t have the forensic analysis of the disinformation ecosystem, but I can chart almost with incontrovertible certainty the impacts. I’ll give you a couple of examples.

In the initial stages of the war with Ukraine, we asked Canadians, “What should Canada do to help Ukraine?” We tested six different types of interventions ranging from lethal to non-lethal aid. Canadians were overwhelmingly very sympathetic, had considerable ire with Russia, and this showed up in our research clearly.

On a hunch, I said, “Let’s break this down by whether or not you have been vaccinated.” This had nothing at all to do with it. I looked at the 90% of Canadians who have been vaccinated and asked them, “Which of these things should we do?” Of that group, only 2% said that we shouldn’t do anything. There was an enormous consensus. The 10% of Canadians who had not been vaccinated, how would they respond to this? It wasn’t a little different. It wasn’t somewhat higher or 10 times higher. It was 52 times more likely to say that we should do nothing. This is a group that probably couldn’t locate Ukraine on a map a month earlier, but suddenly had exotic conspiracy theories. When I talked to my colleagues in Europe who were studying the Russian involvement in European elections, they said that they didn’t bother with the French election for a couple of reasons but mostly because they were training all of their fire on this issue. We saw it vividly affecting attitudes, as you can see in that particular relationship.

In a more recent test, just done a month ago, we updated and asked Canadians about what levels of concern they had with foreign influence, and they broke this down by country to see if their attitudes varied. First of all, concerns with disinformation rise modestly with the level of disinformation. When we look at that group of Canadians who exhibit the highest levels of disinformation and get all the questions wrong, they were extremely emphatic that China was a very bad offender and it was getting worse: 85%. That very same group, when asked about Russia, the number wasn’t 85%; it was 25%. Those kinds of imponderable differences are not statistical artifacts; they are reflections of a very effective program of disinformation, which is producing more congenial and favourable attitudes toward Russia and, by corollary, less favourable attitudes to Ukraine, NATO and other places.

Just to wrap up — I know I have a short time — Canadians are highly polarized in ways we have never seen before. I would describe the Canadian outlook is darker and more divided than what I’ve seen over a very long career. This is connected directly to disinformation, which is now being wielded as a tool of statecraft and being amplified and reinforced in North America by other sources. What we’ve seen through time is that things like sympathy for Ukraine has declined, but we see it expressing itself in a range of other areas, whether it be climate change, attitudes to NATO or attitudes to vaccines.

When we ask Canadians about this, they say that this is something that is having an enormously corrosive impact on our society, our economy and our democracy, and it produces effective polarization, which is to say that we just don’t like each other as we used to in the past. What they emphatically say in a few areas of consensus is that they would like the federal government to institutionally fortify us against this and take strong action. Thank you very much.

The Chair: Thank you. Mr. Kolga, it’s good to see you again before our committee.

Marcus Kolga, Director, DisinfoWatch and Senior Fellow, Macdonald-Laurier Institute, as an individual: Mr. Chair, members of the committee, thank you for the privilege of speaking with you today.

I’m going to focus my remarks on new evidence of Russia’s information and influence operations, as revealed in the Tenet Media indictment and the simultaneous release of an FBI affidavit regarding the Russian Doppelganger network’s campaign. These operations have significant implications for the sovereignty of Canada’s information environment.

The Kremlin’s primary objective in targeting Canada and our allies is twofold: to advance its geopolitical agenda and to undermine our democracy.

To achieve these aims, the Kremlin floods our information environment with a toxic mix of disinformation and hate. This is designed to erode public support for Ukraine, NATO and our allies, while inciting hate toward Canadians of Ukrainian heritage.

The breakdown of cohesion in our society and trust within our democracy are equally important goals for the Kremlin. To achieve this, the Kremlin monitors, identifies and exploits the most polarizing domestic issues and then platforms, legitimizes and amplifies the voices and narratives on the extremes — both far left and far right — to apply maximum tension on our social fabric until it begins to tear.

We know this because we have concrete evidence of it: These tactics come straight from the Russian presidential administration and Vladimir Putin’s most trusted adviser, Sergei Kiriyenko.

In September, an affidavit unsealed by the FBI exposed details of official meetings between Kiriyenko and Kremlin operatives tasked with executing these operations. Their objectives included fomenting anger, hate and division within Western societies, and the incitement of conflict through disinformation to advance Russian interests.

The key Kremlin directives communicated in these meetings included the regular monitoring of Western media to identify and exploit polarizing issues, the targeting of activists and journalists by undermining their credibility, the creation of fake documents, deepfakes and fabricated audio to incite conflict, and also the recruiting of Western influencers to amplify Kremlin-aligned narratives.

We now have new evidence of how these operations are executed. The U.S. Department of Justice’s recent indictment identified two prominent Canadian social media influencers to whom the Kremlin allegedly paid US10 million to create the online platform Tenet Media. This operation was allegedly controlled by Russian state media employees at RT, a Russian government-controlled platform that has been identified by Global Affairs Canada and the U.S. Department of State as a key component of Russia’s intelligence apparatus.

The RT-funded content was amplified through social media networks including major influencers like Elon Musk, who reportedly shared Tenet Media content at least 60 times to his 206 million followers.

It would be a mistake to assume that content produced by such influencers that are not explicitly pro-Russian doesn’t serve the Kremlin’s information warfare objectives.

The anti-government, anti-media, anti-immigrant and anti-LGBTQ content generated by them contributes to the Kremlin’s objectives of eroding our social cohesion. Such operations transcend any geographic boundaries, as Pekka Kallioniemi noted, and they appear to be working, as Frank Graves has demonstrated.

The Kremlin specifically selects influencers like those involved with Tenet Media for their ability to reach large audiences and the alignment of their views and narratives with the Kremlin’s objectives.

What can be done? The good news is we have tools such as sanctions to deter these operations. For instance, RT was placed on Canada’s sanctions list in 2022, making collaboration with it illegal.

The U.S. indictment states that RT funds flowed into Canadian bank accounts in 2023. Any potential violations need to be investigated and prosecuted. Doing so is essential to the integrity and deterrent effect of our sanctions. The proper implementation of Bill C-70 as well, and the Foreign Influence Transperancy Registry, should also help. We also have tools like Pekka Kallioniemi’s Vatnik Soup, which includes several Russian-aligned influencers in Canada.

Now, the bad news is that the Tenet Media case is the tip of the iceberg. Canadian academics and extremists on both the far left and the far right continue to collaborate with Russian state media outlets, like RT, and Kremlin-controlled think tanks like Vladimir Putin’s Valdai Club, and the Russian International Affairs Council.

In conclusion, the threat is clear. Canada’s information spaces and cognitive sovereignty are under attack by Russian information operations. Acknowledging an increasing awareness of the threat is a start.

We have to move on from admiring the problem to focusing on actively disrupting, preventing and deterring these operations to protect our democracy and society.

Thank you, and I look forward to your questions.

The Chair: Thank you, Mr. Kolga.

Our final witness is Mr. Brian McQuinn. Mr. McQuinn, you have five minutes for your opening remarks. Welcome, and thank you for participating.

Brian McQuinn, Co-Director, Centre for Artificial Intelligence, Data, and Conflict and Associate Professor, Department of Politics and International Studies, University of Regina, as an individual: Thank you for the honour of being here.

I wish to touch on three issues that I think are important. I’m going to tailor a little bit of what I was going to say based on what esteemed colleagues have spoken about before.

There are three things I wish to look at, one is what is coming in the next two years? There is a tipping point in the era of influence operations we’re about to see.

Two, looking at the end of research on far-right extremism. Some of my colleagues have been too polite, in some ways, to mention this, but it is important.

Finally, looking at some of the specifics of how Russian influence operations work in Canada, the impact and the mechanisms are a lot of the questions posed by the honourable senators. I’m going to touch on what we have found in one of the larger studies conducted here in Canada.

As my colleagues have touched on, Russian operations and foreign influencers are continually using digital platforms and our democratic freedoms to undermine social trust, polarize communities and erode confidence in institutions. This is a trope we hear all the time.

What is often missed in all of that is the extent to which they are able to shape public discourse. How issues are framed and, as a professor, seeing how students are influenced and the ideas they start with as being in their head — one of the previous speakers spoke about the idea of the first story that comes out, the first explanation is often the one that sticks. This is something we’re going to talk about. I hope we spend some time talking about it.

The first thing I want to talk about is why we are about to see a tipping point in influence operations.

Increasingly, since Elon Musk took over Twitter, now X, and fired all the trust and safety teams, the other platforms have started to follow suit. Notice that he didn’t pay a price for doing so. The U.S. government didn’t step in and say now we’re going to intervene.

Moderation for all intents and purposes, not just in North America but especially outside of North America, simply isn’t happening. I have many examples in questions where we can talk about the extent of it not happening, which means we are about to have a free-for-all, and have been for the last year or so.

The inauguration of Donald Trump at the end of January is going to supercharge Russia’s largest influence operation in the world. The Republican Party shares and amplifies more Russian messages than any other ecosystem in the world, and it is by far the largest. This is going to go global in a way that I don’t think anyone has any idea of its impact. It’s quite uncharted territory.

Increasingly, researchers everywhere in the world — but especially in the U.S., and I’ll get into this in more detail in a second — are being cut off from being able to study the social media data itself. The platforms are cutting off the application programming interface, or APIs.

Increasingly, we live in a multipolar world. With the rise of China as a near-peer challenger, who has the truth and the extent to which that is going to play out, these four different elements are going to combine to create a time I don’t think anyone is prepared for.

The second issue I wish to touch on is the extent to which research on influence operations has basically ceased to exist in the United States. They have a huge researcher community obviously, but through various methods — whether it be Republicans pulling people to Congress to testify — have basically caused most of the researchers to stop doing so.

The most extreme example is the University of Stanford’s misinformation research centre, the Stanford Internet Observatory, was closed due to legal challenges and simply being harangued by Republican elected officials. It’s working. It is curtailing what is being studied. This is an opportunity for Canadian researchers. We are some of the few who are left to do this. We produce research that, at least in the U.S., is thought highly of. There is an opportunity there. It is something Canadians could see themselves as being able to advance in important ways.

Practically speaking, how do they do this? We did a study of 200,000 Twitter accounts that were basically based in Canada but the entire ecosystem was advancing Russian-influenced campaigns tailored to Canadians. We looked at how they did that.

They targeted, as previous speakers have said, both the far left and the far right but only at a two-to-one ratio. They represent in this community as far as impact one of the most active online communities in Canada.

We compared it to the online ecosystem of MPs and found the Russian influence ecosystem produced 27 times more engagement than the ecosystem surrounding the MPs.

The most important thing we found is that 83% of the ecosystem is average Canadians. This is both an opportunity and a challenge. It shows us we have the capacity to have important influence on how this plays out. At the same time, average Canadians are doing this often without realizing it.

The last point I wish to make is simply, in the three months before the invasion, we saw a threefold increase in the number of campaigns and information targeting Canadians in preparation for the invasion.

The Chair: Thank you, Mr. McQuinn.

Now we’ll proceed to questions from my colleagues. As usual, we’ll limit each question, including the answers, to four minutes. Keep your questions succinct and identify the person to whom you are addressing the question. The first question goes to our colleague and vice-chair of this committee, Senator Dagenais.

[Translation]

Senator Dagenais: My question is for Mr. Graves. Research shows a net difference in social values between Quebec and the rest of Canada. You just said there is an obvious difference between the reactions of people who are vaccinated and those are not. Do you think Russian disinformation can effectively adapt to the public in Canada’s various regions? If so, do you have examples of the types of messages they might tailor to Quebec, Ontario, Alberta or British Columbia?

[English]

Mr. Graves: Yes, the methods are tailored to different demographics and regions. They are focused.

For example, our research in Quebec suggests that Quebecers are less likely to appear in the spectrum, more likely to appear in the middle of levels of disinformation, which is a positive finding.

We find, though, as well, that there are tremendous differences in how different parts of our society are reached and with what media. We found, for example, of all of the different social media platforms, by far it seemed that YouTube was the most effective and pervasive.

Young men, by the way, exhibit four to five times the level of extreme disinformation as young women. Many or most of them are on YouTube every day getting algorithmically driven information, which is adjusted based on what they have hit on, and it is extremely effective.

The trouble is now if you say maybe we can combat that or talk to them and so forth, they live in an insular world where all other sources of data are considered fake news and they really aren’t consuming them.

We found an interesting relationship that your level of confidence in your knowledge of the information you think is true is curvilinear. The people who get most of the questions right, which are, thankfully, about half of Canadians, are pretty confident they can sort through. But even higher levels of confidence are exhibited by that 7% who are radicalized and believe all the things we ask, which are pretty straightforward questions.

I would also point out that the issue we haven’t talked about as much is how these are connected to elections. We find in our research that perhaps the most potent predictor of your issue preference and priority and your partisan choices are linked to levels of disinformation, something we wouldn’t have seen even five years ago.

You’ll find the appetite for confronting disinformation of various types — and Russians are very good at it, perhaps the best — is tempered because the road to political success is increasingly paved with disinformation, which makes anyone who says we’re going to try to stop that — by the way, the mandate from Canadians themselves is overwhelming. Regardless of your political stripes or ideology, about 90% of Canadians say we really shouldn’t have this stuff operating in the realm of politics; we should not be using generative AI to produce fake news; these things should all be digitally watermarked and so forth.

There is a strong consensus and also a tremendous level of impatience and dissatisfaction among Canadians, who feel we are not moving quickly enough, that we’re too slow-footed. They look at things like the European Digital Services Act and some of the measures — I haven’t tested this one — that are going on in Australia and they are saying, “Why are we so slow-footed on this?”

I charge you as decision makers to think about ways to pick up our game here and to accelerate our response.

Senator Kutcher: Thank you all for being here. I have two questions, one for Mr. Kolga and Mr. McQuinn and then a different question for Mr. Graves, but I’ll ask them both at the same time and ask you to take them.

The first one is for Mr. Kolga and Mr. McQuinn. We’ve heard from other witnesses that Canada does not have a comprehensive and effective strategy for countering Russian disinformation. In your opinion, what could Canada be doing that it isn’t doing now that could be helpful to us?

And then for Mr. Graves: Recently, I wrote a policy options paper and the troll attacks have been quite interesting. I looked at the Twitter handles of these people and they cluster with health disinformation, climate change denial, various conspiracy theories, anti-LGBTQ+ and pro-Russia against Ukraine. That’s the sort of cluster.

The question is: Are these targets part of Russia’s disinformation campaign? Do they target those specific areas as part of their disinformation?

Mr. Kolga: I’ve been monitoring, analyzing and exposing Russian disinformation for about 15 years now. I think that over the past three or four years, we’ve made a heck of a lot of progress in terms of a government response. Global Affairs Canada and the Rapid Response Mechanism has become extraordinarily effective and courageous in calling out Russian disinformation narratives, explaining them and breaking them down for Canadians so they become digestible. For the average Canadian, I think this is an important step toward addressing this problem. There is a lot more that we could do. We could be looking to Finland. I think the previous witness Pekka Kallioniemi outlined Finland’s digital media strategy in terms of youth and education. We could be looking to Sweden, who has set up an entire psychological defence agency over the past year and a half to deal with these issues.

We need greater coordination across government and, again, education needs to be a part of that.

Mr. Graves: Finland is a great example because in the world tests where they look at levels of propensity to be victims of disinformation, they score number one. Finland actually goes into classrooms, in kindergartens, and will show TikTok videos which are specious and fallacious, and apply the skills very early in life.

One point I want to make is that there is plasticity to disinformation, particularly that which is moderately held. I have found that even going back to respondents who exhibited significant forms of disinformation and explaining that can’t be true because of this and that, that a substantial number will go, “You know what? I think you’re right.” But we don’t have the connections. We don’t have the platform. We’re not speaking in the right places, and they’re not looking at places where we’re trying to deliver those messages.

We’re improving our technology somewhat in understanding what methods will be helpful. I don’t know the boundaries. It’s very clear to me and, in fact, I thought it was ironic that finding I had that the unvaccinated were 52 times more likely to think we shouldn’t do anything in Ukraine. It actually appeared in The Washington Post when it came out at the time and in Russia Today. Russia Today, of course, said, “Look, these people who weren’t duped to take the vaccine actually understand that we really are entitled to be in Ukraine” and so forth.

The troubling thing is these problems have become worse through time. The incidence of people who have negative views on Ukraine and our involvement in NATO is still a minority but it’s growing — that’s very disturbing — which suggests we’re not even treading water on this front. There are some other great —

The Chair: Mr. Graves, you need to wrap up to give your colleagues a chance, sorry.

Mr. McQuinn: Thank you very much. Mr. Graves was part of the team that produced the report I was speaking about earlier, so I’ll leave the whole-of-government approach to his comments.

From the researcher side, I think what is important to identify is being able to — for example, we saw Brazil recently went up against Elon Musk and won. They banned him. There were all kinds of things said, but very quietly they paid their fine and banned the accounts. What would prevent Canada from taking a similar stance, saying: “If you do not give Canadian researchers, or researchers in general, access to the data of our Canadians on your platforms, we will not allow those platforms in Canada”?

It doesn’t seem like an unreasonable request, but it would take a fairly serious stance, and obviously a lot of political blowback would probably happen as a consequence. I think that is one of the important pieces that is always necessary to be said: Without data, we can’t tell you what is going on, and increasingly the platforms are preventing us from telling you.

[Translation]

Senator Carignan: Mr. Graves, I was listening to you earlier. You indiscriminately used — I didn’t see the distinction — the terms “misinformation” and “disinformation” as though they were the same thing, when they are two different terms. You talked about people who were not vaccinated, compared to those who oppose the war or the issue of Ukraine. How can you establish a correlation between cause and effect? You don’t know what sources of information they used. I know many people who refused to get vaccinated. It had more to do with their personality, or considering that it was a matter of individual freedom and what is good for them.

On another matter, such as the war in Ukraine, a person might think: “What are we doing there? I’m not getting anything out of it.” It is not a matter of disinformation, or people being victims of misinformation, because they are two different things. Rather, it’s about a person’s personality traits. I have a hard time seeing how you make the connection there.

[English]

Mr. Graves: First of all, as a researcher and statistician, seeing things which are not somewhat higher, not twice as high, not 10 times as high, but 52 times as high, I’ve never seen those sorts of relationships. I haven’t conducted a random controlled assignment experiment so I can’t be absolutely certain of the causal influence. By the way, this was also triangulated by talking to experts in Europe who monitor Russian interference in election campaigns and do this very seriously, and they said this was an echo of exactly the problems they had looked at. They had researched it in some depth.

On the difference between disinformation and misinformation, pragmatically, they’re the same thing. A respondent or a citizen doesn’t know whether it’s disinformation or misinformation. Disinformation is that which the person purveying the information knows that it’s false. If I ask you what time the bus is coming and I say, by accident, “I think it’s coming in 10 minutes.” That’s misinformation if it’s actually coming in 15 minutes, but if I tell you that and I know it’s coming in 20 minutes and you miss your bus, that’s disinformation.

The fact is that when we ask Canadians about these differences, Canadians are five times more concerned about the conscious effort to deceive with disinformation than misinformation. They very clearly draw those distinctions. I’m not making any moral judgments about those who decided to have a vaccine or not. My bias as a member of the federal vaccine confidence task force, I wanted everyone to get vaccinated because they’re safe and effective and so forth.

I also think it’s important — and I would stress this — that this portion of our society, which is increasingly a large portion, becomes affected and falls into the thrall of disinformation is, in fact, not to be pilloried or ridiculed. That is not helpful. Dismissing them as a basket of deplorables or radical fringe adds emotional fuel to the fire which has set the conditions for why they are receptive to the disinformation in the first place. We have to be careful about that though we have to be mindful that many of the prescriptions they’re recommending would not be in the interests of our economy, our democracy or public health, for example.

By the way, the measures I use are drawn from the international literature. I try to be very careful in how I select them. I don’t take things where I don’t know the answer. I take things and I scale them on a scale of somewhat true to mostly true. You don’t get points, and if you don’t know you, get fewer points.

An example question is: Are governments intentionally concealing the real numbers of deaths from vaccines? I know how those data are assembled and the adverse effects and that the number of people who died is between 4 and 400. For something recorded up from a level of doctors up to then hospitals to municipalities to provinces to the federal government, it would require the collusion of literally thousands of officials at various levels of government to come up with an intentional act to conceal the real numbers of vaccines. Simply said, that’s not true, but we have 30% of Canadians who think that’s true.

This panoply of deceit and disinformation expands into many other areas. I don’t want to get into all the examples, but in all of these, we’re careful to borrow questions which are taken in the international literature so I can compare to other countries to see if there are any differences. They’re also questions I really do know the answers to. I test them formally with measures of reliability and validity, and they perform very well.

Senator Patterson: Dr. Graves, this one is for you. What’s interesting with many Canadians wanting us to not do something about this, but none think they have a problem, so you have a weird paradox going on there. One thing we don’t do very well as Canadians is interpreting data. We’ll take a data point and make it causal, for instance. As you said there is misinformation and disinformation, and one of those tools we use is to focus on the number 400 vaccine deaths, for example. What can we do to help make Canadians more literate? Again, the centre of mass here, trying to understand the data they’re reading and having context to go with it.

Mr. Graves: That’s a great question, and first of all, I would like to point out that three years ago, Canada, which is now in the depths of some of the lowest measures of trust in government that we’ve ever seen, actually reached a 30-year high. We had to ask, “Why did that happen?” That happened because Canada did extraordinarily well in terms of rollout of the vaccine, the Canada Emergency Wage Subsidy, the Canada Emergency Response Benefit and so forth. In hindsight, they’re seen as more checkered, but at the time, they were seen as almost unanimously good, where were dealing with life-and-death problems that individuals couldn’t confront on their own and that the private sector wasn’t going to handle on their own, and that was a perfect role for government. They rewarded government with a 30-year high in trust of government and direction of the country, all of which has unravelled in the last three years. I would argue that a lot of that is because of the corrosive impacts of these accelerating disinformation campaigns, which are weakening and decreasing the level of institutional mistrust.

Here’s one quick example. I know I was looking at some retrospective data on the American election. Fifty-two per cent of people who voted for Donald Trump thought Haitians were kidnapping dogs and cats and eating them. It sounds funny except for how many of those people would not have voted if they had not believed that to be true. We have to wonder to what extent this is something which is amusing but is actually something disrupting and that changes the outcome of elections in advanced Western societies. This is something we have to squarely wonder about.

Senator Patterson: Can I please pass that to the other witnesses here today? Dr. McQuinn, you talked about students coming into universities and who are going to do academic study, publish and move on to positions of influence in academia. How can we address that? How is it being addressed in university? Because we’re not hearing the best things out of that on our side.

Mr. McQuinn: It is a complicated answer. There are two things that are important to emphasize. One is that we are — and Mr. Kolga can speak to this as well — moving increasingly away from the idea of misinformation and disinformation to the idea of influence campaigns and repression campaigns. We’re looking at the goals trying to be achieved. The most effective information campaigns actually use a combination of truths, half truths and falsehoods. We did a study of the Taliban’s influence before the takeover in Afghanistan. We looked at the entire ecosystem, more than 150,000 accounts, and asked what they were talking about. Everyone thought it was going to be disinformation. It turned out that was a small fraction. They really were talking about the places they had taken over and portraying them as these Shangri-La peaceful places. Partly, it was true, so you see their ability to shape it was a combination of those.

Going about a back to the previous questions regarding what the Russian goals are, how they use it and whether it’s disinformation, it doesn’t matter. It’s really about the outcomes. As we’re seeing with the students, they are coming in with a set of framing already that is what the Russian influence operations would like to see. I think that’s the piece.

What do we do? Mr. Kolga already talked about looking at this in middle school. There are a lot of countries doing this at the middle school level and not waiting for university. At that point, we have people coming in who already have a framing of the world, which we have to then, I won’t say deconstruct but share more information, and they have to start figuring out how they’re going to navigate that.

We, as a democratic state, are always at a disadvantage — I think it’s always important to remember this — because we work more slowly and we can’t attack with the same mechanisms of pure lies as they do, so you’re always going to be at a disadvantage. It has to be a more structural approach because you can’t do tit for tat in the same way. You have to have a much more structural and holistic approach that is never going to be as responsive.

The Chair: I’m sorry, Mr. Kolga. We’ll have to come back to you on the second round.

Senator Cardozo: I have a question for all three of you, so if you can take a minute each, that would be great.

You’ve talked about how Russia is pretty tight with the Republicans, and I don’t understand how that ever happened compared to say 10 years ago, but here we are. Can that happen here? We see, as was talked about on the previous panel, that there are connections between Russia and Rebel News. That’s question one, and the second is: As many traditional media is falling away, both broadcast and print, is there an increased importance for the public broadcaster, the CBC?

Mr. Graves: I’d love to handle that one, if I could, because I’ve studied that, and it’s caught squarely in this vortex of misinformation and disinformation. The incidence of those who want to defund the CBC is about a quarter. It’s our most trusted news organization by a fairly significant margin. People say it’s good value for money. I tell them how much it costs.

When I rate this on a continuum of how many questions you got wrong that have nothing to do with the CBC, the sort of questions I just talked about, it is enormously predictive of whether you think we should defund the CBC or not. We’re finding, for example, that those who regularly listen to CBC or other mainstream news are actually less susceptible.

I want to make one quick point of the role of university education because of all the demographics, it is our best predictor, and the prophylactic against disinformation is to develop critical reasoning skills. There are all kinds of reasons, but here’s one quick statistic. In the beginning of this century when I asked people, “Do you think a university education is a sound economic investment?” 85% of Canadians said yes. That number is now 40%. I would argue it’s linked to this same polarization and epistemic crisis where we no longer are just entitled to have our own opinions, but we can have our own facts. We can all have our own facts now. This is really disturbing, and this problem fits very squarely into that problem.

Mr. Kolga: Just very quickly, I wouldn’t categorize Rebel News as being a conservative outlet. It’s an illiberal populist outlet that basically takes the approach of a firehose and pushing information into an echo chamber. They don’t espouse any traditional conservative values.

As far as Conservatives are concerned, I think we have to be reminded that Russian state media has, going back to 2014, targeted Conservative MPs. Stephen Harper was a major target. Andrew Scheer has been a target. Chris Alexander and Jason Kenney. Even Shuvaloy Majumdar, a newer MP, has been a target of Russian information operations and their positions on Ukraine.

I would also argue that Russia is targeting the far left in Canada as well. Grayzone News. One of the previous panellists mentioned Global Research. These are platforms with significant following, so they are exploiting both sides, the far left and the far right, and what they’re doing is weaponizing information. We talked earlier about misinformation and disinformation. When it comes to Russia, it is only manipulation of information and the weaponization of information to, again, erode the cohesion of our society and distort our understanding of the world around us.

I don’t think misinformation and disinformation, those two terms, play any part when we’re talking about Russian information operations targeting Canadians.

Mr. McQuinn: Just one point, because I agree with everything that Mr. Kolga talked about. When I said “Republicans,” I probably should have said “Donald Trump,” because I think in some ways, the classic Republican structure would have responded very differently to this, and we see this. That’s important. To even call it a Republican Party anymore, I think, is a misnomer. It is now the Party of Trump.

It’s really important that we identify this. The Russians have no right or left. Nor do they, necessarily, have any preference. All they care about is, “How do we amplify as much discourse as we can?” One of the things that I found most interesting is a study that was done a couple years ago with the Black Lives Matter online discourse. The majority of the most avid and most rabid supporters on both sides were actually Russian agents. They were basically amplifying on both sides, and this was demonstrated quite clearly. It’s not important, the politics, so much as what is the impact and being able to use it.

I’ll end with one last idea. What the Russians do and what they have found is that if the idea they are pushing is closely aligned to the communities that they’re pushing it into, it’s more effective. Every day, they are pumping different things — and most of them don’t work, but a few of them go viral. It’s kind of rotating themes that get pumped through. They don’t do just one. They pump these out in quite extraordinary fashion.

The Chair: Your time is up. I’ll get you in on second round. Senator Dasko. I’m trying to be fair to everybody here.

Senator Dasko: Thank you, chair.

I have a question for Mr. Graves and Mr. Kolga. The topic of polarization has come up a lot. We’ve heard about issue polarization in our conversation today. We’ve heard about demographic polarization. You mentioned young people. Then there are conspiracy theories. There are a lot of layers to the polarization. Mr. Kolga, you mentioned polarization as well. I’m just trying to understand what this polarization looks like — then there’s a lot of correlations among views that are polarized.

Are we talking about a population that is, let’s say, on a bimodal distribution? Is that what we’re looking at? Or are we looking at a certain segment that is, at one end, where all of these — let me describe them as crazy ideas — that’s the only way I know how to describe them — where they all come together, and that’s the polarized segment, and everybody else is dispersed through different modes of thought? Does this question make sense?

Mr. Graves: I can quickly try to take a shot at this. I’ve looked at this pretty carefully, and it’s a straight monotonic progression with our index on various things, the issues, for example.

For example, if I look at the question of, “How important a priority is climate change?” For the 40% to 50% of Canadians — it depends, because I vary the scales from time to time — who get all the questions right, 90% say that’s a really important priority. As I move up to the moderately disinformed, that drops to, say, 70%. For the more significantly disinformed, it drops down to 40%. As I move to the most disinformed group, who get most of the questions wrong, it’s 3%. So you go from 85% who say, “That’s an important priority, that’s an existential crisis for our country,” to 3% at the polarity; and that type of incredibly powerful distribution is something we see on a broad array of issues, and I never saw that in the past.

Senator Dasko: But there’s a huge correlation among all these views.

Mr. Graves: Oh, yes.

Senator Dasko: Well, disinformation — so that’s measured by a set of questions, but then we’ve got polarization around a whole bunch of issues that correlate with that is what you’re saying?

Mr. Graves: Exactly. It really eliminates having a consensual framework to address some of the important issues of our day, because you have such incredible levels of — and, by the way, this type of polarization is morphing into not just, “I don’t agree with you on that issue,” but it’s becoming what we call affective polarization: “I don’t like you, I don’t want you to live on my street, I don’t want you to date my daughter.” This is a particularly insidious form of polarization, which cripples our social cohesion, our sense of respect for our fellow person and citizen, and it’s something that didn’t exist to this extent at all 5 or 10 years ago.

Senator Dasko: What percentage of the population is in this extreme segment, would you say?

Mr. Graves: About 7%. Yeah, they’re radicalized. They actually share certain characteristics. They seek chaos and they lack conscientiousness. So when we ask questions like, “I really enjoy seeing disasters in foreign countries,” they go, “Yeah.” Now, that group is not particularly accessible. What we worry about is the degree to which they contaminate others. But the group that is in the 40% to 50%, who are moderately — there’s a lot of plasticity. They will change their views. In fact, when I do a longitudinal cross-sectional analysis — longitudinal of the same individuals, at the cross-sectional, it looks like nothing is changing. When we measured this information at different points in time, seven months apart, half the people moved either up or down, and we could find keys as to why that was happening. So that’s important. We just have to find out how to do it and where we go to. Most of them, when they actually move out of the state of disinformation, “Oh, I wish someone would have told me.”

One of the things that we found in our analysis is that people who contracted long COVID said, “Why didn’t you tell me this was going to happen? This disinformation stuff is really bad.” We want to find a less jarring kind of prompt than contracting a post-COVID condition. But the point is that there is plasticity. People do move.

Senator Dasko: Right. In the other segments.

Mr. Kolga, you mentioned polarization. How do you see polarization unfolding? How would you describe it? You used the term, so I wanted to ask you about that.

Mr. Kolga: What I’ve been doing for a number of years is analyzing the narratives that are being injected into our information space by Russian operators. What they are very good at doing and what they’ve become exceptionally good at, especially since COVID, is identifying those narratives and those issues that are the most polarizing and basically becoming rage farmers, pouring fuel onto those issues. And it doesn’t matter — I mean, Ukraine aside, it’s any issue that polarizes us.

Mr. McQuinn brought up the Black Lives Matter issue. This was one of the first ones that came up where it was very clear that Russia was amplifying narratives on both sides of that issue. It’s done this with environmental issues. Where we see these narratives coming together on the far left and far right are on anti-NATO, anti-Western issues. When you have influencers that are tapped into, that are platformed, we see some of these narratives bleeding into the centre as well, those who are not part of the far left and the far right. This is the study that Mr. McQuinn and I worked on, to see how those regular Canadians were being impacted by this.

Those networks that we are seeing online aren’t isolated; regular Canadians are seeing what’s going on there, and they are getting exposed to those narratives. We’re not doing nearly enough to stop that contamination of those who are not at the extremes.

The Chair: We’re going to move to the second round now. Senators, you each have two minutes, including the answers to your questions. Please keep your questions succinct, and you’ll get an answer.

[Translation]

Senator Dagenais: My question is for Mr .Kolga. We are talking a lot about disinformation coming from Russia. Can you tell us about other countries’ use of disinformation and the scope of their threat to Canada?

[English]

Mr. Kolga: Sure. Thank you for that question.

We’ve heard from other panellists about China. China has been trying to emulate Russian information operations but has not done so successfully. Anyone who has looked into Chinese state media platforms, especially English-language ones, will probably agree with my assessment that the information there is indigestible. It is written in a language that’s very difficult for any normal Canadian to consume. So they haven’t done this very well. There is some suggestion that China is looking to Russia to contract out that work to the Russians because they are so effective at it.

China is extremely effective in the Chinese-language media space and in controlling its diaspora, not just here in Canada but in other countries as well. They do so through their influence operations, which means through their diplomatic missions in foreign countries and such. They’re very effective that way.

Iran is also quite effective in controlling its diaspora communities, but it is not so successful in impacting discourse in the English-language information space.

We’re seeing India rising as well. India and many of its largest English-language platforms have also, inexplicably at this point, taken to amplifying Russian narratives. It is probably to attract clicks and for revenue, but they are also becoming an amplifier for Russian disinformation. At least, that’s what we’re seeing right now.

Senator Patterson: Thank you very much. When I listen to all of this, we have to remember that Russia is using information as a weapon. It isn’t because they care about the truth. They are weaponizing information, which I think is quite hard for Canadians to comprehend. They’re using techniques that you could call infiltration to coopt groups — I’m using military terms here for a very good reason: I think they can help us understand why we’re asking what is going on.

When you’re on the defensive, as Canadians are, we struggle with the concepts of free speech.

Dr. Kolga, I’m going to go back to my other question of looking at our university-age students in particular, because they are voting age people who will inherit whatever we do today. What can we do to try and help them understand that these are tools that are being weaponized against them, in a gentle way, so to speak? Also, how do we actually build into university education critical thinking skills that are particularly related to this while we aim for the long term of trying to educate from kindergarten forward? Thank you.

Mr. Kolga: I’ll let Professor McQuinn talk about what we can do in universities, but to go back to the Finnish model, the Finns are extraordinarily effective at injecting digital media literacy into every single course throughout a child’s learning, from kindergarten up until they graduate from high school, and into university as well.

You may have a course on statistics, and the curriculum will have digital media literacy and critical thinking in terms of information operations built into it. It is the same with a science class or a social media class. All classes have it built into them.

That is unlike our system where we are looking at disinformation and misinformation and trying to raise awareness among younger Canadians, but it’s often one course. It might be one hour in a semester or a year. There are individual civil society organizations that are doing this work, but it’s not being done in the same systematic way as Finland is doing it.

We really need to look at countries like Finland that are inoculating all of their young people to this threat. We need to emulate that in our own school systems.

Mr. McQuinn: This is where critical thought — and I think your question has within it the embedded answer. We have begun to start changing how we look at critical thinking in our classes, and it becomes more explicit. The unfortunate reality is that it is not seen as a priority at universities. From my experience, we are more focused on generative AI and its impact upon assignments. This is something that I implore this committee to start pushing harder on. The extent to which I see any international issue — because I do international studies — half of the students already have these ideas in their mind that are quite extraordinary. The Russians are successful in ways that I think are not fully appreciated in their ability to shape certain ideas.

When 70% of my students get their news from TikTok — yes, I had the same reaction as you just had — what actions are the Canadian government taking on TikTok? We have these extraordinarily brilliant civil servants. They would probably scoff at the idea that they would be pumping out 300 TikTok segments a day, but they probably should be. If they’re all doing it, the 20-year-olds aren’t going to be watching it, I can tell you that.

It’s about having to build an infrastructure that is quite similar. This is not how we think in these terms.

The challenge is this mismatch of systems and institutions.

Senator Cardozo: The point about building a new infrastructure that would work on social media is certainly important. It always strikes me as bizarre that when we have a problem with X, and nobody can fix it by creating another platform that will be as successful.

Professor McQuinn, we’ve talked about this a bit, but how does the influencing happen? Are the influencers sitting in Moscow, or are they paying people in Canada to be influencers in Canada and elsewhere?

Mr. McQuinn: I’ll give a very short answer and then pass it over to Mr. Kolga, because he really is the specialist in Canada. I will talk at a structural level.

One of the things we found is that we saw the ability of the farms that produce the narratives — there are farms that are tailored for different parts of that segment, from far right to far left — and every day, they look at the news and pull out whatever is happening in the day and start to weaponize and direct that. It’s literally institutionalized. We would have to have equal infrastructure, and it couldn’t be run by civil servants because it’s just not how they would think about it, in order to match that. The question is who would do that and who would fund that?

Mr. Kolga, I will hand this over to you, because this is really your alley.

Mr. Kolga: Thank you, Professor McQuinn.

The Russians have been doing this very successfully for 100 years. They identify influential elected officials. I think Senator Kutcher mentioned Igor Gouzenko whose files exposed this back in 1945. They are elected officials, journalists, academia — they were doing this back in the Second World War and throughout the Cold War. They’re continuing this work right now.

I think the Tenet Media indictment is very clear. They identify the most impactful influencers. They connect with them somehow, either using money or it could be through other means. That’s the way. This is how Russia is operating. They’ve had an enormous amount of success doing that.

We have completely failed to disrupt these operations. We have failed to deter them and prevent them. This is an area we need to start looking at.

Swatting various different narratives through fact-checking is important work, but if we really want to get to the root of this problem, it’s going after those influencers and exposing them so that Canadians are aware of who they are and whose voices they’re amplifying in this country.

The Chair: Thank you, colleagues. This brings us to the end of this panel. I want to extend a sincere thanks to Mr. Graves, Mr. Kolga and Mr. McQuinn. The committee greatly appreciates you participating in our study here today and for taking the time to share your insights and research with us today. That’s greatly appreciated. Thank you so much for your time here with us.

Colleagues, I would be remiss if I didn’t acknowledge that today is the last day on the committee for our colleague Senator Dagenais, who has been on this committee ever since I’ve been on it. He is the longest-serving member on this committee who continues to be here. I want to take the opportunity to say a few words.

Certainly, his past chairs, if they were here, would be echoing the same thing. I want to thank him for his sincere commitment to our great country. His career didn’t start out in the Senate, but he ended up in the Senate in the latter part of his career. He started out as a police officer, and he served greatly. Later he was a delegate, a regional director, a VP for finance and — like myself — he was a president of the union to represent police, of all people. More importantly, of course, that brought him to the Senate.

He was appointed by former Prime Minister Stephen Harper. He has served on many committees of the Senate. Early next year he will be retiring from the Senate. Given that this will likely be the last meeting on this committee for this year, I want to thank him for all his contributions since he has been here. He has provided rock-solid support as our deputy chair, but equally, in all the studies since I’ve been here and have been participating, he has contributed richly to the discussion, and he has shown what collegiality is all about.

He has never abused his responsibility on the committee, that I’ve experienced. He’s always shown openness to welcome you as a new member but equally to support you in your efforts to try to be successful on this committee.

On behalf of all of us, I want to congratulate him on his retirement in the near future. I wish him a lot of health, and I hope he comes back to visit us in the near future and share the good life he will be enjoying very shortly.

Over to Senator Dagenais.

Senator Dagenais: First of all, it’s around 7:30. I would like to thank all of you for your support, and you are very good colleagues. This is the first time I will make my speech in English. I will practise my English.

Thank you, Mr. Chair. I appreciate working with you, and good luck in the future.

Thank you to all my colleagues. It has been a great experience. I have completed 12 years now on the National Security, Defence and Veterans Affairs Committee, and around 10 years of those have been as deputy chair.

I would like to thank Ericka Paajanen, because I remember when I chaired this committee, it’s not easy, but you are there, and it is so important.

I would like to thank the analysts and the interpreters. It’s not easy to translate my Quebecer expressions. I would like to thank all the technicians and the witnesses, because when we have good witnesses, it’s very interesting.

Now, I realize it’s time to write another chapter in my life. I have enjoyed this time. In my 13 years in the Senate, this has been a great experience.

Thank you, my colleagues.

Hon. Senators: Hear, hear!

The Chair: Colleagues, this will mark our last meeting for this year. We will not be meeting next week, as we’re getting into the final stages of work in the Senate.

I think I would be remiss if I didn’t thank our clerk for her steadiness in guiding us on this committee in our important work but equally in organizing our witnesses and, when we have disruptions, to make it as smooth as possible.

I want to thank our analysts for the tremendous hard work they do in helping us prepare and get ready for the committee and, equally, behind the scenes, which are the people we don’t see, are our translators, our technical advisers and others, who, of course, keep the committee flowing so the public can participate and also witnesses, who are testifying from afar, are able to do so.

I want to thank all the staff on behalf of the senators who serve on this committee for the tremendous effort they put in to help senators coordinate their activity and timing. More importantly, I want to wish everybody a sincere and happy holidays and good health, and we look forward to seeing you back here in the new year.

Senator Dagenais: I would like to thank the pages, who are also so important here, too.

The Chair: With that, this will mark the end of the meeting.

(The committee adjourned.)

Back to top