THE STANDING SENATE COMMITTEE ON NATIONAL SECURITY, DEFENCE AND VETERANS AFFAIRS
EVIDENCE
OTTAWA, Monday, April 15, 2024
The Standing Senate Committee on National Security, Defence and Veterans Affairs met with videoconference this day at 4 p.m. [ET] to examine and report on issues relating to national security and defence generally.
Senator Tony Dean (Chair) in the chair.
[English]
The Chair: Welcome to this meeting of the Standing Senate Committee on National Security, Defence and Veterans Affairs. I’m Tony Dean, senator from Ontario, the chair of the committee. I’m joined today by my fellow committee members, whom I will ask to introduce themselves, beginning with our deputy chair.
[Translation]
Senator Dagenais: Jean-Guy Dagenais, Quebec.
[English]
Senator Oh: Victor Oh, Ontario.
Senator White: Judy White, Newfoundland and Labrador, replacing Senator Anderson for today.
Senator M. Deacon: Welcome. Marty Deacon, Ontario.
Senator Cardozo: Andrew Cardozo from Ontario.
Senator McNair: Hello. John McNair from New Brunswick, replacing Senator Kutcher today.
Senator Yussuff: Hassan Yussuff, Ontario.
Senator Dasko: Donna Dasko from Ontario.
The Chair: Thank you, colleagues.
On my left is the committee’s clerk, Ms. Ericka Dupont, and to my right are the Library of Parliament analysts Anne-Marie Therrien-Tremblay and Ariel Shapiro, who support us so well.
Today, we welcome three panels of experts who have been invited to provide a briefing to the committee on disinformation and cyber operations in the context of Russian’s war against Ukraine. We’re continuing our focus on Ukraine, but specifically now on cyber operations.
I’ll begin by introducing our first panel of witnesses. From Global Affairs Canada, I’d like to welcome Tara Denham, Director General, Office of Human Rights, Freedoms and Inclusion, and Kelly Anderson, Director, International Cyber Policy. From the Communications Security Establishment, we welcome back Mr. Sami Khoury, head of the Canadian Centre for Cyber Security.
Thank you all for joining us today. We now invite you to provide your opening remarks. We’ll begin with Tara Denham, who will speak on behalf of Global Affairs Canada.
Ms. Denham, whenever you’re ready, please commence.
[Translation]
Tara Denham, Director General, Office of Human Rights, Freedoms and Inclusion, Global Affairs Canada Mr. Chair, members of the committee, thank you for your invitation to discuss disinformation and cyberoperations in the context of Russia’s war against Ukraine. It’s been two years since Russia invaded Ukraine.
As we enter the third year of Russia’s illegal aggression against Ukraine, the Kremlin continues its efforts to reduce Ukraine’s ability to defend itself. Moscow also continues to use all available means to try to reduce international support for Ukraine. These tools include cyberoperations and disinformation.
[English]
Cyber has been a domain of conflict since before the invasion, and it will remain a contested domain when the hostilities end. However, in both peacetime and war, there are rules that states are expected to follow for responsible state behaviour in cyberspace.
Russia has been, and continues to be, a particularly egregious actor in cyberspace. It has repeatedly disregarded the United Nations framework for responsible state behaviour in cyberspace, which makes clear how international law applies in cyberspace and promotes the UN norms for state behaviour.
Global Affairs Canada works diligently to promote and defend the framework at the UN and in our bilateral and regional engagements. We also make clear what is and what is not acceptable by calling out unacceptable behaviour.
Canada, along with partners, including the United States and the United Kingdom, have called out malicious cyber activity by Russia seven times in the last four years. Most recently, in December 2023, the Minister of Foreign Affairs issued a statement of support to the U.K. condemning electoral and political interference against the U.K. by Russia.
In addition, Canada, along with the U.S., the U.K. and the European Union, attributed malicious cyber activity against commercial satellite communications networks to disrupt Ukrainian command and control during the February 2022 invasion. Those actions had extensive spillover impacts in other European countries not involved in the conflict.
Canada also works to lessen and mitigate the impact of Russian cyber operations against Ukraine by helping Ukraine build its cyber resilience. In February 2024, the Prime Minister announced further funding for cyber assistance to Ukraine to strengthen Ukraine’s ability to deter and counter cyber-enabled threats from Russia and Russian affiliated non-state actors.
Canada has also been a leading voice in creating and shaping the civilian platform that organizes cyber assistance to Ukraine, which is called the Tallinn Mechanism. The Tallinn Mechanism provides a platform to enable cyber capacity building to be coordinated, avoid duplication and meet Ukraine’s priority needs. It complements similar work that takes place in the military domain.
Along with malicious cyber activities, Russia has long employed state-sponsored disinformation as part of a broader hybrid tool kit to achieve its geopolitical and military objectives globally. In the case of Ukraine, Russia conceals, blurs and fabricates information to gain military advantage, demoralize Ukrainians, divide allies and garner domestic and international support for its illegal invasion.
Russia has also increased its targeting of the broader international audiences, notably in Africa and Latin America. For example, narratives about Ukraine being at fault for the global food crisis are spread by Russian political figures and furthered on social media and in state-owned media articles.
In response, Canada has adopted a strong posture to counter Russia’s efforts to manipulate false information and narratives in their favour. We have publicly called out the Kremlin on its disinformation tactics related to Ukraine, including through campaigns on Russia’s illegal annexation of Donetsk, Kherson, Luhansk and Zaporizhzhya oblasts of Ukraine. We have issued video exposés, highlighting Kremlin tactics on the exploitation of social media platforms, state-sponsored media and disinformation.
We work with allies to monitor, report and share assessments of Russian disinformation, such as through the G7 Rapid Response Mechanism, which was announced in 2018 in Charlevoix as part of Canada’s presidency.
Canada has deployed sanctions to target entities and individuals involved in Russian disinformation operations. To date, we have sanctioned seven individuals and three entities for their roles in disseminating disinformation targeting Ukrainian audiences. We also fund projects to support whole-of-society counter disinformation efforts.
We’ve long acknowledged that no government can tackle this issue alone, so it’s important to work with civil society and academia. For example, Canada is providing $2.5 million to the International Institute for Democracy and Electoral Assistance to increase the capacity of civil society organizations to more effectively counter foreign information manipulation.
With that, I think I’ll close my opening comments and look forward to your questions.
The Chair: Right on the button. Thank you, Ms. Denham.
We now go to Mr. Sami Khoury. Welcome back. Please go ahead whenever you’re ready. We’re looking forward to hearing from you.
Sami Khoury, Head, Canadian Centre for Cyber Security, Communications Security Establishment: Good afternoon, chair and members of the committee. Thank you for the invitation to appear today. I’d like to begin by providing an overview of the cyber-threat landscape, focusing on threats emanating from Russia. Following this, I will provide an overview of how the Communications Security Establishment, or CSE, has worked to support a unified global response to Russia’s invasion of Ukraine.
[Translation]
With technology advancing at a rapid pace, the cyberthreat landscape in Canada is also constantly evolving. In a global environment marked by destabilizing events, threat actors are adapting their activities and using emerging disruptive technologies, such as generative artificial intelligence, to achieve their financial, geopolitical and ideological goals.
[English]
Cybercrime, including ransomware, continues to be the cyber‑threat activity most likely to affect Canadians and Canadian organizations. However, the state-sponsored programs of Russia, China, Iran and North Korea continue to pose the greatest strategic cyber-threat to Canada.
Russia’s invasion of Ukraine in February 2022 gave the world a new understanding of how cyber activity is used to support wartime operations. Russian-sponsored malicious cyber activity against Ukraine has disrupted or attempted to disrupt organizations in government, finance and energy, often coinciding with conventional military operations. Cyber and military activities have also been supported by coordinated disinformation operations.
The Centre for Cyber Security’s National Cyber Threat Assessment unclassified report outlined how nation-states are increasingly willing and able to use misinformation and disinformation to advance their geopolitical interests.
Furthermore, AI-enabled technologies are making fake content easier to manufacture and harder to detect. Adversary states are constantly circulating and amplifying false content that supports their interests.
[Translation]
Since the Russian invasion of Ukraine, we have seen numerous Russian-sponsored online disinformation campaigns aimed at spreading false information about Canada’s involvement in the Russia-Ukraine conflict and about NATO allies in order to discredit them.
For example, the controlled media have been ordered to include doctored images of members of the Canadian Armed Forces deployed on the front line and to publish false allegations that the Canadian Armed Forces are committing war crimes.
[English]
Beyond disinformation, state-sponsored actors are targeting critical infrastructure to collect information through espionage, to pre-position in case of future hostility and as a form of power projection and intimidation.
The invasion of Ukraine has demonstrated that Russia is increasingly willing to use cyber activity against critical infrastructure as a foreign policy lever.
Closer to home, foreign cyber-threat actors, including Russian‑backed actors, are attempting to target Canadian critical infrastructure networks, as well as their operational and information technology.
While I can’t speak to CSE or the Cyber Centre’s specific operations, I can confirm that we have been tracking cyber-threat activity and have been working with Ukraine to monitor, detect and investigate potential threats, and to take active measures to address them.
The Cyber Centre has also been working closely with domestic partners and international allies to support a unified global response to Russia’s invasion of Ukraine. Specifically, we have monitored for malicious Russian cyber activity against Canada, Ukraine and NATO; bolstered the Government of Canada’s defences against known Russian-backed cyber-threat activity and countered Russian disinformation; shared cyber-threat information with key partners in Ukraine, NATO allies and Canadian critical infrastructure; and provided intelligence and cybersecurity support to Operation UNIFIER, the Canadian Armed Forces training mission in support of Ukraine;
At the request of our Latvian allies, the Cyber Centre has also deployed personnel to help defend against cyber-threats on Latvia’s critical infrastructure and government network.
[Translation]
These deployments are part of a joint mission involving cybersecurity specialists from the Canadian Armed Forces, the Canadian Centre for Cyber Security and its Latvian counterpart.
This joint mission helped defend a NATO ally against hostile cyberthreats.
[English]
Last week’s defence policy update, Our North, Strong and Free, highlights the need to respond to significant global shifts and an evolving threat landscape.
As you heard from the minister, the government has announced a commitment of $8.1 billion in further investment into Canada’s defence capabilities over the next five years. This includes a $1 billion commitment to CSE’s foreign cyber operations program and to increase foreign intelligence collection capabilities.
In total, the defence policy update commits $2.8 billion over 20 years for cyber capabilities. These investments will enable Canada to take action through cyberspace that counter threats, advance foreign policy interests and support military operations.
[Translation]
In conclusion, as we adapt to an ever-changing threat environment, we will continue to work closely with our Five Eyes partners and leverage our unique technical and operational expertise and capabilities to confidently ensure Canada’s resilience to cyberthreats and disinformation.
[English]
With that, I thank you for the opportunity to appear before you today. I look forward to answering any questions you may have.
The Chair: Thank you, Mr. Khoury.
Colleagues, our panellists are with us for one hour. To ensure each member can participate fully, we’re limiting the question and answer to four minutes. Please keep your question succinct and identify the person you’re addressing the question to.
[Translation]
Senator Dagenais: My first question is for Mr. Khoury. The war in Ukraine has been going on for over two years. There’s the year leading up to the war and the time since the invasion. The Russian disinformation surely began well before the start of the conflict.
Can you tell us to what extent Russian cyberoperations have changed or evolved over time? Also, in general, how long does it take to spot and distinguish between the real and the fake?
Mr. Khoury: Thank you for your question.
Indeed, Russia has been obsessed with Ukraine since 2014 or 2015. There has been a series of cyberactivities that began at that time; some have been quite damaging in terms of critical infrastructure.
We are well informed through our intelligence teams and through the partnership we have with our Five Eyes colleagues about what is happening in Ukraine. We have seen the evolution of Russian tactics and a fairly rapid adaptation of these tactics. So, as soon as we detect something, we issue a bulletin to help our communities defend themselves. We’ve seen that the Russians will often adapt or modify their techniques within 24 hours to try and get round what we’re doing. So they are quite agile in adapting to our measures.
Along with Ukraine and our Five Eyes partners, we have a good pulse on their activities and we are helping our Ukrainian colleagues. In addition, we are learning about ways to defend Canada and we are increasing the resilience of our organizations.
Senator Dagenais: My second question is for Ms. Anderson. I’d like to talk to you about the government’s decision-making in relation to the disinformation spread by the Russians. Have the reports of Russian cyberoperations by the Canadian Centre for Cyber Security led to changes in decision-making? Have the reports sometimes come too late to give you a more accurate picture of the situation?
Kelly Anderson, Director, International Cyber Policy, Global Affairs Canada: Thank you for the question. I would say that we work very closely with Mr. Khoury and his team. So we obtain information fairly quickly to decide what we need to do about cybersecurity issues.
That said, the process for deciding whether to make a public attribution of cybersecurity issues is fairly lengthy, because we want to be really sure, within the federal community, that we know the nature of the event and that we take into account the effects that relate to the framework for responsible state behaviour in cyberspace.
This includes the impact on international law, but also certain standards of responsible behaviour.
[English]
Senator Oh: Thank you, witnesses, for being here.
My question for the panel is how do cyber operations, such as hacking and misinformation campaigns, overlap with Russia’s military action in Ukraine? What risks does this intersection pose for Canada’s cybersecurity?
I want to add one more question: Does the Canadian government provide enough funding to counter cybersecurity operations in Canada?
Mr. Khoury: Thank you for the question. I can answer the first part of your question.
Since the invasion of Ukraine, we’ve seen a synchronization between Canadian cooperation and cyber operations. Russia’s goal was to weaken civil society and the government through cyber operations and also inflict some damage that has impacted, for example, the telecommunications infrastructure in Ukraine.
We learn a lot from all of these cyber activities. We monitor them closely through our foreign intelligence mission, but also through the partnership we have with our allies. As quickly as we can, we turn what we learn from these into alerts, cyber flashes and advisories to promote resilience. Those started even before Russia invaded Ukraine. We had a sense that these activities were going on, and we issued a number of alerts even before Russia invaded Ukraine to ask that organizations be vigilant and on top of their IT.
Ms. Denham: On disinformation, again, Russia uses a whole slew of tool kits, and they use them very strategically to align one with the other. On disinformation, specifically, what we saw before the invasion was one of the narratives — and you’ll probably all be familiar with it — but at the time you heard that Ukraine was being overrun by Nazis. Probably people in the room heard that narrative. That started over a year before going through Russian disinformation campaigns. In 2021, when you looked at Russian state-sponsored media, the citation of Nazism was at about 3%, and then just before the invasion, it was up to over 20% in the narrative. So you can see how they start to flood the narrative space.
That’s one of the tactics.
When you combine it with cyber activities, laying the foundations, having not only Ukrainians but also the international community question what actions are and will be taken — and, in fact, hopefully they’re trying themselves to sow support for that illegal invasion.
We have to be more and more aware of how the military operations are completely tied into and often preempted by some of the disinformation and other activities that are being laid out, which could be months and years in advance.
Senator Oh: How about the government funding? Do we need more funding for a counteroffensive?
Ms. Denham: We’re always reviewing the support that’s required. We just had the Defence Policy Update and funding announced. There’s always a recurring review.
For our disinformation team, there was an announcement after we were created in 2018 — an acknowledgement on the strength of Russian-specific disinformation. That team was expanded in 2022.
We’re always going through revisions. Of course, if there are more capabilities, we’re able to work on those. But we’re continuing to review those requirements.
Senator M. Deacon: Thank you to all of you and your teams for being here today. It’s very much appreciated.
Ms. Denham, you mentioned how Canada is assisting Ukraine in its cyber defences. I’m wondering how deep this cooperation can go. Mainly, does it extend to offensive operations as well? I’m wondering about the norms of war when it comes to cyber operations. Is it more of a grey area if we assisted Ukraine’s cyber operations, or would it be akin to sending Canadian troops on the ground? I’m trying to distinguish those differences.
Russia for instance — you talked about it — is constantly at work online here in Canada. I think it’s fair game if we assisted Ukraine in trying to shut down a power station with a cyberattack, for instance. Or would that be similar to declaring war on Russia?
I’m trying to make the distinction between cyber and boots on the ground.
Ms. Denham: Maybe I’ll start, and if my colleagues want to add in, they may. Again, I’m not placed to speak specifically to military operations. A lot of the work we do and that I was referencing is building up civilian cyber capabilities with Ukraine. That’s the Tallinn Mechanism that we referenced. There are the military operations — and National Defence or others are more engaging in those discussions — but just as important is to build up the resilience of Ukrainian civil society and organizations to respond to the cyber incidents when they’re happening. That is separate from, as you’re indicating, a specific military operation.
The other part that is important that we are always focused on is, as was referenced, the UN norms of responsible state behaviour in cyberspace. So for any actions that we would be taking or advocating be taken, Global Affairs Canada would be advising in terms of whether the actions aligned with what has been agreed within a UN framework.
That’s a lot of where we’re having the conversations with our whole-of-government colleagues in terms of the capabilities that we’re supporting and where some of the advice and advocacy are in terms of the behaviour in cyberspace.
I’m not sure if Sami would like to add more.
Mr. Khoury: Thank you for the question.
We’ve been a strong supporter of Ukraine and have been working through the Canadian Forces. I will defer to the Canadian Forces as to what they do on the ground, but we’ve worked through them to pass to Ukrainian partners over there either intelligence or cyberdefence tips to have them build some resilience in their society. We’ve also worked with neighbouring countries, like Latvia, to also build that layer of resilience on the perimeter.
We know that Russia has pretty impressive cyber capabilities, and they haven’t hesitated to use them in 2015 and 2016 to shut down the electric grid in Ukraine. We are very mindful of that. Whenever we see something funny or suspicious in Ukraine, we want to learn from it, because we don’t want those capabilities to make their way to Canada.
From a team perspective, we’re very concerned about cyber defence and how to build national resilience in Canada against all of these threats.
The Chair: That was a question that was on all of our minds, Senator Deacon, so thank you for asking it.
Senator Boehm: Thank you to our witnesses for being here. It’s always great to see former colleagues appear as witnesses.
Ms. Denham, you mentioned the G7 Rapid Response Mechanism, or RRM, as it was established by leaders in 2018 and first discussed by foreign ministers and then at the leaders’ table. As I recall from those discussions, because I was there, there was some concern about the viability of the Rapid Response Mechanism and whether it would continue through to other G7 presidencies, how it would be funded and whether it was nimble enough to expand.
I met with the Latvian ambassador this afternoon. We talked a lot about Canadian-Latvian cooperation. It is not that Latvia is looking to join the G7 or anything like that, but is there enough space within the mechanism, first of all, to expand cooperation with countries that are not in the G7? Is it being sufficiently financed? Finally, is it really rapid? Because with bureaucracies, rapid response is not usually associated in the same sentence.
Is it working effectively?
Ms. Denham: Thank you for your question and for your work at the time when this was being created. I’ll go through and hopefully touch on all of your points.
In terms of how to actually maintain the interests or the momentum behind the structure, typically, in G7, when initiatives are announced, it can be announced by a presidency and then perhaps the next presidency doesn’t see the same importance or relevance of the issue. It can dip then in terms of concentration of energy.
At the time, it was actually decided that Canada would maintain a secretariat of the rapid response mechanism. That is not going by how we would traditionally set up these. I think it was an excellent decision at the time, because by maintaining the secretariat, we were maintaining Canadians’ focus and Canada’s leadership on the issue, and we’ve also been able to maintain the focus since 2018 and continue to build and push the RRM, to be more responsive and agile, and to expand beyond the G7 members.
That flows well into the other question: We do go beyond the G7 members. I would think of it in this way: The response is a core part, and when there’s a need to respond, it would be led and coordinated by the G7 members. For example, the power of a G7 statement can be very impactful in having those G7 countries sign on. When we do need to call something out, that’s where we would want to keep the focus with the G7. However, we’ve actually built and have other countries that participate in the conversations that we’ve created linkages with non‑governmental academics. I believe you have Marcus Kolga coming later.
So we’ve tried to expand that network not only to the G7 countries, but, for example, Australia, New Zealand, other countries that are interested in the issue, they participate in the information sharing, which is at its core.
Senator Boehm: Do they participate in the financing as well, or is that strictly on our side, since we have the secretariat?
Ms. Denham: Canada finances the secretariat. Respective countries are responsible for setting up whatever structures they require in their domestic entities, as do observers. Different countries may have a different level of focus on a particular issue, and that correlates into how many resources they have invested in it.
To date the RRM has been most prominently focused on information manipulation and disinformation, that’s where we started, that’s been the prominence of our focus.
To your question as to are we nimble, we have also recognized the issue of transnational repression as being a major concern. We’ve established a working group within the RRM, and we are working with colleagues in that area. Again, the RRM was about threats to democracy. Disinformation was the paramount focus at the time, but we have to continue to expand and be nimble to meet those challenges. And transnational repression is one we’re tackling.
Finally, to close out on the rapidity with which bureaucrats can respond. There were two elements to it, one is we are all based on unclassified, open-source information; and therefore, the ability to be rapid is that we can share that quickly. We don’t have to declassify intelligence; it can be passed beyond the Five Eyes. That has proven quite successful in building our understanding of the issues.
The collective response is where we’ve had the challenges, and that’s where, as the secretariat, we’ve noted that it is a challenge. We are reaching out to our G7 colleagues, because it is very difficult to agree as a collective, as the G7, when we’re going to call out certain actions. We have all done it individually. At times we’ve done it with a few of the members, but our focus is we need to be more active in the response now that we’ve built the entities that it’s based on.
Senator Boehm: Thank you very much.
Senator Cardozo: I wonder if I could ask you to share with us a little bit more about the range of attacks that take place. Ms. Denham, you had talked about disinformation, and I think you mentioned the issue of food production, in Ukraine. Is there truth to that, in as much as Ukraine produces a large amount of wheat, that is, as I understand it, a lot of it exported to Africa? So what is truth there and what is disinformation in that issue? In general, what other kinds of cyber attacks are we seeing beyond disinformation? Is there actual warfare techniques taking place?
Ms. Denham: Maybe I’ll speak first quickly to the disinformation and ask Mr. Khoury to speak more on the cyber.
It’s an excellent example of how the Russian narrative was blaming Ukraine for the global food crisis after the invasion, but what had actually been happening for a number of years is that there had been a building global food crisis in terms of making sure that there was sufficient wheat production and that it was moving to different parts of the world. That was amplified and made worse through COVID. So there is actually a build-up of these issues.
Then with the Russian invasion, Russia was blaming Ukraine for not getting the wheat out, but Russia had actually done the blockades and was not letting the boats, the transport, actually leave. So they take a piece of truth, which is there is a global food crisis — Ukraine has been traditionally one of the major exporters — but they manipulate it to make it sound like it was Ukraine’s fault that there was a food crisis, when, in fact, Russia invaded Ukraine and put the blockades on and was damaging farmers’ crops and targeting different infrastructure. So it was Russia that amplified, through the invasion, the food crisis.
That’s an illustration of how information is manipulated to actually undermine the credibility of other countries.
Senator Cardozo: The other part is they were just getting that information out in large quantities?
Ms. Denham: I’ll speak quickly. Cyber and digital often get intermingled. People say it’s a cyber operation and, of course, we have particular definitions.
One of the easy ways to think about it is if you think of hack and leak, the hack is the cyber, using the infrastructure, breaking something, breaking and getting in and taking information; the leak is the disinformation, the manipulation of information. So the cyber infrastructure, when you talk about cyber incidents, they’re targeting infrastructure or breaking something if I think of it that way; and disinformation is the manipulation of the information that may have come from a cyber incident, but not always, if that is helpful.
Mr. Khoury: Thank you for the question. If I can add, so we’ve seen a gradation of sophistication of cyber activity in Ukraine that Russia has done, everything from defacing websites to doing denial of service attacks against websites. That is to preclude people from achieving that. At the other extreme, we’ve seen many waves of — at the time it was the wiper malware. Wiper essentially is they would release this code on a network, and the sole purpose of that code is to delete all information on that network, so I would put that in the destructive category.
There’s been multiple waves of wiper malware that the Russians have tried to unleash on Ukrainian government infrastructure. Many of them were caught. Countermeasures were developed so that they became ineffective, but that was a bit of a cat-and-mouse game where they would release something, it gets detected. We adapt and then they tweak it, and so on.
At the other extreme, they’ve demonstrated they have the capability, for example, to shut down the electricity grid as early as 2015-16. So that is the gradation from just defacing websites to shutting down the electricity in a particular part of the country using cyber means.
Senator Cardozo: Can they do things to tanks, for example, to cause them to not work or turn in the opposite direction or things like that?
Mr. Khoury: I’m not knowledgeable when it comes to military technology, so I would defer to somebody maybe at National Defence. But we live in a connected world, so is it possible? Maybe.
Senator Dasko: Thank you for being with us today. Ms. Denham, you mentioned that seven people were sanctioned. Can you elaborate on that, because I think you just said it. How much can you tell us? I think that will help us understand what is considered to be sanctionable, tell us something about the situation.
Ms. Denham: Sure. On the sanctions that Canada has applied — and there’s been a number of sanctions since the illegal invasion, I don’t have the exact number with me. In terms of the disinformation, let me start by saying that within our sanctions regime, which we need to apply very judiciously, to be able to apply sanctions, there are a few criteria. One is that we need to be able to back it up, with the information, so that’s open-source information to provide evidence of the issue; two, indications of gross human rights violations; three is — .
Ms. Anderson: International corruption.
Ms. Denham: Thank you, international corruption. Sorry, I always forget the third one.
Those are the areas for which we can apply sanctions and that the team would actually be reviewing to see if individuals that have been suggested or recommended to be sanctioned fall under one of those categories. Currently, cyber activities do not fall within those three areas. What we have been able to do is do sanctions on disinformation. There have been other sanctions about which you have heard — very public releases about corruption and other human rights violations taking place within Ukraine — and those packages undergo a review, are approved and when and if possible we share that information with like‑minded people in other countries, because the power and potential impact of sanctions are increased when there are multiple sanctions imposed by multiple countries on the same individuals.
Senator Dasko: Was Ukraine ever sanctioned, and by whom?
Ms. Denham: I was referencing sanctions packages put in place by Canada against the Russians who had been identified as participating in, advocating for or being a member of disinformation campaigns that had been launched.
Senator Dasko: In Canada?
Ms. Denham: No, in Ukraine.
Senator Dasko: Our laws are sanctioning actors in Ukraine who are disseminating disinformation. That’s my understanding.
Ms. Denham: Yes, the individuals that were sanctioned were Russian actors — not Ukrainians — involved in disinformation campaigns, who were targeting Ukraine or undermining support for Ukraine internationally.
Senator Dasko: What kinds of sanctions did we impose on them?
Ms. Denham: I can speak generally. Again, these would be economic sanctions.
Senator Dasko: Is this public information?
Ms. Denham: The sanctions packages are all released publicly, so you can actually get the names of the individuals. I don’t have them all here with me. But if you think about the economic sanctions, examples that have been made public include blocking assets, freezing assets and so forth.
[Translation]
Senator Carignan: My question concerns the cybersecurity ecosystem. There are more and more companies specializing in cybersecurity that are selling their services, companies that become future NVIDIAs and explode on the stock market.
I’m trying to see how — or even if — these companies protect our infrastructure well. Are these companies helping to protect our critical infrastructure? I’m thinking of the electricity system, the Canadian banking system, banking transactions and all the strategic civilian infrastructure that could be targeted. Do you support them? Are you giving information to these companies or organizations, whether it is Hydro-Québec or another, that manage essential or strategic infrastructure? Or are we letting the big private companies — Israeli and American companies, for instance — that specialize in cybersecurity advise our strategic organizations or companies?
Mr. Khoury: Thank you for the question. Partnerships are very important to us, whether at the federal or provincial level or with organizations. We work with everyone to forge partnerships in order to exchange information in a timely manner, not just in times of crisis. Whether it’s with the big banks or with organizations like Hydro-Québec or others, we have partnerships that enable us to share information when needed and to encourage the exchange of technical knowledge, for instance about things we see and ways we can help them defend themselves better.
As you’ve seen, there are a lot of commercial companies specializing in cybersecurity. At the Canadian Centre for Cyber Security, we are not in a position to recommend company A over company B. We suggest that people take certain factors into consideration before choosing who they do business with. Basically, large companies have their own cybersecurity teams and they rely on organizations like us or other specialist companies to provide them with what is known as threat intel in order to better protect themselves. That’s how we operate: Ultimately, everyone has to play their part, but from the outset, partnerships are present in everything we do.
Senator Carignan: You don’t have a set modus operandi for a sector, with standards for communication or timely information? Maybe so, but I wonder, because we see what is happening with the Foreign Interference Commission. We see that there are wait times; we see that a decision is made to transmit information, not to transmit it or to transmit it within a certain timeframe. How does this work, and how can relevant information be passed on quickly without sitting on a desk because someone thinks it’s not necessarily important now, but it might be in the future?
Mr. Khoury: Our teams monitor everything that happens in the cyberworld 24 hours a day, 7 days a week. We try to react as quickly as possible. If the information is classified, we can declassify it within a few hours and share it with the private sector. On your first point about standards, there are no standards for the private sector at the moment, but Bill C-26, which is currently before Parliament, will establish standards in four main sectors. Working with our counterparts in the provinces and territories, we hope to develop a similar or equivalent system for sectors not covered by Bill C-26.
[English]
Senator Yussuff: Thank you, witnesses, for being here. I have two questions. One is following up on what my colleague, Senator Deacon, asked earlier. With regard to the war in Ukraine, what lessons did we learn in the context of the cyber dimensions of war? This is a whole new area that — it seems to me — will dominate this new century: how wars are fought and — more importantly — how information is used. Equally, how does our country respond to this? We’re playing an important role on the ground in supporting Ukraine in a multitude of ways to help them with the challenges they face. What have we learned and how does this inform us in going forward with our own challenges and dealing with these sorts of issues?
Mr. Khoury: Thank you for the question. What we’ve learned is what Russia demonstrated, which is that cyber is yet another tool in the range of capability that a military or nation has at its disposal, and the synchronicity between them can be fairly damaging. It’s not just about kinetic power, but if you combine kinetic and cyber, you can achieve extra impact. We’re learning a lot from what Russia is doing both when it’s working and when it’s not working, because there are a lot of things in the context of the conflict that did not go according to plan for the Russian side. This is partly because of the resilience of Ukrainian society, and they’ve been the subject of cyberattack for many years, so they have built an element of resilience. That’s what we’re advocating for in Canada: We need to build resilience in society to be able to withstand potential conflict or potential cyberconflict.
The other thing is this concept that we have also talked about: pre-positioning on critical infrastructure. It’s not in time of war that the enemy will come after us. Sometimes they can be hiding within our own networks, and, in time of conflict, spring into action. That’s why we’re also working diligently with critical infrastructure operators to make sure they go hunting on their network for any sign of suspicious activity, because we want to avoid the scenario we picked up last summer on Volt Typhoon, which was a People’s Republic of China, or PRC, actor hiding on a critical infrastructure network.
Senator Yussuff: Do we have a similar capacity in our response? We’re not at war with anybody, but do we have the capacity to target their society and infrastructure in a way similar to what they are doing to us?
Mr. Khoury: Within the CSE authorities, we have the authority to conduct active and defensive cyber operations. There’s a whole approval regime on when those authorities are used. Beyond that, we don’t comment publicly on what operations have been carried out under those authorities.
Senator Yussuff: In the context of disinformation, this is broadly affecting Canadian society, almost on a regular basis. The government does not tend to move fast in how we respond to disinformation.
How can we work better in our civil society organization, given the scope of this issue and how much is permeating through Canadian society, whether it is from young people, also adults? Every day there is something that is spread that is not true. We’re struggling with this in the context of Russia and China, other actors. This is a huge challenge for our country. How do we improve our ability to respond to this and build a better capacity in civil society?
Ms. Denham: Mr. Khoury also mentioned it: We’re focused a lot on the discussions of the resilience of the society as a whole. I have spoken about the Rapid Response Mechanism.
There are certain elements that governments are best placed for, i.e. we can learn from our allies, watch internationally what is happening and hear about those tactics and share it with our domestic agencies. We can hear what some of those narratives are for the purposes of making sure that people understand, when there are certain narratives, which ones are based on disinformation and better fill the information landscape, is how I would describe it.
Some people ask for governments to say what is truthful or not. That is not where governments should be. We should be saying these are the tactics that we are seeing. These are the type of narratives we’re seeing. Here is the fact-based information about what we’re doing or how we’re responding. The whole-of-society resilience then comes in when you have different civil society and academic experts investing in their capabilities to understand what this environment looks like.
There are a few Canadian entities that are knowledgeable in this area. You will be hearing from some of them. Continue to invest in those capabilities and capacities. Again, when you talk about calling out certain tactics, it isn’t always the government that is best placed; it is media outlets, academics, individuals being more understanding of what they’re consuming.
Traditionally, Canada hasn’t been a country that has grown up with massive disinformation campaigns. We’re not used to it. I speak to high school students as well about being very discerning about when you see something, how do you respond? If your emotions are either really excited or really angry, there’s something happening there. We have to get more discerning.
It’s making sure we have the capabilities and learning with our international allies, and within our system domestically supporting the academic community and experts who are working in this space so that they can help with the education piece.
Then there’s another whole piece of work that the Department of Canadian Heritage does working with media entities and building up that ecosystem. It really is a whole-of-society endeavour.
Senator Yussuff: Thank you.
Senator McNair: This is a question for Mr. Khoury: How does the CSE counter specific disinformation programs targeting Canadians? Do you have the capacity to respond to those disinformation campaigns? From what you said to Senator Yussuff, it appears we do have some limited capacity to retaliate in some fashion through an approval process.
Mr. Khoury: Thank you for the question.
We are not a regulator when it comes to information. We look for signs that nation states use cyber means to influence or to promote disinformation.
In some cases, we do counter the narrative — in the case of the doctored images of the Canadian soldiers, because we had intelligence to prove otherwise — as that was a particular case that went through proper government engagement to say we wanted to release that.
As far as our authorities, these are authorities to carry out cyber operations, not to do disinformation campaigns per se. It’s to impose a cost on our adversaries using cyber means. It could be disabling infrastructure, for example, or it could be along those lines, so a pure cyber means.
As far as countering the narrative, I’ll look to my colleague, Ms. Denham, to talk about the role of the RRM in calling out misinformation.
Ms. Denham: Sure. To add a few more examples, the RRM also publishes an annual report where we do agree with our G7 colleagues on the tactics we’re seeing and which countries are conducting them. We now have two annual reports that are available online for 2021 and 2022.
We have a website that was put up. This is specific to the disinformation that was targeting Ukraine, Russian disinformation, to be clear. It is trying to counter disinformation with facts. An example of that is having experts come and do interviews, sharing what tactics they are seeing; what does that look like, then posting that on the website.
We’ve done social media campaigns. Before Russia does regional votes or moving toward annexing different territories, we published a playbook. Here are the traditional steps that Russia does every time. They change the curriculum in schools. We’ve noted the ten steps, with the last one being annexation of territory. It’s pushing out the information. It’s countering the narratives that we see and filling that narrative space with a counter dialogue. Those are some of the actions that we’ve taken, in addition to information sharing through the RRM.
[Translation]
Senator Dagenais: You said, Mr. Khoury, that the Russians react quickly to your reports and make some adjustments.
The questions this raises in my mind are the following: Are your communications too public, could they be infiltrated, and where do the Russians get their information to make some adjustments?
[English]
Senator M. Deacon: Carrying on from my colleague Senator Yussuff on learning, I read that Ukrainian power plants were taken off-line some time ago following Russian cyberattacks before the war and that they are quarantined from the rest of their company’s cyber infrastructure. Is that something we practise or think about here in Canada, or what else are we seeing that Ukraine does to protect its infrastructure that we might consider?
Senator Dasko: Some of the messages you have mentioned, and I’ve seen in the documents, seem simplistic — for example, there are Nazis in Ukraine, or when Russia says they believe in freedom when, in fact, there’s no independent journalism; journalists have left the country, or they’re in jail — there are those kinds of messages that are highly unbelievable; they have no credibility.
There must be some messages that are more credible, salient and do have some kind of impact. I wonder if you can tell me what some of those more salient messages might be. I’m assuming the ones I mentioned are utterly ridiculous; at least they are to me.
Ms. Denham: I’ll go quickly. Mr. Khoury has the other answers.
It seems like they’re highly unbelievable perhaps to yourself, but these are narratives that had a lot of resonance, particularly if you are thinking internationally and international audiences.
Senator Dasko: That Russia believes in freedom of speech?
Ms. Denham: There are still audiences. On Russia’s ability to adapt, they watch which messages seem to resonate and get picked up, then they double down.
When they’re trying a narrative that says they believe in freedom, if that one doesn’t quite hit home, they will come around with other messages. They watch which ones have the biggest resonance.
Some of the ones that have been most persistent, Nazism was out early on; another one has been that nuclear and biological weapons were in Ukraine and that western powers were in control of those; they have denied attacks on civilians. We have seen that over and over. Another one more recently is the questioning of arms shipments that have gone to Ukraine in that that those have been sold on the black market, particularly to Hamas and others.
You’ve heard pieces of those, probably, but they resonate quite strongly with different audiences internationally, and those would be the messages they amplify.
The Chair: I want to give a very quick answer to the other two questions, please.
[Translation]
Mr. Khoury: In some cases, these are public messages, but the Russians realize that their technique isn’t working. That’s how they adapt to understand why it didn’t work. So they change the code, and if we catch it, it doesn’t work either. It’s a game between us and them.
[English]
To the question of whether we are practising hygiene, yes. For example, when it comes to a hydroelectric station, we always say to separate your IT from your operational technology, or OT, networks — the OT network being the operational network of the electricity. Make sure that is not accessible from the internet, so we don’t run into the same issue that Ukraine ran into in 2015 and 2016.
The Chair: Thank you so much. Thank you, Mr. Khoury, Ms. Denham and Ms. Anderson, for very fulsome answers to lots of tough questions. We thank you for the work that you do. We have been talking about one aspect of that work today. We know it’s much broader than this.
On behalf of my colleagues around the table, our colleagues in the Senate and Canadians, we thank you very much for the very important work that you do every day. Thanks again.
For our second panel, we now welcome Anthony Seaboyer, Director, Centre for Security Armed Forces and Society at the Royal Military College of Canada; Svitlana Matviyenko, Associate Professor, Critical Media Analysis at the School of Communication at Simon Fraser University; and Aaron Erlich, Associate Professor, Department of Political Science at McGill University. Thank you for joining us today.
I invite you to provide your opening remarks, which will be followed by questions from our members. We’re starting this evening with Ms. Svitlana Matviyenko. Please proceed when you’re ready.
Svitlana Matviyenko, Associate Professor, Critical Media Analysis, School of Communication, Simon Fraser University, as an individual: I wanted to be here and thank you for the opportunity to share my testimony with you today.
As confirmed by substantial research and investigations, for several decades, the Russian government has been waging multi‑vector disinformation and cyberwarfare, targeting Ukraine and many international communities. The means and dynamics of this war changed significantly after the full-scale invasion of Ukraine. Today, the Russian disinformation techniques have evolved from opinion manipulation into a system of strategic production of terror.
Disinformation is usually understood as false information that is deliberately intended to mislead public opinion and manipulate public opinion, either in a subtle way — for example, like Russia Today, or RT, did in Canada between 2009 and 2022 — by gradually building a relationship of trust with audiences to become a perfect tool that amplified and propagated information in the interests of the Russian government at a chosen moment, or more aggressively by targeting Canadian users directly on social media from fake accounts.
Given the scope, intensity and the nature of information warfare since 2022, the term “disinformation” can no longer appropriately describe its impact on users and the aggressor’s intentions in Ukraine. Don’t get me wrong: Disinformation remains one of the core techniques of destabilization, but in the context of the all-out war, it is an integral part of more complex warfare events, consisting of both kinetic and cyber operations.
Today, the intention of the Russian aggressor in Ukraine is not to misinform but to terrorize. For example, when the Russian government and media deny the attacks on Ukrainian civilian infrastructure, the genocide in Bucha or the destruction of the Kakhovka Dam, for a Canadian user, those instances might indeed qualify as disinformation, with potential to produce uncertainty or noise when nothing is believable. The impact of such disinformation on Ukrainians is different, as they are subjected to violence several times. In addition to witnessing these horrors of war, they receive and must combat such disinformation in the media or on social media while being targeted by drones and missiles, deprived of sleep and traumatized by their personal and communal losses.
This is terrorism with the clear goals of attrition, intimidation, provocation and outbidding that is employed by the Russian Federation for the purposes of suppressing and subjugating the Ukrainian population to undermine their human dignity and, therefore, their ability to resist the aggressor.
The Russian strategy to terrorize does not stay within the borders of Ukraine. My research also follows the Russian weaponization of nuclear infrastructure, from the occupation of the Chernobyl nuclear power plant, to the occupation of the Zaporizhzhya nuclear power plant, and the use of nuclear threat for inducing terror among international audiences.
A nuclear catastrophe or a nuclear strike does not leave anyone indifferent. Radiation, as we say in media studies, was a mass medium that made the world global and erased the sense of distance before the internet. Its deadly invisible matter is believed to reach anyone, anywhere. Disinformation, accompanied by a nuclear threat, achieves its purposes more effectively.
I was reminded of that just last night by a taxi driver here in Ottawa, who, at first, seemed very upset about what he called “governmental spendings.” Then, he immediately jumped to the subject of war in Ukraine: “Why send weapons if Ukraine cannot win the war?” he passionately proclaimed by explaining his argument: “The Russians have the nukes.”
I know, and you probably know, that the logic combining these subjects in one sentence is popular, and it is not coincidental. It is induced by terror, specifically, by nuclear terror, as Russia’s war against Ukraine unfolds on the nexus of cyber and nuclear.
Thank you. I look forward to your questions.
The Chair: Thank you, Ms. Matviyenko. We will now hear from Anthony Seaboyer. Please proceed when you’re ready.
Anthony Seaboyer, Director, Centre for Security Armed Forces and Society, Royal Military College of Canada, as an individual: Thank you for inviting me to speak. My research focuses on the weaponization of information by authoritarian regimes, such as China, Russia, Iran and North Korea. I look at how they target democracies with hybrid, grey-zone warfares and disinformation to undermine rules-based democratic countries like Canada, particularly with AI-enabled applications, which is my focus.
AI-enabled applications are significantly enhancing the effectiveness of information attacks on democracies. Democratic societies urgently need to take substantive measures beyond what we’re already doing to defend against attempts to influence and undermine democracies through the weaponization of information.
I would like to bring to your attention how Russia’s exploitation of AI-enabled applications is considerably increasing the corrosive impact of Russian disinformation campaigns related to Ukraine and democracies far beyond what Russia was capable of doing before. The key takeaway I’d like to share with you is that Russian disinformation campaigns, now based on these new AI-enabled capabilities, are much more effective, larger in scale, harder to detect by target audiences and those defending against them, they are more sophisticated and much harder to defend against. They are a much greater threat to democracies than Russian disinformation has been before.
The difference comes from AI-enabled applications enabling Russian disinformation attacks to be more subtle, plausible and micro-targeted based on information that’s analyzed and used to design the attacks. What Russia is doing has not changed. It goes back at least as far as the 1960s in terms of what we call “Russian reflexive control.” What is new, though, is the precision and scale at which this can be done when using generative AI applications.
I could share many forms of how that is done, but I’m going to share just a few here, targeting the military in Ukraine, specifically. Russia uses generative AI to falsify political and military orders, undermine morale of citizens and members of the security community, divide allies, discredit military leaders, Russia, indirectly targets military members while directly targeting citizens by legitimizing violent protest, sowing confusion in target audiences, polarizing societies, discrediting political leaders, delegitimizing decision-making processes and political systems, and radicalizing target audiences.
We’re all aware of the examples in the U.S. I’d like to focus your attention on the second-largest contributor of military aid to Ukraine, which is Germany. Germany has been targeted by Russia with a campaign containing over 50,000 dedicated accounts that produced over 200,000 disinfirmation-spreading posts a day, trying to sow doubt about supporting Ukraine. The message spread was “support for Ukraine undermines Germany’s economic prosperity and could lead to a nuclear war.”
Russian messaging was designed to look exactly like actual posts from legitimate news websites, going down to the level of the tone and writing style — this is what generative AI can do which was very difficult to do before and impossible at scale. They were basically indistinguishable except for the content of actual Der Spiegel and Süddeutsche Zeitung articles, making messaging basically indistinguishable for the reader from actual original news posts.
In this way, Russia is employing AI in a way to create entire alternative information ecosystems. It does this by actively seeking voices of doubt and unease with Germany’s support of Ukraine and tries to amplify and enlarges those voices, not just in Germany but also in Canada, the United States and other Western countries.
For the Kremlin, like other authoritarian undemocratic regimes such as China, the sheer existence of democracies — and this is why they’re doing this — is perceived as a threat to regime survival. Authoritarian autocratic regimes whose political power is based in repression, violence, censorship and corruption cannot compete with the living conditions of civilian citizens in rules-based democracies. Liberal democracies are clearly superior to strong man dictatorships in terms of living conditions, economic development, political stability and general happiness of the people. This creates a very severe pressure for those systems, and they try to counter this pressure with disinformation.
Russian AI-enabled disinformation campaigns go beyond individual targeted operations focusing on elections, for example. They are in a much wider sense reflexive control operations affecting behavioural change by targeting the world view of citizens. Beyond that, though, they aim foremost at eradicating organic political will formation.
They do this in the following method, which is complex. They achieve cognitive overload by flooding the information space with targeted disinformation campaigns and with misinformation that additionally creates noise and confusion leading to an information overload perceived by members of the target audiences.
At the massive scale which can be generated with AI applications, they create information suffocation. Citizens are then so overwhelmed with information and find it so difficult to find out what is actually happening that they turn away from news sources which lead to information apathy.
Over time, information apathy leads to the “deer in the headlights” effect of information paralysis where target audiences are so overwhelmed that they stop participating in the political process.
This then leads to the final goal of exploitation of information by authoritarian regimes: The feeling of loss of agency of citizens that effectively eradicates civil societies and prevents organic political will formation. AI-enabled applications make achieving the loss of agency of citizens much easier and faster to achieve. Therefore, Russian AI-enabled disinformation campaigns comprise an existential threat to our Canadian democratic society and our way of life. Thank you for your attention.
The Chair: The final witness is Mr. Erlich. Please proceed when you’re ready.
[Translation]
Aaron Erlich, Associate Professor, Department of Political Science, McGill University, as an individual: Good afternoon to you all.
[English]
It’s a pleasure to be with you today.
I am a quantitative and predominantly behavioural social scientist; that is, I study the human side of the equation. I’m interested in the demographic, political and social factors that are correlated with disinformation. I’m also interested in what can help inoculate citizens in terms of believing disinformation, what the last panellists often referred to as information resilience or resilience.
I have worked consistently running large-scale surveys and laboratory experiments in Ukraine since 2014, and I have been working and travelling in the region since 2003.
I will make six very brief points based on my empirical studies.
First, we really do need to focus on promoting critical thinking skills at home and abroad. We’ve run studies in Ukraine mirroring extensive studies in the United States. There is good evidence to believe that improving critical thinking skills and reminding individuals to think critically about media reduces belief in disinformation. Importantly, it does not harm trust in stories that are verifiably true. This type of study should be continually studied at home and abroad. As the last panellists noted, the deluge of generative AI-created disinformation, a lot of which will be generated by pro-Kremlin forces, is going to severely test citizens’ discernment capacities. We really are ready for what’s about to come. We’re at the very beginning of what is going to be an amazing ability to generate at mass scale these types of stories.
We can actually learn a lot from Ukraine. Ukraine in studies that we have done does very well at discerning disinformation despite the other things it might be causing, and with the exception of economic stories which Ukrainians on average are able to distinguish true from false stories in our studies. They’ve also done interesting things some other panellists asked about. There is, for example, dedicated university curriculum on disinformation and information resilience currently in Ukraine.
Two, we need to teach skills to people with connections to Russia and other authoritarian contexts to reach out better to counter Russian and other authoritarian citizens’ belief in propaganda. I think we often discount the importance of what citizens in the sending countries actually believe. It’s hard for random strangers to do this. Recently a team and I sent a quarter of a million messages to Russians, messages designed by top academic teams, and only one increased engagement with information about Ukraine. On the other hand, in a different project, we ran a pilot study asking Ukrainians about their communications with friends and family in Russia. There are approximately 11 million Russian citizens with family members in Ukraine. While many believed they couldn’t convince their family of anything, some did report success.
Three, we should harness linguistic reorientation. The messaging of language matters. Canada is familiar with these issues. Many in Ukraine are now opting to read in Ukrainian rather than Russia. In studies run before the full-scale invasion, approximately 50% of Ukrainians would answer survey questions in Russian, now it’s 10% in many studies. Currently we are running studies about whether reading disinformation in Ukrainian versus Russian reduces belief in disinformation, and some very preliminary results suggest this reduces belief in disinformation.
We also need to help NGOs target hard-to-reach populations and trust the creativity of local actors. Citizens who are most often influenced by disinformation are not on traditional media or social media. Advertising on sports betting websites, for example, is a creative, locally based solution to trying to reach some of these populations. This is an idea from a Ukrainian NGO that understands how to target populations in Ukraine.
Another thing that the senators may be interested in is understanding what media bans do. Impressive research on Ukraine, not my own, shows how banning Russian TV has reduced support for Putin and pro-Russian positions. Ukraine has banned Russian state media. They also banned Vkontakte, which is a Russian competitor to LinkedIn, and this has also limited exposure to harmful and malicious contact, though not without freedom of speech concerns.
Again, as the first panellists note, I think it’s important to continue to support research in this area at scale not just on media but also on the social and behavioural sides. Thank you and I look forward to your questions.
The Chair: Thank you very much. We’ll now proceed to questions. Briefly, colleagues, we have four minutes for questions and answers. So you know what to do. We’re starting with our deputy chair, Senator Dagenais.
[Translation]
Senator Dagenais: My question is for Mr. Erlich. When we look at the cyberoperations and disinformation launched by the Russians since the beginning of the war in Ukraine, do we have any examples that would lead us to conclude that the Russians’ tactics have really been successful? Or, with a certain amount of hindsight, can we conclude that none of their attacks has had a devastating effect?
[English]
Mr. Erlich: Thank you very much for the question. What I can say is that we ran studies before the full-scale invasion where we have a lot of information, for example, you heard about these Nazi stories. Most Ukrainian don’t believe these to be true. Where Russia seems to have had more success — and I believe is still probably the area where they can have success is on the question of the economy. Because Ukraine’s economy has had so many problems for so long, including we must understand endemic corruption that has plagues Ukraine, the Ukrainian population is actually much more susceptible to stories on economic disinformation. Whether it’s devastating or not is hard for me to say specifically, but certainly the area of most concern are questions related to the economy. They are the ones where the disinformation has the most ability to sway Ukrainian public opinion.
[Translation]
Senator Dagenais: Ms. Matviyenko, we’ve taken in a number of Ukrainian refugees since the war began.
Based on your experience in the field, could you tell us whether disinformation may have had a significant effect on the decision-making of part of the civilian population trying to escape the war?
[English]
Ms. Matviyenko: Thank you for the question. I don’t think so. I think Ukrainians were able to find this information, and there is a general understanding that Canada is a welcoming country. I myself received a number of requests, for example, because I was in Ukraine during the time of the invasion. I spent all of 2021 and 2022 there, so I witnessed the situation before the invasion and how it unfolded during the first year.
I was in communication with many people. People knew that I live in Canada, and I had numerous requests to help. That was amazing to see. So there is a relation of trust, and it continues.
[Translation]
Senator Dagenais: I have another question for you, Ms. Matviyenko.
Could you compare the disinformation operations carried out here in Canada to those in the United States, or to those in Europe or in certain countries that are closer to Russia than we are?
If it’s possible, I’d like to know if you see any difference between the messages and objectives of the Russians depending on who they’re targeting.
[English]
Ms. Matviyenko: I would say so, yes. I probably can’t provide you with a more specific example, but as was said in the previous panel, practices of disinformation are themselves a learning practice. This is absolutely true, and we can see how through time, certain disinformation, memes or items appear and disappear, and some of them stay over time entirely.
It’s interesting how this Nazi theme has been evolving. In fact, it is much older than several years ago. It began in 2004 during the Yushchenko-Yanukovych presidential campaign. This was the first time Yanukovych hired a Russian political consultant, and the idea to divide the country appeared there. Somehow, they decided to attach this Nazi talk to Yushchenko. After a few years, it disappeared, but then we see how it reappeared suddenly several years ago. These things sometimes have history. It’s interesting to trace them. Some of them indeed live a very long time.
[Translation]
Senator Dagenais: Thank you very much, Ms. Matviyenko.
[English]
Senator Boehm: Thank you to our academic panellists. You all gave very good and interesting presentations.
My question is for Professor Erlich. In the article that you co‑authored with Professor Calvin Garner, you addressed the capacity of residents of Ukraine to discern between pro-Kremlin disinformation and true statements. As you mentioned in your remarks, you found that Ukrainians, despite years of sustained exposure to Russian disinformation, are on average able to — I’ll use parliamentary language: “distinguish” as opposed to “cull” — distinguish between true stories and pro-Kremlin disinformation claims.
Based on your research — and perhaps your colleagues on the panel might have views on this as well — what sort of practices could we take from that in Canada in terms of countering propaganda of this kind locally? I ask this bearing in mind that any time any government introduces legislation, there’s always the question of freedom of expression and the infringement of the actual freedom we would have on the World Wide Web. I’m wondering if you have any thoughts based on that research you’ve undertaken.
Mr. Erlich: Thank you, senator. It’s a great question and one I’ve struggled with, because the number one thing that has probably made Ukraine so good at it is that they were invaded in 2014. In some ways, that was maybe Putin’s biggest mistake. He really gave the Ukrainians an opportunity to learn how to cope with these things.
One important thing that can be done is to instill some kind of urgency in people. I hesitate to use the word, “fear.” I don’t want to use the word “fear,” but we need to instill the idea in people that this is something urgent, and they can’t sit back. That’s what the Ukrainians did. They could no longer sit back. They felt it was urgent to learn how to deal with this. I’m sure my colleague will have many things to say, but if you talk to colleagues on the ground, they say that nobody believes anything until they’ve triangulated it. They sound like they’ve taken something from a textbook: “I’ve triangulated with three sources and checked with my friend in Kherson and my other friend, and now I believe it.” They’ve gotten that urgency.
Therefore, anything we can do to help people believe that it’s actually urgent is the key. Now, if I had the answer to that question, I would probably have a lot more money than I do now.
Mr. Seaboyer: I think there’s a lot to learn, not just from Ukraine but particularly from other countries that have been targeted for a long time. Finland is a particular example in terms of their whole-of-government and governance approach, educating children in kindergarten focusing on critical thinking. There is a lot that can be done ethically. My research focuses on how we can ethically counter disinformation and how we can ethically counter message. There’s a lot that can be done there.
Learning from Finland, they have, for example, prime time TV shows where they talk about how Russia is targeting them and why and what the exact Russian messaging is. They alert the public to these false messages. There’s a lot we can learn in that regard.
I agree with Dr. Erlich’s point. In my view, what we’re missing in Canada is an understanding of the threat that’s there. This is partially because we’re not targeted as much, for example, as the United States. However, the infrastructure to be able to target Canada as much as the United States has been set up with the echo chambers. So we need to inform the public and make this more of a topic that people are aware of, and then fund research. We heard a lot about the Rapid Response Mechanism, which is an amazing part of Global Affairs Canada. It’s doing a lot of work but is seriously underfunded. If you look at how much staff they have, it’s unbelievable what they’re putting out. Their reports are really great, but we need far more funding for RRM and for other methods used by the government to try to mitigate disinformation attacks.
Senator Boehm: It’s not existential for us. I think that would be a big factor.
Ms. Matviyenko: I completely agree with my colleagues here. I would also add that, of course, fact checking is an incredibly important thing. Indeed, in Ukraine, for the last ten years, people have been learning how to do it almost immediately. This is a skill, and it takes some time, but then you learn it.
However, we are also seeing that, actually, disinformation goes far beyond facts. What we see is an engagement with emotions, fear and even terror — as I was trying to say. These are more complex things, and probably that’s where we need a broader discussion, context, education and on-the-ground reporting. What is really missing in Canadian media — in order to understand how things unfold — is reporting. I’ve been working with media for all this time, and I see there is a huge problem in terms of how it’s done.
Senator Oh: Thank you, professors, for being here.
My question for you is this: How can Canada work with its allies and international partners to develop a coordinated response to Russia’s disinformation and cyber operations targeting Ukraine? How is our punch back on the cybersecurity to Russia? Are we strong enough?
Mr. Seaboyer: About the last question — whether we are strong enough — absolutely not. Absolutely not. This has to do with many factors. Funding is one factor, but also our rules‑based order. We care about transparency and accountability, and we are very careful about unintended side effects. The adversaries are not. Russia and China are not concerned about unintended side effects. They will put out 180 different messages on a topic — Russia, for example, the MH17 that was shot down — and they don’t care about what sticks and what doesn’t stick and what effects this has on target audiences. We do not do that, cannot do that, don’t need to do that and don’t want to do that. We have a completely different ethical background and different regulations dealing with that. So that puts us on one level that tilts the playing field significantly to the advantage of the adversary, which they’re exploiting deliberately. They know that we can’t act in that way, but I argue — based on my research — that we don’t need to fight fire with fire. If we fight fire with fire, we become the enemy, and we don’t need to fight the enemy then.
I’m working on developing ethical methods of influence operations — ethical AI-enabled methods of influence operations — and this is early stages, but we can, in a transparent and open way with white operations, or white ops, influence target audiences by informing them about the challenges they face in those countries. The difference is the fundamental corruption in Russia, the fundamental corruption in China and the living conditions of the people. If we talk about and make people aware of that, it can be a very effective way of creating a counterbalance and a counter defence against the attacks.
Senator Oh: Any other comments? No?
Senator M. Deacon: I’m going off script a little bit. As I listen, I must say that we learned this first-hand at the 2014 closing ceremony of the Sochi Olympics, and trying to have our young people understand what battles they can and can’t have. All the things you’re saying today are sober reminders of the continued work in this area. Also, working with young people, one of the most important things you’re talking about today, specifically, Mr. Erlich, is that of critical thinking skills. We met with hundreds of teachers last week, and they’re just looking for the answers: How do we help our students with disinformation? What do we do? There are plenty of examples of activities they can do, but I couldn’t agree with you more as I’m mulling over and listening today about teaching citizens critical thinking skills. I worry, however, that the people who need to hear this most are the people least inclined to listen to any kind of public trust or public message on this. How does the government speak to people who are the least inclined to listen to or to trust the government?
If you can perhaps think about that first, and if anyone else could respond that would be great.
Mr. Erlich: Thank you for the question. It’s a great question. I mentioned one thing about meeting people where they are, and I think that is very important, and targeting communities is very important. I can think of an example from Nigeria where I did a little bit of work. They hired a famous rap star to speak out about the issue. I mentioned that some Ukrainian NGOs were placing ads on sports betting and pornography websites, because those are the types of people who aren’t going to be in your critical thinking high school classes. When we talk about primary education — yes — we can reach people, but when we’re talking about people who are already adults, we have to think about where adults are spending their free time, and there’s a lot of good research on that. Then we need to think about what kind of people are actually going to be able to convince those types of people. They could be religious leaders for a certain segment of the population. A lot of good work in sub-Saharan Africa using various religious leaders to counter disinformation in church services. Sports figures are often used. It involves a bit of trial and error, but also targeting and thinking about whom the appropriate people are for different Canadian communities. Is it going to be a hockey star? In the Greater Toronto Area, or GTA, is it going to be somebody else such as a popular singer or someone like that? It’s not going to be a one-stop-fits-all; it will require community engagement to figure out what kinds of people in the communities are going to work along with some internet-based smart targeting.
Mr. Seaboyer: Research shows — and I can share resources with you — that about 15 to 25% of the population, depending on the country, who want to believe disinformation. So that’s their inclination, and it’s really hard to reach those people. That’s a maximum of 25%. I would not recommend focusing primarily on those. There’s at least another 25% who are very susceptible to critical thinking and information education, and it’s those — together with another 50% — who can be persuaded.
I mentioned Russian reflexive control. Russian reflexive control is very vulnerable to informing people about what is happening, why it’s happening, how it is happening and what it looks like. If you know all those things, you’re far less likely when you don’t want to believe the disinformation — that’s the exception — to fall for that. A lot can be done with education. But the fact that I wanted to bring to your attention is that a lot of this happens with information overload, overstimulating the cognitive capability of people so that they turn away because it’s too exhausting and too unpleasant, so they disengage from the political process. That has to be addressed in a different way. In my view that is also about media literacy: How much time do we spend and what sources do we choose? That’s a different aspect to focus on.
Senator Cardozo: My first question is for Professor Seaboyer. You mentioned falsifying military orders. I’m wondering to what extent that can happen now or is likely to happen in the next short period. I’m thinking about everything from causing tanks to fire on their own people rather than the other side, or causing fighter jets to go the wrong way or not do what they’re required to do at times like this. Are we going to be at that point in the near future?
Mr. Seaboyer: With regard to falsifying orders, the most prominent case was the deep fake that the Russians created, making it look like President Zelenskyy was calling his troops to surrender to Russia. They’re falsifying documents, which is something we’ve seen for a long time. With AI-enabled capabilities, it’s much easier to do this and much more difficult to distinguish from actual or legitimate documents. What you are referring to are hacking operations. In that domain, which is not my field of expertise — I focus on disinformation — the short answer is that the more connected technology is becoming, the more we’re having autonomous systems steered from far distances, the more drone swarms, for example, are hackable and potentially their GPS systems can be blocked, their targeting can be blocked or changed. The risks of that are exponentially increasing with further reliance on artificial intelligence, digitization and automated systems.
Senator Cardozo: Professor Matviyenko, you mentioned that in Ukraine people have learned to do fact-checking. That’s amazing. What can we learn from that, because I don’t think we do that. How do we train young people to fact-check when they just want to see a quick slogan on Instagram and move on with the rest of their day?
Ms. Matviyenko: That is very true. There is a particular way of using technology that very much imposed, popularized, et cetera. This is a very close attachment. Of course, as a teacher, I teach my students what we call media literacy and practice, which is precisely the techniques of detachment. We discuss how media messages are constructed and that there is someone’s intention behind the messages.
I think that today we kind of do it almost a little bit less than when we started. There was a time — I would say — maybe 10 years ago, with the popularization of apps, iPhone, et cetera, when there was some kind of fear that this new intimacy with technology would suddenly change us. Then it became normalized. There were definitely more courses, attention and critique. But now, in a certain way, we kind of absorbed it, and we don’t have enough conversation about it. We think that it’s already understood, but it’s not. This intimacy grew and became closer, and that’s why every message that is received gives you a sense that it’s to you, made for you and it almost encourages a relation of trust. That’s definitely the culture of communities who are comfortable. So when there’s a threat, this pattern is broken. I am not saying that we need to amplify the threat, but we need to speak more about the seriousness of the situation because the world does go beyond its borders. We’re seeing so many events, recently, with infiltration of Russian agents here and there. There are attacks on critical infrastructure everywhere.
So things are happening, and the war is actually very close. We need a very sensible, rational conversation about this, without amplifying fear, as to how using media literacy is important for finding your position and place within this particular situation.
Senator Cardozo: Thank you.
Senator Yussuff: Thank you witnesses for being here.
Dr. Erlich, I’ll start with you. You used very precise language, which I don’t hear very often, when you talk to a parliamentary group. You said, “critical thinking.” Given what you’ve said, that’s a daunting stance in the political reality of our country — to teach people critical thinking — but I think if we do want to embark on this, it would be not to learn from the Finns with regard to what they’re doing in early high school and university level, because the technology that is available today for us to access information has proliferated, so we’re not going to change it. I don’t think we have learned the hard way how we can use that technology in a different way.
I thought maybe I would hear your comments, given your analytical research as well as your ongoing research in understanding how we can equip ourselves to respond better.
Mr. Erlich: Thank you for the question. I think Professor Seaboyer will have quite a bit to say about that as well.
It’s not just Finland; many of the countries across Eastern Europe have instituted various programs at the primary and university levels, and now have specific ministries or sub‑ministries that deal with this issue.
Far be it for me to suggest what government officials should do, necessarily, but it might be a very useful thing to have a study tour of those programs and see what the variety of solutions are. Finland is a very particular example, so what works there might not here, or something from Finland might work, but I wouldn’t necessarily say that we can just take one thing off the shelf. Looking at the variety of examples out there and then choosing the ones that best fit in the Canadian context would be an ideal way forward.
I’m sure my colleagues might have something else to add.
Senator Yussuff: Given that cyber warfare is a normal reality of countries that want to disrupt and, more importantly, cause harm, do you think we’re equipped as a society to appreciate what we have to do? Equally, on the other side, how well are we learning from Ukraine and other countries that are going through conflicts with other countries, such as in the Middle East and other places?
This is not something that is going to go away any time soon; it seems like it will be the norm. That presents a huge challenge for how democratic societies respond to these challenges. People are becoming less and less trustful of their government to tell them things so basic and fundamental — how society can cope with these issues.
Mr. Seaboyer: There are two challenges, one on the side of citizens and one on the side of the government. With the military apparatus, there’s the thinking that if it doesn’t explode, it’s not dangerous. I would argue that cellphones can be more dangerous than aircraft carriers. There needs to be an understanding of impacts and effects of non-kinetic warfare, which is, for example, using information as a weapon, and how it works. We’re still learning this, because every few months, there are completely new ways of exploiting information.
On the citizen side, the problem is that we’re caught with what is convenient and easy for us. More and more things become faster and easier by using our phone to do them. We don’t understand the mechanisms of how that psychologically affects us and our brains. Our phones are, basically, slot machines designed to trigger emotions in ways so that we keep using them.
So to create an understanding of the psychological and biological factors affected by the effective design of these devices — more understanding in that regard is going to definitely help.
Senator Dasko: Thank you to our witnesses for being here. I was asking a question in the last session about salient messages, and I want to continue along that line of thought with Professor Seaboyer.
Messages, to be salient, there’s the message side and then messages have to be targeted toward audiences. You mentioned that specifically at the beginning of your comments. You were talking about Russia. I’d like you to elaborate on who is being micro-targeted. Who are the audiences? You can comment on the messages, too, but specifically, who are the audiences that are being targeted from what you’ve said? So just please elaborate on that.
Mr. Seaboyer: Russia targets individual targeted audiences very specifically in different countries. So if the target, for example, supports the Ukrainian military in Poland, they’re going to do with this with different narratives designed to tie into historical experiences, humour and culture. AI enables them to do that much more effectively, because they can scrape the landscape and the target audiences for their preferences — when they’re online, how they’re online and what they engage with — in much more effective ways. That’s the first part.
The second one, which refers to your question to the prior panel, relates to why it is that messages that seem totally ineffective to us can be extremely effective. Messaging can have at least two different goals. One is a direct information goal to achieve direct behavioural change. For example, vote for a politician or not. That’s just one side of it.
A whole different side is spreading so much information that target audiences get overwhelmed and feel that they can’t trust information. They feel like they can’t find out what’s actually going on. Psychologically, we’re then set up to believe the largest or loudest voice. That’s deliberately created, because the governments of authoritarian regimes have the loudest voice. If people are feeling that they cannot discern the actual truth, they distrust other authorities and will listen to the loudest voice that comes in there.
Micro-targeting is based on very explicit data, individually targeting — it can be individuals, groups, politicians or citizens, depending on what the campaign is interested in achieving — based on very specific data. When are they online? At what time are they most vulnerable to fall for disinformation? That can all be identified with data. Then it is messaging them, going down to the details such as the colour or tone of the messages, or the potential humour they’re referring to — all targeted based on knowledge of the individual they’re targeting.
With AI-enabled capabilities, there’s just so much more data to scrape and use to have much more targeted information attacks.
Senator Dasko: If we were focused on Canada and Russia’s efforts in Canada, which are not nearly as great as Russian efforts in many other countries, but which kinds of audiences would they be targeting in Canada with what kinds of messages? Do you have any sense of what that would be?
Mr. Seaboyer: Absolutely.
Senator Dasko: Or others on the panel.
Mr. Seaboyer: Right now, they’re targeting support for weapons systems and financial support for Ukraine. They’re trying to create the impression that this is leading to a war and that this is basically participation of Canada in a war. They’re trying to target civilians. They try to target, for example, people who already have similar beliefs, to an extent, who are more likely to fall for those narratives, but they also target broader audiences and politicians directly and indirectly.
The main goal is to eradicate or stop support for Ukrainian military aid, to change the impression of both Russia and Putin, and undermine democracy in Canada and give the impression that our system is unjust and unfair — that elections are manipulated, for example.
Senator Dasko: That’s a different kind of message than the Ukraine message. If you’re trying to focus on eroding support for Ukraine, that’s a little different from what you said at the end. That’s a different topic — undermining support for democracy.
Mr. Seaboyer: Absolutely. There are ongoing messages, for example, toward NATO being aggressive and NATO threatening Russia. Those are messages we’ve been seeing for many years already, so these are ongoing messages they’re spreading. There are newer messages directed specifically toward Ukraine.
Senator Dasko: Are these low-information people in Canada? Is that who they’re going after?
Mr. Seaboyer: It depends upon the individual campaign we’re looking at, but it’s much wider.
They go first for people who are more on the fringes of the political spectrum. They largely do not target people in the middle where it’s more complicated to radicalize them. They try to radicalize people who have belief systems that are more susceptible to the narratives.
[Translation]
Senator Carignan: I’m trying to see how we can get out of this. We don’t want to fight fire with fire, we don’t want to spread disinformation on the other side or counter-propaganda. If the state starts to spread propaganda or provide information, people will start to distrust information coming from the state.
If we use information that comes from verified independent media sources…. People are also suspicious of the media, the state-funded media, the corporate-funded media.
With all this, we’re trying to educate people, give them training and teach them to be critical of information, but sometimes, if we’re critical, we start to doubt all information.
When I was reading my notes in preparation for today’s meeting, I read some articles about how Europe was countering disinformation; there was a list of about 20 dos and don’ts.
I wonder how we tackle this problem. In other words, how do we inform our people without sowing doubt in their minds? How can we stop them from believing that everything can be distorted, as this belief can cause them to close in on themselves?
Mr. Seaboyer: That’s a good question.
[English]
I’m working on research trying to develop ethical influence operations and ethical AI-based influence operations.
What I’m suggesting is complete transparent white operations in the sense that we declare what we’re doing, how we’re doing it and why we’re doing it, fully attributable, but that we exploit vulnerabilities in the information spaces of those adversaries of authoritarian regimes.
A key vulnerability is, for example, the corruption of top leadership. If you identify what China and Russia censor most in the debates of dissidents, you can see that’s what they do not want to have talked about.
What I am suggesting is we, of course, do not create disinformation. We do not lie. We don’t need to do that.
[Translation]
There’s absolutely no need for this.
[English]
All we need to do is talk about the reality of living conditions, contrast that to our living conditions. In Canada, you can post — within the law — whatever you want online. You never go to jail. In China, you go to jail for 20 years if you post content critical of the government. Those things we can talk about. We can make sure that communities who are interested in this are better informed of this, and that creates an effect.
I’m suggesting what we call complete white operations, meaning we officially say what we’re doing, how and why we’re doing it. We message about the living conditions in those countries. There are many vulnerabilities in their information spaces.
Senator M. Deacon: My question is, looking at information and disinformation, more countries in the world, we think, are having less support for Ukraine. I’m trying to work through the impact of the disinformation on behalf of Russia in this question of sustained support. Is there anything you’d like to comment on that, because we can’t pinpoint everything, cause and effect? I wonder where the disinformation is fitting on some countries beginning to reduce their support based on what they hear.
Mr. Erlich: One quick thing and then I will hand it over to my colleagues.
We often see these campaigns piggybacking off whatever domestic constituencies are anti-Ukraine. They take whoever is on the political spectrum who is not advocating for support for Ukraine and will then double down on it and proliferate that message.
They don’t start with their own message. They find whatever message is already being popularized locally, and whatever constituency or country that is — and there is almost always one, often because it is expensive; it’s not cheap. There are always those constituencies. That was very clear in the U.S. with the congressional funding package.
Mr. Seaboyer: There are two sides to this.
We cannot effectively measure behavioural change based on disinformation. We can see who engages with messages, spreads and comments on them. But who actually changes their behaviour on that is not measurable at this point. That said, we see a lot of correlation. Is it causation? That’s difficult to tell. Based on my research, I would certainly say it seems like it is. It’s hard to prove that, though.
We see where that campaigns are most effective, most concentrated, we also see changes in support. I would argue that some of the messages are effective. Russia doesn’t know which messages are effective. We don’t know. They spread so many, some of them stick. That’s the approach they use.
The Chair: Thank you. This brings us to the end of the panel.
Thank you, Ms. Matviyenko, Mr. Seaboyer and Mr. Erlich for this informative exchange. We appreciate the time you’ve spent with us. You can tell from the questions around the room that you have provoked considerable thought and brought us very relevant information for what has been a several-week deep dive into matters in Ukraine. This has certainly been among the most important of them. You’ve added considerably to our understanding of this complex situation.
Thank you. We wish you all the very best. We appreciate the time that you took with us.
Colleagues, we now have our final panel of the meeting.
For those joining across Canada, our meeting tonight is examining disinformation and cyber operations in the context of Russia’s war against Ukraine. For this next hour, we welcome Jean-Christophe Boucher, Associate Professor, School of Public Policy at the University of Calgary; and Anatoliy Gruzd, Professor and Co-Director, Social Media Lab, Toronto Metropolitan University. We welcome the return of Mr. Marcus Kolga, Director, DisinfoWatch and Senior Fellow at the Macdonald-Laurier Institute.
Thank you all for being with us today. I now invite you to provide your opening remarks, to be followed by questions from our members. We’re starting off with Mr. Jean-Christophe Boucher.
Mr. Boucher, welcome and begin whenever you’re ready.
Jean-Christophe Boucher, Associate Professor, School of Public Policy, University of Calgary, as an individual: I know I only have five minutes, so I’ll be quite brief.
At the University of Calgary, I run a research team funded by the Department of National Defence and Social Sciences and Humanities Research Council that does a range of studies on foreign interference and we look at far-right Chinese or Russian disinformation. Most of our team are a data analytics team, so we scrape Twitter, social media and we use machine learning and AI to kind of understand this.
When we look at Russian disinformation, we focus on three big things. One, looking at Russian propaganda on Twitter, and we’ve done a study on this at the beginning of the war. We also look right now at Russian strategic communication on social media, on Telegram, Facebook and Twitter. We also did a survey in 2022 on Canadian vulnerability to Russian disinformation. That’s basically what’s going to be the backbone of what I’m doing. If you’re interested in the papers, I’ll be happy to share them with anyone.
The first thing I want to say is that Russian strategic communication disinformation campaigns in Canada are strategic, meaning what? Two things. On the one hand, they have a fairly comprehensive and coherent and consistent way of engaging in the information space. Some people call this the chaos theory. I think that’s incorrect. The Russians are strategic as they’re trying to advance three things. On the one hand, they’re trying to advance strategic objectives which are long-term objectives, looking at and emphasizing Russian’s confrontation with the West, Russia’s place in the international system, emphasizing anti-U.S. and anti-NATO narratives. They do this across the world consistently and it’s no different in Canada.
The second part is they’re also pushing operational objectives, which are what I call midterm objectives. This is really to undermine society, promote mistrust in democratic institutions. This is where we see them emphasizing essentially an amplifying message that goes against Canadian narratives, anti-LGBTQ, those kinds of things.
In the short term, they have tactical objectives which focus on the Ukraine war. They push disinformation on the Ukraine war to try to negotiate and advance their views of that war. They’re fairly consistent in those three kinds of messages. If we want to do counter-narratives, we will have to tackle those systematically.
They’re also strategic because they have a clear understanding of Canadian audiences, in fact, probably better than the Canadian government itself. They are exploiting our ecosystems in their strategic way. They are focused on two groups: The far right and the far left.
On the far right, they’re amplifying messages — both in French and English — that promotes populist views, anti‑immigration and anti-LGBTQ views. We’ve seen this. We have data on this, showing them coming into the information space, amplifying these voices and, in some ways, amplifying the inauthenticity of it.
We see they also engage with the far left. They create content for them. They collaborate with some of the far left in Canada who follow ministers around and have been to Russia. They participate in RT programs. They are ideologically connected to these views. The Russians are interested in pushing this. That’s a good way of understanding how the Russians are doing this in terms of the audience segmentation.
When we look at Canadian vulnerabilities, my concern now is when we look at surveys on Russian disinformation — who are the Canadians most vulnerable to Russian disinformation — unfortunately, in our data we see a couple of things. On the one hand, younger Canadians have a harder time understanding Russian disinformation. People from rural communities or with less education also have difficulty recognizing Russian disinformation.
When we look at political parties and affiliation, unfortunately, people on the right of the spectrum, PPC voters and, in some respects, Conservatives have a harder time recognizing Russian disinformation. This is a concern for MPs and MLAs in my own province of Alberta who come to me and ask how can they fight this? They tell me they’re finding these things more and more when they do canvassing. It’s a greater concern.
To conclude, what we’re seeing now is that the Russians are engaged in the information space. They have been effective, especially on the right, to push their narratives. Some of the far‑right groups are parroting their narratives now.
Right now polling data suggests that, more and more, we’re seeing Canadians — especially on the right of the spectrum — influenced by those kinds of narratives. In the long run, it will have an adverse impact on our capacity to engage with and support Ukraine in the long run.
Thank you very much.
The Chair: Thank you.
Anatoliy Gruzd, Professor and Co-Director, Social Media Lab, Toronto Metropolitan University, as an individual: Thank you for the opportunity to discuss the threat of Russian disinformation in the context of the Russia-Ukraine war. I am Anatoliy Gruzd, a Canada Research Chair and professor at Toronto Metropolitan University.
Today, my comments are my own. They are grounded in research I have conducted with my collaborator, Philip Mai and colleagues at the Social Media Lab where we study the spread of misinformation, information privacy and how social media impacts society.
The Kremlin has a long history of using information operations domestically and internationally. In recent years, we’ve seen how Russia has expanded such efforts to include the use of bots, trolls, hackers and other proxies to create a more favourable environment for their information operation.
Their influence campaigns are often across multiple digital platforms and rely on techniques such as creating fake personas and websites, as well as impersonating politicians, journalists and public agencies, attacking activists’ accounts and amplifying polarizing topics.
Canadians are not immune to Russian disinformation. According to our 2022 national survey, 51% of Canadians actually reported seeing pro-Russian narratives on social media in the context of the Russia-Ukraine war. We find a strong link between exposure to such narratives and belief in them.
We also find that a person’s prior beliefs and politically motivated reasoning make them more susceptible to disinformation. For example, Canadians — as we heard — with right-leaning views, and those who trust partisan media, are more likely to believe in pro-Kremlin information.
Left unchallenged, the state-sponsored information operations can undermine Canadian democracy. The question we want to discuss is: What can we do to mitigate such risks?
Blocking state-run media outlets like RT News is only partially effective, as the Kremlin circumvents such sanctions by copying content and disseminating it through other channels. In fact, they also rely on social media accounts of their diplomatic services, like the Russian embassy in Ottawa, and sympathetic media personalities in the West, directly or indirectly.
To fight state-sponsored disinformation, digital platforms should be mandated to expand their partnerships with fact‑checking organizations and facilitate access to credible news.
Unfortunately, as we have seen in recent years, the digital platforms essentially have retreated from these areas. With newsrooms closing or downsizing across Canada, more Canadians will turn to social media influencers rather than journalists. This is concerning because our research indicates that individuals who trust mainstream media are less susceptible to pro-Kremlin disinformation. Therefore, investing in a strong journalistic community, and enhancing trust in mainstream media outlets, could effectively combat information operations here in Canada.
Another line of defence I would like to discuss is implementing proactive or prebunking strategies to inoculate Canadians against future disinformation campaigns. We heard about some of them today already. For instance, running public service announcements and educational games that incorporate known false claims, tactics and sources used by foreign adversaries can reduce the perceived persuasiveness of information operations.
We have also seen an increasing use of generative AI to create disinformation about the Russia-Ukraine war. Again, we’ve heard some examples of it today.
While most recent AI fakes were quickly debunked, I expect an increase in usage, frequency and scale of such usage, specifically in the areas of social engineering and reputational attacks.
Therefore, we must enhance and educate not only the general public on the danger of disinformation and the importance of cybersecurity, but also policy-makers and civil servants who are often the targets of such attacks; also, conducting readiness assessments for these groups would identify existing vulnerabilities.
In conclusion, we must not underestimate the potential of the Kremlin’s information operations to undermine public trust in government institutions over time. Deplatforming individual sources may not be as effective, as it could also undermine trust toward government and legitimize censorship.
A more nuanced approach should consider the various forms of information operations that they take, and develop strategies to address them more directly. This could include requiring large social media platforms to expand their trust and safety teams here in Canada, share data with researchers and journalists to increase transparency and support independent audits.
Schools must update digital literacy programs to address the challenges of today, such as generative AI. For the general public, the government should develop prebunking campaigns to educate Canadians about foreign interference. Aspects of such educational campaigns must focus on and be informed by diaspora communities in Canada, as they are more likely to be targeted by foreign states.
Thank you.
The Chair: Thank you.
Marcus Kolga, Director, DisinfoWatch, and Senior Fellow, Macdonald-Laurier Institute, as an individual: Thank you for inviting me here today.
I’m a practitioner and an activist who has been monitoring and trying to expose Russian information operations since 2007 when this new phase of Russia’s information operations started targeting Estonia.
The broad primary objective of Russian information and influence operations is, of course, to distort our understanding of the world around us and to ultimately manipulate and affect our democratic processes and policy decisions. This isn’t new.
In 1945, a Soviet intelligence clerk serving in the Soviet embassy in Ottawa defected. He outlined the scale of this work when he identified dozens of Canadians who were working with the Soviets to influence our democratic processes. For the Kremlin, those operations are more important, of course, today than ever before. That threat, in terms of information warfare and the assault on our cognitive sovereignty, is persistent and growing.
Disinformation campaigns that once took years to execute now take minutes with the help of social media, artificial intelligence and an army of pro-Kremlin influencers that amplify Russian information narratives in Canada.
Inside Russia, the Kremlin uses information operations against its own people in order to consolidate power and silence critics. Putin is constructing a virtual Iron Curtain around Russia’s information environment, which the state controls; it controls all media. The Kremlin has criminalized most independent media outlets and civil society organizations as either undesirable or terrorists. This list includes the entire LGBTQ community in Russia and even Canada’s Macdonald-Laurier Institute.
Abroad, the Kremlin seeks to divide, confuse and sow chaos wherever possible. The breakdown of cohesion within our international alliances NATO has been a Kremlin objective for 75 years. Inside Western nations, Russia aims to divide us by exploiting both sides of sensitive political issues, with the goal of eroding trust in democratic institutions, media, leaders, civil society and ultimately among each other.
In Canada, we’ve observed Russian information operations amplifying vaccine hesitancy, even before COVID, targeting MMR and other children’s vaccinations. Extremist far-right anti‑government voices within the “Freedom Convoy” movement were platformed by Russian state media in 2022. Russian anti‑LGBTQ narratives are amplified by far-right extremists and platforms in Canada as well.
On the Canadian far left, anti-Ukrainian influencers continue to write for and appear on sanctioned Russian state media and Kremlin-controlled think tanks. Ukraine, of course, has been the primary target of Russian information and influence operations for the past few years, with the objective of eroding public and government support for Ukraine. That includes false claims about corruption and the resale of the weapons that the West has donated to Ukraine. It also includes Orwellian claims by Vladimir Putin that Russia didn’t start the war, and that it is attacking Ukraine to end it. They also include narratives intended to incite hate toward Ukrainians. That includes baseless accusations about President Zelenskyy, his government and Ukrainians being neo-Nazis.
Canadian human rights legal expert Yonah Diamond at the Raoul Wallenberg Centre for Human Rights says that this narrative is part of the Kremlin’s accusations and a mirror tactic by which Russia frames and presents Ukraine and Ukrainians as an existential threat, which makes hate and violence against Ukrainians appear to be defensive and justifiable. Those narratives are repeated by far-right and far-left platforms in Canada, and they continue to be spread in parts of the Russian diaspora community in Canada, threatening radicalization through Canadian online streaming services that evade Canadian restrictions of Russian state media and possibly our sanctions as well.
Those narratives are also impacting Canadians. According to the Ukrainian congress, incidents of hate and violence toward Canadians of Ukrainian heritage have been rising over the past two years. Last year, a letter was sent to the Estonian Honorary Vice Consul in Toronto, threatening to spread anthrax if the Estonian community continued to support Ukraine.
The impact of such narratives is intensified when they ricochet between far left and far right platforms and influencers.
At the bottom of the political horseshoe are U.S. far-right politicians like Marjorie Taylor-Greene, who regularly amplifies these false claims about Ukraine, and on the far left, platforms like Montréal’s Global Research.
It is seemingly impacting Western opinions and policy toward Ukraine. Polling among Conservative voters indicates that support for Ukraine has significantly dropped over the past two years. In 2022, just 20% of Conservative voters believed that Canada was giving too much to Ukraine, versus 43% in February 2024. In the U.S., 48% of Republican voters believe the U.S. is giving too much to Ukraine today, versus just 9% in March of 2022.
While Canada has taken major steps to defend our cognitive sovereignty, there’s still much to learn from our allies in Ukraine and the Baltic region to challenge these narratives and the influencers who amplify them in Canada and the Western world.
I’ll leave it there for now. I look forward to your questions.
The Chair: Thank you very much. We’ll now go to questions. You know the rules. You have four minutes for the question and answer both. We’ll move through this as briskly as we can.
[Translation]
Senator Dagenais: My first question is for Mr. Boucher. I’d like to talk to you about one aspect of the use of social media like Twitter — which has become X — in terms of their real impact on disinformation. Take the phenomenon of people retweeting, which is very important on these platforms. Aren’t we always reaching the same people by doing this, which diminishes the real impact of the Russians in their attempts to spread disinformation through this means of communication?
Mr. Boucher: I would say yes and no. Yes, insofar as it’s true that when you start looking at influencers, those who retweet have more influence on the platform. However, what we see in the data is that these people generally have millions of followers. In some cases, some of these influencers don’t necessarily focus on Ukraine, but work on masculinism, for example. So their reach is sometimes greater than you might think.
Twitter is very 2022, and the platform is less relevant to them. However, we are beginning to see other platforms, such as Telegram, TikTok, Reddit, Facebook and YouTube, forming part of the information arsenal of agents associated with the regime in Moscow. For example, surveys now show that young people are increasingly saying that their main source of information is YouTube, a social media on which there is virtually no moderation. There are recommendations made through algorithms where people come across content that has not been retransmitted on Twitter, and this is having an increasingly effective impact on young people.
In our survey, we saw that young people were less able to identify disinformation than older people. Older people are always criticized for not understanding the social media environment and that’s why they’re misinformed. In my surveys, I see young people having trouble distinguishing between what’s true and what’s false. So we need to focus on that. I would focus less on X and more on the organization as a whole. The major problem we have in Canada is our capacity to collect data and monitor the environment; it’s virtually non-existent. The Canadian government, apart from the intelligence services, has very little capacity, so we don’t know what’s going on in social media and we’re always lagging behind in this environment.
One of the recommendations I would make is to increase the Canadian government’s capabilities quite radically. The group that’s part of Global Affairs Canada, Rapid Response Mechanism Canada, is the only one really working on this. Its staffing levels are far below what we need. Russia spends $3 billion or $4 billion a year on disinformation. How much does Canada spend? Maybe $20 million or $30 million, if you look at all the departments? That is too little.
Senator Dagenais: Apart from social media, have the Russians managed to infiltrate more traditional media in Canada, such as newspapers and television, to convey certain information or opinions, in order to influence political decision-making or raise public disapproval of certain positions taken by allied countries against the Soviet regime?
Mr. Boucher: That’s a good question. I don’t know. However, what I do see is that when you read and analyze what people are saying…. In French, in Le Devoir and La Presse, some journalists sometimes take a pro-Russian and pro-Kremlin position, to the great dismay of all those who work on this issue. I see this more often in French than in English. When I look at the English-language media, I see this tendency more in the right-wing and far-right media. Is there pure infiltration? No, but we are seeing more and more that certain journalists feel free to convey these objectives and these stories. We tried to do a study on the penetration of Russian propaganda stories in the traditional media. The phenomenon is marginal, but the trend is growing. Mr. Kolga mentioned this earlier: It explains the slow erosion of support for Ukraine and it’s increasingly difficult to get involved in this area.
Senator Dagenais: Thank you very much.
[English]
Senator Boehm: Thank you for being here. I’m following up where Senator Dagenais left off.
Professor Boucher, you and your research group — at least in the 2022 paper — collected more than 6.2 million tweets, as they were then known — they’re now postings. You came to the conclusion that the Russian influence on social media was prevalent. I guess the assumption would be that it’s even more prevalent now.
What is the probability that an average user of social media in Canada will encounter Russian disinformation through just random doomscrolling, since this is a sort-of doom topic? Also, you mentioned the demographic groups — urban versus rural and young versus old — but are there any other targeted areas that you would see? At the end, I would also like to know if you have an opinion on the reposting by political figures, who are perhaps doing it innocently enough in order to make a point but are essentially reposting propaganda.
Mr. Boucher: That is a great question. In the data — both on social media and in the survey — we find that 80% of Canadians are mostly not touched by that kind of narrative. The issue we have is that the 20% fall into that rabbit hole and stay there, and that 20% has a lot of impact on our political life and slowly but surely has more and more influence in the political sphere — especially in my own province, for example. That’s the issue. I see this as a vulnerability population problem, where 80% are pretty okay, but there are still 20% stuck there, and they’re not getting out. Then, slowly but surely, they’re having an impact on the other part.
In terms of prebunking, if I were doing the strategic communication on behalf of the Government of Canada, I would focus on these groups and engage with them. Some MLAs and MPs in my own province — who are Conservative, of course — ask me sometimes if I want to speak to their groups or public speaking to talk about Russian disinformation. They’re concerned about the impact it has on our population and on our groups. That would be the kind of argument I have. It’s a kind of good and bad story at the same time.
The second question was on Conservatives.
Senator Boehm: I didn’t mention —
Mr. Boucher: I agree. However, I still think that’s the problem. I think political elites have a massive impact in that environment, to be fair. Network effects are way more influential in spreading disinformation than the information itself. When we look at the data, people who are misinformed are misinformed on everything. It’s not about information; it’s about in-group/out‑group positions. My own position is that political elites have a vested interest in being on top of those issues and carrying that message to everyone. If they speak strongly, then people will follow.
I also think that when we wobble on those issues, we make ourselves and our society more susceptible to Russian disinformation. Foreign interference is a crime of opportunity. You need a suitable target, a malicious actor and a lack of enforcement. The malicious actors will do this anyway. We don’t have a lot of enforcement right now on foreign interference, but what we can do is make the suitable target less and less important. For example, if parties come together and say, “We are steadfast in our support for Ukraine. We don’t care what you will do and try to say in our information environment. We’ll do this.”
My own perspective — and that’s what I ask of my colleagues in Alberta — is that you should speak more strongly about those issues, and you shouldn’t be afraid to support Ukraine, and say, “This is a question of values. We support democracy and the rule of law, and we don’t really care which party you vote for. This is just a fundamental value and principle.” Just stand on those. That would be my answer.
Senator Cardozo: I have a couple of questions.
Professor Boucher, in terms of what is going on in the discussion, my sense is that it’s more the far right than the far left. I would put some of the far-left examples you gave me in the far-right column, in a sense — people are supporting the Russian regime.
I’m wondering why mainstream conservatives in the U.S. are pulling away from Ukraine. We’re getting a bit of that in Canada. Why is that happening?
Mr. Gruzd, I’d like to get more details from you on media education programs, I think. They are awfully important. I’d be interested to know what we can be doing more of.
Mr. Boucher: I’ll go fast —
Senator Cardozo: Don’t go too fast.
Mr. Boucher: I speak like this in French and English, so I’m sorry.
I think the growth audience for Russians right now is the far right and the Conservatives. If I had to put any money on it, I would put my money there. It seems like on the far left right now — if we are speaking about Iranian or Iranian-backed proxies right now, I would talk about the far left. I think this is where they’re making a lot of inroads. However, on Russian disinformation, it seems like that’s the ecosystem. Some of it is because of the Americans. They have been able to convince a large part of the American electorate and influencers like Tucker Carlson or Marjorie Taylor Greene of their narratives. That, unfortunately, has an impact in Canada. Our ecosystems are roughly the same. The Canadian far-right groups are integrated with American far-right groups, and that matters.
There are a lot of reasons why that happens. One of those is polarization. When we do surveys on polarizations, we sometimes find that people have an effective relationship with their parties, and it really is up to the parties to decide what they’re going to say. In the U.S., the party has steered toward the right, and the Make America Great Again, or MAGA, elements have mostly gone far right. Now we’re seeing that there’s a rise of illiberal values of autocracy and being against pluralism in the far right, and that is concerning. The Russians find equal ways into this.
In Canada, we’re starting to see that a lot. Slowly but surely, the Conservative Party — unfortunately in my province — is getting more and more influenced by the far right. It’s harder and harder sometimes to know the difference, and the information ecosystems are slowly but surely starting to converge in such a way that we’re having issues within those ecosystems. My sense is that what we’re seeing in the U.S., we’ll see in Canada — unless political leads decide otherwise.
Mr. Gruzd: The question was about media education. Before getting there, I want to briefly answer the previous question. About 51% of Canadians were exposed to pro-Kremlin information. The previous question asked us about it. I would love to give statistics based on platforms, but, unfortunately, platforms don’t give us researchers the data. That’s another concern I will be happy to discuss if we have time.
Going back to the media literacy program, I hear a lot from previous panels about digital literacy and the importance of critical thinking. It’s true, but I don’t want this to be the only take away from this committee — from these hearings — because it’s putting all responsibilities on individual users. Social media platforms are very complex and are regulated by algorithms. Essentially, they’re black boxes. We can’t really just rely on individuals, but we need to put effort in that as well.
Regarding the generative AI’s particular challenge right now to the digital literacy programs, where we see audio generated by AI, you cannot really tell whether it’s authentic or not nowadays. Therefore, no matter what literacy you provide, it’s impossible for the human ear to separate the two. I think that with technology, some issues can be addressed with digital literacy and for some, we would really need to talk to platforms.
Senator M. Deacon: Thank you all for being here today. Mr. Gruzd, I’m wondering if you could go back and talk a little bit more about — you said a lot very quickly on some of the pieces, which is fine — the readiness assessments you referred to.
Mr. Gruzd: The idea is that we’re trying to think about who are the main targets of information operations — in this context, Russian information operations. The policy-makers, politicians and civil servants would be the primary targets. My concern is that if we’re only thinking about interventions for the general public, we’re actually missing the most important critical group here. In my opening remarks, when I was referring to readiness assessment, I had in mind that group of individuals who actually have a strong following base online and off-line and who may retweet something accidentally — or not — and the impact will have an outsized effect.
The question that we’re concerned about is the general public, but we should be focused on whether our elected officials and others are ready to be attacked by an information operation.
Senator M. Deacon: Thank you for that.
Mr. Kolga, I think we might know the answer in this room, but based on your experience and how you introduced yourself, I would be curious to know what it looked like inside Russia during the recent election.
Mr. Kolga: With regard to the information environment, it’s completely sealed off. There are only a few platforms that are still able to penetrate into Russia: One is YouTube and the other is Telegram, but most Russians have been conditioned over the past 24 years to believe that the state only tells the truth and that it is surrounded by enemies, whether that is the United States, NATO or the European Union, and now inside Russia, the LGBTQ community is an enemy. So if you’re inside Russia, you’re surrounded by all of these enemies and you’re presented with only one option of someone — an individual — who might run the government and protect you from those enemies. That’s Vladimir Putin. Russians would have been bombarded with this sort of messaging and also anti-Ukrainian messaging, which one would characterize as being an incitement to hate. Russians are regularly bombarded with anti-Ukrainian messaging as well. Considering all of these things together, it’s a completely different reality or parallel universe that most Russians are living in, and the state controls that. That’s intentional.
Senator M. Deacon: Is there any way out?
Mr. Kolga: There is a way out. The way out is that Western democracies should come together to support independent Russian media. There’s a large community that’s living in Latvia right now and in Vilnius as well. Those governments are giving support to these communities. All the major independent media outlets are operating from abroad, so working with them to help penetrate that Iron Curtain with which Putin has surrounded his country is one way we can do that. Also by supporting Russian civil society organizations are also living in exile right now — supporting them and funding their work to prepare for that day when this regime will come to an end. It will come to an end one day, but now is the opportunity to support those pro-democratic forces that align with our values so they can succeed when that moment does appear.
Senator McNair: Thank you to the witnesses tonight. It’s hard to be the third panel in an evening, and you’ve covered the topic very well and kept the interest of the group. My question is to Mr. Kolga. You talked about some of Russia’s disinformation campaigns here at home trying to impact Canadian public opinion toward Ukraine. I’m curious: How does your organization measure these disinformation operations and what parameters are you using to identify whether a campaign has been successful or not?
Mr. Kolga: Thank you for that question, senator. It’s very difficult to measure whether an operation has been successful, but we try to use what is called the “breakout scale,” and this was proposed by an expert named Ben Nimmo a few years ago, who is now the head of Facebook’s threat assessment unit.
What this basically does is to allow us to quickly assess a narrative. As soon as we see it, we can keep an eye on it and determine the impact that it could be having and is having. For example, you might take this narrative about Ukraine being a government run by neo-Nazis. Initially, that narrative would have appeared on Russian state media — a single platform — and may have appeared on some fringe platforms, but at that point we’re not really too concerned about it because the impact is probably quite limited. We start to get a little bit more concerned when that breaks out to various other platforms. So it might break out from Russian state media to Twitter, Facebook or Instagram, and then we become a little more concerned and start paying closer attention to that narrative. We really start to become concerned when that narrative jumps into mainstream media. When you have TV, Canadian television, or perhaps a newspaper columnist or radio reporting on that narrative, then we become extremely concerned, especially when we have an elected official, an influencer or a major journalist who is also repeating that narrative, and that then impacting policy decisions or provoking some sort of political action. That’s the scale that we look at, and when we see those narratives moving in those different phases, we determine what sort of action to take, whether to expose it or whether to address it. Nothing replaces the kind of work that my colleagues do in diving deep into data and that sort of quantitative research. But this is a quick way of — again — determining what the impact of these narratives might be and where we might be in the life cycle of those narratives.
Senator McNair: Another quick question if I may. Mr. Kolga, I’m curious to hear your thoughts briefly on the impact that disinformation is having on the Russian diaspora here in Canada. I understand there are still some streaming services that are giving Russian air time to state media. Are you concerned about this?
Mr. Kolga: I’m extremely concerned about that. There are 500,000 individuals in Canada who identify as Russians. The Canadian government has done a good job of sanctioning all Russian state media. We’ve also banned Russian state media from our public airwaves. It’s still available, unfortunately, online, but it’s also available through streaming services that are based in Canada. These are services that are like Amazon Fire Stick or Roku. Basically, you can go to a shop in Toronto, buy a little device that has a USB connector behind it, and plug that into your television. You pay $12 or $15 a month, and this allows viewers to stream all Russian state media into their homes. There was a report recently published in The Logic by a journalist named Martin Patriquin, in which he interviewed a well-known Russian-speaking journalist here in Canada, Alla Kadysh, who estimated that at least one third of Russian homes in Canada use this service. That means that one third of those Canadians are being exposed to the extremely toxic Russian state media that is being pumped into the minds of Russians on a daily basis. This includes those incitement to hate against Ukraine, among other narratives. So I’m very concerned about these services, and I wonder if there’s any violation of our sanctions in terms of these organizations generating revenue from rebroadcasting Russian state media. I think the government and our authorities need to take a very close look at these services and whether they’re compliant at all with our laws.
[Translation]
Senator Carignan: Mr. Boucher, in 2022, you did a study on propaganda, particularly on Twitter. In short, 75% of this propaganda was pro-Ukrainian and 25% was pro-Russian. So it was 35% content, even though 25% of the tweets were pro‑Russian.
Has the situation changed? What is the Canadian government doing to resolve the situation, or at least to lessen its impact? In your 2022 article, you were quite critical of the Canadian government, talking about what it wasn’t doing, or rather what it should be doing.
I’d also like to hear what you have to say about the fact that a prime minister said that foreign interference in the elections wasn’t serious, that only a few MPs lost their jobs, but that it didn’t change the result in terms of the government that was elected.
Mr. Boucher: I’ll go slowly. What’s interesting is that when we did this study in 2022, the war was in its first few months; we knew nothing about the ecosystem and we didn’t know how the situation would develop — my colleague, Mr. Kolga, also worked on this.
Today, it’s much the same. The players we identified then are the same as they are today. They are exactly the same. It’s quite astonishing and it surprised us. Every year it’s the same people, we know them and we know what they’re going to say.
This has given us a better understanding of the ecosystem. In my opinion, today, the far-left ecosystem has not grown and is starting to work more on in-depth stories about Iran and anti‑Semitic groups. However, the far right continues to grow; in a way, I’m under the impression that this small 25% is a little stronger and more important today.
The second question concerned the measures being taken by the Canadian government to resolve the situation. I remain just as critical. Frankly, the Canadian government and its civil servants are working very hard and they are competent. It amazes me how seriously they take their work in all the departments, such as Global Affairs Canada, National Defence and the Privy Council. However, they don’t have the tools or the policies to help them do anything.
What emerges from my conversations with people at Global Affairs Canada is that they want to provide answers. Yes, but answers about what, how and where? In reality, they’re not really doing that. Despite two years of war, events have occurred with the government of India, Iran, Hamas and China. There is a whole host of players who are interfering and polluting our information space and trying to influence our fellow citizens, but there’s no real response from the Canadian government.
What worries me a lot is that when I have conversations about the 2025 election, it’s not clear whether Canada has a plan and whether it knows what to do. As Mr. Gruzd was saying, the tools that are being developed today, such as deepfakes or generative artificial intelligence, both in audio/video and in text, will be four times more effective in a year’s time.
This is the year of the big test. Everyone talks about the four billion people who will take part in elections, but that’s not so important; what’s important are the 300 million voters who will cast their ballots in the United States. All the players will be putting their resources into trying to influence this group, because they are the ones who will have the greatest impact on the war in Ukraine. This means that for a year, they will be testing all the artificial intelligence and generative artificial intelligence tools.
When the 2025 elections are held in Canada, we’ll end up with a group that will have spent a year training to try to manipulate the information space in the United States, and we’ll still be there. I know that people in the Canadian government are very serious and concerned about this event, but I can’t see any plans or manpower yet. The teams at Global Affairs Canada are very small. There are three or four people working very hard, but managing 83 files; they spend their time briefing ministers and coordinating people within departments. At the end of the day, there isn’t really a program that’s put forward to say, “Here’s what’s being done and how we’re doing it. Here’s what we’re doing and how we’re measuring it”. I think that’s a problem.
The feedback I get from inside government is that everyone thinks it’s a problem. The Prime Minister says it’s not serious. I think he’s wrong. I think all political parties in Canada should unanimously criticize any interference from anyone, be it the far right, the Russians, the Chinese, the Iranians or the Indians. Canadians have the right to decide their future between themselves, and the Canadian government and all political parties should make this a position of principle, and say that they don’t care whether the interference comes from the United States, the far left or the far right, they have to protect Canada’s cognitive space.
I disagree with the Prime Minister. I also disagree with the Conservatives, who are a bit indolent in the face of the extreme right coming from the Americans. They think, “Well, are we going to do anything about that?” Yes, what’s good for the Liberals is good for the New Democrats and the Conservatives. I think that as a society, we have to tackle the problem.
[English]
Mr. Kolga: A couple of very brief comments. As far as the far left is concerned, Russia is truly exploiting this group. They are advancing anti-NATO narratives, anticolonialism narratives and they’re also picking up Russian narratives which blame Ukraine for starting this war that they are prolonging the suffering of Ukrainians. This is having an impact on mainstream debate about what to do with Ukraine, whether that’s to impose peace or to continue supplying weapons. They’re coming at it from a different angle, and it is clearly having an impact. I think we’ve all heard those narratives in mainstream media.
What Canada should be doing, I completely agree with all of Mr. Boucher’s comments. I would only say that the Europeans are doing a lot on our behalf with the Digital Services Act. This is holding those large social media companies to account. Maybe what we should be doing is looking at how we can support that European effort.
I do want to comment on Global Affairs’ Rapid Response Mechanism; it is very effective. I speak to European colleagues all the time, elected officials. They all comment on the effectiveness of RRM in spreading that information, awareness of some of these narratives. So it is doing very good work. What we really need to be doing is working more closely with civil society because civil society is nimble. It can do that work of exposing those narratives so that there is greater awareness. So more work with civil society is definitely needed.
Senator Yussuff: Thank you, witnesses, for being here.
I’ll take you back a bit. Not in the context of Ukraine but, as you know, we went through the pandemic and as we saw that a significant portion of the population that didn’t believe in vaccination were disruptive. In time, they disrupted what would be the norm as to how society would respond to a major crisis on something that was so fundamental for many of us. Of course, they use all the tools you were talking about. This is here; this is not Russia. This is our own folks. Where they got the information from and how they spread it was extremely disruptive. We’re now looking at it on a larger scale, how malicious actors, state and individuals are posing a major challenge in terms of how democracy can function, including the point that Mr. Boucher made on foreign interference.
What can we learn from what other countries are doing? Despite the fact this has been with us for some time, Liberal democracies haven’t found a proper way in how we can address this and build consensus among citizens. Some people will find any interruption of their ability to have disinformation as part of their lives and other information is their right and you shouldn’t restrict them in any way. You have social platforms that on a day-to-day basis have no screens in regard to what you can see or be influenced by.
To put it in context, if we’re going to deal with this in a meaningful way as a country, it would seem to me we should learn from somebody else, but I don’t know of anybody else I can point to with any significant confidence that they are better than us, other than they are trying extremely hard. There’s been some international coordination, to be fair. I think it’s been the subject of G7 meetings in other places, but we haven’t found a proper solution how we’re going to deal with it.
In the meantime, the people who are extremely good at disrupting our lives in a meaningful way are not stopping their actions. What can be learned and are we really looking at a challenge we have to understand as to how we can best deal with it, other than bring in legislation to restrict certain ways in how people will receive information, which I think society is not prepared for that, not in this country much less the neighbour beside us?
Mr. Boucher: The first part is a lot of people are making money at spreading disinformation. There’s a fundamental right to be able to express your views and beliefs. There might not be one of making money out of it. I think we can regulate some of this or at least make this more transparent. If you are an influencer or you have a website, your sources funding should be transparent. Who gives you the money, like through GoFundMe? I think that would shed a lot of light in terms of understanding who is spending millions of dollars to spread their views.
The anti-vaccine movement was not just a couple of blokes who had views about the vaccines. They were corporations and groups that spent millions of dollars to advance their ads and tried to convince others to do so.
The anti-LGBTQ narrative is not just a couple of concerned citizens. It’s actually groups that are backed through money from the U.S. that spend millions of dollars to promote their views to Canadians.
We can tackle that part. Not police speech but police amplification. You have a right to say whatever you want. You don’t have the right to spend a lot of money on it. Or if you do, it should be transparent. You want to spread an anti-vaccine narrative? Who funds you? This billionaire or millionaire from this thing. It’s legal, but at least it gives you a better understanding of the space.
The second part is that there are other states that are doing this somewhat as a group. I’m thinking of the Australians, who have set up what they call an ASPI, or Australian Strategic Policy Institute, which is a group that is funded through either the Department of National Defence or Public Safety. They’re really good at tackling disinformation, especially with China. We have nothing like this in Canada. They have been very effective, and some of the work they do actually helped us. The “Spamouflage” on public figures from China was their information. We could learn from this and spend that kind of money.
The last part is that we’ve learned a lot that fact checking actually doesn’t work. Much of the research, data and the work we’ve done on fact checking, all demonstrate that fact checking does not work. If we think it works, it doesn’t stick. If we think it works, it actually makes other people more entrenched in their views.
We think now that pre-bunking is the most effective way to combat misinformation and disinformation, but I haven’t seen a lot of good pre-bunking narratives or ways of doing this. We can spend a lot of time and effort in the coming years to do that and see if it’s effective in combatting disinformation.
Mr. Gruzd: We can learn a lot from COVID-19 misinformation and interventions that we implemented across various sectors in our society.
Starting with the platforms, they quickly put labels up. Every time there’s a message related to the COVID-19 virus, vaccines or future vaccines, there is a link directly to Health Canada, where we can find credible information.
Social media platforms invested into fact checking. I disagree with my colleague; they do work. But studies do show that it does not necessarily translate into a long-term effect.
Fact checking false and misleading claims were implemented across many platforms. We’ve also seen how YouTube — related to the previous comment — demonetized anti-vaccination videos. We saw that YouTube users immediately started to recommend more pro-vaccine videos than anti-vaccine videos. This is based on our own study.
There are a lot of things we can learn. Things that I list here are what platforms implemented during those times, and mostly on a volunteer basis. When there was no societal pressure, they quickly stopped continuing those efforts across their platforms. Labelling and fact checking disappeared. What remains as potentially still good are the political ad transparency initiatives, where we can actually see who is spending money on which ads, related to politics during elections and issues of significance.
I think we can learn a lot from the COVID period. The question is why those efforts are not sustained. You can see there are different stakeholders in our society that have dropped the ball now and we can re-energize them.
Mr. Kolga: I would say that freedom of expression does not mean freedom from scrutiny. When we have these actors who are spreading disinformation, they need to be exposed. I think that, as a society, we’re afraid of doing that. We need to learn from our allies in Europe who do this effectively — expose the platforms and the individuals who are spreading those narratives.
Foreign governments such as Russia, China and Iran have no right to express themselves in our country. They have no right to violate our cognitive sovereignty or the sovereignty of our information space. They should be blocked whenever possible. We should, again, learn from our European allies, who have completely blocked them out of the European information space.
Who is doing this well? The European Union is doing it well. EUvsDisinfo is a wonderful platform that combines debunking and fact checking, but it adds contextualization to those narratives. When you’re reading it, you understand why these foreign governments are exploiting certain narratives and who they’re targeting with them.
I also think fact checking is important for our journalists and our newsroom managers, who are actually looking for that information. Have that information exposed online somehow, as we do on DisinfoWatch — we find it’s useful and we know that media does use that service.
Senator Dasko: My question was what our witnesses think should be done, so I feel that’s been well answered.
I do have a small question for Professor Gruzd. You said you had done some research and that 51% of Canadians have seen Russian narratives. You didn’t ask the question: Have you seen a Russian narrative? You asked about the narratives, right? What were some of the narratives you asked people about?
Mr. Gruzd: We’ve conducted a couple of national surveys like this. Every time we ask Canadians about the types of narratives they’ve seen across different social media platforms, we look around and see what’s currently trending, what are the narratives we observe as researchers through the data. During that survey, there were narratives about who is to blame. Whether you blame Ukraine for starting the war, so they caused it; whether NATO caused it, the expansion of NATO; the Nazi claim that Ukrainian nationalism is a neo-Nazi movement — those are common narratives that we’ve observed.
Senator Dasko: So 51% of Canadians have seen one of these narratives, but the number who believed them would be more in the 20% that Professor Boucher has talked about; right?
Mr. Gruzd: Yes. We then followed up to ask to what extent — because it is a range; it is not, do you believe or not, but to what extent do you believe those narratives? That NATO expansion caused Russia to attack in order to defend themselves was the most believable claim in Canada. But around 30% of Canadians believe that the Ukrainian nationalism movement is neo-Nazi.
Those are concerning numbers. Of course, this is a scale and we might not reach with our interventions those who extremely believe in that, but we certainly need to address the rest of it.
Senator Dasko: Thank you.
The Chair: Colleagues, this brings us to the end of our panel and the end of today’s meeting. I want to extend sincere thanks to Mr. Boucher, Mr. Gruzd and Mr. Kolga for, at this late hour, keeping all of our attention right to the last minute. These discussions have been fulsome, thought-provoking and concerning. There has been an alarm call for stronger reactive strategies. You could all tell the high degree of interest in this room. We thank you very much for contributing to a very important discussion.
Colleagues, our next meeting will be on Monday, April 29, at 4 p.m. eastern, when we will be chatting about tensions in the Middle East.
Thank you again for your participation here today. I wish you all a good evening.
(The committee adjourned.)