Skip to content
SECD - Standing Committee

National Security, Defence and Veterans Affairs


THE STANDING SENATE COMMITTEE ON NATIONAL SECURITY, DEFENCE AND VETERANS AFFAIRS

EVIDENCE


OTTAWA, Monday, November 3, 2025

The Standing Senate Committee on National Security, Defence and Veterans Affairs met with videoconference this day at 4 p.m. [ET] to examine and report on the impacts of Russia’s disinformation on Canada; and, in camera, to consider a draft agenda (future business).

Senator Hassan Yussuff (Chair) in the chair.

[English]

The Chair: I am Hassan Yussuff, senator from Ontario, and I am joined by my fellow committee members. I invite them to introduce themselves.

Senator McNair: John McNair from the province of New Brunswick. Welcome.

Senator Cardozo: Andrew Cardozo from Ontario.

Senator Wilson: Duncan Wilson from British Columbia.

Senator Al Zaibak: Mohammad Al Zaibak from Ontario.

Senator White: Judy White from Newfoundland and Labrador.

[Translation]

Senator Youance: Suze Youance, Quebec.

[English]

Senator Ince: Tony Ince from Nova Scotia.

The Chair: Today, we are meeting to wrap up our study on the impacts of Russia’s disinformation on Canada. Our meeting will focus on the international efforts to counter disinformation from Russia.

For this discussion, we have the pleasure of welcoming Jānis Sārts, Director, NATO Strategic Communications Centre of Excellence; Edward Lucas, Senior Fellow and Senior Advisor, Center for European Policy Analysis; and Anayit Khoperiya, Deputy Head, Center for Countering Disinformation, National Security and Defense Council of Ukraine. Thank you all for agreeing to join us here today.

We begin by inviting you to provide your opening remarks to be followed by questions from our members. I remind you that you will each have five minutes for your opening remarks.

We will start with Jānis Sārts. Please proceed whenever you are ready.

Jānis Sārts, Director, NATO Strategic Communications Centre of Excellence, as an individual: Thank you very much for the invitation. For my introductory remarks, I’ll try to focus on three things: first, the way the opponents — Russia and China — see this as an area and why and how they pursue it; second, what are the best known examples of functional response and what are the good recipes to follow, as well as how to respond to that and what is the data that supports that it works; and third, what it takes to prepare for the future because the future, as always, is going to be different than it is today.

First is the way Russia and China perceive it. I think both of these countries are on the record as saying that they perceive this as a war or as an extension of war. When we look at the actual situation, they believe they have an advantage, because being authoritarian states, they have developed an ability and they have a task and an idea to control the information environment and control what their people consume, and they hope to control how their people behave. That’s the DNA of their regimes.

On the other side, they see us as democratic societies, very much dependent on the free exchange of ideas and the free information environment, where everyone is allowed to speak up and deliver their ideas. They use that as a vehicle to penetrate our information spaces.

This is not a new idea. That has happened throughout the Cold War, but the difference today is that social media and the algorithms of social media make it so much more different and easier for them to attack and make an impact on the conversations of Canadian citizens by pretending to be Canadian and trying to field different kinds of conspiracy theories. Many times, they try to use existing wedge issues and then deepen them to the point where disturbance in society is happening. In some cases, that is deliberate. In some cases, they are very opportunistic.

Obviously, for Russia today, the key focus is the war in Ukraine. I’m sure Anayit Khoperiya will talk about that in more detail, but, clearly, from the perspective of the West, they try to undermine our resilience to support Ukraine with weapons and other ways of support, and that’s why they are amping up the current attacks in Europe, namely the drone attacks in Poland by flying into their airspace. They are, in my view, primarily meant for narrative change and cognitive effect. Therefore, also at NATO, we’re increasingly thinking that instead of just pure disinformation, it is more in terms of cognitive warfare.

Now I will move on to what is working to counter that. There is a reason I pointed to this cognitive warfare because, interestingly enough, much of what is efficient and effective in wartime — and you being the National Security and Defence Committee would appreciate that — is also very similarly effective in the information space.

The first thing is — and Ukraine has demonstrated this — by being on the defensive, you cannot win. You have to find a way where you are on the offensive or you decide the narratives and drive the information dynamic. The best example was the opening phase of the full-scale invasion where, for once, the West, by disseminating intelligence, was undermining what Russia was prepared to do. And we, as a centre, did a data analysis. We knew and we saw how Russia was unable to respond on that. They didn’t know how to respond when they had to defend. That’s the first thing.

The second thing is networks. It’s a networked environment, and in a networked environment, we need to have networks that we respond with so that it is not only the government’s capability but also civil society and engagement.

The third thing, which is also very important, is the delegation of authority. In the military, control and hierarchy are very important, but what has been demonstrated time and time again is that the more you delegate, the easier it is to respond to the immediate changes in the circumstances and the more genuine that response is.

The last piece is the future. I’ve said that social media has created the circumstances within which we are. The bigger revolution is coming. Artificial intelligence, or AI, is going to be the future of the information ecosystem, and it can be either much better or much, much worse.

If we look at it today, there is a lot of data showing that AI is already much more persuasive than an average human in conversations with humans. We’re building relationships with AI systems, especially given the level of use because about 40% of users up to the age of 25 are having a relationship with their AI systems in a friendship and other forms. Ultimately, of course, these AI systems are now being used by Russia and China to deliver their information effects.

The Chair: Sorry, I have to ask you to conclude.

Mr. Sārts: That means that is another area that we have to focus on. Thank you.

The Chair: Thank you very much. We will get a chance to delve into that a little bit more.

Ms. Khoperiya, you are next. You have five minutes for your opening remarks.

Anayit Khoperiya, Deputy Head, Center for Countering Disinformation, National Security and Defense Council of Ukraine, as an individual: Honourable chair and honourable senators, thank you for inviting me to speak with you today. It is a privilege to share Ukraine’s experience in countering disinformation. Unfortunately, it is an experience written in real time under the pressure of war.

Ukraine has developed a comprehensive system to defend itself in the cognitive domain. At its core is the Center for Countering Disinformation, or CCD, where I serve as the Deputy Head.

The CCD was established in 2021 under the National Security and Defense Council of Ukraine. At first, we were mainly an analytical institution monitoring the information environment and briefing the national security council about emerging threats, but after February 2022, everything changed. Disinformation became an operational weapon that was launched before tanks and missiles and was used to justify aggression, divide allies and weaken public trust. Since then, our mandate has expanded from monitoring hostile narratives to orchestrating a whole-of-society resilience effort across government, media and civil society.

In the early days, our work was very tactical: We found a fake, we debunked it and we moved on to the next. But that became a never-ending loop because the problem wasn’t just individual lies. It was an ecosystem built to spread them. We decided to focus higher — to uncover coordinated disinformation campaigns, inauthentic networks and behavioural patterns that continuously pollute our information space.

Russia’s method is simple but dangerous. It searches for vulnerabilities in each country — social divisions, political tensions and economic fears — and then uses those vulnerabilities against societies themselves. What changes is not the message, but the entry point. That is why countering disinformation is not only about communication. It is national security in the cognitive domain.

The CCD is well positioned for this mission. We coordinate with all government ministries and special services, yet we also maintain open cooperation with journalists, academics and civil society.

We publish analytical reports such as the mechanisms of Russian disinformation as “information alibis,” for instance, which means blaming the other side in advance for actions that you plan to do yourself. The goal is to hide or excuse war crimes committed by the occupying forces.

We also communicate publicly through our official social media accounts as well as through video projects and podcasts with Ukrainian influencers. They discuss disinformation in their own fields such as education, culture and sports. These voices make our message relatable and trusted.

Education is another pillar of our work. We run certified training courses for civil servants on crisis communication and countering disinformation. We developed an interactive scenario-based simulation where participants experience both the attack and defence phases of information warfare.

We also cooperate closely with similar institutions abroad, international organizations and StratCom units from many countries. Together, we identify shared attack patterns, exchange methodologies and try to find the best solutions to prevent information attacks.

We also work with big tech companies such as Google, TikTok and Meta, pushing for transparency, faster removal of coordinated inauthentic behaviour and accountability when disinformation is organized by human operators. Those who orchestrate such operations must face sanctions and legal consequences.

Our experience offers lessons that are relevant beyond our borders. First, institutional resilience matters. The fight against disinformation must be anchored in a permanent, well-resourced structure at the national security level, not fragmented across ministries or left to reactive initiatives.

Second, transparency builds trust. Citizens are more resilient when governments communicate clearly and openly.

Third, education is the vaccine. No algorithmic filter is as effective as an informed mind.

Lastly, allies must synchronize efforts. Russia’s narratives adapt across languages and platforms. Democratic nations must be just as coordinated in exposing them.

Honourable senators, the lessons we have learned, often at great cost, can help democratic nations reinforce their own cognitive borders. By sharing methodologies and investing in public resilience, we can ensure that disinformation fails where freedom and truth cooperate. Thank you, and I will be glad to take your questions.

The Chair: Thank you kindly, Ms. Khoperiya. Next is Mr. Lucas. You have five minutes for your introductory remarks.

Edward Lucas, Senior Fellow and Senior Advisor, Center for European Policy Analysis, as an individual: Honourable chair and honourable senators, thank you for inviting me here today and thank you to your clerk Ericka for organizing this. I will just take a few minutes of your time with these opening remarks, and I look forward to your questions.

First, here’s a bit about me. I know first-hand how central information is to the struggle to protect and promote our freedoms. I’ve been dealing with European security issues for nearly 40 years, starting as an activist during the Cold War. I was the only foreign newspaper correspondent living in Communist Czechoslovakia and witnessed the Velvet Revolution bring down that regime. I was the last Western journalist to be expelled from the Soviet Union, and in 1992 I founded and ran the first English-language weekly in the Baltic States. I am also the author of several books dealing with Russia and disinformation and what we now call hybrid warfare. I wrote the first of these in 2007, and it’s called The New Cold War. It was a time when most Westerners and, I’m afraid, most Canadians were still reluctant to face up to the threat that the Kremlin regime poses not just to its own people but also us here in the West. In 2018, I was the first witness to the Russia inquiry conducted by your colleagues in London at the Intelligence and Security Committee of Parliament.

I need to warn you it’s a mistake to focus solely on disinformation. Propaganda is just one weapon in Russia’s arsenal. Russia also uses physical, psychological and legal intimidation, sabotage through both kinetic and cyber means as well as subversion and many other tactics too.

I’ve seen all of these first-hand. Some of my friends and colleagues have been killed for their work. In 2010, I coordinated the legal defence for my then-employer The Economist in a libel action brought against us by a Russian tycoon. We fought him off, and it cost us roughly C$1 million in today’s money. It’s a reminder that lawfare is also an information operation.

It’s a huge honour to share this session with Anayit Khoperiya, and like my colleague Jānis Sārts, she is a world expert in her field. People from Estonia and Lithuania and, in their case, Ukraine and Latvia were warning us for years about the threat from Russia, and we in the old West didn’t listen. We were too busy making money in Moscow. The result of that greed and complacency is the disastrous war in Ukraine, so please listen to these voices now.

As they’ve alluded to, Russia’s aim — using information operations and other bits of the tool kit — is to divide and rule. Russia seeks to divide our society, so it is exploiting any fracture line using wedge issues, whether it’s cultural, demographic, economic, geographical, linguistic, political or social, to polarize and weaken us. Russia also seeks to spread apathy and cynicism. If you don’t know what to believe, then you end up not believing anything. Russia seeks to divide our alliances and our international institutions, weakening the European Union through Brexit and NATO by stirring up transatlantic rows, and it depicts the Baltic States as defenceless and failed and Russia as invincible.

The bad news is that this works. The good news is that we can counter it if we wish. We do have normative, regulatory and legal options which we could use far more effectively. We just choose not to. And just as nobody forced us to open our financial system to dirty money, nobody forced us to open our information system to our enemies.

My strongest recommendation to you is to make resilience in the information sphere and throughout society a national security priority. Other countries do this too, notably Finland, but you in Canada, with your cohesive and high-trust society, are far better placed than most to follow suit.

I’m not a Canada expert and I won’t tell you specifically what to do, but I can tell you, echoing what my colleagues have said, passivity is the road to defeat. Do not rely on free speech. Do not rely on the free market. You need an active policy to protect your information system from predators; otherwise they will eat you. Thank you.

The Chair: Thank you, Mr. Lucas. Colleagues, we will proceed to questions. Our guests are with us until five o’clock today. As always, we will do our best to allow time for each member to ask questions. With that in mind, four minutes will be allotted to each question including the answer. I ask you to keep your question succinct in an effort to allow as many questions as possible.

Senator Al Zaibak: Mr. Sārts, it was interesting to hear the term “cognitive warfare” as opposed to “psychological warfare.” I wonder if you could elaborate on the difference. My question to you is given the accelerated use of AI-generated content, such as “deepfakes” voice cloning and automated influence bots, how is NATO preparing to counter these emerging tools and how can allied parliaments legislate effectively without restricting legitimate political speech?

Mr. Sārts: First, cognitive warfare is a kind of recognition of what we see as the phenomenon of today. It’s not just about a simple piece of information that people see and believe in. It’s about establishing — through a complex set of mechanisms — the world views. That’s why, for instance, Ukraine and efforts to undermine Russian support toward the full-scale invasion are not as easy because it’s not just about showing what the reality on the battlefield is. It’s about the perceptions many of the Russian citizens have, and therefore it’s much more intricate and wider than just the information on a one-time psychological effect. It’s a consistent effort.

AI, as I’ve said, is going to change the way information flows, and it is not just going to be about synthetic or non-synthetic. Already, in some pieces of the internet, 40% of the content is synthetic, and by some statistics, 25% of the total traffic on the internet is not human.

During Canada’s next election, I do foresee most of the content online will be synthetic rather than human, so it’s not the way.

What should we do? We have to ensure that infrastructure of AI is conformed to the democratic values and principles, and we must not follow the same place where we fell with social media, where it is kind of allowing freedom but the algorithms are kind of changing the dynamics. That is what we have to concentrate on right now, first. Second, we have to attest if China tries to design the system in a way where they have at least partial control of that ecosystem, and we should not allow it. And that is, I would say, more of a regulatory realm than a pure defence realm.

Senator Al Zaibak: Just with respect to the legislation, how can parliamentarians and free world parliaments legislate the use of AI without restricting free speech?

Mr. Sārts: That’s an interesting question. In the age of AI, where AI will be talking to other AI, much of the internet will be like, “What is the free speech of a human?”

Second, I think one has to be very careful, and as a European, I suggest not to follow the EU’s approach, which is a very extensive regulatory framework and which, I’m afraid, stamps out innovation per se, so you’re not leading. I think Europe is starting to recognize that. More looks should be at the tactical applications of the regulatory frameworks within a particular context. Information, I think, is one of those and the kind of price to pay for not foreseeing — for instance, in an election environment — the rules for AI. That is probably where I would first go.

Senator Cardozo: I’d like to continue that conversation and maybe ask Ms. Khoperiya and Mr. Lucas to talk a little bit more about what you would suggest one can do about artificial intelligence, keeping in mind that, as my colleague mentioned, you have free societies but we can also regulate AI in this country. But unless everybody does it, isn’t it kind of meaningless because it can still work through other countries? I’d like your thoughts.

Ms. Khoperiya: May I start.

Senator Cardozo: Please do.

Ms. Khoperiya: Regarding AI, of course, we see a lot of examples of how Russia became better since 2022, and they are using it not just for “deepfakes.” For instance, the most popular “deepfakes” they are using are our military stuff to surrender. Also, on AI comments, they are flooding a lot of the information space, and in this case, I would totally agree with Jānis. Let’s say we are adopting EU standards because we have a responsibility to adopt all legislation of the EU in order to be part of the EU, but at the same time, it kind of limits us in this case. For instance, our politicians give their interviews to the best TV news broadcasters, et cetera, but Russians are creating millions of fakes, which AI learned and used these materials to answer for citizens regarding different questions.

In this case, I think we could work on context materials, such as how we are working, for instance, with big tech. We are not limiting them, but we are giving directions on where algorithms can work and to delete those materials that are not good or violate our legislation. It’s the same with AI. It’s a lot of work to monitor all these things, but I think it’s the best option for us in this case.

Senator Cardozo: How about Mr. Lucas?

Mr. Lucas: Just very briefly, I think the key thing is realness. We need to be much better at being able to prove who we are online and better at being able to sign stuff that we actually want to authenticate. If you can say that this video we are watching now is signed by the Canadian Senate or by your clerk in a way where I can check that it really is you and not a “deepfake,” then we’re already some step forward.

I think that producing the kind of cryptographic infrastructure with which people in institutions can sign their own material in a way with some kind of key public infrastructure that allows others to check that as true, then that would immediately push all the “deepfakes” into the world of cartoons and caricature. It may be entertaining, but you don’t believe it any more than you believe it if you see someone take up a pencil and start sketching something on a piece of paper.

Senator White: I have a lot of questions. I know I won’t get through all of them this time. Thank you all for your presentations; it has been very informative.

My question is specifically for Mr. Lucas. Drawing on the previous research that you’ve done, I’m wondering what are the economic impacts of Russian disinformation and, more importantly, I guess what I’m trying to look at is how can we better safeguard our economic security and how can governments and the private sector work together to mitigate the impacts of disinformation campaigns directed at businesses, financial markets and trade relations.

Mr. Lucas: Thank you, senator, for that excellent question. One clear effect of Russian information warfare is on borrowing costs. If they can depict a country like Estonia, Latvia, Lithuania or Poland as being in a dangerous frontier zone, that immediately affects those countries’ abilities to borrow money on international markets. It affects the creditworthiness of their big corporates. It also affects trade investment and indeed tourism more generally. We’ve already seen this happening with the Russian information onslaught in previous months and years. And it’s beginning to bite and it doesn’t help when Western news outlets sometimes recycle those narratives.

You have also touched on the potential solution, which is that we can’t outsource these information security questions just to government. It is tempting, particularly for you as lawmakers, to think that if there is a problem, what we need to do is pass a law. And I think there is a need for a legal regulatory response. Ukrainians have shown how to do that in wartime conditions. We may have to do similar things in peacetime. It is more important to have a whole-of-society response where there are things that we don’t do just because they are shameful. We have already seen that with terrorism, child abuse images and other sorts of extreme content, where people just don’t want to go home at the end of the day and say to their loved ones, “I spent the day promoting something that is going to make society a great deal worse.”

By building the normative response, both on an individual and on an institutional/corporate level, there should be things that companies just don’t want to do. When you start doing that, you get a race to the top rather than a race to the bottom.

Senator White: My next question is to Mr. Sārts. We’ve been hearing so much testimony about all the responses to Russian disinformation. It is really reactive for the most part. I am wondering if NATO actually has a proactive strategy to strengthen societal resilience and to build trust. Can you provide some specific examples of proactive NATO programs and/or initiatives aimed at countering Russian disinformation?

Mr. Sārts: NATO is investing much more resources into that and, in particular, in developing capabilities in this area. Of course, NATO is an alliance of 32 nations that have to agree on every step. That is a bit of a factor that has to be taken into consideration.

I can point out that different allies, for instance, in our region have developed tools and practices. Once again, going back to the military analogy to this space, it is logistics. In a military operation, undermining the logistics is of importance. Satellites in our region have undermined significantly the Russian infrastructure of disinformation in the region, both the digital and the straight supported and paid-for infrastructure. In the Baltics, it has decreased the efficacy. That’s one way.

The second way is integrating media literacy programs in the school system and curricula, which is very important and good. It is also very important not only to be active in creating the resilience initiative, but also being capable when necessary to go at Russia, not copying the same methodologies but being within the boundaries and ideas that are within democratic concepts, such as with truth, et cetera. But that, luckily, is a thing we can do. Some allies have been quite good at that, although, once again, it is not about one piece of information that is presented that will change the mind; it is about the ability to shift the perceptions. And I think that is still a work-in-progress.

The Chair: Thank you very much.

Senator McNair: This question is to Mr. Sārts. Can you speak a little bit more about the best practices for building resilience in communities against Russian disinformation? I’m curious about what the NATO Strategic Communications Centre of Excellence has found to be most effective in countering disinformation.

Mr. Sārts: In a way, the efficient way to do so is actually going total analog in my view. In this digital ecosystem, we have to especially acknowledge social media. What is truthful and reliable information is always at a disadvantage because of the algorithmic tilt for the other. Mr. Lucas already pointed to Finland which is using the human networks in society, so it’s more community-based, peer-to-peer communication as a leverage toward the digital storm that is happening, which is — I have to admit — very hard to deal with because of the fundamentals entailed. That’s one thing, as I said: educational systems.

The third really important part is recognition of your own vulnerabilities. That is one of the things that has really led the Baltic and Ukrainian ability to resist, and that is you recognizing that you are vulnerable. My view of multiple countries is the most efficient attacks happen when the countries think they are very resilient and don’t focus on it. That’s when the most effective attacks are. When the countries are, by definition, weaker and have vulnerabilities but they recognize it, it is many times more difficult to penetrate.

For the election interference that I have seen, the moment countries really prepare, it is unsuccessful; when countries don’t, that is when we see the success of the interference of it.

Senator McNair: You mentioned the Baltics. Do you find that Russia has distributed more resources to this area than in other areas, and has it affected the way the war has been perceived in the Baltics?

Mr. Sārts: No, actually my data shows that it used to be more back in 2012 to 2018, and then the shift went to the larger European and other countries because the cost versus the fact in the Baltics equation has changed. It requires quite a lot of investment to create the effect that would be smaller relative to the bigger countries, and that is by denying the infrastructure and accessibility and not making it so easy to operate. Of course, they still do. For instance, one of those effects was not through communication but through action means, which included incursions, drones, et cetera, and, of course, this creates anxiety in society. That’s what they seek. By and large, I would say we are not as much of a focus anymore for Russia as it used to be.

Senator Wilson: My question is for Mr. Sārts. I was actually going to ask you a different question, but I want to pick up on this last thread. If we’re seeing that Russia has moved its attention to Europe, North America and others, my view — and it sounds like you share this view — is that we are a bit complacent here in terms of our societal concern around the issue.

What do you recommend that we should do specifically here in Canada to wake society up to that? How can we distract society from their day-to-day affairs enough to make it a priority?

Mr. Sārts: Things have improved. I remember six years ago, I did the first of this kind of round. That was very little. Now I see more institutions and capabilities. That could grow. First is the societal awareness, and that is where I would pick, hopefully, not a very elegant operation and then unravel and show to everyone how it has happened, which is part A; part B is the capacity not only to locate the operation but also to disrupt. That would be the second thing. Third is training. For instance, we at the centre have created an artificial intelligence-based total information ecosystem with the societies where immersed teams can train for these kinds of hybrid operations in real time but without exposure to the real information environment. Next year, we will launch the first multinational competitive exercise. I hope Canada fields a very good team and can win the competition, but that will be a realistic test of what your capabilities are across the government and how you measure against the other allied governments.

Senator Wilson: My original question goes back to thinking about how the best defence is a good offence. I asked them the same question in the Foreign Affairs Committee. I was interested in what you said in particular, Mr. Sārts, about Russia’s cognitive terrorism, if you will. How do we penetrate into Russia? I know it is not something that can be done overnight. You said that you can obtain the pictures in the battlefield, but basically the populations can’t receive the information. How do we penetrate that?

Mr. Sārts: First, the good examples are that we use intelligence information and do much more with that than the usual process. That was successful with the full-scale invasion, showing what will happen and framing it in a particular way, so it’s about having a different approach to the way we handle intelligence. In Europe, that has increasingly been the case. Some intelligence agencies are much more vocal in trying to gain the initiative.

Second, think of new technologies such as AI. They present opportunities to game out the best way forward for the specific audience and create — for instance, from a Russian society perspective — the effect of realizing the reality when they are presented with that. These AI agent, digital twin systems can be utilized in an experimental way for that, and, obviously, at the NATO Strategic Communications Centre of Excellence, we are pursuing that track.

The Chair: Thank you. Senator Marty Deacon has joined us.

Senator Cardozo: Mr. Lucas, you talked about resilience. What do you mean when you say that we need to be more resilient? Do you have some specifics about what governments need to do?

Mr. Lucas: Thank you. I am delighted to talk about that. The first thing, which Jānis has touched on already, is to look at the countries that already do this well. I am a big fan of the Finnish system, which includes teaching kids about disinformation almost before they’ve learned to read and write properly. Information resilience is baked into the school curricula. They also have well-financed public broadcasting. They are absolutely clear that journalists are part of the national information system. It is not just seen as a business. I mentioned at the beginning: Don’t think that just saying we have free speech and a free market is enough. That does not save you.

Another thing they have in Finland is what we call soft target protection. This came up after a very brave Finnish journalist named Jessikka Aro was targeted by the Russians for her work in exposing the “troll factory” in St. Petersburg, which is the source of a lot of that. She was intimidated and humiliated by Russian information operations, and the Finns took that very seriously. Now they have a system for protecting people like that. The result of all this is that it becomes a lot less easy for the Russians to attack; they know that most of their attacks are not going to work. As Jānis was saying on the Baltics, they go and try somewhere else.

I just want to underline that we can’t win this just by concentrating on defence. We have left it too late. If we had been having this hearing 20 years ago and if we had spent our 2% properly on defence over the past 20 years, we would be in a really good position now, but it is too late for that.

Now we need to have a greater focus on playing offence and having an escalation ladder on our side, so when Russia attacks us — whether with cognitive attacks or anything else — we immediately have responses from the country concerned or from other countries. Using that means or some other means, we can go back to the Russians and say, “We didn’t like what you did. Now we are going to do this to you, and it will hurt.” It is now too late for defence. We are going to have to place a greater emphasis on deterrence. That may mean deterring information attacks by doing things in other parts of the spectrum.

Senator Cardozo: Mr. Sārts, you mentioned China as well. Are the countries that are dealing with cyberwarfare — China, Russia, North Korea, Iran and others — learning from each other? Are they all getting — I hate to use this word — better at it?

Mr. Sārts: Yes, they are learning from each other. In fact, one important difference is that we see cyberwarfare and information warfare in different ways, but both China and Russia have it as a singular piece. In the Russian case, it is actually called the Information Security Doctrine, so you see many cyberattacks led by the influence guys for the effect in the cognitive space rather than pure cyberwarfare.

On the learning piece, we did at least three studies to see how China and Russia interact. We see that they are learning from each other. They are different, but recently, we’ve seen more collaboration in some aspects. However, clearly, Russia is more focused on exposing and using human vulnerabilities, while China is trying to create the information system of the future that they can dominate.

In that sense, there is more danger in the Chinese approach, but that is because they are capable. Russia doesn’t have the same kind of scientific capability to do that.

Senator Al Zaibak: My question is directed to Mr. Lucas. I am concerned about the vulnerabilities. From your vantage point, what weaknesses and vulnerabilities in Western democracies are currently most exploited by Russian propaganda? What institutional reforms would most improve resilience?

Mr. Lucas: Jānis has touched on this. The internet has brought an immediacy and a frictionless quality to information warfare, which we didn’t have during the Cold War. As Winston Churchill said, “A lie gets halfway around the world before the truth has a chance to get its pants on.” The ability of the Russians to gain very quickly with something that makes no sense but catches us by surprise is one element. It’s also the sheer quantity of what they do: the ability to create large numbers of fake or inauthentic accounts and flood the comments under a newspaper article or social media posts. It is like standing under a firehose. In fact, the RAND Corporation had a thing called the “firehose of falsehood,” which characterized it very well.

To reform it, there is no silver bullet. There is no single thing we can do. We need to tackle the algorithms, and I am a great believer in algorithmic transparency. I don’t think the big social media companies can say, as Coca-Cola did, “This is our secret sauce, and you can’t see it.” This is part of our national information system, and we need to know why they have the promotion of some things and not others and have some traction on that.

As I said before, anonymity is an absolute scourge. Someone can pretend that they are you, Senator Al Zaibak. They can create a Facebook page in your name, an X account in your name or an Instagram account in your name. They can create another one in the name of the committee and then start spreading disinformation backwards and forward. This is really hard. At the moment, we don’t have what is called, in technical terms, identity assurance so that we know who we are dealing with.

You know you are dealing with me because your excellent clerk tracked me down. I checked to make sure that she really existed and that it was not some kind of phishing thing to get me on Russian radio and make me look stupid. We took precautions, but you’re kind of naked when you go into the social media stage. You don’t really know who you’re dealing with.

Those are some of the things. But as I said before: It’s not just institutional changes. It’s normative changes. We have to change the way that we as individuals think and behave. There are things that are disgraceful and things that are damaging for our national security or for our mental health. We should then stop doing them.

Senator M. Deacon: Thank you to our guests for being here and from far away. It is certainly a meeting we don’t want to miss. I have two questions to ask, and I am just retooling one so that it is not repetitive.

As indicated, public broadcasting has been mentioned. A recent study from the Center for European Policy Analysis, or CEPA, found that 40% of Ukrainians distrust the state broadcaster which consolidated a number of channels when the war broke out. Is there utility in trying to bolster traditional broadcast media to combat social media where disinformation proliferates? How can the government do that without being seen as interfering and hurting the reliability of these broadcasters in the eyes of the public? I will ask Mr. Lucas first, and if others wish to join in, that’s fine.

Mr. Lucas: I would defer to Anayit on that. Conditions in wartime are very different from conditions in peacetime. Ukraine is at war, and they have to do things that we wouldn’t want to do in a peacetime society. I am a strong believer that you don’t deal with Putinism by “Putinizing” yourself. I don’t want to say any more on that. This is really a question about Ukraine, and we should hear a Ukrainian voice.

Ms. Khoperiya: Thank you, Edward. Let’s say it like this: When we created one broadcaster at the beginning of the full‑scale invasion, it was totally supported by our citizens. The main point that I want to focus on is that even our intelligence service was very much criticized that they had become very popular. They created social media accounts, et cetera. We were criticized: In other countries, there is no such thing as a popular head of the intelligence service, for instance. But right now he is the most trusted person in Ukraine after armed forces and others.

But, at the same time, of course, they were criticizing — some journalists, et cetera — that there is, let’s say, censorship or something, but, honestly, on the legal basis, we never voted for it. For instance, in Israel, there is war censorship. We don’t have such a thing. But, at the same time, we understand that people need to understand where to find the information from government.

There exists a bunch of YouTube projects and journalists’ projects, such as independent ones. There is still journalistic investigation regarding people who, unfortunately, can make criminal cases in our country. So everything exists. Journalists are working very well. They are not punished about this, but the point is that there is no issue with explaining what is happening inside, specifically internal issues. However, there are issues if you want to talk about some military cases which are not allowed to be open. Still we are trying to be transparent with our people.

It’s really hard to balance in this case, but our international journalists and our international allies have arrived in Kyiv from different regions, by the way, and they see how it works in our country. Thank you.

[Translation]

Senator Youance: My question is for Mr. Sārts. You mentioned earlier that countries should conduct analyses in order to measure themselves against one other. I see this as a risk analysis where the danger is Russian interference, and each country must determine how vulnerable it is and potentially assess everything that is at stake in the context of Russian interference. I find the suggestion very interesting, but it implies information sharing between countries. There has also been a lot of talk about transparency. For this to work, should there be a supplementary agreement between countries? Finally, won’t this information sharing and transparency, which is a tool to combat disinformation, make us more vulnerable?

[English]

Mr. Sārts: Thank you for the question. When looking at it from a vulnerability perspective, the assumption by many is that the Russians are going either direction from a political spectrum. But the data shows that they are not really picking. They choose whatever vulnerability in a society is exploitable. It might be on this side or that side. It might be socio-economic. It might be religious. It might be race-driven. They pick and choose. There is no ideology behind that. It is very situational.

I have seen in two countries highly different actions, for instance, and it’s to the point of transparency in between countries. We as a centre created a kind of vetting point of the Nordic-Baltic countries because we saw Russia using it as a region. In some cases, they were driving disinformation on the Baltics, for instance, in the Nordic countries from one angle and then — on the other one — on the Nordics in the Baltics from the other angle. There was not enough coordination, so we created this vetting forum, and we also started trying to coordinate the efforts. That’s where we saw that it is purely situational.

That’s where we also have to acknowledge that it will develop a point on AI, which I want you to think about: The future choices for elections and multiple things will be based on AI advice. As it was said, there is already a set of attempts, for instance, in election cycles and others to poison the data that AI would be using to give advice to the citizens in the way and form that the hostile player wants. It is still a game. It is not assured whether that goes, but we have to recognize this is all elevating already at a different level.

And to the coordination piece, we also have to start coordinating because if about 40% of the users of AI ask questions about their medical conditions and other kinds of very sensitive issues, then we can be pretty sure that these people will also be asking advice about their political choices and other choices. If that system is not functional with what we want relative to our democratic choices and values, it is a very tight situation. That’s where we have to also pay attention to coordinating the future positioning of ourselves. Relative to some of those companies, none of the governments that we are thinking of here would have enough power to persuade, but if we cooperate, there is a possibility, right?

Senator Wilson: My question is for Ms. Khoperiya. Specifically, you could now write the instruction manual on how to do this in real time in war. But are there any things that you wish Ukraine had done in this area prior to the invasion that would have set you up better and that would be good advice for Canada?

Ms. Khoperiya: Thank you for this question because I thought about this throughout the whole full-scale invasion. Honestly, government should have been working more with civil society because since 2014, civil society has been focusing on fact-checking processes. Also, of course, academia tried to do research, et cetera, but it was not so popularized in our country through our citizens. Unfortunately, our unique experience helped to build this cohesion in our society.

In this case, I would say if it makes it more popular in your realities, make it through communication — for instance, through influencers in order to engage them with teenagers, et cetera. That is the most important thing to do. Well, of course, the Finnish experience is the best one, but they start from kindergarten, and we need to start right now.

In the case of Ukraine, I would say we could have started this communication part and media literacy part and made it more popular through our citizens 10 years ago, not right now. I think this is the best recommendation in this case: Make it popular through different platforms. Thank you.

The Chair: Colleagues, this brings us to the end of our time with this panel. I wish to thank Mr. Sārts, Ms. Khoperiya and Mr. Lucas for your experience and for meeting with us today, helping us wrap up this study on a high note. We appreciate your contribution to this work.

Senators, our next item is a discussion about what we heard and the instructions we would like to provide our analysts as they start drafting. Is it agreed we proceed in camera for this discussion?

Hon. Senators: Agreed.

The Chair: Thank you.

(The committee continued in camera.)

Back to top