THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS
EVIDENCE
OTTAWA, Wednesday, October 1, 2025
The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 4:16 p.m. [ET] to Bill S-209, An Act to restrict young persons’ online access to pornographic material.
Senator David M. Arnot (Chair) in the chair.
[English]
The Chair: Good evening. My name is David Arnot. I am a senator from Saskatchewan, and I am the chair of the committee. I invite my colleagues to introduce themselves, starting with the deputy chair.
Senator Batters: Denise Batters, a senator for Saskatchewan.
[Translation]
Senator Oudar: I am Manuelle Oudar from Quebec.
Senator Miville-Dechêne: I am Julie Miville-Dechêne from Quebec.
[English]
Senator K. Wells: Kristopher Wells, Alberta, Treaty 6 territory.
Senator Simons: Paula Simons, Alberta, also Treaty 6 territory.
Senator Pate: Kim Pate. I live here in the unceded, unsurrendered and unreturned territory of the Algonquin Anishinabeg Nation.
Senator Dhillon: Baltej S. Dhillon, British Columbia.
Senator Saint-Germain: Raymonde Saint-Germain, Quebec.
The Chair: Honourable senators, we are meeting to begin our study of Bill S-209, An Act to restrict young persons’ online access to pornographic material.
For our first panel, we are pleased to welcome the sponsor of the bill, the Honourable Senator Julie Miville-Dechêne. She has been working on this issue for a number of years. I think this is the third iteration of this work. We welcome her and her testimony here today.
We thank you for joining us here, Senator Miville-Dechêne. We’ll begin with your opening remarks, for five to seven minutes or so, and then we will open the floor to questions.
[Translation]
Hon. Julie Miville-Dechêne, sponsor of the bill: Good afternoon, fellow senators. I will give my remarks in English for five minutes, and then, I will answer questions in French.
[English]
Let’s start with some troubling facts from the Children’s Commissioner of England. Pornography is no longer something that only teenagers seek out. Primary school children are coming across porn on social media, mainly on X. And it’s not just any porn. These are violent, degrading and often illegal images, writes the commissioner. According to a recent study, 70% of children and young people say they have seen porn online. More than a quarter of them saw it at the age of 11 or younger; 58% of them have seen scenes of strangulation; and 44% have seen scenes of rape. Finally, 44% agree with the statement that even if girls say no at first, they can still be persuaded to have sex.
Therefore, an entire generation is at risk because Canada has still not legislated to require online porn distributors to verify the ages of their customers.
Five years ago, when I first started working on this issue, few countries had taken action. Today, the United Kingdom, France, Europe, Germany and roughly 20 U.S. states are protecting children from the harm caused by online porn. In California, for example, Democrats and Republicans have just voted for a bipartisan age verification bill, and in an important precedent, the U.S. Supreme Court recently confirmed age verification in Texas did not violate freedom of expression. Australia, a model in this area, has decided to move forward after investing in robust pilot projects that prove that age verification and estimation can be done effectively while respecting privacy.
You have before you a bill that has been reworked and improved, taking into account criticism and consultation. I will not satisfy libertarians who are against any regulation of the internet. That’s obvious. But this bill is eagerly awaited by the 77% of Canadians who, according to a 2024 Leger poll, want age verification to protect their children from exposure to porn, not to mention pediatricians and the Canadian Centre for Child Protection.
Specifically, here are the significant changes made to Bill S-209 compared to its predecessor, Bill S-210, which died on the Order Paper.
I have strengthened the wording of the principles ensuring privacy and the effectiveness of age verification. However, as before, the specific methods authorized will be determined in the regulatory phase, as technology evolves. Age estimation has been added to age verification as an acceptable method, as technology improves. Age estimation is more accurate than before and does not require any proof of identity.
The definition at the core of Bill S-209 has been clarified. The term “sexually explicit material,” which comes from the Criminal Code and is considered ambiguous by some, has been dropped. Instead, we have chosen the more common term “pornographic material,” which is defined as follows, drawing on elements from the Criminal Code: “the dominant characteristic of which is the depiction, for a sexual purpose, of a person’s genital organs or anal region or, if the person is female, her breasts.”
Two clauses were also added: clause 6, which ensures that intermediaries, such as internet service providers and search engines, are not covered by the law because their transmission of pornographic material is incidental and not deliberate; and clause 12, ensuring that the government will have the necessary flexibility to decide on the scope of the law. It could, or not, follow the example of the U.K. and subject pornographic websites and content on social media to age verification. But if we really want to protect children, let’s remember that 45% of minors say they have seen porn on X versus 35% on porn sites.
Society has long protected and continues to protect minors from harmful products such as alcohol, cigarettes, cannabis and online gambling. It is difficult to understand how we have allowed children unrestricted access to harmful online porn for the past 20 years. Even though pornographic platforms have the technological means to restrict minors, they have failed to do so. It is high time to legislate to achieve a better balance between protecting millions of children and protecting the individual right to freedom of expression of adults who enjoy pornography. I believe that my bill is a start to a solution.
I will now take your questions.
The Chair: Thank you, Senator Miville-Dechêne. We will now turn to questions.
Senator Batters: Thank you very much, Senator Miville-Dechêne, for that.
To start out with, I, and I am sure many other senators that are here today and maybe are watching, have received a lot of emails from Canadians indicating that they are opposed to your bill, and some of them quite strongly. I wanted to give you a chance to address some of those concerns head-on right here at the start of this committee’s process on the study of this bill.
I have been a member of the Legal Committee for quite some time, so I was a member near the end of the study that we had last time. As a result of that, I know that you significantly tightened up the language that you had from a previous iteration of the bill, and I believe you worked on that with Senator Dalphond — you don’t distinguish lawyer and judge — to handle some of those different concerns.
Canadians have been writing to us, expressing concerns about over-broad definitions, censorship by over-compliance and privacy risks about the age verification that you are purporting to use here. Perhaps you could address some of those concerns and tell these Canadians why they shouldn’t be afraid of having this type of legislation go forward.
[Translation]
Senator Miville-Dechêne: On July 25, 2025, Great Britain introduced age verification legislation, and no major crisis ensued.
There were hiccups in the implementation, of course, but many countries have implemented similar legislation since without any massive data breaches. There were no such scandals.
I’ll talk more specifically about the improvements I made to the bill.
The first is the change in terminology at the core of the bill. The term I used before was “sexually explicit material.” It came from the Criminal Code and, according to the Supreme Court in Sharpe, meant that what was being referred to here were up-close sex scenes for a sexual purpose. It seemed like the right term to us, and a number of experts agreed. In fact, when the average person sees the term “sexually explicit material,” they don’t necessarily think of pornography.
We thought about it and decided that, while this is legal language, of course, it’s also language that needs to be understood.
What we did, then, was create a new term, “pornographic material.” It wasn’t in the Criminal Code, but we took elements from the Criminal Code.
Lastly, we took a lengthy definition of three paragraphs and boiled it down to something quite simple. The wording is as follows:
…any…representation, the dominant characteristic of which is the depiction, for a sexual purpose, of a person’s genital organs or anal region or, if the person is female, her breasts…
In this case, the important part is “for a sexual purpose,” which applies to the entire definition. We tightened up the definition, if you will. Also, keep in mind subclause 7(2), the provision excluding all pornographic material that has a legitimate purpose related to the arts, medicine or research. It provides a very strong legitimate purpose defence.
When someone says that the bill could result in a Netflix series being censored, for example, they are wrong, and the reason is simply that it’s considered art. Similarly, the expression “for a sexual purpose” cannot apply in reference to a scene.
I think we’ve reassured some people by using simpler language. That’s the first change.
You also brought up privacy. That’s very important. We received the most criticism on that point. Like you, I have gotten emails from concerned Canadians.
We strengthened the definition that had already been strengthened. Don’t forget that when I talk about what is in the bill, I am talking about principles. The details of verification and fundamental choices regarding the age verification methods will be set out in the regulations.
Look at the language being used. Originally, we said the Governor-in-Council had to “consider” whether the method was highly effective, but now, we’ve changed “consider” to “ensure.” In French, the language changed from “examiner” to “vérifier.” We are making sure that the method is not only reliable, which was the previous wording, but also highly effective. As you can see, we raised the degree of effectiveness required.
Now, the bill says that the age verification is to be carried out, not by the porn site or the platform X, but by a third party, a company with no relationship to the porn site.
Between us, it’s called a “double anonymity” system, meaning that the website hires what are known as age verification providers. They use method X or Y to carry out the age verification, but the customer gets only a token that should allow them to return to the website without having to identify themselves. The only information that’s known is that the customer must be 18 or older.
Hiring an accredited third party to carry out age verification ensures that the porn sites are not the ones doing it, since they may want to have that customer information. In that sense, I think this should give people confidence that everything will be done by the book. Keep in mind that we have privacy laws in Canada. We want them to be stronger, of course, but we live under the rule of law. Yes, this kind of thing always gives rise to concerns, but remember what we are trying to balance here: the health and well-being of millions of children with people’s ability to access a porn site immediately instead of having to wait 40 seconds for their age to be verified.
[English]
Senator Batters: Thank you. That is very reassuring. I appreciate that.
[Translation]
Senator Saint-Germain: Senator Miville-Dechêne, first and foremost, I want to recognize the leadership you’ve shown on this issue. I also want to say how open-minded and determined you’ve been, particularly regarding the amendments following the initial consultations.
First, my main question, like my supplementary question — which I will ask at the same time — has to do with the adverse effects on those under 18 of viewing pornography, which you refer to in the preamble of the bill.
It says this:
Whereas the consumption of pornographic material by young persons is associated with a range of serious harms, including the development of pornography addiction, the reinforcement of gender stereotypes and the development of attitudes favourable to harassment and violence — including sexual harassment and sexual violence — particularly against women;
My question is about the connection to those two things specifically. I’d like to know what literature or scientific research led you to make the connection between the consumption of pornography and violence against women. Second, pornography addiction has not been medically recognized in the DSM-5 or legally recognized in Canadian law. As far as you know, have any public health agencies identified that effect of pornography consumption as a serious public health issue among young people?
Senator Miville-Dechêne: I will tell you that Quebec and Canadian pediatric societies are supportive of the bill, but the question you’re asking is much more specific. This isn’t in your notes, but I’ve always said how difficult it is to conduct research on the effects of pornography. You can’t sit children down to watch pornography and measure what’s happening. That would be completely unethical, so there is no cause and effect research being done on that. The research involves correlations and focuses on groups of young people who have been exposed to pornography.
You’re right about the addiction aspect. Currently, what we know about the addictive side of pornography is anecdotal. It’s a fair bit, but the impact has not yet been documented in a formal way; things aren’t that far along.
That said, when it comes to violence, I can tell you that the normalization of violent behaviour is probably what is most serious. This means that when children see images of sexual violence, it becomes normal for them, it becomes the reality. They don’t understand that they are watching a performance. It’s not the same as when an adult watches pornography; they know that what they are watching is a performance. Children’s ability to discern what is happening is not as developed.
In addition, according to the Canadian Centre for Child Protection — a very well-known organization — this normalizing of harmful sexual behaviours makes children more likely to fall victim to predators because engaging in these acts is normal, in their eyes.
I’ll get back to you with different names and studies, because unfortunately, I don’t know them all off the top of my head. Mainly, I looked at aggregates of studies, and they led me to those findings, including, as you mentioned, the difficulty children have building healthy relationships. We are worried about a lot of things. To suggest that the research is conclusive and that no new information will emerge is false, but we know enough to exercise caution. There is a reason several countries have done something about this. The issue is real.
Senator Saint-Germain: Thank you very much.
[English]
Senator Simons: Sixty-one years ago, the United States Supreme Court was looking at a question about whether something was hard-core pornography. Justice Potter Stewart famously said that he couldn’t define hard-core pornography, but, he said, “ . . . I know it when I see it . . . .” This challenge of trying to figure out how to define pornography is not new to this bill or new to this continent. I really appreciate that you have backed away from one definition that was problematic. However, I’m not sure that this one doesn’t just open a whole new set of problems because it’s very focused on body parts — what my cousin Leslie used to call “the naughty bits” — but not on the sexual activity. You’ve described hard-core pornography in which there are scenes of strangulation and scenes of rape. I could imagine that, in some of those scenes, you might not see “the bits.” I’m wondering if this definition isn’t in its own way as confusing because it doesn’t make a direct correlation to a depiction of a sexual act.
[Translation]
Senator Miville-Dechêne: I recognize that you really have to dig to get to the truth. The truth is hard to find in this area.
The definition we’ve used partly reflects the definition that was in the Criminal Code. I don’t think a Senate private member’s bill needs to start from scratch and reinvent everything, since there is a certain amount of jurisprudence related to the Criminal Code.
For that reason I, personally, am of the view that referring to pornographic material makes things relatively clear. This brings to mind something the expert Ms. Bénédicte, who will be appearing a bit later, says. According to her, pornographers and those who are for the status quo have always argued that pornography is impossible to define. I will tell you that’s one of the arguments against passing legislation.
I’m not putting that on you, but I think it’s pretty easy to determine what counts as pornographic material. There are courts, but I also think it’s somewhat misleading to say that everything on Netflix will be censored because it contains a few nude scenes. It’s not about whether the content has nudity; it’s about whether it’s for a sexual purpose, the idea being to titillate. That’s when it becomes pornography.
[English]
Senator Simons: At what point does it become art?
[Translation]
Senator Miville-Dechêne: Precisely, but in Netflix’s case, when it comes to a show about dungeons, princes, acts….
Senator Simons: Yes, shows like “Game of Thrones” and “Bridgerton” are full of sex.
Senator Miville-Dechêne: Precisely. In this case, the main purpose of the show is not pornography. To be considered pornography, it has to have a pornographic story. In this case, yes, they show breasts and buttocks, but it’s not pornography. It’s primarily art. Generally speaking, a movie is considered to be art.
In that sense, it’s excluded from the bill. It’s important not to downplay subclause 7(2).
[English]
Senator Simons: The problem is that it’s one thing for a court to decide, but this bill would require people or, what I’m afraid of, bots and algorithms to decide what is pornography and what is not. That is what makes this so difficult, because I fear what will happen is that a lot of platforms and producers will err or set up their computers to err on the side of caution. We’ve seen situations, for example, where Facebook banned pictures of women breastfeeding because the bot is stupid. It wasn’t a person who decided. It was an algorithm that decided.
[Translation]
Senator Miville-Dechêne: There will be time to fix things. The legislation won’t be perfect the day it comes into force. In Great Britain, for example, in the first few minutes after age verification began, a site about the war in Gaza was blocked. Obviously, the platforms will be expected to do their jobs properly and make sure their algorithms block only what should be blocked.
Since the beginning, the platforms have taken no responsibility for making pornography widely available. As far as they were concerned, making sure that children didn’t have access to it was not their problem. They can be asked to take a bit of responsibility.
It’s not just the platforms. Today, children are exposed to pornography mostly on social media, especially on X. The statistics I gave you show that 45% of children are introduced to porn on X. Platforms need to take responsibility and identify where the pornography is. It’s actually already happening. Some pages with pornographic content already have pop-ups asking users whether they are 18 years of age or older or warning users that they are on a page displaying pornographic content. Platforms have the ability to see where the pornographic pages are.
[English]
Senator Dhillon: Thank you, senator, for your work on this. I think it’s important work, and you have put a lot behind this already.
I’ve also been the recipient of many of those emails that Senator Batters spoke of, and I believe that the fringe questions around the definition of pornography and those pieces are areas where we can learn from other countries that have been affected and been effective around this and take advantage of some of their best practices. To that end, with respect to the age verification and age estimation methods of regulation — and this is base-defined criteria — I wonder about other countries like Australia that have adopted these different models in their industry codes and regulatory oversight that has also extended to social media platforms. To that end, what lessons have we learned from their experiences that we can adopt as we move forward with this?
[Translation]
Senator Miville-Dechêne: In the case of Australia, it took a number of years before the decision was made to proceed with age verification. It still isn’t in place, but it will be in a few months. The Australians invested $6.5 million in pilot projects to figure out which methods worked and which ones didn’t. Now they know privacy can be protected using multiple methods. The systems aren’t infallible, of course.
I’ll explain how age estimation works. The beauty of it — and it didn’t exist when I started working on my bill — is that it uses artificial intelligence to estimate a person’s age or identify their approximate age. I say approximate because, for adults 18 years of age or older, the estimate is very accurate. The problem happens when users are between the ages of 14 and 18, because it’s much harder to know their age. Normally, that would be the first method used because it’s the least invasive.
Then, if it doesn’t work, a second, slightly more complex method would come into play. The country I’ve studied most, Great Britain, has a number of other methods. It’s important to have a number of options so that customers can choose. One option is the open banking system, which asks the user’s bank for information about whether the user is over the age of 18. Photo-ID matching is another option, whereby a piece of photo ID and an image of the person in front of the screen are compared. In that case, you want a moving image. Mobile network operators can also carry out age checks; in some cases, the application of age filters indicates that the person is a minor. If no age filters have been applied, the person is an adult. Credit card age checks are another option. A number of methods are available, but there isn’t one that’s perfect, unfortunately.
Germany began implementing age verification for porn sites in 2002, if I remember correctly. In Germany, they comply. National porn sites comply. There’s never been a data leak. Germany has 80 operators authorized to carry out age verification. I realize that it’s a sensitive issue and that people have concerns, because it involves pornography, but just like other online transactions, they have to meet fairly robust privacy requirements.
Senator Dhillon: Thank you.
[English]
Senator Pate: Thank you, Senator Miville-Dechêne. I remember well your first iteration of this bill being introduced right around the same time as there was a massive outcry to address the issues at Pornhub that were revealed.
Senator Miville-Dechêne: The images of sexual exploitation of kids.
Senator Pate: Right. The question that I had then and still have — and it is a question you will be anticipating, I’m sure — is, why criminal law? In that case, it became very clear that all of the resources would go to the criminal law as a blunt instrument. All of the resources would go to preventing people, or the bar would be so high that it would be hard to identify exactly who had made what decision within a corporation like Pornhub. Why not look at things like beneficial ownership and numbered companies, some of the ways in which companies like Pornhub and others have hidden behind the corporate veil and managed to continue on with these behaviours even after there has been condemnation by, at that point, the government of the day?
[Translation]
Senator Miville-Dechêne: That’s a very good question, one that nearly derailed the bill in the Senate because many were questioning the use of the Criminal Code.
We commissioned Professor Catherine Mathieu and lawyer Audrey Boctor to do a study on whether using the Criminal Code in this bill was warranted. Legal arguments aside, the harm caused by pornography certainly fits under the various Criminal Code provisions aimed at protecting children. That means serious harm to the public health of children. The conclusion of the authors was that Bill S-210 at the time satisfied all the conditions to come within the federal criminal law power.
It was also consistent with other legislative measures deemed to fall under criminal law. Clearly, it was felt that using the Criminal Code was appropriate because the purpose was to protect children from harm.
That said, the Criminal Code has its limits, because it will no doubt be impossible to go after a company based outside Canada. That’s why my bill contains two possible avenues. There’s the Criminal Code route, where the offence under the Criminal Code seeks to ensure that children are prevented from having access to this material, but there is also the civil route. It very much lines up with how the process works in the countries that have taken action.
The idea is this: When a pornography or other company is found not to check users’ ages, the Governor-in-Council — it could also be public servants, since the process has yet to be decided — files a complaint with the Federal Court and the Federal Court decides whether the facts of the complaint are correct. At that point, if the porn website or social media platform does not comply, the ultimate punishment is to block access to the non-compliant company’s site. The internet service provider is asked to block access to the non-compliant company’s site. That is the ultimate punishment that Great Britain chose and is also what France currently uses to threaten sites that don’t want to comply. Clearly, it’s the ultimate punishment. The providers lose all their customers.
Therefore, the more countries go this route, the more porn sites will have to comply. Look at what’s happening in Great Britain. Major sites like Pornhub have opted to comply with the legislation, something that is unprecedented. In Germany, some sites have been fighting compliance for a decade or so. In Great Britain, however, they have chosen to comply and to use better standards. That is the reason for using the Criminal Code, but I understand your reluctance regarding the criminal route.
What we are talking about here are pornographic platforms with tremendous power, super rich companies with the ability to call the shots. The idea is also to send the message that this is no longer acceptable, similar to how, in normal society, children can’t walk into a porn theatre or porn shop. It’s the same thing. We want to establish that online, to extend those values to the internet in order to protect children because they don’t have what they need to analyze what they are looking at and see pornography for what it is: unhealthy sexual relations in many cases. It’s not all depictions of strangulation and the like, but it does contain a lot of things that are very concerning for women.
An unnamed university in the U.S. did a study revealing that nearly 70% of the 5,000 young people surveyed had already tried strangulation with a sexual partner. In 40% of the cases, the strangulation occurred between the ages of 14 and 17. There is no question — and I’ve said this often — that the total normalization of pornography we see today is affecting the relationships between men and women.
[English]
Senator K. Wells: My question is focused on learning more about your consultation process and developing the different iterations of this bill, particularly the one before us today.
I’m curious if you have consulted with representatives of the 2SLGBTQI+ community about any unintended consequences — for example, some of the problems we’ve seen in previous decades with the Little Sisters Book and Art Emporium case that went before the Supreme Court of Canada and how Canada Customs was interpreting a definition of the word “explicit” to censor and target materials focused on the 2SLGBTQI+ community. That’s the first part of the consultation I’m interested in. The second part, in a similar thread, is that I’m wondering if you’ve consulted with youth organizations about this bill and what the response has been from them.
[Translation]
Senator Miville-Dechêne: I can get you the names of the various young people and members of the LGBT community I spoke with. It wasn’t a formal consultation process. This is a Senate bill, so we didn’t necessarily have the budget to undertake something like that. I will say this, though: You may have talked to young people who were totally against this, as happens everywhere, but it was the opposite for me. People are divided on a bill like this. It’s obviously a controversial issue.
On the LGBT matter, you may have noticed at the last committee meeting that I was asked a number of questions suggesting that LGBT people were afraid of this bill. They’re afraid of being outed when they’d prefer to keep the information private. I’m pretty close to the LGBT community in Quebec. As you may know, I was the Quebec government’s envoy for human rights and freedoms. Generally speaking, there weren’t any real challenges with that. I think most of the LGBT people I spoke with understood the need to protect children from pornography. The biggest question was whether the methods would ensure that people’s privacy was respected. If an LGBT person submits to age verification, will their privacy be respected? That is the issue that needs to be examined.
Yes, the concerns are normal, and there are concerns from all sides but I don’t think I have a simple answer for you other than to keep saying that safeguards will be put in place in that area, as in others. This isn’t the United States either. I get quite a few emails saying that LGBT people in the U.S. are afraid and fear losing their jobs. This whole age verification issue adds to that. This is a country where people have rights, and I think that if age verification is implemented properly, there won’t be any difference fundamentally between heterosexual or LGBT people, men or women. Everyone will have to submit to age verification in order to protect children.
Senator Oudar: Thank you, Senator Miville-Dechêne, for bringing this bill forward, and for the patience and passion you have shown, which many of us share. I have questions about the application of the bill. I did my homework — thank you, we received a lot of material. Emails and documents are still coming in as we speak.
I looked to the United Nations, and in 2021, the Committee on the Rights of the Child began recommending that signatory countries implement robust age verification systems in order to prevent children from accessing pornographic material. More recently, I looked at what Europe was doing. Why? Because they’re concerned about not only maintaining privacy, a fundamental right, but also balancing what are now called individuals’ digital rights.
Senator Miville-Dechêne: That’s correct.
Senator Oudar: This is my first question: How can we make sure the right to privacy is protected?
My second question has to do with the technology Europe was looking into this summer. They’re practising on a prototype developed by the European Union. I’d like you to talk about that technology.
Europe has built a significant legislative tool kit. They’ve already started testing technology that safeguards privacy while preserving individuals’ digital rights.
I’d like you to talk to us about that protection of privacy rights as well as the technology Europe is currently pursuing. It’s being applied in real life, and we could eventually benefit from that expertise. As you rightly mentioned, France, Denmark, Greece, Italy and Spain have all brought in legislation like the bill before us.
Senator Miville-Dechêne: They’re piloting it at the moment, and some are not yet at the point of implementing age verification, but the pilot project is very promising.
The thing to understand is that Europe passed legislation requiring the biggest pornography distributors to carry out age checks on their customers. Two years later, nothing. The major platforms did nothing, so Europe undertook an investigation. Things don’t move all that quickly, so there is an investigation under way to figure out why and how they have not complied with the law, which is obviously the case.
At the same time, Europe is seeking out technological tools so that all member countries are in sync. Why? Because countries like Germany have had to deal with endless litigation from platforms like Pornhub, which refused to accept that there was a law requiring the company to carry out age verification. As a result, countries are somewhat afraid of facing off against these huge platforms, which threaten lawsuit after lawsuit.
Europe indicated that it would undertake a pilot project to see whether this could work. Bear in mind that Europe is basically at the point of having digital identification. They are a bit ahead of us when it comes to identification, given that people can use their phone to prove their identity, which is much simpler than pulling out a driver’s licence.
I sent you a wonderful little chart showing how age verification works in Europe. Very quickly, I encourage you to take a look at the document: it’s an app that was developed by the EU. I imagine they had a bit of help on the app front, but the idea is for everything to happen without human intervention. Users have to upload proof of their identity in the app, at which point the app establishes the person’s age or confirms that they are over the age of 18. The app then transmits that proof to the porn site using the technology.
Technology isn’t my area of expertise, so that’s a very simple explanation, but that’s what the pilot project will be. I think it’s a great idea because it gives countries that want to move forward on this, Italy, Spain and others, a stronger foundation in terms of knowing whether the system works. Obviously, it costs less if it can run without human intervention. I imagine there are benefits as far as that goes. That is what Europe is working on right now.
To answer your second question, regarding privacy, I can tell you it’s something I’ve focused on. Clearly, when you champion a bill for five years, and every day, you hear people’s criticisms or concerns about user privacy being breached, you come to understand how important the issue is. That is why I made amendments a first time, specifically to strengthen privacy principles, but also a second time.
Previously, the methods had to be effective, but now, they have to be highly effective. That gives you an idea. Every time I rework the bill, I strengthen the privacy principles, and that’s what people should look for in a bill like this.
What I’m trying to say is that it’s perfectly normal to focus on privacy in this bill. Yes, we have to protect children, but not at the expense of the privacy of other people who don’t want their private lives laid bare. There are still many taboos surrounding pornography; understandably, a lot of people don’t want their identity being associated with it.
Senator Clement: Thank you for your leadership. I still remember my first committee meeting, and this was the bill we were studying. You said, “We are a country governed by the rule of law.” It seems to me, though, that the world has changed since that first bill.
We see the president of our neighbours to the south and the decline in government trust around the world. I don’t usually agree with libertarians, but I find myself having slightly less trust in the government today, in 2025. Was that something you thought about in this new version of the bill?
In your answer to Senator Oudar, you touched on technological changes, so will this version of your bill keep pace with that constant change? How does this iteration keep pace with the ever-changing technology in the field?
Senator Miville-Dechêne: You’re right about trust: It’s in decline everywhere, and it’s really tough for any government to bring forward legislation that people support. Imagine when there is only one person pushing for the bill.
Yes, people have a lot less trust in institutions than they did before. Now, what’s the answer? Do we stop drafting bills in an effort to protect part of the population, or do we try to move forward, engage with people, listen to them and pay attention to their concerns?
If people no longer trust institutions…. The government won’t be the one providing the age verification; it will accredit the companies providing that service, as is the case in Germany.
These are private companies that would be expected to meet certain standards. They would be the ones carrying out the age verification, quickly destroying any data that may have been collected as soon as the age verification process is complete, in accordance with the principles laid out in the bill.
There are parameters, but the matter of trust goes to the heart of the issue. When I talked about libertarians, it’s not just libertarians. Trust in institutions is declining. Yes, it is something I’m up against, but I figure we can’t just let a whole generation — and I’ll say it again — learn about sexuality from porn. We have to push forward because we know sexuality is at the core of relationships between men and women.
Frankly, I believe in this very much, having heard the stories of dozens of young people who struggle with relationships because of their unchecked use of pornography, boys and girls. I mentioned strangulation earlier, and that is clearly something that comes from porn, whereby pleasure is supposedly derived from strangling someone. I’m not a prude; sexuality is important. Prevention is important. Sex education is essential, and it isn’t as important everywhere as it should be.
Obviously, the new sex ed classes in Quebec, for example, allow for discussions about pornography. It can be helpful to young people once they know that pornography is not real, that it’s a performance.
Why not try to give them a little more time as kids and push a little further ahead the moment they gain access to all these millions of images that can be very difficult for them. Sorry, I changed the subject.
Senator Clement: As far as changes in technology go, this can keep pace.
Senator Miville-Dechêne: It can keep pace, precisely. That is why I’ve put only the principles in the bill and why the regulations will contain the methods selected, which can always be updated.
I’ve gotten a lot of criticism for not including a method. People say, “It’s a mystery, it’s crazy, you’re not giving us the information.” However, the more information and the more specific methods you include in a bill, the harder it is to make changes.
This is what Great Britain did. It introduced a bill setting out principles, and its regulator, the Office of Communications, established which methods were acceptable, and there are many.
Sorry, I changed the subject.
[English]
Senator Tannas: Thank you, Senator Miville-Dechêne, for your work on this. It’s incredibly important.
You mentioned the U.K., and you mentioned Australia. I’m struck, as a senator here, at how often Australia is ahead of us on stuff and how pragmatic they are. We should be looking there for answers more often.
You’ve been at this now for five years. You’ve probably had some interaction with folks within our own Canadian government. In my time here, I’ve certainly seen the government delve into less important issues than this, less vital issues than this, in the social sphere and in the internet. Why do you think you are here alone trying to figure this out on behalf of Canadians? Why do you think the Canadian governments, past and present, have not deemed this important enough to turn a wheel on it?
[Translation]
Senator Miville-Dechêne: That is an excellent question. First, to answer your question about Australia, I would say that you are absolutely right. I’ll give you an example that shows just how seriously they are taking this. A few years ago, they put a halt to all their work on age verification because they felt that the technology wasn’t far enough along to have robust enough age verification methods.
They did new testing, and now, they feel they are able to move forward with the conditions I listed for you.
Your other question is something that gave me nightmares and made me lose sleep, because, yes, it is a serious public health issue. I thought that the government, with its online safety act, which was talked about for years, was going to do something about the various harms affecting young people. Heaven knows, I said repeatedly how happy I would be to see pornography dealt with in a broader bill administered by the government. It’s interesting, because I always had the support of the Bloc Québécois, the Conservatives and the NDP most of the time, but I never had the support of the government, which asked Liberal members to vote against the bill.
The only statement by Prime Minister Trudeau on the subject was that the bill would send Canadians to the dark corners of the internet and have a terrible effect on their privacy. In short, there was no question of adding elements about age verification in Bill C-63, the online safety bill the government had promoted. They felt that the term was much too explosive. As a result, there is nothing in the bill about age verification. There is a very general phrase saying that platforms would be asked to use age-appropriate features to protect children. Age-appropriate features could mean absolutely anything. I was told, “No, Julie, it’s not a problem. The commission will decide in a few years if the methods could include age verification.” No country approached it like that, precisely because age verification is controversial. It needs to be clear in a bill, whether mine or the government’s, that that’s the direction we should take for the worst harms.
Great Britain targets not only pornography, but also aiding suicide and anything to do with suicide. That also requires age verification so that children don’t get drawn into it.
The government is very hard to decipher. For many years, I felt that they understood what I was trying to do and that it would be part of Bill C-63. When I saw it, I realized that they weren’t going in that direction. It’s hard to know why, but I think you’re absolutely right: The issue is serious enough that the government should address it. That’s what happened in all the other countries I mentioned. All of them.
[English]
The Chair: Colleagues, it is now 5:15, but three senators have second-round questions which I would like to hear asked and answered. We will impinge on the second panel for no more than 15 minutes.
Senator Miville-Dechêne: No, because we want to hear them.
The Chair: We want to hear everybody. Is there an objection to me letting the second round go to three questions?
Senator Miville-Dechêne: Maybe I should give shorter answers. I’m a little longish on my answers.
The Chair: There are four questions, so very short, please.
Senator Saint-Germain: I will be asking my question to a journalist, and you have two minutes in total.
[Translation]
The fact is that a bill’s regulations are extremely important. That’s where we see whether the government is respecting the legislator’s intent.
Your bill is centred around principles. You put a lot of trust in the government and the organization that will be potentially set up — a private organization — to implement the act that will potentially be passed.
In the regulations, especially concerning user age verification, confidentiality and the role played by the private organization or authority that would be in charge of implementation, how do you think the public and its rights could be protected under the rule of law?
Senator Miville-Dechêne: I misspoke or we misunderstood each other. The age verifiers would absolutely not be certified by a private organization. Obviously, the government and its officials would. Certification would be a government responsibility. Age-verification companies, which already have standards, would apply to be certified to work in Canada.
I’m sorry, we misunderstood each other on this.
Senator Saint-Germain: We’ll have a chance to come back to it.
Senator Oudar: Something has been bothering me since the beginning, which is the idea of an organization or business that has commercial purposes.
Information is shared very differently now on social media. On Facebook, Instagram and TikTok, content is shared not by an organization for commercial purposes, but often by one person sharing sexually explicit or pornographic content with another person.
Senator Miville-Dechêne: My bill is aimed only at organizations. As defined in the Criminal Code, the term “organization” refers to companies, to an association of persons. From the beginning, after the first version, we chose to target only organizations. We didn’t want to target sex workers, for example, who are more vulnerable and do this as a job. Therefore, we focused on organizations and, by doing so, we will capture a lot of people.
In Great Britain, they are currently working on 6,000 pornography websites, if I remember correctly. We have what it takes organizationally to act, but chat rooms and private conversations are not affected by my bill.
Senator Oudar: That’s precisely what my question is about. I’ll come back to the European example. The European Commission is conducting an investigation into Pornhub, Stripchat and XNXX, but there is also an investigation into Facebook, Instagram and TikTok.
Senator Miville-Dechêne: They’re going further than I am.
Senator Oudar: They are. Sexually explicit, pornographic content and material is shared that young people are exposed to. Isn’t your bill missing the main target, where children are exposed to sexually explicit, pornographic material through these platforms?
Senator Miville-Dechêne: The problem will never be solved 100%. Some young people will continue to look at pornographic material. It was a deliberate choice not to go into private conversations, because that brings into play privacy principles, as opposed to websites that publicly spew out pornography to children. We know that many children watch pornography on certain websites and on social media. It’s not an exercise in futility. I’m trying to target my efforts as much as possible so that the bill makes sense. It’s a signal, and we can expect it to have an effect. However, it won’t be immediate, and not all children will be protected. There is also the matter of VPNs, which we haven’t talked about here, that are used to hide the websites visited. Europe is more ambitious, which makes me very happy. As a senator, I can present a targeted bill. That’s how I can move forward. I can’t target everything.
Senator Oudar: Are you afraid that the organizations we were talking about could indirectly distribute pornographic material to young people through platforms like TikTok and Instagram, thereby doing what they would be prohibited from doing directly?
Senator Miville-Dechêne: There is nothing in the regulations of the bill that prohibits expanding beyond pornographic websites and trying to reach social media that also distribute pornography. As I said, X is a porn distributor on many of its pages. In article 12, I leave room for the government, during the regulatory phase, to decide on the exact scope of the bill. I think it’s a question of modesty. I am not the only one who should decide on the scope of such a bill, because it’s a hard decision. I will leave it for the regulatory phase. Thank you for the question.
Senator Oudar: Thank you.
[English]
Senator Batters: With respect to that last point, I note that it continues to be a criminal offence for many of those types of appropriate situations where individuals are distributing — let’s not call it “child porn;” let’s call it “child sexual abuse and exploitation material” in accordance with the act we passed in the last Parliament. I sponsored it in the Senate and it was introduced by MPs Arnold and Caputo. A lot of those kinds of things, if it is appropriate and is actually a crime, could be dealt with by the Criminal Code as it exists right now.
My comment to you was just specifying exactly what your defence section says regarding the legitimate purpose. It says in clause 7(2):
No organization shall be convicted of an offence under section 5 —
— which is the part about making this pornographic material available to a young person —
— if the act that is alleged to constitute the offence has a legitimate purpose related to science, medicine, education or the arts.
That is a defence to a criminal charge that would be laid under this.
Senator Miville-Dechêne: Absolutely.
[Translation]
I think it’s a strong defence, because we don’t want this to be a censorship bill. By making the defence broad, we still allow people to show the human body as it is in all kinds of situations. An art piece could resemble pornography, but it remains first and foremost a work of art. I think this defence is important. It shows that it is not a censorship bill, but rather one that aims to protect children. Art featuring nudity is not at all the same thing as pornography. Thank you.
[English]
The Chair: Thank you, Senator Miville-Dechêne, for your testimony here today, and for coming here and explaining this bill to us all.
For our second panel, we are pleased to welcome officials from the Department of Canadian Heritage: Amy Awad, Director General, Digital and Creative Marketplace Frameworks Branch, Cultural Affairs Sector; and Charlene Budnisky, Senior Director, Communication, Legislative and Regulatory Policy. Thank you very much, witnesses, for coming today and joining us.
I will give you three to five minutes to make your opening comments, and then we will move quickly to questions. Thank you.
Amy Awad, Director General, Digital and Creative Marketplace Frameworks Branch, Cultural Affairs Sector, Canadian Heritage: Good afternoon, chair and members of the committee, and thank you for the opportunity to appear before you today on behalf of the Department of Canadian Heritage to discuss Bill S-209, the Protecting Young Persons from Exposure to Pornography Act.
Thank you to Senator Miville-Dechêne for her persistence in seeking to address what we all acknowledge to be an important issue.
[Translation]
Canadians expect to be safe in their communities, both online and offline. However, the digital policy framework in Canada hasn’t kept pace with the fast change of online platforms, which has given rise to new harms, especially for children. Protecting children online has broad consensus and remains a government priority.
[English]
Canadian Heritage supported the development of the policy behind the online harms act, which was Part 1 of Bill C-63 of the previous Parliament. In developing Bill C-63, the department heard extensively from experts, survivors, civil society and members of the public on what should be done to combat harmful content online.
[Translation]
A common theme emerged from the consultations: the vulnerability of children online and the necessity of taking proactive measures to protect them. In that spirit, the Online Harms Act proposed a duty to protect children, forcing platforms to incorporate age-appropriate design features for users under 18. While Bill C-63 died on the order paper in January 2025, the government’s commitment to battling online harms remains firm.
[English]
As this committee may recall, my colleague Owen Ripley presented to the House of Commons Standing Committee on Public Safety and National Security in 2024 regarding Bill S-210, the earlier version of the current bill. He mentioned some concerns with that bill that included its broad scope, risks created to privacy, technological feasibility and implementation, reliance on website blocking and the lack of clarity around certain elements and implementation.
[Translation]
The government is currently studying Bill S-209 to propose the best possible solution to ensure Canadians’ online safety. We took note of some major modifications made to Bill S-209, including restricting the definition of “sexually explicit content” to “pornography”, including age estimation in the means of defence as well as better clarity in interpreting the expression “making pornographic material available on the Internet for commercial purposes”.
[English]
That said, some concerns do remain. Privacy risks remain important and must be weighed carefully in light of recent data breaches. The use of website blocking as a means of enforcement remains controversial. Finally, the bill may benefit from further clarity as to the criminal and regulatory nature of the offence created and how the enforcement mechanism for the enforcement authority relates to that offence.
[Translation]
We also note that we have better access to international comparisons since the committee’s hearings on Bill S-210. The U.K. and Australia have each adopted different age-verification requirements. The U.K. ones went into force in July, and the Australian ones will come into force in December. Canadian Heritage is paying close attention to the implementation of these initiatives and the public’s reaction in order to draw on our international partners’ successes and what they’ve learned.
[English]
In closing, the Department of Canadian Heritage recognizes the urgency and importance of protecting young persons from harmful content online. We are committed to working with Parliament, stakeholders and Canadians to develop solutions that balance safety, privacy and freedom of expression.
[Translation]
Thank you for your attention. I will be happy to answer your questions.
[English]
The Chair: Thank you for your opening comments. We’ll start with the deputy chair, Senator Batters.
Senator Batters: Thank you very much. I appreciate you both being here today to answer these questions on behalf of your government department, Canadian Heritage.
First of all, since you mentioned the online harms act, you were in the room listening to the last panel. As Senator Miville-Dechêne was saying, Bill C-63, the online harms bill that existed last Parliament, did not include this, so I just wanted to find out why it did not.
You also made a reference to your colleague Mr. Ripley testifying at committee about a predecessor of this bill, but at the time that your colleague testified at committee, Senator Miville-Dechêne’s bill was not in the final form that it is in now. As we talked about earlier, there were significant amendments made to it that really tightened it up. With that in mind, does that address a number of the different concerns that you had about those particular parts?
Ms. Awad: Thank you very much, Senator Batters, for the questions.
I’ll speak first to Bill C-63. As Senator Miville-Dechêne said, Bill C-63 included a duty for online platforms or social media services to protect children using age-appropriate design features. Although it didn’t take up a large section in terms of the drafting of the legislation, it was actually a fairly significant duty, and our expectation would have been that the digital safety commission, which would have been created by the bill, would have gone into a fair bit of detail regarding the obligations on different classes of platforms. Those obligations could have included age verification as well as other age assurance methods for access to pornographic material and other types of harmful content.
I think what was described earlier is that some of the elements of the current bill are about setting the principles and having a regulator work out the details later because of the rate of technological change, the development and best practices in other countries. That was the approach in Bill C-63, so there was nothing in Bill C-63 that closed the door to age verification. In fact, that was, I think, open to the regulator to choose as a method to protecting children from, in this case, pornographic content.
Senator Batters: Perhaps it opened it a crack. I wouldn’t say that it opened it widely.
I find it astounding that an individual senator has provided a considerable framework, whereas the government did not. They just simply set out that they were going to set up this digital safety council or whatever they called it and the council would do it in the future. I think this is a more transparent way to deal with this type of issue rather than say, “Oh, we’ll set up this council,” and then the public wouldn’t really get a say in what they do until after it’s already in play.
What about my other question about your colleague Mr. Ripley? You referred to his testimony at committee, but that was before the considerable amendments had been made. Does that assist with the government’s potential approval of this bill?
Ms. Awad: Yes, the changes that were made to the bill in terms of clarification on a definition of pornography, the definition around commercial purpose and the incidental and not deliberate use — those things are useful and do address some of the concerns that the government had raised about the earlier version of the bill. I think they were useful changes.
If you’ll allow me, I can speak quickly on Bill C-63. Perhaps this is already well understood, but Bill C-63 was meant to cover a large spectrum of potential harms for a large group of people, so even the section on protecting children was, of course, for protecting them against any types of harms that they might run into online, and the amount of detail that would have been required in the bill if we enumerated every single possible harm, including additional ones that might arise as technology and the use of the internet evolve, explains in part why there wasn’t the amount of detail and precision as you find in the private member’s bill here that’s focusing on one specific harm. Context matters.
Senator Batters: Right. I see Senator Miville-Dechêne’s bill is seven pages. How many pages was Bill C-63?
Ms. Awad: I’m sorry?
Senator Batters: How many pages was Bill C-63? I imagine seven pages extra wouldn’t have been a huge chunk of it.
Ms. Awad: Yes. Fair enough. Thank you.
[Translation]
Senator Miville-Dechêne: I’m a little surprised by your answer. In its Online Safety Act, Great Britain very clearly included the fact that age verification or estimation would apply to two specific harms, pornography and incitement to suicide. It can be done. Why do I insist? This is a political choice. It should be addressed in bills. If the choice was left to a commission, there would be no debate, or as little as possible. I must say I’m having trouble grasping this. I would like to get an answer.
You also mentioned the fact that website blocking remains a problem for the government. Once again, it’s the ultimate weapon adopted by France and Great Britain. When these companies with lots of money and power decide not to obey the law, the ultimate weapon is blocking pornographic websites. In a sovereign country, how is it problematic to exercise the right to block a website that doesn’t obey the law after the Federal Court has issued a ruling? We need to act. Sovereign governments have rights.
[English]
Ms. Awad: I’ll speak to the first question first. I think you spoke a little bit to the fact that, for example, in the United Kingdom, they were more specific in their online safety bill in terms of use of age verification for certain types of content and so on and so forth. What I can say is that the process for developing Bill C-63 involved extensive consultations, including consultation with child protection organizations. It involved looking at international examples, and it also occurred over a period of time during which technology was evolving. I think you yourself made reference to the fact that, in Australia, initially there was hesitation to adopt certain types of age-assurance methods, but then there was an openness that came later. That is part of the context that might explain how explicit or not explicit the language was, for example, in the online harms act. I think that responds in part to the context, but in terms of political choices, I can’t speak to that. But, of course, political choices were made.
[Translation]
Senator Miville-Dechêne: What about blocking?
[English]
Ms. Awad: On the question of website blocking, I think there are a couple of different elements to it. Part of it is the technical feasibility and the mechanics and the effectiveness of it as a means of enforcement.
Going through a website-blocking process, the Federal Court requires — as is kind of provided in the bill — an order being issued to essentially every single internet service provider in Canada, so for every order that’s issued, that is an administrative burden on the internet service provider, and sometimes that can be for multiple sites because these providers, for example, especially the worst actors, might continue to change their IP addresses. So there is a consideration around the systemic burden that this type of system creates. I imagine that might be secondary given the seriousness of the subject matter that we’re talking about. And then there’s also the risks around, for example, a small portion of a site being non-compliant and then having the order block the entire site, or block it for everyone. Those are things that can be weighed, so they’re not insurmountable, but they also go to a broader approach in terms of limiting website blocking as a tool and maintaining ideas around internet neutrality, so not having the ISPs involved in content whenever possible.
Nothing I’ve presented here is presenting the government’s position, to be extra clear, but I’m trying to share with you some of the considerations and issues that have arisen in our initial analysis of the bill.
[Translation]
Senator Miville-Dechêne: Of course, it’s complicated, but it’s a noble goal.
[English]
Senator Simons: Thank you, Ms. Awad.
I want to pick up right where you left off. It’s one thing to have a pornography site that brands itself as a pornography site, and much though I have concerns about privacy and age estimation, I can see, hypothetically, how one could manage a pornography site. But Senator Miville-Dechêne testified that 45% of minors say they have seen pornography on X. Anybody can look at X. I don’t have an X account any more, but all the posts are public. So how on earth would we go about managing a site like X, which is a general interest site accessible to many people, if it also has porn on it? How would one regulate that?
Ms. Awad: That’s where I think the value of a broader regulatory scheme from online harms comes into play. The online harms act was proposing that you have certain designated regulated entities, large social media platforms like X, and then they have a set of obligations. They have to report to a regulator. They can face administrative monetary penalties if they’re not living up to those obligations.
In the case of X, for example, I do believe that their terms of service actually prevent that type of content, and it’s just a matter of them not policing and not doing the moderation required. So there could be repercussions for a platform like X for not implementing their terms of service, not complying with the legislation. They would come in the form of administrative monetary penalties, in the same way as if they allowed other content that was prohibited in a broader regime.
When you don’t have a designated, specialized regulator, you don’t have that infrastructure in place, and it becomes more complicated, I think that’s what you’re touching on.
Senator Simons: Senator Miville-Dechêne is my very dear friend, and this is not a personal attack on her because we’re friends. The bill leaves open many things to be settled by regulation or to be settled by Governor-in-Council. I worry that, were this bill to pass, there would be many, many unanswered questions about how the age verification process would work. Age estimation makes me uncomfortable because machines are stupid. Sometimes, if somebody is not White, they may look younger than the machine thinks people look. If someone is trans, their age may not match up with what the machine thinks. I worry about the protection of that information and the retention of that information. My colleague, Senator Wells, asked about the concerns of the LGBTQ community. The consequences of someone finding out that you have looked at gay porn could be catastrophic in a way that people finding out that you looked at regular old porn might not be. I worry that by leaving all of this to regulation, we have a black box where we don’t know what the bill would look like at the end of the process.
Ms. Awad: These are important considerations. The impacts, for example, of technological error, so the difference between accuracy and the amount of data that’s retained, that balance between accuracy and privacy, are considerations in developing these types of regimes and in scoping them.
In terms of the reliance on regulation, there are probably two parts to it. Part of it is that the regulatory processes themselves should be open and transparent, as is the legislative process. In developing regulations, there should be a consultation process, and people should have an opportunity to feed in, especially implicated stakeholders, the publications through the Canada Gazette and so forth. The regulatory process itself should also be one that is open and allows for kind of feeding in and technological input from the different stakeholders, as does this current process. And, of course, there is always going to be limitations.
It’s a balance. You know better than I do, but it’s a balance between the flexibility of having something adapt more quickly, to have a very balanced and open consultation, with being able to be extra clear at the time the legislation passes. Senator Miville-Dechêne has chosen a certain balance, and in the form of Bill C-63 there was also a balance. I’m not sure that Senator Miville-Dechêne’s version has left that much more to regulation than what the government had proposed in Bill C-63.
Senator Simons: Thank you very much.
Senator Saint-Germain: Thank you for being here.
My question is about enforcement and institutional capacity for doing so. I see much legislative vagueness in this bill. I’d like to know, from your standpoint, if it is passed, who exactly will enforce the law? The CRTC has testified it does not have the mandate or tools. If a new enforcement body is needed, which one could it be? How would it work? And how will it be created or funded given that Senate public bills cannot appropriate public money?
Ms. Awad: Those are really good questions, and they’re some of the same questions we had when analyzing the bill. I had touched in my opening remarks on this ambiguity around enforcement, and I can speak to that a bit.
The bill creates an offence which, I understand in part from Senator Miville-Dechêne’s comments, is meant to be, essentially, a criminal offence. Usually, a pure criminal offence will have a mens rea or kind of an intent element to it, something that a prosecution service would need to prove beyond a reasonable doubt. To the extent that it creates a criminal offence, the prosecution of the criminal offence would usually go to a prosecution service, so a federal prosecution service.
But then the bill creates a second element to it. It has this criminal offence. There are reasons why there is some ambiguity in the criminal offence in terms of the way the defences are presented because they speak to that mens rea that usually would be part of the offence itself. But if we go to the second portion, which creates a regulatory enforcement mechanism for what is, I understood, to be a criminal offence, it allows the Governor-in-Council to designate an enforcement authority. So through the process of designating an enforcement authority, whatever authority you designated would be responsible for enforcing.
There are some challenges there because it’s not clear what the regulatory obligation is. It makes reference to the criminal offence, which is an unusual way of setting things up but probably something that could be clarified. But then the regulatory authority themselves could be an existing regulator like the CRTC or the Privacy Commissioner, if you were thinking of an existing regulator, or it could be some department or branch of government, but there would be additional cost associated with it. We tried to do some estimates of the number of potential complaints that would be received, the types of investigations that could then lead to the notices, the work, and it would not be insubstantial. There would be nothing as large as what was required for Bill C-63, given how broad it was, but this would definitely require a subset of that in order to be properly enforced. Otherwise, it would just exist on paper and there would be no enforcement action. I think the concerns you raise about the lack of a Royal Recommendation or funding authorities are valid.
Senator Saint-Germain: Thank you.
Senator Dhillon: I’m a bit confused. We’ve had this bill now for some time, and it’s been discussed and debated for some time. I’m curious, Ms. Awad, in that there are other countries that have applied a similar type of legislation. What are their best practices? What have they done with regulators? What have they done when it comes to enforcement of those bills, and what are the results of that? Are they effective? Are they working? Can we take advantage of some of those experiences?
Ms. Awad: What I can say is that we are following closely what other countries are doing. Senator Miville-Dechêne spoke about the United Kingdom, their Online Safety Act, so the age assurance elements of that. Age estimation, age verification, came into force in July, so it has been a couple of months, but maybe not enough to do a full assessment. There were, let’s say, some hiccups at the beginning, but I think that would be normal in any process. In Australia, which is another important comparator for us, the age verification portion of their regime will be coming into effect at the end of the year. There are a number of states in the United States that have these types of regimes in place, but they may raise concerns that are a little different from us.
All this to say that there is an emerging body of practices in this area that could be used to model a regime. If we look at the European, the Australian and the U.K. examples, usually these regimes fit within broader online safety regimes, and that’s what happened in the U.K. They had a broader Online Safety Act which had some elements that dealt with this issue, and that meant that they had the regulatory infrastructure and expertise. In that case, it was Ofcom, their broadcasting and telecommunications and communications regulator, that essentially created a new branch or new section of the organization that focused on online safety and brought in the expertise needed to deal with children’s safety issues and run consultations on that topic. In Australia, they have an eSafety Commissioner dedicated to this particular topic. Having a specialized regulator seems to be a good international best practice and allows to bring efficiency and expertise and to balance the unique interests that exist in the space that are a bit different from, sometimes, other communications regulators or privacy regulators or other existing bodies.
[Translation]
Senator Oudar: I just want to specify something about Bill C-63 and the one we’re studying today.
I went back and read the text of Bill C-63. It’s not quite the same sphere. Bill C-63 concerns online harms and aims to counter hate, specifically the online sexual exploitation of children. There is only a limited comparison to be made.
I wanted to specify that because in the earlier discussions, I was a bit confused. People listening to us and watching our discussions will see that the field of application is not at all the same.
Child pornography, sextortion and cyberbullying of children should be banned. The bill doesn’t target the same goal.
My question is about Canada’s obligations under international law. I’d like to hear from you about the fact that we are a party to international treaties. Earlier, I mentioned the United Nations Committee on the Rights of the Child. Since 2021, it has advocated an age-verification system that should be applied to restrict access by minors to pornographic websites, which is the very purpose of Senator Miville-Dechêne’s bill. Some authors even make a connection to our obligations pursuant to the 2030 Agenda for Sustainable Development, which Canada has been a party to since 2015.
That doesn’t seem far off to us, and we need to be ready. There are several objectives to the sustainable development obligations. The third objective of all our principles is good health and well-being; the fourth is quality education; and the fifth is gender equality. That calls on Canada to step up and pass legislation like the bill we’re studying today, precisely to protect children. That also shows that we need to restrict access to pornography by minors.
It cannot be overstated that pornography should never be the main tool for children’s sex education.
I’d like to hear from you about Canada’s obligations under international treaties. Correct me if I’m wrong, but I don’t think that Bill C-63 meets the objectives of international treaties, or at least not enough. Do you think the bill we’re currently studying better meets the international law requirements that are incumbent upon us as a country?
Ms. Awad: Thank you for the question, senator.
I’d like to start with the first point. I know it wasn’t the entire question, but it’s important.
In Bill C-63, three duties were created for the platforms. The first was a duty to act responsibly in general, related to the type of content you were referring to: sextortion, child pornography and so on.
There was also a duty to protect children, but it wasn’t directly related to content such as sextortion; it was more general. The duty was to set up measures to protect children on the platforms. That left regulation to a digital safety commission, which could impose duties such as age verification for pornographic content and other measures to protect children online, including avoiding addiction and other dangers they could face.
There was an element in the bill, which is the third duty aiming to protect children that, I think, directly responded to what Senator Miville-Dechêne wants to address in this bill, but more generally and not as specifically as what she is presenting here. Right now, there is no Bill C-63, but that’s kind of the background.
I have to say that I can’t go into details about international obligations. I know that they were taken into account in the development of Bill C-63. We looked at our obligations under the international regimes related to children’s rights, and some elements of the bill try to meet these obligations. Unfortunately, I can’t give you any further details.
Senator Oudar: Do you have legal opinions indicating that it would respect the legal obligations we are subject to?
Ms. Awad: During the development process for government bills, there are reviews and evaluations that ensure that the Constitution, the Charter of Rights and our international obligations are respected. This type of review would have been done during the bill’s development process.
Senator Oudar: Thank you.
[English]
Senator Batters: I just want to go back to something coming from Senator Saint-Germain’s questions. There was a reference in that answer to the mens rea, the intent, but, of course, not all criminal offences have a mens rea or intent requirement. Some criminal offences, called strict liability offences, require only proof of actus reus, proof of the action, and not mens rea.
I also note that here, this bill applies to organizations, and it says in the interpretation section of this bill that “organization” has the same meaning as in section 2 of the Criminal Code. I just looked that up, and here is section 2 of the Criminal Code:
organization means
(a) a public body, body corporate, society, company, firm, partnership, trade union or municipality, or
(b) an association of persons that
(i) is created for a common purpose,
(ii) has an operational structure, and
(iii) holds itself out to the public as an association of persons;
Given that, it would probably also be a little bit unusual that that type of entity would — although occasionally it can be done, it is not the easiest thing to prove the intent of an organization.
Would you acknowledge, first, that there are strict liability offences that don’t have an intent requirement, and also that this applies to organizations, and that’s the definition?
Ms. Awad: The bill is clear. I think the reason there was ambiguity — and I wasn’t suggesting that this is fatal, just where there is an ambiguity — is that if you read in the defences section, it says, under age verification, so 7(1):
It is not a defence to a charge under section 5 that the organization believed that the young person referred to in that section —
This is entitled defence, but it is actually non-defence. It is addressing what is not a defence, and it implies that there would be a defence around belief that the child is under 18. That language of belief and so forth, in our interpretation, suggests that there may have been an intention to include a mens rea, but then it is not that clear when you look at the defence. This is where there is perhaps some ambiguity that could be clarified. It wasn’t meant to suggest that this should have a mens rea element. In fact, I agree with you that a regulatory offence, essentially a strict liability, is probably more appropriate for the purpose here.
Senator Batters: It seems to me that that’s what this is.
Ms. Awad: Perhaps, except there is this element to the belief of the organization as a potential defence. Some clarity, perhaps, could be useful here.
[Translation]
Senator Miville-Dechêne: I’m not a legal expert, but it seems to me relatively easy to see whether an organization conducts age verification. It’s not hard to prove, but I have trouble following you when you said that the criminal offence moves to the second part of the bill, where it’s clear that this is a rather bureaucratic aspect. If it becomes clear that an organization has not conducted age verification, another procedure applies, which is the Federal Court step. If the evidence is there, the website is blocked. You said that it wasn’t clear and it was complicated. However, it seems to me that it’s relatively clear.
[English]
Ms. Awad: There are potentially two authorities involved in this bill.
There is a prosecution service that would have to assess whether an organization, for example, has done what is in clause 5, and then also whether there is a defence that could be brought. This would be in a criminal context. In the criminal context, it is a potential defence.
Separately, there is another regulatory organization that, starting from clause 8, is going to try to determine, first of all, whether there are reasonable grounds to believe that the offence in clause 5 happened. Now, it is not so clear whether it needs to look at the defences or not because usually a defence would only come after it. So it tries to determine if there are reasonable grounds, so it is a different entity looking at essentially the same activities. Then, if it thinks so, it can issue a notice, and the notice must state the steps that the enforcement authority considers necessary to comply with the act. So looking at that, we ask if the only step they could do to comply with the act, for example, is to stop offering pornography or implement age verification, and if those are the only two steps, then it is not so clear what the particular other elements are that could have been put in place here, and was this designed, for example, for the regulator to be more specific?
I spoke already about the fact that committing the offence, it is not so clear whether that includes consideration of the defence or not. Those are elements that could be clarified.
Senator Miville-Dechêne: But also I think you mentioned or somebody mentioned that on X, it was not permitted.
[Translation]
Posting pornography was not permitted. Quite the contrary, according to their policy:
[English]
The social media platform X says it will now formally allow people to show consensual adult content as long as it is clearly labelled as such. They have a page saying “sensitive content,” but actually, that is where the kids watch porn. So it does exist.
Ms. Awad: Fair enough.
Senator Miville-Dechêne: X knows.
Ms. Awad: Fair enough. There are examples of other social media platforms whose terms of service do not allow it but there is still significance access. That was the example that I was trying to provide where this occurs in places where it may not be consistent with the terms of service.
Senator Miville-Dechêne: Thank you.
Senator Simons: Senator Miville-Dechêne has set me up perfectly. I want to walk through this. The non-compliance notice is issued. The matter goes to Federal Court, and presumably the platform would come to defend itself in Federal Court. If it loses in Federal Court, the effect of the order may be what is stated in 10(5):
If the Federal Court determines that it is necessary to ensure that the pornographic material is not made available to young persons on the Internet in Canada, an order made under subsection (4) may have the effect of preventing persons in Canada from being able to access
(a) material other than pornographic material made available by the organization . . . .
Let’s pretend we have a very popular website called Z, and Z is run by somebody who is a very loud libertarian and determined to not abide by pettifogging rules. Mr. Head of Z decides he is not going to do this. So the implication would be that the Federal Court would then have the power to ban Z in Canada, meaning that any Canadian who wanted to have any use of Z, including their own account on Z, would be barred from doing so.
Let’s suppose you had a site called VisagePage, and VisagePage was very popular, and VisagePage sometimes had pornographic content. Presumably again, the implication would be that VisagePage would be banned so that all the grannies could not exchange their tomato soup recipes and their quilting plans.
This isn’t just potentially restricting adults’ access to pornography; it’s potentially attempting to regulate the entire internet in a way that would block mainstream sites from Canada if their owners chose not to submit to Canadian regulatory oversight, which I suspect they would. What are the consequences of that?
Ms. Awad: I think your statements are true. I’m not sure everybody would agree with the implications, but the statement saying, for example, that this could lead to a site that offers both pornographic and non-pornographic material being blocked — that’s how I read the bill as well.
The question that the government may be looking at and also the question for this committee is, what is the appropriate action when a commercial entity is operating as a digital service in Canada and does not wish to comply with Canadian law? Our response in the online harms act was administrative monetary penalties, called AMPs, and as many as possible. That was the response.
Senator Simons: If you are the owner of Z, our AMPs probably are not enough to be any kind of restraint. In the meantime, presumably, any smart 16-year-old who really wants to look at dirty pictures can get a VPN and go around. No?
Ms. Awad: Whether our AMPs would be sufficient or not, it depends on the purpose of the platform’s participation in these processes. Many of these platforms are ultimately businesses that respond to shareholders, and monetary penalties can be motivating factors for shareholders.
Senator Simons: The problem is when you have oligarchs who’ve got more money than God. It’s next to impossible to have any kind of market signal that tells the owner of Z to change his behaviour.
Ms. Awad: Those are some of the challenges with digital regulation. I can’t say more than that.
On the question of VPNs, we are thinking about that and looking at it in this context. One hiccup that occurred when the U.K. implemented age verification is that VPNs became the most downloaded app on the App Store. It seemed like a lot of people were trying to circumvent those rules for various reasons. Some may have been minors. Some may have been adults who also didn’t want to participate in the age verification process or who were concerned for their privacy as they might be accessing these types of sensitive sites.
[Translation]
Senator Miville-Dechêne: I would like to add two pieces of information. You can respond or not. It should be noted that most children 13 and under don’t have a VPN. Like all bills, this one cannot protect all children, but there is a difference between a 13-year-old child and a 17-year-old teenager.
There is an actual incident that happened with Twitter in Germany. Twitter had some pornographic pages. Germany told them that either they verified ages for their pornographic pages or they couldn’t operate in Germany. Twitter removed its pornographic pages in Germany. You might consider that a form of censorship, but Twitter didn’t disappear entirely. Twitter made a business decision. There was more money to be made from a pornography-free Twitter in Germany than from a Germany-free Twitter. These are the choices and trade-offs that the platforms need to make.
I just wanted to add that information.
[English]
Ms. Awad: That context is useful.
Part of the interest in the VPNs and also just in the general diversion is that sometimes changes can occur in the framework because that can lead to unintended consequences. One of the risks that we want to manage is whether young minors who were accessing this problematic content through websites might now try to access it through file-download services or private messaging or other areas where it might actually be even more dangerous and problematic. I think that’s part of the reason the government was initially conceiving this as part of a broader framework to try to address different elements.
Senator Dhillon: The work is going to be expensive. I don’t think we can cover each area and stop all the leaks all in one go, but the work is noble and important.
Part of my question is further to Senator Simons’s question and comment. Can there be bifurcation of websites that are then facing penalties to then make business decisions around what to hang on to and what to continue to produce, rather than to sink the entire ship with one of those products? On that front, do you believe, from your research and the work that you’ve done, that fines are enough, or do we need to do more?
Ms. Awad: We consulted on this a lot for Bill C-63. This was one of the questions: Should there be website blocking or not? Ultimately, the conclusion we reached was we should start with fines. There is less risk. It is less polarizing within society in terms of the response to website blocking. Ultimately, the most problematic players, the largest players, are businesses, like I said, with shareholders who have to make business decisions. It’s more likely than not that fines will be an effective tool, but there could be exceptions. There could be platforms that run for ideological reasons or that are worried about precedent. We have seen examples in our digital regulation, both in Canada and elsewhere, where platforms take action that may not be in their best business interests in order to control public policy or to advance ideological interests. Those are the ones that require more thought and maybe other methods of dealing with them.
Senator Dhillon: Thank you.
The Chair: Colleagues, please join me in thanking our witnesses for their participation and presence here today. Thank you for assisting the committee in its work.
I have a couple of final notes. The deadline to propose special study suggestions has been extended by two weeks to October 17. A work plan approved by steering committee is currently being implemented. Members with witness suggestions are encouraged to forward them to the clerk as soon as possible. In the future, members are encouraged to provide suggestions at the earliest possible opportunity, ideally before the steering committee considers the draft work plan while, of course, recognizing that new witnesses may emerge over the course of the study.
(The committee adjourned.)