THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS
EVIDENCE
OTTAWA, Thursday, October 9, 2025
The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 10:35 a.m. [ET] to study Bill S-209, An Act to restrict young persons’ online access to pornographic material.
Senator David M. Arnot (Chair) in the chair.
[English]
The Chair: Good morning, honourable senators. I declare open this meeting of the Standing Senate Committee on Legal and Constitutional Affairs.
My name is David Arnot. I’m the chair of the committee. I invite my colleagues to introduce themselves.
[Translation]
Senator Miville-Dechêne: Julie Miville-Dechêne from Quebec.
[English]
Senator Tannas: Scott Tannas, Alberta.
[Translation]
Senator Oudar: Manuelle Oudar from Quebec.
[English]
Senator Prosper: Paul Prosper, Nova Scotia, Mi’kma’ki territory.
Senator K. Wells: Kris Wells, Alberta, Treaty 6 territory.
Senator Simons: Paula Simons, Alberta, also Treaty 6 territory.
Senator Pate: Kim Pate. I live here in the unceded, unsurrendered, unreturned territory of the Algonquin Anishinaabe Aki.
[Translation]
Senator Clement: Bernadette Clement from Ontario.
[English]
Senator Dhillon: Good morning. Baltej Dhillon, British Columbia.
[Translation]
Senator Saint-Germain: Raymonde Saint-Germain from Quebec.
[English]
The Chair: Honourable senators, we’re meeting to continue our study of Bill S-209, An Act to restrict young persons’ online access to pornographic material.
For our first panel, we’re pleased to welcome, by videoconference, Lord James Bethell; Mr. Tobias Schmid, Director of the Media Authority of North Rhine-Westphalia and Commissioner for European Affairs of the German Media Authorities; and Laurence Pécaut-Rivolier, Member, Arcom, Arcom France.
Thank you very much, witnesses, for appearing here today. We’ll start with Lord Bethell first, followed by Tobias Schmid and then Laurence Pécaut-Rivolier. The floor is yours for five short minutes to give a brief overview, and then after all the remarks are made, we’ll go to questions from the senators. Lord Bethell, please commence. Thank you.
James Bethell, Lord, House of Lords: Senators, good morning, good afternoon. Thank you very much, indeed, for giving me this opportunity.
I’m here to offer a practical perspective from the United Kingdom, where we spent five years developing and implementing legislation to protect children online, including and specifically from exposure to violent and degrading pornography. Let me be clear about the number one point: It can be done.
The U.K.’s Online Safety Act, passed in 2023, mandates very clear age verification for commercial pornography sites. Despite initial resistance, the measure has landed very well. The tech industry, including, very importantly X, formerly known as Twitter, has complied. Public support is strong, and most importantly, the evidence demonstrates that children are expressing relief and it is clear that they feel safer online.
Senators, privacy concerns are valued, but we have seen that they are manageable. We’ve heard libertarian arguments that age verification threatens privacy; in practice, it hasn’t. The U.K. has adopted privacy-preserving age estimation technologies such as facial analysis and document checks, but it is shown that they do not store personal data. These systems are already being used in banking, gambling and alcohol sales. We applied the same standards online.
Senators, the virtual private network, or VPN, argument is overstated. Yes, some adults may use VPNs to bypass restrictions. These are costly and intrusive, but the idea that that this renders the policy ineffective is a myth. Young children do not routinely use VPNs. They cost money; it’s difficult to administer.
The goal is not perfection, it’s harm reduction. Even partial barriers significantly reduce exposure and delay first contact with harmful content. The nature of the pornographic problem we face in society is not of young children seeking pornography but of young children being sent pornography by predators, by bullies and by the algorithms of big tech.
The U.K. example shows this is achievable. We worked for years to build consensus, consult stakeholders and develop technical standards. We’re the first country in the world to enforce age verification on this scale for online pornography. It’s a real-world example of liberal democracy acting decisively to protect children without compromising adult freedoms.
Importantly, there’s no question that we’re not going to go backwards on this one. Once protections are in place, they become part of the public’s expectation and part of the childhood experience. The idea of removing them is politically and socially untenable. That’s why I urge Canada to act now. Delay only prolongs the harms to children.
In the U.K., we took great care. The protections do not discriminate against LGBTQ+ communities or sex workers. The law targets commercial platforms, not individuals. It’s about corporate accountability. There’s no moral judgment.
If not this bill you’re considering, then what? Senator Miville-Dechêne’s bill is thoughtful and proportionate. If Parliament chooses not to adopt it, I urge you to offer an alternative, because doing nothing is no longer defensible. The evidence of harm is overwhelming. The tools to act are available. The public, especially parents, are demanding change, and the tsunami of AI-enabled porn aimed at our children is absolutely terrifying.
In closing, I offer this: The U.K. has shown that online safety legislation can be effective, can be proportionate and can be rights-respecting. Canada has the opportunity to lead in North America, as we have led in Europe. I urge you to seize it.
Thank you very much.
The Chair: Thank you.
Tobias Schmid, Director of the Media Authority of North Rhine-Westphalia and Commissioner for European Affairs of the German Media Authorities, German Media Authorities: Dear honourable members of the Senate, I’m pleased to speak to you. Unfortunately, there was a little issue with my presentation, and that’s why I have to switch over to free speech, but that won’t be a problem. If there is some issue with translation, please give me a sign. Nevertheless, thank you for having me share our experience on the issue of age verification around pornography platforms in the next five minutes.
The Chair: I’m sorry for interrupting you. The reason I’m interrupting is the translators here in Ottawa are unable to interpret your voice due to technical difficulties. We’ll have the clerk speak with you, and we’ll find another day or time at your convenience and the committee’s availability to reschedule you. I’m sorry about this. This often happens, unfortunately. I apologize for that. We’ll see you at another time, sir, if you’re available.
[Translation]
Laurence Pécaut-Rivolier, member, Arcom France, Arcom: I’m going to talk about the experience of France, which introduced legislation to mandate porn platforms to prohibit access to minors a few years ago.
France’s penal code requires online platforms to put in place prohibitions to prevent minors from accessing content. However, until recently, all porn platforms were content to just have a check box stating “Yes, I’m an adult” and this was sufficient to access the porn sites.
We commissioned one institute to conduct a study that provided data on website visits. The study showed that in 2024, a third of minors in France visit porn sites at least once a month.
We know full well that minors start visiting porn sites between the age of 11 and 12 — we know this because we have the data — and that from age 13, 50% of boys view porn at least once a month. This was a huge wake-up call for France because visiting porn sites at the age of 10 or 13 is obviously a very serious matter, first, because this is really not the right age to access porn, and second, because online porn sites frequented by minors show extremely violent and inappropriate porn that depicts a picture of sexuality that is obviously not appropriate for their age or even later.
France therefore decided to take action and 2024 legislation gave Arcom some legal powers to develop technical guidelines on age verification solutions that porn websites could put in place to prevent access to minors. Indeed, the only thing porn sites said when they went to court was that “you want us to introduce an age verification system, but none exists”, or “In any case, anything we introduce to verify personal information will violate privacy and will be more dangerous and will not be functional.”
The French parliament asked Arcom to prepare guidelines showing that technical age verification tools existed that could respect privacy. We worked with the French national data protection agency, or CNIL, to develop the guidelines that provide for a certain number of technical age verification tools that are functional and do not violate privacy.
The technical guidelines came out in January 2025. A short time later, the guidelines were shared with the European Commission since it is involved in implementing the Digital Services Act. Various operators were notified about the guidelines, which came into force in April 2025. In April 2025, we implemented the 2024 legislation for the first time and sent out the first correspondence to porn sites asking them to comply with the legislation and the guidelines.
Arcom formally vetted a total of 12 porn sites selected as having the highest number of visits in France. The results have been positive because most of these porn sites and a number of others complied and introduced age verification systems that meet our requirements, even though some of them are still under review.
The group known as Aylo, which owns the Pornhub site, among others, chose to shut down their websites in France because they felt that age verification had supposedly curtailed its freedom.
We ended up blocking only one of the 12 sites we had vetted because we did not have a contact person to send a notice.
That is where we are now in relation to the largest websites. We will conduct another study on the use of porn websites by minors in a few weeks to see what impact these measures have had on the use of porn sites.
We will also expand our demands to other websites with fewer visits and work with the European Commission on its Digital Services Act when the tools are put in place in Europe.
[English]
The Chair: Thank you. We’ll now move to questions, and we’ll start with Senator Batters, the deputy chair.
Senator Batters: Thank you for being here today and taking time out of your busy days in a much different time zone than we have here. It’s much appreciated because it’s important to get this kind of information about systems that have been put into place in countries that are similar to Canada. It sounds from what you’re both saying that it’s working well, so that’s very good to hear.
First of all, Lord Bethell, thank you very much for the important evidence that you provided about how it can be done and for just reminding us that the goal is not perfection but harm reduction. You said this was a thoughtful and productive bill, and if not this bill, then what? The evidence of harm, as you’ve seen there, is overwhelming.
I wanted to ask you more about this issue of the VPN and potential bypassing. Can you tell us a bit more about how that has not been found, in your experience thus far, to be a significant barrier in the U.K.? You described one of the reasons, as it’s a costly method to try to bypass. Could you tell us a bit more about that VPN issue?
Lord Bethell: I’ll make three quick points, if I may. Thank you very much.
Firstly, this is contested territory. There are no accurate figures. The only figures come from the VPN industry itself, and I regard that as a tainted source. They are working in collaboration with big tech, big porn, in order to try to denigrate the effectiveness of the bill, so I take with a massive pinch of salt all the details and data that come out of the VPN industry.
Secondly, installing and subscribing to and managing a VPN is technically quite tricky. It requires a credit card, unless you have a very large amount of advertising, which is pretty obnoxious. Most children do not have a credit card, nor do they have 20 or 30 pounds or dollars to pay for it. So the idea that the children of Britain are subscribing to VPNs against their parents’ wishes is ludicrous.
Thirdly, it’s a wrong understanding of how porn gets into children’s lives. The Children’s Commissioner of Great Britain did a very large and substantial piece of work, interviewing 300,000 children, a massive study, on how porn gets into their lives. It is not through search. It is not randy teenagers desperate to look at videos of abusive, violent sex. It is mostly children being sent porn by bullies at school, by predators trying to lure them into conversation or by the algorithms themselves.
Senators, I’ve just got to tell you that if you subscribe on Twitter to Kim Kardashian, to Ronaldo the footballer and to hip hop artists, you get a very different experience than if you subscribe to liberal politicians and TV journalists. The idea that VPNs in any way undermine the rules, I’m afraid to say, is wrong-headed.
Senator Batters: Thank you for bringing that up because I was going to ask you about Twitter and X, and I’m very glad to hear that, in the U.K., as you’ve said, they are complying. Can you tell us a bit more about that? As a senator, I’m on Twitter, Facebook and Instagram, and I have to agree that, yes, the number of Instagram crazy, terrible messages that you get sent and just these Twitter things where you’re constantly having to block these porn sites, with bots and things like that that they use to send that out, that does seem to be a very common thing. Could you tell us more about Twitter complying, and do you happen to know anything about perhaps the Meta experience of Instagram? That seems to also be a common one.
Lord Bethell: The Children’s Commissioner’s report laid it very clearly at Twitter’s door as being the net main gateway to porn for children, with 42% of children saying that their first encounter with pornography was through Twitter, but much of this was not dialed up by themselves. It was either sent to them by the algorithm or direct messaged to them by people at school, or predators, because you know it’s an open platform and adults can send messages to children, something that I think is inappropriate. It is the same on Instagram.
We were very worried that Elon Musk would go ape and that he would, like he did in Brazil, create a great fuss. I’m pleased to say that didn’t happen and, instead, the platform has largely complied. It’s used a style of age verification which is pretty creative and low impact. We will be interested to see if it actually works, so there will need to be some validation and auditing, but I’m encouraged that the ingenuity and data skills of the tech sector have been put to work in order to make a lightweight and seamless experience for users. That’s ultimately what we want. No one wants heavy, clunky gateways that create turbulence for adult experience. What we want is two internets, a safe internet for children and a fruity internet for adults.
On Instagram, I would say that the same remarks reply. Meta have, in their lobbying and campaigning, been ardent opponents of age verification. I find that very disappointing for a firm that has so much of the time of our children on its platform and purports to be a force for good in society, but I’m pleased to say that they have taken steps to try to reform their arrangements.
Senator Miville-Dechêne: Just before asking my question to Lord Bethell, I just want to make this commentary. Contrary to what was said yesterday by Ethical Capital Partners, Bill S-209 does not just target porn platforms like Pornhub. The bill leaves this decision to the government in clause 12, and the government will decide on the scope, so the government could decide to include social media like X in its choices. I just want to say that to preface my question because it is about X.
Lord Bethell, thank you for being here. I’d like to know how X does the age verification. Here in Canada, it is heard often that the whole X platform will have to do age verification to prevent children from accessing porn. This has been heard a lot. Can you tell us technically how it’s done? I also understand that in the case of X, it’s not a third-party verifier. You have given the choice to some platforms to do it themselves.
Lord Bethell: Yes. X has largely taken the porn content from their platform in the U.K. In order to verify yourself, you give an email address. It then checks that it is your email address by emailing you, and it then matches that email address to your profile that is stored in data on the internet by matching it against your other social profiles and your appearances in the internet. In other words, there’s no handing over of documentation. There’s no passport, there’s no driver’s license, and there’s no credit card. If, through the combination of facial and email recognition, they are satisfied that that you are an adult, then you get one experience, and if you’re a child, another experience. In other words, it’s a risk-based system. It isn’t a frontier that requires a positive assertion for everyone. I think that that is proportionate for what we’re talking about.
We need to find out whether it meets the legal bar. In our law, we wrote “highly effective.” Highly effective doesn’t have a percentage, and we have pressed our regulator to put a percentage on that, and the regulator has surprisingly been resistant. To my mind, highly effective means 95%, 97%, something like that, so we’ll see whether they’re meeting that kind of threshold.
Senator Miville-Dechêne: The demand for an email address is just done on the pages that are porn pages and harmful pages, not on all X, or is it done on all X?
Lord Bethell: Senator, I’m afraid I can’t answer that question. My understanding is that it’s moving around a little bit, and they are trying different systems. I can write to you and give you a concrete answer, if that would be helpful.
[Translation]
Senator Miville-Dechêne: I’d also like to ask a question of Ms. Pécaut-Rivolier. Your system only targets porn platforms. Could you speak to the difficulties you’ve had with Ethical Capital Partners and Pornhub?
My understanding is that this company, which is based in Montreal, Canada, has sued you since 2022 and has failed to comply with the current legislation.
Ms. Pécaut-Rivolier: Thank you for your question.
Indeed, before the 2024 legislation, we had the 2020 legislation that gave us a number of powers. As such, our legislation applies to sites that screen porn content and requires them to introduce age verification systems. Since 2020, we have been sued by a number of plaintiffs who wish to mount a defence against any actions that we have tried to take to ensure age verification is effective.
We have taken legal action to compel the platforms to implement the age verification system. The platforms have pushed back and said that there’s no suitable system. I think that right now we have about 15 cases before the courts across all jurisdictions in France and Europe on various arguments, such as potential privacy breaches and invasion of privacy, especially regarding the fact that the effects of age verification were disproportionate. These cases concern Aylo as well as large porn sites more broadly. We received questions about constitutionality and conventionality. We have strong legal fights. However, what I’m seeing is that —
Senator Miville-Dechêne: I’m going to interrupt you for a minute to ask you why Aylo, Ethical Capital Partners and Pornhub exited France even though you had introduced dual authentication on your platforms.
Ms. Pécaut-Rivolier: We reached out to them to understand why they felt this was an issue. They said that the demands placed on them were too cumbersome to implement and disproportionate to the objective of protecting minors.
On their website available in France, they put an imitation of the famous painting by the French artist Eugene Delacroix, which is known as Liberty Leading the People.
Senator Miville-Dechêne: Thank you very much for that.
[English]
Senator Prosper: Thank you so much to our witnesses for helping us understand this very important subject.
I want to reiterate some of the evidence we heard in our consideration of this bill. The thing that is on my mind is the efficacy, the effectiveness, of what this bill seeks to address, because what we’re hearing from individuals like Ms. Boden with Free Speech Coalition and Mr. Friedman with Ethical Capital Partners is that it is not effective. What we’re hearing is that it is circumvented. I understand the dimension of VPNs and how you characterize that as a myth.
Through the witnesses, we have been hearing of alternatives, and I want to get comments on this from both of you. It seemed to be summed up by Mr. Friedman with Ethical Capital Partners when he said that having age verification and things of that nature site-based contains fatal flaws, and if it was on a device instead, those fatal flaws don’t exist.
What do you think about that? Do you think site-based, age-verification technologies that deny access to pornographic materials to youth contain fatal flaws? Do you think the more appropriate mechanism or avenue to address this is through the device itself? I’m curious what your thoughts are on that.
Lord Bethell: He has a point, and I will explain it, but it is not a fatal flaw and it is not a reason not to act.
I would very much like Apple and Google to put age verification into the App Store and onto the telephone device. This is the only way to have long-term, effective division of the internet into a safe internet for children and a freaky internet for adults. I am very disappointed that two serious corporations worth many hundreds of billions of pounds have resisted this so vehemently, have spent so many hundreds of millions of dollars campaigning against it and have intervened at Downing Street and at the highest level of government to prevent this sensible measure from being introduced. It speaks very badly of both organizations.
But that doesn’t mean that we shouldn’t do web-based authorization as well. There is a kitchen-sink dimension to this and many problems in life that you have to do several things for it to be effective. Now, he has got a point that site-based authorization may well not last for tens of years. There may well be loopholes and ways around it. Children — I have got four of them — are incredibly inventive. I’m not saying that it is going to be perfect. However, I come from a public health background, and we deal in harm minimization or risk reduction. We’re not the Homeland Security operation for which you just need one terrorist to get through and you have got a problem.
So I think he is overstating his point, but it is reasonable that he is angry at the device manufacturers who have basically dumped the whole problem on his doorstep.
Senator Prosper: Thank you.
[Translation]
Ms. Pécaut-Rivolier: I’m on the same page as Lord Bethell. Demands were placed on porn sites because they are the ones responsible for the images they disseminate, which should not be viewed by minors, and until there’s a more effective system, it makes sense to ask them to put protections in place.
Second, of course there will be ways to circumvent safeguards once they are put in place. That’s to be expected. These are technologies and as technology evolves, there will be advancements to bypass current technologies. Our main goal is to protect minors, including those aged 10 to 15. They cannot access more advanced technologies, which use paywalls and are more complex to implement. Protecting this group of children means we’ll have won part of the battle.
Second, this does not prevent us from having systems that include age verification at its core. We’re trying to work on that in France and in England. Indeed, we’re working on on-device telephone verification and we hope we’ll be successful. It would appear that Google is moving a little in that direction. However, we aren’t the least bit shocked that, for now and until the development of more advanced on-device technologies, the main providers of content unsuitable for minors are being asked to urgently implement a protection system.
[English]
Senator Prosper: Thank you both.
Senator Simons: Senator Prosper stole my question, so it is a good thing I have another one.
Lord Bethel, you have made a couple of mentions of the problem of children being bullied and being sent inappropriate content, whether that’s by mean classmates or pedophiles. How does your legislation stop that? It might stop someone from sending a link, but it surely can’t stop someone from sending an offensive or disturbing image.
Lord Bethell: Yes, that’s right. It doesn’t stop WhatsApp, for instance. It doesn’t stop someone sending a downloaded video by WhatsApp, but it does stop that message then having a link in it and for someone being urged or pressured into clicking on the link. It stops this cycle — which we’re all familiar with now — this escalator of violence, this extremism dopamine cycle that I’m afraid even our kids get stuck on. They may be looking at topless pictures one day, but before long, they are sucked into horrible, graphic, violent porn featuring acts of illegal rape and sexual misconduct and be taken to a very dark place. I’m sorry for being so blunt about it, but that’s what is happening on a very large-scale basis.
Senator Simons: That seems to be a contradiction of what you said earlier, which you said the number one problem isn’t children seeking out pornography but people having pornography thrust upon them. Again, I have to ask, if we had a device-based system, would the device-based system be better, potentially, at protecting children from what might colloquially be called “dick pics”?
Lord Bethell: No, it is not a contradiction. The power of these algorithms means that pornography has become highly, highly addictive to adults — fine, they can look after themselves — and also children. What I said was that access was through push, not through search.
What has happened — particularly, I’m afraid, amongst the boys — is that once they get into the habit of watching, as with any addictive dopamine economy, attention economy activity, it becomes moreish, and quite soon they’re escalating from simple rumpy-pumpy into quite disturbing and disgusting porn that they then seek to enact on their girlfriends, and it gives them a very disturbing view of the world. It’s going to hit our demographic curve and is breaking up basic institutions in society, like the family.
Now, you are right that a device-based system probably would be more effective and more fundamental. It does raise its own set of problems. For instance, we are then assuming that every device has a name and therefore an age attached to it.
I think we just shouldn’t think of this as single-shot solutions. As with everything in the media space, in the tech space, you have to think of interlocking programs that work together in order to solve broader health problem. I’m from the public health background, and that tends to be how I see things. It is plus rather than all.
[Translation]
Senator Simons: I also have a question for Ms. Pécaut-Rivolier.
[English]
One of the challenges I find, as a very boring, straight, vanilla woman on the internet, is that I am constantly being served what Lord Bethell would call “fruity material,” whether I want it or not. I shut down my Senate Facebook page because Meta sent me an unending stream of soft-core porn in my feed, culminating in the day I was at the Chinese New Year event in Edmonton, and I guess I was posting a lot of pictures from Chinatown, so it decided to send me very bizarre, Asian fetish porn on the middle of my phone in the middle of the afternoon.
I’m curious to know whether Arcom or anybody in Britain has looked at the question of asking these social media platforms — I would dearly love a button that I could push that said, “Stop filling my feed with this stuff.” My experience is that if you interact with it, it’s like chopping the head off a hydra. The moment you do anything, including trying to stop it, it just says, “Oh, you reacted, so you must want more of that.”
[Translation]
Ms. Pécaut-Rivolier: Indeed, I share your concerns, especially because in my work, I often have to click on porn sites and I end up receiving a lot of stuff, which is really hard to stop. In France, we’re working on implementing the Digital Services Act. As part of this process, we’d like to ensure Europe — France can’t do it alone — requires platforms to put in place rules to stop exposing minors, as well as adults, to this content.
The European Commission, which has just released guidelines for the application of Article 28 of the Digital Services Act, requires first to start by having an interoperable age verification system — we mentioned that earlier — which essentially makes it possible to verify the age of the person visiting a website and to have a set of mechanisms roll into place immediately. Second, we’d like to have default rules that would ensure content and algorithms offered to minors are age appropriate. This is very important to us because we have found it to be a source of tremendous harm. We’d like to prevent minors from getting into contact with unknown adults who may prey on them. These are some of the things we want to move forward.
We’re doing that as part of the Digital Services Act. For the time being, Europe has allowed us — because we said it’s urgent — to work on porn sites to prevent this type of access. Obviously, we’d like to move to the next step, which is to have more functional and broader protection of minors across all online platforms — so-called blogs, which are the largest platforms visited by most minors.
Senator Oudar: Thanks to our two witnesses for appearing today. My question is for Ms. Pécaut-Rivolier, but before I get there, I want to recognize France’s fight to keep children away from porn sites. The European Union has praised this fight. Ms. Pécaut-Rivolier, thank you for joining us this morning and for sharing this very pragmatic international perspective that’s informed by evidence. I am sure that your testimony will be an invaluable lesson for Canada on what works and what does not. I’d like to thank you for the input from Arcom, the digital communications regulator.
I did a lot of reading in preparation for this committee meeting, and I noted something when it came to France’s approach and the search for technological tools. The solution France chose incorporates the concept of double anonymity, which combines verification and privacy.
I’d like to hear your thoughts on this solution.
Ms. Pécaut-Rivolier: Thank you. Indeed, we’ve worked with our data protection agency to find a solution that protects minors and privacy for adults who have to comply with age verification requirements. However, the most successful solution is double anonymity through a third party that verifies that one is an adult, without storing data, meaning that ultimately, the information will not be stored anywhere because there is an intermediary.
However, this is a cumbersome solution and we’re mindful of that and as such, from the outset, the guidelines provided that porn sites should give users a choice between a solution that protects personal information as much as possible and a solution that is less secure with respect to privacy, but which users might prefer if it’s easier and user-friendly. We found that these were temporary solutions and we immediately told porn sites, “Stop telling us there are no solutions”. There are solutions. They may be a bit more expensive and cumbersome for adults, but that’s the prerequisite to safeguard minors’ mental health. Solutions do exist.
Now we are going to force the market to take action. We realized that there have been no mandatory requirements up to this point. We did not have a market that allowed us to have effective and cheaper age verification solutions. We’re sensing a shift now. The European Commission is working on a European portfolio to have a smartphone app that would show one is an adult, regardless of the reason for entering data in the app. The app would then be able to share that verification has been done broadly and that the person is an adult. These solutions will be more functional, seamless and user-friendly for everyone. That’s the future. That’s what will be rolled out. We just needed to be more stringent from the outset to make all sites, all platforms realize that at some point, they will have to comply with age verification to provide the emerging market with less complicated solutions.
Senator Saint-Germain: My question is for you, Ms. Pécaut-Rivolier. I’ll ask you to give a short answer because some of my colleagues would like to respond.
This morning, you mentioned that the French penal code includes a porn-related obligation. You mentioned the April 2024 legislation and a few minutes ago in response to a question from my colleague, you said that the legislation was strong enough from the outset. Bill S-209, which we’re currently studying, is not really prescriptive. It leaves a lot to enforcement, including the choice of technology. There are no criteria or any requirement about the nature of the technology.
Compared to the legislation and regulations in France — and in the UK as well, Lord Bethell — do you think Bill S-209 should be amended to make it more prescriptive?
Ms. Pécaut-Rivolier: French legislation provides for Arcom, as the country’s audiovisual and digital communications regulator, to use technical guidelines. The law itself does not stipulate any provisions, other than the fact that there has to be a system that is effective enough to ensure minors can no longer access porn sites while protecting privacy. The law left it to Arcom to determine how this could be put in place technically. The advantage of this system is that we at Arcom can amend the guidelines in response to technological advancements and future improvements.
[English]
Lord Bethell: We had the same for the online safety bill. It was enabling legislation that gave a large amount of scope to the regulators, not least because our experience is that technology changes so there is no point putting it on the face of the bill. You need to give the regulator both the obligation to engage with industry stakeholders and children, and also the ability to change the regulations if needed.
One thing that I would mention is that one of the reasons why I think we were successful is we put in a really tough enforcement regime. We brought in senior management accountability. This was one of the lessons we learned from the financial crisis. When we brought in senior financial accountability for banks, their behaviours improved dramatically. We brought this in for media companies, and that’s why we think big tech and big porn fell into line in the U.K. Fines don’t cut it. You need prison time to be threatened.
I have seen your bill. It doesn’t have that kind of enforcement regime in it. My inclination would be to go ahead with what you have got and add to it as you go along. I appreciate gaining legislative time is always tricky for any parliament, but I do think that the signal of putting something on the statute book is more important at this stage than getting every single refinement correct.
Senator Dhillon: Lord Bethell, thank you for all of your testimony here today. It is encouraging to hear the progress that you have made and the tenacity that you have in following through with that work in protecting our children. My question is, who are your regulators?
Lord Bethell: We have a large regulator called Ofcom. I’m not an expert on French regulation, but I think it is similar. Madame Pécaut-Rivolier will know more about that.
It is a very good regulator. We have four big regulators in the U.K. in finance, medicine, law and media. It is quite well funded. They may not say that, but I think it is. It recruits from the highest echelons of industry and the civil service. It has previously focused on licenced media, so television, radio and mobile phones. My worry was that they were better at partnership management than in dealing with big tech with its tough lawyers and libertarian attitudes. It is true that they spent quite a long time — two and a half years, longer than I would have liked — engaging with tech in order to refine the regulations. However, I would send them a bunch of flowers because it turned out that they did a good job, I think, and so far, so good. On July 25, there was a real big change, a palpable change, in the online experience for children. The donkey work of engagement, drafting thousands and thousands of pages of regulations, an extraordinary effort, has proved to be, so far, worthwhile.
I would just add that we don’t know what is on the horizon with AI-enhanced pornography. It terrifies the hell out of me and its impact on the imagination and on the plastic minds of our children. I’m really glad that we put up some kind of barrier, but we are going to need to revisit this very soon.
Senator Clement: Thank you both for your testimony and your work.
Two things struck me, Lord Bethell, when you were speaking. I’m going to ask a question about data breaches and the risk and the importance of destroying that information. I’m going to ask about your comment about how children are expressing relief. I know that you consulted widely — or the commissioner did. Could you say more about children expressing relief, and also about how you enforce this? You said there is data showing that they do get rid of the data. How do you know that?
[Translation]
I’d also like to hear Ms. Pécaut-Rivolier’s response to the same questions.
[English]
Lord Bethell: The framework that I am thinking about is the evidence from the Children’s Commissioner and others that most children wish social media did not exist. If you ask children if they would pay to be on Instagram, Facebook and Discord, they say they wouldn’t pay a penny. In fact, they would pay for other people not to be on it. That huge social pressure to be on social media has become a real burden for our children. One aspect of it is this incessant pornographic content that is pushed at them by the algorithm, bullies and predators.
The Children’s Commissioner has done a large amount of work. I would encourage you to get her online, if you can. I’m happy to share her details. She is very impressive. She has the absolute candid testimony of children themselves who have felt for years this enormous pressure — the confusion, the moral pressure, the pressure of all the other children watching and reacting — as well as the racism, the misogyny and the violence of this porn that has been a part of their lives, and that’s particularly felt by the girls. If I could also say a word for the boys, our boys are not doing very well. They are finding it very difficult to hold down relationships. They are confused about what their role is in society, and this porn has given them a very funny idea about how they should be interacting with other boys and with other girls.
I really think it’s been one of the most damaging aspects of modern society, and I am pro-technology. I’m no Luddite or someone who wants to turn the clock back. I want to embrace technology and all the benefits it brings to us, but this taint that has been promoted by big tech, by men who do not understand the difference between children and adults and do not appreciate the damage done particularly to girls by sexual content, has been a 30-year disaster, and I think we should turn the clock back and try to change it.
Senator Clement: Thank you. What about data breaches?
Lord Bethell: So far, so good. Listen, there will be a data breach. Of course there will. Whenever you have a collection of data, some of it will leak out somewhere. That hasn’t happened anywhere so far. The only thing I would say is that there have already been a ton of data breaches by porn sites. You only have to look at CheckYourName to see how much there has been done. I am impressed so far.
What I would like to see is Amazon, Google and Meta using their technology, their cybersecurity protocols and their multi-billion pound infrastructure to be supporting this age verification regime. As Ms. Pécaut-Rivolier explained, so far, we’ve been largely reliant on third parties, and there is a good reason for that, which is that people sometimes trust third parties more than they do the mainstream provider, but that’s probably not a long-term solution. Mark Zuckerberg knows how to solve this problem. He’s solved some much bigger problems than keeping children away from porn. It’s time he spent some of his money, his intelligence and his creativity to solve this, and it’s a damn shame he hasn’t done so, so far.
Senator Clement: Thank you.
Ms. Pécaut-Rivolier?
[Translation]
Ms. Pécaut-Rivolier: We conducted a study that was published last week, which shows that indeed, minors themselves are saying they don’t feel safe. I’ll forward the report to the committee. Obviously, they visit many social media platforms and porn sites, but there is not enough oversight. They’re calling for that oversight and would like adults to have a better understanding of what is going on even if it means circumventing that oversight. Obviously, some young people will try to get around the rules, but at the same time, they are calling for rules that will make them feel safer. We have a quantitative study that focused on what young people want.
I agree with Lord Bethell when it comes to personal information. We have the most secure tool. Obviously, it’s not 100% effective right now, but we can provide a better guarantee to protect privacy through double anonymity than when porn sites were collecting information to date. We have a lot of hope that the European system will provide a near total guarantee because it will flow through an ultra-secure system within the member states.
[English]
Senator Pate: Thank you very much to all of the witnesses.
My question is for both of you. How has the legislation that your respective jurisdictions have put in place interacted with anti-violence work that’s being done in your countries, particularly in France? We’re familiar with the very high-profile case of Ms. Pelicot and the work to counter violence, particularly against women and children.
Also, in your experience, what has the impact of the financial penalties been? I heard you, Lord Bethell, in terms of prison time. In my experience doing work in this area for a long time, that usually means that companies lawyer up and that very few people get held accountable. I’m curious as to how the financial penalties have dealt with this issue.
Perhaps Ms. Pécaut-Rivolier can answer first and then Lord Bethell.
[Translation]
Ms. Pécaut-Rivolier: We have enhanced the role of the justice system in cybercrime substantially. Indeed, I think that we needed to recognize the importance of this phenomenon and the fact that our current legislation and methods were not tailored to a rapid pace and violence and how things are playing out. Now, in France, we have a prosecution service that specializes in cybercrime and judges and police officers with more training in this area.
The general public has a broader awareness of these issues, which is unfortunately primarily due to recent cases, but this means that there is a genuine desire and acceptance from all parts to crack down on violence. We feel that together, these tools will put us in a better position to respond to these situations, even though we still have a long way to go. We will work on that every day. France’s regulatory authority is working side by side with law enforcement and the justice system to ensure each party operates efficiently within their respective domains and contributes to collective progress.
[English]
Lord Bethell: We had a slightly different experience in the U.K. The original versions of the bill — by the way, it has taken nearly seven years to get the bill through Parliament — tried to tackle a very broad suite of online harms and problems, including extremism and online violence in the round. It was cut down — quite rightly, in my view — to focus on children and damage to children. From a political point of view as a parliamentarian, it was easier for me to make the case for the protection of children than it was to tackle the very naughty, important but tricky problems of online harm and violence online.
I would also say, as a tactical legislative tactic, I found it easier to police the perimeter than to address the algorithm. What I mean by that is that there were other issues that we sought to tackle around suicide ideation, anorexia and other online — what we call — “harms,” which really mean the regulation of how the algorithm works. Now, I would like big tech companies to be responsible enough to use algorithms thoughtfully with children, but telling them how to do that is a tough thing to do. That’s their core model. Children account for about 25% of page views. The harms are an important accelerator of page views. Let’s be under no illusion: The reason why they drive their algorithm to promote porn and harms to children is that it creates a lot of attention and a lot of money for them. This is a commercially driven and overt decision by big tech. Our view was that policing the perimeter was an easier thing to enforce. It doesn’t solve all the problems, but if you keep the porn out, you’ve made some progress, and no one can possibly think that children watching porn is a good idea, not even Mark Zuckerberg. So from a political point of view, that is an easier thing to win.
It has not been tied, I’m afraid to say, to any offline, real-world addressing of violence to women. That is the next agenda point for us. I think it’s a bit disappointing, but it’s just the way that things played out in the U.K. I’m conscious in France of the Gisèle Pelicot case. We don’t have that at the moment, so it’s not in the national debate in the same way.
With regard to the fines, there haven’t been any fines so far. Companies have been called in for investigation. They’re having, you know, serious words with teacher. It would be my ambition that there are no fines. I want this to be done in a partnership, in a consensual way, as regulations should be sensibly done through deliberation and through partnership, but there will be people who try to test the law, and I don’t want to see anyone go to prison. I do slightly disagree with you, senator. They may lawyer up, but the threat of prison, we have found — certainly in the finance community experience — means that they don’t just try to pay the fines and then keep going. It is behaviour adjusting. I come back to what I said before: I think one of the reasons why big tech and big porn fell into line in the U.K. is that when they played through the legal consequences of ignoring the law, it did end up in prison, and none of them wanted to go down that road.
Senator K. Wells: Thank you both for spending time with us today.
We’ve heard a lot of concern about third-party verification and the potential of data breaches. In fact, I think, Lord Bethell, you said that those breaches were inevitable. I’m just wondering if both of our witnesses could speak to what the consequences are in the legislation for third parties that do breach the data safety and security requirements, and have those consequences actually been enacted in your respective jurisdictions? Lord Bethell, please.
Lord Bethell: I was rather hoping Ms. Pécaut-Rivolier was going to go first, because she’s the regulator. I will answer that briefly.
Cybersecurity is covered by a whole plethora of laws. It wasn’t the role of the online safety bill to put in information regulation. We have an information commissioner, and he’s a tough guy. He would be very hard over people who breached data safety rules. They would find themselves with their businesses closed down.
One of the biggest breaches of data safety laws historically has been the porn industry and other social media companies, companies like TikTok. TikTok has the U.K. record for the biggest fine, 29 million, for its data breaches. This is the world we live in at the moment. I don’t think that data breaches are a reason to be worried about this legislation.
Senator K. Wells: What about specifically the third party age verification that you’re using? Have data breaches occurred and, if so, what responsibility or sanctions have been put on the companies that store and maintain this information or give it or sell it to third parties?
Lord Bethell: I’m not aware of any data breach since July 25. I am aware of a case where one of the third parties had wrongly put the age at I think 23-year-olds instead of 18-year-olds for the facial recognition, which, of course, let in more people than it should have done. They were admonished and I think fined for that, and that’s a matter of record on the ICO website. We are looking at all this stuff closely, but I’m not aware of any major data breach so far.
Senator K. Wells: Great, and over to our regulator?
[Translation]
Ms. Pécaut-Rivolier: As far as we are concerned, the law requires us to work on technical guidelines with two authorities. Arcom regulates audiovisual communications while the CNIL specializes in regulating privacy. We have developed technical requirements that are respectful of personal information. This means that when a porn site uses double anonymity through a third party, the third party undertakes to respect privacy and if they don’t, they will be subject to the civil and criminal law that apply in this case.
Again, this is temporary. We all have an interest in all the other age verification methods, including the facial recognition that Lord Bethell spoke about, which would address all privacy issues because in that case, there is age recognition without the need to provide identification. It’s not yet fully developed but it’s heading in the right direction. There will be the European app. For now, it’s up to trustworthy third parties, who are liable to criminal sanctions if they don’t respect privacy. Ideally, we should not have the same problem going forward.
[English]
Senator K. Wells: You didn’t quite answer my question about the actual breaches that have occurred. AI Forensics has reported one with AgeGO in France. What were the consequences? These third-party verification platforms or companies you’re working are holding the data and doing the verification, and when breaches happen, what are the consequences, or are there no consequences? This is to Arcom and speaking to the situation in France.
[Translation]
Ms. Pécaut-Rivolier: So far we’ve not received any complaints about data breaches related to age verification. As I said earlier, any breaches will be dealt with under existing legislation in civil and criminal law that makes it a crime to disclose personal information outsourced for purposes of protection. To my knowledge, there have been no complaints about this.
[English]
Senator K. Wells: Thank you.
The Chair: Lord Bethell, I have one question for you concerning the cost of regulation. I’d like to understand what the total budgetary commitment has been for Ofcom’s age assurance and online safety programs since 2023. Public records suggest more than £160 million in initial funding and, in addition, £60 to £90 million annually thereafter has been required. Yesterday, we heard from a witness that Ofcom currently has around 60 ongoing investigations which, of course, leads to litigation. Also, you have what you described as a very robust regime. Is it accurate to say that Ofcom’s accounts indicate that you have about 1,500 staff and a dedicated online safety tech lab to help regulate legislation in the U.K.?
My fundamental question is, if Parliament in Canada were to enact a law like Bill S-209 but the government failed to provide new funds or any funds, or new powers to an enforcement body, would you expect such a regime to be unsuccessful and not have the success that you have seen in the U.K.?
Lord Bethell: Thank you, senator.
Yes, cost is a very important factor, both in terms of the financial cost of a regulator, the friction to industry and the impact on investment into the U.K. by tech. This is undoubtedly an important issue. I happen to think that this has made Britain a more interesting place for technology companies to come and invest in. I think we’re helping to fly the flag for a safer, more sustainable, more resilient internet experience for families.
In terms of the figures you put to me, they didn’t sound right I’m afraid to say, senator. The total budget for Ofcom in 2024-25 was £168 million, and that included all the regulation for all the mobile phone companies, all the television companies, all the internet and all the radio. That number, I’m afraid to say, is not quite right. I would be glad to write to the committee and give you detailed numbers, but it is nowhere near the 160 that I think you mentioned.
In terms of the numbers of people, policing age verification is a hell of a lot cheaper than trying to moderate online activities, which is a very people-expensive activity. I refer to my comments earlier on policing the fence. It’s so much cheaper than trying to address the algorithm. That’s why it’s a good place to begin. It’s not to say the algorithm isn’t important. I wish we could hold big tech much more accountable for what it’s doing in its calculations, but so far that’s a black box and a legal victory we haven’t won.
Yes, there is a cost. We do have an online tax here in the U.K., and I think it’s about 2%. The cost of doing age verification is a tiny fraction of what we get in for that online tax.
The Chair: Thank you to all the witnesses that have testified here today.
Honourable senators, continuing our study of Bill S-209, we will now hear, in person today, Natalie Campbell from the Internet Society; from the Canadian Centre for Child Protection, Monique St. Germain, General Counsel; and Mr. Tamir Israel from the Canadian Civil Liberties Association, by video conference; and Penelope Rankin with the National Council of Women of Canada. Thank you.
We’ll start with a brief overview from each of you, and it’s going to have to be three to four minutes or so in order to get all the questions in, because senators have a lot of questions for each of the panellists.
Monique St. Germain, General Counsel, Canadian Centre for Child Protection: Thank you to this committee for inviting me to present today. My name is Monique St. Germain, I am general counsel for the Canadian Centre for Child Protection, a national charity with the goal of reducing incidents of missing and sexually exploited children. We support the objectives of Bill S-209 and welcome the ongoing dialogue on this important topic.
We operate cybertip.ca, Canada’s tip line for reporting online crimes against children. As of March 31, nearly 450,000 reports have been processed through this avenue. We also operate Project Arachnid, which prioritizes the removal of child sexual abuse and exploitation material online. Overwhelmingly, the reports to the tip line relate either to sexual material of a child or offenders communicating online with children, but since the launch, we’ve also received many reports about adult sexual content and content where it’s not possible to tell the age of the person depicted.
We get these reports because there’s really no other way for Canadians to raise concerns about this content. What we see is that sexually explicit content, whether of a child or an adult, is increasingly disturbing, including elements of sadism, bondage, torture and bestiality. It’s getting more violent and, on the child’s side, the victims seem to be getting younger. It’s very difficult for our analysts to assess the child-related material that’s coming into us, so we’ve had to do our best to limit their exposure to adult pornography. It’s simply too horrifying. But understand this: if it’s in a report to us and we can readily access it with the click of a mouse, then so can a child.
Cybertip has been around since 2022. Since that time, technology has been evolving at warp speed and the harms to children are stacking up. We know a lot has been done to ensure individuals abusing children are accountable, assuming we can detect and prosecute them, but that’s a different issue. The same cannot be said about another crucial player, online platforms. These are the companies that not only facilitate predators connecting with children, but also make available and amplify harmful content.
Pre-internet, it was well understood that exposing children to pornography harmed them. Post-internet, the debate is circling around the need to protect the privacy of adults and their presumed entitlement to absolute privacy online.
We know that tackling online problems is not just about doing one thing and that there is no silver bullet, but over the last eight years our organization has appeared or made written submissions on this topic several times.
First there was motion M-47, which was to study the public health effects on everyone of the ease of access of viewing pornography. At that time, the concern was about algorithms serving up pornography even when a user hadn’t searched for it. Fast forward to now and the algorithm problem is even worse. The content is proliferating not just on adult sites but on platforms like X.
On April 11, 2017, our executive director appeared before the House of Commons Standing Committee on Health, which was studying motion M-47. She expressed deep concern about the prevalence of increasingly disturbing pornography being made freely available and the ease of access to this content by minors. This was eight years ago, and at that time she was referencing an increasing amount of research linking pornography viewed by adolescents to numerous negative health outcomes like aggression, substance abuse, depression, risky sexual behaviours and sexual deviance.
Since that time, we’ve had Bill S-203, Bill S-210, and in 2024, the study by the Standing Committee on Canadian Heritage on harm caused by sexually explicit material.
Through all of these submissions we’ve participated in, we’ve remained steadfast that the ready availability of pornography on the internet is a problem that has to be addressed. The burden of managing these harms is still being borne by parents. While this debate rages on, an entire generation of children has been failed. It’s time to move forward and try something. Other countries are doing this, and technology has greatly improved to the point it’s possible to both protect children and respect adult privacy. We welcome these efforts and are thankful for the persistence of Senator Miville-Dechêne and this committee for pursuing a workable solution.
Thank you.
The Chair: Thank you.
Natalie Campbell, Senior Director, North American Government and Regulatory Affairs Internet Society: I would like to thank the Senate committee for the invitation to appear on S-209.
My name is Natalie Campbell, and I’m the mom of two kids. That’s my first job. I’m also senior director of North American Government and Regulatory Affairs at the Internet Society.
The Internet Society appreciates the efforts of Senator Miville-Dechêne and this committee to focus on child safety online. We’re a global organization made up of many parents and caregivers, such as myself, all working to ensure the internet is for everyone, including that everyone can have safe experiences online, especially children. We do this by supporting community-driven solutions to bring fast, affordable internet to some of the hardest-to-connect places in the world. We also help policy-makers understand how these kinds of laws to address safety online could unintentionally undermine our shared goals for a safer internet and what needs to exist in the first place.
I think we all agree that children should not be exposed to pornographic content, especially unintentionally. Age checks can help protect children from unwanted or inappropriate online experiences, but how these age checks happen is critical because all types of age checks have trade-offs and none are immune to the risk of data breaches.
We are seeing a wave of these proposals globally with a wide array of methods, from requiring users to upload government IDs, to facial scanning, to relying on third-party data brokers. While these measures stem from a valid desire to protect children, they often create new and significant risks to their privacy, security and even their ability to access the internet. Many of these risks are considered in Bill S-209.
We appreciate the work done to improve this bill. The new version is not perfect, but it’s clear the sponsors have been working hard to address stakeholder concerns.
The Internet Society has several concerns with Bill S-209, but given the limited time I will focus on two.
First, age verification. As a mom, I’m obviously very concerned about my kids’ safety online and understand my personal responsibility in this matter. I also understand why I can’t trust any online service with their personal data. This is a contradiction that is a burden on parents and a complexity for age-verification policy. I don’t want my kids to be tracked online, and I do not want bad people to access their personal information when it is breached, or because people see they are children. So how do we keep our kids safe and protect theirs — and everyone’s — privacy and security online? Moms like me want this to be easier and safer to achieve. We need safeguards to keep children safe and safeguards to protect our sensitive information.
Bill S-209 has considered some important safeguards. It narrowed the scope of entities required to do age verification. We appreciate the effort to avoid putting that duty on services that operate important functions of the internet’s infrastructure that aren’t always aware of what content flows through their networks. This will reduce the amount of third-party services people have to trust with their most sensitive data simply to use the internet.
However, the bill’s well-intended exclusions are still too vague to provide legal certainty. They can still capture many internet services involved in basic functioning of the internet and private communications. It also needs protections to ensure no covered entity should be required to bypass, weaken or break crucial security tools like encryption to comply with the law. Encryption is our most powerful shield to protect privacy and security online, especially in the absence of modern privacy law in Canada.
Second, content blocking. This bill acknowledges that court orders could lead to overblocking legitimate content, although there are no safeguards. More troubling, however, is that the definition of “internet service provider” is very broad. Left as is, court orders could easily issue blocking orders to coffee shops, community centres, public libraries and universities that provide public internet access. It could also target dozens of small non-profit community networks — from the Arctic community of Ulukhaktok to Winnipeg’s North End Connect — that bring affordable connectivity to underserved communities across Canada. Faced with a court order, any one of them would likely be forced to shut down since most lack the expertise, resources or financial means for expensive content-blocking mechanisms. That would have a disproportionate impact on Indigenous communities.
Furthermore, content blocking is ineffective and introduces security risks that undermine the goals of this bill. Long gone are the days of video rental stores that hide adult content behind curtains. The material would still be accessible to a person determined to find a workaround, like dodgy sites and apps and the many websites that simply wouldn’t comply with this bill.
We all want kids to be safe online. Unfortunately, this bill undermines our ability to reach that goal. While it might end up blocking some kids from adult websites, the risks far outweigh the goals.
I look forward to your questions.
The Chair: Next to present will be Mr. Tamir Israel of the CCLA. Please be succinct, sir.
Tamir Israel, Director, Privacy, Surveillance and Technologies Program, Canadian Civil Liberties Association: I will do my best.
Honourable senators, good morning, and thank you for inviting us to testify on Bill S-209.
Age verification is complex, difficult to legislate and implicates the privacy of everyone online. It can also have a chilling effect in that some people will not feel comfortable providing an ID before accessing a website. This can disproportionately impact 2SLGBTQ people.
We have been encouraged by this committee’s thoughtful and ongoing study of this issue and its recognition of the difficult trade-offs that arise from legislating age-assurance requirements. We welcome the addition of critical safeguards in this iteration of the bill that attempt to address legitimate civil liberty concerns raised by the bill’s predecessor. We do continue to have some concerns with this bill, and I’ll focus the bulk of my testimony today on outlining some of these concerns.
Before turning to those, however, I will point out that this is a rapidly evolving ecosystem, with new measures and countermeasures developing constantly, and age-assurance mechanisms that continue to be flawed. As the committee is aware, there have already been data breaches, substantial increases in the use of VPNs and significant drops in traffic to compliant websites. One independent study showed drops in traffic as high as 50% for compliant websites in the United Kingdom and some level of increased traffic to non-compliant websites over the same period. Age-verification mechanisms also continue to develop, including solutions that are more privacy-preserving than the mechanisms contemplated in this bill. In light of this rapid development, we would encourage you not to rush to legislate.
With respect to Bill S-209 specifically, we have some ongoing concerns regarding potential overbreadth. As a general starting point, too much of this bill is left to be determined by the government, and the bill itself is in need of stewardship by an independent regulator.
More specifically, the addition of clause 6 is very welcome. However, we were deeply concerned to hear this morning the committee’s view that social media platforms might still be directly captured by this bill, despite the addition of clause 6. We would appreciate some clarity on that.
We are additionally concerned that notices under clause 9 might be used to impose conditions on social media platforms regarding what is necessary for them to qualify as incidental and non-deliberative transmitters of sexually explicit content. If this were the case, it would effectively transform the bill into a content-moderation regime, and the bill is not designed to address that effectively. Therefore, we would urge you to consider instead a mechanism that only applies to online services that operate predominantly for the purpose of making explicit content available to the public for commercial purposes.
We are also concerned that regulated organizations could be obligated to block VPNs under clause 9, and that clause 9 could be used to undermine secure end-to-end encryption and private messaging services. That would be highly disproportionate, and we would recommend such outcomes be explicitly prohibited by the bill.
Finally, we are concerned by the inclusion of website-blocking provisions in the bill. These could be triggered by a single infraction applied to an ISP and do not allow for judicial discretion. Our view is that this is a disproportionate remedy.
Briefly, we also have AI-related concerns. Canada has no federal framework for AI, and age-assurance mechanisms frequently rely on artificial intelligence algorithms that are known to be flawed in several respects. Often, for example, general statements of accuracy of these algorithms hide significantly higher error rates when age-assurance mechanisms are measured against specific demographics. Bill S-209 does require that age-assurance methods designated under clause 12 be highly effective, but there is no requirement for this to be assessed on a regular and ongoing basis, nor is there any mechanism for doing so.
Very briefly, with respect to privacy, Canada does have an effective federal framework for privacy, but this framework includes some gaps that, in our respectful view, pose challenges for this bill. For example, PIPEDA, our federal privacy law, is limited in application to data processing in the course of commercial activity. It is possible, and even preferable, for a non-profit entity to become the third-party age-assurance provider contemplated by Bill S-209, but in that scenario, PIPEDA would not apply.
We are also concerned that Bill S-209 currently permits the retention, preservation and disclosure of personal data where required by law. Providers could be legally required to retain and even disclose sensitive personal data to foreign or domestic law enforcement agencies. Nothing in PIPEDA would prevent that. We would, therefore, urge you to include an expanded set of enforceable privacy safeguards and a requirement for some independent oversight of the applications of these within the bill.
Those are my opening remarks for today. Thank you once again for inviting us to appear, and we welcome your questions.
The Chair: Thank you.
Penelope Mary Rankin, President, National Council of Women of Canada: On behalf of the National Council of Women of Canada, I thank you for this opportunity to speak in staunch support of Bill S-209. This legislation is urgently needed to protect Canada’s children and youth from a digital environment saturated with harmful content, notably pornography — content that now accounts for more than 30% of global internet traffic in an industry generating an estimated $97 billion annually.
There are approximately 28,000 porn-related searches happening every second, which is 8.4 million in the five minutes allotted for me to speak. It is particularly disturbing that 21% of these searches target violent, misogynistic, objectifying and aggressive material. In 2019 alone, Pornhub reported more than 42 billion visits — so many that they boasted that if a person began watching the new videos posted that year nonstop from 1850, they would still be watching them by the year’s end.
Who wouldn’t want to protect children from witnessing nonfatal strangulation and rape videos, two egregious examples of hardcore porn circulating on the net?
The average age of first exposure is between 9 and 11 — some are younger — and 2.7 million Canadian children under the age of 11 are active on the internet. It’s simple math to conclude that we are not talking about a handful of children being at risk.
Data from Girlguiding U.K., which I have shared with you, found that 73% of girls believe online pornography damages youth’s views of sexual relationships and promotes harmful sexist stereotypes. Girls aged 7 to 10 expressed anxiety about unexpectedly seeing “rude” images, sometimes triggered by innocent searches for children’s toys like My Little Pony or Pokémon. For boys, overexposure has been linked to erectile dysfunction and disturbing thoughts about both sex as well as consent. “Do I have to choke her?” are six words that starkly reveal how pornography fosters objectification and deep confusion. We need to hear children and their voices, concerns and struggles. It’s their lives, after all.
Research further confirms the burden these harms can cause adolescents exposed to pornography to experience higher rates of anxiety, depression, low self-esteem, distorted body image and problematic sexual behaviours. The Recovery Alberta handout, which I have shared with you, adds aggression and delinquency, clearly making this a significant public health concern.
Widespread — and, dare I say, inaccurate — assumptions that Canadian children are doing well are untrue. UNICEF Canada, which does support age verification, paints a different picture. Canada ranks just 19th out of 36 high-income countries in overall child and youth well-being. Life satisfaction among youth has dropped. Anxiety, depression, bullying and social isolation are rampant. The report clearly shows that policy responses and protections are failing to meet children’s actual needs, especially in relation to digital safety and mental health support.
To shift gears, ratified 35 years ago, Bill S-209 aligns Canada with its obligations under the UN Convention on the Rights of the Child, with Comment 20 demanding protection from all forms of digital violence and Comment 25 calling explicitly for robust age-verification measures to protect children from age-inappropriate content. These, along with the 2007 Senate report Children: The Silenced Citizens, Effective Implementation of Canada’s International Obligations with Respect to the Rights of Children, urge Canada to step up. It is called duty to care.
The advocacy of the National Council of Women of Canada, or NCWC, on this issue has mobilized broad and diverse societal support. Our open letter supporting Bill S-210, the predecessor of Bill S-209, was endorsed by 80 national, regional and local organizations across Canada, none of whom have withdrawn when presented with Bill S-209. A Leger poll shows that 77% of Canadians support age verification. There is a strong, clear public mandate.
Passing Bill S-209 is a vital step to fulfilling Canada’s legal and moral commitment to children’s rights and safety. In the absence of an independent, non-partisan children’s commissioner to defend and protect their rights, the NCWC, along with the 80 organizations and many others, is committed to supporting the bill and to envisioning a digital environment where children can learn, grow and develop free from harm including exploitation, because that happens. Porn is used to desensitize and “educate” those trafficked, a disproportionate number of whom are young girls and women. For over 132 years, we have sought to better the lives of women, children and society, and in the words of our national anthem, we are standing on guard for our children.
Thank you.
The Chair: Thank you to all four witnesses for their opening remarks. We’ll now move to questions.
Senator Batters: Thank you to all of you for being here and the important evidence that you have given us to help us with this bill.
Because we had a more succinct time frame than you might have otherwise expected for your opening statements and because of the number of questions that potentially could be posed to you, I wanted to give a couple of you a bit more chance to respond and give us some more information. I wanted to start with Ms. St. Germain on that, and I wanted to give you a bit more time. I think you were going to get more into discussing the evidence behind the harm that viewing pornography has on children, and I would like to give you a bit more time to discuss that, please.
Ms. St. Germain: Thank you.
We have made several submissions on previous versions of the bill outlining the evidence that was in play at that point in time. We can resubmit that information and update it with additional research. There is no question that access to pornography is harmful to children.
Senator Batters: It would be very helpful, if you could do that. Could you just give me a couple of top-line things? We would look forward to receiving that additional information, if you could please forward that to the clerk so that all of our committee members could receive it.
Ms. St. Germain: Yes.
Senator Batters: Okay. Would you like to tell me a couple of top-line points of view on that right now?
Ms. St. Germain: In terms of operating cybertip.ca, we’re hearing, of course, from Canadian children and families on a daily basis, and what we know is that families are struggling, children are struggling, with not just the pornography that is circulating online but also with all of the things that the online world has sort of thrown at them and thrust parents into the position where they have been the ones who have had to be the only one protecting their child. There has been a lot of discussions, obviously, internationally, et cetera, about the need for online regulation. We strongly support that. This bill is a step in that direction. We obviously need some broad-based regulation in many different areas, and clearly, the public will is there. Clearly, we have children being harmed on a regular basis, and the Cybertip data would back that up. There is a need for us to act.
Senator Batters: Ms. Rankin, could you provide us just a bit more of that sort of commentary as well? If you don’t have enough time right now, if there is anything that you would like to submit to our committee in addition, that would be welcomed as well. Thank you.
Ms. Rankin: I have shared with you some information from the U.K. It is very hard to analyze or speak to children directly about this. You can’t study them. You can’t expose them to pornography and then see the reactions. You can have conversations after the fact. I do feel so much for the parents who are dealing with the child that comes to them with having seen something so egregious as I mentioned, like strangulation, for example.
In that Girlguiding report — which I have shared with you all; they have been translated — you are going to see a few comments that are direct, and this is important. There is so much expertise around this table, and you have invited experts here. I’m here to remind you that we’re talking about our children. We have to listen to them and find out how they are experiencing this.
What UNICEF has come out with in terms of our Canadian youth is true. They are not doing well. They are struggling. Their relationship with and the amount of time they are spending on the internet, their exposure, whether it is through TikTok or just as we have heard other people being exposed to pornography without any desire to look for it — this is devastating to them. Children’s brains don’t develop fully to be able to notice and have a sense of what they are viewing.
It is important that we look at other issues, not just at sexuality, but also look at for girls the fact that they are now being looked at as objects, for example, by boys. This is what’s happening. That statement that I made about, “Do I have to choke her?” is a quote from a friend who works with youth who are struggling with their exposure to pornography and their understanding of what sexuality is about. It is true that girls and society at large is very much exposed to how women’s bodies are presented online.
This is just something that I feel that we need to act on, and as I said before, there is no one standing up. There is no children’s commissioner standing up and saying, “We must do this.” I do suggest that everyone reviews the document that I mentioned, The Silenced Citizens from 2007, which was a study done by the Senate looking at our country meeting its obligations in terms of what we have ratified. Comment 20, Comment 25 and many other conventions that we have signed clearly suggest — well, they don’t suggest. They are demanding that we protect our kids. There is duty to care.
Senator Batters: Thank you. I truly appreciate you also reminding us about that Senate report. Very important topic. Thank you.
[Translation]
Senator Miville-Dechêne: Ms. Rankin, there is hardly any mention of children. Since I started working on this bill, I’ve been wondering why we’re not hearing from children and yet we’ve heard a lot from those who wish to protect the privacy of porn site clients.
Obviously, it’s important to protect the privacy of clients, but I’d like to hear your thoughts on how come the first thing I hear all the time is that the government has no role here and that it’s up to parents to protect their children. We hear the argument that the government should not intervene in this issue and yet, it does so in the real world; and we hear that parents should ensure that cell phones — because we’re talking about mobile phones, not land lines at home — don’t have porn. That’s the most common argument I hear.
In one of his many appearances before this committee, Michael Geist, a University of Ottawa expert on the Internet, made this argument and said that it’s up to parents to handle this matter, period.
How would you respond to that?
[English]
Ms. Rankin: There is no way that parents can handle this. Everyone around this table is having a hard time handling this, so you can’t expect a parent to be able to be the only roadblock. We have heard discussions here today about age verification in devices and in relation to platforms. This is a bit like, well, if you are going to try and protect somebody in a car, they are going to first — we first put in seat belts, and then we realized we needed airbags. We need — and parents need — support. It is the Government of Canada that ratified these agreements. They have to take on the responsibility and support parents. They cannot do it alone. When you are talking about adolescents, anyway, you are talking about youth that are out there in the world. Parents are not following them around day in, day out. Wi-Fi is everywhere, and their exposure is going to happen everywhere. Certainly, parents can, to a certain extent, sit behind their children and put in certain firewalls, if you want, but it is not going to be enough.
This is like a cancer. You are not just going to use chemotherapy. You are not just going to use surgery. You are not just going to use radiation. You are going to need to support and attack this from as many angles as possible. It cannot be the parents alone that do this. It has to come from government. Not only do we need the laws and the implementation, but we also need the enforcement, and this is critical. Otherwise, we’ll have ticked off the box that we’re taking good care of our kids, but we will have failed them all the same.
It is clear from the data that I shared from UNICEF that we are failing our kids. Canada has one of the highest rates of suicide amongst adolescents in the whole Western world, and we are being compared to 38 to 42, depending on the report, of our peer countries in terms of GDP, et cetera.
Our kids have such great opportunities on the internet. We do not want to exclude them. I would like to see there being better access in the Far North, for example, but I don’t want a system that is railroaded by an industry that is in a position to help solve this problem. It was mentioned earlier that Zuckerberg would be able to, perhaps, do something to help this move forward. Absolutely, parents cannot do it alone. I think all of us have enough trouble just hoping that we’re protected with our own cellphones. No. We need help, and we need government, and we need action. Thank you.
Senator Prosper: Thank you to all of the witnesses here.
I’m trying, probably like most of us around this table, to understand the breadth of what we’re trying to deal with in this regard. In the previous panel, we listened to testimony from Lord Bethell who was talking about the efficacy of legislation in the U.K. He indicated nothing is perfect but you need, similar to what you mentioned just now, Ms. Rankin, a host of approaches to deal with this particular issue.
One thing that intrigued me was his mention of AI-enhanced pornography, and I believe he said — and I quote him for saying this — “it terrifies the hell” out of him. I’m just going to put it out there to any witnesses here, if you could elaborate on that. I’m just curious about that concern and the gravity of that concern. Can anyone contribute to what that sort of space looks like in terms of AI-enhanced pornography?
Ms. Campbell: That’s a good question, and, as a parent, that’s a huge concern of mine as well. When it comes to child safety online, it is not a binary solution. It is not just whether the government needs to do something or parents do. It is a holistic approach that is needed, including ensuring that if we are going to do age checks, that we have considerations for privacy and security, especially in an age where AI technologies are better than ever at collecting our data and our kids’ personal data, and perhaps that might contribute to this AI-enhanced pornography. That is absolutely terrifying to me as a parent. It is something I think about every time my kids go online in terms of what they’re sharing, what they’re being forced to share with services and what could potentially be breached, leaked and used by large language models to contribute to any one of the harms that we all don’t want to see online. I appreciate your question on that.
Ms. St. Germain: If I could add to that, from the perspective of our organization, the amount of AI-generated child sexual abuse and exploitation material has been going up for the past several years in terms of what we are dealing with. We know that this has a tremendous impact on criminal justice mechanisms. We have got law enforcement who are, in some cases, not able to readily tell the difference between the two because the technology is getting so good. That’s on the child sexual abuse side.
Obviously, anybody can now be the subject of AI-generated pornography that can ruin people’s lives. Deep fake adult pornography is a big problem. Just on general AI, we are now hearing about chat bots in the news and families that have lost their child who has died by suicide because of encouragements that were given to them by AI chat bots. We’re at a very bad place as a society, and we have not done enough — not nearly enough — to help protect our children and to step in and take the reins back from industry. We have to do that. We can’t just keep letting this go.
Senator Simons: My questions are for Professor Israel and Ms. Campbell.
There seems to be a contradiction that you both highlighted between clause 6 of the bill, which would seem to protect an organization that only incidentally and not deliberately provided a service to search for, transmit, download, store or access content on the internet, which would seem to be, Ms. Campbell, the examples you cited which are concerning of small internet cafés or independent networks, especially for Indigenous People in the North, being captured. Clause 6 would seem to indemnify them, but then we get to 12(1)(a), which gives the Governor-in-Council very wide latitude to specify what is or is not pornographic material made available for commercial purposes. I think this is the tension, Mr. Israel, that you were pointing to in your comments. Can you each talk to me a little bit about how we would want to scope this out so that people who are innocent third-party providers wouldn’t find themselves on the wrong end of this legislation?
Ms. Campbell: Thank you for raising this important question. That was one of our main concerns when we first looked at this new version of the bill.
We really do appreciate the efforts to scope out services that are core functions of the internet, but I think the lack of clarity could leave it uncertain for some services, and who knows what will happen in the regulation formation process. There are so many services that make up the internet’s infrastructure, and they’re so vast. My fear is that by being too vague on who is carved out or exempted from having to comply might wind up scoping in folks in a regulatory process.
I think perhaps some clarity on who this actually does apply to would be helpful and make it clear that this isn’t going to prevent people from using our strongest tools for internet security and our own safety and our children’s safety, like encryption, which are used by many of the internet infrastructure services.
Mr. Israel: Thank you for the question.
We share those concerns. We appreciated the inclusion of clause 6 and the intention behind it. We wanted to clarify that the intention behind it was, indeed, to exclude any site that isn’t effectively predominantly being operated for the purpose of circulating explicit sexual material. There should be no potential for this to capture social media platforms or other types of content-sharing websites where this type of content is only shared as a small fraction of what is generally being circulated on these platforms.
In addition, there is a provision in this legislation that would allow for ordering internet service providers to block access to sites that have been found to have infringed the notice requirements. I think on that front as well, meeting those requirements and implementing those website blocking provisions could be very onerous on small internet providers, like ones operating in the North or small cafés that operate Wi-Fi networks and that could be captured by the definition of ISPs and could be caught by a very broadly framed order to block access to a site that’s been deemed to be infringing.
I think more clarity on that provision, as well as on the exclusion under section 6, would be welcome. Thank you.
Senator Simons: They would be captured for transmitting. If they didn’t follow the blocking order, they could be in trouble for that?
Mr. Israel: Yes, precisely.
Senator Simons: Yet, we heard in today’s testimony, if you were listening earlier — as I believe you were — to the testimony from the European witnesses who said that it’s actually social media sites like X and Meta that are some of the biggest concerns. I confess that I myself remain confused about whether they’re scoped in or scoped out, given the pretty wide latitude that the regulations allow.
Mr. Israel: I think if this was intended, in fact, to capture social media sites, we would want to take a closer look at it. Our understanding was that clause 6 was intended to exclude these types of sites, but then this morning it sounded like they might still be in scope, subject to the regulations and maybe how “making available” is defined or maybe how some of the compliance notices are issued. If that’s the case, we would want significantly more safeguards, and we would want to consider the bill again with that lens in mind. Our assumption coming in was that they would not be directly captured, but the exchange this morning seems to suggest that, perhaps, they could be captured, depending on how the government implemented this bill. A lot is left to their interpretation through regulations, so now we’re concerned that they could be captured.
Senator Simons: Thank you very much.
The Chair: Ms. Campbell, did you want to make a comment?
Ms. Campbell: Yes, please, and thank you for that question.
I also share Mr. Israel’s concern that, from the comments this morning, this may entail that other websites are captured within the scope of entities. If websites and social media platforms are captured, then I question whether private communications are captured. This is why I think really making it clear that there are protections for use of encryption, private communications and things that our kids rely on to communicate safely online with families is crucial. It is a non-negotiable. We need encryption to have safe experiences online. Those protections should be in this bill to help ensure that we’re not taking away people’s strongest shield to have safe experiences online.
Secondly, on the question before about the content blocking, similar to what Mr. Israel just shared, there are dozens of community networks around Canada that help fill in the gaps for underserved communities or communities who have folks where the internet is simply unaffordable. I lived in the Northwest Territories for many years. I have visited many communities there where this is the reality. Community networks are community-driven solutions that help communities take connectivity into their own hands and bring solutions that meet their own needs. They’re in the Arctic. They are in Winnipeg serving the largest urban Indigenous population in Canada. They are in Indigenous communities across Canada.
The Internet Society has been supporting these types of solutions for many years, and we know how important it is to communities to ensure kids can do homework, people can do remote schooling and find work opportunities online. If any one of these networks were to get a court order for blocking — I was just speaking with North End Connect last week — they’d have to shut down. There’s no way they would be able to manage the content blocking orders, let alone the fees to challenge something like this in court. It’s not feasible.
Senator Simons: Thank you very much for flagging that for us.
Senator Dhillon: Thank you to all the witnesses for being here. I have a number of questions, but I’ll limit it to one, given our time here.
Mr. Israel, the Canadian Civil Liberties Association, or CCLA, has warned about law enforcement accessing customer data from private organizations without adequate safeguards, arguing that police access cases highlight weak retention limits and access controls. As a former law enforcement officer, I’m a little confused with that claim. Can you elaborate on the evidence that supports this claim?
Mr. Israel: I think my concern was just that this bill should prevent third party identity service providers designated under clause 12 of this bill from being compelled to keep information that they’re collecting for the operation of this bill and disclose it to law enforcement. We’re not creating a law enforcement vehicle here; we’re creating a mechanism for verifying people’s identity. It shouldn’t be used by Canadian or foreign law enforcement to be able to get people’s identification, or IDs, as submitted or to compel a third party identity provider to keep track of what sites are being visited when these people are authenticating that they are over the age of 18.
Senator Dhillon: So you are agreeing that law enforcement is not going to be accessing this information because the information doesn’t exist?
Mr. Israel: Law enforcement can issue what is called a preservation order, compelling a company to hold on to information longer than it normally would have, and then get a production order compelling a company —
Senator Dhillon: I understand that. Initially, the claim was that law enforcement would be able to access this information extrajudicially. Are you suggesting that that was a wrong claim?
Mr. Israel: I don’t think I said “extrajudicially.” Although, if Bill C-2 passes as it is currently slated, there is an information demand power that would apply to a third party identity service provider, such as the one contemplated in this bill, and it would allow police to demand that that service provider confirm if a particular person has set up a profile with it. In other jurisdictions —
Senator Dhillon: But guided by existing laws within the Criminal Code —
Mr. Israel: In other jurisdictions, there are many extra-judicial powers that would allow for that, and if you’re using an international provider, a lot of these extrajudicial powers do have cross-border reach.
Our concern is broader than that. We don’t think that this should be available to law enforcement. It shouldn’t be a law enforcement tool, so they should just not be able to access it with or without a court order, essentially.
[Translation]
Senator Oudar: Thank you to all our four witnesses. We’re running out of time. We all have a few questions and so I’ll just ask one question, which goes to Ms. St. Germain.
In your opinion, what mechanisms should be put in place so that the various levels of government, including indigenous authorities, who have not been mentioned a lot here, can tailor some of the implementation details that could ensure relevance at the local level while supporting our commitment to consultation on self-determination and to narrowing geographical and cultural gaps? Everyone is equal when it comes to the online protection of children. I’d like to hear your thoughts on that.
[English]
Ms. St. Germain: I don’t disagree with that.
In terms of this bill, the mechanisms in here involve quite a lot of work being done after the bill is passed in terms of the regulations, and I would expect that, as part of the regulatory process, taking into account all of those considerations is going to be part of it.
I did watch the testimony of the Office of the Privacy Commissioner of Canada, as an example, where what he had suggested to the committee is that it’s very important that the Privacy Commissioner, as the regulator on privacy in Canada, be very involved in the development of the implementation of this bill, and we strongly support that. That does make sense. They have expertise in that very specific area.
In this space, it’s very complicated, and our organization has experienced this complexity with the work that we do over the period of years that we’ve been doing it. What we know is that Canadian children and families are really struggling with everything that is being thrown at them technologically, that the pandemic amplified all of the harms that everyone was seeing and experiencing and that there is a desperate need for government to step in and start doing something. This is one step in the right direction.
There is a need, though, for a much broader regulatory approach. Bill C-63 was one attempt at that, Part 1 of Bill C-63. Maybe something like that comes in again. We do definitely need an infrastructure. We can see all the work that’s happened in the U.K. They’ve really done their homework. We can learn a lot from them, and we can learn a lot from Australia, but the time for Canada to do something is now.
[Translation]
Senator Oudar: Thank you, Ms. St. Germain. We have the same point of view.
In summary, under the legal framework, there is no need to make additional amendments to the bill at this stage. If I’m not mistaken, all the measures we have discussed here can be achieved under the regulatory framework of section 12. Thank you.
[English]
Senator Pate: I’m conscious of time and everybody’s schedule, so my question is to all of the witnesses. If you have suggestions of what, if not this bill, could actually achieve the objectives of the bill, if you could please provide those in writing, that would be great. Thank you.
The Chair: Thank you to all the witnesses who have come today, all four witnesses, with their advice and cogent evidence to assist our committee. Thank you.
(The committee adjourned.)