Skip to content
LCJC - Standing Committee

Legal and Constitutional Affairs


THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS

EVIDENCE


OTTAWA, Wednesday, October 8, 2025

The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 4:16 p.m. [ET] to study Bill S-209, An Act to restrict young persons’ online access to pornographic material.

Senator David M. Arnot (Chair) in the chair.

[English]

The Chair: Good evening, honourable senators. I declare open this meeting of the Standing Senate Committee on Legal and Constitutional Affairs. My name is David Arnot. I am a senator from Saskatchewan and chair of the committee.

I invite my colleagues to introduce themselves.

Senator Batters: Senator Denise Batters, also from Saskatchewan.

[Translation]

Senator Miville-Dechêne: Julie Miville-Dechêne from Quebec.

Senator Oudar: Manuelle Oudar from Quebec.

[English]

Senator Prosper: Paul Prosper, Nova Scotia, Mi’kma’ki territory.

Senator K. Wells: Kristopher Wells, Alberta, Treaty 6 territory.

Senator Simons: Paula Simons, Alberta, also Treaty 6 territory.

Senator Pate: Welcome. I am Kim Pate. I live here in the unceded, unsurrendered, unreturned territory of the Anishinaabeg Algonquin Nation.

[Translation]

Senator Clement: Bernadette Clement from Ontario.

Senator Saint-Germain: Good afternoon. Raymonde Saint-Germain from Quebec.

[English]

Senator Dhillon: Hello. Baltej Dhillon, from British Columbia.

The Chair: Honourable senators, we are meeting to continue our study of Bill S-209, An Act to restrict young persons’ online access to pornographic material.

For our first panel, we are pleased to welcome today, by video conference, Pierre Trudel, Professor at the Centre for Research in Public Law, Faculty of Law, University of Montréal; Victoria Nash, Director and Associate Professor, Oxford Internet Institute, University of Oxford, also by video conference; and in person, we have Alison Boden, Executive Director of the Free Speech Coalition.

Thank you to all the witnesses for joining us today. We will begin with opening remarks.

I would ask the witnesses to restrict their remarks to five minutes — it’s very short — to give an overview so that we can get into the meat of it with questions from the senators.

I now ask you, Professor Trudel, to open with your remarks, please.

[Translation]

Pierre Trudel, Professor at the Centre for Research in Public Law, Faculty of Law, University of Montreal, As an Individual: I would like to thank the committee for the invitation to participate in discussions on Bill S-209.

My name is Pierre Trudel. I am a professor of cyber and internet law at the University of Montreal, and for the past 30 years, I have been interested in internet regulations and the best way to strike a balance between freedom of expression and other societal values.

Bill S-209 is consistent with initiatives in other parts of the world that address issues arising from the circulation of certain types of content considered to have significant potential to cause harm.

Indeed, material that contains explicit sexual activities poses potential risks that must be managed. Public health imperatives are ample justification for the guardrails the bill proposes to put in place.

In most parts of the world, a number of groups that study online safety have found that exposure to pornographic material is highly likely to cause children to trivialize dangerous behaviours and to lower their inhibitions towards them, increasing the risk of problematic behaviours and hindering the development of healthy sexuality.

Bill S-209 imposes an obligation to verify or estimate the age of individuals accessing pornographic material online. It does not prohibit pornographic material as such, but rather, it aims to restrict access to individuals over the age of 18. In that respect, it is not different from the prohibitions and rules that already exist to restrict access to pornographic material off-line.

Bill S-209 creates a duty of care for online platforms that generate income from the circulation of videos and other content primarily on the internet.

Section 5 of the bill makes it an offence to make pornographic material available to persons under the age of 18 for commercial purposes.

The scope of the bill excludes material that has a legitimate purpose related to science, medicine, education or the arts. The bill primarily targets commercial distributors of pornography. It applies to organizations whose major purpose is to make pornographic material available on the internet for commercial purposes.

Bill S-209 introduces an offence that is subject to a mechanism stipulated in section 9 of the bill. Its purpose is to issue a notice to online companies to have them take the necessary steps to restrict access to pornographic material to persons over the age of 18.

A designated independent authority with the appropriate expertise will issue notices to websites that violate the law.

If the platforms, be they Canadian or foreign, do not comply after a reasonable period, the authority designated by law may apply to the court for an order to prevent access to the prohibited material by young people.

Technology is constantly evolving. It is therefore appropriate, and we commend that fact, that the bill empowers us to make regulations prescribing age verification methods. These regulations must ensure that the mechanisms used are efficient, fall under an independent organization and most importantly, ensure privacy.

In conclusion, Bill S-209 proposes to introduce appropriate legal mechanisms into Canadian law. These mechanisms will ensure that conditions governing off-line access to content that may be harmful to children will also be applicable to online platforms.

It is an effective and targeted way to clarify the duty of care incumbent on commercial entities, whether online or otherwise, that distribute sexually explicit material on the internet. Thank you, Mr. Chair.

[English]

The Chair: Thank you.

Victoria Nash, Director and Associate Professor, Oxford Internet Institute, University of Oxford, as an individual: Thank you very much. It’s a great pleasure to speak to you this afternoon. As per the introduction, I’m an associate professor at the Oxford Internet Institute at the University of Oxford, where we conduct research on the societal implications of digital technologies, including the internet and AI. My background is as a political theorist and scientist, and I conduct policy analysis trying to understand better approaches to governing digital technologies.

I was particularly pleased to be asked back for a second time and to observe the progression of this bill over the preceding couple of years. I note with some positivity the narrowing of the definition of the type of content and the type of service you’re focusing on and, in particular, as mentioned by the previous witness, the possibility of having measures which will guide the appropriate use and standard for age verification.

In the period since I last appeared before your Parliament to give evidence, we have had some quite interesting developments in both the technological space and the policy space on this topic. In particular, we’ve seen states like the U.K. implement similar approaches with our Online Safety Act, which has been in place since 2023 but through which measures regarding the responsibilities of online commercial pornography providers have taken effect this summer. We’ve seen advancements in the development of technologies around age estimation, which I gather you may hear from other witnesses about later.

However, despite some of these advances, I would flag from the very beginning that my sense is that some of the same concerns and trade-offs remain at this point in time, and in particular, the concerns about privacy should be taken very seriously by this committee.

I am much more optimistic now about the potential of age verification and age estimation technologies to be more privacy preserving than they were perhaps two or three years ago. I think that is improving rapidly, but I would note that in many cases, it is an empirical question as to whether that privacy is appropriately protected. For example, we have seen a case in France where forensic investigation noted that much more information about particular videos people sought access to was being stored by an age verification service, so it is a practical and empirical question and not merely a technical one.

I would also flag one other point — it hasn’t changed significantly, but it has changed in scale even in the last two or three years — which concerns the baseline of information we are already providing in our daily lives online. There is often an assumption that in the absence of age verification or estimation, we can access online content and services without providing information about ourselves, without leaving digital traces. I wanted to flag that I don’t think that is the case, that the majority of us are attracting small pieces of tracking information like cookies or voluntarily handing over information everywhere we are going online, for example, when we’re using social media.

It is relevant to think about privacy in those two ways, one to do with technical but empirical practices around data collection and deletion and one regarding what the backdrop is.

To finish with a couple of points about the U.K. experience, I fear it is too early to tell whether our legislation has been effective in terms of restricting minors’ access to pornography online. I note that Ofcom, our regulator, has already initiated investigations into a number of companies. I think it’s more than 60 pornography websites, which is encouraging. On the other hand, we have experienced legal pushback from4chan, I believe, raising a case in the U.S. which seeks to challenge the applicability of our law in the U.K. space.

We have also noted something else that may be relevant to this committee: a purported rise in downloads of something called virtual private networks, by which individuals can bypass age verification measures by pretending to access services as if they are in another country. I saw a supposed rise of 1,400%. I don’t have independent verification for that, but I think this is something we will be monitoring closely in the U.K., both the effects on minors and evidence of technical circumvention. I am happy to take questions on any of this. Thank you.

The Chair: Thank you.

Alison Boden, Executive Director, Free Speech Coalition: Honourable senators, thank you for the invitation to testify today. As the head of the Free Speech Coalition, the trade association for the adult entertainment industry, I’ve spent the last several years almost entirely focused on age verification policy.

As both an experienced software engineer and former CEO of an online adult platform, I understand why Free Speech Coalition, or FSC, member companies have struggled to comply with laws in two dozen U.S. states, the United Kingdom and Europe touching on this subject.

Today, I’ll be sharing the real-world outcomes of website-based age verification mandates for pornography websites and suggesting changes to Bill S-209 that I believe would better protect the children and adults of Canada.

Since January 1, 2023, I’ve watched the same pattern repeat itself every time a new age verification statute goes into force: Adults refuse to comply, children remain exposed to explicit content and sex workers suffer the consequences.

Unfortunately, despite polls showing strong public support for this policy, the reality is that fewer than 5% of users actually complete the verification process. Many fail because the systems are confusing or technologically unreliable, but the vast majority simply refuse to comply.

As mentioned, when the U.K.’s rules came into force this summer, over the first weekend alone, a major VPN provider reported an 1,800% spike in new sign-ups, and data showed that an additional 66,000 Britons had resorted to using the dark web to avoid having to age verify.

Perhaps that is a rational reaction in a world where, to quote former FBI director Robert Mueller, “There are only two types of companies: Those that have been hacked and those that will be hacked.” The consequences of this can be devastating, as we’ve seen with the recent example of the Tea app, where a breach of identity verification data exposed tens of thousands of women to online doxxing and harassment.

The same pattern shows up in the United States, creating perverse market incentives. Researchers at Stanford and New York University, or NYU, found that when age verification laws took effect, traffic to compliant sites dropped by 51%, while traffic to non-compliant rivals rose by 48%.

Worse, because the laws in some jurisdictions only require age verification for pornographic websites, children are still able to stumble upon explicit content elsewhere. Research has repeatedly found that the majority of teens encounter pornographic content on social media and are more likely to come across it through search engines than adult websites.

Finally, a side effect of the way age verification laws have been written in the U.S. and elsewhere is that they exacerbate financial precarity for sex workers — a community that is already heavily marginalized despite providing a completely legal and highly popular form of entertainment.

For these independent content creators and other small businesses, the burden is often financially impossible to meet. Implementing and operating an age verification system on their websites can cost hundreds of thousands of dollars — far beyond what most can afford.

Nearly half of creators surveyed this year reported declining income as fans — and even the creators themselves — struggled to access the platforms where their content is sold.

After years of working directly with companies trying to navigate age verification laws, I’ve concluded that for a law like Bill S-209 to be effective, the following criteria must be met.

First, the law must account for the ecosystem of children’s online experiences and not exclude the websites and platforms where children are most likely to inadvertently encounter adult content.

Second, it needs to prioritize user privacy and make compliance easier than circumvention. The current approach of forcing users to verify their age using unknown third-party verification services on multiple individual websites is simply not working. Happily, this can be accomplished using existing technology that device manufacturers like Apple have already created. They simply need to make it accessible to websites.

In closing, Canada is well positioned to demonstrate global leadership by leveraging the experiences of other jurisdictions. Adopting a legislative framework that protects user privacy, significantly decreases circumvention and eases the burdens of compliance would protect Canadian children more effectively while safeguarding the rights and dignity of adults.

Thank you for inviting me today. I look forward to your questions.

The Chair: Thank you. Senators, we have about 40 minutes left and 4 minutes each for the question and answer. I ask senators to be succinct with questions and responders to be succinct with answers. The first question is from the deputy chair, Senator Batters.

Senator Batters: Thank you all for being here with us today and providing us with important evidence on this bill. I would like to start with Professor Trudel.

First, I want to note that you are a well-known expert on the internet and protection of privacy, so I thought it was particularly important when you said in your opening remarks that you think this bill introduces relevant legal mechanisms in an effective and targeted way.

Some critics of this bill question whether the age verification provisions could be circumvented by youth who use VPNs to bypass age verification safeguards. What is your response to that?

[Translation]

Professor Trudel: It is certain that on the internet, there is always a certain degree of circumvention. The main challenge is to ensure that there is as little of that as possible. Saying that we should not put a law in place because individuals may circumvent it is not, in my opinion, the right way to look at the issue.

In fact, the bill provides for protection from the outset. Companies have the primary onus of ensuring that the law is enforced, and they must take the necessary measures to enforce it. In addition, the authority that the bill proposes to establish must closely monitor trends in the use of circumvention measures or mechanisms, which are unavoidable. It should be noted that circumvention will always be there, and this is not unique to the internet. Any legislation can be circumvented. This also happens to be the case with the internet.

[English]

Senator Batters: Thank you very much. Ms. Nash, from your introductory remarks, I note that you are more optimistic that age verification measures protect privacy much more now than they did before. Can you please tell us more about that?

Ms. Nash: Certainly. I should flag that I’m not a computer scientist, but my understanding is that we now have a greater variety of tools available to us. Imagine using almost a waterfall approach where you start out with most privacy protecting tools, for example, age estimation based on technologies that estimate age from facial images. These don’t require you to give any information about your identity or provide any documentation.

You can start with that. Those estimation processes are not perfect. They may give you an age estimate within three to five years. If that takes you too close to the age boundary you are concerned with, at that point, you might then seek to use another measure which could require verification using a means which requires you to provide some identity documentation.

The point is that we now have a whole toolbox of measures which can be used separately in a privacy-protected way, and you only use the ones that require more identity information in the hardest cases. That is a new and important development.

Senator Batters: Thank you very much.

[Translation]

Senator Miville-Dechêne: My first question goes to Professor Trudel. According to a Leger poll, 77% of Canadians support age verification.

You are a digital expert. How do you explain the fact that some people are describing this bill as a threat to the freedom of expression of those who like pornography, given that this bill does not stop them — on the contrary — from visiting the websites? Yet, in certain circles, this bill has been described as a threat to freedom of expression.

Professor Trudel: Obviously, that is fair enough. It is a very common strategy for some individuals to describe measures that impose certain restrictions as a violation of freedom of expression. In fact, in Canada, freedom of expression is a freedom that is subject to restrictions. It has never been understood to be absolute freedom.

Most countries, and Canada in particular, have a certain number of laws that restrict freedom of expression in a limited and reasonable manner.

Consequently, invoking freedom of expression without looking at the reasonable and targeted nature of the restrictions that a law such as Bill S-209 proposes is, in my opinion, missing the real issue: Are the restrictions imposed by Bill S-209 reasonable in a free and democratic society?

Yet, as you pointed out, in Canada there is a lot of consensus on the need to restrict children’s access to pornographic material. Moreover, courts have determined that restriction to be reasonable. Consequently, what we have here is a bill that imposes a reasonable restriction on freedom of expression by imposing restrictions, namely, age verification or age estimation of individuals who wish to access certain content; the aim is to protect minors and children, who may be harmed by this material.

Senator Miville-Dechêne: I have a question for Ms. Nash. You are in Great Britain, in the United Kingdom. What is your assessment of the implementation of the Online Safety Act? Age verification has been carried out since July 25. Of course people use private virtual networks, or VPNs, but otherwise, have there been data breaches and negative events that could put a damper on our efforts as far as age verification is concerned?

[English]

Ms. Nash: I wish I could say that we had any clear data to give you. It was unfortunately only two and a half months ago. As yet, those reports about VPN downloads are the only concrete data I have seen. I would note that we have had plenty of time for companies to adapt to this. It is very notable that the U.K. Children’s Commissioner’s most recent report that came out in the early summer noted no improvements, certainly, just before the bill came into effect.

Senator Miville-Dechêne: Thank you very much.

Senator Prosper: Thank you to the witnesses for coming before us.

I would like to follow up on a couple of things. Ms. Boden, thank you for your opening remarks and providing us with what you referred to as “real-world outcomes.” It’s something I’m sure you have tracked for quite some time in terms of the use of age verification technologies and what happens as a result of those, as well as the statistics you mentioned about the rise in VPNs, private ones, and the dark web. Could you further elaborate on your experience in terms of, when you have the introduction of technology like this, what tends to happen in the real-world dynamics?

Ms. Boden: Thank you for your question, senator. Essentially, we are seeing that, although these laws are certainly enacted in good faith, users are very reluctant to share, be it their government ID or simply a scan of their face that is supposed to guesstimate, as was mentioned, how old they are. It feels like every week, we hear about a new enormous data breach. It is very frightening, especially when this particular type of data, regarding the sort of sexual interests you have, is so very sensitive.

Maybe the best comparison case to look at there is Ashley Madison and, sadly, the potential suicides that resulted. It is very sensitive data. Even the age verification providers cannot guarantee that the data is safe — you will see in my written submission that I grabbed four different age verification providers and took a look at their privacy policies. Each one said something along the lines of, “No system is 100% secure, and we will not ensure or warrant the security of any data that you send us.” I know as a former software engineer that any time you’re sending data through the internet, it is at risk.

VPN use is absolutely rocketing up. The research is showing that. There is also a phenomenon that we have only seen some hints of, but I have had a few identity documents sent to me at the Free Speech Coalition via email by confused folks who are not exactly sure how to verify their ages. They are having trouble figuring out these systems, and they are very vulnerable to phishing attacks. They will get a message saying, “Oh, would you like to look at my content? I am a lovely person.” And they will click on this website. It will ask them to age verify and then it will maybe take a photo of them showing their ID. That is being sold on the dark web.

So there are some very concerning use cases, and I am not shocked that people are not willing to subject themselves to them in order to avoid this.

Senator Prosper: You just mentioned the dark web. Could you please share what, in your experience, the expanse of that could potentially encompass?

Ms. Boden: Sure. I would describe the dark web as similar to the normal internet, but it is secret. It is hidden. You cannot get there on your normal browser. You have to use a special browser. On it is essentially any kind of criminal activity you could imagine.

The idea that people are looking for pornographic content there is really frightening to me, because not only can you buy drugs or guns on the dark web, but also child sexual abuse material is sold there. Revenge porn, or non-consensual intimate imagery, is on the dark web. Even just looking for mainstream pornographic content, people are in a lot of danger of ending up exposed to illegal materials.

Senator Prosper: Thank you.

[Translation]

Senator Saint-Germain: My question is for Professor Trudel. Obviously, the bill we are considering has a noble objective, and nobody is going to challenge that.

However, we have before us a bill that is very principle-focused and leaves significant room to regulations, whereas we are asking the government to prepare and implement legislation based on regulations that will determine the best mechanisms for verifying and estimating age. Experiences abroad are recent and not yet sufficiently conclusive.

Thus, from a legal standpoint, do you have any concerns that such a bill is too vague and only meets minimum legislative standards, and that, due to its flexibility, a regulator could approve a method that could then be overturned by the courts, but above all, a method that has not been sufficiently validated, so that the adverse effects and risks associated with privacy could outweigh the intended benefits?

Professor Trudel: We should assume that things evolve very quickly on the internet and for that reason, it is often necessary to use regulatory authority. There should be some kind of public regulatory authority. My understanding is that the bill proposes an authority with the necessary expertise to understand how the internet works and how it is evolving in order to act swiftly since things happen at speed in cyberspace. We must be able to act swiftly and potentially monitor trends, keep up to date with technology, with its strengths and weaknesses, and so we should work with businesses to put in place best practices and the best ways to carry out age estimation and age verification.

However, it would be a mistake to believe that we can pass legislation that applies on the internet and expect it to be enforced without the need for adjustments or a guidance process that should be run by experts in conjunction with the industry. There is a need to identify potential pitfalls and to research and develop the best possible solutions. It is an ongoing process, which should, in some way, be done in collaboration with the relevant stakeholders.

All stakeholders care about protecting children. Therefore, the public agency to be set up by the bill should work with businesses and platforms to ensure the use of best practices, and to intervene swiftly if these practices reveal any pitfalls in order to address them wherever possible.

Senator Saint-Germain: Do you think in this context that the mandate of such a public agency will be complicated and will require a lot of highly specialized resources?

Professor Trudel: Absolutely. It is clear that in order to take action in an environment as complex as the internet — we are talking about protecting children, but we have to remember that a lot needs to be done on other online platforms that are also harmful and endanger other people. It is undeniable that we need effective regulatory mechanisms with the necessary expertise in order to take action in a very complex environment. Let me add that these public agencies must work with their international partners since we are operating in a global environment.

It would be erroneous to assume that such an agency would only address events that take place in Canada. It has to work with partners in other parts of the world.

Senator Saint-Germain: I get that. Thank you.

[English]

Senator Dhillon: Thank you, panellists, for your expertise and lived experience. I certainly understand that all three of you are in favour of protecting our children from harm, so we’ll start with that as a basis.

Professor Nash, I want to give you some leeway because you said that you do not have any data to support it but understand that — and I certainly recognize that Ms. Boden confirmed this — there’s been an estimate of a 1,400% increase in virtual private networks, or VPNs, since these laws came into effect.

Professor Nash, with that understanding, do you still believe that these bills and laws are important and required?

Ms. Nash: Thank you for that question. Yes, I do believe these bills are important and required but with a caveat that I don’t think they will provide a perfect solution.

In my book, what these laws are very helpful for is making it that much harder for children and young people to access material and, in particular, making it harder for them to stumble upon it. I frankly don’t think there is a law that you could introduce that would stop a very determined teen from finding a way to access pornography online, and I don’t think that can or should be the goal.

Senator Dhillon: Thank you. We hope children in our homes are using their brains and expertise for purposes other than trying to break into these types of sites, but I accept your answer.

Ms. Boden, if I may, in a recent LinkedIn post, you shared that age verification laws “ . . . do nothing to protect young people.” Can you elaborate on that position in the context of Bill S-209 and what evidence or experience informs this view and position of yours?

Ms. Boden: Thank you for your question, senator. The idea that we are targeting content that is not where you start encountering this sort of explicit material, focusing directly on a profit motive for a website when young people can go on Twitter, X, Instagram or any number of platforms. And they are often not looking for this content. They are just encountering it, whether it’s being sent to them by someone or just seeing it in their feeds. I know most search engines are trying to avoid this sort of thing, but if you search for a term that may have an innocent meaning to a child, it is not difficult to end up with a wall of images —

Senator Dhillon: I have limited time, so I’m going to cut you off. Can I rephrase my question so we can get to the answer?

Ms. Boden: Certainly. I apologize.

Senator Dhillon: Do you believe that measures, like the ones we’re talking about in Bill S-209 will have some effect in protecting our children from this harm?

Ms. Boden: I think that the effects of this bill as written would probably be felt more by adults than children, and if we really want to help protect children, the place to do that is on the device so that they cannot access any kind of adult material. They can be blocked from it. Then adults aren’t led down this path of danger.

Senator Dhillon: Ms. Boden, you’ve also compared the Ashley Madison data breach to the work that we’re doing here. One is adults looking to procure sexual services, and the other is commercial sites with pornographic material. Do you see these as comparable?

Ms. Boden: Sadly, I do. I’ve worked for a variety of pornographic sites, and if there is anything that I’ve learned, it is that different people like different things. They’re not always the same things you or I might enjoy, and there is a lot of shame surrounding viewing pornography in our society.

I have answered customer support emails of people desperate to have their viewing history erased so that it could never be exposed. It is not uncommon for people to be very concerned about this information getting out. And, frankly, in my own life, I’ve seen people who have had this sort of information used against them to, among other things, try to take their children away. So, yes, absolutely, this information is just as, if not more, dangerous than what Ashley Madison had.

Senator Pate: Professor Trudel, I understand that you appeared before the Standing Committee on Canadian Heritage in 2023 to speak about holding international tech giants accountable under Canadian regulations. During your testimony, you discussed how tech giants use intimidation and subversion tactics to evade regulation in Canada. Given that this bill will likely look at some of these issues, do you think such tactics could impact the operation of the legislation that’s proposed?

[Translation]

Professor Trudel: Yes. Unfortunately, a number of big platforms have more economic power than some entire nations so one has to assume that when the platforms you are talking about wish to evade regulations, they can come up with all sorts of strategies to try to derail legal processes. In fact, lawmakers in all democratic nations face significant challenges in cooperating with their counterparts to maximize the likelihood that democratically enacted legislation in democratic nations to control and balance the activities of these platforms is effectively enforceable. That is a huge challenge.

[English]

Senator Pate: Do you have other recommendations about how this could be achieved to fulfill the objectives of this legislation?

[Translation]

Professor Trudel: One of the recommendations that could be made would be to insist that the agency proposed under Bill S-209 networks with comparable agencies, citizens’ groups, university researchers and stakeholder businesses across the world to ensure industry-wide monitoring that is sufficiently smooth and effective to swiftly identify pitfalls and determine the best ways and technical or regulatory strategies to address them.

This requires ongoing monitoring, as happens with medicine. There is oversight when medication is put in circulation because these are dangerous substances. Often, many activities on the web involve multi-faceted hazards. We need the means to take action to ensure effective regulation of these complex and wonderful tools, which come with risks.

[English]

Senator Pate: Thank you very much for that.

In an unrelated but, in my mind, linked area, offshore tax evasion, Canada is a country that has chosen not to pursue this area. I recently had discussions with a very small country, Iceland, who had chosen to pursue it. Do you know of countries that are doing this well that we could perhaps look to, to learn how you actually take this seriously and address it in a serious manner?

[Translation]

Professor Trudel: In connection with the European Union, Ms. Nash talked about the United Kingdom’s experience. It seems to me that Canadian authorities should work with these countries, along with Australia.

[English]

Senator Pate: Thank you. Basically, you’re saying it takes the political will, and there are ways to do that.

[Translation]

Professor Trudel: Absolutely.

[English]

Senator Simons: Thank you very much to all of our witnesses. Ms. Boden, I would like to start with you. Up until now, most of our testimony has centred around the big players, sites like Pornhub and social media platforms like X, but you have raised a question we have not discussed here that is about what you might call independent producers. Is “cam girls” the right term?

Ms. Boden: Sometimes.

Senator Simons: Yes. I mean young people and many of the women who are using OnlyFans and other platforms to do safe sex work, where they’re not in contact with customers and can make money without putting their physical safety at risk.

Can you talk a little more about what kind of imposition legislation like this would be for those small, independent producers, if I can use those terms?

Ms. Boden: Thank you for your question, senator. What we’ve seen so far is that because their customers are unwilling to perform age verification, they are no longer paying customers for these folks.

Senator Simons: You mean in places like the United Kingdom or Australia?

Ms. Boden: Absolutely. But wherever the legislation seems to be passed, they lose the customers in that geolocation. As mentioned, this is a global, worldwide internet, and so a creator — who may or may not be based in Canada — would certainly potentially lose income from Canadian customers. That has been happening. There are several instances of folks losing 20% or 30% of their yearly income simply because folks are not willing to age verify.

Senator Simons: Yes. This raises a question that Professor Nash touched on. I wanted to ask both you and Professor Nash this question. It seems to me that the danger we run into is that we could construct a system whereby it is no problem at all for a tech-savvy 17-year-old to access this material, but for a lonely 70-year-old man, it might be next to impossible. Professor Nash, is there a danger that we are actually cutting consenting adults off from material that they are legally entitled to purchase or watch, while not doing very much at all to discourage 15-, 16- and 17-year-olds from seeing the same material?

Ms. Nash: Thank you. That’s an excellent question, and I suppose, respectfully, I would suggest that this is one that you’re all very well placed to answer because it precisely concerns a matter of trade-offs. I don’t think there is a perfect technical or policy solution that would, exactly as you suggest, make it impossible for your determined 17-year-old but really easy for your 70-year-old.

Having said that, I do think that some of the newer age verification solutions that we have should be much easier in the future for older users to utilize and, in particular, wouldn’t also disenfranchise individuals who don’t, say, have passports or government identity documents.

So I think things are improving, but I frankly would agree with you that there is a trade-off, and I’m very grateful, in a way, that I’m not in your position of having to decide which of those two outcomes is the more appropriate or desirable one.

Senator Simons: Ms. Boden, I don’t know if you want to take a stab at that.

Ms. Boden: I have personally attempted to go through a variety of age verification systems. They often involve, if you’re on a laptop or desktop device, having to go get your smartphone to take a photo of yourself and then show an ID. Even if you don’t have to show the ID, maybe you have to scan it all around your face. They’re very complicated. It is definitely far more difficult for elders, as I’ve seen myself with emails that not only include identity documents, unfortunately, but also just kind of desperation and looking for help. Like, “Hey, I paid for this. I don’t understand this technology. I don’t know what to do.”

Senator Simons: I’m just curious. How close did they get to your age?

Ms. Boden: I don’t look at the documents, so I couldn’t quite answer that. Usually, it’s like, “Oh, no,” and so we delete them. I’m in my early forties. I would say they’re within two decades, certainly.

Senator Simons: Two decades? All right. I think you look as if you could be in your late twenties.

Ms. Boden: Well, thank you so much.

Senator Simons: I’d believe that.

Senator Batters: I have a supplementary question on this. You’re talking about the possibility of having cam girls or things like that included, but this act applies only to organizations. As the definition in this bill says, “organization has the same meaning as in section 2 of the Criminal Code.” That definition is as follows:

organization means

(a) a public body, body corporate, society, company, firm, partnership, trade union or municipality, or

(b) an association of persons that

(i) is created for a common purpose,

(ii) has an operational structure, and

(iii) holds itself out to the public as an association of persons;

It is quite a defined thing meant for much more organized situations than cam girls.

Senator K. Wells: My question is for Professor Nash. In your comments, you mentioned France as an example and the third-party age verification provider’s breach of privacy and information. Could you share a bit more about that situation? You mentioned that content was tracked. What kind of content, and for what purpose was it tracked? How was that provider held accountable for this privacy breach?

Ms. Nash: I can certainly follow up with the exact information. I’ve only read what’s been reported in the media on this. My understanding is that it was a third-party age verification service provider rather than an organization that was actually doing the age verification. So it was a middleman in this process.

France has a requirement that there must be provision of some services which are technically double-blind, and in this case, it was a service that was offering — in theory — this double-blind age verification service provision.

The forensic investigation discovered that, at the stage where the user was communicating an intent to verify their age with this provider, that company was making a note of the precise video that the user was intending or trying to access, and that information was then stored and being linked with the potential identity of the user.

It is quite a complex and unusual case. I think Ms. Boden made a point at the beginning about the ecology of the actors in this space. This is something we are thinking about in the U.K. and might be wise to think about in this case too: What are the services that would be in scope here? Would it solely be those who are actually providing the age verification or estimation? Would there be any regulation or standards set for other third parties involved in that process to ensure they are also maintaining appropriate standards of privacy and data minimization? I’m very happy to follow up with the article and send it to the clerk if that would be helpful.

Senator K. Wells: That would be very helpful. You may not know the answer to this, and it is something we can follow up on, but given the seriousness of such a privacy breach, what were the consequences — penalties or otherwise — for that provider?

Ms. Nash: I don’t think I read about any penalties being applied as of yet, which is interesting.

Senator K. Wells: This real-world example reflects one concern we’ve heard from hundreds — if not thousands — of Canadians around the safety of their data and, as Ms. Boden said, the sensitive nature of the material that is being tracked and the purposes of that tracking.

Ms. Nash: In my introduction, I did flag one other issue that is important: In many cases, these individuals are already giving up a lot of this data voluntarily to the commercial companies themselves, as well as the payment companies, for example, whose services are enabling the financial gain. Also, the pornography companies are selling that to advertising companies. A lot of this data is already out there, so we shouldn’t fixate too much on just this aspect, though this is the aspect you are focusing on.

Senator K. Wells: All of that raises broader conversations around data safety and security sovereignty. Thank you for that.

[Translation]

Senator Oudar: Thank you to our three witnesses. This is a very interesting discussion. I am going to start with Professor Trudel and then move on to Ms. Nash and Ms. Boden.

I am wondering about the recommendations that can be put to the designated authority, while of course ensuring the duty of care, to swiftly monitor technological advancements. I would also include socio-technological advancements because they have a huge impact on young people. What measures would you like this authority to take so that the provisions put in place by Bill S-209 remain relevant with the evolving digital usages and emerging risks, particularly when it comes to creating new platforms and non-conventional content that is becoming accessible to young persons?

In conclusion, I would like to hear your thoughts on one issue: If you could give the designated authority a couple of recommendations, what would they be? My question goes to the three of you, starting with Professor Trudel. Thank you, all three of you.

Professor Trudel: I think the main recommendation would be to establish a multi-sector stakeholder network, including government regulators, researchers, technology experts, compliance strategy experts and industry stakeholders. It would be a very tight network that would move swiftly to identify and anticipate trends and take swift and effective action. In other words, the opposite of an organization that reacts to requests and takes six months to respond. The agency should essentially be able to act in a more proactive and concerted manner. This is the biggest challenge, but in my humble opinion, it is the challenge of regulating activities on the internet.

Senator Oudar: Thank you.

[English]

Ms. Nash: Thank you very much. In my case, I would propose two things. First, that, as we have in the U.K., the designated authority or regulator be granted sufficient resources to have research teams in-house, simply to keep pace with, for example, technological advances.

The other thing I would probably suggest — but it would be a source of support for issues and policies much broader than simply this bill — is, just as regulators like Ofcom in the U.K. have consumer panels, having a youth panel to inform and advise on practices and changing behaviours, particularly those of young people. I think their voices often aren’t heard in these debates.

Ms. Boden: I fully agree with the other panellists, and I would suggest that adult consumers be part of any panels that are giving feedback, just to ensure that they’re able to deal with whatever limitations they’ll face.

I would also want that authority to consider the entire ecosystem. Where are the points at which youth can be stopped from seeing this content that aren’t necessarily each individual website, if that’s not the most efficacious place for them to do it? How can we look at the way the youths are going online and the devices they’re using? I don’t know if your refrigerator connects to the internet these days, but many do. Having the technical expertise to understand the environment would be very helpful.

Senator Clement: Thank you to all of the witnesses. I have a question for Professor Nash and then Ms. Boden.

Professor Nash, you talked about how we’re already leaving traces of our identities everywhere. All the social media platforms know that I’m a Black woman who prefers hip-hop music, but to Ms. Boden’s point, information of a sexual nature is a whole different category.

You just answered Senator Oudar and mentioned a youth panel. We heard that from the Privacy Commissioner last week. This bill has a lot of support. Is it amendable at this point? Last week, we heard Mr. Hurley say that he would want to see that when we require private information be destroyed immediately, not doing so become a punishable offence. Are there specific amendments to this bill that you have thought of?

Then for Ms. Boden, I don’t think you think it is amendable. Your brief speaks about reimagining. Maybe speak a little more about actually having the devices doing the age verification. Lean into that so I can understand that a bit better.

Ms. Nash: Thank you for that question. Is it amendable? There’s one area I would look at. It might be an amendment for this bill, or it might be a separate matter, but it would be about what regulations are in place for your age verification sector in Canada. I’m afraid I don’t know much about that, but again, thinking about the wider ecology — from the age verification or age estimation providers themselves to the other third-party organizations — what standards are they held to, how is that overseen and do you have the resources to ensure those standards are applied? I think that is something I would be paying attention to.

I’m going to be cheeky and say one thing about the device-level solution. I think it’s another option, but I would flag, from the sociological perspective, that not every child has their own device. A lot of devices are shared, particularly in the younger teenage years, so apologies for adding that.

Ms. Boden: Thank you, senator. While not all devices are individual, I think that parents can probably be relied on not to give their children devices that are set up to look at pornography. The idea with a device-based solution is that whether it’s, for example, your Apple ID that is on your Mac laptop and phone or the device itself that somehow holds this data, it is much safer for there to be one source of truth that says, “We know this person is over 18,” and only gives out the information as a yes or no to the question, “Is this individual old enough to access this content?”

The website could ask, “Hey, is this person okay?” If the answer is no, they’re just blocked. There is no VPN getting around that. There is no showing your ID at every website. It’s merely making one server call on the back end and not sharing any private information at all.

Senator Clement: Is there already technology advanced in that —

Ms. Boden: Apple has already created an API capable of doing that. It opens it to apps but not to websites.

Senator Clement: Thank you.

The Chair: On behalf of all of the senators here, thank you to all the witnesses who have appeared and helped us with our work.

For our second panel, we’re pleased to welcome, from Ethical Capital Partners, Mr. Solomon Friedman, Partner, Vice-president, Compliance; from Aylo Freesites, Mr. Matt Kilicci, Vice-president, Trust and Safety, Risk and Compliance; and by video conference, from Yoti, Julie Dawson, Chief Regulatory and Policy Officer.

Witnesses, you’ll have five minutes each to give an overview. After that, we will move to questions from our senators.

Solomon Friedman, Partner, Vice-president, Compliance, Ethical Capital Partners: Good afternoon, chair, deputy chair and distinguished senators. Thank you for inviting me to appear before you today.

I am from Ethical Capital Partners, or ECP. ECP owns Aylo, which operates some of the most popular adult content platforms in the world. I am also a lawyer certified in criminal law by the Law Society of Ontario and an adjunct professor of law at the University of Ottawa.

We are proud that our portfolio company operates safely, responsibly and legally. Indeed, today Aylo’s platforms are the only free video-sharing platforms on the internet, adult or otherwise, which verify and maintain proof of age, consent and identification for every person appearing in content.

But Bill S-209 is not about content moderation. It is about age verification. The question before you is not whether we should protect children online. We all agree on that; of course we should.

I want to congratulate and thank Senator Miville-Dechêne for this bill and for keeping this important issue at the forefront of public consciousness.

Indeed, from ECP’s perspective, we want to ensure that absolutely no minors can access adult content. The question is whether Bill S-209 in its present form will achieve that goal. The evidence says it will not. Worse, it will expose Canadians to grave privacy risks and push users, including minors, into the very spaces Parliament most wants to protect them from.

That is because Bill S-209 provides for a system of site-based age verification. I must say, frankly, this approach does not work. It introduces significant risks, is unenforceable and will ultimately fail to achieve its intended goal of protecting Canadian children.

This prediction is based on evidence, not speculation. Indeed, we have seen this experiment play out around the world. Jurisdictions that have mandated site-level age checks have seen three clear outcomes. Because enforcement is impossible, children still access pornography by migrating to non-compliant sites, which are often unmoderated and infested with illegal content. Adults refuse en masse to provide their private data to pornographic sites and also migrate to non-compliant sites.

The small minority of adults who do participate are forced to repeatedly hand over their most sensitive personal data. VPN usage skyrockets, with many of those services selling or leaking customer data.

For example, after the U.K.’s Online Safety Act came into effect, as you’ve heard, VPN demand increased thousands fold in just days, and the result was predictable. For those who do comply, what they watch, who they are, their device and their IP address are all collected, stored and potentially leaked. You have heard about the example in France. The same has happened in the United States. Legislating that every Canadian must repeatedly prove their age on every site will multiply, not minimize, these risks.

Even with the most robust regulator in the world, the U.K.’s Ofcom, thousands of non-compliant sites remain freely accessible. This is an experiment all of you can do from your own offices. Use a VPN and place yourself in the United Kingdom. We are happy to provide a list of sites that today are accessible, openly and freely, to children in the United Kingdom with no age verification.

Expecting Canadian regulators to police tens of thousands of offshore platforms is unrealistic.

In addition, Bill S-209 will not keep children from seeing adult content where experts say they encounter it first — on mainstream social media sites, Reddit, X, Google search engines and similar platforms. Targeting only organizations that make pornographic material available on the internet for commercial purposes results in this bill missing the mark. After all, the largest repository of explicit content online is not Pornhub or any other adult platform, but Google Images. Bill S-209, as presently drafted, would do nothing to address this.

There is, however, a better way: device-based age verification. Instead of placing the obligation on millions of websites, require three operating system manufacturers to implement age controls. Microsoft, Apple and Google already have the tools. On-device age verification means one-time verification, privacy by design and effective enforcement at the source of internet access. This will empower parents, protect minors, preserve privacy and is technologically feasible today.

Sites will never learn a user’s name, ID number or see their biometrics. It is enforceable — regulators need only oversee a handful of OS providers, not millions of websites. And it is effective — minors cannot simply route around it by hopping to a new site or mirror domain or using a VPN.

Indeed, the state of Missouri has done just this by explicitly requiring that any mobile operating system present on at least 10 million U.S. devices — those of Google, Apple and Microsoft — must have the capacity to provide digital age verification that websites and applications can rely on.

This provision recognizes exactly what we are arguing here: that system-level solutions are the only feasible, scalable and privacy-protecting way to regulate access to age-restricted content online.

Device-based age assurance can also be tailored to address other issues, keeping children from seeing ads for gambling, tobacco or alcohol and restricting adults from anonymously entering online spaces designed for minors.

I urge this committee to learn from the failures of site-based laws abroad and to chart a better course, one that actually protects minors.

Amend Bill S-209 with a requirement for device-based age verification. Lead the way with a made-in-Canada solution that will work. Let the devices be the keys. The adult platforms will gladly be the locks.

Thank you. I look forward to your questions.

The Chair: Thank you. Mr. Kilicci.

Matt Kilicci, Vice-president, Trust and Safety, Risk and Compliance, Aylo Freesites: Good afternoon, chair, deputy chairs and distinguished senators. Thank you for the opportunity to present Aylo’s perspective on Bill S-209.

Aylo operates free adult content websites, including Pornhub, YouPorn and Redtube, as well as subscription-based websites such as Brazzers and Reality Kings. We maintain comprehensive trust and safety measures, including identity and consent verification, for all content creators and their performers; automated content scanning, with over a dozen different tools and classifiers; and human moderation of all content prior to publication.

Let me state our position from the outset: We absolutely support age verification and the initiative to protect minors online. We agree that minors’ access to age-inappropriate material is a real issue and have long supported age verification.

We’re grateful for the opportunity to appear before you and are eager to collaborate with government and industry to find a solution.

Over the past few years, many countries and states have enacted age verification legislation, focused exclusively on requiring websites to verify the age of their users, in other words, at the widest possible end of the internet funnel.

This means requiring tens of thousands, if not hundreds of thousands, of websites to implement age verification measures based on differing standards and legislation, and expecting users to share personally identifiable information — or PII — repeatedly, every time they’d like to access a different adult content website.

In our experience, we have found this approach to be fundamentally flawed and counterproductive. We are here to propose a device-based solution that will work to protect minors. This recommendation is based on our real-world experience complying with existing laws and witnessing the failures first-hand.

We have seen migration of users to non-compliant or unmoderated websites, circumvention of age verification restrictions using virtual private networks — or VPNs — a near absence of enforcement and a significant rise in privacy risks.

Just this week, Discord disclosed a data breach that included scanned IDs submitted for age verification. Each new point of data collection multiplies the risk of recurrence.

Overall, while the intention is well founded, the mechanism is flawed. Minors are not any better protected with than they were without these laws. In fact, we would argue that the internet has become less safe.

The solution does exist. Device-based age verification is an effective solution to the challenges of online age assurance and the protection of minors online. It will protect minors, respect user privacy, safeguard user data, ensure equitable compliance and avoid the unintended consequences of website-based age verification. Age verification must be implemented at the narrowest possible end of the internet funnel, and that is at the point of access: the device.

For background, modern internet-enabled devices are overwhelmingly operated by three companies: Google, Apple and Microsoft. They offer built-in parental control features and they could securely store age classification and communicate an age signal through an application programming interface to websites and applications.

The proposed solution is this: First, enable kid-safe defaults. New devices sold in Canada could have parental controls enabled by default. For existing devices, this enablement can be done through a software update.

Second, verify age once. The age of the user is determined on the device, either at the point of purchase or through on-device verification. Verified adults can then disable the parental controls. The device can transmit only the age range and no other information — just whether the user is 18 years or older or under 18 — to access websites or search engines via a standardized signal.

Third, this would enforce universal restrictions. Websites, search engines and applications, including social media platforms, must block adult content where an age signal indicates a user is under 18.

The technology exists for operating system manufacturers to enable parental controls and determine and share an age range.

A device-based solution virtually eliminates migration and VPN circumvention, safeguards and protects user data and is realistically enforceable.

Recent legislation in California and Missouri has already begun to incorporate a device-based solution, validating the feasibility of the model.

We want to be part of the solution. Device-based age verification is that solution, and we respectfully urge the committee to embrace the approach by amending Bill S-209 to mandate device-based age verification. This will effectively protect minors from age-inappropriate content across the internet and achieve the bill’s stated objective. Every device can represent a protected minor.

Thank you. I look forward to your questions.

The Chair: Thank you. Ms. Dawson please.

Julie Dawson, Chief Regulatory and Policy Officer, Yoti: Honourable senators, members of the committee, thank you. It’s a great pleasure to join you today.

I represent Yoti, a U.K.-headquartered digital identity and age-assurance provider. Yoti is also a member and part of the steering council at the Digital ID & Authentication Council of Canada, or DIACC, by my colleague Leigh Day. I am also co-chair at the Age Verification Providers Association and have been involved in the standards development in age assurance for eight years now, so it is a real privilege to contribute to your deliberations today.

Here is a brief overview of some of our age-assurance approaches: We offer a suite of 12 approaches spanning the full range of verification, estimation and inference. All of them are data minimized — in other words, just sharing what is needed for the youth case. For instance: Is a person over 18?

We undertake over 1 million age-assurance checks daily. We have completed over 900 million checks — soon it will be 1 billion — and we do this across a whole range of the world’s largest global platforms: Instagram, Facebook Dating, TikTok, Sony PlayStation, Philip Morris International, LEGO and OnlyFans.

The majority of consumers, when we give them several options, select facial age estimation. This is within about one year of accuracy for your youth age group, and we publish the exact accuracy levels. Over 95% of people who start this are able to complete the check, which takes about a second. It works in three stages: detects the face; assesses it, including injection detection; and then deletes the image. The image is immediately and irreversibly deleted. Nothing is stored — there is no biometric template or central database.

Every one of the deployments that we have undertaken reinforces our conviction that effective age assurance and strong privacy protection can and must coexist.

We are independently certified to international standards: the ISO 27001, for information security; ISO 27701, for privacy management; and SOC 2 Type 2, for security, which we do at a government and financial services level. We do that to the U.K. digital identity and attributes trust framework and work closely with the DIACC Trust Framework.

Our facial age estimation has been reviewed by data protection bodies such as the U.K. Information Commissioner’s Office, which confirmed that there is no unique recognition, identification or storage of biometric data. It has been reviewed by the German regulator — the Kommission für Jugendmedienschutz, or KJM — who approved it for regulated use in 2022, and recently reduced the permitted buffer, which used to be 5 years over 18 and is now 3 years.

It has been reviewed by the Age Check Certification Scheme; the U.S. National Institute of Standards and Technology, or NIST, global benchmark on facial age estimation; and most recently the Australian benchmarking program, conducted just this July.

Across all of these, we have shown consistent accuracy across age groups, genders and skin tones, in every instance sharing minimized data — just an over-18 attribute.

These accreditations and benchmarks are designed to give regulators and businesses confidence that independent oversight and transparency are built into the system design and governance.

I thought you might be interested to hear about our experience of implementing the Online Safety Act in the U.K. and also the enforcement of access to adult content in France this summer. We have drawn several key lessons: Of course, data minimization is essential for user trust, which aligns with Article 7.

We have been asked if this can meet the Canadian privacy standards limiting data collection to what is necessary and destroying it after use. Absolutely. These mirror the best practices we already follow.

Regarding learnings from the Online Safety Act, we saw that about one in four users chose proactively reusable checks. This could be an anonymized age token, facial age estimation within a reusable app or just an 18-plus stored from a reusable identity app.

Contrary to fears, users were very willing, at large scale, to complete checks. We have not seen that drop off. We have not seen large-scale circumvention. Yes, there has been a lot of noise about VPN circumvention. This has been widely cited but, as yet, there is little peer-reviewed evidence and few statistics to know what percentage are legal adults, what percentage are bots and what percentage are children.

We will continue to invest in innovation to support choice for users, such as on-device liveness, face-matching models and on-device facial age estimation.

We continue to look at how to make checks affordable. We have stated on the public record that, for instance, shares from our reusable app, be that of the facial age estimation or just an over-18, are offered at $0.00.

We have shared the volume statistics. I can share those afterwards in the amicus brief we undertook earlier in the year.

We would encourage regulators in Canada and around the world to work jointly with the Global Online Safety Regulators Network and the International Age Assurance Working Group of data protection commissioners to set clear, enforceable minimum standards for the deployment of age assurance.

When standards are vague or optional, a race to the bottom can occur with sites choosing less effective or less privacy-protective options.

So let me close at this point. Around the world, societies are grappling with the same question: How can we protect children while upholding privacy rights? The answer lies not in rejecting technology but in demanding the right kind — solutions that are independent, transparent, auditable and privacy-first, combining estimation and verification to serve the public good.

Honourable senators, we stand ready to support you in the implementation of this bill and to uphold the twin values of safety and privacy equally. If done properly, Canada can set an international benchmark for responsible age assurance. Thank you for your attention.

The Chair: Thank you for your opening comments.

We have about 30 minutes, so we must please be succinct.

Senator Batters: Thank you to all of you for being here and helping us with this bill. First, Mr. Friedman, you are advocating that the age verification for pornography should be device-based, not site-based. But isn’t it correct that section 12 of Bill S-209 would leave it up to the Government of Canada’s cabinet to prescribe the age verification method, which could potentially allow for device-based age verification rather than site-based verification?

Mr. Friedman: It could, but given the pitfalls and potential downsides of platform-based age verification that we have seen around the world, in my respectful view, that’s too important of an issue to leave to another day. Allowing this multiplicity of mechanisms over thousands of platforms really introduces all the risks we have talked about and for which, in my view, there is great evidence. The bill itself should require age verification at the device level.

Senator Batters: You have provided some important evidence as to why you think it should be one way and not the other. I would hope that the Government of Canada, if this bill were to pass, would take that into consideration and consider all of these different safeguards that the bill has as to what an appropriate mechanism would be. Thank you for that answer.

Because I have limited time, I want to next go to Ms. Dawson from Yoti. Thank you for telling us about the 12 approaches and the data-minimizing part of that. You also said that you have not seen widespread circumvention of it despite a lot of fears expressed. We have heard about those already at our committee. I understand that your company is one of those providing age verification services for the U.K. I would like to give you a little more time to tell us about that experience and some of the things that had been feared but are not coming to fruition as far as what you are seeing so far.

Ms. Dawson: Absolutely, thank you. Yes, we have had a lot of noise that there has been mass circumvention, mass attacks and mass use of VPNs. But what we are still yet to see is any concrete data. Because adults can legitimately use a VPN, what we don’t know is how many are actually bots that have been targeted versus adults versus minors. That is what I think Ofcom is going to look at much more closely.

One of the things we have seen is that this element I mentioned of 25% of consumers picking the reusable approaches. We have seen some platforms that have over-removed or over-blocked lawful content when they did not need to. We have seen some fake sites appear, which would highlight the need for an awareness campaign and either licensing or an audit of which providers are to be trusted in a certain jurisdiction.

We have seen some credential reuse and federated log-ins, and actually account-based and federated log-in models can increase the risk of tracking, profiling and credential sharing. In our view, this shows there is a need for enforceable baselines and things like the NIST liveness review, review of injection detection, requirements for document authenticity, limiting the number of retries, independent certification and review of what is proportionate. One of the things NIST, for example, in the U.S. looks at is what a person can do with low skill, 10 minutes and $10 versus high skill, 1,000 hours and $1,000, and what the Goldilocks solution that’s just right in the middle is. Those are things that regulators around the world can look at.

Another thing is clarity on re-authentication frequency. For example, in France, tokens have to be updated after an hour of inactivity. That sort of clarity would be very helpful. But all in all, we very much see that a check at the point of access is really helpful.

As per Vicky Nash’s point, one of the things we have seen is lower-income families are more likely to have shared devices and hand-me-down devices. Also, of course, there are strong monopoly and antitrust issues with regard to centralizing checks at the point of access.

[Translation]

Senator Miville-Dechêne: Mr. Friedman, in your opening remarks, you said — and I will quote you word for word — that “children are migrating to non-compliant sites” in the United Kingdom.

How do you know where children are migrating from, and whether children migrate to non-compliant porn sites? Do you have magic powers? There is no way to know this, the same way we don’t know whether or not children use virtual private networks. There is no record to show whether or not a child purchased the virtual private network. How do you have the slightest idea where children are migrating to?

We know that there are 50 most visited porn sites, and we very well know that the 600 others have far fewer visits. Frankly, I do not know what your sources of information are in this matter.

[English]

Mr. Friedman: First, senator, I do have magic powers. I don’t usually talk about them publicly, but that’s just between me and my close friends.

I would say there is data, and we are eager to provide it to this committee. I’m going to discuss it and happy to make it available as a written version. The first U.S. state to impose age verification using site-based measures was Louisiana. We saw an attrition of 80% of traffic leaving the compliant platforms and a commensurate 80% increase.

In the United Kingdom, we have seen a drop of at least 50% of traffic and a commensurate increase in that same 40 to 50% for non-compliant platforms. If those platforms are not complying with age verification, then they are not complying with other regulations either.

You are right: We don’t know if those are children or adults. The point is that if children cannot get into those platforms and many thousands of other platforms remain open, then it is a very simple, logical inference that children are migrating to non-compliant platforms.

Senator Miville-Dechêne: They may not try those platforms. You don’t know that. You are saying things that you can’t prove.

I’ll go to another question.

[Translation]

You are saying that it is not your responsibility to conduct age verification and that the onus is on other operator companies, namely Google and Meta. This is extraordinary. It is akin to a company that broadcasts pornographic material asking another company to ensure that children do not access this material. It is your responsibility to ensure that children do not access your platform, not Google’s or Meta’s.

What you are doing is a reversal of responsibility and frankly, that seems pretty remarkable. Furthermore, if we ask Google or Meta to carry out verification, you believe that means that everyone will be subject to age verification and not just those who want to visit porn sites.

You exaggerate the situation, and you assure us that everyone on earth will have their age verified, even if they have no intention of visiting a porn site.

What you are doing, Mr. Friedman, is shifting the problem and ensuring that you are not the ones who will carry out any age verification. Isn’t that so?

[English]

Mr. Friedman: Senator, I would say that your own bill requires the age verification provider to be a third party who is at arm’s length. There is no question that adult platforms themselves will not be doing the age verification. What I point out is that the device is the best place to do that.

Second, this is not about absolving anyone of responsibility. This is about recognizing that site-based age verification has fatal flaws that device-based age verification does not.

With respect to your last point, you say this would require age verification for all, even if they don’t intend to access adult content. I respectfully disagree. That is not the position.

I think Mr. Kilicci put it so well, and I want all the members of this committee to reflect on this. Mr. Kilicci, who’s a vice-president at Aylo, said that Aylo’s, the operating company of Pornhub, wants all adult content to be blocked by default on every device in Canada without age verification. That’s all it’s asking for. There would be no requirement to verify your age if you don’t want to access adult content. It’s about having device-level parental controls employed by default, only to be disabled by age verification —

Senator, you asked me a question, if I could just finish.

The Chair: Let him answer. I’ll give you time to ask afterwards.

Mr. Friedman: I very much want to hear all of your questions, and I want to answer them.

The idea would be that the parental controls would be enabled by default. They would be on, only to be disabled or circumvented if there is age verification. This is not a situation where everybody would need to age verify.

[Translation]

Senator Miville-Dechêne: If this solution were the miracle solution you are contemplating, why is it that countries like the United Kingdom, which have been working on this issue for years, have not chosen this system? How come neither France nor Germany has chosen this system, and most states in the United States, including Missouri, chose other systems? Are you the only ones who’ve got it right?

[English]

Mr. Friedman: We are certainly not the only ones to say this. I recommend everyone read the report of the Australian eSafety Commissioner, who has recommended device-based age verification as potentially the most viable solution, as well as the Spanish data protection regulator, who made the same recommendation.

However, there is an elephant in the room — or rather there are three elephants in the room: Google, Apple and Microsoft. They are incredibly powerful and have great sway.

Our own Parliament — and I’m a very proud Canadian and very proud of Parliament — has been fighting tooth and nail for years against Google with respect to the online news bill and with Meta with respect to similar pieces of legislation. They are very hard to legislate against and do not want to be regulated in this regard — that’s the answer. That’s why it hasn’t happened.

On our side, for the adult perspective, we want to be regulated. We are here begging to be regulated. In fact, I would say that I want this law to make it an offence to not identify oneself as an adult site via a meta tag like “restricted to adult.” That should be a crime — it should be penalized — but at the same time, the place to restrict access is the device.

I know Google, Microsoft and Apple are fearsome foes when you’re on the wrong side of them, but Canada can get this right and can legislate courageously, and your bill should do that.

Senator Miville-Dechêne: I don’t think you want to be regulated, but let’s stop there.

Senator Prosper: Thank you to the witnesses. This has been an informative discussion. It has been great to listen in. My question is for Ms. Dawson.

The Chair: Ms. Dawson is not online at the moment for technical reasons. She has logged out.

Senator Prosper: Hopefully she logs back in.

I want to pick up on your earlier point in response to the dialogue you had with the sponsor of the bill. You said that there are fatal flaws associated with having things at the wider end of the pipe — site-based, let’s say — and that you should constrict it to the narrow end, at the device, the most appropriate area. Can you expand on that in a succinct manner?

Mr. Friedman: I talked about circumvention and about VPN. Let’s talk about something else: enforcement.

Think about the regulatory body that you will need to create and pay for to do this. And think about how impossible their task will be. You shut down one adult platform operating in Moldova, and they open another URL and another URL.

I notice that this bill requires action in the Federal Court. I practised in the Federal Court. I’m an amicus curiae in that court. I know how overburdened all of our courts are.

Are you going to bring thousands of court actions against brand new URLs that pop up? It’s whack-a-mole and makes no sense. Apple, Google, Microsoft — the narrow end of the pipeline makes enforcement considerably easier.

Senator Prosper: Thank you. I would like to reserve that question for Ms. Dawson.

The Chair: If she comes back on, we will ensure that.

Senator Simons: My first question was also for Ms. Dawson, so I would also like to lay down a marker, but I will ask some questions of our friends who are here in the room.

How did Missouri get this to work, then? Because you are right: Senator Miville-Dechêne and I both worked on Bill C-18 in the Senate Transport Committee, and we know how hard it is to get Meta and Google to — well, “cooperate” isn’t even the right word for it.

How did Missouri manage to get device-based regulation when they’re a rather small jurisdiction?

Mr. Friedman: We don’t know how it will work because the law takes effect next month, but they have enacted it. Those rules are published, and I’m happy to send to the clerk of the committee a copy of those regulations.

Senator Simons: Please do.

Mr. Friedman: As I said, the threshold they set is a wise one: 10 million devices in the U.S. You must now be able to transmit an age flag — over 18 or under 18 — to any platform that requires it.

I don’t know if Mr. Kilicci has more information on that.

Mr. Kilicci: I don’t have more information on Missouri specifically, but as Mr. Friedman said, the final version is set to be published in the next week, and it is set to go into effect at the end of November.

I will also add that other than in Missouri, there is a bill in California that is currently sitting on the governor’s desk. It has gone through all phases of review — the House and Senate — and it is sitting on Governor Newsom’s desk to be signed —

Senator Simons: I think Governor Newsom has been busy.

Mr. Kilicci: Granted, he has been busy, but there is a bill on his desk as well that does something similar, which is to require the device manufacturer and operating systems to determine the age of the users and make an age signal available to applications, which then disable or do not show age-inappropriate content based on the age range received from the device for that particular user.

Senator Simons: I’d like to see California move forward with that. They are as big an economy and population as Canada. That would make a much bigger difference than our friends in Missouri, and I mean that with no disrespect to Missourians.

The Chair: Senator Simons, Ms. Dawson is back on.

Senator Simons: Ms. Dawson, I have a quick question for you.

I want to talk about facial age estimation. You claim that your system works regardless of gender or race. I don’t know if you mentioned gender identity.

This is my concern: It’s one thing for facial estimation software to be able to tell a 60-year-old woman from a 12-year-old boy, but how can it possibly know whether you’re 17 or 18? How can it possibly adjust for people who may, because of their ethnicity or gender identity, not conform with typical benchmark data?

Ms. Dawson: Thank you. It’s an excellent question.

With regard to your first question in terms of accuracy and how it has been built, all of ours are human estimators, and we are able to estimate within about four to eight years because of our lived experience of who is around us.

This artificial intelligence has been built with millions of images with a ground truth of age in face in months and years. Because of that, it has become more and more accurate over eight years of developing it.

In terms of people transitioning gender, we’ve worked with Sparkle and MOSAIC, which are both LGBTQ non-profits, to see that it does not further disadvantage people who are transitioning or the wider LGBTQ community. This has been independently verified, and it is something that we continue to improve.

It provides just under one year of accuracy for 16- and 17-year-olds. It’s just over a year for 18-year-olds, and we publish exactly how accurate it is for males, females, light or dark skin tones and so on, from age 6 all the way to age 70. I am happy to provide a white paper —

Senator Simons: If you can provide data, that would be terrific, because my concern is that if you’re 18 or 19, surely your right to look at things is going to be compromised if we’re using facial age estimation, because, with all due respect, I don’t know how you can tell somebody who’s 17 years and 10 months old apart from somebody who is 18 years and 3 months old.

Ms. Dawson: That’s a very good question. The German regulator uses a three-year buffer. They initially used a five-year buffer in 2022, when they first reviewed and approved it, and it was awarded the seal of approval from the co-regulatory body and listed by the KJM, the regulatory body. It has quite a few years of being deployed, but they use a three-year buffer. Those over 21 years of age are able to use it to access adult content in Germany.

This is absolutely something that must be independently assessed, and we’re happy to provide those assessments to you.

Senator Simons: With a five-year buffer, it seems to me that a 13- or 14-year-old could then easily access pornography. This seems an imperfect system, at best.

Ms. Dawson: By way of background, it means that in Germany, they say that because the accuracy is one year, they make it a three-year buffer. So you have to be over the threshold of 21 years of age to use facial age estimation. It’s one year of accuracy. At 16 and 17, it’s actually nine months of accuracy, and this has been independently reviewed. It has been reviewed by the National Institute of Standards and Technology, or NIST, and also through the recent Australian benchmarking, where they also reviewed that. It also worked with Indigenous populations; they tested it specifically for that.

But absolutely, each country reviews what it thinks that buffer over the age of 18 should be, but it’s something that’s inclusive. Not everybody feels comfortable, we found, using a document-based approach. Not everyone will have access to a credit card or other materials. It is an approach that can be used in a waterfall level. That can be one of the options, and people will choose to use that according to their preference.

Senator Simons: That information would be very helpful.

Senator Prosper: Nice to see you back, Ms. Dawson.

I wanted to put a question to you, because part of the earlier testimony here centred around a statement by Mr. Friedman, where he provided that to deny access to pornographic material to youth, site-based mechanisms have fatal flaws as opposed to device-based ones. What’s your opinion on that?

Ms. Dawson: I’m happy to share a full paper where we’ve gone into a lot of detail working with a trade body and different techs and organizations. But Victoria Nash put forward one very specific point: If you look at lower-income families, a lot of them will have shared devices or hand-me-down devices.

It will probably take several years for us to work through the antitrust issues with Apple, Google and Microsoft. This is not something that is ready to go immediately.

If you look also at approaches which are device-based, you’re basically putting a parent in the loop. The parent is going around on day zero and having to adjust every device. What we’ve seen, for example, from parental controls — the CEO of Snap Inc. was quite public: 1% of parents in your neighbour, the U.S., were able to get to grips with parental controls on Snap. Around the world, we see there are very differing levels of parents being able to get to grips with parental controls across the range of devices.

Picture a setting at home where you have a young child under 10, a teenager and grandparent all using —

The Chair: Ms. Dawson, unfortunately, we’re not able to translate your testimony at the moment.

Ms. Dawson: I understand.

The Chair: I invite you to please try to log on through your computer, if possible. I’m sorry. I can’t do anything more about that at this time.

Senator Saint-Germain: My question is for Mr. Kilicci, and I’m glad to hear that you are the Vice-president, Trust and Safety, Risk and Compliance, because my questions are related to this fundamental question.

At the end of 2020, you said in your brief that you executed on your flagship platform, Pornhub, what outside observers later called:

. . . the most sweeping cleanup in the history of tube sites since we removed more than 8 million videos uploaded by unverified users and instituted a “verified uploaders only system.”

My first question is this: Did this purge only affect unverified users, or did it also target videos with problematic violence or inappropriate content that could eventually lead to criminal behaviour?

Mr. Kilicci: Thank you for the question. At that time, we essentially identified that all uploaders to the platform were either ID-verified or unverified. Anybody who was unverified, we pulled their accounts and all their content. That didn’t mean it was problematic content. It just meant that they were unverified uploaders and the content came down.

When it comes to what’s in the content, what remains on the site today is content uploaded by verified uploaders, meaning individuals who have been verified by supplying us with ID. Going back further than 2020, content gets moderated by humans who are employed by the company. We review content and have strict standards around what is and is not permitted.

If you’re not sure, you escalate or reject the content. If you’re confident it meets our trust and safety standards, you can allow it to be published. That process also includes validating that the individuals in the content have been ID-verified by the platform. We make sure they meet standards when it comes to our levels of service prior to allowing the publication of the content.

Senator Saint-Germain: It says in your brief that you verify those who appear in the videos, their agreement to appear and the agreement with the conditions, but editorial content related to violence is not the main objective of the verifications.

My second question is this: Today, at this moment, forgetting any legislation that may force action upon you, do you have any mechanism to prevent minors from using your websites? Since you are the one putting the material that is inappropriate for children online, do you feel you should assume part of the financial burden of keeping the content away from children?

Mr. Kilicci: Thank you for the question. If I may, I’m going to end my answer to the first question by saying that the verification process does involve verifying individuals, but we have strict standards when it comes to what is permitted in terms of what is depicted in the content, the actions in the content and what is being shown. That is also done as part of the moderation, and that is to check every piece of content before it is published. We don’t just check the people; we also check that the conduct in the content meets our standards, and that includes rules against violence in content, for example. Just to conclude regarding that point, that is part of what we check.

On the question of verification in terms of standards and what we have in place today when users visit our sites, first, we support parental control features. As Mr. Friedman mentioned, the “restricted to adults” label is on all of our websites and allows parental control filters to block our websites when users try to access them.

Second, we display an age disclaimer page, or age gate, on our websites when a user arrives at our website. That gate tells them that this website contains sexually explicit content and that you must be over 18 to access it. It gives the user two options: “Yes, I’m over 18” or “No, I’m under 18.” That allows someone accidentally landing on the site to leave it. Those are the two primary functions we have available on our platform today to facilitate the prevention of access by minors.

Senator Saint-Germain: What about the last question related to you assuming a part of the financial burden to keep the content away from children?

Mr. Kilicci: As I mentioned in my proposed solution — I’m assuming we’re talking about the possible solutions here — looking at the problem we’re trying to solve, it seems to me that we’re trying to keep minors from accessing age-inappropriate content online. Part of that includes adult content websites. I think there is a role to play for adult content websites. As I mentioned in the proposed solution, number one, we’re asking for all websites to be blocked by default. There is less for us to do there because we’re asking for those controls to be on by default on all devices.

The part that requires not just adult sites but industry to do something is receiving the age signal from the device, whether it’s a social media site, a search engine, an app or an adult content website, to have the ability to demand that these websites query the operating system and the device and identify whether the user is over or under 18. If they’re under 18, don’t let them in. If they’re over 18, let them in. Our responsibility is to query that signal and take action if it tells us a user is not 18 or older.

Senator Pate: My question is for Aylo. The last time we had witnesses representing a pornographic company appear before the committee regarding the predecessor of this bill, they were very clear that the industry would largely not comply with a bill like the one before us and that there were no incentives for them to do so. I’m hearing you say something different today. What has changed?

Mr. Kilicci: Thank you for the question, senator. What do you mean that you’re hearing something different from me today? What have I said that’s different or that contradicts?

Senator Pate: Right now, would you say that companies would or would not comply with the bill? And would you say that there are incentives for them to not comply?

Mr. Kilicci: I don’t know that I can speak to incentives, but I can speak based on experience and what we’ve seen in other places. There are 24 U.S. states that have enacted similar laws to what is contemplated in Bill S-209. The U.K. has done something similar when it comes to site-based age verification. Other European countries have done something similar. When it comes to the 24 U.S. states, we’ve seen overwhelmingly that the majority of websites just do not comply.

If we go back to January 2023, the first state that enacted a law that required websites to verify age before allowing entry to adult websites was Louisiana. To this date, the largest adult platform still doesn’t verify ages before allowing users to enter their site in Louisiana, and it has been two and a half years. My point is that it is hard to get tens of thousands or hundreds of thousands of websites in different states and countries to all comply with these standards. It is the most complex way to try to achieve the objective. There is a better way to do it.

I see no reason why what we’re seeing in the U.S., in all these states, is going to be any different in Canada. Maybe some of these operators will do it differently in Canada. It’s possible, but the history is there over the past two and a half years. We can pick up a VPN or proxy, go to any one of these states and try to go to the top 20 websites. You can see for yourself which ones are letting people in and which ones are requiring them to be age verified.

Senator Pate: You mentioned device-based methods. Is there anything else? What type of economic processes or measures could be put in place to encourage compliance by companies like yours?

Mr. Kilicci: When we’re talking about the solution, yes, it starts at the device because that’s the point at which the verification can take place. Once this has happened, it makes it easier for websites globally — not just jurisdiction by jurisdiction — to query and prevent access to users based on the signal received from those operating systems. Google, Apple and Microsoft operate 98% of the devices in Canada. That number is 97% globally. It’s not different in different parts of the world. It is three companies operating devices globally. If they are able to make the signal available, it makes it much easier for not only parental controls to prevent access in the first place but also for social media websites, adult content websites and search engines to block access to specific materials on their websites based on the age range of the users.

Mr. Friedman: Senator, can I add something to that?

Senator Pate: Sure.

Mr. Friedman: The end of your question was, “What would help make companies like yours comply?” I’m speaking on behalf of the ownership group. Aylo will always comply with the laws in the jurisdictions in which it operates. We comply today and will continue to do so. We do so in the United Kingdom — by offering age verification — and in Louisiana, et cetera. We will do so in Canada. There is no question about that. But if Aylo does it, is the good actor and puts up the age verification and thousands of other sites don’t, we know where the traffic will migrate to.

Senator Pate: You’re saying right now, if this law were passed, there would be compliance.

Mr. Friedman: On behalf of the ownership group, Aylo will comply with the laws in every jurisdiction in which it operates.

Senator Pate: Okay. We heard the opposite previously.

Mr. Friedman: From Aylo?

Senator Pate: Not from Aylo, but from MindGeek.

Mr. Friedman: Okay. I’m saying Aylo will comply with the law, just as we do today.

Senator K. Wells: My understanding is we do not have Ms. Dawson.

The Chair: We don’t, and we’re going to advise her that she can answer questions in writing, which would be distributed to all the members. Here’s the problem: There is a policy in the Senate. She only has a telephone, and the telephone doesn’t allow for proper translation.

Senator K. Wells: Is it possible to invite her back to answer questions? We have not closed the loop, so to speak, on questions.

The Chair: We’ll look into that and if she’s available.

Senator K. Wells: I do have questions for these witnesses here.

The Chair: Go ahead, please.

Senator K. Wells: Thank you for coming and sharing your statements and expertise with us. We’ve heard from yourselves and from other witnesses about device-based age verification methods. Looking at the bill as it has been written, subclause 12(2) discusses age verification and age estimation methods. Would you support an additional amendment here for, let’s say, a paragraph 12(2)(h) after 12(2)(g) that would specifically name device-based age verification as an option?

Mr. Friedman: The trouble with doing it in clause 12 is that would essentially say that this is a method of age verification that could be used. It won’t require Google, Apple and Microsoft to do anything. The Missouri approach, on the other hand — which says that if you make available operating systems beyond a certain numerical threshold, you must provide an age signal to apps and platforms that want it — will actually force them to do it.

You could have it right here in the legislation, and Google and Microsoft would say no, and we’d be in the same position we’re in today.

Senator K. Wells: So it can’t be optional. It needs to be mandatory from your perspective.

Mr. Friedman: Correct.

Mr. Kilicci: This is the second time it was mentioned as a method, but device-based age verification specifically, we’re not referring to it as a method. The method could be ID checks. That’s a method to verify age, but we’re talking about where that verification takes place. We’re advocating for that verification to take place once, on the device, and for users not to do it after that.

Google, Apple and Microsoft can determine how they’re going to determine the age of the users on the device. That could be through an ID check. Maybe they use Yoti or another provider, but they are the ones undertaking the process. They store the results and then make them available to applications, search engines and websites.

It probably requires an addition to the amendment to bring them into scope as operating system manufacturers and mandate them to ensure that for devices issued in Canada, for example, they have the mandate to determine the age of the user and provide for a signal that can be shared with websites, applications and search engines, not to mention the enablement of the filters if that is of interest to the Senate. Default enablement of filters could be something that would fit into that.

The Chair: I have one question for Mr. Friedman. Earlier today, you spoke about the fact that the bill contemplates action in the Federal Court of Canada. You said that you thought that the courts might be inundated with actions. Can you elaborate on that in any way? I think that’s an issue that’s not been addressed by any other witness so far.

Mr. Friedman: My perspective there comes less from my capacity representing ECP and more as a criminal lawyer of 15 years who has practised in the provincial courts, the Superior Court, the courts of appeal and the Federal Court.

The Charter of Rights and Freedoms, under section 11(b), guarantees every person, including organizations, a right to a trial within a reasonable time. Here we’re specifically talking about penal consequences. That means our courts have to prioritize certain matters. They tend to prioritize criminal matters as opposed to regulatory or administrative ones.

In a situation where we hear the chief justices in our courts crying out for further appointments to the bench because of stretched judicial resources, by potentially burdening the Federal Court with what could be thousands of actions, you will delay and deny access to justice for Canadians who need it. That’s why it should be contrasted with a simple solution where the regulatory and enforcement end involves a very limited number of entities.

The Chair: Thank you, Mr. Friedman.

To all the witnesses, thank you for your testimony here today and for coming and assisting the committee in its work.

(The committee adjourned.)

Back to top