Skip to content
LCJC - Standing Committee

Legal and Constitutional Affairs


THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS

EVIDENCE


OTTAWA, Thursday, October 23, 2025

The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 10:33 a.m. [ET] to study Bill S-209, An Act to restrict young persons’ online access to pornographic material.

Senator David M. Arnot (Chair) in the chair.

[English]

The Chair: Good morning, honourable senators. My name is David Arnot. I am a senator from Saskatchewan and the chair of the Standing Senate Committee on Legal and Constitutional Affairs. I will invite my colleagues to introduce themselves. I know Senator Batters is on her way, and she will be here shortly.

[Translation]

Senator Miville-Dechêne: Julie Miville-Dechêne from Quebec.

Senator Galvez: Rosa Galvez from Quebec.

[English]

Senator Tannas: Scott Tannas from Alberta.

[Translation]

Senator Oudar: Manuelle Oudar from Quebec.

[English]

Senator Prosper: Paul Prosper, Nova Scotia, Mi’kma’ki territory.

Senator K. Wells: Kristopher Wells, Alberta, Treaty 6 territory.

Senator Simons: Paula Simons, Alberta, also Treaty 6 territory.

Senator Busson: Bev Busson, British Columbia.

[Translation]

Senator Clement: Bernadette Clement from Ontario.

Senator Saint-Germain: Raymonde Saint-Germain from Quebec.

[English]

Senator Dhillon: Baltej Dhillon, British Columbia.

The Chair: Thank you.

Honourable senators, we are meeting to continue our study of Bill S-209, An Act to restrict young persons’ online access to pornographic material.

On this panel today, we have three witnesses: from the Age Verification Providers Association, Mr. Iain Corby, Executive Director; from Needemand, Jean-Michel Polit, Chief Business Officer by video conference; and from Yoti, we have Julie Dawson, Chief Regulatory and Policy Officer by video conference, who has been rescheduled after technical difficulties from last week. Welcome, and thank you to you all for joining us.

We’ll begin with opening remarks, and then we will move to questions. Ms. Dawson, you did give opening remarks last week. If there is any additional comment that you would like to make, you can do so now; otherwise, we will just hear from Mr. Corby and Mr. Polit.

Julie Dawson, Chief Regulatory and Policy Officer, Yoti: I will allow time for our trade body.

The Chair: Very good, then. Mr. Iain Corby, five minutes or so, please.

Iain Corby, Executive Director, Age Verification Providers Association: Thank you, Mr. Chairman, and senators, for the opportunity to give evidence today.

The Age Verification Providers Association, or AVPA, is the global trade association for providers of age assurance technology. That includes age verification, age estimation and age inference techniques. I am also the technical author of IEEE 2089.1, which is an international standard for age verification, and I also sit on the working group for the ISO 27566 standard, which is going to supplement that when it is finalized next month.

In one U.S. state, it took less than 30 days from the first call I had from a concerned mother to a bill requiring age verification to be passed into law. Now, I know this bill has been a long time in the making and is perhaps at the opposite end of that time scale, but we do believe the time you have invested has paid dividends in its quality. I have already reviewed the evidence you have heard in committee, so to avoid repetition, I will focus on some key emerging issues.

First of all, on enforcement, I would like to suggest the bill would benefit from the additional powers that the U.K. regulator Ofcom has taken beyond just fines and blocking access to sites. Ofcom has the power to require critical support services, such as payment networks, to withdraw their services from non‑compliant sites. We expect this to be an important tool in securing compliance from overseas entities.

Second, I know you’ve heard calls for quite a different piece of legislation that makes the likes of Apple and Google accountable for providing age verification to the pornography industry. We understand the allure of such device-based solutions, and as I will come to, we have sought to move toward that and away from those who profit from publishing adult content. Both those two operating system providers are offering mechanisms to share age signals, but neither is offering to be accountable for those signals. Apple’s solution is actually a parental control, reliant on parents to accurately enter the age of their child and not be pestered into inflating it so their kids can play a particularly gory game. Google is happy to share age information from credentials held in their digital wallet, but they will not be the source of those credentials or take any responsibility for the age that is claimed. At the heart of this debate on where in the technical stack we should apply the age checks is the question of liability. The logical answer is that those who publish and profit from adult content should be liable if they allow it to be seen by children. This also places the control as close to the harm as possible, which is a tried and tested real-world principle.

Third, let me touch upon what we now know about the recent U.K. implementation. Ofcom is reporting a peak of 1.5 million virtual private network, VPN, users, which has now fallen back to just 1 million. We don’t know if they are all being used by 8‑year-olds or if it is just adults perfectly legally not wanting to do age checks, but we will need to keep a close eye on this since adult sites are not exempt from the U.K.’s law, even if children were to access them with a VPN. We checked with our members and found they were doing 5.7 million checks a day when the law first came into effect. We don’t represent some of the largest suppliers, so the true figure was probably twice that. When the big porn sites say they lose 80% to 90% of their users when age verification comes in, I think they really mean “users with an IP address registered in that jurisdiction.” Indeed, our calculation is that the biggest sites might have lost about 15% of their users after you account for people using VPNs, and the intended outcome of the policy, which is preventing users under 18 getting access, probably accounts for 7% to 14% of that fall.

Finally, I want to share where our industry is going next. Thanks to projects funded first by the European Commission and then by the Safe Online organization hosted by the United Nations, we have developed an interoperable, tokenized solution that will allow a single age check to be used across multiple sites. A user completes an age check using any sufficiently accurate method and can then choose to accept a signed digital token onto their device for a limited period of time that will give them immediate access to other age-restricted sites and platforms. This is not quite the same as the device-based options you’ve heard about, as the tokens can still only be used by the person who did the initial check, and they will need to prove they are still in control of the device regularly. The cost of the checks is still met by the adult sites, although it can now be shared across several operators. This solution, called AgeAware, is already live in the U.K. with three operators and will include others shortly.

In summary, age verification by the website is available right now. Millions of privacy-preserving checks are being done every day, and systems can be audited and certified to international standards to confirm privacy, data security and accuracy. There may well be other ways to do that developed in future, but those are years away. Canadian children should not be made to wait for the protection their peers in the U.K., EU and many other states already have.

Thank you.

The Chair: Thank you.

Jean-Michel Polit, Chief Business Officer, Needemand: I am based out of France — I am a French citizen — but I lived for a year and a half in Canada, close to Toronto, and I enjoyed it quite a bit. I am honoured to be part of this meeting.

I have read your Senate debates about this bill, and one of the main questions that keeps coming up is this: Should we accept foregoing our privacy online to protect our children? You are understandably worried that the personal data used in age‑estimation or -verification solutions might be stolen and fall into the wrong hands. I have some very good news for you: Age‑estimation technology now exists that does not require any personal data at any point in time during the verification, so there’s no need to worry about how and when personal data might be destroyed after the verification because it’s not captured in the first place.

How is that possible? My company, a France-based company, has developed a technology that can determine, with 99% accuracy, if a web user is over or under a certain age limit with just a few hand movements. I can’t see your faces really clearly, but I’m sure I raised some eyebrows there. It is not black magic; it is an AI solution based on medical research. It’s quite simple. From when we’re born until early adulthood, our body changes quite quickly, and our nervous system changes almost on a daily basis. As it changes, since it is managing all the movements in our body, we make certain hand movements slightly differently. It is not visible to the naked eye, but medical research has very accurately and precisely established a link between the age of an individual and the physiological features of certain hand movements.

It took us eight years of R&D, but we’ve been able to leverage the analysis of these features with our AI models. Again, no personal data is required or shared by the web user. There is no metadata and no fingerprint. If somebody starts showing their face in front of the camera — and it can be a PC, laptop, smartphone or tablet — if we start picking up an ear or maybe part of an eye, we stop the camera and ask people to move their head away from the camera. The whole process takes less than 30 seconds.

The reliability of our technology has been independently tested several times, most recently by the Australian government in a very large-scale test with several thousand people from various ethnic backgrounds in real-life settings. They used different devices under different lighting conditions. Some people were at home, and some kids were in school. The reliability has been proven. This technology cannot be faked or fooled. A 17-year-old doesn’t know how the hand of an 18‑year‑old moves. Again, these are differences that are not visible to the naked eye, so they cannot be faked. We are not fooled by recorded pictures or videos put in front of the camera, or by gloves.

Another feature is that we also have a tokenized system that’s usable across different platforms so people don’t have to go through the hand movement every time. It is a zero-knowledge-proof token, so no personal data is attached to the token either.

The bottom line is that this is very good news, because my main message to you is that technology now exists such that we don’t have to choose between protecting our children and protecting our privacy online.

Thank you.

The Chair: Thank you, witnesses. We’ll now move to questions.

Senator Batters: Thank you very much. I appreciate all of you being here today to help us with the study of this bill.

My questions will be to Ms. Dawson from Yoti. Yesterday, we heard in testimony from one of our witnesses, Professor Geist, that all of these age-verification companies being international could be problematic for Canadians in that the data would be going to these international companies, and that could result in potential privacy challenges for Canadians. Can you address that issue, please?

Ms. Dawson: Certainly. Thank you for the question.

We already provide age assurance globally, and that is always data-minimized. We do observe where there are specific requirements in countries for specific approaches to data processing, but in every instance, the data is data-minimized with just an “18 plus” being provided back to the organization that is required. We already provide services to companies globally, such as, for example, OnlyFans, where an over 18 is required. We’re very happy to provide examples of how and where this is already operating. We do over a million age checks a day for about a third of the largest global platforms. Some of those are adult and some of those are social media, dating, gaming, et cetera, but we manage in every instance to meet the data privacy impact assessments that are required, the security assessments that are required, and go through the requisite benchmarks, such as those by NIST in the U.S., the recent Australia benchmarking, those laid down by the German government and those that have been put forward in the U.K. similarly in terms of independent assessment of bias in systems. I am happy to provide any that are helpful.

Does that answer your question enough?

Senator Batters: It does, and yes, if you have either a chance orally or, if there is not time, if there is something in writing you can provide to our committee, it would be helpful.

You indicated that all images are deleted immediately after the age estimation. Could you let us know how that process actually works in practice?

Ms. Dawson: Certainly, yes. This can be done in two ways.

Some people access our facial age estimation through their Yoti reusable digital identity wallet. Within their wallets, they have undertaken a facial age estimation. If you take the German regulator, they require a three-year buffer for facial age estimation to be used for access to adult content, so anyone over the age of 21 can use the facial age estimation. They do that once within the app, and then with one press they are able to re-share that facial age estimation to a site, and only the over-18 is shared with that site.

We do have services that ask for that also on a software-as-a-service basis. They will send us an image. We do a liveness detection, first of all, a pixel-level analysis of that image, and assess if that person is over 18, and nothing is stored at all. There is never a central database. It is detect a live face, analyze it instantly, and then delete the image with nothing stored. That has been reviewed by the Age Check Certification Scheme, recently the Australia government, the KJM in Germany, and in terms of accuracy that has been benchmarked by NIST in the U.S., and our white paper similarly reviewed by the Age Check Certification Scheme.

Does that answer enough?

Senator Batters: It does. Thank you very much.

Senator Miville-Dechêne: Mr. Corby, do you have any Canadian members in your organization?

As well, could you address the issue of data breach? Yesterday, a data breach was raised here. Was it a member of your association? What can you tell us about the risk of data breach for age verification and age estimation among your members?

Mr. Corby: I don’t believe we have any members headquartered in Canada. I think Netsweeper has a CEO based in Canada, but no, no current members operating out of Canada.

I think you might be referring to the Discord breach, and this was a breach in their customer services system which they rather foolishly were using to handle appeals. Their actual age verification provider, which is an audited and certified provider, already supplies its own appeals mechanism. Discord had chosen not to use that and were instead allowing people to use their ordinary customer services system and presumably to email copies of their ID when they were unhappy with the adjudication of the original age check.

We already sent a message to all of our members urging them to tell their clients not to do this and that they should take just as much care with people’s data in an appeals process as they would take in the initial check. That would include, of course, data minimization and privacy by design, so immediately deleting those images whether they were used for a check in the first place or whether they are an ID being supplied for the purposes of an appeal.

The answer to that problem is more age verification providers, not less.

Senator Miville-Dechêne: Was Discord a member of your association?

Mr. Corby: No, Discord is a client. It’s a relying party, and it was indirectly using a member of our association. I believe it was using KID, and in turn, KID uses a number of other age verification providers. They will all be operating on the same basis as Yoti does, as Julie described, so deleting any personal data as soon as the age check is complete. This was a completely separate process being run by their customer services team, which I believe was outsourced to a different third party, but that third party never claimed to be an age verification provider. They are there to help users who get stuck with one thing or another, but appeals were being directed towards them.

[Translation]

Senator Miville-Dechêne: Mr. Polit, I am turning to you to find out whether your method, which seems almost miraculous, has been used by certain countries and whether pilot projects are currently under way.

Obviously, I must admit that I am not sure about this, but we were told yesterday that, in the context of facial age estimation, someone being a person of colour can make the estimation less effective. Can you tell us about racial differences and how they may impact hand-based estimation? Who is using your system and what are the results?

Mr. Polit: Thank you for the question.

[English]

We have clients. We have clients in production in three different industries: the adult industry, the social media industry and the online dating industry. These are companies — two out of the U.S., one out of Germany — with worldwide clients. In fact, the online dating company has clients/members in 130 countries around the world. We have passed the project or the concept stage, and we now have paying clients that deploy our technology.

In terms of ethnic bias, there is none with the hand. In Australia, as you know, a large-scale test has taken place, and people from a lot of different ethnic backgrounds have been tested. I think the Australian government was keen to figure out if First Nation people in Australia would be able to use this technology the same way that other ethnic groups would. I can share with this panel and senators the results for our technology across all ethnic groups, and they are basically the same. There is no impact of skin colour or ethnic group for the hand movements.

Senator Miville-Dechêne: The U.K. is very advanced in terms of age verification. Have they approached you for this particular technology? Are you in discussions with countries that are actually starting the process?

Mr. Polit: Yes. In the U.K., I have been in contact with Ofcom. As you know, a while ago, Ofcom released a list of technologies that they deemed to be highly effective. This list was published and put together before our technology came about. I’m in contact with a number of people at Ofcom who are looking at our technology. The fact that it’s totally private and doesn’t require any personal data is very appealing to all regulators, whether it’s Ofcom in the U.K., Arcom in France or KJM in Germany. I’m also in contact with the eSafety Commissioner in Australia. For obvious reasons, they are interested in the fact that our technology doesn’t require any personal data at any point in time during the verification and, on top of that, is very reliable and does not have an ethnic bias. It checks all the boxes.

[Translation]

Senator Miville-Dechêne: Standards have been verified by a third party. Thank you very much for that information, Mr. Polit.

[English]

Mr. Polit: You’re welcome.

Senator Prosper: Thank you so much to our witnesses.

Listening in on the testimony here and comparing it to some of the testimony from previous witnesses, I’m just curious to get your thoughts. Some of that previous testimony centred around data breaches and certain statements such as it’s not a question of “if” but a question of “when.” There is always this balance we hear with respect to privacy considerations and the protection of children.

Ms. Dawson, in terms of one practice, you mentioned that a good practice is just the elimination of the data immediately after it’s verified. Mr. Polit, you mentioned that, with your system of hand-waving, with a bit of AI and medical science involved in that, there’s no information even requested for anything like that. Mr. Corby, from what I gather in reference to the recent example of a data breach, it was people undertaking bad practices by using alternate appeal mechanisms from what was available. Am I accurate in stating that it’s your opinion that privacy considerations and the data breach of personal information is a remote issue or consideration with respect to this bill?

Mr. Corby: I’ll be the first to admit you can do age verification in a good way or a bad way. We do see bad ways, and the Discord example is a very bad way of doing it.

We always say the only non-hackable database is no database at all. It is a very bad idea to retain data. If you keep personal data, you become a target for hackers. That’s why our code of conduct for all of our members expects people to do that data minimization. If you want to be audited and certified to those two international standards that I mentioned I had been involved with — and Yoti are certified to those, for example — then you would be expected to make sure you are delivering on that promise.

This speaks to the earlier concern about data being transferred overseas. Particularly here in Europe, we’re very lucky to have the GDPR data protection laws. You have good laws in Canada as well and a Privacy Commissioner. I understand why people would be concerned about that, which is why we believe, globally, you should use certified providers where you know they’re meeting those standards, whether or not they’re in a jurisdiction that requires them. Of course, it would be up to your authorities to set the specific regulations for the implementation of this bill, and I would hope they would look at being very careful about where data is sent and ensuring it’s processed by only those who meet these very high standards that we expect of all our members.

Senator Prosper: Thank you.

Ms. Dawson: Just to add to Mr. Corby’s element, I think the key thing is the term “data minimization.” Of every one of the 12 different age assurance approaches we have for an adult site, we’re only sending back, “Is this an over 18? Yes or no?” We’re never sending back any more specific information. We use the full range of verification, estimation and inference approaches to meet requests globally, but in every instance, there’s data minimization.

I think there are several things you want to look at for your citizens. It’s a choice of approaches. You want the data minimization. You want inclusion, because not everybody will be happy with the same sort of approach. You want to ensure that you have approaches that are accessible for people with different disabilities, people with older devices or people who have different preferences. That’s why a range of approaches is required, but they must all be data minimized.

Senator Prosper: Thank you.

Mr. Polit: If I may, you are right in understanding that in our case, the technology precludes any breach because there’s no personal identity to be breached, so breach is a non-issue.

Senator Saint-Germain: My first question is for Ms. Dawson. A very worrying report was published recently by the European non-profit organization AI Forensics. It reported to the public that:

A firm claiming to provide “double blind” age assurance services to pornographic sites adapting to France’s online safety law has been found to be collecting unauthorized user data.

It was noted in this report that:

We observed that, despite claiming to offer “double anonymity” options (intended to hide user traffic), [this firm] collects the URL of the video the user attempts to watch.

The firm in question is AgeGO, and it uses Yoti as a third party to proceed with digital identification.

Can you comment on this report and your relationship with AgeGO? Will it have an impact on your relationship with this company? How can we trust third-party verification if, in addition to data breaches, we also have to worry about improper collection of data?

Ms. Dawson: Thank you very much for the question.

Absolutely, one of the things that we need to do as an organization is to look at the downstream usage. I’m not aware of the specific instance that you mention, but we’ll look into that and revert back to the committee. I know that, at Yoti, we take great care not to collect unauthorized data and that all our approaches are data minimized. Let me look into the specific example and revert.

Senator Saint-Germain: I understand if you’re not aware that there’s no provision or requirement in your —

The Chair: I’m sorry, Senator Saint-Germain, I have to interrupt. The interpreters are unable to translate Ms. Dawson because of a technical issue. Can it be resolved?

Vincent Labrosse, Clerk of the Committee: It’s because there’s too much echo in her room.

The Chair: Ms. Dawson, I’m advised that there is too much echo in your room, which is something that we can’t cure. I’m going to suggest that the answers to Senator Saint-Germain’s question be put in writing. Again, it’s a technical issue. I am sorry, but we have no other option. If it’s not translatable, it’s not evidence.

Senator Saint-Germain: My next question is very short, and it’s for Mr. Needemand.

You are the creator of Border Age. I would like to know more. I went to your website, and I still have a few questions. How many years have you been established? Would your model, your technology, fit with the definition of age estimation as described in Bill S-209? How many employees do you have?

Mr. Polit: My name is not Needemand. That’s the name of the company. My name is Jean-Michel Polit.

Senator Saint-Germain: Apologies. I’m interested in Needemand, but Mr. Polit, I rely on you for the answers.

Mr. Polit: The company was created eight years ago. The founder is a person who has been working within the AI field for a long time. He created this company to train people on AI, and he has used this business model of training people on AI to fund the development of Border Age. That took eight years, as I mentioned. Border Age came to fruition, the solution was finally marketable last summer, so it’s very recent. It’s a small start-up with six of us, but we expect to grow very quickly.

Senator Saint-Germain: Has any medical authority certified your technology?

Mr. Polit: What I can do is share with you some of the medical research that we initially used to establish our technology. It’s medical research, medical science.

Our test results and methodology were validated by the ACCS, which is the outfit that was mandated by the Australian government to run the tests in Australia. The test in Australia doesn’t really validate the fundamentals of our technology, but it validates the results, the accuracy and the robustness of the technology in a real-life environment with a few hundred people being tested.

I’d be happy to share with the committee the research papers that were the foundation of our technology. This was the beginning of the story eight years ago. We took that as a basis, a foundation for the development of our solution, but we’ve gone beyond the findings of this medical research. I’d be happy to share the sources.

The Chair: Thank you. We look forward to those documents.

Senator Simons: Mr. Corby, I’m going to start with you. When data is collected and stored in Canada, it is governed by Canadian law, not just the work of the Privacy Commissioner but actual legislation. If Canadians are sharing their data with age‑verification systems that are based outside of Canada — and I think you said you have no Canadian members — would they be governed by Canadian legislation or by the legislation of the home jurisdiction?

Mr. Corby: I think the question of extraterritorial jurisdiction is really complex when it comes to data protection. We’ve seen examples of data protection authorities in one country — for example, the United Kingdom — seeking to enforce against companies in Canada, actually. The Clearview example comes to mind. They tend to cooperate with one another. I would certainly be more comfortable with data being processed in a country that has a robust data protection regime in place, and that may be part of the regulations you would want to introduce.

What we do through our code of conduct, I think, is probably sort of built into the law we’re discussing today to some extent. One of the requirements is that the age-verification system generally complies with best practices in the field of age‑verification estimation and privacy protection. Obviously, those best practices would, in my mind, be the same as the standard levels of data protection you offer in Canada under your legal framework. I would regulate to look for equivalence of that.

Senator Simons: I guess what I’m concerned about is that if we don’t have a Canadian-based company with the technology and the expertise to do this work, Canadians could be sharing their data into other jurisdictions, and then they would have no recourse under Canadian law if there were a data breach.

Mr. Corby: Yes, and I think we saw a little bit of that in the U.K. here when we went live on July 24 and a number of very big global sites were redirecting users to age-verification providers, particularly in the U.S., that perhaps they’d never heard of. I think there was some disquiet about that.

Now, we do our best as a trade association to mitigate that through our code of conduct and through advising people to only use audited and certified solutions where effectively that privacy protection is built into the audit. I think you should look at putting on some extra layer of protection for Canadians to make sure that there is that attention to privacy being paid wherever the processing may be taking place.

Senator Simons: Perhaps for you and for Mr. Polit, there’s been some discussion around the table here about whether 18 is the correct age. After all, in Canada, the age of consent is 16 and you can be married at 17, yet we would be regulating at the age of 18 for access to pornography.

I don’t want you to answer that question, but I want you to answer for me a technical question. The difference in appearance, whether it’s your hand movements or your face, between somebody who is 17 and 11 months and someone who is 18 and 1 month is going to be very difficult to judge. Presumably, if the age were 14 or 16, it would be easier for any kind of age-estimation protocol to estimate the age. Can you talk to me a little bit about how much more effective your methodologies might be for somebody with a slightly younger age versus — the difference between 17 and 18 is very subtle.

Mr. Corby: I didn’t quite catch who you directed that at, but I’ll start briefly and leave my colleagues to talk about the specifics.

We would never recommend using an estimation tool for an exact age qualification. Estimation is generally done with a buffer age, and you use it to check the people who are a few years above the legal age or clearly older and therefore permitted. Those close to the legal age would normally need to find a verification or an inference method, which gives you a more accurate decision so you know, for example, that yesterday was their sixteenth birthday. No estimation solution is going to find that. It will get close, but if you want an exact answer, you will need to look elsewhere.

Jean-Michel, how would you apply that to the younger ages?

Mr. Polit: I just want to come back to a previous question that I did not answer and is actually tied to this question. Our technology is not age verification; it is age estimation, although we don’t send back an age. We send back a 0 or 1 for under or over a certain age limit.

The underlying principle of our technology is the fact that our nervous system matures very rapidly in our teenage years, or basically between 10 and 25, and it applies whether you’re talking about someone 18, 16, 13 or 15. It applies to all of these age groups. Today, although our technology has been tuned up for 18, it would just be a matter of adjusting it. We were in the process of doing this, for obvious reasons, because, in Australia, the age limit for social media is 16. We’re in the process of training our AI models against other age limits, and we know they will work with the same level of accuracy.

I would add to what Mr. Corby said. Our age-estimation technology is very different in its principles and results to facial analysis in the way that when we developed our technology and tried to come up with an age estimate as opposed to a zero or a one, around 18, we are plus or minus two months of accuracy. There doesn’t really need to be a buffer with us, although we’re not 100% so obviously there will be false negatives and false positives. Our interim testing yields to the fact that we were 99% accurate. In Australia, the test found that we were 97% to 98% accurate. Obviously, if somebody wants to be sure or if a platform wants to be sure that we won’t let anybody go through that shouldn’t or that we block people that shouldn’t be blocked, there needs to be a second way of verifying the age. In fact, we’re already working on another anonymous technology that will come about in the next few months that we’ll be able to combine to get very close to 100%.

The short answer to your question is, yes, our technology can be applied to other age limits than 18 — it could be 13, 16 or another age group — and it will be as efficient and as accurate.

Senator Simons: Based on what Mr. Corby said, for anybody under the age of 25, estimation probably won’t work and they will have to default to some other kind of verification where they show ID of some kind or have a much more intrusive look into their online footprint. It’s particularly problematic for young adults who are of age and are legally entitled to look at whatever they want to look at, and they are going to be the ones who will be the most affected because age estimation or approximation is not going to be suitable for them.

Mr. Corby: I would say 25 is too high of a figure to put on it given by what we know about the accuracy being delivered by the various estimation techniques. Let’s have a working assumption of 21 so we give ourselves a three-year buffer age for 18. Yes, the sort of things those people could do is share their email address, share their cellphone number or ask their bank —

Senator Simons: Precisely. That’s just —

Mr. Corby: — to confirm their age. There are multiple alternative methods, but they do rely on finding an actual date of birth, you’re absolutely right.

Mr. Polit: With our technology, the clients we have to date do not use a buffer. They go straight with our technology. Again, when I say plus or minus two months, that means that most people who are above 18 and 2 months or under 17 and 10 months will be identified correctly. Then, with that second technology that’s coming about that we’ll introduce probably before the end of the year or maybe shortly after the beginning of next year, we’ll be able to combine the two technologies. Since they’re totally different technologies, we won’t do it twice.

The Chair: Thank you, sir. That’s good.

I’m now going to ask Ms. Dawson to say a few words to see if her voice can be translated and we can receive her evidence.

Ms. Dawson: Certainly. KJM in Germany has recently stated three years of accuracy for facial age estimation to access adult content. Is that clearer?

The Chair: It’s being translated, so yes. Did you want to answer Senator Saint-Germain’s question that you were unable to answer, or do you want to just leave that for the moment?

Ms. Dawson: I was going to mention that for ages 16 and 17, we are at nine months of accuracy in terms of facial age estimation, and hence, it is the choice of the regulator if they would like to put a buffer on top of that. Germany is currently looking at a three-year buffer, and that would depend on each regulator’s view of the false positives, the false negatives and the other alternatives available.

The Chair: Senator Saint-Germain, would you like to make a comment?

Senator Saint-Germain: On a supplementary, the answer you gave me related to AgeGO and a data breach, and you said you were not aware of it. Does that mean that, as per your contract with this company, they have no accountability to you when such incidents occur?

Ms. Dawson: We find with lots of platforms that they might be using a range of different services. Say you look at Instagram, Meta or OnlyFans. They might use several different services for age assurance. They might also use their own inference approaches.

One of the things in the upcoming international age standard will be for platforms to actually issue a transparency report as to what they’re actually doing with age signals, which age signals they’re using, and which are internal to them and which are from external providers. At the moment, that isn’t something that most regulators ask for in terms of detail.

Would it be on us as an organization to sort of tell tales on the companies we work for? Is that something that the regulator would be best placed to actually do to undertake to see if a company is not doing what it ought to be doing?

Senator Galvez: Thank you for this very interesting conversation that we’re having today.

When preparing for this meeting, I was looking at the news to see how much progress has been made on this. I was very surprised to see that the number of countries adopting laws that limit the access of children to the more harmful parts of the internet is growing, particularly with respect to our peers at the G7.

I know artificial intelligence is in an algorithmic scale of development. In terms of the things you’re talking about, I think they will be solved, if not in the next month, the next year. Here in Canada — Montreal, Windsor, Toronto — we have a huge hub on artificial intelligence. I wouldn’t be surprised if soon we will have in Canada a branch or a company like yours.

Does Canada risk falling behind its peers by not adopting proactive approaches on this issue? I am new to this area, but I’m worried because I was a professor for 35 years, and I dealt with students in engineering, many of whom had drug and pornographic addictions, so this really worries me. What is in the balance if we don’t do anything? Thank you.

Mr. Corby: Perhaps to speak on the industry’s behalf in general, I’ve been doing this for six years now, and really, the momentum has increased very rapidly in the last 18 months.

We’ve seen the European Union pass the Digital Services Act, and they are now aggressively enforcing the requirement for age verification against what they call the very large online platforms. They’re also creating their own age verification app linked to the European Digital Identity Wallet so that they can guarantee there’s a way to enforce that. Obviously, some citizens are not keen to use a government-issued ID and prefer to use private sector alternatives, which they can do, but there’s been a big effort in the European Union. You’ve heard mention of Australia, which is passing requirements both around social media and adult sites. Over 24 U.S. states have passed laws in this area. Brazil is now also bringing in a law for age verification.

I think one of the issues is some countries have been concerned that maybe they tried to let the best be the enemy of the good and have been waiting for a perfect solution. Technology moves very fast. As my colleagues have said, nobody is offering perfection, but as of today, 0% of Canadian children are protected. We can guarantee 95% of Canadian children can be protected tomorrow if you pass this law. As the technology continues to improve, that will get better. I’m sure there will be other innovations with AI and other technologies as well, and I look forward to them. Anything that helps make children safer, I would welcome.

Senator Galvez: Can we hear from the other witnesses?

Ms. Dawson: I would concur that there’s been incredible innovation and investment in approaches for this across political parties of all colours. I think we could probably count about 30 nations around the world that are looking at this for different age-restricted goods and services.

We have had now two global age assurance conferences — the next one is happening next year in Manchester — where not only is there a data protection regulator but a content regulator, sometimes law enforcement, sometimes the bodies looking after alcohol, tobacco, vaping, as well as adult content and social media. They are all starting to understand that, actually, what we used to do, which was require a physical document in person, isn’t good enough anymore.

The fact is that all of us are human estimators in a way, and I could look around this room and think 21 or 25, but with technology, we can make that much more equitable. It can happen at scale, and images can be deleted. I think that’s where we are embracing technology, as Mr. Corby said, to try to support this problem of age-appropriate access to a whole range of goods and services around the world and, using the phrase of the Age Verification Providers Association, to enable the internet to be age aware, and also the offline world.

Mr. Polit: We’re getting interest from very large platforms, not only in the adult industry but beyond the adult industry, where laws have been voted on. The online gaming industry is another one. They want to do the right thing. They want to show their clients that they’re doing the right thing to protect children, because, of course, online, there are a lot of hazards and dangers for young kids. For example, if they’re chatting online as they’re playing games, there could be bullying and a lot of different issues. There are a lot of cases where things go very wrong for some teenagers. These companies — these are worldwide companies making billions of dollars — want to make sure they’re doing the right thing, even though the laws in these industries may not be pushing them specifically to go in that direction. They want to go in that direction themselves. It is the same with, in some regard, social media. I mentioned online dating also where the laws are not specifically targeting them, but they’re moving in that direction too.

My feeling is that as these very large corporations and platforms that have millions of users worldwide embrace these technologies, in our case, it ensures their clients’ privacy online, and that will entice the governments beyond the ones that have already voted on laws and beyond the sector of the adult industry to in some way push governments to follow these platforms and to basically satisfy — we’re all parents. I’m a parent, and I’m a grandfather. I want to do the right thing for my kids and grandkids. These people from these large corporations — I might be dreaming — also want to do the right thing for their own companies, because as they implement these types of solutions, they gain competitive advantage over the companies that don’t implement these solutions.

It’s a global trend. Everywhere you go in the world, parents talk about this on a daily basis. It’s not going away. Platforms, whether they are forced by law into doing it or not, are going to do it; it is a matter of time. As some big platforms out there do it, it’s going to be a matter of fact that laws will have to be devoted to go along with that very powerful trend that is going on worldwide.

[Translation]

Senator Oudar: I thank all three of you, Ms. Dawson, Mr. Corby and Mr. Polit. I believe your testimony is truly crucial and will help us assess whether all age estimation technologies can be integrated into our regulatory system without creating a surveillance infrastructure. That is where I am going with my question.

I’ll start with you, Ms. Dawson. You advocate for an organization that collects, stores and processes citizens’ personal data while being subject to strict transparency and security requirements, including requirements to publish algorithmic error margins. In short, the use of artificial intelligence to estimate age is based on massive data collection on a global scale. If Canadian regulations required transparency in model systems, would there not be a paradoxical risk of exposing exploitable vulnerabilities in cybersecurity or even circumvention of algorithms?

[English]

Ms. Dawson: The way this algorithm has been built — and we explain this in our white paper — is chiefly through the Yoti reusable digital identity Apple wallet, where over 20 million people around the world now have voluntarily set that up. At the point of setting up the app — or subsequently — there is a data element. Through that, we just take the face, with month and year of birth — no other details. That is the ground truth for the facial age estimation. When we’re doing a live check, we don’t learn anything because there is no ground truth. If, for Facebook dating or Instagram, they send us one image, we look afresh at that one image, do a likeness detection, analyze it and delete it. We don’t have any ground truth with that. So if there is more expansion of use of safe age-estimation data in Canada, we don’t learn anything about Canadian citizens. Regarding the million checks we do every day, we do not learn anything from those new facial age estimations that we do daily.

In all of the 12 approaches that we have, they all have to go through data-privacy impact assessments. We’ve done that with a third of the largest global platforms. You can imagine the likes of Lego, Xbox — all of these platforms are very clear on what is the data you have to respect, as well as the processing and the cybersecurity. So we do have all of those elements in place. We have the independent audits. We have gone through the reviews and, for example, we have the seal of approval from the German regulator way back in 2020-21, where they initially put a five‑year buffer for facial age estimation. This year, they have reduced that down to three years.

For every element of testing that has been put forward, we happily have put ourselves to, and all of those can be made available to the committee, if that is helpful.

Senator K. Wells: Mr. Polit, when looking at gesture dynamics and your technology, can you talk about how that would work or how it’s been tested with persons with disabilities, perhaps people with different body types or people born without hands or limbs?

Mr. Polit: That’s a very good question. Obviously, people who can’t move their hands for any reason in a so-called “normal way” cannot use our technology — that first technology — but, as I mentioned, we’re working on a second one that is totally different. The user can use a webcam. That is going to be able to be used by people with disabilities. Inclusivity is obviously very important. The conjunction of these two technologies — the first with the hand and second — I can’t talk more about the second because there is a patent on the way in France and internationally. Until the patent process moves further along, I cannot share more details about the technology. Currently, though, the hand technology doesn’t work if somebody has a disability.

Senator Dhillon: I appreciate everyone being here today.

I offer a quick reminder to us all that we’re building laws based upon some principles, and based upon that, the technologies and various initiatives that we’re talking about today — we’ve heard that much of this could change in the next six months, a year or five years from now. By the time this actually comes to fruition, we may be at a different place and talking about different technologies that requires zero — as we have heard today — personal data to be shared.

I will come to my question. We’re talking about age estimation, which, from what I understand, allows for the majority of folks who are looking to access particular websites to be verified. They then go on their merry way to look at whatever they choose to look at. Then, those who are not successful through age estimation move into the second stage, which is age verification, which is a choice by that individual to offer further information to confirm their age to then access those websites. Do I have that right?

Mr. Corby: Yes.

I would add one further thing to your age verification point. You can do a number of age-verification techniques entirely on your own device so that the data you share never even leaves the palm of your hand. That would include, for example, reading a driver’s licence and doing a likeness check and selfie check against a face. There are some of our members that have shrunk the technology down so the only thing you’re sharing is the signal saying you’re over 18. So it is a mistake to assume you have to give up all this personal data, even if you’re doing a verification.

Senator Dhillon: Thank you for the clarification.

That adds to my earlier submission, which is that we’re moving to a point where we are actually having to provide minimal elements, if any, to arrive at that space where that age is verified. Would I be correct in that assumption?

Mr. Corby: Indeed. In my earlier statement, I mentioned the interoperable tokenized solution. That would mean you would only prove your age once and maybe share that data, if you have to share it at all, just that once in order to have a reusable token that the regulator may approve you to use for the next three months, for example, without having to repeat the process. That, again, minimizes the data required at any point in time.

Senator Dhillon: I have a final question. In this space of age verification and age estimation, what would you say would be the likelihood of a data breach occurring where the private information of citizens is compromised to the extent that their livelihoods, personal information and any element of their identity are misused to the point where they’re placed in significant harm?

Mr. Corby: I will be honest with you. I sleep soundly at night around my own members because of our strict code of conduct and requirements for data minimization. I would be very comfortable with age-verification providers that have been audited and certified to international standards, but I cannot speak for every company in the world that might claim to offer an age-assurance process that may not live up to the same high standards. That is a responsibility based upon good regulation and data-protection authorities to make sure the regime you put in place in Canada is insulated from some of those risks on the edge cases. But I’m very confident in the high-quality industry I represent.

Senator Dhillon: Thank you very much, Mr. Corby.

The Chair: Colleagues, I see no other senators wanting to ask any questions here this afternoon, so this brings us to the close of the panel.

Witnesses, on behalf of the committee, I thank you for your presentation, and participation today and for your valuable contribution to our study.

Senators, in regard to the special studies, I want to thank all the members of the committee for submitting their ideas and suggestions. Steering had an opportunity to discuss those ideas briefly yesterday, but we had a truncated meeting, for reasons you know, and will require some additional time to consider that. What we’re proposing is that steering will defer the discussion on future business to Wednesday, October 29, at which time the full committee will have the opportunity to look at the proposed studies and make determinations and discussions on the plan for next steps. I give that to you for your information. Do you have any questions, senators?

I would also like to further note that we have Bill S-205, providing alternatives to isolation and ensuring oversight and remedies in the corrections system, Tona’s law, sponsored by Senator Pate, a member of the committee, and that has been referred to this committee. The study of the bill will be expected to begin after we conclude Bill S-209.

Senator Batters: I just wanted to make a brief comment, not related to that but on another matter that is happening right now.

The justice minister is currently holding a technical briefing. It is supposed to be for parliamentarians, which means MPs and senators, and the justice minister chose to commence that technical briefing at 10:30 a.m., the same time the Senate Legal Committee meets every single week that Parliament sits. I find this completely unacceptable, and I actually wrote to him when I found this out late yesterday, during our meeting yesterday. I wrote to him and copied it to the government Senate leader, Pierre Moreau. I have heard nothing from either one of them. I asked that it be rescheduled for senators because all of us are here at this meeting, and all of our key policy staff are here at this meeting. I believe that the justice minister is bringing forward a very important bail reform bill that he has been long talking about. The fact that we’re not able to be part of that technical briefing today is not acceptable and the government should not do things like that.

The Chair: Senator Simons, do you have a comment on this issue?

Senator Simons: Hear, hear! Senator Batters is correct. I thought the timing was very unfortunate.

The Chair: Are you suggesting that the chair of this committee make a communication? The idea would be that we find a time for a special Senate technical briefing. I’m not sure what kind of time that would be, maybe a Monday or Friday next week. I will make that outreach, and I guess we’ll expect cooperation, for reasons that should be quite obvious. Thank you very much.

Senators, this meeting is now adjourned. Thank you.

(The committee adjourned.)

Back to top