Skip to content
LCJC - Standing Committee

Legal and Constitutional Affairs


THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS

EVIDENCE


OTTAWA, Wednesday, October 22, 2025

The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 4:15 p.m. [ET] to study Bill S-209, An Act to restrict young persons’ online access to pornographic material.

Senator David M. Arnot (Chair) in the chair.

[English]

The Chair: Good afternoon, everyone. My name is David Arnot. I am a Saskatchewan senator and chair of this committee. I invite my colleagues to introduce themselves.

Senator Batters: Senator Denise Batters from Saskatchewan.

[Translation]

Senator Miville-Dechêne: Julie Miville-Dechêne from Quebec.

[English]

Senator Tannas: Scott Tannas from Alberta.

[Translation]

Senator Oudar: Manuelle Oudar from Quebec.

[English]

Senator Prosper: Paul Prosper, Nova Scotia, Mi’kma’ki territory.

Senator K. Wells: Kris Wells, Alberta, Treaty 6 territory.

Senator Simons: Paula Simons, Alberta, from beautiful Treaty 6 territory.

Senator Pate: Kim Pate. Welcome. I live here on the unceded, unsurrendered, unreturned territory of the Algonquin Anishinaabe people.

[Translation]

Senator Clement: Bernadette Clement from Ontario.

Senator Saint-Germain: Raymonde Saint-Germain from Quebec.

[English]

Senator Dhillon: Baltej Dhillon from British Columbia.

The Chair: Honourable senators, we are meeting to continue our study of Bill S-209, An Act to restrict young persons’ online access to pornographic material.

For our first panel, we are pleased to welcome, virtually via video conference, from the Digital Governance Council, William Darryl Kingston, Executive Director. We also have, in person, from Google Canada, Jeanette Patell, Director of Government Affairs and Public Policy. Welcome, and thank you to both of us for joining us today. We will begin with your opening remarks, and then we will move to questions. We will start with Mr. Kingston, to be followed by Ms. Patell. Mr. Kingston, the floor is yours for five minutes or so.

William Darryl Kingston, Executive Director, Digital Governance Standards Institute, Digital Governance Council: Thank you very much, honourable senators. I go by Darryl, and I really do appreciate the opportunity to appear before you on Bill S-209, An Act to restrict young persons’ online access to pornographic material. As noted, my name is Willian Darryl Kingston, and I go by Darryl. I am the executive director of the Digital Governance Standards Institute, which is a Standards Council of Canada-accredited standards body and part of the Digital Governance Council.

Bill S-209 addresses a real and urgent issue. Young Canadians are increasingly exposed to explicit online content, and the organizations that profit from such material have a clear responsibility to ensure it is not made available to minors. The bill recognizes that modern age-assurance technologies can help meet this obligation in ways that respect privacy and individual rights.

A key feature of Bill S-209 is that the Governor-in-Council will determine what constitutes acceptable age-verification methods. This authority is central to the bill’s success. Determining what is acceptable must rest on a foundation that is technically sound, privacy protective and trusted by Canadians. It requires an approach that offers clarity to organizations and accountability to the public.

I’m pleased to share today with this committee and with Canadians a new National Standard of Canada for age-assurance technologies. It was published in August of this year by the Digital Governance Standards Institute and approved by the Standards Council of Canada. The National Standard of Canada was developed through a public- and consensus-based process and has been designated as CAN/DGSI 127. The standard sets requirements for the responsible design, implementation and use of technologies that estimate or verify an individual’s age. It offers a structured approach to privacy, proportionality and risk management, and addresses, with requirements, the criteria prescribed in subclause 12(2) of the bill, turning the intent of Bill S-209 into responsible action that protects our youth and that Canadians can trust.

For that reason, my recommendation would be that the Governor-in-Council formally prescribe technologies that have been independently verified as conforming to CAN/DGSI 127 as meeting the requirements of age-verification and age-estimation methods specified in subclause 12(2) of the bill. Such a recognition would provide consistency and clarity for organizations in demonstrating compliance and would give Canadians confidence that verified systems are meeting the high standards expected of technologies used in the public interests. By linking Bill S-209’s legislative framework with a recognized and approved National Standard of Canada, the Government of Canada would ensure that this important act is implemented in a way that is transparent, privacy-preserving and trusted by Canadians.

Honourable senators, Bill S-209 is a timely and necessary measure. The alignment of this act with a national standard like CAN/DGSI 127 is not just good governance; it is a model for how Canada can lead responsibly in the digital area. It demonstrates that protecting children online and protecting privacy are not competing goals but shared obligations.

Thank you very much for your time.

The Chair: Thank you.

[Translation]

Jeanette Patell, Director, Government Affairs and Public Policy, Google Canada: Good afternoon, Mr. Chair and honourable senators.

[English]

Thank you for the opportunity to appear today.

[Translation]

Thank you for inviting us to contribute to your study on Bill S-209, An Act to restrict young persons’ online access to pornographic material.

We understand that this bill will be targeted at providers of online pornography — and not a broader class of websites and services, such as search engines or internet service providers. And we are proposing an amendment that will provide greater clarity in this regard.

[English]

At Google, we treat age assurance as part of a multilayered approach to youth protection, and we support a risk-based, service-level response that balances privacy, access and safety considerations. Google Search does not itself host content. Instead, as our mission statement says, we seek “ . . . to organize the world’s information and make it universally accessible and useful.” We have a range of systems, tools and policies designed to help people discover content across the web while not surprising them with mature or explicit content that they have not searched for.

More kids and teens are spending time online than ever before, and parents, educators, child safety and privacy experts, as well as policy-makers, are rightly concerned about how to keep them safe. We know that each family’s relationship with technology is unique, and we provide tools and product features that put parents in control of their children’s online experiences. We’ve also built default protections that create safer age-appropriate experiences across the board.

Since 2017, Google’s Family Link has enabled parents to manage their children’s accounts and devices as they explore online, with a range of features designed to help parents set digital boundaries, protect users’ privacy and automatically filter out content not appropriate for kids. Google SafeSearch, which helps filter out explicit results, is on by default for all signed-in users under 18, including those who have accounts managed by Family Link. Our dedicated YouTube Kids app, teacher-approved apps in Google Play, and the Gemini app offer experiences that are customized for younger audiences.

To help build awareness of the tools available and to empower kids to be smart and safe online, we collaborated with global experts to develop the Be Internet Awesome program, a suite of free educational resources for children, educators and parents. We bring this program to life through our Online Safety Roadshow, which we launched in Canada this spring with visits to school in the Edmonton and Waterloo regions. We will be here in Ottawa next week to kick off Media Literacy Week with the Ottawa Catholic School Board.

[Translation]

We see online safety as a shared responsibility. Our investments and innovations have often placed us at the forefront of positive change in the industry in this regard.

[English]

Earlier this year, we announced the rollout of our age estimation model, which helps us estimate whether a user is over or under age 18. While it is one of our most complex challenges, understanding the ages of our users helps us ensure we’re applying the right protections to the right users so we can treat teens like teens and adults like adults. In our view, to be effective and appropriately balance privacy, safety and access to information, age-assurance measures need to be proportionate to the underlying risks. Higher-risk content and services such as pornography sites should require higher assurances of age, such as verification. As we said, we believe that the content providers, the websites themselves, are best positioned to establish the risk associated with their service and to put in place the right method to limit access to the service to underage users.

Turning to the legislation before the committee today, there is an understandable concern about the ease with which some young people access pornographic content on line. Bill S-209 seeks to address this challenge by creating an obligation to determine the age of an individual before they can access online pornography. The intention appears to be to establish this new obligation for providers of online pornography and not a broader class of websites and services, such as search engines or internet service providers.

To provide greater certainty on this and to clarify that the obligations don’t inadvertently scope in other unintended services, we are proposing an amendment to clause 6 of the bill that emphasizes the primary intent of the service.

I am happy to answer any questions with regard to this proposed amendment and Google’s approach to age assurance more broadly.

[Translation]

Thank you again for giving us the opportunity to contribute to your study on this bill.

[English]

I look forward to engaging with honourable senators and this committee on this important topic.

The Chair: Thank you, Ms. Patell.

We will move to questions from senators. So that people understand, senators signal to the clerk that they want to ask a question, and they are allowed to ask the question in an order that is presented here. I would say that just so the public understands that.

Senator Batters is the deputy chair of this committee and will ask the first question.

Senator Batters: I appreciate you both being here today.

I would like to start out with Mr. Kingston. I was trying to make a few notes but missed a bit of the specificity of what you were saying on a few points. I certainly noted that you believe that this legislation, Bill S-209, could actually be a model for how Canada treats this type of material and a model for around the world. Could you please tell us a bit more about that, but also tell us a bit about the Digital Governance Standards Institute, just so that Canadians who may be watching can know a little bit about your organization and what you do?

Mr. Kingston: Thank you for the question.

The Digital Governance Standards Institute, as noted, is part of the Digital Governance Council, which is an executive forum comprised of members from both private and public organizations. We have three streams of activity, if you will.

One is to exercise an executive forum where we bring together senior leaders to discuss collective priorities in the digital space and then to take some type of collective action to address some of those priorities that they have identified.

We have a Digital Governance Standards Institute, which is an accredited standards development organization, which essentially means that we are able to develop standards and then publish them as national standards of Canada. We do that through an open, transparent, consensus-based process that we lead where we bring together stakeholders from across the country, as well as international stakeholders, and leverage their expertise in the standard that is to be developed.

The last activity is that we offer verification as well as validation activities to the standards that we develop. If an organization is seeking third-party accreditation to demonstrate their conformance to a standard, we offer both those verification and validation activities.

As part of the work we’ve done in developing CAN/DGSI 127 and in putting together that first draft, there is a research component that goes into looking at what exists out there in the market, whether it is existing standards or existing policies. We benefited from lots of international as well as national expertise around the table when developing this standard. We looked at models in the U.K., et cetera.

In my remarks, what I was referring to is that we do have an opportunity to leverage a standard that was developed using an open, transparent process where we solicited the input of subject-matter experts and then to rely on the standard as a benchmark, if you will, to uphold what is being proposed in Bill S-209.

Senator Batters: Thank you.

Ms. Patell, first of all, I’m not sure of the exact corporate structure, but I want to acknowledge that YouTube is a Google company, one of your affiliated companies, and YouTube, as you were mentioning to a few of us beforehand, does not have pornography on its platform. That is because, as you were saying earlier, it does not meet your standards. I appreciate that.

I’ve just been taking a very quick look at the amendment that you propose, and I think it might go a little far from what I can see, just looking at it. It would take out the wording in that particular clause saying that an organization that “incidentally and not deliberately provides a service” and would instead just be left to say “an organization that provides a service that is not primarily intended.” From my point of view, I think it goes a little too far, because it could include things that are intended to search for but that might not be the absolute primary intention, so 60% of the time or something like that is not required, but it is certainly an intention of a service that is provided, just not its primary intention. Maybe you could comment on that.

Ms. Patell: Of course.

When we looked at the language that has been proposed in clause 6, “incidental and not deliberate” — please correct me if I am wrong and if I’m making any assumptions here — this was language that was not something we’ve seen used in the context of age-verification or age-assurance legislation elsewhere, so, given the important topic here and the need to strike the right balance in maintaining access to information, preserving privacy and protecting users, we thought that precision would be prudent. That way, we have clarity on what the scope of service envisioned to be covered by this legislation would be rather than leaving it to the discretion of a Governor-in-Council about the scope of services that are in question. We looked at language that has been used in other jurisdictions that does speak to the primary intent.

Regardless of the ultimate language used in clause 6, we’re driving toward that precision and clarity because the conversations and the trade-offs become very different if you’re talking about requiring or conditioning access to search engines, internet service providers or cloud storage on Canadians disclosing sensitive personal information. My understanding is that those were not the intended services that the bill envisioned, so we wanted to ensure that that was accurately and adequately captured in the text itself.

Senator Batters: The Governor-in-Council, I should say, is the federal cabinet. Regardless of whether it says “not primarily intended” or whether it continues to say “incidentally and not deliberately provides a service,” it will be the Governor-in-Council that decides the scope of how far Bill S-209 goes, if it becomes law. Is that not correct?

Ms. Patell: That’s an important point.

Regardless of where this needs to be established in the bill itself, there is a lot of value in having clear and precise language about the scope of services that is subject to the law, and it makes a lot of sense to have flexibility around the technologies that are deployed, because this is an evolving space.

When it comes to the scope of services, I think the conversation becomes very different if the conversation is about predicating access to high-risk content to underage users versus access to information and services that are so foundational to how the internet works. As policy-makers engaged in this conversation and in discussions with stakeholders, what trade-offs are involved? How do you preserve privacy? How do you maintain access to information for users? How do you treat teens like teens but also treat adults like adults and not like children — unless they prove otherwise — just to access something as fundamental as search? We want to make sure that that is well captured in the legislation itself instead of leaving it to the discretion of a future government and empowering them to expand the scope of services that would be subject to the law.

I might be repeating myself here, but this goes back to having the right conversations with stakeholders about what exactly is envisioned. We believe that you need to place the controls, which are very legitimate, closest to the point of risk for the high-risk content. That is very consistent with the offline world. If a teenager is going to the mall — or when any of us go to the mall — we don’t get ID’d going into the mall. However, if you try to buy a bottle of wine, the LCBO clerk might ask for your ID — if you are lucky. That recognizes striking that balance of having a targeted approach that puts the control at the point of risk. A mall cop doesn’t need to know how old you are, but the LCBO clerk might. That’s the approach we think is appropriate and strikes that critical balance.

[Translation]

Senator Miville-Dechêne: Ms. Patell, my first question has to do with the amendment drafted by our legal experts, who are specialists in these matters under Canadian law. According to these legal experts, it is clear that the mere fact of mentioning the incidental nature — which means secondary — means that internet service providers and search engines are covered, because it is a secondary consequence of distributing material; it is not deliberate, it is incidental. In that sense, the wording is the one that was chosen. I would like to know which country you’re basing your wording on.

My second question is this: You’ve said several times that age verification should be done at the point closest to the pornography. I would like to know what you think about what Ethical Capital Partners said here, namely that it is you, Google, and other operators who should be doing this verification. It would be much simpler for everyone. Everyone would be covered, and we wouldn’t have the problem of porn sites refusing to comply with the law.

I’d like you to speak first to the amendment, and then I’d like to know why Google isn’t helping to protect children from pornography.

Ms. Patell: I’ll answer in English because it’s a very important and very specific question.

[English]

With regard to the language we’re suggesting, the closest reference, I understand, is from the U.S., but I will double-check with my colleagues to see if there were other jurisdictions that they were also inspired by.

Thank you for the opportunity to respond to what we heard from Pornhub just the other week here. When I take a step back and look at what Pornhub is proposing, it would essentially require all Canadians to provide ID simply for accessing the internet. It undermines privacy, it excludes those without ID and it disproportionately impacts marginalized groups. Not surprisingly, maybe, it also removes any incentive for sites like Pornhub to strengthen their own systems to prevent underage users from accessing inappropriate content.

No other jurisdiction has taken the approach that Pornhub is suggesting, and there are some good reasons for that. The first is this mass disclosure of data by default, and it is incredibly privacy-invasive. It is also disproportionate to the risk. When you want to have a targeted, risk-based approach, you focus the controls on a narrow subset: those users who are trying to access the high-risk content. You protect that point of risk as opposed to a blanket approach that absolves Pornhub of any responsibility for their own user base and the content they are making available.

There is the impact of mass data disclosure by default to marginalized groups, some of whom may not have access to government IDs, such as the unhoused or newcomers, and I would be remiss not to talk about the practical challenge, which is that it does not reflect the real situation of many families who share devices.

When we looked at that proposal that Pornhub put forward, it was such a clear misrepresentation of both the technology and trade-offs involved. Frankly, that is why other jurisdictions have not allowed porn companies to write the rules of the internet for everyone else.

Senator Prosper: Thank you to the witnesses for being here today.

Like most of us, I am trying to navigate my way through this. It has been quite a process and journey through the evidence and listening to witnesses. You discuss different approaches. This is a question to both of the witnesses, Mr. Kingston and Ms. Patell. There was a suggestion by the content providers — I think they used it in terms of an analogy — that if you want to get to the crux of things, you have to look at the device and seek out ways to have a certain level of protection on the device itself. If you were to leave it to the content providers, which is something I think you ascribe to — them having in place certain systems and processes for age assurance and things of that nature — you are dealing with a huge number of sites. One of the things cited is that it’s not really effective even in some jurisdictions that have legislation, such as France, I believe. What do you say to those who say that we should work from the level of the device itself as opposed to further down the chain of a wider substructure of content providers? Ms. Patell first, and then Mr. Kingston.

Ms. Patell: Thank you.

I will return to what I said earlier about a device-level approach being both problematic in terms of the implications for privacy for Canadians as well as practically problematic. To the question you asked earlier, we have seen that there has been a claim that it has been ineffective in other jurisdictions to require site-level verification. In addition to being wildly disproportionate to predicate access to the internet at the device level on providing sensitive personal information simply to access the world’s information, I also think that would have an even bigger blowback. If users are concerned about giving personal information to access high-risk content, being required to give something like government ID to access information writ large would, I think, invoke a whole different set of concerns from Canadians, and rightly so. That is sensitive, personal information, so it is not an appropriate solution to bring to the problem.

Senator Prosper: Thank you.

Mr. Kingston: Thank you for the question.

I think I agree with Ms. Patell in that around the technical committee that developed our standard, there was obviously some discussion as it pertains to whether it is the device or the age-assurance-providing party. What I can say specifically around the standard is that some of the requirements within the standard where we came to consensus were not around the device but around the technology and ensuring that it is designed, developed and deployed based upon the tenets of privacy by design, privacy by default, security by design as well as proportionality and data minimization principles. We also went as far as to ensure that it needs to be designed to be transparent, accountable and respectful of the rights and interests of individuals, including members of vulnerable populations. So in responding to your question, I would agree with Ms. Patell, just based on the discussions we had at the technical committee and where we reached consensus.

Senator Prosper: Do you wish to add to that?

Ms. Patell: Yes. Just to add one thing, it is a shared responsibility. When we look at how our products are designed, we are designing for an age-appropriate experience. As I said in my opening statement, things like SafeSearch — which, by the way, was developed in Montreal — ensure that, for account holders who are under 18, we are filtering out explicit results by default. We are equipping parents with controls, in tools like Family Link, to ensure they are having the right conversations and that they are setting the rules of the road for their children. Families can make the choices that are most appropriate for them. Even in a signed-out state, which is important to preserve, SafeSearch blur applies so that users are not surprised by explicit content.

We recognize that we have a role to play, while also ensuring that the content providers themselves, which best know the risks of the content they put on their sites and the users who are accessing that content, have the responsibility to protect users and protect against underage users accessing that content.

Senator Saint-Germain: Welcome to both of you.

My first question is for you, Mr. Kingston. In your opening statement, you said the council has now issued a new national standard. It is a minimal standard, if I understand. You recommended to us that we refer to the Governor-in-Council to formally prescribe technologies that would be acceptable. My question to you is: Do you believe that there is now, in Canada or elsewhere, a technology or technologies that would respect your minimal criteria?

Mr. Kingston: Thank you for the question, senator.

The answer to your question is that I truly do. As part of our technical committee that developed the standard, we had vendors of age-assurance technologies — those that develop such tools. We certainly feel strongly at the council that, by certifying age-assurance technologies and processes against consensus-based standards, it does provide added confidence and assurance that the design, use and accuracy of these technologies is reliable, while still maintaining users’ privacy and protecting their personal information.

Senator Saint-Germain: I am concerned about the dire consequences of data breaches. That is why I asked the question. I do not mean to trap you, so be at ease in responding in the way you deem appropriate.

A brief that this committee received from the Internet Society mentioned the recent data breach that impacted the platform Discord and numerous countries using third-party verification systems. We were told that:

The problem with these ill-thought-out laws, often devised by hapless politicians with no background in tech and who do not understand how the internet works, is that they are already breaking the open internet.

According to both of you, are we at risk of breaking the open internet with this bill?

Ms. Patell: My mic is on so I guess that is to me.

This is an evolving technology space. It is a delicate issue, and getting it right is difficult. We have seen that in jurisdictions all over the world. This committee is dealing with the question with the appropriate care and hearing from stakeholders. This is a committee known for diving deep into specifically, as the name implies, the legal and constitutional implications of the proposals in front of it. Having an approach that is targeted and risk-based, understanding that there are trade-offs involved and there is a balance to strike, is my best recommendation as you take this forward.

Mr. Kingston: I agree as well. I do believe that there are trade-offs and you have to look at the implications of compliance and the risk assessment that is required. I would say to you that obviously there is some serious concern. We spent, I believe last week or maybe the week before, where we had the Canada/EU week, and one of the main themes or topics was around building trust, because in Canada right now there is a lack of trust. I would agree. I think you’re taking the right approach. It is a rigorous, deliberate approach to understand all sides before coming to a final decision.

Senator Simons: I wish to start with a question for Mr. Kingston. I am trying to understand this, and I think this is where Senator Saint-Germain was also going. You set standards. Who enforces those standards? How do you know that someone is following those standards? Can your standards be applied to international players in this field, or is it only for Canadian-based companies?

Mr. Kingston: There are many questions there to unpack. I will do my best.

We are an accredited standards development organization by the Standards Council of Canada. They have a mandate directly from Parliament that allows them to accredit organizations to develop standards as national standards of Canada. We do so by bringing together subject matter experts, as well as anyone who has an interest in the subject matter and contributing their thoughts to the development of these national standards of Canada.

When it comes to levers, obviously there are a number of different ways we can see our standards implemented, whether it’s through a reference by a regulator in legislation or by a company picking it up and referencing it in an internal policy. There are many different ways that standards can be recognized and then leveraged. Perhaps one innovative way I can demonstrate is that for one of our standards around cybersecurity or cyber resiliency in the health care space, we have an organization like an insurance company who provides insurance for health care organizations, providing reduced rates for those organizations that can demonstrate through a third party that they are meeting the requirements of the national standard of Canada. There are many different levers, if you will, that either ourselves or certification bodies can use to try to get organizations to demonstrate their conformity to the standard.

Another thing I wanted to add is that you were talking about international applicability. The Digital Governance Standards Institute has actually been recognized by ISO, the International Organization for Standardization, as well as IEC, the International Electrotechnical Commission. Essentially, what that means is we have the ability, with our national standards of Canada, to submit them to the international community for consideration as an international standard. A couple of weeks ago —

Senator Simons: I want to be clear, though. If I invent an age verification program in Canada, I am not required to live up to your standard. It is a thing I can put on my advertising to say I conform with the standards council, but you cannot enforce that standard on my company?

Mr. Kingston: The Digital Governance Council does not have the ability to do that, but if you require organizations to get third-party certification through either legislation or one of these other levers, you then can ensure and make organizations providing that service or providing that product meet the requirements of the standard.

Senator Simons: I will ask a two-part question and then be quiet. Are you suggesting that we should amend Senator Miville-Dechêne’s bill to say, perhaps in clause 12, that it must comply with the Canadian standard? And if we did so, could we impose that standard on an international company that provided age verification, a third-party company?

Mr. Kingston: Yes. As it pertains to the remarks that I was making, that is exactly what we are suggesting within that section. You are absolutely correct. For organizations who are offering these services to Canadians, if you make that part of the legislation, they are then required to meet those certification requirements given that it is part of Canadian legislation.

Senator Simons: Even if they are an international company?

Mr. Kingston: Yes.

Senator K. Wells: I want to pick up on that question Senator Simons was asking. To be clear, if we gave you the power today to amend this bill you would be looking to put in the requirement to meet those national standards; is that correct?

Mr. Kingston: Again, I cannot speak out of both sides of my mouth. There is due diligence and rigour required. But as part of our remarks and our experience in standards development, the recommendation we are making today is for honourable senators to look at standards and how standards can be used and leveraged in legislation to require organizations to meet those requirements.

Senator K. Wells: Let’s assume, then, that this legislation passes with an amendment for standards, whether it’s in this legislation or future regulations. What should be the penalty for violating those standards in case of a third-party breach?

Mr. Kingston: I’m not sure exactly what the penalties are for breaches of others, but there should be some obvious consequences assigned to those organizations that get caught not meeting the requirements as specified by the legislation or regulation.

Senator K. Wells: We’ve heard that some have advocated for monetary fines, and others have said that is not significant enough and it should be criminal in those cases. Given the gravity and potential damage of a breach if this legislation moves forward, what should the consequences look like in terms of holding those companies accountable, considering the public trust that, if they are going to give this information, it will be appropriately protected? One witness said that, in this age, data breaches are inevitable.

Mr. Kingston: From my own personal perspective, as I said, I probably agree. This is not a position of the Digital Governance Council, and I will make that clear now, based on the fact we have not discussed this. From what we’ve seen in the work we do, I would agree that trust by Canadians is not at a level that it needs to be when it comes to digital technologies.

The other factor is that in dealing with minors and the seriousness of the subject matter, I would agree, too, that if there is an opportunity to look at criminal implications for those who are not meeting the requirements of the law, then that would be an appropriate action to take. Again, I am just saying those are my views personally as a father of two children and not necessarily the views of the Digital Governance Council.

Senator K. Wells: I certainly appreciate that.

Ms. Patell, following on Senator Prosper’s comments, I am trying to understand device-based verification. We’ve heard from some witnesses that that is the least intrusive and the most protective of private information. Whether this was of Google’s own volition or a federal government requirement of any device such as a laptop or let’s say a smartphone, parents could enable age restriction — you’ve outlined many different ways that you are already doing this — at the device level without having to share age verification or age identification. They could then give that device to their child, protected by a password or a code to ensure that that could not be changed. We’ve heard from many witnesses that device-based verification is the most secure, the least intrusive and the least likely to lead to a third-party privacy breach.

Ms. Patell: Thanks for the opportunity to speak to that. There are a number of different challenges with this type of approach.

Maybe just to put it in concrete terms, when I read the proposal from Pornhub and how they envisioned this happening, it started with website blocking by default, where any websites with explicit content would be blocked and you would have to provide identification in order to unlock access to that information, which I would note is legal information. And then, essentially, the device providers — they focused on three device providers, and I would note that there are more, as this is a diverse ecosystem — would somehow have to monitor the usage across all of the interfaces, surfaces, what have you, of how that device is connecting into the internet.

Is that truly a risk-based approach? Is the risk at the device level, and is it proportionate to require individuals to share their personal information and tie their personal identity to all of that content about how they are engaging with the world of information, many of whom find that that’s an incredibly personal experience simply to access that information that is not high risk, right? There are many challenges with even just the reality of how the ecosystem works, how the technology works, that mean that that information doesn’t transfer. We’re not telling the age of users to every website that they access.

This type of proposal just really invokes such important privacy considerations that it would need to be, I think, really well understood. The conversation with Canadians about what they are being asked to do — the kind of preconditioning their ability to use their devices, preconditioning how they access the information on their disclosure of sensitive personal information — I think that’s something that we think there would be serious concerns around. It sounds like an easy solve, a silver bullet, but it is neither a silver bullet from the technology perspective nor from the trade-offs perspective.

I will return again to the very real situation of shared devices. We already know that when users are sharing a device in our ecosystem, the signals get a little bit scrambled. Right? That is just a reality.

If you want to, understandably, put controls in place to prevent access to high-risk content so that it is only being accessed by those of appropriate age, the right place to do that is at the content itself. That may mean that it is inconvenient or challenging for the content providers, and I understand that, but, nonetheless, when it comes to how Canadians access information and the proportionality principles and data minimization principles, it is important.

The Chair: You are at eight minutes.

Senator K. Wells: If there is time in the second round, we can come back and finish this conversation.

The Chair: Do you have a quick answer?

Mr. Kingston: I don’t think there is a quick response on device-based.

[Translation]

Senator Oudar: First of all, I’d like to thank both witnesses. This is really interesting and very important for the committee’s work.

I’d like to return to this discussion about the whole issue of privacy breaches, particularly with you, Ms. Patell. In terms of the position you’re expressing, there is indeed support for the protection of minors, and you are promoting a safe digital experience, voluntary codes, and the parental control tools you mentioned. However, we also understand from your testimony that you have some reservations about age verification, particularly with regard to the right to privacy.

I would like to know whether you’re aware of the position of the Privacy Commissioner, who appeared before us and published his opinion on Bill S-209; if so, what is your opinion on his opinion?

[English]

Ms. Patell: Thank you for the opportunity to speak to that.

In terms of my opinion of his opinion, it is limited. Correct me if I’m wrong, but my understanding is that he is also working under the belief that the bill is more narrowly targeted and that that was an appropriate approach. I think we would be aligned with that, and I would see as a shared goal to have a targeted, risk-based approach that minimizes privacy implications. Where the concerns around privacy are invoked would be around more of the proposal put forward by Pornhub to shift the application of verification upstream to the device level. That, I think, invokes a different privacy conversation, and I would welcome the Privacy Commissioner’s point of view with regard to what that would entail or the idea of privacy implications for having to provide identification in order to access something as fundamental as search. Those are the things we are speaking about when we talk about privacy implications for this bill. As I said before, there are trade-offs here. Striking that balance of protecting users while preserving privacy and maintaining access to information, which is a fundamental right of Canadians — achieving all three of those is our goal.

[Translation]

Senator Oudar: Thank you.

In fact, the Privacy Commissioner said that he was satisfied with the amendments made to the bill, particularly those limiting its scope, specifically with respect to improvements to the criteria applicable to the age verification mechanism in order to ensure privacy rights. In terms of the evolution of the legislation, he expressed satisfaction with the version we have before us today.

You also seem to have some reservations about the powers of the Governor-in-Council. I was surprised to learn that. I think it’s important for legislation to be able to evolve. I would like to make it clear that the regulatory powers can never exceed the scope of the legislation. I want to reassure everyone. It is important to have regulatory powers in the legislation that are well defined so that we don’t have to come back every year and be forced to amend a legislative framework that is too rigid.

I would also like to reassure you that this regulatory power isn’t a political act. It will adapt to technology and innovation. Regulations will be able to evolve.

They are pre-published in the Canada Gazette. You will be able to comment. In fact, the Privacy Commissioner asked to be consulted as well. The committee has taken note of that. Correct me if I’m wrong, but I may have sensed a lack of confidence in this power. I was surprised to hear that.

I would also like to reassure you about the use of regulatory powers by governments, whether federal or provincial. I believe they are exercised properly in Canada and in the provinces, because there is significant consultation that takes place and democracy that can also be expressed.

Does it reassure you to have these regulatory powers in the law? Reassure me too, at the same time.

[English]

Ms. Patell: Thank you. It is reassuring to hear the Privacy Commissioner’s interpretation that services like search engines or internet service providers should fall out of scope and that, in his view, it is accomplished in clause 6. This is a conversation where we are aligned on the goal and have some questions about the specific text to achieve that goal. That’s the purpose of our effort, to ensure that there’s clarity on the scope of services that are subject to the act.

When it comes to the role and the powers for the Governor-in-Council, I think I was responding to some of the conversation with Senator Batters about how ultimately it is for the Governor-in-Council to decide if a service is in or out of scope. That’s where I do see a distinction to be made in terms of the services in scope and the other rule-making powers regarding how you are achieving that goal. In our view, it is important to have precision on the services that are in scope because they do invoke different considerations when this comes to all of the other elements here.

I appreciate very much the spirit in which everyone here is engaging, and I recognize that there is, perhaps, a different interpretation about what the current language says, but I am reassured that we seem to have the same goal.

Senator Pate: I would like you both to further elaborate on why you would propose amendments to the legislation rather than use the regulatory approach. From my perspective, the regulations could be kept more up to date rather than having to go back and deal with legislative change every time there is some new technology or new advancement. We are all very concerned about AI. I am concerned, but that also comes from a place of ignorance, of not even knowing what the space looks like. Why would you recommend legislative change versus regulatory change?

Ms. Patell: There is a desire to ensure that services have clarity and predictability in their operating environments. The tenor and content of the conversation that we would have with Canadians about how they are engaging with the world’s information, how they are engaging with their devices, how that information is being used and what is required, that changes if you are talking to Canadians about a specific site that a subset of Canadians are trying to access. It is a subset of content for a subset of Canadians who are specifically trying to access high-risk content. The conversation there necessarily involves a different set of trade-offs as opposed to some of the considerations required if you broaden that. If suddenly internet service providers were also captured under the same clause as clause 6, if suddenly access to internet service providers meant they were required to verify age, I think Canadians would have expected a different kind of conversation around that.

We are here to say we would like more precision. It sounds like people feel there is sufficient precision on this point, but to have precision on the scope of services to which the law applies at the front end ensures that there is trust among the Canadian public about what we are all agreeing to.

I take your point when it comes to the implementation of the bill. This is an evolving technology space, and there is work ongoing on standards. That work should be allowed the space to continue to evolve, and that’s completely appropriate for that to be set in regulation.

Mr. Kingston: I might add that, again, the standard focuses explicitly on the role of age-assurance technologies and facilitating a user’s cyber safety and articulates how best to implement age assurance in a manner that is privacy-preserving, secure, effective and efficient, as well as easy to use. Our note was around certifying or bringing that certification piece, because we believe that consensus-based standards provide added confidence and assurance that the design, use and accuracy of these technologies is reliable.

To your point about the update of legislation versus a standard that happens to be referenced in legislation, just so everyone has the same information, a National Standard of Canada has a shelf life of five years. There is a requirement for accredited standards development organizations to revisit a standard before its five‑year birthday to reconfirm whether or not it’s still relevant to the market that it is directed towards. The Digital Governance Council actually has a two-year policy that is internal to us where, every two years, on the birth of the publication of the National Standard of Canada, we actually go out and test the market to see whether there have been any changes or gaps that people have identified in using our standards. The hope or desire is to always remain market-relevant with the standards that we develop.

Senator Pate: Thank you.

Senator Clement: I will try to be quick. I see the time has passed. Thank you both for your testimony.

My question is for Ms. Patell. I remember interacting with you on Bill C-11, I believe. It is good to see you again. I heard you say “shared responsibility,” and you talked about the different programs that Google has, like Be Internet Awesome. Do they work? Do you have data on whether that stuff works? Are you spending more of your budget as Google on this type of safety? That, I think, is relevant for us to understand. We’re trying to do, as parliamentarians, what we do, which is to regulate and legislate, because we are feeling pressure, but I would be interested in knowing how you keep track of what you do, whether it’s working and whether you are spending more money on that.

Ms. Patell: That’s such a good question. It is hard to answer in concrete terms.

I do think it works to equip parents and families with more digital literacy. This is a space that is evolving, and I would love to share some information about our Online Safety Roadshow. I would love to invite you to the one we are hosting on October 27 here in Ottawa because this is engaging kids in an interactive way to help them be good digital citizens. We equip parents with resources on how to use Family Link and how you can have the conversations that you need to have in your own families and set the rules of the road that are the right ones for you. We work with organizations like MediaSmarts to translate — sometimes we need to have a translation — between big technology companies and families. There is a really 360-degree approach here. I think equipping families, whether you are a child, teen or adult, with knowledge does work.

When you look at our investments in this space, absolutely, we are investing a lot. We do that on an ongoing basis. Not only are we investing to continue to refine our products and to make sure that we are adopting and developing technology, like age estimation, to ensure we’re applying age-appropriate experiences for users on our services, but we are also working with external experts. That is something we are doing in-house, we are bringing in outside knowledge, and we are very much committed to that continuous improvement.

The Chair: Senator Wells, we are out of time. Do you want to ask a question to which the witnesses could perhaps provide written answers later? No? Okay.

I have a question for Ms. Patell and Mr. Kingston on the issue of digital literacy. Mr. Kingston, your organization is a proponent of digital literacy. You won’t be able to answer this question verbally here tonight, but if you would like to amplify any advice you might have for the committee on that issue as it pertains to this bill, that would be quite helpful. Similarly, Ms. Patell, if you would be so kind as to consider that, also. I know you spoke about it just now, but if you would like to amplify that, it would be helpful to us.

Thank you, colleagues, for your questions. Thank you, witnesses, for taking the time to be with us here today and assisting this committee.

For our next panel, we have with us Michael Geist, Canada Research Chair in Internet and E-commerce Law and Full Professor with the Faculty of Law at the University of Ottawa, by video conference. In person, we have Ms. Emily Laidlaw, Canadian Research Chair in Cybersecurity Law and Associate Professor with the University of Calgary. Via video conference, we also have Janine Benedet, Professor of Law at the University of British Columbia.

Each witness will have the floor for five short minutes. After we hear from all three witnesses, we will move to questions from the senators.

Michael Geist, Canada Research Chair in Internet and E-commerce Law and Full Professor, Faculty of Law, University of Ottawa, as an individual: Good afternoon, everyone. In addition to being a professor and research chair, I am a member of the Centre for Law, Technology and Society. I am appearing in a personal capacity, representing only my own views.

I would like to thank the committee for the invitation to appear. I had the chance to appear before you on this bill’s predecessor, Bill S-210, and as committee members may know, I’ve been critical of this approach. It should go without saying, but just in case: this criticism is not due to the purported objectives of the bill, which are laudable. The problems are that this bill is overbroad in scope, intentionally capturing far more than pornography sites; it relies upon largely foreign-based age-verification services, which raises serious privacy concerns and puts our data sovereignty at risk; and the use of court-mandated website blocking contemplates restricting access to lawful content for those entitled to view it. Allow me to expand upon each.

First, the bill’s preamble states:

. . . a significant proportion of the pornographic material accessed online is made available on the Internet for commercial purposes — in particular through pornographic websites . . . .

That’s how I think many would understand this bill, yet Senator Miville-Dechêne told this committee:

Bill S-209 does not just target porn platforms like Pornhub. The bill leaves this decision to the government in clause 12, and the government will decide on the scope, so the government could decide to include social media like X in its choices.

Respectfully, that is not how the bill is drafted. The default is that social media and other services are subject to its requirements, and the government could later work to exclude them by passing the necessary regulations. Indeed, I think many Canadians would be stunned to learn that the bill, as drafted, covers social media sites such as X or Reddit, internet and wireless service providers simply providing connectivity, and AI services such as ChatGPT. As it stands, all of these services relied upon by many millions of Canadians would be required to verify the ages of their users or face penalties and potential blocking from Canada.

Second, the reliance on age-verification or age-estimation services at this time raises significant privacy concerns. In the case of age estimation, it simply doesn’t work very well for this use case of distinguishing between, say, a 17-year-old and an 18‑year-old. Moreover, we know age estimation is less accurate for those with darker skin tones, meaning that the burdens of increased data disclosures and privacy risks will inequitably fall on racialized communities and persons of colour. White Canadians are more likely to be able to rely on age estimation, while persons of colour will be forced to surrender their government IDs.

For millions of Canadians of all backgrounds, there will be privacy risks due to Bill S-209 to simply use the internet or social media services. There are already cases of data breaches involving with government IDs, including, most recently, tens of thousands of IDs collected by a third-party service provider that was used by Discord.

Moreover, at the very time when there’s been a growing emphasis on data sovereignty and the desire for Canadians to ensure that their data is well protected, this bill will result in millions being required to disclose that data to private, for-profit foreign entities to whom, as we heard from the Privacy Commissioner, enforcement of Canadian privacy laws may be limited.

Third, the reliance on blocking sites as an enforcement mechanism remains troubling. The danger of over-blocking legitimate websites raises serious freedom-of-expression concerns, particularly since our past experience suggests that over-blocking is a likely outcome of blocking systems. In fact, the bill expressly states that blocking non-pornographic material is also permitted. Widespread website blocking will increase the costs of internet access, may violate the Charter of Rights and Freedoms and could well result in retaliation from other countries as violations of our trade obligations.

The net effect of this bill as drafted is this: A service — say X, to use the senator’s example — will be required to obtain age verification from millions of Canadians. Each will be subjected to age verification with at least younger and racialized Canadians required to submit government ID to a third-party, foreign-based service with limited protections from Canadian law. If X refuses, the entire service could be blocked in Canada, leading to a constitutional challenge and trade retaliation. This could be replicated with ChatGPT, Reddit and dozens of others.

Is it really the intent of this committee to support a bill that expressly contemplates nationwide blocking of non-pornography sites used by millions of Canadians? If so, it should go on the record so Canadians know what is at stake; if not, the bill needs to be scrapped or at least significantly amended.

I look forward to your questions.

The Chair: Thank you.

Emily Laidlaw, Canadian Research Chair in Cybersecurity Law and Associate Professor, University of Calgary, as an individual: Good evening, and thank you for inviting me today.

How to best protect the interests of children from technology-facilitated harms is one of the most pressing challenges we face. It is about protecting them from exposure to pornography, yes, but it’s also about so much more than that. It is about the best interests of the child — their right to privacy, freedom of expression, freedom of thought — in all online spaces.

Age assurance is one of the key ways to do any of this, and how to do this well is the question for the coming years. However — and this is a big however — it is easy to do age assurance in a sloppy way. The Tea app — an app for women to spill the tea on their exes — experienced a data breach a few months ago. They used age verification to restrict access to women aged 18+. Their age verification method was rudimentary and their cybersecurity practices poor. The app essentially collected women’s photo IDs and selfies to confirm who they were and retained their data. When the app was hacked, the bad actors gained access to photo IDs, private chats and thousands of images, all of which were publicly shared. This is a reminder that privacy and cybersecurity are of paramount importance when using age-assurance tools.

Turning to Bill S-209, I have two main points.

First, this should be part of online harm’s legislation, which I know is outside the scope for you right now, but age verification for pornography sites is just one small piece of the puzzle of child safety online. In my view, it is a bad idea to pass this kind of narrow obligation without any of the wider protection measures: risk management duties, transparency obligations and safety by design. Australia’s age verification law is overseen by their eSafety Commissioner, whose mandate involves an array of online harms and safety solutions, and age verification is just one small part of that.

Second, if you proceed with the bill, it requires amendment to be fit-for-purpose. I will focus on three key amendments.

First, under clause 5, age verification would be required by any organization that, for commercial purposes, makes available pornographic material on the internet. As the speakers in the earlier session mentioned, clause 6 aims to clarify the meaning of “commercial purpose” as those that “incidentally and not deliberately” provide such a service. This is an improvement on the previous iteration of the bill.

That said, the definition still risks capturing all kinds of platforms, such as X, YouTube and Google search, because they are commercial organizations that deliberately make available a service through which pornography can sometimes be viewed. I would like to know the kinds of platforms that are sought to be covered by this bill. If the goal is to target sites whose dominant purpose is to make pornography available for commercial purposes, such as Pornhub, then this provision should say that explicitly. I think that is the only realistic choice for this legislation. I acknowledge that this leaves a great swath of pornographic content outside of scope, but otherwise there is a real risk that the legislation would require age-gating the entire internet, and that is untenable and, at minimum, requires a much bigger conversation and consultation.

Second, clause 12(2) sets out the requirements for age verification or estimation. Missing from the list is a requirement that reasonable security standards are used. Cybersecurity should be of paramount importance, as we saw with the Tea app.

In addition, clause 12(2) requires that the collection and use of personal information are solely for age verification, but there are more privacy principles relevant here — for example, that data is retained only as long as necessary. The Tea app should not have retained photo IDs after ages were verified. In addition, this technology continues to evolve, and the least privacy-invasive approaches possible should be used by these companies, and collecting IDs is often one of the most rudimentary.

Third and finally, clause 10 enables website blocking by court order. Website blocking is a blunt solution that should rarely be used because of its impact on freedom of expression. It can be an important tool, as contemplated in this bill, where a website refuses to comply with a court order, particularly a foreign-based website out of reach of any other enforcement powers. However, there must be strict guardrails on the use of website blocking, based on principles of proportionality in human rights law. Instead, this bill does the opposite and condones broad blocking of even lawful content. I recommend that the criteria for website blocking, if used, be specifically set out in the legislation.

Thank you.

The Chair: Thank you.

Janine Benedet, Professor of Law, University of British Columbia, as an individual: Thank you for the opportunity to speak to this bill, whose passage I strongly support. I thank in particular Senator Miville-Dechêne for her leadership and persistence on this issue.

I approach this bill from a somewhat different perspective than some of the other speakers you’ve heard today, from the context of having worked for 30 years on ways to use the law as a tool to address the harms of pornography as part of the continuum of male violence against women and girls. This bill deals with an important but relatively narrow aspect of this problem — age verification or estimation — to limit the access of minors to online pornography, and, in my view, it should attract all-party support. The technology exists, the harms are undeniable, and other comparable jurisdictions have moved ahead with such laws.

I continue to hear objections to these kinds of laws that sound exactly like the objections I have heard every other time the law has been used, or proposed to have been used, to interfere with men’s unlimited access to pornography since the late 1980s when I first became involved in this issue. It’s just now been repackaged for the internet age. To me, those objections were unconvincing then and they are unconvincing now.

When I started doing this work, those who supported the pornography industry and its consumers wrapped themselves in the mantle of freedom of expression, defined expansively. Today, to me, it is an expansive definition of privacy, but the arguments are the same.

The claim is that pornography is personal, a private fantasy and not harmful, that it is the responsibility of parents, not a state responsibility, to protect children, and that it is impossible to define pornography so that we will end up censoring or restricting or punishing mainstream books, television and movies. It used to be “American Psycho,” and now it’s “Game of Thrones.”

I would say to this committee is that pornography is a harmful cultural product. While the means of production and distribution have been made seamless with the advent of the smartphone, it has been true for many decades that pornography routinely presents women in servile positions, sexually insatiable and enjoying pain, humiliation and degradation. Billions of dollars are earned through this industry, mostly by people other than those whose bodies are used to make it.

When I started this work, my focus was entirely on the harms to adult women from pornography, both from its production and its consumption. Increasingly, however, we see the research demonstrating harms to its male consumers as well. The point is not, as we used to sort of look at, that pornography takes normal men and makes them rapists but that the conditioning of the sexual response of boys and young men to the sexual acts presented in pornography shapes what it means to be a “normal” man. It is what is meant by the recognition that we live in a rape culture. Pornography’s construction of sexuality, for boys but increasingly for girls as well, shapes young people’s understanding of what sex is supposed to look like and what their bodies are supposed to look like.

A recent research article by Grant, Sheehy and Gotell in the Dalhousie Law Journal documents some of the ways that pornography shapes the behaviour of men and the legal arguments they make, in cases of sexual violence and homicide against women. A notorious recent example is the killing of Cindy Gladue, an Indigenous woman. Her killer, Bradley Barton, searched for online pornography presenting ripped and torn vaginas in the days before her death from this sort of wound.

No one can continue to pretend that there is no harmful impact on the sexual development of young people from this material, and the need to act is urgent. The protection of children from harm is not just a private responsibility; it is also a state responsibility.

The arguments about defining pornography and the impossibility of doing so are disingenuous. The producers have no difficulty knowing what to push on their platforms, and men who go looking for it do not have trouble finding it. They aren’t constantly being redirected to the Bible or Game of Thrones. If its producers and consumers know exactly what it is, it is capable of legal definition to a reasonable standard of certainty.

In conclusion, I would say that if porn producers and consumers have to bear some additional modest burdens of the kind contemplated by Bill S-209 for the obvious and necessary benefit of Canadian youth, this is entirely justifiable.

Thank you.

The Chair: Thank you.

Now moving to questions.

Senator Batters: Thank you to all of our witnesses for being with us today on this bill.

I wish to direct my time and questions on this panel to Professor Benedet. I first heard about your important work on this type of thing when I was taking my third-year feminist legal theory class at the University of Saskatchewan law school from Professor Wanda Wiegars. I remember being impacted by the work you did on this type of thing then, and I thank you for 30 years of this important work.

Something that we have been hearing during this committee study once in a while is wondering whether there is, in fact, appropriate evidence about the harms that pornography can have on children, going to the very goal of this legislation. Can you provide us with your additional thoughts, based upon your 30 years of experience in looking at this area, to help us find out if there is additional evidence we can hear about?

Ms. Benedet: Absolutely. I am certainly happy to provide that to the committee in writing. That may be the most efficient way to do that.

There are excellent resources. Gail Dines, who has been a researcher in this area for a very long time, has a very good website collecting peer-reviewed research on this topic. The Dalhousie Law Journal article I just mentioned summarizes some very good research on the topic as well.

Two things stand out to me from recent research. The first is very interesting research about the ways in which the consumption of pornography decreases young people’s capacity for empathy and, in part, the impact of viewing women mistreated in the ways that pornography normalizes and whether or not its consumers are bothered by that. Some of that research is very disturbing. There is also interesting research — this is really more with young adult men — looking at the link in the recent rise in erectile dysfunction in young men and overconsumption of pornography. There is a lot of literature to draw on.

Obviously, we do not do research studies of the kind we used to do in the 1970s of showing people a bunch of porn and then measuring things, and we certainly do not do that for children, but there is still good research from a variety of disciplines about these harms. As I said, I’m happy to provide more to the committee.

Senator Batters: Yes. If you could, it would be very helpful to provide that to our committee, and our committee clerk can ensure that senators on this committee have that important material.

I have a follow-up question to Professor Geist. It is nice to see you again. You were detailing a number of the reasons that you are not in favour of this bill. I wanted to confirm that you would also not be in favour of having a device-based approach as has been advocated by Pornhub.

Mr. Geist: Thank you, senator, for the question.

There are a number of different ways we can be thinking about this. We have seen proposals about a device-based approach. We have seen, frankly, many of the different platforms come up with different approaches. Your committee heard from the U.K. lord talking about Twitter using emails and scanning the internet as their system as somehow an appropriate system. There are real concerns and challenges with many of these different approaches.

My concern at the moment with this bill is that, whatever approach might get used, it is being applied well beyond the harms that we just heard about from Professor Benedet and well beyond pornographic websites. The concern here is scope, it is website blocking and it is the privacy-related issues from many of the kinds of technological solutions currently being used to try to address this issue.

Senator Batters: I want to go back to my question, though, specifically about device-based approaches and what Pornhub is advocating. Perhaps you heard the individual from Google who testified on the previous panel when she talked about the type of ID requirements just to access the internet if a device-based approach was used. Would that be your concern as well?

Mr. Geist: Right. Two things.

First, to be clear, your bill right now requires potentially the age verification to access the internet. Internet providers and wireless providers are covered by this legislation as currently drafted. That is first.

In terms of the device-based approach, there are real concerns about a device-based system. Shared devices and otherwise raises some real challenges as well. I do not think it is the silver bullet or the real solution here. At the same time, as I say, you have got age estimation which raises significant concerns for racialized Canadians given what we know about how those systems work. Age verification with government ID, clearly — we heard it from Professor Laidlaw as well — raises significant issues from a privacy perspective in terms of the breaches. There is no perfect solution here.

If, in the trade-offs, we say there is a need to do something, what then becomes essential is ensuring that we have narrowed the scope of this so that we are dealing with a specific limited harm, not applying this as broadly as this legislation currently does apply.

Senator Batters: Of course, the Governor-in-Council will decide those specifics. Thank you very much.

[Translation]

Senator Miville-Dechêne: My question is for Mr. Geist.

I am a bit surprised by your testimony, because it completely contradicts the testimony of the Privacy Commissioner, Philippe Dufresne. You claim that I have broadened the scope of the bill, which is completely inaccurate, since the purpose of the bill has always been sexually explicit and pornographic material. Now, what I’m doing in paragraph 12(1)(a) is giving the Governor-in-Council the power to decide what’s included or excluded from the act. How does that broaden the scope of the bill? It is completely contradictory.

Also, if I understand correctly, you completely disagree with the Privacy Commissioner, who said that he now supports the bill, because I have limited its scope and improved the criteria around age verification to ensure that privacy is protected. So you think that the Privacy Commissioner, whose primary responsibility in life is to ensure privacy is respected, is wrong and that you are right?

My other question is about X. As you know, in Great Britain pornographic sites and social networks are affected by the legislation. X certainly didn’t say it was going to leave or wasn’t going to comply with the legislation. Not only is X verifying email addresses, but it will soon be covering age verification and will therefore be complying with the law. As a result, you’re expecting the worst, as always, but the worst doesn’t always happen.

[English]

Mr. Geist: Thank you, senator. There are many questions there. Let me try to make that sure I get through all of them.

First, with respect to the scope of the legislation, it is clear in the way the legislation is drafted. Clause 6, combined with clause 5, makes it clear that this covers everyone who, let’s say, in clause 5, makes available this content. In clause 5, any organization, for commercial purposes, that makes this available is covered by the legislation. You don’t need a regulation for that; everybody is covered. There is an attempt to clarify the potential for some limitation here in clause 6, but, based even on some of the interpretations I’ve heard from some of the committee members, the argument would be that even a social media site like Twitter would be covered. I heard you a couple of days ago at a conference talk about how these social media companies play a significant role. Clause 12, with respect to the government establishing regulations, does not say that the government sets who is included; it says who may be excluded. The default in your legislation is that everybody is in. Then, in clause 12(1), the government may make regulations specifying the circumstances whereby someone is excluded. The default for all these services is that they are in.

Any prudent company will look at that legislation and say the way they have to comply is to comply or exit the market or find some mechanism to deal with some of these issues. That’s what you’ve structured. It would be an improvement if you said that the only time anyone is subject to this is once the Governor-in-Council has established regulations that would include them. In other words, you’re not in by default. You’re only in when the regulations so say. That would go a long way to addressing some of the scope-related concerns.

With respect to the Privacy Commissioner’s comments, I heard the commissioner express some support for age verification, but I also heard the commissioner acknowledge, quite explicitly, that his ability to enforce the law with any sort of penalties over these violations is extremely limited. In fact, there are, of course, at the moment, no penalties available to the commissioner in enforcing the law. This is enhanced or it becomes an increasingly problematic concern with respect to the very companies that would be providing these services. They are invariably foreign-based entities for which the Privacy Commissioner has even less ability to enforce the law. Where there are breaches — and, as we have heard, there will be breaches — the ability of the commissioner to do much of anything is exceptionally limited.

With respect to your third question on Twitter, I read the comments of Lord Bethell. I must admit that I was rather stunned to hear him speak somewhat approvingly of Twitter’s approach to trying to stay in the country and an approach in which — this is what I believe he said — we ask users to submit their email addresses, and then we scan the internet for everything we can find about them, and then we use that, together with unseen AI algorithms, to determine whether this person is of age or not. That kind of surveillance-based approach is so anti-privacy that it is stunning to see, frankly, a regulator or Lord Bethell suggest that, somehow, this is an appropriate approach for making this kind of determination. It is something that we should roundly reject.

[Translation]

Senator Miville-Dechêne: I’d like to ask Ms. Benedet what she thinks of that testimony, and particularly the use of privacy as the main criterion when considering age verification.

Ms. Benedet, do you agree with Mr. Geist’s comments on this?

[English]

Ms. Benedet: I don’t want to diminish the significance of anyone’s testimony or their expertise, and our expertise is quite different.

I will say that I am concerned about an overemphasis on privacy in the digital environment, in an environment in which people are routinely engaging with these companies and these algorithms in ways in which they quite willingly surrender all kinds of information about themselves. As someone whose work focuses on violence against women, privacy raises red flags for me because privacy is used as a justification for abuse. Abuse happens in private. The idea that the state has no business in the private sphere is something that historically has really been to the detriment of women and children. I want us to keep that in mind when we are talking about the privacy of internet users or adult porn consumers. We must give that appropriate weight and not elevate it to something that ends up making it impossible for us ever to do anything about this very serious threat to children.

Senator Prosper: Thank you to all the witnesses here for providing great testimony.

I have a question for Professor Geist and Professor Laidlaw.

We’ve been hearing, through testimony related to this particular bill, about the intent, obviously, but there has been quite a bit of discussion on the level of detail — or lack of detail — within it. One of the justifications we heard was that, with evolving technology, you don’t want to overly proscribe. We heard earlier that you do not want to go back and amend and amend. You want to have a legislative framework that can adjust to a changing environment and be responsive to that.

What I am hearing from you, Ms. Laidlaw, with your suggested amendments, is that you’re thinking that leaving things to regulations is just not sufficient for the purposes of the points you raised.

Professor Geist, if I were to consider your evidence, it is a little more harsh. I think you used the words “scrap” the legislation. But I note you mentioned certain things with respect to data sovereignty, being overly broad and some issues with respect to persons of colour potentially being racialized because the technology is not really there.

When I think about an approach of a legislative framework not being as prescriptive, is it your position that no level of vagueness will remedy this piece of legislation and that it is best scrapped?

Ms. Laidlaw: Thank you for the question.

One of the challenging issues when writing any law when you’re dealing with technology is the challenge of it evolving. I tend to have quite a bit of faith in leaving things to regulations to handle that evolution as long as you have the right structure in place.

My concern is actually purely a drafting one. I do think that clause 6 is a huge improvement on the last iteration. It’s still unclear to me exactly the objective and precisely what types of platforms are sought to be regulated. If you want to leave some details later to regulations, that’s fine, but clause 6 still needs to be amended to make that clear. That would also provide some business certainty, so that we know, generally speaking, the types of platforms you are looking to bring within scope.

Given what is sought here, I would start with just platforms for which the dominant purpose is making pornography available. I think the goal right now is Pornhub. If you’re looking to bring in all kinds of other services where there is a lot of pornography that is viewable by and accessible to children, we have to talk differently about age assurance. That would be the Googles of the world and the Xs of the world. We think of them as dual-use technologies in some ways because they involve so many other kinds of content. That’s where you start to get into that “age‑gating the internet” conversation, which is so much bigger.

Mr. Geist: Thank you for the question.

I would align myself with much of what Professor Laidlaw just said, but I will add, first, that I appeared before a number of committees dealing with Bill C-11 and Bill C-18. One of the primary concerns about that legislation — frankly, if we’d had hearings on Bill C-63, the online harms bill, we would have dealt with it there — is that too often there has been a tendency to punt on some of the really hard questions and leave them for either a future regulator, like the CRTC, or for future regulations. But we have seen, as some of these bills have unfolded — in the case of both the Online Streaming Act and the Online News Act — that this in itself creates real harm. There is confusion in the marketplace. We end up in the courts. We actually don’t get where we want to go because there hasn’t been sufficient clarity from the legislator and too much is left for later on. I think it is incumbent within this legislation to be very specific and precise.

My concern here is not whether or not there’s regulation on this issue. It is, in a sense, what exactly are you trying to achieve with this bill? If it is, as referred to in the preamble, about dealing with Pornhub, I don’t hear people saying it’s hard to define or that privacy trumps, respectfully. I do hear people saying, yes, that’s a real issue, and we need to find a way to deal with it. The problem that arises is that we also have, respectfully, the sponsor of the bill telling the committee, “No, no, no. This can apply to social media as well.” We see many references to X as part of that discussion. Is the intent that this should apply to X? Is it to apply to some of these sites? I must admit, I hear members of this committee and the sponsor of the bill suggesting that it is.

If that is, in fact, the case, then this is enormously problematic. It is overly broad. It is not a matter of requiring men who want to access porn to age verify. It is actually requiring everyone to age verify, regardless of their purposes, to engage in just regular conversation, to use a chat bot, to use generative AI, to use a search engine or to use any sort of social media companies. That strikes me as wholly unreasonable, given the more specific narrow goal of dealing specifically with porn sites, where I think, frankly, just about everybody is aligned that it is appropriate to find a mechanism to ensure that only those who meet a particular age can access that content.

Senator Prosper: Thank you.

Senator Simons: My question will be for Professors Laidlaw and Geist. I want to pick up where Professor Geist left off.

I want to look specifically at clause 10(5)(a) of the bill which says:

If the Federal Court determines that it is necessary —

“Necessary” is not defined.

— to ensure that the pornographic material is not made available to young persons on the Internet in Canada, an order made under subsection (4) may have the effect of preventing persons in Canada from being able to access

(a) material other than pornographic material made available by the organization that has been given notice under subsection 9(1) . . .

Now, it would seem to me that if you put in a clause like 5(a), you’re per force suggesting that site doesn’t just have pornography. Would that be the way you would read that as well?

Ms. Laidlaw: Yes, that’s correct, and that is partly why I am a bit confused by some of the drafting in the legislation, because of the definitions in clauses 5 and 6. Then we get to clause 10, and it seems to contemplate the idea that you’re going to block all of X, for example, if they are failing to comply with age verification because pornography is made accessible. There is quite a bit of pornography on X, so it is a legitimate conversation to have, generally, but that is the way that I read it here.

For the most part, if I can say this, when website blocking has been used and where courts have approved it in other jurisdictions, it has been, say, in the U.K., for sites that predominantly just existed for sharing copyright infringing content, and they were based out of jurisdiction, so there were no other avenues to hold them accountable.

Senator Simons: Professor Geist, would you agree?

Mr. Geist: Yes, Senator Simons, I certainly agree.

Frankly, I don’t even see how this part of the bill is constitutional, to explicitly say that we’re going to block legitimate content. Let’s recognize that there are actually two elements of people being blocked. It’s both the content itself, which we readily acknowledge is lawful content and not even pornographic content that is being blocked, and it’s people who are entitled to view it, which is literally all Canadians, not just underage Canadians, are blocked from being able to access that. That strikes me as wholly disproportionate and unlikely to survive constitutional scrutiny.

Let’s recognize as well that website blocking, in our experience, over-blocks all the time. When Telus, years ago, attempted to block a single union site, it blocked 600 other sites, including a breast cancer fundraising site, at the same time. It is imperfect.

The notion that, somehow, we ought to trust that these companies are ultimately going to find a way to cooperate, as we heard one senator suggest earlier — I heard the same thing with respect to Meta and the Online News bill. Two and a half years later, we don’t have any links to news on that platform. When companies make it clear that there are red lines about what they are willing to comply with — and, frankly, requiring millions and millions of Canadians to engage in age verification strikes me as a kind of red line — many of those same companies are either going to challenge you in court or they’re going to simply walk away.

Senator Simons: Thank you very much.

Senator K. Wells: Thank you all for being here.

There is not enough time for an important conversation with all of you. My question will be to Dr. Geist and Dr. Laidlaw. You may not be able to answer this in our scope of time, and I would welcome your written feedback on this. If we were looking at this proposed legislation, should it pass in its current form, through the lens of the Charter, I am interested in your sense of whether it would comply. Where would the Charter breaches or concerns be? Can we have a mini Charter analysis? If I have am putting you on the spot here, I would welcome — as I am sure the committee would — a written submission with more detail.

Mr. Geist: Thank you for that, senator.

I just highlighted at least one example. The notion of including website blocking with an express permission to the court — as if the court needs permission in this — that the court can block lawful content as part of a blocking order, that strikes me as something that would be very tough to sustain because let’s recognize that when you combine that with the scope of this legislation, we are not just blocking what most would conceive of as pornographic websites.

We are blocking sites — let’s stick with Twitter or X as an example. Of course, we all recognize there is some pornography on that site. One of your earlier witnesses spoke about the percentage of youth who may access it as being quite high. But that’s not really the relevant metric. The relevant metric, if you are Twitter or X, is what percentage of that site is porn. It’s a tiny percentage of the overall traffic on a site that has literally billions of tweets or comments that take place on that site at all times. It is a very small proportion. Yet, it, too, would in theory be covered here based on the coverage; so, too, search; so, too, image-based generative AI. The list is enormous. Reddit certainly would be covered as part of this conversation as well.

Given the scope, I must admit that the freedom of expression provisions within the Charter begin to kick in here as well. Given the broad scope that I think has been quite deliberately used, because we’ve heard senators say, “Yes, we think it can apply here as well,” then you end up with something that is open to challenge.

To quickly close, again, if you flip the approach and you say that the only way someone is covered by this is by way of regulation, you protect yourself at least with respect to the scope issue from a constitutional perspective because then you are ensuring the Governor-in-Council is going through that more fulsome analysis as to who is really intended to be covered here, and they are including them by way of regulation.

Ms. Laidlaw: I will build off of Professor Geist’s comments to say that I don’t think there’s any way, as it stands, that the website-blocking provisions would withstand constitutional scrutiny, so maybe I can tell you how to fix it so that it would.

First, narrowing the scope to the pornography providers — their dominant purpose is to provide pornography — is a way to narrow the focus of age verification. Website blocking, when it has met human rights principles of proportionality, has done a few things. One, they are mainly focused on websites against which they basically can’t enforce their law. You are looking to form-based websites. You are looking at ones that are perpetually not complying with Canadian law, and they just exist to share pornography. Some of those do exist. They are sharing intimate images without consent. They are revenge sites, as well as pornography sites.

When the website blocking is approved by a court and only by a court, which is still provided here, it is time limited, and it is narrow in scope, if you can keep it to specific content. In this case, you would be talking about sites that are for pornography, and then the court revisits that in a short period of time. It is more about quick compliance and forcing them to the table instead of a perpetual ban. That is a way to do this as a last resort mechanism in a very narrow capacity that, I think, could withstand constitutional scrutiny.

Senator Saint-Germain: My question is for you, Professor Geist.

Before I proceed, I want to put something on the record. We heard from Lord Bethell from the U.K. House of Lords, and he did not seem to be overly concerned about the risk of the impact of a data breach relative to the benefit of protection of children. On data breaches, he said in front of this committee:

So far, so good. Listen, there will be a data breach. Of course there will. Whenever you have a collection of data, some of it will leak out somewhere. That hasn’t happened anywhere so far.

Well, he was right. On the day he appeared before this committee, a serious data breach was reported in the U.K. and in other countries using a third-party age-verification system.

In an article published this morning, the U.K.-based law firm, Kennedys, had this to say:

While the new legislation was implemented for the explicit purpose of protecting children from adult content, it also brings with it an unintended consequence: platforms complying with the rules could become prime targets for cybercriminals. Verified identity data is a high-value prize — linking real identities to online activity — and if compromised it could have implications far beyond the law’s well-intentioned aims.

My two questions for you, Professor Geist, are the following: Do you believe this recent breach in the U.K. is evidence that, if we adopt Bill S-209 soon, similar breaches will affect law-abiding adults in Canada, and even teenagers, and that would be putting their data at risk? In your opinion, is this bill salvageable, including in reference to the recommendations for amendments that Ms. Laidlaw gave to us minutes ago in her testimony?

Mr. Geist: Thank you for the question, senator.

I think that security breaches, data breaches, cannot be viewed as unintended consequences. Respectfully, they are going to be intended because they are foreseeable at this stage. Essentially, if you pass this legislation as is, you must go in knowing that there will be these breaches. It is not that you intend for them to happen, but they are entirely foreseeable. They are virtually guaranteed. As I mentioned, they are more likely to affect racialized Canadians and persons of colour than they are others given the fact the attempt to find less invasive technology like age estimation does not work nearly as well on darker skin tones. The reality is that this is not a risk; it is a guarantee. Over the last hour, we have now heard of multiple breaches that have taken place with exactly this kind of technology.

Those risks are real, and they are compounded by the fact that the Privacy Commissioner directly told you about his ability to mete out punishment where these breaches occur, much less seek to enforce Canadian law against service providers, virtually all of whom, as far as I know, are not in Canada. It is positively stunning to think this legislation would require millions of Canadians to send their government-issued IDs and other identification outside of the country, largely beyond the effective scope of the Privacy Commissioner from an enforcement perspective. That is the trade-off that takes place here, which is why it is so essential that if they were going to move in this, the gains, the benefits, have to be very clear, and we have to limit the risks as much as possible.

In my opening statement, I said, frankly, I think it should be scrapped. I think Professor Laidlaw provided the right course of action. We know that an online harms bill is going to come back. This is essentially dealing with an online harm. The right place to deal with this is in the legislation that follows from Bill C-63, which would bring it within a broader framing with more effective enforcement and administration.

Senator Dhillon: Thank you, everyone. I appreciate you being here today and your responses.

I will direct this question to Professors Geist and Laidlaw. Before I ask the question, I wish to remind everyone and put on the record that we have evidence about the harm that children are suffering and that there is enough validated evidence to treat early porn exposure as a public health issue and not an educational issue and not just a moral panic. There is a great desire and urgency to deal with this issue. How we get to it, the vehicle and the instruments we use, are what is at question. There is no question that we all agree there is harm that our children are suffering every day that we delay some type of effort in protecting them. There was one witness who also said that sometimes you have to take the kitchen-sink approach to come at this for the betterment of all and the betterment and protection of our children. We are also talking about access, privacy rights and balancing that against the harm our children are suffering.

I am no law professor, so I will have you correct any of what I am saying here today. Certainly, someone’s right to expression is guaranteed, but there are no guarantees to access that expression. There are limited scopes and places where there are guarantees around access. In this instance, in this scope, I would argue that is not a guarantee.

When we talk about the protection of one’s privacy, I return to what the Privacy Commissioner of Canada said, that laws are based on principles. That is what we are doing here, working with principles. We are going to rely on the Governor-in-Council and, as you said, Professor Laidlaw, the trust we place and the confidence we have in the regulations that come following the principles we put in place here.

With that, the question I am asking here is: Do you agree that in this instance, balancing and keeping in mind the harm, understanding that access is not a right and understanding the ability to address the issues around privacy breaches, and the Privacy Commissioner said when he was here there is no expectation and there would be follow-up — I take your point, Professor Geist, that the Privacy Commissioner did say there are more teeth we can provide to him when it comes to following through with some of the penalties. Putting that aside, when we approach it from that perspective, is this not a good step forward? Is this not the right move? Is this not the right thing to do today?

Ms. Laidlaw: Thank you. That is an excellent question.

The short answer I will start with is that I think access to pornography sites should be subject to age verification. I do land there. There are many circumstances where we need to show IDs. We had to show ID in the physical world to go into a video shop and access pornography. We have all kinds of environments where we have to show IDs now, like to access gambling sites or our banking. All of our data is vulnerable.

Professor Geist is right that this does create a privacy and cybersecurity risk. What we have to think through is how to ensure that if you narrowly scope a type of legislation like this, that it is done in a way that forces the kind of privacy-preserving and cybersecurity measures that are critical to ensure that privacy is protected as much as possible and the least amount of data is both collected and retained at any point in time.

I echo that, in the end, I think this is best in online harms legislation, but specific to your question, that is my answer.

Mr. Geist: Thank you, senator.

I would say I’m not here to argue that we have to have a right to pornography. I’m fine with seeking to regulate that access. The problem with this bill is that there are rights to access the internet, rights to engage in legitimate expression outside of the pornography world and there are rights to privacy. This bill undermines all three. So no, it is not an appropriate trade-off, particularly when there are alternatives available to deal with these harms, particularly through an online harms bill which will be more effective, more likely to survive constitutional scrutiny and not jeopardize some of those other rights at the same time.

Senator Dhillon: I would like to hear Professor Benedet’s commentary on this as well.

Ms. Benedet: I would add that in terms of the question of sites that are dedicated to producing pornographic content versus those for which it’s only part of the content that they’re providing, I do not trust at all the pornography industry to not work very hard to get around any limits placed or any opening that is given. If you have to watch two cat videos before you’re able to access porn so they can put themselves outside of the scope of this, they will do it. I want to sound a note of caution about who we place our trust in, in terms of commercial enterprises, and to be careful when we are trying to draw lines between people’s legitimate access to the marketplace of ideas in the digital space, or the digital commons, that we are not naive, frankly, about the entities we’re dealing with and the incredible motive they have to actually get young people exposed to their material and habituated to it. No amount of digital citizenship courses by Google is going to be a match for that kind of incentive.

The Chair: We have a hard deadline of 6:35. I hope all senators can get their questions in. If some witnesses have to add to their testimony in writing later, that is fine.

Senator Pate: Some of us have other things and expected to stop at 6:15. Is it possible that we could all ask the questions and get written responses, in the interests of time?

The Chair: Sure.

Senator Pate: That would be great. Thank you.

The Chair: Everyone agrees with that?

[Translation]

Senator Oudar: Thank you to all three of you for being with us today. My first question will be for Ms. Benedet and my other question will be for both Ms. Laidlaw and Mr. Geist.

My first question is this: Ms. Benedet, your work shows that child pornography isn’t just a moral problem or obscene content, but that it’s a structural vector of gender inequality and a framework for sexual socialization. You have demonstrated this. In this context, how could Bill S-209 go beyond simply filtering content and instead contribute to lasting legal and cultural change? In other words, how could age verification legislation become an instrument of real equality rather than a mechanism for technical compliance?

If you could answer the question in writing, I would appreciate it. This could also be done through instruments related to Bill S-209 or in the bill itself. Could you expand the response to include instruments other than the bill?

My question for Mr. Geist and Ms. Laidlaw is this: What do you think of the European Commission? You didn’t mention it at all in your presentations, which I find a little surprising because Europe is quite advanced, and they’ve even been testing an application since this summer. There has been a lot of oversight in place for some time. I would have liked to hear your opinion, so I look forward to receiving your written response.

[English]

Senator Pate: I wish to build on my colleague’s question, because Senator Dhillon went into a number of issues that I was going to raise. What would each of you propose to address the very real issue of trying to prevent online harms? If it’s not this legislation, if there’s particular legislation you have in mind, that would be helpful. Thank you.

Senator Clement: Thank you to Professor Geist for coming back repeatedly to the issue around how this is going to affect racialized communities. For the Black Canadians I am connected to, the amount of lived experience around this is just mounting terribly, and many Canadians are afraid, so thank you for coming back to that.

My question is specific. We had a witness here, a criminal defence attorney, Mr. Hurley, provide suggestions around amendments for clause 12. He talked about adding “Canadian” into that, going back to your points, Mr. Geist, about being worried about foreign or outside-of-Canada age verification. Mr. Hurley suggested inserting a requirement there that it should be Canadian. I think it was in 12.2(b). He also suggested that the collection of information needs to be destroyed immediately, but there is no punishment if you don’t do it. I wondered what you would think about those kinds of specific amendments. I take your point, Professor Laidlaw, about clause 6 and what you’re saying there, but I would like you to look at clause 12 as well. Thank you.

Senator Batters: Professor Benedet, given your 30 years of significant work in this area, what do you view as likely more harmful to Canada’s children: the relatively rare situation of a data breach involving the data of those children, or the much more frequent situation of Canadian children being exposed to violent pornography because proper safeguards do not exist in this country?

The Chair: Thank you to all the witnesses for your testimony and assisting the committee. We welcome your input. We look forward to the questions you are going to answer in writing. We thank the witnesses for coming here today.

With that, colleagues, I will declare this meeting adjourned.

(The committee adjourned.)

Back to top