Skip to content
LCJC - Standing Committee

Legal and Constitutional Affairs


THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS

EVIDENCE


OTTAWA, Wednesday, June 2, 2021

The Standing Senate Committee on Legal and Constitutional Affairs met by videoconference this day at 4:00 p.m. [ET] to study Bill S-203, An Act to restrict young persons’ online access to sexually explicit material.

Senator Mobina S. B. Jaffer (Chair) in the chair.

[Translation]

The Chair: I am Mobina Jaffer, Senator from British Columbia, and it is my pleasure to chair this committee. Today, we are holding a meeting of the Standing Senate Committee on Legal and Constitutional Affairs.

Before we begin, I would like to offer several helpful suggestions that we believe will help you have an effective and productive meeting. If you encounter any technical difficulties, particularly in relation to interpretation, please let the chair or clerk know, and we will work to resolve the issue.

[English]

Senators, I will do my best to get to everyone who wants to ask questions to the witnesses. In order to do so, I ask senators to try to keep their questions and preambles to questions brief. In our first panel, we only have one witness, so you can have up to five minutes.

Members of the committee, I ask that you only signal to the clerk through the Zoom chat if you do not have a question. If you are not a member of the committee, please signal to the clerk if you do have a question.

As you know, senators, today we are resuming our study of Bill S-203, An Act to restrict young persons’ online access to sexually explicit material.

Commissioner, before we start, I would like to introduce the members of the committee: our deputy chairs, Senator Campbell and Senator Batters; Senator Boisvenu; Senator Boniface; Senator Carignan; Senator Cotter; Senator Dalphond; Senator Dupuis; Senator Pate; Senator Simons; Senator Tannas; and Senator Miville-Dechêne, the sponsor of the bill.

For our first panel, we have with us the Privacy Commissioner of Canada, Mr. Daniel Therrien. He is joined today by Martyn Turcotte, Director of the Technology Analysis Directorate.

Before you start, Mr. Therrien, I want to tell you that the committee has very much appreciated your working with us. We look forward to working with you on bills in the future. Thank you very much for making yourself available today.

[Translation]

Daniel Therrien, Privacy Commissioner of Canada, Office of the Privacy Commissioner of Canada: Madam Chair, honourable senators, thank you for the invitation to speak with you today. With me is Martyn Turcotte, Director of the Technology Analysis Directorate at the office.

Protecting children in a digital environment is of the utmost importance. As you know, the UN Committee on the Rights of the Child has recently emphasized that the rights of every child must be respected, protected and fulfilled in the digital environment, and this includes the right to privacy. We support efforts to incorporate special consideration for children’s rights in the digital environment, including through implementing privacy safeguards that specifically address their interests.

Bill S-203 raises a number of privacy-related issues primarily to do with the age-verification scheme and the protection of the personal information that is required to be collected to facilitate the process. It is on these issues that I will focus my remarks today.

Based on the current text of the bill, the Governor-in-Council may make regulations prescribing the age-verification methods referred to in subsection 7(1). Because these methods have yet to be determined, I will provide you with a number of considerations, regardless of the methods chosen.

[English]

This committee has suggested encryption and the use of third parties specializing in age-verification services as a way to reduce risks of privacy violations. You have also discussed some challenges around the use of biometrics, facial recognition technology in particular.

First, Canada’s private-sector privacy law, PIPEDA, applies to private-sector companies implementing age-verification technologies in a commercial context. Current digital age-verification systems use diverse technologies, analytical methods and safeguards. No two systems are identical; the design, implementation and potential for vulnerabilities may differ from one system to another. Moreover, the risks are constantly evolving.

In our opinion, these factors lead us to the view that the key is to ensure that there are several lines of defence.

Regardless of the mechanism chosen, the user will ultimately be required to provide some amount of personal information. However, in any digital age-verification system, the principle of data minimization should be applied to reduce data-matching and surveillance of individuals. There should also be strict controls on access to user data. It is also possible to use a system of tokens to substitute sensitive information with a random string of characters that have no value in identifying individuals.

Encryption, which was discussed at your committee, is a process used to ensure the privacy and security of data. When data is encrypted, it can be sent over the internet and generally be stored without fear of its confidentiality being compromised. However, inappropriate and outdated encryption technology, or flaws in its implementation, can render it less effective or even useless. In this context, encryption is a good method, but it is not a foolproof method, to eliminate the risk of user re-identification.

On the other hand, the use of biometrics, or facial recognition, to verify or estimate a user’s age raises unique privacy concerns. Biometric technologies are generally very intrusive. In addition, their effectiveness in accurately verifying age remains to be demonstrated. We would be glad to elaborate on the lack of total efficacy in age verification with facial recognition or biometrics, if you wish.

Other methods of age verification that do not require the digital storage of personal information could also be considered. For example, an individual could have their government-issued ID card visually verified at a point-of-service location to have their age confirmed and then receive a verified key or code that can be used online in a way that cannot be traced back to them.

[Translation]

The age-verification process will also apply to adults. In the absence of proper privacy measures, it could increase the risk of revealing adults’ private browsing habits. The adoption of clear practices on how to verify the age of users will help reduce the risk of breaches, unauthorized use or reputational harm.

Due to the nature of the risks associated with the collection and use of data needed for age verification, there must be clear requirements for privacy, including technical safeguards. It will be important that the chosen age-verification method be designed and implemented with sufficient privacy protections so that Canadians feel secure in providing their personal information.

Thank you for your attention. Mr. Turcotte and I welcome any questions you may have.

[English]

The Chair: Thank you very much, commissioner.

Senators, I erred. I did not introduce Senator Boyer.

We will now go on to questions. We will start with the sponsor of the bill, Senator Miville-Dechêne.

[Translation]

Senator Miville-Dechêne: Thank you for your presentation, Mr. Therrien.

Of course, you’re right. The whole issue of deciding how to do age verification was left to the regulations, so it will be done after the bill is passed, if it’s passed. However, I did want to get you to consider two points.

First of all, there can be a licensing system given by the government to private companies, which would commit to a number of privacy rules. I’m wondering what your thoughts are on the licensing system that Great Britain is considering, in particular.

More specifically, I would also like to hear you talk about biometric recognition, because I understand that the issue of facial recognition — many will say — is a breach of privacy. However, when we talk about the simplest method of verifying age — because you can’t just take ID cards, whether these cards belong to the person — many now use a 3D self-portrait with liveness identification, ensuring that it is the photo of the person they are talking to, and to ensure that there is a match between the card and the person requesting permission to go to the pornography site.

Do you think this kind of method is possible, imaginable or not at all?

Mr. Therrien: I would say that yes, it’s possible, and that facial recognition is a method that can provide information about a person’s identity.

Perhaps at the margin, there are problems with age verification based on the current state of facial recognition technology, which I am told is not capable of determining age extremely accurately. There are margins of error that can range from two to three years on average, but it depends on the technology — some are more accurate than others. So when I say “at the margin,” for some ages, obviously, a margin of error of two or three years is not relevant, but for others, it would be extremely relevant.

I will follow up on your question about licensing and then ask Mr. Turcotte to follow up on the effectiveness of biometrics and facial recognition.

A licensing system could work. I would say at the outset that it’s good that the bill doesn’t prescribe methods of protection because the technology is evolving too quickly. So these issues need to be able to be defined in terms of the changing technology, changing context, and so on, so that’s a good thing.

Second, what is important is that qualitative criteria, capable of evolving with technology, be developed and defined; and finally, a licensing mechanism could help. I think in order: (1) it has to be scalable; (2) the criteria has to be clearly adequate to verify age; (3) it has to be upgradeable in terms of technology; and (4) proposing a licensing method could help.

Perhaps Mr. Turcotte could follow up on the issue of facial recognition accuracy.

Martyn Turcotte, Director, Technology Analysis Directorate, Office of the Privacy Commissioner of Canada: To continue on the subject of age verification systems, particularly with respect to biometrics, they’re considered to be emerging systems, so they’re still new, and they haven’t necessarily reached a certain level of accuracy or precision.

The other thing that needs to be considered is how these algorithms were trained — with which photos. Do they take into account, for example, gender, race, ethnicity, differences between people? Do they take into account the aspects that may influence the assessment of the technology, an estimate of the person’s age? So there are many biases that need to be taken into account before using systems that are appropriate to properly estimate age.

As the commissioner mentioned, we see that there are still two- to three-year margins of error with this type of system. We have to be realistic about the 1.5 margin of error, and we have to look at the age range we’re looking at. Right now, what the legislation is focused on is basically 13- to 18-year-olds, so certainly if I take an age range that goes from 25 to 30 years and include 13-year-olds, my margin of error will be smaller or there will be an impact on that margin of error.

[English]

Senator Batters: Thank you very much, Mr. Therrien.

From testimony we’ve heard, the challenges surrounding privacy appear to pivot around the software programming, as you’ve just been discussing in French, used of age verification. For example, when we heard from Julie Dawson, Director of Regulation and Policy at Yoti, a U.K. technology company, she testified that their system of facial analysis and age estimation does not store any data about the person. Would a system like that address concerns that you have about the privacy aspect of it? Which of your concerns might still remain?

Mr. Therrien: Thank you.

The fact that an organization or a company’s system or given technology would collect less personal information to come to its age estimate is obviously a good thing from a privacy perspective, but I would say, going back to my statement and the reference to several lines of defence, that you need to look at a number of considerations. The amount of information collected by, say, a third party age verifier is one factor, and the less information collected, the better. Then there would be questions of security safeguards and encryption. Is their encryption system adequate and safe? Do they rely on facial recognition, which then leads to the issues we’ve just discussed? There are a number of layers to the onion. You should not look at the individual layers; you should look at the sum total of the layers. Ideally, of course, each of the layers is designed in a way that protects privacy as much as possible.

Senator Batters: Thank you.

In your opening statement that you provided to us, you talked about some other methods of age verification that wouldn’t require digital storage of personal information, and you gave the example of a government-issued ID card that could be visually verified. Could you please tell us a little bit more about that sort of system? In your view, would that be a preferable method of age verification from a privacy standpoint?

Mr. Therrien: The way this system would work is, again, we’re dealing with a third party whose task is to verify age for the purpose of allowing access to the website in question. Here the third party would see the individual and the identification card and would be able to confirm identity and age, and then they would transform that knowledge and information into, say, an anonymous code. The code itself is a good way to protect privacy, as opposed to providing personal information, but the respect of privacy of that system would depend on how much information the third party in question would collect, store and for how long before issuing the anonymous code in question. For the website provider, there would be less personal information collected, which is good, but if the third party were to collect a whole lot of information for a long time, that would not be so good. Again, it’s a number of factors.

Senator Batters: Yes. That onion you’re talking about. Thank you very much.

[Translation]

Senator Dupuis: Thank you, Mr. Commissioner, for being here to help us to better understand the issues of this bill.

In your brief, you set out a certain number of criteria. From what you explained in response to a question earlier, there are several layers of considerations to look at. Ultimately, it’s all these considerations that give us a better idea of the way forward.

My concern is with what you just said about personal data that is collected, how it’s used and how long it’s kept.

If an age verification mechanism is introduced, everyone who wants to access the site will have to provide that information, regardless of whether it’s through facial recognition or an ID card, right?

Mr. Therrien: Clients of the site will need to provide at least some personal information, either to the site owner or to a third party, to be able to verify their age.

Senator Dupuis: The bill aims to protect children and block their access to these sites.

If systems are emerging, the margins of error are not negligible, especially around the age of majority. In addition, there is a problem with data collection, and it is not clear how the data is stored. The approach is much broader because everyone’s data is obviously being collected.

In your experience or in the studies you have done to ensure privacy through technological means, are there other methods that could address Senator Miville-Dechêne’s concerns?

Mr. Therrien: I will confirm that at least some personal information must be given to at least one third party.

Senator Dupuis: Yes.

Mr. Therrien: With respect to the length of retention, for example, by a third party, that could be part of the regulatory criteria that would be adopted and potentially subject to permits or licences.

We can’t overlook the fact that at least some information must be collected or collated — obviously, the least amount of data that is kept for the least amount of time. You have to ask yourself whether it is better to have it done by a third party or by the host of the site. The question bears asking.

A set of regulatory criteria should be adopted, which may actually mitigate the privacy risk, but not eliminate it completely.

Senator Dupuis: As part of your mandate and responsibilities, have you established criteria that could be used as a guide?

In an article about the law in France, I read that there has been an attempt to sanction sites that disseminate pornographic images that may be viewed by minors. Since the publishers of these sites are located in tax havens outside of France, the government has no legal control over these sites. Wouldn’t we have the same problem here?

Mr. Therrien: It’s always a problem where the site is hosted. The bill, as drafted, targets organizations and businesses that make sexually explicit material available. They may be outside the country, which is a problem for enforcement. So this is a real issue.

However, I don’t think that means that we should give up and not adopt rules. The rules, if I may say so, are entirely justified. Second, there are very real enforcement issues that go beyond my mandate, and they involve the extraterritoriality of the sites in question.

As for the criteria defined by the Office of the Commissioner, I covered them in my presentation. One important principle is data minimization, which means collecting as little information as possible. Retention is important; ideally, information would be retained only as long as necessary for age verification. Encryption is not a perfect mechanism, but it helps a lot when used properly. I think my presentation speaks to the most important criteria that could very well be regulated, as the bill provides.

Senator Dupuis: Thank you.

Mr. Therrien: When it comes to privacy, the risk is generally not eliminated. You try to reduce it as much as possible. I think the structure of the bill is such that you can reduce the risk of privacy breaches without completely eliminating them.

Senator Dupuis: From our perspective, what we are also trying to do is pass legislation that can be enforced. If the majority of these sites are outside the country, we may have done some good by passing legislation, but we won’t have solved anything either. That was my concern. Thank you, Mr. Commissioner.

Mr. Therrien: Great.

[English]

Senator Boniface: Thank you to Mr. Therrien for being here, and thank you for your opening comments because they actually clarified some of the questions I had in my mind.

I’m wondering, in your experience in what you’re seeing around the world, in your opinion, is there an age-verification system that is more protective of private information than other options at this stage in technological developments? I’m wondering if you have been able to look at other options and whether you are seeing the technology actually advance to meet the type of criteria you are referring to, particularly on keeping it for the least amount of time.

Mr. Therrien: We haven’t, is the short answer. The slightly longer answer is that even if we had, the situation could change in three months because technology evolves constantly and the context changes constantly, so I really think to leave the definition of standards to sub-delegation is the way to go.

Senator Cotter: Thanks, Mr. Therrien, for being here and sharing your insights with us.

I have two questions. I think one is related to Senator Boniface’s question. It seems as though internet technology creates the many opportunities but also the risks to young people here that we are actually trying to address in consideration of this bill, and then at the same time, the technology and the technology risks present the exact kinds of difficulties that potentially impair, compromise or prevent the legal application of a provision like this, the privacy points that you raise. I’m wondering in general terms whether there are ways of thinking about these solutions. You mentioned the need for kind of an evolving regulatory structure, perhaps. That’s the first question.

The second question is a slightly different one and perhaps not directly in your field of responsibility. Based on some work I’m doing on another issue on another bill before the Senate, I have come across the strategy that the United States uses to prevent online gambling in poker. What they have done is impose sanctions on the institutions that might finance bettors who try to bet online to play poker. I’m wondering whether mechanisms like that, to sanction violators, would be a somewhat indirect but perhaps more effective way of addressing the problem and forcing compliance by people who are presently doing things that we want to see stopped? Thanks.

Mr. Therrien: Thank you for this. I’ll try to answer the second question first, and maybe you’ll want to add so that I can address your point more precisely.

I think the bill targets the company or organization that provides access, which obviously is the focal point, but also the internet service provider, which is in Canada normally. The fact that the bill addresses or defines responsibilities at both of these ends is a good thing. Are there others who should be caught through the scope of the bill? Potentially. But I see that as a good feature.

To your first point about the technology having benefits and risks, including the fact that many technology providers are outside Canada, the fact that the bill focuses in part on internet service providers who are generally in Canada is helpful. I’ll just say that under the PIPEDA, the long general application in the commercial sector that we apply, an important factor as to the scope of application of that law is whether the company that is collecting information has a sufficiently serious relationship with Canada, operates in Canada. The fact that it is housed outside Canada but operates in Canada is a factor of application of PIPEDA. We have been successful in applying our legislation to many companies that are headquartered outside of Canada but operate in Canada. Does it solve all problems of extraterritoriality? No. I would say, again, that the fact that internet providers are caught by this proposed legislation is a good thing, and it may be that there are other players that should be caught as well.

Senator Cotter: Thank you.

Senator Simons: Thank you, Mr. Therrien, for being with us here this afternoon.

As Senator Batters mentioned, last week we heard from a witness from Britain, Julie Dawson from a company called Yoti. She was very careful to push back every time we called it “facial recognition software.” She called it “facial analysis software.” In the course of questioning, she told us that their company is selling their technology not just to pornography companies but to retailers in the United States who are using it at the point of purchase, say in supermarkets, to make sure that they are verifying the age of people who are purchasing products such as cigarettes or alcohol or lottery tickets perhaps. When I expressed concern about that, she went on to explain that this is very useful because it stops there from being sometimes ugly confrontations between a sales clerk and the person being forced to show ID to buy something. It concerns me that the adoption of this kind of technology may normalize and regularize a consumer expectation of showing their face for artificial intelligence analysis quite routinely.

I wonder if you could tell me if your office has been tracking the adoption of this kind of technology in Canada, if you know if there are private companies or government actors who are using it and if you are concerned that we could be opening the door to tech that could be quite invasive in the private lives of all kinds of Canadians?

Mr. Therrien: That’s a very good question.

Biometrics and facial recognition or analysis — whatever the expression used by the representative from Yoti — collects at some level information as to identity that is essentially of a permanent nature. That’s the core of the privacy concern with that technology, that it may be used in this instance for a good purpose, which is to ensure that people who want access to certain services are of a certain age, but you have to have trust in the company in question that it will apply all of the criteria to reduce privacy risks. The criteria that might be adopted under regulations would reduce the risk but would not eliminate it. It’s a question of will the companies comply? Will the sanctions work? If they do not work, then it’s a question of trust in the companies in question.

If a technology is well designed, it can be useful and privacy protective, but if it is badly designed, and in the worst case scenario leads to the collection of information that can lead to essentially surveillance, then obviously your concerns become relevant. That’s the nature of technology, as Senator Cotter was saying. It can provide benefits here such as age verification, but is Parliament sufficiently confident that the regulatory criteria are adequate and will be enforced to ensure that the risk is sufficiently reduced is ultimately the question.

Senator Simons: Do you know of examples of this kind of technology being used at point of sale, for example, in Canada? I understand you are the federal Privacy Commissioner. That may be a more provincial Privacy Commissioner question.

Mr. Therrien: We have not investigated this.

Senator Simons: Because it seemed to me very problematic that you are conditioning people that there is an expectation that they have to provide their face for facial analysis or facial recognition, whatever you choose to call it, and that we are leading to acceptance of the surveillance culture by shifting the paradigm of what the expectations of privacy in the public realm might be.

Mr. Therrien: I’ll answer indirectly. We haven’t investigated the use of technology for the specific purpose to which you allude, but we have investigated, as you may know, a facial recognition company called Clearview AI that offered its services to law enforcement agencies and companies, and we are about to complete an investigation into the RCMP. We will publish the results of that investigation very soon, and more to the point of your question, propose guidelines as to how police agencies should use facial recognition technology. But in that document, we say — and that goes to the societal debate that I think you are referring to — this can be very useful technology, but if it is used in many circumstances, either to prove age or not to have to take your wallet out as you pay for a certain good and leaving a grocery store or other purposes, if this technology is used so pervasively, what does that do to the expectations of individuals being able to go about their private lives without the risk of being surveilled for commercial or other purposes? That’s a societal debate.

Among the good uses of technology, what is in this bill is probably on the good side. There is a good aspect to the use of the technology. But if it is used for that purpose, or to catch criminals, that’s one thing. If it is to allow someone to again more conveniently leave a store in order to pay for something, if it is for another purpose, then the sum total of these practices is of concern.

Senator Simons: Thank you.

Senator Pate: I’m interested in hearing a little bit more on the way that you see guaranteeing that the code that you’re talking about would be untraceable. There is a particular concern, of course, around the tracking, in particular the discriminatory approaches to tracking for hiring, for housing particularly for racialized and marginalized Canadians. How would you see that type of code being guaranteed as being untraceable?

Mr. Therrien: This would be provided by, for example, a third party. I think the reduction of the privacy risks would come from rules and regulations that would reduce the period of retention of the information and would require as little information as possible from this third party to issue the anonymous code. One can envisage, for instance, to take the example in my statement, that a 19-year-old produces a garment ID — physical digital ID to a company. The company doesn’t take any information other than to acknowledge that the person presenting the card is indeed the person in front of them, and an anonymous code is issued on that basis and no information is collected. That would be ideal. But there may be other systems that can be envisaged.

The Chair: Commissioner, I have a few questions for you, if I may. In your opening remarks, you have covered some of this but I will ask this again. You said there must be clear requirements for privacy, including technical safeguards. How would you suggest these requirements be put in place? Should privacy requirements or technical safeguards be included in this bill or in other related regulations?

Mr. Therrien: If the question is should they be in the bill, the legislation or regulations, I think it’s preferable that they be in regulations because of the evolving nature of technology. Under the legislation that we apply, PIPEDA, the technical standards are not prescribed in detail. It is a standard of adequate standards of protection consistent with the sensitivity of the information. That’s the standard that applies, and that is prescribed by law. Frankly, I think that is the best approach, to have a flexible standard in the law itself and to have technical details in regulations.

The Chair: Thank you.

Commissioner, we have heard about privacy measures in place for some online age-verification systems. As an individual, would you use online age-verification systems if this law came in tomorrow, or would you have privacy concerns?

Mr. Therrien: I would ask how many layers there are to the onion being presented to me. I would ask for many levels of protection.

The Chair: Thank you, and you explained that earlier. We will go to second round.

[Translation]

Senator Miville-Dechêne: Mr. Therrien, I want to come back to one of your premises. This is not about protecting commercial and corporate interests. As you said, it is about protecting children. That’s why we are looking for a balance, of course, between some loss of control over the private data, which may be temporary, and protecting children from commercial interests. That seems to me to be the main argument.

More generally, I would like to hear about the measures taken by betting and lottery sites, because age verification must be done on all those sites, since people are not allowed to bet if they are under 18. Right now, in Canada and, of course, in the United States, age verification systems are in place. We don’t need to talk about facial recognition, but we often talk about the option of seeing the person who submits their cards to make sure there is a connection between the two.

Are you shocked by this sort of more minimal verification or is this the price we have to pay to protect children from the harm caused by viewing pornography?

Mr. Therrien: I think I said earlier that, among the age verification practices, the one in the bill is among the more justifiable. So no, it doesn’t shock me.

Then comes the question: Which technical measures? We have talked in some detail about facial recognition, its risks and the inaccuracy of the technology. As you say, choosing the technology is important. Protecting children is clearly a laudable goal that can be achieved through technology.

Senator Miville-Dechêne: That is why I would like to take you to the current technologies used by sports betting and lottery sites. Do these sites seem to respect the privacy of players?

Mr. Therrien: Unfortunately, we have not investigated those sites or how they perform age verification. That is why I would not be able to speak first-hand.

Senator Miville-Dechêne: Thank you.

[English]

The Chair: Thank you, commissioner, for once again making yourself available. We look forward to working with you in the future. Mr. Turcotte, thank you as well for making yourself available.

For our next panel, we have Brian Hurley, Director, Canadian Council of Criminal Defence Lawyers, and as an individual, Emily Laidlaw, Canada Research Chair in Cybersecurity Law and Associate Professor, Faculty of Law, University of Calgary.

Senators, you will have five minutes to ask your questions.

Brian Hurley, Director, Canadian Council of Criminal Defence Lawyers: Good morning, ladies and gentlemen. As a quick introduction, I’m a criminal defence lawyer in Edmonton. I have been doing this for the last 28 years in Alberta, Saskatchewan, B.C., the Northwest Territories and Nunavut. I also happen to be the father of four children who are now in their late teens and early 20s, so certainly this is an issue that my wife and I have grappled with over the years — also the issue that our children, like every child in Canada, posts absolutely everything about their entire life online.

I will open with a few concerns that have been expressed by other witnesses. Obviously, age verification is something that may work; it may not. It has been tried on a lot of sites, some of it pretty basic, some of it pretty complex. Presumably, as the preamble states, we have, or can find, effective age verification that does not sacrifice privacy. I’m not sure if that’s realistic at this point in time with technology. I think to have effective age verification, you have to divulge a lot of information, so that, for instance, my 16-year-old son cannot pretend he’s me when he wants to look at something.

With the act itself, I am concerned, as a criminal defence lawyer who has represented several youth and as a father who has talked to his kids and many other parents, that this is truly aimed at commercial purpose, large providers, large internet providers and things such as child pornography. I have represented dozens of 16- and 17-year-olds who have been charged with child pornography provisions and talked to dozens of parents when their child shares a photo of themselves, for example, mooning a school.

The act itself seems to be focused on commercial purpose, and there is clause 4, which starts, “Every person who, for commercial purposes. . . .” Then I connect that back to clause 11(a), which opens things up to further regulation and provides that this might encompass things that are posted free of charge. Once again, as both a criminal defence lawyer and as a father of kids sharing everything they do, even inappropriate photos, it made me think.

I think the law ought to be a lot clearer regarding who has the duty to do this and who it’s aimed at. Are we aiming at the Shaws, Bells and TELUSes of this world? Are we also including the Pornhubs and other sites like that of this world? Or, as subclause 11(a) leaves open, are we aiming this at absolutely everyone?

As a defence lawyer, I’d be concerned with clauses 4 and 5 being quite broad.

I’m also concerned, as a defence lawyer, when I have to Google words in a piece of legislation. I would hope that by now I’m used to most of the technical words that come up in criminal legislation. I had an interesting Google of “mandatary” and its connection to the Pope. My children are bilingual and I’m not, so I haven’t looked at the French wording. However when I see words like that, I get concerned. There is also a word like “acquiesced,” which is not part of the criminal lexicon of the English criminal law we have in Canada. I get concerned about that, particularly when we’re looking at six months in jail as the upper end here.

Additionally, I’m concerned about the corporation liability for employees, corporate directors, et cetera, these things being very broad.

This legislation includes the terminology “legitimate purpose,” which is contained in other legislation often focusing on this. “Legitimate purpose” can be tricky. It’s vague. It hearkens to old Criminal Code sections such as immoral theatrical performance or mailing obscene material, which relied upon very difficult and prone-to-vague enforcement and vague interpretation community standards tests. In Edmonton, we celebrated the anniversary of a raid on a bath house many years ago here. Of course, community standards back then allowed for that type of thing. So I am concerned about “legitimate purpose” and inviting a community standards test that can be problematic, particularly for marginalized members of the community.

This is something that is occurring more and more often in the federal legislation, and I’m referring to clause 8. We have a piece of legislation that is telling judges what is aggravating. With the greatest of respect, ladies and gentlemen, our judges know what is aggravating and they know what is mitigating, as do our defence lawyers and Crown prosecutors. This particular piece of legislation simply states that the purpose of the act, obscene materials, is aggravating. It seems a little redundant, and I would suggest that it’s part of a trend to be instructing judges on what is aggravating. We appoint very bright people to be judges. Those men and women know what is aggravating and mitigating.

The terminology of “obscene material” and then the term “explicit material” in the legislation cause me concern. Once again, I haven’t looked at the French terminology. I actually asked my daughter, who is going to law school and is bilingual, to have a look at it, but we didn’t get to that lengthy discussion. But I am concerned we have both “explicit” and “obscene” material in the same piece of legislation that presumably refer to the same thing — or do they? I don’t know.

Regarding clause 9, once again, as a defence lawyer, I have concern about it being quite broad. Does this get service providers investigating personal information of individuals and turning it over to authorities — turning over information that can be then used for prosecutions under this legislation or potential criminal prosecutions?

Senator Paula Simons might know about the old Hunter et al. v. Southam Inc. and a raid on a newspaper here in Edmonton that goes back decades under the Canadian Combines Investigation Act. Our Supreme Court told us clearly that warrantless seizures are presumptively unreasonable. This would seem to be a warrantless seizure, based purely on the minister’s belief that there’s a reasonable issue here, and that’s problematic. It was problematic in the 1980s under Hunter et al. v. Southam Inc., and it is problematic today.

Clause 11(c), the penalties for service providers failing to comply with the previous clause 10, is left to regulation. We set out other penalties. As a defence lawyer, I’m concerned about penalties left to regulation. It can and often does invite extortative types of penalties, particularly when we’re dealing with big companies. We all see the U.S. example of the $50-million penalty. I wouldn’t like to see Canada move that way. Let’s have it spelled out in the legislation what the penalty is; let’s not leave it to a regulation that can be changed in a matter of weeks and can be specifically focused on a service provider. Now, we never want to have a Donald Trump in Canada, but we write legislation on the belief that we might have that risk at some point in time, so we write legislation carefully so that regulation can’t be used for an extortative purpose somewhere down the road.

Those are my quick comments. Thank you.

The Chair: Thank you very much, Mr. Hurley. We will now go to Professor Laidlaw.

Emily Laidlaw, Canada Research Chair in Cybersecurity Law and Associate Professor, Faculty of Law, University of Calgary, as an individual: Thank you very much.

I approached this from a slightly different angle. Brian, I was carefully listening to your comments. My comments today will focus on two aspects of the bill, and those are the website blocking and age-verification aspects.

I want to begin by saying that I think the goals of the bill are commendable. The impacts of children viewing pornography can be profoundly harmful. Arguably, children have a human right to be free from exposure to pornography; there’s been some literature on that point. But we do have to be mindful that, for adults, viewing pornography is legal and protected expression.

Let me turn to website blocking. The right to freedom of expression — and I’m taking more of a human rights lens here based on the work I’ve done — is much wider than people often understand. It’s the right to seek, receive and impart information and ideas, regardless of frontiers. It’s not an absolute right, but any limits on the right have to be necessary and proportionate to the public policy goal. Also, foreign and domestic content should be treated the same. Privacy is also part of that story; it is a prerequisite to enjoying the right to freedom of expression.

What does that mean for website blocking? It has historically been frowned upon in democratic societies, because it’s seen as a prior restraint on speech. It’s hard to do it in a way that’s human-rights-compliant. It can be a blunt tool, it’s easily circumvented, it tends to block more than it should for longer than it should, it tends to be a bit of a due-process nightmare, it can cause collateral damage and it has a global impact. No matter how you cut it, the internet is a global medium.

There has certainly been a trend by states toward using blocking in certain circumstances, but there is little international consensus on when it is appropriate, except to address child-sexual-abuse images.

If we look at the bill, I am sympathetic to the enforcement problem against foreign-based websites that prompt the focus on website blocking. I’ve thought about this quite a bit since earlier conversations about this bill, but I will lay out my concerns.

One, in principle, blocking should be a last resort, if at all, and it’s not structured that way at the moment in the bill. Alternatives are collaboration with law enforcement and other stakeholders. Education is one of the primary points to address access to child pornography by children, as is blocking at end-user points, which is often tied closely with education.

Second, to meet due process, anything like this should only be administered through a legal body, not a minister, as proposed here. I know that’s something that has been thought through by those involved already.

Three, it needs to be narrowly scoped. This is something that I think Brian was talking about. Who is the target? Here, it’s any site that makes available sexually explicit images, so it’s not just focused on sites primarily devoted to such content. You can imagine anime websites, mainstream video-sharing websites, blogs, email providers and different artists who are posting their work. Those might be captured at the moment — or is captured at the moment — with the way the bill is drafted. I would also suggest here that perhaps those in the sex worker industry be impacted by maybe some of the knock-on impacts of capturing some of these providers to their work.

Four, it should not be outsourced to private companies to administer and set the terms. Outsourcing to ISPs does generally raise net neutrality concerns. My concerns are more about how problematic it can be to enlist ISPs into a policing role.

The last point I want to make about website blocking is there needs to be a focus on whether it can be effective here to address the mischief that you’re concerned with. I have some concerns about that. One concern is that individuals can route around the problem. VPN access, for example, is a way to route around it. The other is this might not necessarily reduce the most common ways kids are exposed to pornography, such as stumbling upon it, inadvertent pop-ups or content that somehow beats whatever the filtering tools that are used on mainstream sites like YouTube or whatever it is.

That brings me to age verification. In principle, age verification is a good idea, but I’m not sure the tech is there yet. I wasn’t able to hear the earlier comments, although it seems like this has been a point that was made to you before. What we can deploy easily in the physical world is far more complicated to do online.

There are really two options here: a trusted third party or through each website. My understanding is the thinking is that this would be through some sort of trusted third party. At least that’s been explored.

There are challenges with age verification, at least where we’re at right now with it. One challenge is social stigma. It creates a registrar for all Canadian pornography consumers. This is a free-expression concern because it creates a high bar for adults who want to access this legal content. It also creates a privacy concern. It compels them to disclose their viewing habits. Even on a larger scale, just that this is part of their viewing habits. The problem is that it’s not waving a driver’s licence in a bouncer’s face to get into a club. This essentially would operate as an ID system. It does also raise equality concerns, because even economically disadvantaged adults might not have, for example, credit cards, if that is one of the ways identity is being established.

The other point is that this creates a significant data breach risk. This is sensitive data. If we look at MindGeek, the owner of Pornhub, it has suffered several data breaches since 2012. If we leave it to every website to use their own verification system, there’s even greater risk of varying levels of security safeguards. Equally, if you house it in one place, even with a trusted party, it then is also a target for a data breach. There’s no easy way out of this one.

Age verification, at least where it stands right now, it’s questionable whether it would be effective, for many of the reasons I identified when we were examining the website blocking issue. I do think there is some hope down the road. Researchers are trying to develop an ID system that is privacy preserving. I know work is going on right now in the United Kingdom. Getting that right is a necessary first step to being able to implement what is being sought here.

My recommendation is that you work with the Privacy Commissioner’s office on age verification and the issues associated with these broader ID systems and more closely scrutinize what the enforcement mechanisms or regulatory options are. The goal is to reduce the exposure of children to pornography that they should not have access to. There are regulatory options that are available to achieve that goal. And more closely scrutinize what the potential free speech implications are of website blocking.

Thank you.

The Chair: Thank you very much, Professor Laidlaw.

We will now go on to questions from senators, starting with the sponsor of the bill, Senator Miville-Dechêne.

Senator Miville-Dechêne: Thank you very much to both of you. You gave me a lot of food for thought.

I want to reassure Mr. Hurley that we are aiming at porn sites, not at individual children, not at sexting, not at any of that. That’s why the word “commercial” is there. I understand why you think it could be wider, but the aim of the bill is commercial porn sites, and it is where the kids are going. Obviously there’s a lot of space for making this bill better. Thank you for your advice.

I want to ask Ms. Laidlaw about what she said about blocking sites. I’ve reflected a long time on this question, and it seems to me that since most of those websites are outside of the country, we don’t have many means to intervene. If a porn site says, “No, I’m not going to verify age,” or says nothing, apart from blocking the site, what else is possible? I would say that it has been done in Canada. There was a trial involving Gold TV. Gold TV was a site that was basically stealing material from others, and the Federal Court gave permission to act. So it is possible.

Obviously, as you say, we have to be sure that the law is respected, but I don’t see what else we can do except blocking the website. Mr. Hurley, once again to reassure you, I’m rewriting this clause to make it clearer what the means are, and blocking will be specified. Could you tell me, Ms. Laidlaw or Mr. Hurley, if there are other means besides blocking a site if it doesn’t comply?

Dr. Laidlaw: I’m very familiar with the case law on website blocking. I lived in the U.K. at the time this started being more actively used there as a method to address precisely what you’re talking about, which are these foreign-based sites that aren’t complying with local law.

There are a few problems with it. Let me say what the problems are, and then I’ll tell you what the options are. The problem is that there are real questions about whether you would be compliant with your international human rights obligations under the International Covenant on Civil and Political Rights if you implement a website blocking program, and that said despite the case involving Gold TV.

The way to make it work is usually a very narrowly scoped website blocking route. But again, it has to be a last resort. It has to be that other ways of trying to address this particular content have been set down in the law, like a graduated scheme, in a sense, until you get to that as the last point. It’s not currently structured that way in the bill.

The problem with website blocking, I’ll emphasize again, is that it is prior restraint. You are saying that people in Canada cannot access content that is for them legal. That hurdle is very high to try to overcome.

The other way to try to deal with that particular content is usually at the end-user point, and it is at that point where — and this is also a flawed approach — at a family level, mechanisms are put in place to block access to particular content online, such as child safety features, for example, so they can’t access particular sites. What are the flaws in that? It’s dependent on each family to be able to implement them.

Senator Miville-Dechêne: Exactly.

Dr. Laidlaw: It doesn’t completely work, but that’s where education often plays a particular role to try to implement those features. Let’s face it: Kids are often savvier than their parents when trying to get around some of these features.

There’s no easy way out of this. The problem is that jumping straight to website blocking is jumping over several hurdles. I have to think through what more creative in-between options there are. There aren’t many. That’s the problem we have. You have flaws at the individual and family level in trying to deal with these features, but jumping to website blocking implicates all kinds of human rights. I’m not giving you a good answer entirely, but that’s where we’re at with this.

Mr. Hurley: I would echo all of what Emily said. As a parent with kids that are far more tech savvy than me and are now young adults, I think all parents would appreciate some assistance from Net Nanny and things like that to block things from their children. That is something maybe this government can get involved in.

The technology exists to block websites. There are regimes all over this world that block all sorts of websites. You can shut down pornography if you want. You can block it out. Other countries have done it. It’s a question of what we give up as far as adults’ freedom to look at things that are perfectly legal in order to protect our children. On a family level, that’s a decision we made with our own computers in our house. We were very locked down years ago, and I would have to go into the office to do research on things. That’s obviously changed as my children have aged, but as a nation, that’s what we would also look at. Are we prepared to lock down websites and lock out adults from certain things for the sake of children? It’s a balancing act. It’s doable. If we do it on the basis of people who violate age-verification restrictions, we have to be careful about what information we put out there that can be hacked into, as we find over and over again.

Senator Batters: Mr. Hurley, thank you for setting out some of the concerns that you have with this bill. I also appreciate what you said about being a parent, because there are many of us who definitely support the noble intent of this bill. We want to make sure this bill is the best it can be to accomplish the goal of restricting young people having access to this type of material. For adults, it’s one thing, but for very young kids, it’s quite another. From the legal perspective that you bring to it, which is very important to this issue, what are the two main ways that you could see some of the concerns you have with this bill being made more compliant with, perhaps, a couple of amendments that you would suggest might be able to be fixed in this bill?

Mr. Hurley: Judicial authorization to get material and information from private citizens or corporations, and a true focus in the bill that this is about commercial organizations and business, so if my 17-year-old son’s buddy shares a photo with 20 people, that they’re not caught by this, which are real phone calls that I take about clients and children. Perhaps some of the terminology that I mentioned may be seeking terminology that is more common in Criminal Code-type language.

Senator Batters: Taking out things like “acquiescence” and not having two different words, “explicit” and “obscene.” What other types of terms did you have trouble with that you had to Google, and what words would you suggest they be replaced with to more properly bring it to the precision that criminal legislation should have?

Mr. Hurley: I think what the bill is trying to say is that if you’ve instructed an employee to do something, you’re liable for those actions, and the liability goes up and down the food chain at the corporation. The word that I had to Google was “mandatary” something that makes it explicit up and down the corporate ladder. The terminology we often look at in criminal court is “with knowledge.” Are we going after people with knowledge? Then let’s state it. Are we going after people who are reckless? Then let’s state that. “Acquiesce” would suggest a level of guilt, if you will, or consent, but something far lower than knowledge and something far lower than recklessness, I would think. The terminology is tried and true in the Criminal Code as far as whether someone is knowingly doing something, recklessly doing something or willfully doing something. That terminology exists, and I assume it’s chosen carefully. As I said, “acquiesce” and “mandatary” are not something I was familiar with in criminal language, and “mandatary” I wasn’t familiar with at all.

Senator Batters: Thank you. I appreciate that.

[Translation]

Senator Dupuis: My question is for both Ms. Laidlaw and Mr. Hurley, and I have a more specific question for Mr. Hurley. I understand the intent of the legislation, but I have a concern. If we tighten up the wording of the legislation to make it more in line with what is traditionally used in criminal law, does that address the issue of children accessing pornographic sites? In other words, if different families have different rules for accessing the Internet — for example, I could deny my children access, but all they would have to do is go to the neighbour’s house — then we have just missed the point of the legislation.

My more specific question for the two witnesses, whom I thank for being here today, is this: based on your experience, Mr. Hurley, wouldn’t it be simpler to have something else in the Criminal Code, like an offence? In other words, if I ask you more bluntly, would your career of defending people who have allegedly accessed pornographic sites under the age of majority be shorter and less lucrative if we changed the Criminal Code, instead of passing a bill that raises so many legal problems, as you have both just told us?

[English]

Mr. Hurley: It’s a very specifically focused bill, but we do have specifically focused pieces of legislation in the Criminal Code. The closest to this is 171.1 of the Criminal Code, which is essentially sending pornography to a child to invite a sexual liaison. So that’s there, and this certainly could be incorporated into something like that as part of the Criminal Code but aimed at corporations.

If you want to truly aim at a corporation, then a stand-alone piece of law may be the best to suggest that we are focusing specifically on corporations, and there’s nothing really like that in the Criminal Code at the moment that I’m aware of. That may be a benefit of having it as a stand-alone bill to make it very clear that this is something meant to focus specifically on commercial purposes.

My focus on it and my take on it, as a criminal lawyer, was whether this is going to withstand Charter scrutiny. Will I be able to poke holes in it when TELUS or SaskTel pays me a huge retainer to go after it? I think there are problems in that. An experienced defence lawyer could poke a lot of holes in it.

To your question about if this effectively protects children, I don’t know. Is there too big a risk with the amount of information that’s necessary for age verification that it will create more harm than good? That’s the problem, Senator Dupuis.

[Translation]

Senator Dupuis: Thank you. You understood my question very well about your brilliant career in finding the holes in this bill. That is why I asked you the question: Would it not be better to turn to an amendment to the Criminal Code, which already has provisions? Thank you for your answers. Ms. Laidlaw, do you have an answer?

[English]

Dr. Laidlaw: I would obviously defer to Brian’s comments about specifically the Criminal Code. What I’ll comment on is more about effectiveness.

There’s no way through this where we are going to be able to wholly protect children. This is about simply reducing the risks of exposure. But with that lens, though, it means that the scrutiny of the way that other rights are impacted in order to achieve that goal needs to be carefully considered here. This isn’t going to be a perfect system. Children are going to be exposed, no matter what.

What are the risks that you are taking when it comes to individuals having to register their particular interests? Again, this is me saying that I don’t think the tech is there. I don’t know what conversations you had earlier with this committee.

It means, too, that if you select the route of website blocking, it needs to be narrowly scoped to the extent that you have oversight and you have due process principles built into this. It is only a matter of last resort. You have other options that are available, for example fining the company. I know that’s one of the things that’s set out. Website blocking should be rarely used. It’s certainly something that other countries have used only recently, but that’s because it’s an easy mechanism and not necessarily the right mechanism. That’s just what I want to emphasize.

[Translation]

Senator Dupuis: Thank you, that answers my question. I think we have a problem. English has a very interesting expression, “feel-good legislation,” which ultimately leads to more problems than it solves. I can’t find the equivalent in French. I don’t know whether we are doing more than that with Bill S-203. Thank you for your answers.

[English]

Senator Boniface: Thank you to both witnesses for being here. I think you’ve probably answered all of my questions. I feel like I’ve had a lesson in criminal law review, so I’m grateful for that.

Mr. Hurley, can you expand on your comments on the aspects of this bill that would be in regulation versus legislation? That was one of the concerns I had. Could you just clarify your point on that?

Mr. Hurley: The concern, as you all know, is that regulations can be very quickly done and achieved and can really change the focus of the bill and the punishment of the bill and markedly change the bill, versus legislation that takes a lot longer to get done. The main one I was concerned about is a penalty section back in 11(c). This one did seem truly aimed at punishing corporations but putting penalties in the regulation where penalties are spelled out for other things in the act. Let’s spell it out rather than put it in regulation. That was, I think, my main focus.

Senator Boniface: Thank you. I think of it from the concept of criminal law and firearms, for instance. You’ll find some things in regulation, but the penalties, I believe, are in the legislation itself. I was trying to think of other areas of criminal law where you may find this type of structure.

Mr. Hurley: Generally, in criminal law, you leave things to regulations where you are concerned there might be a quick change or new development that has to be addressed quickly. People are always dreaming up a new type of firearm or weapon that has to be dealt with quickly. The police become aware of it, it’s on the streets and we need to deal with it immediately. That type of tinkering, obviously, is left to regulation. If it’s truly tinkering, leave it there. However, something as significant as what the maximum penalty might be strikes me as a lot more than tinkering.

Senator Boyer: Thank you, panel, for being here today. This has been a very interesting panel.

My question is for Professor Laidlaw, and it’s about the realistic possibility of the enforcement of this legislation. I worry that if Canada passes this legislation, the companies it’s trying to regulate would simply move their operations offshore and then would be able to effectively ignore the regulations. Professor Laidlaw, you had touched on this when you spoke about the web blocking, but I’m wondering what would prevent minors from using things like VPNs to simply circumvent the regulations. Is there anything, in your view, that can be done to make the legislation more effective and enforceable?

Dr. Laidlaw: That’s a great question, and you are right. One of the issues here is that they can just root around it using a VPN. The other issue, too, is what it drives if you end up blocking, for example, a major website. Those Canadians who are adults might go other places looking for access to pornography, for example, and to the dark web or wherever it is. There are a lot of different issues at play.

If the goal is to reduce the amount that children are exposed, you’re not going to be able to perfectly enforce this, either, against the bad actors. What do we do? One of the in-between roads, which I sort of mentioned in passing earlier, is collaboration — working with other countries and different stakeholders to be able to enforce it abroad. One option is to block the website, but, of course, there are implications for that when it comes to various factors I have already mentioned. The other option is then just working directly with law enforcement.

It’s not going to work for every country. Not every country will collaborate. Some of these sites are based in countries that aren’t as collaborative, and you aren’t going to be able to be effective against those particular sites. That’s kind of the cat-and-mouse game you are dealing with here.

Senator Boyer: Could you be looking at some international treaties between different countries, then, in relation to the enforcement of this kind of issue?

Dr. Laidlaw: It is, and in fact Brian might be able to speak to this better than me. This is starting to go outside the area where I am comfortably an expert, which is countries that we have particular collaborations with to be able to enforce it. I might toss this one to Brian if this is an area that he knows better than I do.

Mr. Hurley: As Emily said, you are looking at mutual assistance and mutual enforcement treaties that exist. However, you’re always going to have the problem that all it takes is one bad state actor and everyone just sets up home there. That gets around all of our best-intended treaties.

I don’t want to be too cynical, but as Senator Dupuis said, is this just a feel-good thing or window dressing, or can we actually make it work? I’m not sure if we can, actually, make it work. Perhaps we’re better served with funding and focusing on helping parents with net nannies and other things and the technology they need to protect their children — where we give them devices, put rules and attach various things to it. There are concerns that no matter what we do, all it takes is one bad state actor and everyone moves there.

Senator Boyer: Thank you to both of you.

[Translation]

Senator Boisvenu: I would like to thank our two witnesses for their very interesting presentation. I have a general question for both witnesses, because they raised many points that make us wonder about the effectiveness of the bill.

As parents, we all want our children to be spared from the evil of pornography, especially at such a young age.

With respect to this bill and the status quo, is there another avenue we could take to better protect our children by preventing them from accessing those products? I’m referring primarily to flagrant pornography, which has a much greater impact on children. Perhaps Mr. Hurley could answer first, followed by Ms. Laidlaw.

[English]

Mr. Hurley: Thank you, sir. It’s a difficult question.

I don’t know that there is a way that we can absolutely block this from children. My children were not very old when they were far more tech savvy than I was. My children were not very old when they were far more tech savvy than even a tech-savvy adult. That’s a reality we deal with. I don’t think anything is going to be 100% protective.

As I also stated, we also deal with children who share absolutely everything with each other online, including inappropriate pictures of themselves. So even if we are blocking sites, they will get inappropriate pictures of their brother’s girlfriend or sister’s boyfriend. That’s going to be out there. So I don’t think we can ever achieve perfection. That certainly doesn’t mean we don’t try and certainly doesn’t mean that we don’t look at something like this.

But the short answer to your question is, as a father or a criminal defence lawyer, I can’t see something that works perfectly and keeps children away from pornography. That’s just unrealistic. A 14- or 15-year-old boy — speaking from experience as a father and once a 14- or 15-year-old boy — is going to find this.

[Translation]

Senator Boisvenu: Mr. Hurley, can we take this bill as a first step?

[English]

Mr. Hurley: Personally, I don’t think so. I think part of that is because of the nature of age verification. I don’t think you can have effective age verification at this point in time in the nature of our technology without sacrificing a gigantic amount of personal information. I think that creates a huge risk as far as that information being in people’s hands, that it’s going to be hacked into and cause a big problem. Personally, I don’t think the technology is there for safe and effective age verification that doesn’t give up so much information that you create a larger risk to privacy of individuals looking at perfectly legal things.

[Translation]

Senator Boisvenu: Ms. Laidlaw?

[English]

Dr. Laidlaw: In some ways I think that this bill is ahead of its time because of the fact that the age-verification technology, as least as far as I know of it, is not quite there, given some of the privacy risks. So where does that leave us? The hard truth is that some of the best ways to deal with this are things that we almost don’t have a lot of control over, which is resources. We need to throw a tremendous amount of resources at a local level working with, for example, schools that actually work with the parents and that teach them about the different mechanisms to try to control children’s access. Actually, I do have tremendous concern about children’s exposures. I don’t like the analogy that we were all sort of exposed to it when we were growing up because we are facing something profoundly different now. No one has said that here but I have heard that elsewhere. There is a real risk and harm that we are facing. Resources are what are needed until there are more human-rights compliant, privacy-preserving tools available to us that we can then piggyback a law off of.

[Translation]

Senator Boisvenu: Thank you very much.

[English]

Senator Simons: I want to thank Professor Laidlaw and Mr. Hurley for being with us on what I can personally attest is a very beautiful day in Edmonton, on which we would all rather be outside.

I will direct my questions to both of you from your different areas of expertise. I’m looking specifically at clause 4 of the act. I understand that Senator Miville-Dechêne clearly intends this bill to be focused on large commercial, online distributors of pornographic images. But the language in clause 4 says, “Every person who, for commercial purposes, makes available sexually explicit material . . . in the case of an individual . . .” I hadn’t thought about it until Mr. Hurley said this. It’s possible it will capture little Johnny who is selling pictures he took of his sister in the shower online to his friends for a buck a peek. Is there a legitimate concern that clause 4 could capture individuals rather than corporations, and if so, is there a way to rewrite that clause so that it would be clearer that this was not meant to capture that kind of very small, private, intimate exchange of images, even for some commercial consideration?

Dr. Laidlaw: I went through the process of legislative drafting recently for the Uniform Law Conference of Canada, but we were looking at civil law for intimate images. One way is to focus on I think it shouldn’t be on the person; it should be on an entity. So every entity. And I think it should not be focused on just all entities because you are still capturing all of the internet in a lot of ways. Where you would want to focus would be on every entity where, for commercial purposes, the dominant purpose of its website is the sharing of sexually explicit material.

I actually had some concerns about how broad the definition of sexually explicit material is in the Criminal Code. I chose not to address it in my remarks, but it does capture — and maybe Brian can speak to this — it’s not just videos and images of humans, but it’s also drawings, I think. Isn’t that the way the provision is phrased? If we imagine here that anime, for example, would be captured by this. I know that anime pornography is relatively popular. I know it is being shared amongst children, so maybe that is something you want to make within scope. Or maybe that is the thing that you narrowly say, “This is going too far; we need to focus just on the major websites, on particular content that involves real live humans and that their dominant purpose is for sharing that.”

Again, then you might capture websites where children are actively seeking out this particular content. What you would not capture, then, under this law would be all the ways they just sort of stumble on and find this particular content because it just pops up or some video appears on YouTube, which still happens. The filter is not perfect. It never will be.

Senator Simons: Thank you. That’s the sort of answer I wanted to hear.

Mr. Hurley, can I put the same question to you?

Mr. Hurley: I always approach it when I look at it as what is law enforcement going to do with this or potentially do with this, which is often different from what the legislators had intended. Law enforcement quite often finds a problem and then finds the piece of legislation or the Criminal Code section that might deal with it, whether that’s what Parliament intended or not. This certainly, particularly with the inclusion of “person,” leaves it open to go after what I’ll term “the little person,” who may be involved at a low level. If that’s not what the Parliament intended, then I do agree with a reworking of the wording.

Obviously, as a defence lawyer, if you charge the corporation, I do everything in my power to make sure anyone connected with the corporation can jump ship as quickly as possible to avoid prosecution, and that’s the trade-off you have.

Senator Simons: Thank you very much.

Senator Cotter: My question is for Professor Laidlaw. Firstly, I guess I should say I have followed your career since you left the University of Saskatchewan and have taken it to significant heights as a Canada Research Chair. I know that you’ve done a lot of work about online abuse. That seems to me to be the kind of atomized, individualized incidents that are not dissimilar to the challenges that are being addressed here. I wonder if you could say more, maybe following up on Senator Simons’ question about whether it’s possible from your perspective to take the ideas that exist in trying to address those atomized instances to the higher, more systemic level that we are wrestling with here. Are there insights we can draw from those individualized kinds of circumstances that you looked at to this higher level of treatment?

Dr. Laidlaw: Yes, thank you. It’s wonderful to see you, professor.

I have been thinking about this. There are two avenues in the area of online abuse and platform regulation that might help your thinking. It was something I didn’t mention following on Senator Simons’ question on how to focus the definitions and the targets here.

Recently some of the focus and platforms, for example, the European Digital Services Act, have actually differentiated between different types of platforms. They are really only targeting platforms of a certain size with the regulation so that you only would be hitting the major players essentially. It might be that they have a certain number of users or whatever their global turnover is per year. We have seen this with the GDPR for privacy regulations. It’s a way to target the bigger players. The drawback of this is when it comes to what kids are finding online, that’s not necessarily going to capture the players that you’re aiming to capture here. But it’s one way to narrow things. Some of the ways to address some of the human rights concerns are small and several decisions that just kind of carefully carve this a little bit better.

There have been a lot of lessons learned about due process through dealing with online harms. One of the things would be it’s aiming to have a minister, for example, make the decision about website blocking, but let’s say you moved that to a different body. How is that structured? Is it a transparent process? Do we know what the rules are? Is there an opportunity to appeal? How narrowly is that crafted? Is there opportunity to be heard even? How does the process work so that we are replicating the elements in the due process? For example, if we look at the website blocking in the U.K., all of that is done through a court. It’s not done through a minister or even through some sort of an agency or tribunal. It is set out in the Copyright, Designs and Patents Act. It is an application to the court, and therefore you have all those kinds of structures that we would expect of a legal system. These are some of the ways that might address some of the issues at least.

Senator Cotter: Thank you.

Senator Pate: Thank you to the witnesses.

The benefit of going near the end is that lots of questions have been asked, so really I will provide opportunity, starting with Dr. Laidlaw and Mr. Hurley, to expand on some of what you have already said.

Dr. Laidlaw, your work has focused on corporate social responsibility. Working in and around the criminal legal system for about 40 years, one of the challenges I see is whenever we criminalize something, those individuals most likely to be caught up in the system are often those who are easiest to catch. So in particular, those who are most marginalized, who may have experienced trafficking or exploitation themselves, could be caught in this. I’m curious as to whether, in addition to what you already said, you have other suggestions of things we might do to safeguard against those realities? Thank you.

Dr. Laidlaw: That’s a great question. I would say two things.

The way it is crafted now, one of the knock-on effects is the only ones to comply with these particular regulations end up being the big players. You have to be careful about that because of some of the impacts this might have on marginalized groups, I think that is real the way it is drafted now. But I hear from the committee there might be some goals to maybe narrow the drafting of that, because at the moment, it would apply to every person. If you are imagining that individuals have created a website, for example, they don’t necessarily have the machinery to try to manœuvre and avoid obligation, whatever it is, there are individuals that will be more vulnerable to this structure than others.

I think that either you let go of the goals of this legislation and focus more on a kind of education, so you don’t proceed with the bill, or the other option is going ahead with some of the recommendations I made before about much more narrowly targeting what players you want to regulate with this.

Mr. Hurley: Yes, I agree wholeheartedly with Emily. I’m not sure if this is just a bill before it’s time and we don’t have the technology to enforce this, but if we want to make it more targeted, as Senator Pate indicated, time and time again, we go after the low-hanging fruit, the small fish. They are easier to catch. That’s not what we want to do here. So perhaps a focus exclusively on commercial purposes and corporations — we all know the danger with that, that we simply kill the corporation and move along. We do want to go after the actors within the corporation. I’m not sure of the solution, quite frankly, Senator Pate. In three decades of doing this, it’s always the low-hanging fruit and the small fish that we go after. It’s rare that you represent the big fish. They swim away, and I don’t know how to fix that unfortunately.

The Chair: I have a very few minutes before the clock strikes 12 and we have to close this committee. I have been struggling with this. I thought you would be the best one to answer. The actus reus and the mens rea components of the offence are included in clause 4, providing sexually explicit material to minors. In other words, what activities are specifically prohibited by the bill? What is the necessary mental component for an accused to be found guilty? Did they need to be intentionally making sexually explicit material available for commercial purposes to a young person, or is it enough that they were reckless and negligent in doing so?

Mr. Hurley: At least on the English version, with the inclusion of the word “acquiesce,” I think the mens rea is very low and gets below the criminal standard of mens rea. The actus reus is one of those failure to do something actus reus, a failure to adequately set up your site to screen out children when there may not be good enough age verification to do that. It’s a failure to act actus reus, which exist in the Criminal Code. They are a little complicated. And the mens rea I think is well below criminal mens rea despite a criminal sanction, which would be my first attack on this bill if I’m representing someone.

The Chair: Thank you very much, Mr. Hurley.

Professor, I had a question for you but we have run out of time so I’m forced to not ask my next question.

I want to thank both of you. We have found this extremely interesting, as you have seen from the question. You both have worked hard to educate us. Thank you to both of you. We look forward to working with you in the future.

Senators, next week we have clause by clause. This is the final day of study of the bill. Thank you very much to all of you for being here.

(The committee adjourned.)

Back to top