THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS
EVIDENCE
OTTAWA, Wednesday, April 6, 2022
The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 4:16 p.m. [ET] to study Bill S-210, An Act to restrict young persons’ online access to sexually explicit material.
Senator Mobina S. B. Jaffer (Chair) in the chair.
[English]
The Chair: Honourable senators, I am Mobina Jaffer, from British Columbia, and I have the pleasure of chairing this committee. Today, we are conducting a hybrid meeting of the Standing Senate Committee on Legal and Constitutional Affairs.
[Translation]
If you run into any technical difficulty, particularly with the interpretation, please let me or the clerk know, and we will do our best to get the problem resolved.
Now I would like to introduce the members of the committee who are participating in today’s meeting.
[English]
We have Senator Boisvenu, deputy chair; Senator Batters; Senator Campbell; Senator Clement; Senator Cotter; Senator Dalphond; Senator Dupuis; Senator Harder; Senator Pate; Senator White; and Senator Wetston. As a reminder, only signal to the clerk if you do not have a question. Otherwise, I will call your name.
Senators, today we continue our study of Bill S-210, An Act to restrict young persons’ online access to sexually explicit material. Today, we are happy to welcome the following people: Kevin Westell, Member, Criminal Justice, Criminal Bar Association, and I want to thank Mr. Westell for all his efforts to attend our committee meeting; as an individual, Pierre Trudel, Professor at the Centre for Research in Public Law, Faculty of Law, Université de Montréal; and Colin McKay, Head, Public Policy and Government Relations, Google Canada. Mr. McKay, we appreciated how quickly you agreed to appear, so thank you very much.
I will now ask our witnesses to make opening remarks, to be followed by questions.
Kevin Westell, Member, Criminal Justice, Criminal Bar Association: Thank you very much.
I appear on behalf of the Canadian Bar Association’s Criminal Justice Section. The CBA is a national association of 36,000 members. Lawyers, notaries, academics and students across Canada are all involved. We believe we bring a unique perspective in the sense that we’re neither a defence counsel nor a prosecutorial-sided group. Our membership encompasses both sides of the justice system. I myself continue to do work on both sides of the aisle as a prosecutor and defence counsel.
As a section, we support the overall goal of the bill to protect children from serious harm arising from increased accessibility to internet pornography. We acknowledge that the Senate has heard from other experts and that there is a real risk, especially related to prolonged exposure of children to this kind of material. While we generally support the bill and its purpose, we raise a few concerns about its implementation and offer recommendations for improvement.
First, we say that the age limit may be arbitrary. In Bill S-210, a young person means an individual who is under 18 years of age, but the baseline age of consent to sexual activity in Canada is 16, and our code allows children as young as 12 to engage in sexual activity in certain circumstances. We say it’s therefore discordant with the code’s overall legislative scheme to prevent an older young person to engage in sexual activity while making it simultaneously illegal to distribute documents depicting sexual activity to the same young persons. The CBA suggests lowering the number in Bill S-210 from 18 to 16 for that purpose.
Second, we say the defence outlined in the bill of “legitimate purpose” is overbroad and confusing. Subclause 6(2) of the bill outlines the defence of legitimate purpose. It states:
No organization shall be convicted of an offence under section 5 if the act that is alleged to constitute the offence has a legitimate purpose related to science, medicine, education or the arts.
We suggest this language is confusing and overbroad. The word “legitimate” is really only a synonym for “lawful” after all. Really, the use of the word “legitimate” doesn’t assist prosecutors or members of the public who are trying to interpret the law to understand what the bill is getting at in that regard.
Also, the list of qualifying descriptors of “science, medicine, education or the arts” covers a very broad swath of human experience. While the CBA section agrees with the overall contention that exposure, and especially prolonged exposure, to pornography is harmful to children, it sees possible practical interpretive issues related to the wording of the section. For example, in many instances, a document can be both artistic and pornographic. If it’s both, determining whether it is also legitimate may be an entirely subjective inquiry. This makes it difficult, almost impossible, to actually apply this law. The interplay leaves the public with little or no notice of which precise conduct is defensible by the act.
Since the word “legitimate” offers no guidance as to the proper application of these provisions and as the qualifying descriptors are broad and engage topics that are both construed broadly and experienced highly subjectively, we believe the defence of legitimate purpose as set out in the bill should be replaced. We recommend amending the bill to make the defence more precise. We suggest the bill should specify that the legitimate-purpose defence could never apply to the transmission for the entertainment or sexual gratification of the viewer of pornographic material depicting sexually abusive behaviour. This would narrow the ambit of the defence to a clearer standard — still not a perfect standard, but clearer than the one we’ve currently got.
Last, one of the desired effects of the bill is to force pornography distributors to implement prescribed age‑verification methods to limit access to sexually explicit material made available for commercial purposes to persons who are at least 18 years of age. The bill’s preamble claims that online age‑verification technology is increasingly sophisticated and now can effectively ascertain the age of users without breaching their privacy rights. The bill, however, contains no specifics on how the government will practically balance privacy and protection. Instead, it mentions the development of regulations for carrying out the purposes and provisions of the bill.
The CBA section is a strong proponent of privacy protection for Canadians and is skeptical that the appropriate balance is possible, absent evidence to the contrary. Age-verification requirements may authorize the Canadian government to either collect or supervise the collection of private and sensitive information.
I can see that I’m coming up against my time limit. Our concern here, I’ll say here in closing my opening statement, is with respect to whether we actually have the evidence that the technology can protect the privacy of Canadians to the extent that it would need to so that innocent members of the public aren’t abused through this legislation, as opposed to being protected.
Those are my comments in opening. Thank you.
The Chair: Thank you very much, Mr. Westell.
[Translation]
Pierre Trudel, Professor at the Centre for Research in Public Law, Faculty of Law, Université de Montréal, as an individual: Honourable senators, a growing number of countries have recognized the need to update their legislation governing online content, in order to mitigate the proven adverse effects of instant access to content that is known to be harmful.
Bill S-210 is consistent with efforts undertaken elsewhere in the world to deal with situations in which content deemed to have significant potential for harm is shared.
Many organizations, some of which have submitted briefs — which I have read — argue that sexually explicit material poses potential risks that warrant oversight. Laws in a number of Canadian jurisdictions do just that; they already restrict children’s access to this type of material in non-Internet settings. It makes perfect sense that the government would want to introduce measures to restrict that access online as well.
This bill is not about banning sexually explicit material; rather, it is about limiting access to individuals who demonstrate that they are 18 or older. It requires that organizations take reasonable steps to verify the age of individuals who access sexually explicit content. In that sense, Bill S-210 establishes, and explicitly lays out, a duty of care for online platforms, which earn considerable revenue from the distribution of videos and other online content.
The bill excludes sexually explicit material that has a legitimate purpose related to science, medicine, education or the arts. The courts have made clear that this sense of “sexually explicit material” — already provided for in the Criminal Code — does not apply to any representation of casual nudity or intimacy. Therefore, it seems unreasonable to claim that the notion would be so broad as to cover all material containing nudity, as has been argued.
In Sharpe, the Supreme Court of Canada clarified that the expression “explicit sexual activity” refers to intimate sexual activity represented in a graphic and unambiguous fashion, and intended to cause sexual stimulation to those who consume such material. Accordingly, it is my understanding that Bill S-210 relies on those same definitions, through which, the Supreme Court of Canada clearly described the material that could be regulated, mainly for the purpose of protecting children.
Bill S-210 applies to commercial distributors of pornography, in other words, “organizations” within the meaning of section 2 of the Criminal Code. That term is also well established and defined in the case law. What really makes Bill S-210 different is that it allows for a dialogue of sorts between companies and the enforcement authority responsible for issuing them notices. The offence that the bill introduces depends on a process whereby the online company is notified that it must take the necessary steps to restrict access to sexually explicit material. Only when companies fail or refuse to take the necessary steps to verify users’ ages will they likely have to respond to the offence, practically speaking.
The bill also makes it possible to block access to content on sites that are in breach of the law and that do not have an age verification system. In Canada, the courts have recognized that online sites or content can be blocked. The Supreme Court of Canada recently declined to hear an appeal against a unanimous Federal Court decision confirming the validity of a site-blocking order to prevent clear infringement of the law, copyright law, to be specific.
It has therefore been established in Canadian law that the courts can issue orders to block access to illegal sites, provided that the illegal nature of the activity has been demonstrated. What Bill S-210 would make illegal is failing to implement an age verification system for users who access this material.
As the Supreme Court’s decisions show, courts do not take the decision to issue site- or content-blocking orders lightly. Canadians can rest assured that sites or content will be blocked only when a judicial process has clearly demonstrated that the content is in breach of the law.
In conclusion, Bill S-210 would introduce in Canadian law appropriate legal mechanisms ensuring that the conditions that apply to offline access to content deemed harmful to children would also apply to similar content on online platforms. From that standpoint, the bill levels the legal playing field, so to speak, by ensuring that what is prohibited offline is also prohibited online.
[English]
Colin McKay, Head, Public Policy and Government Relations, Google Canada: Thank you for the invitation to speak with you today.
At Google, we are committed to building products that are secure by default, private by design and that put people in control. Search engines like Google Search do not offer direct access to content. Instead, we seek to organize the world’s information and make it universally accessible and useful. We have a range of systems, tools and policies designed to help people discover content from across the web while not surprising them with mature content they haven’t searched for.
Technology has helped kids and teens during the pandemic stay in school through lockdowns and maintain connections with family and friends. As kids and teens spend more time online, parents, educators, child-safety experts and privacy experts, as well as policymakers, are rightly concerned about how to keep them safe. We engage with each of these groups regularly, and we share these concerns.
While our policies don’t allow kids under 13 to create a standard Google account, we have worked hard to design enriching product experiences specifically for them as well as for teens and families. One product, which was developed by our team in Montreal, is a protection called SafeSearch, which helps filter out explicit results when enabled. As well, in recent years we have invested heavily in creating policies to better protect children on our platforms, including policies that govern the ads and commercial content kids see. We created Family Link, which allows parents to create a Google account for their child. We have strict advertisement rules for how and what types of ads can be served to Family Link accounts, which are designed to protect users’ privacy and filter out content not appropriate for kids.
Turning to the subject at hand today, there is an understandable concern about the ease with which some young people access pornographic content online and on social media. We believe that content providers are best positioned to establish the risk associated with their service and put in place the right method to limit access to the service by underage users. Like this committee, some countries are considering and even implementing regulations in this area. As we at Google comply with these regulations, we are looking at ways to develop consistent product experiences and user controls for kids and teens globally.
We begin by applying strong protections to address this issue. The community guidelines that help shape our user’s experience on YouTube prohibit pornography and sexually explicit content on that service. YouTube already gates content that we determine to not be appropriate or that creators tell us is not appropriate for users under 18. We turn SafeSearch on by default for users under 18 to help filter sexually explicit and violent content. Mature ads are not shown to under 18s. As well, under 18s are not able to download apps classified as 18+ on Google Play.
The legislation under consideration today seeks to create an obligation to verify the age of an individual before they can access online pornography. Having an accurate age for a user can be an important element in providing experiences tailored to their needs. Yet, knowing the accurate age of our users across multiple products and services — while respecting their privacy, as you heard one of our other speakers today emphasize — and ensuring that our services remain accessible is a complex challenge.
When we become aware of a user who may be under the relevant age of consent, we route that user through our underage accounts process. For example, a user may provide a birth date and/or ID that indicates they are below the relevant age of consent. If they are unable to demonstrate the account is controlled by a user over the relevant digital age of consent, they are given the option to add supervision to their account. If they fail to do so, their account is disabled. We disable tens of thousands of accounts each week as part of this process, and I’ll emphasize that this number is global.
I know that some members of the committee have asked about advertising in particular. It’s important that our advertising experience across Google products is useful, informative and, above all, safe for all our users. We have a series of comprehensive Google Ads policies that establish expectations and obligations for businesses and individuals that seek to buy ads with Google. For all users regardless of age, these policies prohibit content related to dangerous products or services, inappropriate content — for example, bullying, hate-group paraphernalia or self-harm — counterfeit goods and the enablement of dishonest behaviour.
Our ads policies also prohibit sexually explicit content under our inappropriate content policy. Under this policy, ads themselves are not permitted to contain any sexually explicit content such as text, image, audio or video. Searches using sexually explicit search terms should not yield any search ads. Similarly, this policy also applies to the destination site referenced in the ad, so we do not permit ads to refer to sexually explicit websites.
We have recently updated our enforcement policies to make violation of the sexually explicit content policy an “egregious violation,” meaning that if we find violations of this policy, we will suspend the advertiser’s Google Ads account upon detection and without prior warning, and they will no longer be able to run Google Ads.
In addition to standard Google Ads policies, we work to ensure the advertising content shown on Google’s products contains extra protection for kids and families.
I have only touched the issue under discussion and how Google is taking steps to make our products and services safe for our users. An effective solution to the problem we are discussing today will require input from regulators, lawyers, industry bodies, technology providers, child-safety experts and others to address this challenge and to ensure that we build a safer internet for kids and for our families.
Again, I thank you for the opportunity to speak with you today and look forward to our discussion.
The Chair: Thank you, Mr. McKay.
I have a question for you, Mr. Westell, on age restriction. I see in the brief you sent to our committee — and also at committee today — you suggested the bill be amended to forbid organizations from distributing pornography to young persons under the age of 16 rather than 18, which is the age limit currently stated in the bill. You cite the fact that the age of consent is generally 16 and can be as young as 12 in Canada. Can you provide to the committee a more fulsome response as to why you believe that the age limit should be lessened by two years, please?
Mr. Westell: Thank you very much for the question, Madam Chair.
It’s really a matter of practicality and whether or not the bill needs to be as restrictive as it is, given that, as lawmakers, there has been a decision made that those under the age of 18 can engage in the kind of sexual conduct that could be featured in the very pornography that they are being restricted from viewing. We just see that as discordant with the purpose of the act. I mean, if they are of age to engage in sexual behaviour, they should be allowed to view sexually explicit material. To me and to our section, we don’t see a reason why that restriction should be put into place. I’m not disputing that for a 16-year-old, just as for a 20-year-old, prolonged and unhealthy exposure to or use of pornography is not a good thing and can lead to real problems with addiction, but we say that it’s simply unnecessary to put that restriction in for those under the age of 16, given that they have been made, by legislation, valid sexual actors.
The Chair: Thank you very much.
We will now go to questions from the sponsor. Senators, I remind you that you have four minutes to ask questions.
[Translation]
Senator Miville-Dechêne: My question ties in with Senator Jaffer’s. It’s about age. Mr. Westell, I’m trying to understand the bar association’s recommendation because it doesn’t appear anywhere else, not in provincial legislation or in the legislative regimes of other countries. The age for sexual consent across all the provinces is 16, but the age to watch pornographic material offline is 18.
In Germany, for example, the age of consent is 14, and the age at which someone is allowed to watch pornography is 18. It’s similar in France, where the age of consent is 15 and the age at which a person can watch pornography is 18.
That would make Canada the only country where individuals under 18, so 16- and 17-year-olds, could legally access pornography, which is increasingly hard core, as you know. Sexual consent involves the young person’s body, so their power over their body, a bit like consent for medical treatment for minors, whereas the consumption of pornographic material involves an industry and a company. Those are two different things. What exactly is your recommendation based on, since it would make us the only ones in the world to have such a law?
[English]
Mr. Westell: I can’t respond other than to say that we say it’s common sense that if we’re entrusting people as young as 16 to make those big decisions about consenting to the use of their bodies for sexual congress, then we should be trusting them to be able to look at sexual material as well. If they are being entrusted by law to make decisions about engaging in sexual conduct, we’re having difficulty seeing the link in terms of why they would be entrusted to do such a dramatically important and impactful thing around sexual activity yet not be trusted to regulate themselves appropriately in relation to the viewing of content. I have struggled with the logical link there.
I take your point that other countries have not done what we’re suggesting. It’s a suggestion. We’re not necessarily saying that it’s the end of the world that there is that discordant difference in ages, but it is something we thought worth bringing to the table as a valid criticism.
Senator Miville-Dechêne: Just a short question: Isn’t it because more and more research shows that online porn can be harmful, especially to young people, and studies show that there are more and more links between a lot of porn being watched and aggressive behaviour among boys?
Mr. Westell: I take your point about those studies and what they say. We say that it’s also not the case that everyone who views pornography is addicted to it or necessarily reacting to it in the way that studies have shown. I would like to see whether those studies also show that, in the age bracket between 16 and 18, that’s really where the problem lies. We take your point, but we think that we have a valuable counter-perspective on that point and we have set it out as best we can.
Senator Miville-Dechêne: Thank you.
Senator White: Thank you very much.
I was going to ask some questions again around the ages. I have spent some time looking at the age of consent. Really what you’re doing is drawing back to the federal Department of Justice response to age of consent.
Mr. Westell: Yes.
Senator White: Would you agree that if we kept it at 18, we would be unsuccessful in a court challenge? Is that what you’re suggesting, or have you not given it that much consideration?
Mr. Westell: I’m not here to say that there would necessarily be any given result with respect to a court challenge, but when these sorts of misalignments appear in the law, you can see situations where you create vulnerability for court challenges.
We wanted to bring this issue to the panel, not that we’re pounding the table and saying, “Oh, my God, 16-year-olds need to be able to look at pornography unrestricted.” That’s not the intent here. We wanted to highlight this issue and have the committee consider this issue and the question, taking into account all the other studies and results and all the other concerns the CBA shares with the committee and other witnesses about the harmfulness of pornography, of whether or not it makes sense to say, “We trust you to have sex, but we don’t trust you to look at sex.”
Senator White: Thank you very much for that.
When I review federal Department of Justice documents, they talk about “close in age” exceptions in some other areas as well. Is there anything else that concerns you about the age 18 — or even if we dropped it to 16 — that there could be arguments made around “close in age” exceptions? Or do you feel that if we were in line with age 16, as the federal government is, we would be on a safer path?
Mr. Westell: I’m not of the view that there is a need to lower the age any lower than 16 or that it would be appropriate to. The “close in age” exceptions are just meant to acknowledge the reality that children of a very young age, whether we like it or not, will engage in sexual activity with each other in some circumstances. We don’t think that somebody as young as 12, 13, 14 or 15 should be unrestrictedly watching pornography on their own, freely.
We say that once you’re 16 and you’re out of those “close in age” exceptions, for the most part, you’re getting closer to engaging in an adult type of sexual activity. The reality is that we’re entrusting them with a lot of responsibility around how to conduct themselves around their own bodies at that age. There is a legitimate question as to whether a mature 16-year-old has the ability to regulate themselves appropriately around how much they should use.
Senator White: What it would mean, then, is that a 16‑year‑old could watch child pornography. As long as it wasn’t somebody between the ages of 16 and 18, that would be pornography, but if they watched somebody between the ages of 16 and 18, it would be child pornography?
Mr. Westell: That’s true. We’re not meaning to pull in anything to do with child pornography. We don’t think anyone should be watching child pornography, regardless of whether it is a 16-year-old watching another 16-year-old. There is a danger in allowing this; and on a practical level, there is an inability to police. What is the age of the person you’re watching? If you’re 16, do you get an exception? That’s not possible, practically speaking.
We think that also engages the question about whether a 16‑year-old should be allowed to make the decision to perform in pornography. We’re not going anywhere near that. We’re not of the view that that should be allowed, not in any way, shape or form.
Senator White: Thank you very much, Mr. Westell, for your responses.
[Translation]
Senator Boisvenu: Welcome to the witnesses. My first question is for Kevin Westell. In your brief, you expressed concerns about the protection of Canadians’ privacy and the risk that the government could collect private information. This is what you said:
Age-verification requirements may authorize the Canadian government to either collect or supervise the collection of private and sensitive information.
Can you explain how the government could collect that information and what the consequences would be?
[English]
Mr. Westell: Thank you.
The answer to the first part is that we don’t know how the government would end up in possession of particular information, other than the fact that if they are the regulators and members of the public are clicking or linking through their IP address to verification steps about what age they are, there is a natural link there. People are watching and regulating this. There is a question as to whether the government all of a sudden has the ability to review and understand what Canadian citizens are watching — what type of pornography they watch and what type of pornography they don’t watch. That information may be collected, whether that’s the purpose of the government or not.
The concern is if that information is breached or hacked or falls into the wrong hands, then there’s a repository of highly sensitive and reputation-damaging information that’s in the government’s hands and being collected on purpose or by accident through these verification steps through the technology. We don’t pretend to know exactly how they intend to execute this, but our concern is with the right to be forgotten and the right to privacy and what steps are going to be taken. When the government takes steps to know who is watching or visiting what pornographic sites, how is that information being protected?
[Translation]
Senator Boisvenu: I understand what you’re saying, but in your brief, you point to a risk. What I want to know is how the government could exploit that risk.
[English]
Mr. Westell: They can deal with the risk by being incredibly transparent with Canadians about exactly what steps are being taken. What is the technology being used? What is the security being used? How is this information being stored? What information is being stored? What information isn’t being stored? It’s incredibly important that the government, if they are going to engage in this level of information verification and, therefore, gathering, be incredibly transparent about how the data is being collected, where it goes and at what point it becomes erased or deleted. It starts with transparency.
I don’t pretend to have a technical background and to know the ins and outs of how the technology would work, but that goes to my point that it would have to be translated in a way that the average Canadian can understand what the consequences are of clicking on that age verification button and what the consequences are for information. How much information falls into the hands of the Canadian government when that box is clicked? Where does that information go? That’s my answer.
[Translation]
Senator Boisvenu: Thank you, Mr. Westell.
[English]
Senator Dalphond: Thank you to all the witnesses who are appearing before us today.
My question is to Mr. McKay. I have two questions. First, do you have any data about the use of VPNs by users who are consuming or viewing pornographic material, especially young people? Second, do you consider that you are one of the potentially targeted organizations pursuant to the definition of “organization” and the definition of the offence? Do you consider that Google might be targeted as providing or facilitating provision of pornographic material?
Mr. McKay: Thank you, senator.
To your first question, I’m afraid we don’t have data around VPN use. That is more likely something that a user’s internet service provider would be able to provide some information on.
Second, we recognize that there is an ambiguity in the definition and that there needs to be a conversation about what the intent and expectation of Parliament are in this legislation. Our interpretation is that the obligation, as it stands, is for age verification by the provider of the online pornography, which is the site, and that they should be enforcing any local as well as federal laws. We would expect that while we have an interest in this issue, and we’re appearing today to describe to you how we take steps to create a safe environment for our users, we, in fact, would simply be operating in a number of different ways but primarily as a search engine.
Senator Dalphond: My next question will be for Mr. Westell of the Canadian Bar Association. I understand the age restriction proposal. You have also expressed privacy concerns, but I’m more interested in the defence. Am I right to understand that you propose that we redraft clause 6(2) — which reads: “ . . . if the act that is alleged to constitute the offence . . . related to science, medicine, education or the arts.” — and add the sentence that you have proposed?
Mr. Westell: I have no issue with the section as it currently reads, plus an addition. It’s not always the case in legislation, but occasionally there are these clarifying clauses that just, in my mind, serve to further sharpen the ambit and the focus of the legislation. We think that would be a helpful addition. I understand that someone might respond and say, “That should be obvious.” I don’t think it always is obvious, and it’s certainly not obvious enough to ward off court challenges that we would prefer not happen. By adding a small clause of that type, we say that we can be clear and intentional in the legislation and focus on what harm this is aimed at and what it isn’t aimed at.
Senator Dalphond: Thank you.
Senator Pate: Thank you to all of the witnesses for appearing.
My question is for Mr. McKay. While you were speaking, I thought I would google a sex act and see what comes up. The first link that came up was Pornhub. Would that mean that they are advertising or gaining some benefit somehow by having themselves come up as number one from that search?
Mr. McKay: That is simply a result that was delivered as a product of your search and expressing interest in those terms. Those results are delivered if you’re not identified as an underage individual or someone that is a member of a family group on Google.
As I began to describe, there are many more details. There are environments that we provide for both adults as well as for users we’ve identified as underage that do not produce results when searching for terms like that. But that was not a sponsored result, and that was not a result that derived any revenue for Google.
Senator Pate: One of the things that would be useful to know is there was a prompt that you can have for some sort of safety measure. I forget the term you used. That did pop up, but there was no question of what my age was or who I was even. I went through just the general site.
You’re saying it doesn’t mean they are an advertiser. It doesn’t mean they get a sponsored position just because they come up first. Do you have a list of advertisers, the fees you impose and the policies around this? How would this be impacted by this legislation if it passed?
Second, you mentioned you have content reviewers. Could you please provide the training materials to this committee, as well as the list of advertisers and fee structure and the policies that are implemented to ensure that appropriate content is on the sites?
Mr. McKay: I could certainly provide the committee with information about our advertising policies, especially the ones relevant to sexually explicit material, as well as the guidelines and the policies that are applied in circumstances like this.
In this case, just from what you’re describing to me, this was not content that was reviewed other than simply understanding from the limited information we have who you are and what you’re looking for and then delivering a relevant result. In this sort of inquiry, we would expect that the site that you click through to is the one that verifies your age before allowing you access to their content. In this case being Pornhub, I assume that it’s online pornography.
Senator Pate: What it says is “Some results may be explicit. Turn SafeSearch on to hide explicit results,” presuming that it’s up to the person who is doing the Google search to take those steps. With this legislation, you would presumably have to do that age verification.
You mentioned that you would be able to provide the policies. That’s great. Could you also provide the list of advertisers, the fees and how you determine when someone has breached the policies, as well as the content training material?
Mr. McKay: We don’t accept sexually explicit advertising, so I can respond in writing with detail about that policy.
Senator Pate: Okay. Do you have training for your content checkers?
Mr. McKay: In this case, when we’re speaking about the advertising and the search results, it’s the application of our policies and guidelines. There is no content review of the result that’s being given to you other than an automated verification that it’s meeting the expectations included in the search terms you provided to us.
Senator Pate: Okay, so whatever policies you have that provide the review of the content and the policy to protect children would be extremely helpful. Thank you.
Mr. McKay: I’ll be happy to follow up with that.
[Translation]
Senator Carignan: My question is for the Google representative. To be fully transparent, I should say that I sent the question to the witness in writing before the meeting — something I very seldom do — and I wanted to be transparent about that.
Here’s my question. How much ad revenue does Google generate when people conduct Google image searches using “XXX” and other pornography-related terms? Does Google compensate producers of explicit content at all?
[English]
Mr. McKay: Thank you, senator, for your questions, and thank you for submitting them in advance.
As I said, our explicit content and sexual content policies mean that we don’t buy advertising and we don’t generate revenue, especially on image search. If you are running a search on Google Images for specific terms and you’re seeing results, then that simply is an index of relevant results based on the information you provided and what we understand about you, and one of those elements is an inference about your age and a verification that you are not, in fact, identified as underage in your Google account.
To your second point, no, we do not provide remuneration to producers of explicit content, because we are trying to design an environment online through our services where, one, users are not surprised by the results they find, and two, they have an environment that they feel safe within. There’s an active conversation in Parliament that began yesterday about remuneration for a specific element of content, but certainly in this case there’s no transaction.
[Translation]
Senator Carignan: Can you explain how Google benefits by allowing explicit images to be produced or distributed if it doesn’t bring in any revenue for Google? You’re not a non-profit organization, after all.
[English]
Mr. McKay: You have to look backwards to our foundation and the creation of Google Search, which was an effort to apply technology to make information more universally accessible and informative. Our mandate since those first pieces of research and development of technology is to provide our users with access to the information that’s available online.
There are limiting factors to that information, certainly, if information and content are illegal and explicitly prohibited under Canada’s Criminal Code and other legislation around the world. Within those conditions, we do restrict access to some material, but otherwise we are providing information that we believe is the most relevant and also the most informative on the questions that you pose through our various services. Depending on your search terms, you may receive some very specific and very relevant results from a variety of sources where we don’t actually make any money, because for us, across all these products and services, we are trying to deliver a service that meets those initial goals with our users, and in many, many cases, we are not generating that revenue. Thank you.
Senator Wetston: There is a lot of activity in Europe and the United States dealing with big tech — I think, Mr. McKay, you would obviously agree to that — and just to make this question a bit more targeted, the Digital Markets Act in the EU, the Digital Services Act, the General Data Protection Regulation, or GDPR, in Europe. How would any of that legislation affect your opinion or view with respect to access to pornography by young adults, and does that legislation in any way impact that view?
Mr. McKay: Thank you for the question, senator.
As a baseline, as any jurisdiction around the world passes legislation to frame a conversation online or to shape how a community interacts online, we both meet as well as seek to apply technology that meets our obligations as effectively as possible and respects the intent of the legislators, as well as the community.
In the case of legislation — and you described a few different frameworks with different purposes in Europe — the impact it would have in Canada is that it often is world-leading in terms of scope and application. The challenge it presents for a company like Google, but any other technology company, is how do we apply the tools we have at hand to meet the expectations of legislators and the community under question? And then, importantly, are we continuing to respect and meet the obligations and requirements of the jurisdiction within which we’re operating? So does meeting the expectations in Europe create a conflict in Canada?
On the issue at hand today, what that means, particularly in the conversation that members of the committee have been having with Mr. Westell around age verification, is what mechanisms are there to verify the age of users that allow us to do that in the least privacy-intrusive way, and what is the way under which we can proceed with age verification while respecting not only the privacy expectations of our users but the privacy obligations of regulators around the world, whether in Europe or Canada. So by extension, the deliberation and then passage of legislation on a number of different areas in other jurisdictions in the world leads to technological improvement that leads to the improvement of the experience for Canadians.
I will touch on an example that was only briefly touched on previously today in the case of child sexual abuse imagery, where there is a consistent global agreement on the reprehensiveness and the necessity to not enable the distribution and the creation of child sexual abuse imagery. This is an example where the application of technology on a global basis, both in terms of image identification but then also intervention when someone is demonstrating an interest in such imagery, is applied globally.
I hope that answers your question.
Senator Wetston: I think that’s very helpful and quite enlightening from the perspective of understanding the implications, because as you know, Europe is much further ahead than Canada with respect to addressing these issues. You may not like what they are doing, but they are much further ahead. I mean, we don’t even have our privacy legislation. It’s been amended. We’ve been talking about it for quite a while, but we still haven’t done that. It’s a very important component of dealing with advancements in digital technology. Europe has had the GDPR for a number of years now, a couple of years, anyway.
I think your comments are very interesting, and I guess I would just add one more. We would benefit from these technological advances, with respect to this area at least, because Europe is likely to do it or is doing it, and as a result of the global nature of this business, we could benefit from that, even in the event that this bill is not passed. Are you more or less saying that?
Mr. McKay: In fact, I am saying that it does happen. Canadians are protected by what we believe is the strongest data privacy regime for users of an online service through Google, not only because of the leading work of Canadian privacy experts and legislators 20 years ago but because of the follow-up work by privacy advocates and legislators around the world, ending in Europe and the production of the GDPR.
In the specific case of the legislation under conversation today, the research that has had to be conducted into age verification, in part in advance of Europe’s own moves in this area, is reflected in how we and other companies attempt to identify whether or not someone is under 18 or over 18 through some very simple signals. As Mr. Westell identified, there are some very specific requirements around personally identifiable information, and those requirements create a series of obligations for not just governments but companies when that information is collected. Absent an explicit need and an explicit framework under which to collect that, we try to use technology to address the policy challenge and then the legislative obligation, recognizing that it’s absolutely imperative to meet that legislative or legal obligation once it’s passed.
Senator Wetston: Thank you, Mr. McKay.
[Translation]
Senator Dupuis: I thank the witnesses for joining us. I have a question for you, Mr. McKay, first, and then another one for Professor Trudel.
Mr. McKay, I did the exercise Senator Pate did earlier. I have never done a Google search using the term “pornographic site.” I searched in French, so the response may be different depending on whether French or English is used, and I understand that. I looked at the top of the page and found Safe Search. I had to look for it. I tried to update it, and it brought me nowhere. Then I clicked on “close,” and I got 13.5 million results in 35 seconds. I can have access to a whole range of pornographic sites.
This is not a question I am putting to you. I wondered whether it was because two senators did the search that we all of a sudden got an array of porn websites appearing on the screen. You said something interesting earlier, when you said that it was the ones who input content, so content providers, who should have verification mechanisms — and I am not talking about age verification, but about effective barriers. What effective barriers could be implemented on sites and could indirectly protect you from prosecutions, because it would be said that you come under the definition of the bill or of another bill? What kind of effective barriers should be required from content providers?
[English]
Mr. McKay: Thank you. I admit I’m having difficulty making the transition between your description of your experience searching for a term and finding the results, and then what a barrier may be and whether or not you mean that there should be a barrier to you seeing the result or if there should be a barrier to a clear understanding of what is in the result before you get to that site.
[Translation]
Senator Dupuis: I don’t want to prevent you from answering, but what you told us is that content providers should be putting up barriers to block access for children. So my question to you is this: what do you think effective barriers would be?
[English]
Mr. McKay: Well, I think at the moment the obligation is age verification. Not being an expert in this field, I understand that they do ask for some signal of compliance with local obligations, and I understand many don’t. That is the basic requirement, before they are served any of the content when you arrive at the site, that they undertake that verification.
[Translation]
Senator Dupuis: So you don’t have other barriers aside from this mechanism to suggest to us. Is that right?
[English]
Mr. McKay: That’s the most immediate and the most practical one that addresses the challenge you are identifying.
[Translation]
Senator Dupuis: Thank you. Professor Trudel, the bill talks about the age of 18 years. Have you perhaps looked into the issue of emancipated minors? On the issue of emancipated minors, the Civil Code of Quebec provides that, at 16 years old, depending on whether the child is partially or fully emancipated, they can engage in all legal acts. Have you had an opportunity to look into this issue in terms of the definition of age, in the bill, which is set to 18?
Mr. Trudel: No, I have honestly not considered that issue. I have not evaluated the impact of that notion of an emancipated minor.
Senator Dupuis: If I asked you to consider it now, would you have any thoughts on it off the top of your head?
I understand that you cannot anticipate every question, but I really wondered about that. Do you have any insight that could help us?
Mr. Trudel: I think the fact that legislators are imposing an age limit to access material can address all kinds of considerations. For example, it is not because a minor can engage in sexual activities from a very young age that it automatically means they should have the right to access any sexually explicit material.
My understanding of the logic underlying the bill is that it presupposes a choice, that there are more inconveniences than benefits to setting the age for accessing that material to 16 instead of 18. But, at the end of the day, there is always some choice that will generally stem from a risk analysis.
The purpose of legislation like this one is to help fight against the development of behaviours that lead to sexual violence and to behaviours that can end up being very damaging over the long term. So preventing people under the age of 18 from accessing content doesn’t seem contradictory to me if we acknowledge that, in their personal life, people under the age of 18 can engage in sexual activities.
Senator Dupuis: I understand very well. I wanted to see whether there could be an issue with the selected criterion because the age would be set and an attempt would be made to group together everyone under the age of 18, which may not be possible because of a technical element such as the fact that someone is an emancipated minor. That is also an issue. This is not at all a matter of the bill’s objective, but rather a matter of technical issues, as there are generally always technical elements like this one in bills.
Thank you.
Mr. Trudel: Thank you.
Senator Clement: I thank all of our witnesses.
My question is for Professor Trudel, and I will have another one for Mr. McKay. Mr. Trudel, technology is clearly advancing at a rapid rate, as are ways to circumvent technology, especially by young people. So can this piece of legislation remain useful in that context?
Mr. Trudel: I think that any law can be circumvented and, in my opinion, that is not an appropriate criterion. We cannot use the fact that a piece of legislation can be circumvented to say that it has no purpose. Otherwise, the entire Criminal Code should be abolished because, unfortunately, many of its provisions are circumvented or ignored in Canada on a daily basis. So I think the real issue is about ensuring that laws are updated to promote the deployment of technologies such as age verification.
In fact, the best incentive for developing those technologies is precisely the implementation of an obligation like the one proposed in the bill. We can assume that, when that type of obligation is imposed, it encourages industry to develop and refine technology. So it helps reduce the risk of people circumventing prohibitions.
However, it would be unrealistic to assume that this legislation, like any other, will never be circumvented. All legislation, especially in an environment like the internet, can be circumvented. I think that is not the right criterion because we may think it is important, or at least do what we can, to limit access to material considered harmful, while knowing there will be some degree of circumvention. Noting that degree of circumvention also helps determine the type of non-intrusive technologies that protect privacy that could also be developed to limit potential circumvention activities that could occur once a piece of legislation like this one comes into force.
I would add, to echo some of the other comments, that this legislation must also apply in the context where legislation on privacy protection exists. So there are already a number of mechanisms under the Privacy Commissioner’s responsibility that will necessarily have to work in conjunction with the development of technologies that could be required under the regulatory power provided for in the act.
So, according to my understanding of the bill, it should be implemented by collaborating with the Privacy Commissioner, but also by encouraging industry to do what needs to be done in research and development to engage in the process of continuously developing those technologies.
Senator Clement: That’s the issue, collaboration.
Mr. Trudel: If we really want to ensure that Canadians’ rights are respected on the internet, I don’t think we have a choice. We have to promote greater collaboration between the worlds — those of technology developers and legislators.
[English]
Senator Clement: Thank you.
Mr. McKay, we heard some testimony about how porn sites will resist this kind of legislation because, if they have to put in age verification, then they drop down in terms of the ranking that a search engine will attribute to them. Does that make sense?
Mr. McKay: I’m not certain about the example you’re describing and the particular circumstance, but I’ll respond in a very general way. Thousands of different signals are assessed when we deliver a result to someone as a product of their search. We look at many different things, including the relevance of the site to the terms entered. But we also look at whether or not other users click through to that result, which is a signal that that site is relevant to what they asked, especially if they return to it. If there’s anything in the way— for an average site, it could be bad graphic design or poor content — and we see that people do not return once they click on that result, then we see that as a signal that it’s not the best result for those search terms and for that query. I could see that applying in this circumstance.
Senator Clement: Right. So if there’s an age-verification process, that will create a bounce rate. People will bounce away from sites with age verification to sites without it.
Mr. McKay: Understanding what I do about human behaviour, I would assume so, yes.
Senator Clement: Is there any way for a search engine, like Google, to evaluate things differently? I mean, porn sites are saying, “We’re not going to put in age verification unless everybody else does.” Do you know what I mean? That’s how they resist this. If Google were to say, “Well, we’re going to rate you more highly if you have age verification,” wouldn’t that help?
Mr. McKay: I think it’s better to look at this in the context of what the committee and Parliament are trying to do, which is to create legal boundaries for this behaviour. If there are legal boundaries for this behaviour, like any other behaviour, that, one, creates an obligation on those sites but, then, two, creates an expectation around the results that are delivered to individuals within that jurisdiction of which that demand is being made.
From our perspective, collecting that data about bounce rates isn’t as relevant as the legal system that identifies websites that are not complying with the existing law and then seeking either from internet service providers or from the websites themselves or with the framework of copyright enforcement, as Professor Trudel highlighted, and then creating a formalized process that intervenes with companies like ours and others.
Senator Clement: What is your opinion of this law?
The Chair: Senator Clement, I really let you go on over seven minutes.
Senator Clement: That’s fine. Thank you.
[Translation]
Senator Miville-Dechêne: I would like to put a brief question to Mr. Westell.
[English]
You spoke about a government registry when you talked about private data. That’s what I heard. Is there any place in Bill S-210 where you see mention of a government registry?
Mr. Westell: The concern is we don’t know exactly how the regime will work and how the regulations will be enforced. My understanding is there was going to be implementation of a government agency to oversee how age verification processes are taking place and whether or not they abide by the regulations in law that is being set out here. My concern is that the government will be involved and closely watching and potentially involved in the collection, at arm’s length, of the information from these age-verification procedures.
Senator Miville-Dechêne: Okay, because there’s no government registry. Obviously, as you probably saw, the regulations will deal with all of that in respect of the laws that Mr. Trudel was talking about. That’s my question. Thank you.
Senator Dalphond: My question is for Mr. McKay. Do I gather from some of your answers that Google is defining a profile of everyone who does a search on the internet? So if he’s looking for teenager music, he’s probably younger than me. Maybe he’s looking for expensive restaurants or for hotels. With those kinds of things, you are able to define a kind of user profile? If that’s the case, can you then have an idea about the age of the user?
Mr. McKay: There are several different circumstances under which we understand individual pieces of information about a user.
In the broadest case, if you’re a regular Google user, we have an understanding from what you’ve searched, from the sites you visited and other pieces of information that you share using our products and services, about your general age range. We make an estimation about other aspects of what you’re interested in. We use that to inform our response to your queries, using whichever product. It’s very rare that we have specific information about an individual that allows us to actually pinpoint. In fact, our entire framework is built around being able to provide the most accurate and relevant results without, in fact, having a comprehensive understanding of you as an individual.
Importantly, whether or not you have a Google account, you also have the ability, by going to myaccount.google.com, to see what we understand about you based on your use of our products and services. When I say “our products and services,” we also identify exactly what that means: You can see, well, they understand where I’ve been travelling and where I’ve been asking for directions, things like that.
Senator Dalphond: Let’s say somebody is using the internet at home. There might be three or four users on the line, so you have different profiles. Is it possible for you to identify that, at this time of the day it’s somebody who is young, looking at young music and pop stars that are popular with teenagers, and so to have a profile that indicates that this person is probably not an adult?
Mr. McKay: No. If we’re describing an environment where everyone in that house is using our products and they are not signed into a Google account and they don’t have a level of identification that allows us to distinguish between the users, then we’re going to see a mixture of signals and data about all of the people in that house, and you may be receiving information about contemporary pop stars as well as your children.
I mentioned in my comments a tool called Family Link. Family Link is an explicit effort on our part to identify the situation you’re describing, where you have a household with parents, guardians or adults who want to provide some level of control and understanding of what their young family members are doing online. It creates more restricted activity and gives them the tools to restrict that activity even further.
Senator Dalphond: Thank you.
[Translation]
Senator Dupuis: My question is for Colin McKay. I understand the differences in technologies, but we know full well that credit cards track all transactions made. That has existed for decades, and we are used to it. When our credit card is used fraudulently, we are notified that one of our transactions is not in line with our consumer profile, and we are asked to confirm whether we made the transaction. So, our banking transactions have been tracked through credit cards for a long time.
What has changed at Google that enables you to be more accurate when you track me, as a senator, today?
[English]
Mr. McKay: Thank you for your question.
What you’re describing is actually two very different environments, because your bank and your credit card issuer know very specific information about you based on your application. They also know very specific information about you —
[Technical difficulties]
The Chair: Senator Dupuis, I think we have lost the witness.
[Translation]
Senator Dupuis: I think we have lost the connection with the witness. The screen is frozen.
In that case, Madam Chair, could we ask him to send us the answer to this question in writing? I don’t want to delay our work needlessly.
The Chair: Yes.
Senator Dupuis: Thank you very much.
The Chair: The clerk will write to him.
[English]
Mr. Westell, Professor Trudell and Mr. McKay, thank you very much. We learned a lot from you. You can see there is so much interest and we could ask you more questions. Thank you very much for being here.
Senators, we will continue in camera.
(The committee continued in camera.)