THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS
EVIDENCE
OTTAWA, Wednesday, March 2, 2022
The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 4:18 p.m. [ET] to Bill S-210, An Act to restrict young persons’ online access to sexually explicit material.
Senator Mobina S. B. Jaffer (Chair) in the chair.
[English]
The Chair: Honourable senators, I’m Mobina Jaffer, senator from British Columbia, and I have the pleasure of chairing this committee. Today we are conducting a hybrid meeting of the Standing Senate Committee on Legal and Constitutional Affairs.
[Translation]
If you experience any technical difficulties, with the interpretation in particular, please inform me or the clerk and we will do our best to fix the problem.
[English]
A reminder to please only signal if you do not have a question to the clerk. Otherwise, all members are on my list for questions.
[Translation]
Now I will take a few moments to introduce the committee members taking part in today’s meeting.
[English]
We have Senator Boisvenu, Senator Campbell, Senator Carignan, Senator Clement, Senator Cotter, Senator Dalphond, Senator Harder, Senator Dupuis, Senator Pate, Senator Wetston and Senator White.
Senators, today we continue our study of Bill S-210, An Act to restrict young persons’ online access to sexually explicit material.
We are very happy today to welcome the RCMP and the CRTC. From the RCMP, we have André Boileau, Officer in Charge, National Child Exploitation Crime Centre; and from the CRTC, we welcome Scott Hutton, Chief of Consumer, Research and Communications, and Peter McCallum, Legal Counsel-Consultant.
May we hear from you, Mr. Boileau. Can you start with your presentation, please.
André Boileau, Officer in Charge, National Child Exploitation Crime Centre, Royal Canadian Mounted Police: Good afternoon, Madam Chair and honourable members of the committee. Thank you for the opportunity to speak with you today. The safety of our children is a very important issue for all of us. As children and youth are spending more time online, it is important to raise awareness of the risks that they may encounter, and to take preventative and collective action to keep them safer on the internet.
To begin, I would like to offer some context as to our role within child protection. In 2004, the Government of Canada announced the National Strategy for the Protection of Children from Sexual Exploitation on the Internet, which brings the RCMP, Public Safety Canada, the Department of Justice and the Canadian Centre for Child Protection together to provide a comprehensive, coordinated approach to enhancing the protection of children from online child sexual exploitation. The Canadian Centre for Child Protection is a non-governmental organization that operates cybertip.ca, Canada’s tip line to report suspected online sexual exploitation of children.
[Translation]
The RCMP’s National Child Exploitation Crime Centre is the national law enforcement arm of the national strategy, and functions as the central point of contact for investigations related to the online sexual exploitation of children in Canada and international investigations involving Canadian victims, Canadian offenders or Canadian companies hosting child sexual exploitation material.
The centre investigates online child sexual exploitation and provides a number of critical services to law enforcement agencies, including immediately responding to a child at risk, coordinating investigative files with police of jurisdiction across Canada and internationally, identifying and safeguarding victims, conducting specialized investigations, gathering, analyzing and generating intelligence in support of operations, engaging in operational research, and developing and implementing technical solutions.
The centre is also mandated to investigate transnational child sexual offences. The centre has seen a dramatic increase in reports of online child sexual exploitation in recent years. In fiscal year 2020-21, the centre received 52,306 complaints, reports and requests for assistance related to online child sexual exploitation. This is a 510% increase compared with the number of reports received in 2013-14.
The majority of our referrals come from the National Center for Missing and Exploited Children in the United States, and every report is assessed. Reports that are deemed actionable are forwarded to the police of jurisdiction for further investigation.
[English]
In addition to high volumes of reports, online child sexual exploitation cases have become increasingly complex. Technological advancements, such as encryption, the dark web and anonymity tools, have made it significantly easier for offenders to operate undetected by law enforcement.
Like many cybercrimes, online child sexual exploitation is often multi-jurisdictional or multinational, affecting victims across jurisdictions and creating additional complexities for law enforcement. No single government or organization can address this crime alone. The RCMP works diligently with its partners at the municipal, provincial and federal levels in Canada and internationally, as well as with non-governmental organizations, to strengthen efforts to identify and remove victims from abuse and to bring offenders to justice. The RCMP is a member of the Virtual Global Taskforce, an international police alliance dedicated to the protection of children from online sexual exploitation and transnational child sex offences. The VGT consists of law enforcement, NGOs and industry partners working collaboratively to find effective response strategies.
The RCMP also works closely with the private sector, as offenders regularly utilize platforms operated by internet and communication service providers to carry out a range of Criminal Code offences relating to online child sexual exploitation.
[Translation]
Sexual offences committed against children are among the most deplorable of all crimes. Not only are children victimized through sexual abuse, but they are also often re-victimized throughout their lives, as photos, videos and/or stories of their abuse are shared repeatedly on the internet amongst offenders.
The Criminal Code provides a comprehensive range of offences relating to online child sexual exploitation. Canadian police services, including the RCMP, are responsible for investigating these offences when there is a possible link to Canada. The Criminal Code also authorizes courts to order the removal of specific material, for example, a voyeuristic recording, an intimate image and child pornography that is stored on and made available through a computer system in Canada.
We are very supportive of the legislation currently in place that serves to protect children from offences related to online child sexual exploitation, and we advocate the continued strengthening of these laws.
Over the years, we have observed a number of amendments and additional offences added to the Criminal Code of Canada, which have offered law enforcement enhanced measures to bring offenders to justice for the crimes that they commit against children.
[English]
Sexually explicit material, as defined within the Criminal Code of Canada in subsection 171.1(1), does not include child pornography. However, it can be used in the commission of an offence against a child. Specifically, under section 171.1(1) of the Criminal Code of Canada, it is an offence to transmit, make available, distribute or sell sexually explicit material to a child for the purposes of facilitating the commission of designated sexual and exploitative offences against a child.
Within the context of online child sexual exploitation, some offenders provide sexually explicit material to children to normalize the sexual acts depicted and to prepare for the commission of a sexual offence against the child on or offline. This is referred to as grooming and is often a gradual process that can lead to the sexual exploitation of a child. Charging under section 171.1(1) of the Criminal Code of Canada is an avenue that police can pursue when sexually explicit content is used within the context of committing an offence against a child. The National Child Exploitation Crime Centre does not investigate matters related to sexually explicit material beyond the scope of online child sexual exploitation. I highlight, again, that it is an offence to transmit, make available, distribute or sell sexually explicit material to a child for the purposes of facilitating the commission of designated sexual and exploitative offences against a child. This is not the case in the context of a child coming across adult pornography online.
[Translation]
We support measures in place that serve to protect children online and prevent their victimization, and continued efforts to strengthen these safeguards. We all have an important role to play in child protection, and we must take a holistic approach to keep our young people safer online.
The National Child Exploitation Crime Centre is committed to working together with stakeholders to enhance Canada’s ability to protect children from online child sexual exploitation and transnational child sex offences.
Thank you for inviting the National Child Exploitation Crime Centre here today, and I would be pleased to answer your questions.
[English]
The Chair: Thank you, officer.
We will now go to Scott Hutton from the CRTC.
Scott Hutton, Chief of Consumer, Research and Communications, Canadian Radio-television and Telecommunications Commission: Thank you, Madam Chair, for inviting us to appear before your committee. I am joined today by my colleague, Peter McCallum, a Legal Counsel-Consultant for the CRTC.
We are aware of your study of Bill S-210, which proposes to restrict young people’s online access to sexually explicit material.
The internet has exponentially increased access to all kinds of content, including sexually explicit material. We acknowledge and share the concerns about the adverse effects and negative social impacts that exposure to pornography can have on youth and adolescents. This is a global issue that, in our view, requires a comprehensive, whole-of-government approach and many different tools.
There is no simple solution to regulate harmful online content. Many jurisdictions are struggling with this issue. I would note that the Standing Committee on Canadian Heritage recently adopted a motion to conduct a study on the harms caused by online access to sexually explicit material.
Of course, there is no place on the internet for harmful or illegal content. There are provisions in the Criminal Code to address this type of content, and several organizations at the federal level are actively engaged in these files. They include the Department of Public Safety Canada, the Department of Justice, the RCMP’s National Child Exploitation Crime Centre and the Canadian Centre for Child Protection.
[Translation]
At the moment, Canadians can control their children’s access to inappropriate content using filtering software and parental controls.
Bill S-210 would enable designated enforcement authorities to take steps to prevent sexually explicit material from being made available to youth on the internet in Canada. While we support the aims of the proposed legislation, the CRTC does not currently have such authority. Canada’s Telecommunications Act does not clearly provide for the regulation of content with respect to internet service providers, ISPs.
Our legislation is built on the foundational principle of net neutrality. This refers to the concept that all traffic on the internet should be given equal treatment by ISPs. ISPs should not manipulate, discriminate, or give preference to, the content that passes through their networks.
The CRTC was one of the first regulators in the world to implement an approach to uphold net neutrality. We have issued three decisions that, combined, form the current regulatory framework for net neutrality in Canada.
Even if the CRTC were given the power to order ISPs to verify the appropriateness of the content passing through their network, it may not be technically feasible for them to implement an age verification system. In terms of content, the CRTC’s powers were designed with the traditional broadcasting system in mind.
As you are most likely aware, Bill C-11, which is currently being debated in the House, proposes to modernize the Broadcasting Act. If adopted by Parliament, Bill C-11 will empower the CRTC to ensure that online broadcasters contribute to Canadian content and achieve other important public policy objectives.
The legislation would give us the three key elements that we are missing to regulate online platforms: the clarity of jurisdiction, the ability to gather data and the necessary enforcement tools.
[English]
That being said, let me repeat that harmful and illegal online content is a global problem. Various countries are looking at different strategies to prevent minors from accessing this type of content online. For instance, Australia is working to implement a mandated age-verification system, but its approach also recognizes the need for greater education, awareness and understanding of respectful and harmful sexual behaviours among youth. The Australian eSafety Commissioner is consulting the public and stakeholders. One of the key insights of its initial consultation is that a one-size-fits-all technological approach would not be effective.
In 2021, the European Council proposed amendments to the draft Digital Services Act to improve provisions related to the use of age-verification and parental-control tools to mitigate the risk of exposure to harmful content. These provisions would apply to large online platforms and even search engines. Debates are expected to begin soon on the final text of the act.
The European Commission has also funded a project to enable service providers to verify the age of their users, which will be piloted this year by 1,500 children, parents and adults from at least three different countries in the European Union.
Finally, the United Kingdom is in the process of implementing new regulations to ensure that video-sharing platforms implement measures that would protect users from harmful content. It requires the platforms to establish age-verification systems with priority for those providing access to pornography.
[Translation]
Clearly, these international efforts are in the early stages, and it remains to be seen how effective they will be at protecting children. This is certainly a challenging area, as research has found that children are increasingly adept at finding workarounds to age verification systems. To be frank, there is no single organization or single measure that can effectively address this issue.
We believe that it is important to learn from our international counterparts, take the time to evaluate the most effective means to prevent minors from being exposed to harmful online content and develop a whole-of-government approach.
We would be pleased to answer your questions. Thank you.
[English]
The Chair: Thank you very much, Mr. Hutton. We will now go on to the first question, and that will go to the sponsor of the bill, Senator Miville‑Dechêne.
[Translation]
Senator Miville-Dechêne: My question is for Mr. Hutton, from the CRTC. I realize the CRTC doesn’t have the authority to deal with the issues covered by Bill S-210, but I’d like to ask you about two things.
You said that asking ISPs to remove pornographic content that had not been subject to verification would go against net neutrality.
However, if Canadian legislation required pornographic sites to verify a user’s age, it would be illegal to show children this type of content. Therefore, I don’t exactly understand what you mean when you talk about net neutrality. ISPs already remove illegal content such as images of child exploitation because they are considered illegal content.
If they are doing it for illegal content such as child sexual abuse, why wouldn’t they do it for pornographic sites not adhering to the law? How is net neutrality at all diminished when we are talking about illegal content?
Mr. Hutton: The reason I talked about net neutrality is that our powers in the telecommunications sector flow from the Telecommunications Act, which, itself, is based on the exchange of information and the non-interference of telecommunications service providers or others involved in the distribution of content.
Of course, it’s possible to block content, but that ability is very limited because of the powers conferred to us. It’s not that it can’t be done; it’s just that it’s difficult under the current jurisdiction.
Senator Boisvenu: I have a question for each witness. Thank you all for your informative presentations. This is my question for Mr. Boileau. If the bill is passed, what do you think your role will look like?
Mr. Boileau: In the event the bill is passed, I can’t speculate as to the role of the RCMP or other organizations in the country because I don’t have any information about the enforcement of the act.
Senator Boisvenu: When you read the bill, do you think it lays out the police’s role clearly, or does it need more clarity around the role of police, whether it be Quebec provincial police or another police force?
Mr. Boileau: It is not clear from reading the bill what the mechanism for its enforcement will be.
Senator Boisvenu: My other question is for Mr. Hutton. This doesn’t worry me, but you said you didn’t have the legal authority to enforce the legislation. I don’t find that too concerning. Legislation can be amended.
What I find more concerning is your statement that it would not be technically feasible for the CRTC to enforce the act. Is that due to resources or the training of CRTC staff? What do you mean when you say that it’s not technically feasible to enforce the legislation?
Mr. Hutton: The answer to that is twofold.
Yes, the jurisdiction is one thing, but the CRTC has to be given the necessary tools in case it is called upon to enforce this legislation, tools it doesn’t currently have.
It’s about the tools, not just the jurisdiction. I brought up Bill C-11 to illustrate something. Obviously, having a clear understanding of what it applies to is important. Can those things be dealt with under the Broadcasting Act? That’s one of the questions. If we are supposed to impose monetary penalties, do we have the authority to gather the necessary information to do that? Are we allowed to collect the data and information needed to enforce the act? That’s not clear either. Ultimately, what does seem to be clear is that there will be a monetary approach to support enforcement.
The other technical consideration — perhaps more for the ISPs than for the CRTC — is this. For some content, it’s very challenging for ISPs to block access and set up an age verification system, but not for other content. In addition, applicability of the measure can differ depending on the nature of the service provider. Whether the ISP manages its network end to end or the provider leases the facilities from a third party, in each case can the provider enforce the legislation?
Senator Boisvenu: Thank you.
[English]
Senator White: I have a question for the RCMP relating specifically to the difficulty with investigating offences that would occur should this bill be enacted and whether you have any concerns about the RCMP’s capability if you were given responsibility for those investigations.
Mr. Boileau: I apologize sir. I was not able to fully hear your question.
Senator White: My question pertains to whether you have any concerns about the difficulty to implement investigations should this legislation pass, and whether you are concerned about the capabilities and capacity you would have to add more investigations into your unit, for example, if the responsibility were borne by the RCMP.
Mr. Boileau: We cannot speculate as to how this bill may impact law enforcement as content controls placed on online platforms in any legislation or regulations in that regard would fall to either federal departments. The NCECC does not investigate matters related to sexual explicit material beyond the scope of online child sexual exploitation.
Senator White: So who would it fall to, if it doesn’t fall to the National Child Exploitation Crime Centre?
Mr. Boileau: As I mentioned previously, I cannot speculate as to whom it would actually fall to.
Senator White: Okay. I’ll ask a question to the CRTC representative, if I may. I’m wondering, having read the legislation, whether you would recommend any other avenue that you believe could be adopted or adapted in Canada to lessen the impact of such material and to actually deal with the internet providers.
Mr. Hutton: I think the position we have put forward in our opening remarks is very much that a single solution will not address this serious issue. We need to work collectively with a number of different parties who are involved and, as mentioned by Mr. Boileau, on different fronts and different arms of the RCMP, Public Safety and even other departments of law enforcement.
There is also very much a public education issue that needs to be addressed and dealt with. There is, naturally, ensuring that the enabled institution does have the authorities and the tools itself, which goes beyond simple resources to be able to enact the provisions of the laws. As I have indicated in my answer to Senator Boisvenu, there are some issues that need to be addressed. We may be able to receive those with Bill C-11, which is currently being debated to modernize the Broadcasting Act to address and provide the CRTC with some of these tools. But that is another activity that needs to occur and be looked at.
Senator White: There are other jurisdictions that have started similar steps and begun to try to limit the access for people of certain ages. Do you know of any that have been successful and if so, which? And if not, where you saw them stumble.
Mr. Hutton: It appears no single measure has been successful. Simple age verification has not been comprehensive enough. Simple education of and providing families with the tools to block as it is currently available in Canada has not been enough because workarounds need to occur. Really, it is as we have seen. In speaking to and exchanging with regulators in other countries on this issue, the consensus seems to be that a multipronged approach is needed, which includes education, tools for families to be able to protect themselves and responsibilities being applied to various providers. As I mentioned in my opening remarks, it can be everything from streamers and social media companies to the European Union actually looking at addressing issues with the search engines themselves. There has not been a solution that has been fully successful to date.
Senator White: Thank you.
[Translation]
Senator Dalphond: I will start with my question for the CRTC representatives. The bill creates two types of situations. On one hand, it creates an offence whereby the party who is found guilty is liable to a significant fine, and that means some sort of criminal proceeding has to be held. On the other hand, it creates an administrative process whereby internet service can be suspended, or Rogers or Bell can be prohibited from making the offending site accessible. In the CRTC’s enforcement of the various pieces of legislation for which it is currently responsible, do you have authority and do you initiate the type of criminal or administrative proceedings that the bill would involve?
Mr. Hutton: I’m going to ask Mr. McCallum to answer, but first, I want to say something. The CRTC is a quasi-judicial body—an administrative tribunal, if you will—so its current authority under the Telecommunications Act and Canada’s anti-spam legislation provides for administrative penalties, not criminal penalties. Mr. McCallum should be able to provide more details on that.
Peter McCallum, Legal Counsel-Consultant, Canadian Radio-television and Telecommunications Commission: Yes, that is indeed the case. The CRTC has the ability to impose administrative monetary penalties when offences are committed.
Criminal proceedings are possible under the current legislation, but that avenue is rarely used since the CRTC can impose administrative monetary penalties, and does when necessary. I hope that clarifies things.
Senator Dalphond: I gather you regularly communicate with other regulatory agencies, whether in the U.S., Europe or even Australia. It’s an issue you’ve been working on for a while now. How do you regulate access to content that has a harmful impact on users? Has any research been done on that? In your conclusion, you say that a single solution — which you seem to think this bill is — can’t do the job and that a more comprehensive approach is needed.
Mr. Hutton: Yes, we are indeed in regular contact with quite a few regulators around the world, and that helps us understand how various laws in various countries are developed. When I say we’ve been working on this for a few years, what I mean is that we have been examining the full range of issues related to online content in that regard. Unfortunately, we don’t have any specific research, other than our discussions with foreign regulators about the types of issues addressed in Bill S-210.
[English]
Senator Dalphond: Thank you. In your ability to sue for illegal or criminal pornography, do you encounter problems with the use of VPNs? We are told that many youth, 15-, 16- and 17‑year‑olds, have access to VPNs. Is that something you encounter in your attempt to sue or find the traces of illegal child pornography on the internet?
Mr. Boileau: The answer to your question is yes. We do encounter the usage of VPN technology. For most offenders, it’s one of the technical aspects they use to make themselves undetectable by law enforcement. For their ability to move forward, put things in place and not be detected and identified, VPNs are used a great deal.
VPNs could be used for other purposes. They are already being used for legal purposes, and that is why we have so many of them. But, unfortunately, they are also made available to people for illegal activities.
Yes, a VPN could be used in that way. It goes back to what my colleague from the CRTC mentioned. People can have different means to circumvent measures put in place, and that could be one example.
Senator Wetston: I’m going to begin my question with the CRTC, Mr. Hutton. Your description of why you need to wait and see is one I have heard for many years, having been in public service for most of my career.
Having said that, I think what you are saying, with all due respect, is let’s see what other countries do first, and then we might act. That sounds to me as though you are not prepared to take the step forward in this area, if you agree with what this bill is attempting to achieve. By saying, “We’ll wait, let’s see what other countries do, let’s observe their mistakes and then we might put it together because it is a whole-of-government initiative,” you are suggesting it is not a whole-of-European Union initiative; it’s not a whole-of-U.K. initiative; it’s not a whole‑of‑Australia initiative. I’m not trying to be disrespectful toward your comment, but I would like to have you address that issue.
Second, we recognize the role that you have at the CRTC over the ISPs, and we understand your concerns about the limited jurisdiction that you have. But you do have considerable jurisdiction. Why is the CRTC not capable of presenting a series of policy statements or guidelines that might allow for the ISPs to agree or disagree with the possibility of joining an effort to deal with this matter? You have considerable authority to do that. You can do that, and you do not need explicit legislative jurisdiction to implement a guideline approach in consultation with the ISPs.
Mr. Hutton: Thank you. I don’t think we are suggesting a wait-and-see approach. We are here to talk about the proposed bill you are studying. What our view is and what we are bringing to you is through our experience, because we do implement a number of different pieces of legislation, including Canada’s anti‑spam legislation, where we deal with international players, and we have specific authorities and directions to be able to accomplish those roles. I think those are the messages we are bringing here to you today to this committee.
When we talk about a whole-of-government, we do have significant experience with the Broadcasting Act and the current broadcasting system where we have a number of different measures to try to ensure that Canadians under 18 do not access this type of programming.
I would suggest that is the message we are trying to bring to you as we are looking at this bill, and we are certainly in agreement with the issues and the concerns that have been brought up by the subject matter and these bills here.
We have been working and studying, and we have been calling upon government for the need to modernize our authorities to fully deal with these issues in a proper manner. Certainly, the CRTC is supportive of the concept behind Bill C-11, which is currently before the House and will come, hopefully, before the Senate for consideration in due course.
With respect to the role of the ISPs themselves, we have considerable authority on many fronts, but the Telecommunications Act and how we have to implement, it is certainly not clear — and I’m being quite honest when I’m saying “not clear” — that we can deal with content under that act. We deal with content under the Broadcasting Act, and hopefully that can be modernized to allow us to take a more interventionist approach in this area.
Senator Wetston: Thank you. Madam Chair, I didn’t get a comment on my guidelines.
The Chair: Mr. Hutton, can you answer Senator Wetston’s question on the guidelines?
Mr. Hutton: I didn’t go to the guidelines because when we are dealing with ISPs, we don’t regulate ISPs under the Broadcasting Act, and the Supreme Court has been quite clear on that issue. They are clearly telecom providers, and, as I said, guidelines on content are something that wouldn’t be clearly within the authority of the current telecom act.
Senator Wetston: Thank you.
[Translation]
Senator Dupuis: Thank you to the three witnesses. My first question is for Mr. Hutton. I was fascinated by what you said about net neutrality. I want to better understand what exactly that principle covers. Let’s say I decide to start an internet platform, become a service provider or launch a website, does net neutrality protect me against any CRTC intervention? Does it mean that, whatever I post online — or allow to be posted — I won’t have the CRTC on my back since it doesn’t pay any attention to content because of net neutrality?
Mr. Hutton: Only partially. You mentioned three different players when it comes to online content. One was an internet platform, a player that may be better covered by Bill C-10, which fits in with the objectives of the Broadcasting Act. Now if you have a website, you aren’t currently covered by the two acts we enforce.
The third player you mentioned was an internet service provider, which is a telecommunications company. Telecommunications companies are subject to the Telecommunications Act, and the underlying principle there is to ensure the exchange of information without the interference of the provider.
Net neutrality means that the tools we are given and the frameworks in place must foster the exchange of information. It doesn’t mean you are immune on all sides. Obviously, the authorities or, rather, activities that can be deemed criminal would be in accordance with the corresponding pieces of legislation, as Mr. Boileau talked about today.
Senator Dupuis: Then, would you say that one of the reasons why it is so difficult to achieve something in this field is that so little progress has been made around the world, for all sorts of reasons, including a political position to not disturb those who are doing whatever they want on the net? Progress has been minimal regardless of location — as you told us, you’ve checked and it’s clearly the case. Am I mistaken, or does part of the problem not stem from the principle of net neutrality? To date, we have felt that it was a very good thing — except in relation to the Criminal Code, for what is most obvious — but in another way, we want to try to break from a mentality that is very deeply rooted, with acquired rights, and it will be very difficult to shake things up.
Mr. Hutton: When I talk about net neutrality, I’m not talking about the same thing as you, but I hope that I’m not putting words in your mouth. It is a very open approach toward everything occurring on the net. We are talking about telecommunications companies, and we need to ensure that they don’t control the content that is distributed, so that information sharing can occur without barriers.
For a few years now, the CRTC has taken a position in various reports and in a master report on all internet content. We have taken a position and recommended a greater contribution by providers, whether they be platforms or people who broadcast online, since their presence here, in Canada, gives them advantages because they are distributing content to Canadians. We believe that they have an obligation to make a contribution to Canadian society and, in accordance with the objectives of the Broadcasting Act, to create and promote Canadian content.
Under the act, broadcasting in the country must also meet high standards. One of the objectives of Bill S-210 would be to establish norms in Canadian broadcasting.
Senator Dupuis: Do I have any time remaining?
The Chair: No, senator.
Senator Dupuis: I’ll wait for the second round, then.
Senator Clement: My first question is for the RCMP representative. Do you see the usefulness of this bill, given your mandate to protect children? If you do see its usefulness, what would be its major weakness, if there is one?
Mr. Boileau: Any initiative that seeks to protect children on the internet is necessarily an excellent one. Concerning your question about the weakness of the bill, unfortunately, I cannot speculate. The regulatory authorities of this new legislation have not yet been defined.
The centre’s mandate specifically deals with investigations of child pornography material on the internet. What you are referring to is children who might have access to adult pornography, which does not fall under the centre’s mandate.
Senator Clement: Thank you.
[English]
My next question is for the CRTC representative. If you had authority — let’s say we wave a magic wand and you have the required authority as passed by the house and the Senate — do you see this bill as a useful piece of the global approach that you discussed? If you do, then tell us what the difficulty or weakness might be in the bill. I’m trying to get more specific around this bill.
Mr. Hutton: As we are a tribunal that implements acts, it is certainly the provision of legislators to develop and come up with those bills on that front. As my colleague, Mr. Boileau, has highlighted, this is an initiative and we need many initiatives. Certainly, it is viewed as positive on that front. We are aware and acknowledge the potential social impacts of accessing this type of content by minors in this country.
I will go back to where we would see the need for us to be able to intervene. Yes, there is that jurisdiction, but there is the jurisdiction which is clarity on who this applies to. These are questions that need to be highlighted. As I mentioned, there are a whole bunch of different types of players who can distribute content over the internet. The clarity question is one that is quite important.
On the question of tools, the simple issue is that we do have experience administering administrative monetary penalties, but to be able to administer a penalty, you need to ensure that you have elements of jurisdiction or tools to require data provision to understand what is occurring in the networks and to understand the financial implications of a potential administrative monetary penalty. Yes, if it were up to the CRTC to do, there would be elements needed to be able to allow an institution like ours to fully implement these provisions.
Senator Clement: So elements of a technical nature, then?
Mr. Hutton: They are legislative tools. I know I said jurisdiction before, but it’s certainly the tools in the legislation. For example, if we look at what is currently being proposed in Bill C-10 when the CRTC called for action with respect to broadcasting. For the players who would be captured by the Broadcasting Act, certainly we would be able to intervene in this domain. But it does provide us with tools to address both Canadians and international players.
Right now, for example, the CRTC, in respect to broadcasting, has only two tools. One is licencing and applying regulations to licence payers or exempting parties who broadcast from licencing. If you hold a licence, you must be Canadian. So we do have limits as to whom we can apply our regulations to at this point in time. It is certainly looking at providing other tools, such as regulation-making tools that can be applied to these particular players, making sure they are subject to our act, and being able to ask for information, do investigations and ensure that we can put in place conditions of service, for example, such as requiring age verification.
Senator Clement: Thank you.
Senator Harder: I have a question for the RCMP. You’ve stated that you are not able to answer the excellent question by my colleague, Senator Clement. Who should we be calling before us to answer that question?
Mr. Boileau: I would defer to the Department of Justice in regard to being specific to the legislation. You’ve got people from Public Safety who would probably also offer some guidance to the questions that are being asked.
Senator Harder: So are you saying by implication the RCMP would have no role in this area?
Mr. Boileau: That is not what I’m saying. In the bill as we see it, the regulatory body is not identified in the application of the law. It is not identified either on how its application will go forward. We don’t want to speculate on certain questions that are being asked, not knowing exactly what is intended in that legislation.
Senator Harder: My second question is to the CRTC.
The Chair: Senator Harder, before you go to your second question, just for your information, the Department of Justice declined to come and Public Safety deferred to the RCMP. Thank you.
Senator Harder: To the CRTC — I’m hearing you and I’m interpreting what you’re saying as not saying anything is wrong with this bill, but it really wouldn’t be all that important or helpful and the real tool box is the one that’s before the House of Commons. Am I reading your language correctly?
Mr. Hutton: It is. For us to implement, this bill may be helpful, but we need to have the tools either in this bill or tools like those in the bill currently in front of the House of Commons to be able to be effective.
Senator Harder: Let me take that further. If you have only the tools of this bill, what difference will it make?
Mr. Hutton: There will be some efforts that can be put in place, but it would be difficult if the CRTC were named as the party to implement. There would be elements needed for us to fully live up to the purpose of the bill.
Senator Harder: Thank you.
Senator Pate: Thank you to the witnesses, and thank you to my colleagues for their questions. I’d like to follow up on the train that has been explored in the last little while.
If the CRTC were put in the role of the regulator in this context, what resources are you saying you would need in order to fulfill that role? It sounds like that’s the next step.
For the RCMP, during the investigation of Pornhub, jurisdiction was put forth as the reason that some of the investigations couldn’t continue. One of the issues being raised, and certainly something that Senator Miville‑Dechêne I understand is trying to do with this bill, is to allow for the pursuit of large companies that are benefiting significantly from these kinds of exploitation. So what tools do you feel you need? Building on Senator Harder’s excellent question, Senator Clement’s and everybody’s excellent questions — what tools do you think are needed, and what would you add to this bill to do this job if you feel it’s insufficient?
Mr. Hutton: For the CRTC, I think you were talking about resources. Primarily, what we’ve been talking about for the moment is jurisdictional. There are tools in the forms of legislation and there is clarity of that legislation as to whom it should be applied to. Those are the elements that I have raised. I’ve not spoken about actual resources if you’re calling it personnel or dollars and cents. I’m sure that if there is an act that goes forward the CRTC may be in a position to find a way to seek those additional resources, but it would certainly take additional resources to be able to go on that front.
I think the other message is that there is no single solution. The CRTC could play a role with regard to certain players, but there is public education and much wider international collaboration that needs to occur. The resources and the fullness of government approach would certainly need to be looked at much wider than just the CRTC.
Mr. Boileau: For the RCMP, in answer to your specific question in regard to MindGeek, the government recently completed a public consultation on the matter and published a “what we heard” report. Within the consultation, the government proposed amending the mandatory reporting act in the following ways: centralizing the mandatory reporting of online child pornography offences through the Royal Canadian Mounted Police National Child Exploitation Crime Centre; clarifying that the mandatory reporting act applies to all types of internet services, including social media platforms and other application-based services; enhanced transparency by requiring an annual report to the Minister of Public Safety and Emergency Preparedness and Justice from the National Child Exploitation Crime Centre; impose a 12-month preservation requirement for computer data as opposed to the current 21 days; add a requirement for persons who provide an internet service to provide, without a requirement for judicial authorization, additional information to the National Child Exploitation Crime Centre where a child pornography offence is identified; and, most importantly, designate a person and regulations for the purpose of collecting information to determine the application of the mandatory reporting act. That would be the compliance tool.
Senator Pate: Thank you.
Senator Cotter: I have intended to ask a question along the lines that Senator Clement asked of the CRTC representative, so I won’t repeat that. My question is for Mr. Boileau. I’ve noticed while we’ve been having this discussion with you, notice in the media of a major child sexual abuse ring being broken up and 40 or so people charged in Canada. Congratulations, your work produces good results.
Mr. Boileau: Thank you.
Senator Cotter: As I understand your role, it is trying to protect children from sexual predators. In that sense, it is almost the reverse of what we’re talking about here, which are the issues related to children accessing sexually explicit material. I’m interested in your sense of whether there is an interrelationship between those two. You had mentioned, for example, the grooming point, but I’m wondering whether one aspect of this legislation is related to the platform of, I don’t know, sexual tolerance or whatever it happens to be that could lead to the problems that you have to work on.
Mr. Boileau: The interaction in which we have to work is basically — we’re speaking here, with Bill S-210, about online platforms that can be used by children for all sorts of purposes but can be used also by offenders to engage with children to facilitate the commission of sexual offence against them. Online conversations can start within a public space where the children are, where it’s an environment based on and actually created for children, and then the offenders move to a more private conversation where risks are increased and offenders can more easily exploit a young person’s vulnerabilities.
So they go into a space that is a space for the children, and they build trust throughout the grooming process, and they actually pull that child out from that environment or platform and bring them and isolate them with more technologies to more private one-on-one conversations. That is where kids are more vulnerable.
In regard to Bill S-210, our comprehension is that we are looking at websites that host specific material to which we want to limit young ones’ access.
Senator Cotter: I will ask a supplementary, officer. In the work that you and your colleagues do, do you perform follow‑ups to determine the preconditions that created the vulnerability, and do they include young people’s access to sexually explicit materials that could cause them to be drawn into the web and trap of the exploiters?
Mr. Boileau: The RCMP’s National Child Exploitation Crime Centre does not offer victim services to young people who view sexually explicit material, and we don’t research in that specific area. As such, we cannot speculate as to how children might be impacted through the exposure to sexually explicit content.
Senator Cotter: Do you see that as a feature of the way in which they might get trapped, though? I’m trying to understand whether, in a debrief of the child victim, you learn the route that caused them to be accessed by the predators.
Mr. Boileau: The predators or offenders will go where children will actually hang out. So if we walk away from the internet and we go to the real world, we all know that, for many years, we would go to the park with our kids and we would actually be looking at who is in the park and how safe it is for children to play in the park. Now we’re walking into a virtual environment, a space that is also dedicated to children, but unfortunately, we do find offenders on those platforms, and unfortunately, we don’t have the same ability as the real world where you can visually see them on those platforms; you just don’t see them until they actually present and expose themselves.
Senator Cotter: Thank you.
The Chair: Senators, unfortunately, we have run out of time. We cannot go to a second round. I want to thank Mr. Boileau, Mr. Hutton and Mr. Peter McCallum for attending today. You can see the interest the senators have had in their comments and questions to you and finding out more about what you can do. Thank you for making yourselves available today.
[Translation]
Senator Dupuis: Can we ask the witnesses who referred to specific reports whether they can send us the links to those documents?
[English]
The Chair: Witnesses, did you hear Senator Dupuis’s request?
Mr. Hutton: Yes. I don’t think I referenced a precise report other than maybe our own report on harnessing change, which we can certainly pass on the link to the committee.
The Chair: Thank you. If you can send it to the clerk, I would appreciate it.
Mr. Hutton: Yes, we’re familiar. I thank senators for their questions and for having us here today.
Mr. Boileau: Thank you very much.
The Chair: Now I would like to introduce our three witnesses. From CIO Strategy Council, we have Mr. Keith Jansa, Executive Director, and Mr. Tim Bouma, Director, Verification and Assessments. From the Media Authority of North Rhine‑Westphalia (Germany), we welcome Mr. Henning Mellage, Adviser.
We will start presentations with you, Mr. Jansa.
Keith Jansa, Executive Director, CIO Strategy Council: Madam Chair and honourable members, thank you for the opportunity to present today. The CIO Strategy Council is a national forum that brings together the country’s technology C‑suite to collectively shape common digital priorities. I am joined today by my colleague, Tim Bouma.
I commend the senators on this committee for recognizing the importance of standards in achieving policy objectives. Without standards, we would have no product-safety security, no common technology unifying global markets, no interoperability and certainly no opportunity for competition to thrive.
My perspective today is as an expert in the development and strategic application of standards and more importantly as a father of three young children with daily access to the internet.
Bill S-210 rightfully recognizes the harmful effects of the increasing accessibility of sexually explicit material online for young persons and acknowledges those effects as an important public health and public safety concern. However, I would like to call upon the committee to expand age verification technologies to include adequate controls on the use of artificial intelligence algorithms to reflect the online harms that our children face daily.
I draw the committee’s attention to some very troubling trends that illustrate harms beyond those specific to pornographic websites and that compel the need for a broader application of age verification technologies.
Past weeks alone, investigations uncovered sexually explicit content on gaming platforms geared toward children online.
Virtual reality apps restricted to adults are, in practice, an online seeking-ground for sexual predators to exploit young children.
Live streaming content is a click away for children — from live-streamed abuse of children to the heinous acts displayed real time in raw, uncensored form online of the “Freedom Convoy” protest; online ads going unchecked, targeting children with harmful content; public and private chat spaces luring vulnerable young children into indecent acts; so-called “challenge” videos going viral on social media that kill children.
This is a crisis. Children have the right to privacy. They have the right to an identity that no one should be able to take away or manipulate. They have the right to protection from physical and mental harms.
Canada is lagging other jurisdictions when it comes to protecting young persons online. While other countries are unveiling comprehensive strategies to mitigate this crisis more effectively, as your peers in the U.K.’s House of Lords have done with the age-appropriate design code, our solutions must be designed for no less than the contemporary world our children face daily.
To achieve this, legislation must set out the regulatory framework that encompasses or empowers the right bodies with the authority to create a comprehensive rule-setting and enforcement regime that protects our children online. Standards and certification offer the flexibility required to effectively regulate this and are a go-to compliance mechanism for addressing new, emerging and evolving technologies to protect children’s rights.
I have three principal recommendations to focus your priorities moving forward, which are further detailed in my submission. They are to enact Governor-in-Council powers to approve standards, codes of practice and certification programs to provide sufficient safeguards for online safety and on the design and use of age-verification technologies; add the words “privacy preserving, age-appropriate in design, and trustworthy” before “age verification method” in the text of the bill, and add “responsible use of artificial intelligence algorithms” to protect children’s rights; and leverage trusted digital identity programs already established by Canadian jurisdictions and apply national standards that ensure conformance to ensure trusted age verification for all Canadians.
We bear no greater responsibility as a society than protecting our children. Thank you. I look forward to your questions.
The Chair: We will go on to the next speaker, Mr. Mellage.
Henning Mellage, Adviser, Media Authority of North Rhine-Westphalia, Germany: Thank you, honourable senators, for your invitation to participate in the study of Bill S-210. My name is Henning Mellage, and I serve as a legal adviser to Media Authority of North Rhine‑Westphalia, Germany. I studied law at the University of Cologne. I also chair a working group on youth protection and digital media of the German Commission for the Protection of Minors in the Media, the KJM. I will come to this later.
The Honourable Senator Miville‑Dechêne, whom I met on a panel at an online event, recommended me to the committee to give you an impression of the situation in Germany with regard to age verification systems and online pornography, as Germany has had an age verification regime since 2003. I honestly have to say I feel a little bit like an alien, as it has been mentioned before that no country in the world has set up such a regime.
Media regulation in Germany is the responsibility of the federal states. Germany is a federal republic, so it is up to the federal states, but media regulation has to be organized independently of the state. This is carried out by the 14 media authorities in Germany. As a regulatory body, we have the Commission for the Protection of Minors in the Media, the KJM. It is the central supervisory authority for youth protection in private broadcasting and telemedia. The KJM, as mentioned, is a regulatory body to the 14 media authorities.
The legal basis in the field of protection of minors in the media in Germany is the Interstate Treaty on the Protection of Human Dignity and the Protection of Minors in Broadcasting and in Telemedia.
The treaty went into force on April 1 in the year 2003. In this treaty, we have a clear regulation with regard to providers of pornography. They have to implement an age verification system or delete the respective content.
In article 4, paragraph 2, sentence 2 of this treaty, it states that “content is legal in telemedia services” — and this has to be emphasized — only, “if the provider has ensured that such content is accessible for adult persons only (within a closed user group).” This is what the text of the law says.
The contents are pornographic in any other matter. That means content that is not child, youth, violent or animal pornography. These contents are absolutely forbidden due to German law. The other contents are indexed content and content that is evidently suited to seriously impair the development of children and adolescents.
Age verification systems are used to ensure those closed user groups, as the law states. The law does not contain any further requirements for age verification, apart from having to be an effective barrier. The KJM, our regulator, set up criteria for age verification systems, which were also confirmed in a lawsuit by the federal high court of justice in Germany.
According to these criteria, age verification for closed user groups must be ensured through interrelated steps. The first step is through at least one-time identification — this means an age check — which must be done through personal contact, generally personal contact. “Personal contact” means a face-to-face control with a comparison of official identification data, identity card or passport. Unlike in the United Kingdom, for example, a driving licence is not an official identity document in Germany.
Under certain conditions, it is possible to fall back on a face‑to‑face check that has been carried out already, for example, when you open a bank account or a mobile phone contract. This is quite new, two and a half years. The criteria of the KJM say a face-to-face control can be waived if the identification has taken place by means of software by comparing the biometric data or the identification document and a photo of the identification document and an automatic recording of the data of the identification document.
The second step one has to take is a single-use authentication by the next step of users of this platform to effectively reduce the risk of excess authorizations being passed on to minors.
The interstate treaty does not contain a recognition procedure for closed user groups or AV systems. For this reason, the KJM, our regulator, has developed a positive evaluation procedure and evaluates appropriate concepts upon request from companies and providers. This serves, in our opinion, to improve the protection of minors on the internet and is, at the same time, a service for providers for more legal and planning security. More than 80 concepts have been evaluated so far by the KJM.
As I said before, this law has been in force since 2003. It led to the effect that some providers of pornography left Germany and started to offer their content from abroad. Those providers who stayed in Germany implemented an age-verification system and still do have enough customers. There is one provider — I don’t want to advertise, but one website can be seen as a role model.
Of course, the media authorities and the KJM also found that the majority of pornographic websites are operated from abroad. However, some of these are also aiming specifically at the German market. For this reason, the Media Authority of North Rhine‑Westphalia has initiated proceedings against large websites based in Cyprus and Ireland. One company which has been mentioned is MindGeek.
Although the country-of-origin principle applies in the European Union, European law permits exceptions to this in certain cases. One of these cases is the protection of minors. In these cases where we took proceedings against the large websites, prohibition orders were issued to the providers to the effect they may not address the German market as long as —
The Chair: Mr. Mellage, may I please ask you to wind up?
Mr. Mellage: Yes, I will. We took legal action and gave a prohibition order. They are not allowed to address the German market unless they use age-verification systems. This has been up to court. They filed a complaint about this. The Administrative Court of Dusseldorf, in a first decision, said that the orders that we have issued were lawful.
I will end now with this, Madam Chair, and I would be happy to take your questions. Thank you, honourable senators.
The Chair: I have a few questions for Mr. Bouma. Mr. Bouma, can you give examples of age-verification methods that could be prescribed?
Tim Bouma, Director, Verification and Assessments, CIO Strategy Council: Yes. There are quite a number of age‑verification technologies. It’s quite an innovative and very busy marketplace. One report identifies 10 different methods. The simplest one is self-declaration, where you enter your date of birth or age, which basically provides no assurance.
Other methods include you provide your identity information, which could be hard identifiers like an identity number, which then goes against public databases to confirm your age. Then you get into other techniques like biometrics, kind of like facial recognition but determining your age by analyzing your face. There is also profiling and inference technologies; artificial intelligence; capacity testing — quizzing the individual whether they actually have the capacity at a certain age; cross-account authentication — they may have authenticated to another account that has collected personal information that can be shared; third parties that do provide these services; there are commercial providers in the U.K. that are being developed, and there are a few that can provide age assurance. You can ask other account holders.
Then you get into other techniques which are not so much age verification but really controlling access from the source like controlling the device, controlling the network et cetera. So there are a lot of technical controls out there that could be qualified as age verification.
The Chair: What issues or concerns might be related to these age-verification methods?
Mr. Bouma: As I was saying, these are wide and varied technologies. They give rise to a very complex mix of security and privacy issues. For example, the third-party providers might collect the information for age verification, but they might actually use it for other purposes, which is not good at all. Then there is the whole issue of profiling, surveillance, et cetera. And then from the point of view of the consumer, just the acceptability of the techniques. Some of the techniques might be quite innocuous but other ones like measuring your face, et cetera, and trying to determine your age, people might view that as very invasive.
You never know with these types of technologies. They are very personal, if you will. There might be unintended consequences or opposite consequences, people who reject it. You have to be very careful. It is not just putting in place these safeguards whether they are effective or not. It is actually ensuring that they are acceptable.
The Chair: I have a question for Mr. Jansa. What efforts are under way in Canada that would assist with implementing age verification?
Mr. Jansa: Thank you for the question. There are certainly a number of jurisdictions that are implementing digital identity programs to assert and issue identities and credentials associated with folks and their age. I will pass it to Tim Bouma, having experience with respect to the federal government’s acceptance of some of these jurisdictions’ identities that have been issued.
Mr. Bouma: There has been a lot of good work, the federal government working with the provinces and territories developing what we have been calling “trusted digital identities,” accepting trusted digital identities. Really, it is confirming that the individual is presenting their own name and date of birth and perhaps a few other pieces of information from which an age can be inferred. The federal programs, namely Canada Revenue Agency and Employment and Social Development Canada or Service Canada, are using that data to enroll those individuals in those programs. And by extension that information can be used for eligibility purposes: Are you ready to retire? Are you at a certain age?
While it hasn’t been specifically contemplated, there is a possibility that the work that has been done with the provinces and territories could be carved out into an age-verification capability.
The Chair: Thank you very much.
Senator Miville-Dechêne: This is a question for Mr. Mellage from Germany. First of all, thank you for telling us that in one country, Germany, there is a law on age verification and it seems to be working and has been working for 19 years. That is good to know. I would like to know a bit more about it. Could you tell us if security and privacy issues were raised? I understand you have a certification process of about 80 different companies. Have there been big privacy issues? Can you tell us about how much success you had in convincing companies? I’m thinking finally about Pornhub and xHamster, two companies you are in court with. What is happening there? Will they obey or will you have to pull the plug?
Mr. Mellage: Thank you, Senator Miville‑Dechêne. Yes, we have this kind of evaluation process by the KJM, our regulator. Of course, we are faced with concerns about data protection and privacy protection. As far as I can see, of the more than 80 systems I have reviewed on my own or I was involved in the evaluation process, what we can state from the KJM’s side, most of the AVS providers are coming from the banking and financial sector. So they have to comply with the strict regulations for the banking sector. They must comply with the anti-money-laundering legislation, and, of course, they are compliant with the GDPR in the European Union. It’s basically regulations on privacy and data protection.
As you mentioned, our proceedings against this big website, we have been in court for this. As I have mentioned, the MindGeek company went to the Administrative Court of Dusseldorf, and the court said that our prohibition orders were lawful, but they appealed. It’s now to the Higher Administrative Court of North Rhine‑Westphalia. The decision in this case is still pending. It’s probably the same situation in Canada as well, but court decisions take some time.
The other thing you mentioned, I cannot assure it is the website you have mentioned because I’m not allowed to. If you follow the press, you might get your own idea. The senator is quite right.
In this case, we addressed the actual provider and gave a prohibition order. They did not go to court. We have an appealable decision by Germany. However, they did not respect this decision, so they went on with their age verification systems. Therefore, we addressed the host provider based in The Netherlands, and we got a prohibition order for the host provider. The host provider did not go to court, so our decision is binding as well.
The next and final step in our instruments, what we can do, we are going to give a blocking order to the ISPs. The KJM is going to decide in the next hours or days.
[Translation]
Senator Boisvenu: Thank you to our witnesses.
My question is for Mr. Mellage. Does the German commission for the protection of minors have jurisdiction over Germany only, or does its jurisdiction also extend to Europe?
[English]
Mr. Mellage: Thank you. The jurisdiction of the interstate treaty is mostly German providers, but there is an article that allows us to address other providers not based in Germany when they are targeting the German market.
[Translation]
Senator Boisvenu: I would like you to tell us briefly about the efficiency and effectiveness of your legislation or regulations, in terms of cost and the resources that you have to carry out your work and ensure its effectiveness. Have a large number of offenders been found as a result of your research?
[English]
Mr. Mellage: I’m afraid I can’t give you anything about costs, as a lot of personal information is needed. What I can say is my employer, the Media Authority of North Rhine‑Westphalia, is the first media authority in Germany to lay charges against porn providers from abroad. We have more than 100 employees, but in the supervision team that is doing these proceedings, we are six people.
The effectiveness of this is not showing up now because we are waiting for the court’s decision. On the other hand, what we found out is that other porn providers — I just know of one, to be honest — have voluntarily implemented age verification systems. It is FanCentro. FanCentro stated by official Twitter message that they implemented an age verification system supported by the company Yoti from the United Kingdom. They implemented this to fulfill and to be compliant with the German law. I think this is a good step.
Senator Dalphond: My question will also be addressed to you, Mr. Mellage. You referred to the enforcement that you are trying to do against the providers. I understand they are all based outside Germany except one and that one has complied.
For those that are not compliant, if the user is using a VPN system, can you trace that, or do you not know how many people are connecting with foreign porn sites without being, first of all, in the system?
Mr. Mellage: Thank you, senator. If I understand your question correctly, the question is: If a porn provider is using age verification, can a user using VPN go behind this? No. If a porn provider implements an age verification system, the idea is that, for example, like Pornhub —
Senator Dalphond: My question is about those providing pornography from outside Germany.
Mr. Mellage: Okay. You mean with the blocking.
Senator Dalphond: If they use VPN to connect, do you have a way to know how many are doing that? Is that something you are able to trace, or do you have no idea about that?
Mr. Mellage: I don’t have any numbers. If we had a blocking order, there is the chance to use a VPN and get access to this website, yes.
Senator Dalphond: Thank you.
Senator Pate: Thank you to all three of our witnesses for appearing. My question is for Mr. Mellage.
Last year there were news reports about Germany moving to block access to sites where corporations, organizations or individuals were not cooperating with German laws. I’m curious to pick up on something that Senator Dalphond just raised. How do you monitor compliance with the websites, given their number and sophistication? How do you keep track of the various ways they may try to get behind — or in front of, depending on your perspective — the system? Are there practices that you have developed that you think we could look at as some of the best ways to try to address this?
A number of people have also raised concerns about the idea of the government blocking sites. Could you clarify what steps you’ve taken in your system to make sure the access doesn’t go beyond non-compliance with this particular legislation or access by youth to pornography?
Mr. Mellage: Thank you, senator. There was a bit of a glitch in the video call, so I hope I understood your question correctly.
The AVS systems that KJM has evaluated so far, those systems can really easily be used by porn providers, of course. The idea is that, for example, as a customer or a user, you want to get access to a website, and in having an onboarding process, you’re at once identified. This is an argument for all people who think that data and privacy protection might be a problem. The only data the AVS provider is giving to the porn provider is that person X is of legal age, nothing else. There is no address, no name and no date of birth given. I think this is a really robust way to have an age verification.
A company I’ve mentioned before is Yoti in the United Kingdom. They use a technique by an American company, FaceTec, and this technique allows us to have an age estimation — not age verification, but an age estimation — just with the biometric data of your face. I know that there are concerns about this when it comes to biometric data and faces, and all the alarm clocks are ticking about what is happening to this data. But as far as I’ve understood this method, it’s not face recognition, it is an age estimation. So if I were to access a porn website where age estimation is required, the porn website would not know that Mr. Henning is sitting in front of the computer. It will just know that there is a person sitting in front of the computer, 35 years old — but in fact 40 years old — and that’s it. In the end, the picture is going to be deleted.
Senator Cotter: My question is for Mr. Mellage. Would I be correct in understanding that the regime with which you are associated and working is a kind of administrative oversight with sanctions as opposed to the use of essentially a criminal law power? It’s a tension that we’re wrestling with in the Canadian context. Could you speak a little bit to that?
Mr. Mellage: We’ve got more or less two ways to address porn providers. We take up administrative actions, and we can talk about charging fines. Charging fines is a little bit complicated in the international context because the international law really does not give us, at the moment, the opportunity to get the money if we charge these fines. If from Germany you’re driving to France and you’re speeding and you’ve been in a speeding control, as a car driver, you have to pay this fine. But in this context, what we are speaking about, there are no treaties within Europe.
Senator Cotter: That answered my question. Thank you.
The Chair: Do you have anything else to say, Mr. Mellage?
Mr. Mellage: What I would like to say when it comes to criminal law is that, in the German criminal law, to give access to pornography to people under 18 is a criminal offence. It’s not just a question of media regulation, the legislation that Germany has is that pornography is harmful to minors, full stop. The other thing that the German legislator is thinking about is that even adults do not or might not want to be confronted with pornography. It’s part of the right of sexual determination not to be confronted with pornography, for whatever reason, whether it’s for religious or moral reasons. In Germany there are two sides, the protection of minors and the protection of adults who do not want to be confronted with pornography.
Senator Cotter: Thank you.
Senator Clement: I have a question for Mr. Jansa, but I will start with a question for our witness from Germany. You’ve been at this for a long time, and I imagine you’ve read the bill sponsored by Senator Miville‑Dechêne. Are there any recommendations that you would make? What do you wish you had done differently and that we could get right now?
Mr. Mellage: Well, this is a good question. I don’t think that as a German I’m in a position to evaluate a Canadian bill, but we are happy that other countries are making an attempt to implement age verification.
I’ve studied the bill by Senator Miville‑Dechêne, and it’s quite similar to the regulation in Germany. The only difference is, as far as I’ve understood, it is on porn providers who are doing this — I’m missing the word right now — sort of like a professional and doing it as a job. In Germany it is so that no one, even a private person, is allowed to spread their own pornography content through the world without an age verification system. This has been criticized recently in an article by a sex worker who said sex workers must have the opportunity to advertise their job on the internet, and why should there be an age verification system? Our answer would be, yes, you can advertise your job and what you are doing, but, please, not in a pornographic way, and do so in a way that minors are protected.
Senator Clement: My question to Mr. Jansa is similar. You have read the bill and you have made three recommendations, if I understood you correctly. You’re the father of three children. You made a point of saying that. What do we need to improve here? What is the main recommendation or main weakness you think we need to address in this bill?
Mr. Jansa: Thank you for the question. First, noting the variability in age verification technologies that are in the market, it’s important to ensure that there are appropriate safeguards associated with those technologies and that we can have confidence and assurance in the technologies being implemented.
So that first recommendation being to qualify what age verification technology is within the bill to ensure that they are privacy-preserving, age-appropriate in design and trustworthy.
Then the regulations that would ensue complemented with my other recommendation — to enact Governor-in-Council powers to approve standards, codes of practices and certification programs — would have a combined effect with respect to combining legislation, regulation, standards and certification programs as a go-to compliance mechanism.
So those two specific recommendations add some text to the bill to qualify age verification technologies, then with the evolving nature of what these technologies are, enact the ability to have powers to approve standards and certifications so that we can continue to ensure that appropriate safeguards are in place after the legislation is, in fact, passed.
[Translation]
Senator Clement: Thank you to our witnesses.
Senator Dupuis: justThank you to our witnesses. I have two questions: the first is for Mr. Jansa, and the second is for Mr. Mellage.
Mr. Jansa, my first question deals with your second recommendation. I don’t have your document in front of me, but you spoke about adding algorithms or the issue of algorithms in the act. Is that right?
Mr. Mellage, what criteria did Germany use to determine whether its legislation was effective? You said that the objective was the protection of minors in the context of telecommunications. In your opinion, what shows that the legislation is effective? Has the incidence of violence against children decreased? We’ve seen much more sexual exploitation of children online, but have you seen a drop in this phenomenon in Germany, for example?
[English]
Mr. Jansa: Thank you for your question.
Expanding age verification technologies to include adequate controls on the use of artificial intelligence algorithms is recognizing the types of age verification technologies in the market. We heard it from testimony today with respect to age estimation, these sorts of technologies which are, in fact, underpinned with artificial intelligence algorithms. We need to ensure we have the appropriate safeguards in place.
We need to be cognizant that with these technologies, age verification technologies, there is a wide gamut; to ensure we have appropriate safeguards is to ensure they’re trustworthy, privacy preserving and age-appropriate in design.
Mr. Mellage: Thank you, Senator Dupuis, for your question. Yes, this is a good question as to how we measure success or the effectiveness of age verification.
Germany has been, in the last years, really the only country to have this. Now it’s going to be rolled out after the amendment of the AVMS directive in the European Union and then the member states have to transpose it to national law, and this directive requires age verification systems for porn providers. I think the effect or the success is not yet measurable, to be honest.
But to speak from the regulator’s perspective, we are very happy in Germany. We are glad that now we’ve got the AVMS directive in Europe, and even Cyprus and Ireland have transposed this directive.
Countries outside the EU, like the United Kingdom, Australia, and now Canada, they are thinking about such an approach. I think it would really help, or it’s going to help to get the issue or the matter of age verification systems to be an international matter. My personal belief is that only in the international way can we have a good and effective system of protecting our children and minors.
Senator Wetston: I have three brief questions for Mr. Mellage.
What is the age verification age in Germany? Is it 18?
Mr. Mellage: Yes, 18.
Senator Wetston: Does Germany adopt the principle of net neutrality?
Mr. Mellage: Yes, they did, and this is a European law as well.
But there can be, to the point of net neutrality — because I think Mr. Hutton said it before — a DNS blocking order can be a violation of the principle of net neutrality.
I’ve got to look at a note I’ve made on my computer. But such a violation can be justified if there is a court or administrative order to block an offer.
Senator Wetston: The last question is: Who regulates content on the internet in Germany?
Mr. Mellage: The media authorities, the 14 media authorities. We are regulating content in the field of the protection of minors. We’ve got this regulatory body, the KJM, and therefore we have no — an authorized failure does not have another approach than Bavaria, for example.
Senator Wetston: I take it that interstate treaty allows this to evolve in Germany and be the regulator of content. And is that same organization the regulator of telecommunications services?
As you know, in Canada, with the CRTC — you’ve heard that — there is an issue with the Broadcasting Act. Do you have that integrated in Germany or are they separate functions?
Mr. Mellage: It’s separate. For telecommunications, it’s a federal authority called the Bundesnetzagentur.
Senator Wetston: I see.
Mr. Mellage: I don’t know the English name for this, I’m very sorry, Senator Wetston.
Senator Wetston: That’s fine. Thank you, Mr. Mellage. I appreciate your information.
The Chair: Thank you very much to Mr. Jansa, Mr. Bouma and to Mr. Mellage. Thank you for your presentations.
Mr. Jansa, we will be distributing the presentation you made. We’re going to have it in both languages. I’m sure that the members will also get a lot of information from that presentation as well.
I want to thank all three of you. You have really helped us understand this issue in front of us and we thank you for the time and effort you’ve put into this.
That ends this meeting, senators.
I just have a follow-up for the next meeting that I’d like to share with you. As we have received government Motion No. 14 with a reporting date of March 31, we will move on to that in our next hearings after the break weeks, and then we’ll return to Bill S-210 afterwards.
Thank you for your understanding.
(The committee adjourned.)