THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS
EVIDENCE
OTTAWA, Wednesday, October 29, 2025
The Standing Senate Committee on Legal and Constitutional Affairs met with videoconference this day at 4:15 p.m. [ET] to study Bill S-209, An Act to restrict young persons’ online access to pornographic material.
Senator David M. Arnot (Chair) in the chair.
[English]
The Chair: Good evening, honourable senators. I declare open this meeting of the Standing Senate Committee on Legal and Constitutional Affairs. My name is David Arnot. I am a senator from Saskatchewan, and I am the chair of this committee.
I invite my colleagues to introduce themselves.
Senator Batters: Senator Denise Batters from Saskatchewan.
[Translation]
Senator Miville-Dechêne: I am Julie Miville-Dechêne from Quebec.
[English]
Senator Tannas: Scott Tannas, High River, Alberta.
Senator Prosper: Paul Prosper, Nova Scotia, Mi’kma’ki territory.
Senator K. Wells: Kristopher Wells, Alberta, Treaty 6 territory.
Senator Simons: Paula Simons, Alberta, also Treaty 6 territory.
Senator Pate: Kim Pate, and I live here in the unceded, unsurrendered, unreturned territory of the Anishinaabe Algonquin Nation.
[Translation]
Senator Clement: I am Bernadette Clement from Ontario.
Senator Saint-Germain: I am Raymonde Saint-Germain from Quebec.
[English]
Senator Dhillon: Baltej Dhillon, British Columbia.
The Chair: Honourable senators, we’re meeting to continue our study of Bill S-209, An Act to restrict young persons’ online access to pornographic material.
For our first panel, we are pleased to welcome, by video conference, Julie Inman Grant, the eSafety Commissioner for Australia. Our second panellist is Dr. Tobias Schmid, Director of the Media Authority of North Rhine-Westphalia and Commissioner for European Affairs of the German Media Authorities. Welcome, witnesses. Thank you for joining us.
We’ll begin with opening remarks. We’ll start with Ms. Inman Grant, followed by Mr. Schmid. I would ask both of the witnesses to speak for approximately five minutes to give an overview of your points, and then we’ll move to questions from the senators.
Julie Inman Grant, Commissioner, Australia eSafety Commissioner: Thank you so much, chair, and thank you to the committee for having me here. Just raise your hand if I have gone over. I wanted to make sure I give the committee a quick context for what we do and how we do it.
Thank you again for inviting me to speak about the leading role Australia is taking in the protection of children online and about our multi-layered, holistic approach to executing this important mission.
From eSafety’s very beginnings back in 2015, our regulatory focus has always been on helping to protect children, and in those early years, this centred on protecting children from online bullying and the scourge of child sexual exploitation and abuse material.
But after just two years, our remit expanded to cover all Australians, and, in 2022, the Online Safety Act, the third reformation of that, came into force, and this represented a landmark shift that expanded both our powers and our responsibilities.
For the first time, there was a clear set of expectations for online service providers. They could no longer claim that user safety was someone else’s problem. They were now expected to be accountable for the harms playing out on their platforms and services.
While we have always taken a risk- and harms-based approach to our work, that means targeting the most serious threats and doing so in a way that is proportionate, fair and grounded in evidence.
We’re also one of the few countries in the world that provides hands-on assistance to its citizens when they become victims of online harm. We do this through a range of complaint schemes, including an adult cyberabuse scheme; a child cyberbullying scheme; through illegal and restricted content, which includes instruction in violence. As you can imagine, my team has been dealing with the Charlie Kirk assassination video, with the stabbing of the Ukrainian student and, sadly, also a decapitation of a Dallas hotel manager, which have gone viral all over the mainstream platforms. Unfortunately, the platforms did a really poor job this time of applying labels or what we call interstitials, which is meant to blur and protect innocent eyes from seeing content that they may stumble upon incidentally and then cannot unsee.
But this also includes, through our complaints schemes, a 98% success rate in removing the non-consensual sharing of intimate images and videos. This includes “deepfake” image-based abuse. This is often referred to as “revenge porn.” We do not call it “revenge porn” because this is akin to victim blaming. Revenge for what? But it also trivializes the harm when we know, particularly for young people, this can be very denigrating and humiliating, and now we’re seeing “deepfakes” being so hyperrealistic that it’s difficult to tell between what is real and what is not. So we refer to this practice, which is happening almost every week in schools across Australia, as “deepfake image-based abuse.”
Now, this is an incredibly important part of our work and provides meaningful help to Australians who find themselves in the midst of an online crisis. At the same time, we know it’s just dealing with the symptoms rather than searching to find the cure. This is where our systemic powers come in: to change the online landscape from within, to take an ecosystem approach and prevent the harms from occurring in the first place by putting the burden back on the platforms themselves, to assess the risks and harm and build in safety by design.
I’ll probably speed up a little bit, but it’s worth for the committee to know that we just registered eight industry codes that will require a broad range of platforms, from gaming sites to search engines, to AI companions and chatbots to protect children under 18 from coming across high-impact violence, violent pornography, suicidal ideation and self-harm material, including disordered eating material. Some of these codes will come into force in December. The search engine code is included in that. That will require the search engines, which often do this in a safe search mode, to blur explicit violence, like the videos I just mentioned to you, but also pornography so that kids cannot come across this incidentally, accidentally and “in your face.”
Our research has shown that almost 30% of Australian kids under the age of 13 — usually around the ages of 9 and 10 — are not deliberately seeking out pornography but come across this by using a simple search engine or playing games. And they describe this as being “accidental, unsolicited and in your face.” Of course, if you are an adult and you want to continue going through to an adult porn site, you can just click a link.
This will also require the platforms, when somebody is seeking out specific information around suicidal instruction, to be sent to a local mental health institute site rather than a terrible site that instructs in those lethal methods.
I’m probably getting close to time. I know you’re probably interested in knowing a little bit more about our social media minimum-age legislation. This will come into force on December 10 of this year, and social media platforms that are captured by this will need to take reasonable steps to prevent children under 16 from having accounts.
This is not an absolute prohibition or a ban. As a result, I refer to this as a social media delay. What it’s essentially doing is increasing the minimum account age from the current 13 to 16, recognizing the developmental vulnerabilities of younger users. The aim of the legislation is not to punish parents or children. There are no penalties for users. Instead, the onus is placed back on the platforms to do better.
We know from our research last year that 84% of 8- to 12‑year-olds in Australia already had a social media account. When we spoke to them, about 90% of their parents helped them set up these accounts. This is a conundrum I think parents face around the globe. They are worried about the harms of social media, but they don’t want their children to be excluded.
This step will lead to a real normative change where parents can say, “I’m not preventing you from having social media or a smartphone, just not yet. And the government is saying that 16 is the appropriate age, so you will not be left out.”
This is really important for the companies to take responsibility in terms of enforcing their own terms of service. Again, many of the 8- to 12-year-olds that already who already had accounts said they had never been banned or suspended for being an underage user. You can imagine that it should be pretty easy, given that the companies can target most of us with deadly precision when it comes to advertising, to be able to tell an 8‑year-old from a 13-year-old.
I can make the regulatory guidance available to the committee, but what we’re asking them to do is to take reasonable steps to identify children who are under 16, and by December 10, these companies will be expected to either deactivate or remove the accounts of children who are under 16, and we will be monitoring.
The government has tested about 50 different age-assurance technologies, and these were tested not only for accuracy and robustness but also to ensure that they were privacy-preserving. Almost every single one of these technologies used a form of AI. We’re not specifically mandating a use of that technology. We know the companies will know their user base and their own infrastructure better than we will, but what we are asking them to do is take a multi-layered, “waterfall” approach to age assurance. That means applying different measures at different points in the user journey, from the sign-up to account activity, and this needs to be supported by clear, accessible and timely review mechanisms. They are not allowed to specifically or only ask for government ID. Again, they must have user-reporting mechanisms for people who are missed and an appeals process if they “overblock” or for those whose accounts may be inadvertently blocked.
I’ll just finish by saying that privacy is an important consideration, and this will actually be a co-regulatory bill. The Australian privacy commissioner has also put out privacy guidance, and we will be co-regulating — me for the implementation of the safety and age assurance, and the privacy commissioner will be making sure that they are not violating elements of the Privacy Act.
I do believe that we can have this right balance between privacy and safety, and I think it’s imperative. I also think the ship has sailed. I was just out in Silicon Valley, and most companies now are looking at some form of age assurance, including gaming platforms and companies like Reddit. This is because there is a real global movement toward protecting children and making sure they are having age-appropriate experiences. The U.K., Ireland, the broader European Commission, as well, are looking at these things, so companies are having to step up.
I just will finish by saying we were the first online safety regulator 10 years ago, and we were the only one for about 7 years. When other countries came online, we started an organization called the Global Online Safety Regulators Network. I am pleased to say that Canadian Heritage and our friends at the Canadian Centre for Child Protection, probably some of our most valued partners, are observers to that regulation so that they can learn from all of us and we can learn from them.
Thank you so much. I’m happy to take questions whenever appropriate.
The Chair: Thank you, Ms. Inman Grant. Now, please, Mr. Schmid, for five minutes or so.
Tobias Schmid, Director of the Media Authority of North Rhine-Westphalia and Commissioner for European Affairs of the German Media Authorities, German Media Authorities: Dear Mr. Chair, dear honourable members of the Senate, first, of course, thanks for having me. It’s an honour to describe the German perspective and experience in this area.
Just as an administrative remark, I will have some examples out of one of our states. It’s North Rhine-Westphalia. The reason is very simple. This is the area I’m responsible for, but for your understanding, every measure we take in one state is always effective all over Germany.
Having said that and coming to the core question, how to find the right legal approach to protect minors in the online world is a very complicated and, often, emotional question. But why is it so important?
In the end, it’s very simple. Our studies say that minors are spending 3 hours and 21 minutes online daily. We have got to realize that the online world is part of their reality, and it is a little bit like when they first go out of the home and take part of their life into the online world. This normally happens at an age of about 10.
In this reality, for children, they are confronted with different content, also with pornography. We have some numbers that say that 47% of all children have already seen pornographic content. What’s important is the average age they come into contact with pornography is 13. What’s also important, I think, is to know that 56% of the children stated that they have seen pornographic content that they would have preferred not to see. One last number that is very impressive is that 75% of the children believe that what they see on pornographic platforms mirrors reality.
That leads to the first and central point from our perspective — and to make this absolutely clear, the problem is, of course, not pornography or the use of pornography by young adults. The point that we have to take care of is how to protect children around 13 and to not let them come into contact with something they are obviously not totally able to put in the right social context.
What is the bigger situation in Germany? It’s very simple. Since 2003, the law says pornographic content is legal in digital services if the provider has assured that such content is acceptable for adult persons only. Therefore, every content provider has to generate closed user groups.
One of the measures for this is age-verification systems. The media regulators in Germany have, for years, evaluated these age-verification systems, a little bit as our colleague from Australia described. We have around 100 different systems evaluated and approved.
That’s why, in the end, it is a very simple legal situation. What is the reality? I can describe an interesting case, also from the Canadian perspective. In 2019, the Media Authority of North Rhine-Westphalia initiated proceedings against content providers distributing pornographic content and not respecting the legal rules that we have. The biggest of them is Aylo, operating YouPorn and Pornhub sites. In both cases, we only have the button, “Yes, I’m 18.” This is not an appropriate age-verification system.
Aylo was consulted and informed about their legal obligations, and they didn’t stop conducting their business this way. That’s why we issued an administrative order stating that they may not operate in the German market as long as their websites fail to implement age-verification systems.
Of course, Aylo appealed these decisions before an administrative court. The court ruled that the order of the authority was lawful. There is still one pending decision of the Higher Administrative Court here. However, this appeal does not have a suspensive effect, and that is important to know. There is a clear obligation for Aylo to implement age verification or to stop doing business.
Aylo is respecting neither the decision of the authorities nor the decision of the court. It is still operating. That’s why, in 2024, we issued orders to the German internet service providers to block the services of Pornhub and YouPorn. Then, interestingly, Aylo reacted by circumventing the block by just changing the web addresses. That leads us to a new step. The German legislator will, in the next two weeks, adjust the German media law, and the consequence will be that in future, blocking one internet page will also block all mirror pages, so you can’t circumvent it anymore in the future.
I think that’s an important example not so much because it’s Aylo and it’s Canada but because it shows that if you implement something, then it’s also important to see how we can execute it and how the protection of minors can be efficient.
To conclude, I have just two little points because, based on my experience, there are classical prejudices against age assurance or age verification. The first is always, “Yes, but this can be circumvented by VPNs.” That’s right. Of course, you can circumvent it — more or less every legal framework can be circumvented — but the core point is we are talking mainly about the protection of children under the age of 13. They are not interested in circumventing. They have to be protected from coming into contact with something they are not searching for.
The second point is, like my colleague said, the issue of privacy protection. Yes, it’s an interesting point, but also here I have two arguments. The first is, as I said, we have 100 systems in the German market for age verification, and the number of cases where we have problems because of privacy is exactly zero. The second and last argument is there could be that there are also data protection issues or privacy issues, as the colleague from Australia said; therefore, the solution is again simple. The companies have got to fulfill two standards: the protection of minors and the protection of data security. It is the risk of the industry. I’m quite sure that we shouldn’t lower the standards for the protection of minors just because there is another issue that the industry has to fulfill, not the regulator. Thank you very much.
The Chair: Thank you, Mr. Schmid.
Senator Batters: Thank you to both of you for being here. You are in very different time zones than we are, so we really appreciate that because it’s an important part of our study to find out how different jurisdictions have handled this matter. Frankly, the more I hear from different jurisdictions, the more it reinforces to me how far behind Canada is, so thank you for helping us with this.
First of all, I would like to ask Mr. Schmid about Germany. Germany has enforced this mandatory age-verification system for pornographic content since 2003. After two decades, what is your overall assessment of the effectiveness of limiting minors’ access to such material in your jurisdiction?
Mr. Schmid: Yes, our experience, in the end, as I said, is like always. I think this will also be or is the experience in the other countries or states. First, you have got to figure out how to make fulfilling the legal framework possible. That’s why it was so important that there was an evaluation process for all these age‑verification systems so industry had a chance to get systems like these approved.
Of course, we also have porn distributors out of the German market, internally, and 100% of them use age verification. We have a complete market internally. Unfortunately, we have this unlevel situation because of systems coming from outside, and in 2019, as I said, we started to enforce the execution. The reason is, of course, because of the European legal structure. It’s a bit more difficult to come here.
Aylo, for example, has a daughter company operating out of Cyprus. That’s why Cyprus law is relevant first. Only if Cyprus is not active, then the German regulator can enforce it. That took a little bit of time, but now it is in the clear process. I am quite optimistic that, especially with the changes in the German media law that we will get at the end of this year, there will be an effective system.
As I said, and this is really important, the internal German market has already been working according to that standard for years and, again, without any additional problems in the area of privacy. That’s why the last thing I would like to share is that we have to be careful not to fall for every complexity trap the industry is setting up.
Senator Batters: Yes, and some have argued that such a system would be technically difficult to implement or could fail on a large scale, yet Germany has applied it nationwide for over 20 years. What lessons do you think Canada can draw from your experience?
Mr. Schmid: The easiest lesson we have learned is that it is not a problem. The technology exists. There are in reality no problems with data protection and privacy. It works. The only change for the industry is that they earn a little less money because they are not putting children in danger. To be honest, the easiest answer I can give you is just do it. It will work.
Senator Batters: One further thing, you got into this a little bit, but I would like to hear a bit more. I also want to thank you for the statistics you provided us with because they really set out the alarming nature of what we are dealing with here. This stat jumped out at me: “One in two respondents . . . stated they have seen things in pornography that they would have preferred not to see.” I think this is what we are trying to prevent, dealing with children who are potentially really harmed by the types of things they are seeing, possibly involuntarily.
My last question to you is this: When a pornographic website is hosted abroad but clearly targets the German market, how do your authorities proceed in practice to enforce that law? Could you describe the different steps from issuing prohibition orders, perhaps, against the provider to involving hosting services or internet service providers, and could you also tell us whether those measures have proven to be effective in practice, particularly in cases brought before the courts, like MindGeek?
Mr. Schmid: Thank you very much for this complicated question. The starting point is very simple. The first point of contact is always the content provider, so the company that is distributing the pornographic content. As I told you, we have contacted all the platforms. Our logic is that we first contact the platforms with the highest level of reach because the potential of danger is the highest. The reaction from the international platforms, some of them located in Canada, was, as I said, no reaction. They haven’t respected this rule.
That’s why the German law says we can escalate it. If the content provider is not reacting, we have the ability to address, in the next step, the online service providers, like Deutsche Telekom, Vodafone and so on, and then we can do the blocking orders. The blocking orders were respected by the online service providers. That was not a problem. They were executed in about three days.
Our experience was that the industry circumvented this with the simple trick of changing the web address. In German law, we then have to start a new procedure. This procedure is time-intensive because we have to respect the European mechanisms. I will not bore you with that.
The solution is that now the so-called mirror domains will be addressed. That means when we block content on one web address, then every web address distributing the same kind of content is also blocked. To be very honest, for us, blocking content in the online world is the absolute ultima ratio; it’s not the idea of media freedom, but in this case, we have a very clear legal situation where the way the content is distributed is illegal, and that’s why we are taking these intensive measures.
At the same time, just because Europe is a little complicated, we have this trick, and there is also the responsibility of the European Commission to get active in the cases of the so-called very large online platforms. Of course, Pornhub is one of them. They have to take care that they are systematically in charge of protecting minors. The German media regulators addressed the cases. The European Commission started an investigation against the same companies because there could be systemic faults in the structure of these companies not implementing age verification.
To make a long story short, in the end, this or another way, it will have the same effect that Ms. Inman Grant has also described. There will be a change especially because, in the end, it is easy. They only have to respect one rule that says there has to be a mechanism to protect minors, and then they can do their business as before. It is not really complicated.
Senator Batters: Thank you.
Senator Miville-Dechêne: I will address my question to Ms. Inman Grant. You gave us a good overview of the whole system that you are dealing with as a commissioner. You must be a very busy woman.
I would like you to focus a bit more on pornography because, in that case, Australia went forward, then stopped and then decided to go again. I want you to tell us a little bit about why you hesitated a couple of years ago to go forward. What changed? Also, could you explain in more detail the pilot projects you started? I’m wondering how efficient you found the different methods, especially around the demographic groups, Indigenous groups. What have you found? Are there some biases?
Ms. Inman Grant: Lovely to see you again, senator. These are great questions.
I have been working on age verification in some way, shape or form since 2008, when I was at Microsoft and at the Harvard Berkman Klein Center. They created a big internet safety technical task force. From the industry perspective, we sought to roll out an age-verification pilot in Australia back in 2008. What I can say from this historical perspective is that the industry, the technology and the ecosystem have come such a long way.
We were asked by a committee in 2021 — there was an inquiry called Protecting the age of innocence — to put together an age-verification road map. What is really interesting is that the dialogue has changed a lot. It was a much more polarizing discussion, particularly the discussion around facial age estimation and the keeping of biometrics. People were adamant that there weren’t three legs to the stool. This is the way I look at safety, security and privacy. If one leg of the stool is weak, then the stool falls over. All of those imperatives, as Mr. Schmid said, have to be taken into account. This is precisely what we found in the technical trial.
It took us two years because we did such deep consultation to put the age-verification road map together. There is an extensive background paper with lots of research. When we tabled it to government, we made a number of recommendations. One of the first recommendations was not to move to a mandate mandating any specific technologies or systems without testing the technology first and looking at how it would be implemented in the Australian context.
We had watched, frankly, what had happened in France, where some of the major porn companies did actually apply age‑verification systems, and that created friction. Those that were doing the right thing in terms of having some age verification lost business, and the French users just moved to less compliant sites elsewhere. Of course, there is always going to be a lot of jurisdictional arbitrage.
Interestingly, at the time, the government did not accept our recommendation to do a technical trial. This became a big political issue between our opposition — again, as an independent statutory authority, I stayed out of it. We made recommendations. What happened then was there was quite a populist movement around the social media age-restriction bill that started in our states thanks to Jonathan Haidt and The Anxious Generation. One of our premiers said, “If the federal government is not going to do something, we are going to institute parental consent for 14-year-olds.” Then another state said, “We’re going to do it differently but for 15-year-olds.” Another state said, “We’ll do it for 16-year-olds.” I think the government understood this had to be a federal, national issue that we do together.
What happened two years later is that right as they were considering the legislation, they finally endorsed and funded the age-assurance trial. So that was the back and forth.
I have now been working toward deploying 16 or 17 sets of codes. The first set deals with illegal content, like child sexual abuse material and terrorist content, and the second deals with harmful content, which is what we call “class 2” content. We have a unique codes-based structure. It’s co-regulatory. It’s separated into eight different sectors of the economy. The industry actually writes the codes for themselves. My role is to determine whether or not they meet appropriate community safeguards. If I do, I register them. If they don’t, then I move to making it a mandatory standard.
That was the delay and why it’s taken so long to implement these codes. With the age-verification bill, the social media age‑restriction bill, we were kind of building the plane as we flew it.
Senator Miville-Dechêne: If I can interrupt you, what about those tests? What can you tell us about those pilot projects that would assure you that somebody who is of another colour, another demographic, Indigenous Peoples — there are also communities in Australia — will not be discriminated against?
Ms. Inman Grant: That’s an important question. That was tested for. This goes back to the vendors needing to train. Safety by design, when they’re developing these tools, has to be a part of every element of the AI life cycle process.
The data they choose, is it representative of the population? Of course, most of these vendors are overseas. We did a number of tests to ensure the unique ethnicities of Australia were picked up. Again, they’re getting better all the time.
For some of the platforms, for darker skin, Asian ethnicities and First Nations Australians, there was a little bit of lagging. But what these companies can do where they know there is a lack of accuracy is they can continue testing in a privacy-preserving and consent-driven way, training their classifiers on these different ethnicities.
One of the facets of our regulatory guidance is you need to work toward consistent improvement. If your technology shows there is perceived bias — I don’t think we went as far as to find perceived bias, just not the levels of accuracy across a broad range of ethnicities we need to see — the burden will be on them to continue training their classifiers that they are picking this up. It is going to be different for every country.
Senator Miville-Dechêne: When is it going to come into force, your age estimation and age verification for porn content? When is it coming into force?
Ms. Inman Grant: The age restriction bill is coming into force on December 10 of this year, so very soon, and the codes which will require the companies to prevent under-18s from accessing the content. If you look at the age restriction bill, it is about raising the age to 16, and then there is the content. That will be December and March for the prevention of content. We are very close.
Senator Prosper: Thank you to the witnesses. This has been a good education, sitting in and listening to your evidence.
Ms. Inman Grant, I believe you said there are over 50 different age-assurance technologies out there. I believe that’s what you said earlier.
Mr. Schmid, you mentioned that 100 different systems were evaluated and approved with respect to the technology.
What we have been hearing previously through other witnesses — an item of consideration pitted against this type of technology — is the intrusiveness of it, privacy considerations, data breaches, even statements to the effect that, in terms of data breach, it is not a matter of if but of when; I recall one witness stating that.
Given your experience, do you think that is a false claim? Or do you think that where the technology is — I guess it is all an issue of what information is provided up front, how long various companies or entities hold on to that information — data breach is more and more of a risk?
Could you provide your opinion on whether there are major concerns with respect to privacy when you think about it from a technological perspective of all these different age-verification technologies?
Ms. Inman Grant: Sure. What I probably should have clarified with respect to the age-assurance technical trial is that companies which wanted to be independently valuated or tested put themselves forward.
The 50 technologies tested are not an exhaustive list of all the technologies out there. There was quite a great variation, including an interesting one from France that looks at hand movements. It is called BorderAge. It uses AI. Apparently, there is a lot of medical science around the ligature and how you move your hands. I thought it was interesting. It tested well in terms of accuracy and robustness. That is a concern many people have — that if the biometrics and facial age estimation are taken, of your face, can you be reidentified?
It is a 1,400-page technical document. The overall finding was age verification can be done in a privacy-preserving way that is accurate, robust and effective, and it will continue to get better. Data breaches are going to continue to happen. It is a burden on the companies developing the technology, but the way the companies deploy the technology matters.
There was recently a data breach reported from Discord. It wasn’t actually the age-verification technology; it was a third-party vendor doing an appeals process who didn’t handle the data correctly.
Yes, there could be a data breach, but many of the technologies we look at take an image of your face, and the companies are not allowed to keep that information. We are not supporting the building of a different kind of surveillance system, if you will.
If I’m honest, the ship has already sailed. That is how the companies monetize your data and your children’s data. That is how the system has been built. We have done everything possible in our construction of this to ensure that this does not happen. We have had a lot of pushback from the companies. We have 2.5 million 8- to 15-year-olds. They want to continue monetizing them and building a pipeline for the future.
We need to try everything we can to ensure we are protecting kids from content they are not ready to be able to process. I believe we are seeing a real change in the sexual socialization of an entire generation, the harmful sexual behaviours we are seeing kids perpetrate against each other that they see online. Thirty percent of the porn they see now is choking and strangulation porn. This is not the Penthouse in your dad’s sock drawer — not your dad, of course. The violent and explicit nature of porn is readily available online and, as I think was noted, on social media sites. About 75% of kids are seeing these on social media sites, not just porn sites. It is a problem we have to get ahead of.
Senator Prosper: Mr. Schmid, can I get your perspective on this?
Mr. Schmid: Yes. I have two answers to this point. As I said, the first is trust and practical experience. As you said correctly, we have about 100 of these systems approved. The practical experience, as I said, is that at the moment we have zero cases where we have problems with data protection or privacy protection.
Of course, we are in a close exchange with our colleagues from the privacy agencies so that there is a close look. My practical experience at the moment — that’s, of course, nothing about the future — is that there are no problems.
For the future, I think that the most interesting part is the technology. It’s interesting that you, Ms. Inman Grant, also have the hand system at the moment, the same technology that we are also looking into. It’s interesting how fast this technology is developed and with minimized risks.
I think one additional point could be the structure in between the company that is delivering the age-verification system, on the one hand, and the company that is delivering the content, so maybe it could be a good idea to support especially structures where you have separation between the two ideas. Then, as we all know, you need this age verification, of course, not only for the area of pornography; it is also relevant for all the bank business. In Germany, we have a very strong law around gaming. Also here, we need age verification. That’s why I think this industry is growing. The technology is growing. It could be an interesting aspect to have a look on the separation of the structure between handing out the permission to users, or the certificate, for users to be 18 or older, from the pure business, for example, of porn platforms, but also of banks, or maybe the gaming industry or whatever.
These are my two points. At the moment, I know that this is very dominant in the discussion, and, if you allow me, as I said, it could also be a little bit of a complexity trap that industry can use to protect interests. I was a lobbyist in my past. That’s why I have some ideas about the dark side of the world. On the other side, as I said, the practical experience is more relaxing. It’s nothing you can be relaxed about, but the practical experience is as it is. In addition to that, maybe it could be interesting in future also to have a look at the structure of who is owning the data and who is owning the business, so to say.
Senator Simons: Dr. Schmid, I wanted to start with you. I was in North Rhine-Westphalia this summer, so I’m enjoying hearing from you.
You say that in Germany porn providers have been required to verify age since 2003. And yet your data also shows that porn consumption is rising and rising, which suggests to me that whatever you’re doing is not working. So if you have these tight restrictions, why is consumption of pornography by younger teenagers rising so dramatically?
Mr. Schmid: Thanks for your question. Also, of course, thanks for your visit in the wonderful North Rhine-Westphalia.
The question is a good one. The answer is, at the end, very simple. The porn content that is consumed by children, and also adults, comes mainly from companies that are not located in Germany. That’s why the way to enforce the rules that we have is a bit more complicated. Either it is coming from out of the European Union, and then at the end it’s legally very simple because then it’s German law; it’s more of a matter of execution. The other structure mainly used is that the company has a second part of a company placed in Europe. Then you have the idea of so-called country-of-origin principle that says, in the first point, always the authority is responsible where the company is placed. Maybe it could be one of the reasons why all these companies are so in favour of being in Cyprus. I don’t know why, but maybe because it’s nice there, but that generates a very complicated way because we also have to respect, of course, the European rules.
To make a long story short, that’s the main reason. The answer to that is that we have to enforce this cross-border consideration, and that is what we have done starting in 2019 and then very consequently since 2024. That’s a good way. To be honest, at this point, I think this is in the end a good sign because it shows that authorities and media regulators are taking it very seriously to see if the blocking of content is appropriate because, of course, we are always also talking about freedom of expression, even in porn. And that’s why it is, from my personal perspective as a lawyer, necessary to be very strict in respecting the rules of the process in Germany, but also in the European Union. That’s the reason for that.
So the short answer is the content at the moment is mainly consumed from Pornhub. We have unofficial numbers that say Pornhub has 110 million users per month. That’s more or less impressive, especially in respect to the fact that I know nobody personally who has ever used the service. But that’s another story.
So it is mainly this and the complicated structure to find a legal answer.
Senator Simons: But this is my question. It seems to me that if you can’t enforce the law — and from the brief you provided us, it sounds like Pornhub has been, to put it politely, unwilling to abide by the law — then what is the point of the legislation except to harvest data from people? I know you’re proud of the fact that you’ve never had a breach, but one suspects it is only a matter of time until something happens.
Why are you collecting all this information if it’s not actually stemming the flood of pornography?
Mr. Schmid: Because we will be effective. As I said, we will get to a situation at the end of the year where we can block the circumvention. To be very honest, we have here an interesting case. In my years as a regulator, I have never before seen a company that is consequently ignoring all orders from courts and authorities, and the consequence will be that we stop the online distribution. As far as I can see, this will happen at the latest at the beginning of next year. That’s it. So at the end, it will work. It’s a long way, but it is what it is.
Of course, I can’t answer to the data risk. As you said, there is a risk; it is what it is. But from our perspective, that’s my job; I’m responsible for protecting minors, and that’s something where I can’t stop just because there is an additional issue.
Senator Simons: Ms. Inman Grant, it sounds like Germany’s law is very much focused on pornography sites, like Pornhub, that exist to distribute porn. But you testified that there are a lot of children seeing porn incidentally on social media sites. When Australia’s laws come into effect, will they only target people who are in the pornography business? Or will they incidentally target other sites and platforms where porn may be from time to time, but which are not primarily porn sites?
Ms. Inman Grant: For the social media minimum age, again, that is about restricting children’s access under the age of 16. The burden falls on the platforms to age-verify them and make sure that they are not holding an account.
For our codes, it is quite complex, but I believe that responsibility and accountability have to exist up and down the technology stack. The equipment manufacturers have responsibilities; the app stores have responsibilities; the search engines have responsibilities; the internet service providers have responsibilities. And they are all different. Each code is different. So, for instance, I mention the search engine code; safe search is a common practice. What we’re asking all of the search engines to do is to blur when people come across explicit content so that they don’t have that incidental exposure.
We’re asking the app stores to create age-assurance signals for social media sites, for instance. And then, we have got a couple of broad ranges of services. Our designated internet services code does cover porn sites specifically. It also covers gaming sites, AI chatbots and companions. I just want to reiterate that AI is a very present and probably catastrophic harm. I am sure you have seen that there has been litigation in the United States against both Character.AI and ChatGPT. But ChatGPT, or OpenAI, is now talking about making erotica available, even though they don’t have fully deployed and well-designed age protection or age-assurance tools yet.
Senator Simons: When you say you’re asking the apps and asking the platforms, are you asking them, or are you telling them?
Ms. Inman Grant: They are mandatory requirements. If they violate these requirements, they are subject to up to AUS$49.5 million in fines. We’re starting to use some of those powers now. I was just able to get down a chatroulette site called OmeTV, which basically pairs pedophiles with young people. They refused to respond. This jurisdictional arbitrage that Mr. Schmid is talking about is very live. So I went to Google and Apple. They are gatekeeper services with the app stores. They take a 30% cut of every piece of revenue that comes from each of those apps. Yet, chatroulette sites are against their terms of service. So I said, “You have responsibilities under the app store code. We will take you to court if you don’t de-platform them.”
So we have to become clever about moving upstream and making sure that all of these players have some kind of responsibility. It can’t just be porn sites and social media sites. It has to be the device manufacturers, the app stores, the search engines and everyone in the ecosystem. That’s how ours work.
Senator Simons: Thank you.
Senator Saint-Germain: My first question is for Ms. Inman Grant. It’s a continuation of the question that my colleague Senator Simons asked to both of you. Ms. Inman Grant, could you provide us with a bit more information about the recent amendments to the act to include social media age restrictions? In a nutshell, what is the reason and how would you do that?
Ms. Inman Grant: This is something that our prime minister felt very strongly about — the fact that social media companies need to have more social responsibility, that they are systematically using harmful and deceptive design features, such as endless scroll or Snapstreaks or opaque algorithms that are driving kids down rabbit holes. They are very powerful forces that kids cannot see, let alone, fight against.
So the decision by the government was passed in a bipartisan manner. It was supported by both of the major parties. It was to give them a delay and put the onus on the platforms to prevent children under 16 from having and holding an account.
There wasn’t much information about how that would be done, so my team and I had to figure that out. That’s laid out in the regulatory guidance. Basically, we have told the companies what they need to do by December 10. Prior to December 10, they need to identify in a kind and compassionate way — we have already worked with them through transparency reports. We know that 96% of kids are on YouTube, for instance, and there are 2.5 million 8- to 15-year-olds. That gives you a clear indication of how many underage users are on there. So we expect they will identify who those are, and then, on December 10, they will start deactivating and removing them.
They will have to make user reporting available in case they miss someone so that parents and educators can say, “Hey, my under-16 is still on this.” They have to have an appeals process in case they overblock.
The other thing that is important, which touches on what Mr. Schmid said, is that the burden falls upon them to prevent circumvention, whether it’s location-based circumvention through VPNs or age-based circumvention. Kids are creative, right? They will try to get around things. They will probably try and spoof the age verification systems, and they can do that now with generative AI. They can do it with gaming graphics. They can wear a plastic mask.
We also expect that there will be a black market for “pre-age-verified” accounts. We have told the companies in very specific detail what we expect them to do in terms of “pre-age verifying.” Then they have to be very transparent about whether or not their deployments are working. Again, if they aren’t putting the technology systems and processes in place to prevent future under-16s from having and holding an account, that’s when they would be subject to an investigation and up to AUS$49.5 million in fines.
We don’t expect this to be perfect. However, I’m not going to let perfect be the enemy of the good. It’s putting important friction in the system. It’s also worth noting that I’m working with 12 special academics from around the world to do an evaluation that is separate from the legal review. We’re going to look at things like whether kids are sleeping more. Are they actually getting out on the sports fields? Are they interacting more? What are the unintended consequences? Are they going to darker corners of the internet? Is 16 the right age? When other countries want to look at what they are doing, they will have the benefit of an evidence base that we weren’t working from.
Senator Saint-Germain: Is your organization funded only by the Australian government, or is there a partnership in the financing of your organization?
Ms. Inman Grant: We are an independent statutory authority. I report to the Information and Communication Technology, or ICT, ministers, and we are fully funded by the government. I believe we’re close to AUS$60 million, and we were 250 employees when I started. Initially, we were AUS$10 million and 30 employees, but as the impact and the regulatory schemes have grown, we have had to grow with them. Education is a very important part of this. I talk about the three Ps: prevention, protection through our regulatory schemes and proactive and systemic change.
This is about how we can minimize the threat surface for the future. How are we looking ahead? How is the metaverse going to affect things? What are our concerns about AI and the quantum internet? Again, we’re looking ahead so that these things aren’t hitting us in the face and putting the burden back on the platforms through safety by design, just like we did with seat belts. Parliaments required the embedding of seat belts when we saw that traffic fatalities would go downhill. We think that technology needs to have their seat belt moment. They should be assessing the harms and risks up front and building the protections in up front rather than retrofitting after the damage has been done.
[Translation]
Senator Oudar: Thank you, Ms. Inman Grant and Mr. Schmid. I think your input is essential, so thank you for sharing your expertise and experience with our committee.
I’d like you to comment on something you mentioned in your presentations, the fact that the digital ecosystem doesn’t stop at the border. In fact, countries aren’t really separated by borders anymore.
In your experience, what international coordinating mechanisms have the most potential to reduce the number of minors moving to less regulated platforms? For example, could technical interoperability, mutually recognized certification or harmonized standards help? Ms. Inman Grant can go first.
[English]
Ms. Inman Grant: That’s an excellent question. There actually is a technical standard that is pending or is almost in place. It’s called ISO/IEC 27566, and it will provide some consistent technical standards for age-assurance systems. This is really important because when you think about the fact that the engineer is king in Silicon Valley, that’s what they build towards. They build towards standards. So that is very important.
I think interoperability when we talk about different laws and legislative systems is probably going to be the biggest challenge we all face in the internet regulatory space.
We often speak about permissive hosting environments. Mr. Schmid named Cyprus; I believe XHamster, which is another one, is based there. A lot of the worst gore sites, terrorism sites and child sexual abuse sites that we investigate are located in the United States, the Netherlands and also in France, although they are doing a better job.
Where there aren’t strong laws and online harm is enabled, that’s going to spill over all of our borders. They say that a rising tide lifts all boats. This is why we speak to other parliaments. We would love Canada to come on board. Of course, we watched the Online Harms Bill, and we’re sad to see that it didn’t get through, because we would love to partner with you. I think we have so many similar sensibilities in terms of how we approach things. Again, until all of us are safe and all of us have regulators in place, I think none of us is totally safe.
Mr. Schmid: That’s a good question, and it’s not so easy to answer. I think from a technical side, I agree with the point that Ms. Inman Grant said.
The more complicated point in your question is the political situation, of course, because we are talking about regulation of media, and it starts with the fact that all over the world, we have obviously a very different understanding of terms like freedom of expression and the question of how strongly states, state regulators and so on should intervene. I’m not so optimistic that we will find a global standard for that.
To be more optimistic, I think it makes sense to differentiate between two areas. I am sure, like Ms. Inman Grant also, we are not only taking care of minors; we are also protecting goods or values like human dignity against extremism. We are fighting against disinformation and so on.
My experience is there are two very different settings. The first setting is business-driven. For example, that’s pornography. At the end, it’s about earning money. The second area is politically driven by extremists, by enemies of democratic systems, et cetera.
For the first part, I would say, this is something we can get under control over time because, in the end, all these companies would like to earn money in big markets. Therefore, in the end, they will respect the rules because if not, then they can’t earn money. So the answer for this area is good collaboration between regulators, consequential rules, clear rules of enforcement to make clear that if there is a rule, you have to accept it. If you accept it, then you can do your business here. That’s it. This is not a matter of different layers. In the end, it’s about money, and we have to balance it out with the idea of a free society.
The other area is much more complicated because the motivation for disinformation, manipulation, hate and so on is mainly, or in the first step, not economic. That’s why it’s harder to find answers. It’s also harder to define the borderline between protecting users, society, on the one hand, and respecting the idea of freedom, on the other hand, because, of course, every measure you take to protect your society against this kind of aggression, at the same time, there is always the risk of destroying the freedom you would like to protect.
To make a long story short, for the more business-driven side — and this is mainly for the protection of minors — I’m quite optimistic. Excuse me; let me give you one additional piece of information from my legislator, the German legislator. As I said, they will adjust the law, and there will be another interesting rule, which is the follow-the-money rule. That means that from the beginning of next year, we will be able to stop credit card companies from offering their services to porn platforms if they do not respect the media law.
So I’m quite optimistic that the will of respecting the law could increase in the same manner as the income goes down. That makes me more optimistic. In the end, collaboration with Canada and with our partners in Europe and learning from the experiences in Australia are the answer. And then we need to be patient because it takes some months to find answers.
[Translation]
Senator Oudar: Thank you very much to the both of you.
[English]
The Chair: I’ve just been informed that Ms. Inman Grant is not able to stay to give further testimony. Is that true?
Ms. Inman Grant: I was told that it would have ended at 10. I have to catch an airplane.
The Chair: All right. Thank you.
Ms. Inman Grant: I can take a couple more questions if that’s helpful.
The Chair: All right. We’ll see.
Senator Dhillon: Mr. Chair, could we direct questions to Ms. Inman Grant, and then we can come back to the other witness?
The Chair: Sure.
Senator Dhillon: Ms. Inman Grant, thank you for staying with us, and thank you for the important testimony you have given today.
Aside from the ongoing harm that we now know and understand our children will continue to face, what other risks or challenges are we going to face in Canada if we don’t act now?
Ms. Inman Grant: I often compare online safety to water safety. Maybe that is a great ice-skating analogy for Canada.
We were one of the first countries to start with phone bans, very much as we were one of the first countries in the world to require fencing around pools backed by enforcement because of tragic backyard drownings. When the government started contemplating this whole idea of a social media ban, I said, “This isn’t the same as fencing pools; this is like trying to fence the ocean.” This is not how we approach water safety. We have to take a holistic approach. We always supervise our kids in the water. We teach them to swim from the earliest stage. On the beaches, we have lifeguards. We teach them to swim between flags. When there are sharks, we have shark nets.
I think the risk of having no regulation means you’re effectively not having any protections for your citizens. Of course, I think we would all agree that children are the most vulnerable.
We often talk about social media being a great social experiment. I worked for 22 years in the technology sector, and I tried to change things as a safety antagonist from within. One of the requirements of this job was that the eSafety Commissioner had to have significant standing in the internet industry and experience in online safety. The reason for that was they wanted an expert who knew exactly what the talking points of the technology companies were going to be, who would know what they were capable of, but who would also understand their limitations so that we are not asking them to do anything that is beyond their power or control.
I think, again, this has been an unregulated experiment. These are the most powerful, richest technology companies in the world, and they are telling us they can’t deploy the technologies they use to make their platforms safer. Somebody has to hold them to account. If you don’t have anyone holding them to account, then they will continue to run roughshod over governments, I believe.
I think it’s imperative. The way a country decides to do that is up to them and their democratically elected parliaments. Obviously, we want to get the balance right between freedom of expression and protection.
Senator Dhillon: Thank you. Have a safe flight.
The Chair: Thank you, Ms. Inman Grant, for assisting the committee in its work. Thank you for coming today. Safe travels.
Ms. Inman Grant: Thank you. Best of luck to you all.
Senator Tannas: Mr. Schmid, I wanted to confirm what I thought I heard you say earlier in an answer. Maybe it was actually in your opening statement. You mentioned that some of the larger pornography providers were openly not complying with regulations in your jurisdiction. Did I hear that right?
Mr. Schmid: Yes.
Senator Tannas: One of them, I think, was Pornhub. Is that right?
Mr. Schmid: Yes.
Senator Tannas: Mr. Chair, I think I would like to raise a question of privilege around this. I believe we have been misled by Pornhub, by their representatives, who are Canadian, conveniently.
I would like to ask the steering committee to consider engaging our legal department to undertake a review of the testimony and then to come back to us with a recommendation on whether or not we should compel, using our powers, the individuals from Pornhub and any other organization that was here associated with them to testify under oath. We need to get to the bottom of this. I think we owe the world a chance for these people to come clean and clarify. They seem to not be doing what they said they were doing in other parts of the world, and we are the only ones who can actually bring these people forward and have them tell us the truth. Thank you.
The Chair: So a question of privilege has been raised. Are there any comments or debate on the question?
Senator Pate: I support that and I think we need to revisit the testimony of the previous iteration of this bill when pornography websites appeared before the committee and actually said they would not follow the law. You may recall that I asked that question when the lawyers for the new iteration of Pornhub were here, and I think pulling all of that material together would be vitally important.
An Hon. Senator: Agreed.
The Chair: Do any other senators wish to make a comment on the question of privilege?
In this regard then, the steering committee will be asked to engage legal counsel for the purposes of revisiting this testimony, pursuant to the question that has been raised, and to potentially compel witnesses who have previously testified in front of the committee, who may have misled the committee, to testify under oath.
All right. Thank you.
Senator Clement: Thank you, Mr. Schmid, for being here and providing testimony. I have two questions. I want to come back to the idea of data sovereignty and the desire for Canadians to ensure that their data is well protected. We heard from Michael Geist, who is an academic, a professor, and he said that the bill will result in millions being required to disclose data to private for-profit foreign entities, to whom, as we heard from the Privacy Commissioner, enforcement of Canadian privacy laws may be limited.
I wonder if you could comment on that particular concern from your perspective. Canadians are feeling nervous these days about all kinds of people, even our close neighbours who we thought were good friends.
The second question is one that Senator Miville-Dechêne asked Ms. Inman Grant around the bias, the systemic racism that is built into these market operators. You talked about 100 systems being in your German market. What do you know about what they are doing? Do Germans, particularly Black and racialized Germans — because I’m hearing from my folks here in Canada that there is a lack of trust around some of that. Any comments you have about that would be appreciated.
Mr. Schmid: Thank you for the question. For the first point, yes, I understand the nervousness, especially with the question of where the data goes. It is also for us an interesting experience that relations with partners obviously can change a little bit, also across the Atlantic. But I cannot give you a good answer to that. It is like it is.
One point we need to have in mind, as I said, is it would be really favourable to have a close look at the question of the structure of the companies that are verifying age in relation to the companies that earn money with their business. This is something I’m structurally interested in.
The second point — and that is really important from my side — is what the answer is if we don’t get a satisfying answer to the question of data security. Could the answer be that we then lower the protection of minors? I think that doesn’t make sense. My answer would be this is a risk the industry has to solve because they set the risk. At the end, we cannot make a kind of competition between two values. It is up to us as society to protect our inhabitants in that way. That’s crucial.
To be honest, I cannot give you a good answer to the question how secure, in perspective, this data in general is. We have very strong data-protection authorities, our privacy-protecting authorities here in Germany. At the moment, we have no evidence that there is an existing problem in the systems that are implemented in the market. That is what I can share. But, of course, this is not an answer for the future, like I said.
The other question, yes, that’s a good question. I was very interested in the answer of Ms. Inman Grant because I think they have more experience in that than we have. The reason could be that the structure for our population is a bit different, so this is not so dominant. To be very open, I have no evidence or knowledge about issues in this area for our market at the moment. That does not mean that it doesn’t exist, but it hasn’t been brought to our attention up to now.
One reason could be that having 100 age-verification systems means that not all of them are based on biometric knowledge or AI. It is also classic procedures, like you go to the post office and take your identity card. This is much easier because there is no technical discrimination possible. That’s why the number of 100 does not mean 100 cases of artificial intelligence using your biometric data, to put it in relation.
For the areas where we have the biometric data, at the moment I have no positive knowledge of issues of discrimination that we have. But for sure, as in all the areas where we use biometric data, we have got to be careful here, either because of negative intentions or because — as Ms. Inman Grant said — the data has to have enough material to learn, and we have to take care that the feeding of these mechanisms is as balanced as the reality so that, in the end, the systems are able to act like in the reality and not only in the direction of one or the other, colour or ethnicity.
Senator Pate: Thank you very much for being here. My question is this: What’s the critical mass, or have you looked at how many countries, jurisdictions need to be taking this on for it to have a meaningful impact globally? We know that companies keep moving; as you mentioned, they change platforms, they change names, they change configurations. As you look forward, what are the next steps you see for us to be able to really address the online harm issues in a meaningful way globally?
Mr. Schmid: That’s a very good question. As I said, we have to differentiate between the area where we have business-driven interests. As I said, I think that’s a bit easier. There you can take a look at what the most important markets for them are. I’m sure that if the European Commission, in combination with the national regulators in Europe, is enforcing as we are doing at the moment, then this will be a strong argument for companies to change. It is also important to remember they can do their business. Pornography is not forbidden. They just have to respect one rule.
By the way, they all have these techniques. It’s not rocket science. That’s why I think if we are able to use the German law, if there is continuing initiative coming from our colleagues from the U.K. and France, then you have three strong European markets that will encourage the European Union to stay active on this matter. Then you have the European market, which is an important market for this business. And if, at the same time, countries like Canada and Australia address this topic, then there will be a flipping point where the industry says, “Okay, if we would like to continue our business, we have to respect that and fulfill the standards that are set up.” I’m quite optimistic.
At the moment, all these companies are in a little dilemma. The company that changes first will first lose money because there is a risk that people will go to another platform. On the other hand, my experience is that when we have enforced the rule in one or two cases, industry normally follows because then it is again a kind of level playing field.
That’s why I’m optimistic. Of course, it will cost some more days. An initiative like you are discussing at the moment coming from a country like Canada would be extremely helpful. As Ms. Inman Grant said, the atmosphere is changing in this direction because the story that this is not possible is not true; it is possible, and we can do it.
The answer for the other area — the protection of human dignity, protection of pluralism, and fighting against disinformation, manipulation, hate and so on — there the answer is much more complicated. I would say this one central point — and if we have this in as many democracies as possible, that would help — is to remember that the idea of society and having rules is that the society is setting the rules through the legislator and not the industry. The industry has to follow the rules. If the industry does not follow the rules, then it is up to the authorities to execute enforcement of the rules.
We went for too long of a period of time with the idea that this online world will regulate itself or that the platforms are just pipes and nobody can change anything about that. That’s not the truth; of course, we can change things. At the end, someone decides something. People are responsible for the things they are doing. The whole European discussion about the Digital Services Act is exactly the same discussion you are having at the moment. It is mainly driven by the moment you decide to come back to the idea that society is setting the rules, and then the others can act in line with what you, as legislators, are setting. If we do so, then we have a chance to bring this step by step a little bit under control.
That at the end — I’m sorry, but this is an important topic for us also — will stabilize freedom and democracy. As we have learned in the last years, when we cannot enforce some central ideas and we cannot protect some central values in the online world, then it undermines the idea of democratic freedom. That’s why I think it is so important that we find a good, balanced way to bring also this part of our reality under the democratic control that is also in the “old world.” It is a good working idea.
Senator Pate: Thank you.
The Chair: Thank you, Mr. Schmid, for coming before the committee and assisting the committee in its work. We really appreciate and value your testimony.
Mr. Schmid: Thanks a lot. It was a pleasure.
The Chair: Colleagues, we were expecting that we would be concluding the testimony of witnesses, but the steering committee has specific instructions to potentially explore the possibility of having witnesses who have testified to come back in a second appearance to testify under oath. So I’m sure we are going to have to adjust the work plan somewhat.
Senator Saint-Germain: When you are done, I’m going to make a comment.
The Chair: I’m not quite done. We were hoping to do clause by clause on Wednesday. Following that, we have a plan to work on Bill S-205. The steering committee is looking at the work plan tonight. Tomorrow, we will be reconvening, and we will devote the discussion to the special studies that the committee may undertake — a set of potential studies that have been circulated by the steering committee.
So I’m going to encourage the members to review that list of studies and familiarize yourselves with those issues being put forward, with the background material that the clerk has circulated.
Senator Saint-Germain: I would like to suggest that when we are ready to go to clause by clause on Bill S-209, we have an in camera meeting in order to discuss things before going to clause by clause.
The Chair: Any questions?
Senator Batters: Senator Saint-Germain, why would we need to do that in camera when all the testimony has been in public?
Senator Saint-Germain: If this committee decides to do it in public, I have no issues.
Senator Batters: That would normally be how we do it. I just wondered why you suggested going in camera.
Senator Saint-Germain: It was so that some colleagues might feel freer to speak and exchange comments or views. But, once again, I’m for transparency. Personally, I have no issues, so if the views of committee members are that we have a general discussion about and an overview of the bill before we go to clause by clause, from my standpoint, it is very relevant to do so in this case.
The Chair: My understanding is that the chair would ask everyone, “Are we ready to proceed to clause by clause?” At that time, we can address this issue.
Senator Batters: Chair, we could also certainly have a more general discussion about the bill. I know that it has sometimes been the practice in the last year or so at this committee that clause by clause would start right away with going through the different sections. It used to be the practice in this committee to have a more general discussion and to have the opportunity to ask questions and have more of a debate. That can certainly happen at the start of it, and it doesn’t need to be in camera.
Senator Saint-Germain: Personally, I have no issue with having the conversation in public.
The Chair: We don’t have to make any ruling on that. We’ll deal with that when it next comes up, which would be prior to clause by clause. In the interim, then, we have work to do tomorrow on the potential special studies. The steering committee has a number of questions to deal with shortly.
(The committee adjourned.)