Skip to content
LCJC - Standing Committee

Legal and Constitutional Affairs


THE STANDING SENATE COMMITTEE ON LEGAL AND CONSTITUTIONAL AFFAIRS

EVIDENCE


OTTAWA, Wednesday, May 26, 2021

The Standing Senate Committee on Legal and Constitutional Affairs met by videoconference this day at 4 p.m. [ET] to study Bill S-203, An Act to restrict young persons’ online access to sexually explicit material, and to study, in camera, the subject matter of the elements contained in Divisions 26, 27 and 37 of Part 4 of Bill C-30, An Act to implement certain provisions of the budget tabled in Parliament on April 19, 2021 and other measures.

Senator Mobina S. B. Jaffer (Chair) in the chair.

[Translation]

The Chair: Honourable senators, I am Mobina Jaffer, senator from British Columbia, and I am pleased to chair this committee. Today we are holding a meeting of the Standing Senate Committee on Legal and Constitutional Affairs. Before we begin, I would like to offer several helpful suggestions that we feel will help us have an efficient and productive meeting.

If you encounter any technical difficulties, particularly in relation to interpretation, please signal this to the chair or the clerk and we will work to resolve the issue.

[English]

Senators, I will do my best to get to everyone who wants to ask a question of witnesses. In order to do so, I ask senators to try to keep their preambles and questions brief. Members, you will have four minutes to ask questions. I ask you that you only signal to the clerk through the Zoom chat if you do not have a question. If you are not a member of the committee, please signal to the clerk if you have a question.

As you know, senators, we have two orders of business: We will start by resuming our study of Bill S-203, An Act to restrict young persons’ online access to sexually explicit material, and we will then move in camera to consider our report on the pre-study of certain sections of the budget implementation act.

Senators, there has been an updated agenda; we have more witnesses. Please check that you have an updated agenda.

Before I begin, I want to introduce to our witnesses the members of our committee. We have our two deputy chairs Senators Campbell and Batters, Senator Boisvenu, Senator Boniface, Senator Carignan, Senator Cotter, Senator Dalphond, Senator Dupuis, Senator Pate, Senator Simons, Senator Tannas and Senator Miville-Dechêne.

For our panel on Bill S-203 this afternoon, we have four witnesses. We welcome you today. Some of you accommodated us on very short notice, and I thank you for that.

From Yoti, we have Julie Dawson, Director of Regulatory and Policy; from Justice Defense Fund, Laila Mickelwait, MPD, Founder of Trafficking Hub; from Oxford Internet Institute, Victoria Nash, Professor and Director; and from the BC Coalition of Experiential Communities, Susan Davis, Sex Worker and Director.

We will start with a presentation by Julie Dawson.

Julie Dawson, Director of Regulatory and Policy, Yoti: Thank you for the opportunity to join the Canadian Senate today.

Yoti is a digital identity platform, a team of over 300 people headquartered in London but also represented in Canada. At Yoti, we provide age assurance, both with hard identifiers from documents and also low-friction age estimation via facial analysis. We do this globally. We are keen that there is broad understanding by regulators around the world of the more recent approaches that have undergone thorough scrutiny and approval cycles.

In order to do this, we’ve worked with testing authorities in Germany and the U.K., where all approval and audit mechanisms have been put in place. We have passed the audits, for instance, set up by the BBFC, which was the audit body NCC; by the Age Check Certification Scheme delegated from the Home Office; and in Germany by the FSM and KJM. 

We are also keen on following standards, so our approaches meet those of the PAS 1296 for age checking. That’s the Publicly Available Specification. I also have the honour of serving on the drafting committee of the next ISO, International Organization for Standardization, which will follow on from the PAS.

We believe in stakeholder consultation, so we recently held another round table on our age approaches, inviting 55 regulators and NGOs from seven countries. We’ve since been briefing government teams from the U.S. and New Zealand who are also researching these areas.

We’re proud to be one of the largest global providers of age assurance approaches and we work across a whole range of industry sectors, doing social media, such as Yubo, the social media live-streaming site; age-restricted or adult services; law enforcement; and non-profit organizations. Our solutions have been integrated by some of the world’s largest adult content sites, notably mentioned publicly by MindGeek, and our approaches are in use by them already for certain use cases, such as model, age and identity verification.

Our approach is also in use by law enforcement to ascertain the age of victims and perpetrators in child abuse material and by the leading child safety non-profit, Childline, and with the Internet Watch Foundation, whereby a young person can apply to remove an indecent or sexting image, where they prove just the fact that they’re under 18.

To date, we have conducted over 500 million age checks. That’s in about the last 18 months. We publish transparently the results of our algorithm of facial analysis age estimation. At the moment, accuracy levels — that’s mean absolute error — for 13- to 25-year-olds are within 1.5 years.

To give you an overview, we have three approaches and we can offer these side by side. Two of them rely on a root identity document, where just the data minimize attribute — that could be over 18 or it can be the full age if the regulator requires that, in use cases where that’s needed for legislation — is shared with the site. To be clear, an adult site would not need to have access to any other biographical or identity details.

Let me take the first one, which is the reusable Yoti app, a digital identity app. This requires a three- to five-minute set-up by a user on their mobile phone where they add an identity document that is verified. Over 10 million people globally have set up the Yoti app and in two clicks can share an 18-plus or an age attribute anonymously with a site. There’s no surveillance with this. We provide the private key to the individual and they share just the data minimize attribute.

The second approach is where somebody doesn’t want to set up an actual app. This allows an organization to integrate the ID checking in their flow. The user uploads their identity document straight from their browser. That document is reviewed, just the date of birth extracted and then just a data minimize age attribute, such as 18-plus, is shared. No document or image needs to be shared with the content site.

The third and least friction of our approaches is one that’s been devised with inclusion in mind for people who might not have access to identity documents or might not own one. This is age estimation using facial analysis. To be clear, this is facial analysis and definitely not recognition. There’s no recognition of an individual. The image is instantly deleted each time an age estimation is done. No data is retained. This has been independently tested, reviewed for bias and approved.

We’re very happy to share these details if there’s further interest. There’s also a place on our site where you can have a go, if you’re feeling brave, and have your own age estimated.

All three of these age-assurance methods can be integrated by an organization in just a few hours, be they a large- or small-content platform, without prohibitive costs. We also offer a customizable portal where all three approaches can be offered side by side and in one single integration. It’s either stand-alone or the three side by side. We publish all of our pricing transparently so that it is accessible to large-, medium- and small-content producers.

The final area that might interest you is around age tokens. This allows an adult to securely prove to one or more sites that they have been age verified by Yoti. In one sitting, a user would not have to re-prove their age multiple times. So a non-shareable, one-time use age token lasts for just one session or the period of time agreed by the regulator.

I hope that gives a good overview. Thank you very much for your attention today.

The Chair: Thank you very much, Ms. Dawson. We appreciate you making time for us on such short notice.

We will now go to the Justice Defense Fund, Laila Mickelwait.

Laila Mickelwait, MPD, Founder, Trafficking Hub, Justice Defense Fund: Thank you very much. Please excuse the graphic nature and language that will be used. I want to make sure we’re all on the same page and understand the reality of what we’re talking about when we use the word “pornography.”

I took a sample from Pornhub and Xvideos, the world’s most popular porn sites, yesterday and today of videos on their home page. These are videos that children would easily access: Stepsister gets cum blasted in the face. Schoolgirl in white socks masturbates in front of teddy bear. Blonde slutty anal stretching and rough gaping anal pounding. Daddy rough face fuck and spanking schoolgirl. Young man is fascinated by tender body of teen who licks his butt. A tiny teen fuck toy gets hard rough anal from stepbrother. He destroys her in a sex fight and humiliates her. Mom went to work and dad woke me up and recorded me. He said it was to masturbate when I went to class. Spy camera on sister-in-law. Finally, my parents are not home and I tied up my sister and did everything I wanted.

This is the most popular and accessible pornography on the world’s most popular porn sites today. These mega, multinational porn corporations are making hundreds of millions of dollars per year via advertising and user data collection because of unlimited, unrestricted access that drives enormous amounts of traffic to their sites.

In 2020, Pornhub boasted 47 billion visits per year and 130 million visits per day. That traffic also consists of underage child users. These corporations clearly care about profit over people, especially the most vulnerable children.

Pornhub’s senior community manager was caught saying the following about age verification for children. She said “Pornhub stands to lose 50% of traffic,” and that, “It costs money for us to verify” and that “overall it’s a disaster.” She also said: “Mindgeek loses money. Any age verification devastates traffic.”

Clearly, these profit-driven corporations cannot be trusted to self-regulate and it is the job of government to protect the most vulnerable — especially children — from abuse and children accessing hardcore pornography is abuse. Not only is content that children are accessing on mainstream porn sites like Pornhub often violent representations of assault, incest, rape and other sex crimes, but because these sites do not verify the age or the consent of those in the videos, often the videos on these sites are real sex crime scenes — crime scenes of actual drugged, unconscious sexual assaults, child rape, sex trafficking and other forms of criminal, non-consensual, image-based sexual abuse.

A recent study published by The British Journal of Criminology found that one in eight videos on the home pages of the most visited and popular porn sites in the world — Pornhub, XVideos and xHamster — contain depictions of real sexual crime. Many people argue this is the responsibility of parents to protect their children from these sites. This is a calloused and naive view. We all know there are many children in Canada and around the world that have neglectful parents, parents who are working too hard to put food on the table to pay attention. Many children have no parents at all. What about those children? Who is going to protect them from Pornhub?

Even children with attentive parents are not under the supervision and watchful eye of those parents 24-7. It is unfair to parents and unjust to children to place the burden of protection solely on those parents. I am often contacted by distraught mothers and fathers whose young children accidentally or intentionally out of curiosity access mainstream porn sites.

One mother recently lamented to me:

My children have been exposed to porn from Pornhub as early as 10 years old. The damage has affected my children and my home and all my children are now in therapy. One of my sons ended up molesting a child after being exposed to porn on Pornhub and acting out what he saw.

A distraught father said:

My 12-year-old son accidentally came across Pornhub while searching for PS4 on Google. He was devastated after seeing Pornhub because he thought the women were being raped. My son hasn’t been the same since. I would like to sue Pornhub for what the company did to my child.

However, we don’t need to rely on anecdotal evidence because we have over 40 years of peer-reviewed research that makes it clear that child access to pornography harms children. I will share this compiled list of research on this issue with the committee.

Lastly, I will end with this. Studies have shown that child exposure to pornography is most often unwanted. The Kaiser Family Foundation has reported that well over two thirds of 15- to 17-year-old adolescents have seen pornography when they did not want to or intend to access it, with nearly half reporting being very or somewhat upset by it. Unwanted sexual experiences are what we call sexual assault. You see, when a child witnesses porn through unwanted exposure or even by curiously stumbling upon it, unaware of what it would contain, it can become a serious and traumatic event in the life of that child. Viewing pornography is a sexual experience. We know why thanks to advances in neuroscience. Researchers now understand the function of what are called mirror neurons in the brain and the role they play in how the psyche and the body react to viewing things with our eyes. Mirror neurons are a type of brain cell that react in the same way when an action is actually performed and when that same action is witnessed being performed by someone else.

These mirror cells were first discovered in the 1990s by a team of Italian scientists who identified this phenomenon in the brains of monkeys. When the monkeys reached for a peanut, the exact same individual neurons in the brains of the monkey fired as when they simply watched another monkey grab a peanut. The neuroscientists who discovered the existence of these mirror neurons said:

If watching an action and performing that action can activate the same parts of the brain in monkeys — down to a single neuron — then it makes sense that watching an action and performing an action could also elicit the same feelings in people.

Indeed, it does. This phenomenon can be applied to the case of children viewing pornography.

The Chair: Can you please wind up now?

Ms. Mickelwait: Yes. Probably 30 seconds more.

Neuroscientist Donald Hilton has said that viewing other humans’ experienced sexuality is sexuality to the viewer. An 11-year-old boy or girl, therefore, experiences the unwanted, often violent, criminal sexual visuals in pornography as a form of sexual trauma.

In 1970, Thomas Emerson, a legal theorist who was a major architect of civil liberties laws, said that imposing what he called erotic material on individuals against their will is a form of action that has all the characteristics of physical assault.

The UN Convention on the Rights of the Child, Article 19, says:

State Parties shall take appropriate legislative . . . measures to protect children from all forms of physical or mental violence, injury or abuse . . . including sexual abuse . . . .

I urge this committee to see the importance of Bill S-203 and the urgency for which it needs to be passed. Thank you.

The Chair: Thank you very much. We will now go to the Oxford Internet Institute and to Professor Victoria Nash.

Victoria Nash, Professor, Director, Oxford Internet Institute: Thank you, Senator Jaffer. It’s a real pleasure to be invited to speak this evening.

I thought it might be helpful to just introduce my own background and area of research before I give my preliminary comments. I am a social scientist, a political theorist and I conduct research into children’s internet use and the strengths and weaknesses of different digital policies targeted at children.

On this particular topic of pornography and age verification — or AV — or age assurance, my primary experience came several years ago when the U.K.’s digital economy bill was being prepared and I was asked to conduct a review of evidence related to children’s access to online pornography, the routes by which they assess that pornography and the efficacy of age verification or age assurance measures.

More recently, we’ve been looking at this topic again but from the perspective of data protection. In the U.K., we have something called the Age Appropriate Design Code which has its foundation in data protection regulation that seeks to deliver principles to companies providing online services likely to be used by minors, such that again it introduces incentives for companies to identify the age of users.

I would say, first and foremost, that as an academic researcher it is very much up to all of you as senators to determine whether or not age assurance or age verification are appropriate policy tools for preventing access to pornographic material online. But I would flag up what I think are several quite important considerations that deserve to be taken into account when looking at the utility of this particular tool. Vitally, I think these particular factors I’m going to briefly list are ones which, if you like, highlight certain trade-offs which will inherently exist in any legislation that you produce which has its basis in verifying or sharing the age of users.

The first factor that I think is worth considering is that age assurance is generally a very blunt tool that has no regard for the maturity or vulnerability of the child. This is perhaps less of a problem when we are looking at a narrow scope of online content. For example, looking at say commercial pornography providers, certainly in the U.K., it would be illegal anyway to produce and publish that material to minors. Arguably it may be more of a problem if your legislation wishes to cover a wider array of online content, so things like social media for example.

The reason it matters is when I say that AV or age assurance is a blunt tool, the point is what it will do is it will reassure you whether that user is over 18 or maybe under 18, over 13, under 13. What it won’t tell you is whether or not that particular individual is particularly vulnerable and might need additional protections or maybe is particularly mature and may be well placed to enjoy an element of online risk. Again, it matters I think what type of platform you are using these tools on.

In particular, perhaps, is when thinking about the benefit of the young users being able to enjoy those platforms. So again, alas, we can demonstrate quite strongly some of the incredibly powerful arguments for keeping children off commercial pornography sites. But I would say those same arguments don’t apply quite so strongly in relation to social media sites where there are many clearer benefits to younger users, which we would not wish to see them separated from.

I think Julie did a very good job at the beginning of talking of the importance of privacy protection in using age assurance tools. I would add that I think that a company like Yoti has high standards of data minimization and protection, including for very young users. In the U.K. context, we had some qualms about companies more closely linked to commercial pornography providers delivering age verification or age assurance because we felt there was more incentive for them to use a user’s data in generating improved services, which probably wouldn’t exist in the Yoti case.

Third — and I only have one after this — competition issues are worth bearing in mind. If we look at online companies, it is very much the case that some may actually benefit from the introduction of high-cost regulatory barriers. In other words, the companies that make the most money are the ones that traditionally will be able to comply with regulations or legislation. That may not be so easy for smaller providers. I don’t know what our next speaker will say, but I have read some interesting research in the context of pornography that suggests what that might do is disadvantage, for example, ethical providers, feminist providers, providers of content suitable for different sexual minorities and so on. I would be interested to think more about that.

Last but not least, it’s worth considering the freedom of expression and information considerations. Again, these maybe seem more limited in relation to commercial pornography provision, but depending on the breadth of the legislation you’re intending to introduce I think it’s important to remember two things. First, sites with heavy sexual content may themselves be educational. Certainly, we had problems in the U.K. when we first introduced at-home filtering such that some important sexual health charities were blocked for teenagers. The second is, again, I think it is worth bearing in mind that as teens age, they are going to seek out risk. They are going to seek out sexual content, and the nearer they are to the age of majority it’s worth considering, I think, what counts as free access to information, even sexual information. It is important for that older age group.

In summary, I would suggest it’s really important when you look at this legislation to think about what both age verification and age assurance would offer in terms of protection, but what it may also close down depending on the scope of your laws. Second, I would note, in concluding, that all our research suggested that age assurance and age verification is no silver bullet. A determined teen who wishes to seek out this material will always find ways of doing so. That is the reason why education policy, sexual health education and relationships education is a vital accompaniment to tools of this sort. Thank you very much.

The Chair: Thank you very much, Professor Nash. I know you were contacted at a very late time and you still accommodated us. We thank you for that.

Susan Davis, Sex Worker, Director, BC Coalition of Experiential Communities: Thank you for taking the time to hear from me today. I would like to start by acknowledging that I’m testifying today from the unceded traditional territories of the Squamish, Musqueam and Tsleil-Waututh Nations.

I’m here today to represent Canadian sex workers as the Director of the BC Coalition of Experiential Communities, and I want to start by stating that I am not an academic. I am a sex worker actively working in the sex industry as an independent escort for 35 years. I have broad experience in many areas of the industry as many sex industry workers do. I migrated across the industry and this country with the availability of work. I have worked in micro brothels, businesses run from residential spaces, escort agencies, massage parlours, adult film and on the street in Halifax, Moncton, Montreal, Toronto, Surrey, New Westminster and Vancouver. I have spent time in prison and survived numerous assaults and attempts on my life. I have battled cocaine and heroin addiction and survived four overdoses. I also note that it seems that I’m the only Canadian witness here today, which I think is a bit of a shame.

I’m not a representative of a minority of sex workers in Canada. I am a representative of the majority. My experiences with criminalization, violence and the outcomes of biased laws have led to my 19 years in advocacy for the rights and safety of Canadian sex workers. My greatest achievement has been the adoption of the lowest level of enforcement policy by the City of Vancouver and Vancouver Police Department, which has now been adopted by all 45 police services and the “E” Division of the RCMP here in British Columbia. The net result of those sex worker safety centred policies is a tremendous success, with safer indoor workspaces stabilized and no murders of sex workers in the city of Vancouver for 12 years and counting.

I am here today to try to represent the interests of my community and to draw attention to the many flaws and impacts Bill S-203 could have on our lives and safety. First, the biased and flawed data that is being used as the foundation of this bill with no regard for the rules set out governing research involving human beings in this country. The current context of sex workers making difficult choices during the COVID-19 pandemic and significant migration from in-person sex work to online and adult film sex work, further restricting people’s options for survival in this regard, will have predictable negative outcomes.

Of course, facial recognition and age verification technology are flawed and experimental technologies that have been shown in several recent studies to be biased, racist and sexist in platforms where they have been implemented. The use of age verification and age estimation as terms describing these technologies reveals these flaws and potential for bias or algorithm mistakes. Privacy, of course, protected storage of this kind of data by tech companies has proven to be impossible. Breaches are happening daily, and given the intimate nature of this data, the risk is unacceptable. Stigma faced by sex workers and the potential for weaponizing that data using stigma and fear are real and predictable. The United Nations Declaration of Human Rights prohibits the use of the rights of one group to destroy the rights of another. Sex workers’ rights and safety are not a reasonable casualty in the experiment proposed by this bill.

Missed opportunities — the exclusion of the people with the most experience in this area of trying to verify the age of the users of their sites and who will be directly impacted by this bill will lead to negative outcomes. They have lots to offer in terms of knowledge and experience, and I think it’s a mistake to bypass them as witnesses in this work. Impartiality — statements made by committee members in support of this bill before your work is complete demonstrates the lack of impartiality being shown to sex workers who are citizens of Canada with rights. The Standing Orders and the codes of conduct that govern your work require impartiality.

The anti-sex work lobby is working hard, of course. This is an attack on our community and our ability to feed and house ourselves and our families. All sex workers post sexually explicit images on the internet and all sex workers will be impacted by this bill. We would like to acknowledge how difficult it can be to hear the stories of tragedy and the most extreme cases being presented. These stories and assertions are meant to impact you emotionally. No one is trying to dismiss the stories of those impacted by exploitation or non-consensual image sharing or those negatively impacted by consuming pornography. However, we are asking that you try to understand the political agenda of the anti-sex work lobby. They want you to be offended. They want you to feel repulsed. They want to abolish our community and choices. They make biased, emotional and ideological arguments against our ability to feed and house ourselves with no consideration for the impact on our lives.

We appeal to you for impartial and rational attention to these issues and for understanding that the extreme examples being given are not a reflection of our industry, that the majority of people working or operating businesses in our industry are ethical and share the same concerns about exploitation and proliferation of violent sexual images. This bill is flat with no strategy or funding proposed for the obvious remedy to issues with young people viewing sexually explicit material: sex education. There is no planning for this critical aspect of providing information and support for youth who are exploring their sexuality.

Last, we are people with families who love us. Our lives and safety depend on your decisions here. This cannot be overstated given the current context of sex workers’ choices in Canada. We hope to see a more balanced approach to these issues moving forward that respects sex workers’ lives and safety as citizens of this country.

We recommend that the committee take the time to scrutinize all research and data presented for consideration and decisions regarding this bill to ensure ethics and meeting the test of the Tri-Council Policy Statement and that the committee reach out to adult film business operators to draw on their expertise and experience with age verification.

Further, we ask that the committee seriously consider the risks to the lives and safety of sex workers when making decisions about this bill; that the committee remember its commitment to impartiality and try to refrain from emotional responses; that the committee remember the difficult situation faced by Canadian sex workers during this incredible time and work to not further narrow people’s choices for housing and feeding themselves. We request that the committee recognize why other countries have abandoned the idea of facial recognition and age verification technology and that the committee reject Bill S-203 in its entirety based on the clear and predictable threat it poses to sex workers and the privacy of all Canadians. Thank you.

The Chair: Thank you very much, Ms. Davis, for your presentation. Ms. Davis, I want to assure you that we’ve heard from many panellists and many Canadians on this issue. We are honoured that our international witnesses are assisting us with this bill and showing us what is happening in their countries and internationally. We appreciate them being here, but we’ve heard from many other witnesses as well.

We will now go on to questions and we will start with the sponsor of the bill, Senator Miville-Dechêne.

Senator Miville-Dechêne: Thank you to all witnesses for your very thoughtful testimony. I want to come back to this idea of privacy on age verification because since I’ve presented this bill, I’ve been asked this question many times.

To Victoria Nash and Julie Dawson, do you think porn platforms, porn sites, should do this age verification or should it be a third party who does it?

More pointedly, to Julie Dawson, you said you now have a contract with Pornhub. Pornhub was asked by the French authorities to verify the age of their customers a few weeks ago. Are you involved with that particular project?

Ms. Dawson: To your first point, I think it’s crucial that you don’t mark your own homework. I would think that one of the key learnings from the Digital Economy Act 2017 was the recommendation that age checking should be done outside of the companies.

To your second point with regard to MindGeek, we are working with MindGeek. We have presented material with members of an organization called Point de Contact, which is probably the French equivalent of the Internet Watch Foundation. They operate a hotline on this subject and they have presented materials around this. They have been part of the committees for the CSA — le Conseil supérieur de l’audiovisuel — and we have shared materials with them. We haven’t, as yet, been invited to a session such as this with them, though we have offered to do that should they so choose.

The integration with MindGeek goes across several areas. One is for actual models and assessing the age and identity of models, and the second area is with regard to users. That is something that was already a discussion in the U.K. two years ago.

Crucially, to recap a couple of points, it is most definitely facial analysis. It is not facial recognition. First, it detects a face, and second, it looks at the characterization of the face and assesses age. In that training set, there is no recognition. It’s just an anonymous face with the month and year of birth used to train the algorithm. It can never recognize an individual. I hope that is helpful.

Senator Miville-Dechêne: It is. Dr. Nash, perhaps you could expand on the question of who should verify what, as well as privacy.

Dr. Nash: This is a very important aspect, I’m glad you’re focusing on this. I would absolutely agree with Ms. Dawson that it should be a company one removed from the pornography providers that is responsible for carrying out age assurance checks.

I actually think that for two reasons. The first is the marking your own homework point, which has already been made. The second consideration, though, is around the use of individual data. From a privacy perspective, it is helpful to use companies who are not incentivized to analyze the data showing, for example, which adult sites are being visited by which individuals, how frequently, what type of content, et cetera. Again, I think keeping these companies at arm’s length helps with that.

One thing I’ll add that we missed in the U.K. in our Digital Economy Act 2017 is that there do have to be minimum standards for data protection and privacy set up in conjunction with this sort of legislation to ensure that any approved providers for age assurance do meet minimum standards and can ensure users there will be no possibility of data collection and data breaches.

Senator Miville-Dechêne: Thank you.

Senator Batters: My first question is for Laila Mickelwait, founder of #Traffickinghub and CEO of the Justice Defense Fund.

First, it’s a good point you make to call it child sexual abuse material instead of child pornography because clearly that’s what it is. I understand you are supportive of this legislation, but my question is: Does your organization believe that this bill goes far enough or are there additional measures that parliamentarians should be taking to restrict young persons’ online access to sexually explicit material?

Ms. Mickelwait: Yes. I think that this bill is an extraordinarily important step. I think it does go far enough. I think that being able to make sure that those who are used in pornography behind the screen are of age and are consenting adults and not victims of trafficking or child rape, for example, is crucial. As is protecting those who are in front of the screen and making sure that those accessing this content are not children.

In my opinion, these are common sense, protective measures that need to be implemented. We don’t allow children to go into a sex shop and purchase a pornographic video. We should not allow them to enter a virtual sex shop and do the same.

Like I mentioned before, the examples I gave at the beginning of my presentation were not extreme examples. They’re meant to elicit disgust, but they are frankly samples of the most popular home page materials on the world’s most popular and most frequented porn sites. These are not extreme examples. This is the content that the children will access when they land on these sites.

Senator Batters: My second question is for Julie Dawson from Yoti.

When the U.K. was contemplating this type of approach, there were many skeptics who pointed to online tutorials and step-by-step instructions on how to sidestep age verification technology. For example, by generating a non-existent credit card number, one U.K. journalist demonstrated how he sidestepped age verification in less than two minutes. I’m wondering has the technology improved since 2018 and does it concern you that resourceful minors might be able to easily work around this technology?

Ms. Dawson: Thank you for your question. Yes. Absolutely, there are more approaches on the market over the last several years and those will keep on evolving. Absolutely, you’re right that there will always be those people around the world. In the same way that we have a fraud landscape, we will always have people look at how to circumvent things. It would be disingenuous to say otherwise.

However, our methods use anti-spoofing checks to see how hard it is to circumvent the technology. If you’re an audit body, that should be the test you’re undertaking. That is one of the things we have offered. We have suggested to the different audit bodies that they employ the age technologies to assess if the work that platform is doing is working, but also it’s vital that NGOs and the auditors do check and find out how easy it is to circumvent the technology.

If you’re looking at age estimation using facial analysis, we’ve looked at, for instance, the impact of masks and mask wearing over the last period. We’ve also looked at makeup. We’ve performed a whole range of different tests and on an ongoing basis we look at what the next attack sectors are. That is crucial.

Senator Batters: Thank you.

Senator Dalphond: Thank you to the panellists for being with us today. We’ve certainly had a wide range of perspectives and this has been very informative.

My question is for Professor Nash. I know it’s late, but I see that you’re still alert. You’ve done a lot of research. I found the age verification techniques very interesting — the report you did in 2013 about gambling. I would like to really speak about the introduction in the U.K. of the age restrictions to adult content online. Could you give us more substance about how it is happening, where it’s going and what the problems are, if any?

Dr. Nash: I will do so. You might ask Julie, as well, who I suspect sees this on the ground even more than I do.

Thank you for the nice words about that report. Many elements of that 2013 report are fundamentally still relevant now. There are lessons we can learn from sectors such as illegal gambling that have, in particular countries, introduced quite rigorous age checks.

The U.K. situation is an odd one. We had a Digital Economy Act introduced in 2017 that purported to mandate age checks for — I can’t remember the exact definition; I know it’s commercial pornography providers, but it was for when more than a third of the content on a site was deemed pornographic. The regulator was set up to be the British Board of Film Classification and we all expected to see the full roll out of that scheme a few years later. In fact, it never happened and I’m sure Julie and others in the age verification sector felt very frustrated by that.

We currently have a new bill before Parliament, the Online Harms Bill. Again, that was expected to include very specific measures around accessing adult content and it was, again, expected to introduce new measures regarding age assurance. It’s actually missing. So the suggestion at the moment is that such might be introduced as an amendment in the parliamentary term. I believe that the Secretary of State is open to that. But yes, it’s very interesting it didn’t appear.

I don’t know the rationale as to why it has been excluded or why it failed. After the Digital Economy Act was introduced, a lot of the media coverage suggested it was due to concerns about privacy and data protection standards. I suspect a fair amount of it having been from industry lobbying as well.

Senator Dalphond: Ms. Dawson, do you want to add something about the inability so far to provide some kind of workable measures to restrict access?

Ms. Dawson: Yes. There were probably several factors around Brexit and a new term of government that might not have wanted — we call it the “Daily Mail test” in that the broadsheet newspapers might find it not the best thing in your first 100 days in office to be bringing in something that might be quite controversial and unpopular. So probably the timing wasn’t the best.

However, there are several things that have been happening. Professor Nash mentioned the Age Appropriate Design Code coming in this September in the U.K. It’s focusing a lot of attention on age checking across a whole range of platforms — all platforms looking at designing appropriately for age. Across Europe, there is also the Audiovisual Media Services Directive that is looking at video sharing content. There is also the online safety bill that Professor Nash mentioned.

Across Europe, there is also euCONSENT, an EU project building an interoperable approach for age verification and parental consent that kicked off on March 1. Across industry, from the adult industry, I took part in the development of the PAS 2096, where it was clear to me that the adult companies were looking for the level playing field. A lot of them were parents and, despite what people might think, they said that if there is a level playing field and there is a specific date, we will meet this. What they didn’t want was it to be a different playing field for social media providers versus adult providers or for it to be unclear in terms of enforcement.

We’ve seen that even more in the last six months, whereby there is a lot more proactive discussion from the content companies that have probably done quite well over the last 18 months of lockdown — them looking and thinking how they should get ahead of the curve when they see on the horizon the Audiovisual Media Services Directive, the online safety bill, as well as the Age Appropriate Design Code. Clearly, the writing is on the wall; hence, a lot of those providers have been proactively doing due diligence behind closed doors, looking at what methods they would integrate. We have integrated with quite a few different ones, of which only a minority are already public.

Senator Dalphond: Thank you.

Senator Pate: Thank you to the witnesses.

We’ve heard from experts and young people about the lack of effective sex education for young Canadians, as well as young people worldwide, and how this can exacerbate the harms that young people experience when they come upon pornography and those becoming the sites for education in the absence of other sources.

I’m wondering if each of you have thoughts about best practices for sex education. Do you have some examples we could be looking to in terms of observations this committee might make in terms of some of the necessary corollaries for this type of legislation? Thank you.

Ms. Mickelwait: I can add a thought here.

Regardless of age verification implementation, there will always be those who are going to end up being exposed to pornography. It’s important to develop resilience in children to be able to go through those encounters and come out the other side unscathed.

There needs to be a significant amount of education for those who are caretakers of children in schools and also for parents. We need to create shame-free zones where children feel that they can speak about these issues and that they can go to a trusted adult to be able to ask questions and get healthy answers versus going to mainstream pornography sites where they will see sexual violence and possibly actual sex crime scenes as their sex education.

Ms. Davis: I would just like to say that there’s never been adequate sex education in this country. This is a huge issue that needs to be addressed in a much broader way. There needs to be funding to NGOs and others to develop these programs. It has to be in partnership with youth; the people who will be receiving the training need to be included in the development of the training.

I have never heard of a really good sex education program in any country, anywhere, that’s actually met this sort of a need. It should include something about internet literacy and teaching children how to avoid these kinds of things if they’re worried about it. We have help lines and things like that for people to call. Educate them about what resources there are for them, where they could call somebody and discuss something if they found something they were uncomfortable with.

But the main point here is that this bill, with none of that in place, is pointless. All you’re going to do is narrow choices for youth who are trying to explore their sexuality under the umbrella of trying to protect children and put nothing in the void or in place of that. It just makes no sense to me that the bill has no provision for that whatsoever.

To me, the main issue with this bill is that there is no provision, consideration or even thought given to the people who work in this field. No one seems to be mentioning them whatsoever.

So, yes, for the sex education portion, work with youth and elders in First Nations communities. Talk to people about what it is that’s appropriate for various communities across the country and make sure it’s appropriate.

Senator Pate: Was anything like this done in Britain when it was introduced?

Ms. Dawson: I remember some materials existing through the NSPCC. I don’t know if Professor Nash would have access to the sources.

Dr. Nash: I probably would. The one key step for us was that we did actually, as a result of that discussion, introduce compulsory sex education at the secondary school level, which we did not previously have. That alone is a significant step. The first curriculum, I believe, is being taught this year so you might want to look at the evaluation of that to see how successful it is.

I very much agree with everything the previous speaker, Susan, said. I would note that I think there is a risk sometimes that we see pornography as a stand-alone element in which all children or young people experience sexual content. But the reality is that we live in a very sexualized culture, whether it’s music, music videos, film or TV. I think the rationale is for a strong sexual education. A depiction of sexual content as a form of fiction, whether that’s in pornography or films, is important as a component of our education. But beyond that, it’s not a field I’m an expert in.

Mark Palmer, Clerk of the Committee: Senator Campbell or Senator Batters, unfortunately Senator Jaffer was cut off, so if one of you could take over as chair that would be much appreciated.

Senator Campbell: Go ahead, Senator Batters.

Senator Denise Batters (Deputy Chair) in the chair.

The Deputy Chair: I’m sorry; I don’t know who was next on the list to ask a question. Mark, could you let me know?

Mr. Palmer: I believe it’s Senator Dupuis.

[Translation]

Senator Dupuis: I have a question for Susan Davis and Julie Dawson.

You have been very clear, and I fully understand your position that this bill is totally inadequate with respect to sex workers. Can you tell me what you think could or should be removed or added to a bill like this to make it acceptable to you as a representative of sex workers?

[English]

Ms. Davis: Unfortunately, I don’t think there’s anything that could be added to this bill to make it appropriate for sex workers. The goal of the bill is not only to limit access of youth to sexually explicit content, but to limit access to sexually explicit content for everyone.

I think we’re missing a piece here where every single sex worker in this country has sexually explicit ads with pictures, whether that’s for escort services or clip sales. The bill itself doesn’t give any definition of what is sexually explicit. What exactly are the limitations on this? Is it narrowly targeting adult film or is it broadly targeting the entire industry?

It’s fairly well-known that Senator Miville-Dechêne is against the sex industry, so my assumption is that it’s targeting the entire industry. Do we want to see sex workers forced back to the street and removed from the internet completely? Is this the goal? For me, there is nothing we could add to this bill that would make it any less dangerous for my community especially given the migration towards online work because of the pandemic, and even before that.

For me, this bill was ill-thought out, ill-planned and didn’t take into consideration the Canadians who work in this field, who feed and house their families using income from the safest, most high paying jobs in the sex industry. For me, there’s nothing that could be added to this bill and it should be completely scrapped.

[Translation]

Senator Dupuis: If you say that nothing can be added, would you say that something should be removed?

In other words, should people who are engaged in the sex trade be excluded because it is a trade rather than a commercial operation, as is a porn site like Pornhub? Since we can’t add anything to the bill to make it acceptable, do you think it should exclude the practice of a trade such as sex workers? Should we say that, if they are practising their trade, they must be excluded from this sort of bill, because they are not engaged in a commercial activity that allows them to make money off of children?

[English]

Ms. Davis: People who work in adult film are sex workers, so I don’t see how there could be an exclusion given that the bill is targeting adult film and adult film workers. The assertions that have been made here about how every single video is exploitative and is rape and all these things are the same things we hear about sex work from anti-sex work people all the time. Of course, they would likely say that I am living in a false consciousness and that I only think I’m choosing sex work, but really I’m so damaged that I don’t know what I’m doing.

This argument has been going on in Canada for decades. I wish we could count on facts, ethical data and research to make these decisions. The kinds of assertions made by witnesses in previous panels about how children become rapists and all this sort of thing could be vetted.

I don’t understand how a bill like this could be formed, founded on unreliable, unethical data that does not meet the test of the Tri-Council Policy Statement or research involving human beings in this country. I really don’t understand.

If there was to be legislation like this, I would say scrap Bill S-203, begin again and include people who work in the field, including those who own businesses and those who work as adult entertainers and performers. Then perhaps you could come up with something that’s a little more reasonable. This is so extremely biased against my community that I don’t see any way to salvage it.

[Translation]

Senator Boisvenu: Welcome to our guests. Your presentations were really interesting. My first question is for Ms. Mickelwait. In your opinion, have any studies established a link between the development of violent behaviour among children and the frequent viewing of pornographic pictures and films?

[English]

Ms. Mickelwait: Thank you for that question. I think it’s very important. Like I said in my presentation, we have over 40 years of peer-reviewed research that demonstrates the harm pornography does to children who are viewing this content.

We talk about viewing and doing. A study was done which showed that over 88% of mainstream pornography films contain sexual violence. When children view this content, research has shown that it does something in their brain that creates permission-giving beliefs, which then enable them to more easily act out in sexually violent ways.

[Translation]

Senator Boisvenu: So they will develop behaviours by imitation. My other question is for Ms. Davis. We are somewhat uncomfortable with your position, which I understand fully. I’m not sure whether you have children. We know that today, children have easy access to the Internet, especially to video games. More and more often, when children play these games, windows with pornographic ads or advertising on slot machine games pop up through overlay techniques. If you are a parent — and you may be — how are you going to protect your children from the violence that is part of online pornography? How can we protect our children if we can’t legislate?

[English]

Ms. Davis: First, I would say that I don’t think that pornography is synonymous with violence. I think that’s a flawed assumption from the beginning. In terms of my children, I talk to my children. I create a trusting relationship. I talk to them openly about sex and sexuality.

[Translation]

Senator Boisvenu: Ms. Davis, as I am sure you know, there are two types of pornography. There is what I would call soft porn and hardcore porn, and children do not distinguish between the two.

When they are exposed to pornography that shows violence, sexual exploitation and assault, do you think that it is the sort of pornography that does not influence the future behaviour of our children?

[English]

Ms. Davis: I sort of understand French and I’m not getting any translation here, but I think I got the gist of what you were saying.

The thing for me is that when you look into the data and the research that is being referenced by Ms. Mickelwait, there are questions about whether or not that data is ethical. The numbers being bandied about, such as 88% is violent — or whatever it was that she said — and in the testimony of the other witnesses in the other panel, we heard that 11-year-old boys are 60% likely to act out the violent scenes that they’ve witnessed.

When you look into this research and try to find the source of those numbers, it’s not there. I have learned about policy. This is how I have been successful in my advocacy work. I’m not trying to say that we don’t need to protect children. Children need to be educated about these things, given the tools to make safe decisions when using the internet and to have people they can talk to about their sexuality in a safe way.

Senator Cotter: I have two questions and one might be to enable Ms. Davis to complete her point. Maybe I could start with that question for Ms. Davis. I also have a question for Ms. Mickelwait, if I may.

Ms. Davis, the impression I’m drawing from your testimony and your responses is that you are opposed in principle to putting in place restrictions on young people’s access to pornography on the internet. I’d like you to be able to respond to that. If I’ve misstated it and you are supportive of the philosophy of the bill, I’d be interested in your thoughts on how that can be achieved.

With respect to Ms. Mickelwait, I’ve read your material, the writing in the New York Times. I think this is a larger philosophical project for you, or this is one part of a larger philosophical project. In Canada, the question for us is limited to young people and access to pornography. The sex industry and sex workers are constitutionally protected in the Canadian framework by a decision of the Supreme Court of Canada, so that’s sort of off the table for our conversation here. Would you say this bill is structured in a way that can achieve the more specific and limited objective of protecting children from the exposures about which you are concerned?

Ms. Davis: I am not opposed to trying to educate and ensure that children are not exposed to violent pornography. That is not the point I am making at all. I am trying to defend the people who work in that sector because these are some of the highest paying, safest jobs for sex workers in this country, especially given that people are migrating online due to the pandemic and were migrating there before. My issue here is that we’re using the excuse of protecting children to try to limit people’s access to adult films without any consideration for people who work in that sector.

My point is, we’re basing all of these decisions on research that has been proven to be flawed. When I looked at it myself — and I’m a layperson. Granted, over the last 19 years I’ve learned a lot about what the Tri-Council Policy Statement means. But when we use data that is in conflict with the rules that govern research involving human beings, we always end up with negative outcomes. If we don’t include the people who will be most directly impacted in the development of these policies, we will not have the outcome that we want. It’s not justifiable to throw sex workers’ safety under the bus to protect children and youth in the way that is being proposed here.

Senator Cotter: Got it. Could I ask Ms. Mickelwait’s thoughts before my time runs out?

Ms. Mickelwait: Yes. Thank you. I think that if there is not a way that we can protect children without harming those in the industry, then we have a different problem on our hands. We do not allow children to enter a strip club, we should also restrict children from being able to enter a virtual strip club.

We talk about violent content — whether this is staged, studio-produced content or whether this is real sexual violence, there’s no question that this content does contain quite a lot of violence. Even if it didn’t, even if this was what we would call normal sexual interactions, children are not able to process that kind of material without being harmed at young ages. It doesn’t take a rocket scientist to know that children should not be having those kinds of sexual experiences and they need to be protected from these experiences. We need to be able to do that in a way that prioritizes the safety of the most vulnerable in society and children above everybody else.

Senator Cotter: Thank you very much.

Senator Simons: Thank you. My question is for Ms. Dawson. Pornography is not my cup of tea, but it is legal in Canada and Canadians have a constitutional right to access it. I share Ms. Davis’s concerns that the inevitable result of facial — not recognition, what did you call it?

Ms. Dawson: Analysis.

Senator Simons: Facial analysis software would be to frighten away adults who don’t want to risk having their face shown and recorded, who are worried about blackmail and their privacy being violated. I also worry about the efficacy of the software to make fine distinguishments. How does your software differentiate between someone who is 17 and a half and someone who is 18? Is there a danger that it might not accurately recognize the age of a Southeast Asian Canadian or someone who is trans and going through transition — transitioning from a female identity to a male one — who might appear to be adolescent or younger than their age?

How does the software accurately ensure that everybody who has a right to access this material gets to exercise their right? What do you say to the legitimate concern that people would have that, even as law-abiding adult consumers, they don’t want to put themselves through the proposition of having their faces analyzed?

Ms. Dawson: Thank you very much for that question. Taking your points with regard to the efficacy, when we’ve worked with regulators what they have done is set a buffer. The technology is 1.5 years of accuracy for 13 to 25 years. For example, in the U.K. Digital Economy Act, the regulator was looking at a three- to five-year buffer and that was back two and a half years ago. In our latest conversations with regulators, depending on the sector, again, they are looking at around a three-year buffer. It would be people only over the age of 21 who would be able to see adult content for 18-plus using that one mechanism, which is why we offer three mechanisms side by side and there are obviously other mechanisms on the market. That would enable people who do not want to use age estimation using facial analysis to use other approaches.

However, in the testing we have done — in the retail sector, for instance — we’ve worked with one of the largest bricks and mortar retailers in the U.S., and over a two-week period — these were people in a self-checkout buying age-restricted goods — we had over 70% of shoppers using this approach. It’s also live in other countries.

We have live implementations of this software already and we have not found it to be a material disinhibitor. We always recommend to the organizations to absolutely offer different methods side by side.

To be very clear, we share exactly how accurate it is across genders and ages, and we update that every few months. Specifically with regard to skin tone, we test it across skin tones and we use the Fitzpatrick matrix all the way for skin tones one to six and we show clearly how accurate it is. We ensure there is no material disadvantage to those of any skin tone. For some of the younger demographics, we are more accurate on the darker skin tone, but we continue to look at that and share that every few months.

Senator Simons: The world “Orwellian” gets thrown around far too easily in our political discourse, but I find there’s something extraordinarily dystopian about the world you are painting — that someone should have to have this kind of analysis to buy cigarettes at a grocery store or wine at a wine shop.

I’m repulsed by the kinds of pornography that were described to us. I am equally as repulsed by the idea of living in a world where either government or commercial actors are surveilling me as I go about purchasing things that are my legal right to purchase.

Ms. Dawson: The difference here is that there is no surveillance. No one has actually recognized you. The same way I might look at you and say “45” —

Senator Simons: All right. You’re my new favourite witness of all time.

The Deputy Chair: With that, we have to move on. If Senator Carignan is on, great. If not, we will go to a brief second round, starting with the sponsor of the bill, Senator Miville-Dechêne.

Senator Miville-Dechêne: On this first point, Ms. Davis, we’ve talked at length about those questions, and obviously we do not agree, but you characterized me as being against sex workers and all that. I just want to say that I’ve said in my past life that there was a lot of exploitation in prostitution, and this is my position. I don’t have any grudges against sex workers.

That being said, I want to go back to Ms. Dawson to let her finish her thought about this interesting question that Senator Simons asked, which is surveillance. What do you do with the information you’re getting?

Ms. Dawson: Thank you.

So the image is instantly deleted with every age estimation using facial analysis. In the same way I look at one person, I give the personal judgment, that’s exactly what the algorithm does. It’s only been trained with anonymous faces. Each time it looks, it says 42, 36, 72 and it instantly deletes that. Yoti never receives back the result of that; it goes to the company with confidence.

You may or may not think this is Orwellian, but this has been used to triage hundreds of thousands of child sexual abuse images to ascertain ages of victims and perpetrators. This is enabling triage for lots of sites that couldn’t have enough humans doing that looking and estimating. That’s all it’s doing.

Looking at it in the retail context, we have across the world rising levels of verbal, physical and racial abuse to retail staff at the point of checking age. We have people losing documents — a million driver licences, 400,000 passports in the U.K. each year — lost and stolen — putting those people at increased levels of identity theft and fraud.

We offer several mechanisms side by side. If someone is not happy with facial analysis age estimation, they can use one of the other two methods or just ask for somebody in person, as previously. That’s what we recommend to sites: Offer a range of options and do not be dictatorial; let the market decide what they want to use. Thank you.

The Deputy Chair: Thank you very much. We will now go on for the last second round question.

[Translation]

Senator Dupuis: My question is for Ms. Dawson.

You mentioned three methods that you described well: the Yoti app, ID checking, and facial analysis, which you say is less controversial. All these methods leave traces. For example, if I try to access a site, I inevitably leave traces. You tell me that it is a series of numbers that go by and are erased as they do. I believe you when you say that the system is set up in such a way that it does not retain anything, but that is not my concern. My concern is: What do you do with the trace that I inevitably leave, and that stays permanently in your systems, when I try to access a pornographic site?

[English]

Ms. Dawson: Thank you for that question. I think I’ve understood your question to be this: What trace do I, as a user, leave with Yoti rather than with the site?

[Translation]

Senator Dupuis: If I want to access a site and I have to have my age checked, my access will be controlled and blocked. I must first prove my age by giving my date of birth, for example. That leaves a trace, since I have registered on the site in question, and we know that the data is permanent. What guarantees that the information will not be used by others?

[English]

Ms. Dawson: Understood.

With the Yoti app, we have a set-up that means we have a non-relational database. Once you set up your Yoti, we shard all your different attributes — name, age, date of birth — and only you choose to share just your over-18 attribute. When you do that, we don’t know that it’s Senator Dupuis who shared that attribute or anyone else. We just know that an attribute has been shared and the site just receives the attribute. We don’t know that it’s you who shared the attribute; we just know it’s been shared.

With the case of age estimation using facial analysis, again, we provide this on a software as a service basis and the site — say, PornHub — requests the use of that software. They send an image to the server that says, “This is a face.” The software looks at the face, gives its assessment, instantly deletes it and all the site has is a confidence level that the person is over the age of 18. We don’t have a trace of who that was or which face that was. All we know is the software has given an age estimate. We don’t know the face or the name, just that we give them a confidence score that they are over or under the requested age.

[Translation]

Senator Dupuis: I am interested and concerned at the same time about —

[English]

The Deputy Chair: Senator Dupuis, it’s past time, so we will have to end it there.

Thank you so much to all the witnesses and to the senators for being patient with me. I’m a very inexperienced person in this particular chair. Thank you very much to all the witnesses especially, as Senator Jaffer mentioned before, for accepting at the last minute and providing us valuable information as we study this important bill. Thank you very much.

(The committee continued in camera.)

Back to top