Safeguarding Podcast – The Dirty Dozen with Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

By Neil Fairbrother

In this Safeguarding Podcast with Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE) we discuss the Dirty Dozen report. Companies featured include Amazon, Twitch, Only Fans, Wish, Discord, Google Chromebooks, Verisign, Snapchat and Visa. We also discuss Age Verification, PornHub and the Creeper Act.

https://traffic.libsyn.com/secure/safetonetfoundation/SafeToNet_Foundation_podcast_-_The_Dirty_Dozen_with_Haley_McNamara_Director_ICOSE.mp3

There’s a lightly edited for legibility transcript below for those that can’t make use of podcasts or for those that simply prefer to read.

Neil Fairbrother

The Dirty Dozen list is an annual campaign calling out 12 mainstream entities for facilitating or profiting from sexual abuse and exploitation, particularly of children. Since its inception in 2013, the Dirty Dozen list has galvanized thousands of individuals to call on corporations, government agencies, and other organizations to change specific policies to instead promote the concept of human dignity. Or at least so say the organizers.

Organizations impacted by this campaign include online social media companies, as you might well expect, but also an interesting mix of traditional offline businesses, such as the Hilton Worldwide hotel chain and Walmart.

To discuss these year’s Dirty Dozen report, I’m joined by Haley McNamara, Director of the International Centre on Sexual Exploitation (ICOSE). Welcome to the podcast, Hayley.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Thank you so much for having me.

Neil Fairbrother

It’s an absolute pleasure.

Could you provide us please, Haley, with a brief resumé, so that our listeners from around the world have an appreciation of your background and expertise?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Sure. So as you said, I’m the Director of the International Centre on Sexual Exploitation, which is a division of the US-based National Centre on Sexual Exploitation (NCOSE). You can’t tell from my accent, but I am living in England and my background really is focused on corporate responsibility. So for a number of years, the Dirty Dozen list has really been my bread and butter.

Through it we’ve been meeting with companies like Google, Instagram, Snapchat, and many more, and really just finding ways that mainstream corporations can do better at preventing grooming, sex trafficking, exposure to pornography and a host of other similar issues. So now I’m focused primarily on the international aspect of not only corporate work, but also legislative and educational work on really the web of sexual exploitation issues.

And that is the kind of philosophy that our organization approaches these issues with. It is that you can’t solve sex trafficking in a vacuum. It’s absolutely connected with child sexual abuse, sexual violence, pornography, and a host of other issues. So we want to have a really holistic approach to this issue and corporate responsibility we think fits into that very well.

Neil Fairbrother

Okay. So could you define sex trafficking for us then, or simply trafficking in itself?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Sure. So sex trafficking is the exchange of sexual contact or content for something of value. So that could be money or it could be, you know, food, to have dinner that night. But it’s under force, fraud or coercion, psychological coercion, or if someone is under the age of 18.

Neil Fairbrother

Okay. And this can take place both online and offline.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Absolutely. Yeah. Increasingly is happening online.

Neil Fairbrother

Okay. Well, we might explore some of these issues as we run through the list on your Dirty Dozen report. Now you’ve been running this campaign since 2013, so getting on for close to a decade, and I’m sure you’ve seen differences throughout each year’s report. Last year 2020 I think is probably unique in that we’ve all been impacted by COVID. What COVID related impact have you seen in this year’s report when compared with non-COVID years?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Absolutely. So in general across, you know, the child protection sphere, we are seeing increased online grooming and exploitation images, sometimes known as child pornography, online facilitation of child abuse and sex trafficking across the board and across borders. It seems like this is the rising trend due to COVID.

So our list this year in some ways reflects that. Some of the targets are targets where especially children are especially vulnerable to the online abuses. And then also some of the targets are ones that have had record growth due to COVID. So, you know, Amazon and Netflix, those are some that you can immediately recognize that they’ve grown and profited more during COVID.

And some that also had an increased responsibility that we believe they failed to enact in protecting children due to COVID such as with the move to online learning. So we have some online learning services on the Dirty Dozen list this year.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Okay. What criteria did you consider for a company to be included on the Dirty Dozen list?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Well, we really spread the news far and wide. We tried to get feedback from concerned citizens, parents, direct service providers, survivors. So we spend a good amount of the year just trying to get an understanding of people who are on the ground, what they’re experiencing and what they’re seeing.

And then, you know, we do our own research and we basically make decisions based on how big of a problem we think that it is, how it fits into the current narrative, so sometimes if there’s a company that’s very mainstream that we think is facilitating or normalizing sexual exploitation in the last year, then we try to highlight them.

Also we try to weigh those that we think we can actually get victories because at the end of the day this isn’t just about awareness. This is an active campaign where people have a chance to participate, to send emails to these corporations, to send out social media and actually get some of these policies improved.

Neil Fairbrother

Okay. So some people might say this is a sort of “name and shame” kind of exercise, which may or may not be. What measures of success do you have?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

We’ve had an incredible measures of success. So we’ve had victories at Google, at the US Department of Defense, Instagram, Snapchat. There’ve been policy improvements really at maybe five major hotel chains that stopped selling on-demand pornography after activism related to the Dirty Dozen list.

So it is incredibly helpful for changing policies and also for developing relationships with some of these companies. Some of these companies after being named will reach out to us and ask what they need to do to be removed from the list and then that’s an opportunity for these executives. We try to connect them with the survivors, or with the concerned parents who are being impacted by hurtful policies and it’s becomes a learning opportunity. So you know, there is an aspect of name and shame. We think when a mainstream company is facilitating some of the really egregious stuff that’s happening that they should be called out for it. But we also want to, you know, build bridges with those that are willing and help them to improve those policies.

Neil Fairbrother

Okay. So you mentioned some interesting things there but what is the fundamental cause of them being on the list? Is it a lack of knowledge, or is it lack of foresight and thinking things through, or is it a deliberate ignoring of the problem for other reasons, or is it a mix of various factors or something else completely different?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

It’s definitely a mix. I think especially this year there are certain companies like Only Fans that are on the list and they know, you know, what they’re doing and I believe that they don’t have any reasonable way of claiming ignorance to the abuses that are going on, on their platform. And that’s true for a number of targets this year.

And then some of them, the problem is just that they’re not making these issues a priority. So some of them, they knowingly are facilitating it. Some of them is just not a priority for them to fix the problem. And I think there’s also just a general ignorance about the way that different forms of sexual exploitation and abuse connect with each other.

So it’s actually very important what a company’s policy is regarding pornography for example because as soon as you’re incredibly permissive to that, there’s often hosts of child sexual abuse materials, sex trafficking, pornography, videotapes, sexual assaults, and more that begin to arise on your platform. And so I think in some cases like those, people are a little bit ignorant about the steps that they need to be taking to keep their platform safe.

Neil Fairbrother

Okay. Well, let’s have a look at some of the candidates on your list. And if we start in alphabet order, I think we begin with Amazon, which might surprise people because Amazon isn’t generally regarded as a social media platform per se, but is regarded as a global shopping mall where you buy books and groceries and so on and so forth. But it’s more detailed than that. It’s much more nuanced than that because Amazon does a lot more. So what is the issue with Amazon?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So Amazon has unfortunately a number of issues. Like you said, they’re really a mega-corporation with many tentacles and many kind of sub-genres of their business and there’s problems in multiple of those sub-genres. So one of those would be Amazon Prime where there don’t have the kind of parental controls that they really need. And so that’s one problem.

Neil Fairbrother

This is Amazon Prime, the video streaming service, right?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes. So the streaming service really needs to improve their parental controls. It’s very easy for kids to get around them.

Neil Fairbrother

Okay sorry. So Amazon Prime, you want to watch the video. And what you’re saying is that there needs to be better parent control, but presumably the issue then is that young children are watching inappropriate content. Is that, is that correct? Yes.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes. Yes. I mean, Amazon Prime or Amazon Video is really producing a large amount of content that eroticizes sexual violence and portrays really extreme sex scenes, which parents should have the option to really protect their children from, seeing that if that’s their choice and their family.

So there’s some simple things that Amazon needs to improve. Like right now, children can just move from one profile to another and that doesn’t require a pin code for a child to move into, maybe the parents have set up their own profile and have a separate profile for the child. This is a problem that Netflix had last year and after we called on them, they actually fixed that. So one thing that is consistent across corporations is that for parental controls to be effective they need to actually be able to be locked in otherwise they’re simply suggestions.

Neil Fairbrother

Okay. All right. So you mentioned Twitch, what is Twitch?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So Twitch is owned by Amazon and it’s essentially streaming for playing video games, so people can watch others play video games and interact with them in real time. And unfortunately, you know, that again has parental control problems. And they also have a number of issues on there, such as grooming is happening on Twitch, you know, it’s live stream. So there’s a lot of grooming that’s happening on there and child owned accounts that are being bombarded with sexual harassment as well.

Neil Fairbrother

Okay. So as a child, I might go into Twitch to do what? This is a gaming platform, is it?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So a lot of a lot of people really enjoy just watching other people play video games. Then you can send in your own chats and you can also tip them or pay them money on this. And so many people, you know, stream themselves playing video games full time and make a living even on Twitch through this system. Seems great, but like anything that can be then used for grooming through those kinds of direct messages and those comments features, and sometimes that grooming can then be used to move the child off of the platform, whether into other forms of online abuse or even into in-person abuse.

Neil Fairbrother

Okay. The general public perception of Amazon is kind of on the surface of Amazon’s services, I think as I said it’s generally guarded as a an online store, but actually they have services that go deep into the architecture of the internet, or at least the worldwide web anyway. They have something called Amazon Web Services and Amazon S3. What are they?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So Amazon Web Services is essentially hosting the domains for a number of websites. So you know, the way that maybe GoDaddy does, sometimes people are a little bit more familiar with that company, they’ve done a lot of advertising. We’ve unfortunately seen that Amazon Web Services really does host a number of pornography websites through that system. So that’s something that we’re calling on them to address.

Neil Fairbrother

Okay. Just before we leave Amazon I would just like to spend a couple of minutes talking about the Amazon store that everybody is familiar with. Now you referenced in your report the “Creeper Act”. What is the Creeper Act and what’s that got to do with Amazon and child abuse?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So the Creeper Act is a bill that’s been proposed in the United States that has not yet passed, but essentially to outlaw child-like sex dolls or a better term for them would be child sex abuse dolls. And that ties into Amazon because we have over the number of years found that they’ve been selling child-like sex dolls and adult sex dolls on their site.

Even more, they’ve been hosting a number of incest-themed, written pornography, sometimes called erotica, and [it’s] really, really disturbing. You can actually often even find this by just searching “stepdaughter”, “stepdad” and on the first page, there’s a number of incest-themed, eroticized, sexualized stories which is also especially problematic considering incest child abuse is a really serious problem that’s plaguing our society right now. So sex dolls, sexual abuse dolls, and they even host some books by pimps that are instructing people on how to groom and coerce women in the sex trade.

Neil Fairbrother

Okay. Now these sex dolls, these child-like sex dolls are manufactured in Japan, by an organization called Shin Takagi I think is the correct pronunciation. And they reportedly argue that these dolls help prevent pedophilia towards actual children. Is that the case do you know?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

It’s really not, unfortunately, you know, there’s this old tripe phrase called “neurons that fire together wire together”. Basically, it’s the idea that you can’t practice a desire away. So when pedophiles are practicing sexual acts on a doll, they’re actually encouraging that behaviour. And there’s been a number of a number of individuals who’ve spoken out on this. Many people who’ve actually been caught with child sexual abuse images, or child pornography, or have been caught sexually abusing a child are found in possession of child-like sex dolls. So it’s just in reality, it’s just not true that practicing sex abuse on a child will somehow discourage you from doing it in real life.

Neil Fairbrother

So next on the list then you mentioned “Only Fans”, which I’m not sure how many of our audience will be familiar with. What is “Only Fans”?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Only Fans has really become very popular in the last year, the last 18 months. Essentially it’s a subscription website where you go, you subscribe to a person and they then sell images, videos, and live streams that are exclusive. So only you or the subscribers get to see that content.

The problem is Only Fans is essentially monetized pornography. That’s essentially all that’s going on, on Only Fans. You know, there might be one or two small handfuls of people selling workout videos or something, but it’s widely recognized to be filled with online pornography.

And it’s also branding itself as an influencer platform. So it’s really working to normalize the sex trade, so that’s a problematic as well. And there have unfortunately already been cases of people sex trafficking, individuals, primarily for the purpose of posting explicit videos of them on Only Fans and to profit from that. There’s also a very wide trend of individuals really just going around and trying to recruit girls to post on this site. It’s mostly women posting pornography, although there are some men as well.

Neil Fairbrother

Yeah. Now when it comes to that kind of activity, the trafficking aspect of it, I believe that the individuals that do that get a finder’s fee. Is that correct?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes. So the way the structure really of Only Fans actually encourages trafficking and pimping, because there is a recruitment link where if someone signs up with your link, you get 5% of their proceeds for the first year of their time. So many women have spoken out about how people are kind of cruising among young women trying to get them to sign up with that link because then they can make money.

I think something else that’s really interesting about this platform is that it is a lot of people will say, well, this is the way that people are able to make money it’s during the pandemic. A lot of people are going to it out of concern about economic security. But the problem too, with Only Fans is it has been flooded. The market has been saturated to the point where many women are speaking out about how there’s really this environment of competition. So for someone to actually make real money on the platform, they’re typically having to do increasingly degrading, painful and shocking sex acts in order to stand out from the crowd.

Neil Fairbrother

Okay. So the third on the list is a retail shopping outlet called Wish, which sounds very innocent. What’s the problem with Wish?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Right? So the Wish shopping app has some similar problems to what we’ve seen in Amazon. They have a history of hosting sex dolls including child-like sex dolls. Right now it appears that they’ve removed most of them so we’re keeping our eye on them, on that matter, but that is progress.

They are also selling spy cams or nanny cams, and that’s fine, but they’re marketed as useful for filming women nude without permission. So they’re marketed where the photos are of the spy cam being set up to spy on someone undressing, to spy on someone in the shower, to spy on someone clearly about to enter an intimate situation. So that’s a real concern and something that they should very easily be able to fix and regulate with their providers.

We also know that Wish is one of the top advertisers for Mindgeek. Mindgeek is the owner of PornHub, which of course has been found to be rife with sex trafficking, child sex abuse videos, and more. They’re being grilled right now by the Canadian parliament for what is increasingly appearing to be knowing criminal conduct. And so we’re also calling on Wish to stop advertising and aligning themselves with this corporation in that way.

Neil Fairbrother

Okay. So perfectly legitimate business is being carried out by perfectly innocuous, innocent individuals, members of the public buying regular goods, products, and services through the Wish platform, which I think has some 500 million users worldwide. And they presumably are not aware that some of the payments, their commerce on that platform, is contributing to the continued success of Mindgeek and PornHub. Is that what you’re saying?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Absolutely. And sometimes we’ll have people will ask, well, do I need to boycott this company? Do I need to never shop with the wish app again? We’re not necessarily calling for that. If that’s something that someone wants to do, they can, but we often find it’s more powerful for people to take a positive action. So we have on the www.DirtyDozenlist.com website multiple ways that you can contact the company, sending them an email or sharing about it on social media, so that other people find out the way that this corporation is facilitating that.

Neil Fairbrother

Okay. Discord is apparently a popular platform which allows users to connect and chat with each other in real-time and boasts over a hundred million active monthly users. So it’s a fair sized platform. What is the issue with Discord?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So Discord really is very, very popular. I mean one of the top, most popular platforms for video gamers and it started as this virtual meeting place for gamers. But unfortunately, again, we’re seeing this problem where people are grooming children for sex abuse and sex trafficking on this platform, and also trading pornography, including child sexual abuse materials, and also non-consensually shared or recorded pornography. And so this is a really big problem.

You know, many young boys especially, and also young girls love video gaming. Right now we’re especially in countries where they’re still locked down or intense social distancing you know, people go to these platforms in order to have that social element, but Discord is definitely not doing enough to proactively remove those who are grooming and abusing children or sharing this kind of exploitative content.

Neil Fairbrother

Okay. Now you say in your report that this code has inadequate Age Verification procedures in place, how can Age Verification be applied? How can it work?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Right. So, there’s a number of different ways that it could work. We try not to be too prescriptive to them in what they need to do. They obviously know their system best. But essentially they need to know the ages of those that are signing up in order to make sure that, you know, adults are not grooming children, to make sure that children are not accessing pornographic channels or servers is what they’re called on discord. And this is very doable technologically

I’ve spoken with individuals at other social media companies and the ability to have someone when they first sign up, say what their age is going to be, and then develop algorithms that the tech detects those who are clearly lying. For example a 40 year old man, who’s pretending to be a 15 year old girl so that he can groom some young boys online. It’s very possible to catch a large number of those kinds of predators based on how they act online, if you prioritize building out those algorithms and systems.

Neil Fairbrother

So you would infer the age through how they act and what they say on the platform, rather than using some kind of formal identification documents such as the passport or an ID card or driving license?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Right.

Neil Fairbrother

So Google, everyone knows Google. Everyone may not be aware that Google provide products as well as their search engine. And they provide real world physical, tangible products in the shape of Chromebooks, which are their laptops basically and they come with their own operating system and their own apps and all that good stuff. What is the issue with Google?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So Google Chromebooks specifically are on the Dirty Dozen list this year. You know, there’s a host of other problems with, you know, Google Play and YouTube that we’re constantly in communication with them about. But Google Chromebooks, as you said, it’s used especially for online learning. Before COVID, there were more than 40 million students and teachers who were using these devices for schooling. And now, afterwards, that number has skyrocketed over far, far beyond 40 million people are using this.

The problem is, is that Google does not default these items to be safe for children, even when they know that these laptops are going to children to be used for school. And so what that means is that safe search is not turned on. Parental controls are not turned on. We hear from a number of parents and students every year who were exposed to pornography or are even groomed on their Chromebooks because they’re not set up with the right parental controls that would maybe keep them in safer places online.

So our request really to Chromebooks is that if you’re giving these products to children in schools, if you know that’s where it’s going, you need to by default turn on the filtering and safety tools, because schools are overwhelmed. Teachers are overwhelmed, parents are overwhelmed. And I actually have a colleague with children in school who were given Chromebooks and it took her, a tech savvy mom who’s very aware of the importance of these issues, a couple of hours to get everything set up the right way to protect her child. So this is something that Google needs to not put that burden on parents and in schools.

Neil Fairbrother

Yes, this is a really interesting problem. Google I’m sure would argue that they are providing a universal product to everyone everywhere and it is in fact the school’s responsibility as the purchaser and distributor to the end consumer to set up these controls and therefore the problem lies with the schools. I haven’t spoken to Google, I don’t know if that’s what they would say, but it seems like a reasonable argument from their point of view. Why is that not a reasonable argument? Surely if schools are having to deliver blended learning as it’s called these days, then the responsibility does in fact lie on the schools to provide a safe product?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

That, yeah, that’s really the argument that Google has at this standpoint. What we’re seeing though, is that if you’re knowingly entering a contract with a school, so I’m not talking about, you know, parents who maybe purchase a Chromebook for their child, Google could never know that that’s happening, but when they’re entering a contract with the school, they they need to take the responsibility on themselves to make sure that it’s set up safely because many school systems don’t even have I Departments.

Teachers are underpaid and overworked and administrators don’t have the tech savvy ability to set up these systems. We’ve spoken with a number of administrators who, you know, they are handed a thousand Chromebooks and they have to through each individual Chromebook and look up what the safety settings are, which often change, which often take upwards of 10 steps to actually set up. It’s not as simple as, you know, a simple check mark that you want the Chromebook to be set up safely. You have to go through steps with multiple different facets, you know, Google Search ,YouTube and many more kind of sub products, Chrome search, and many other things. So at that point that clearly becomes overwhelming.

And Google, we think has the social responsibility. They could do this so much easier. You know, maybe they offer to the school and say if you want, we can set these filters up for you before giving them, because I think that the vast majority of schools would say, yeah!

Neil Fairbrother

I’d like to talk about an organisation that may not be a high street name or a household name and that is an organization called Verisign. Who is Verisign? What do they do? Why are they important when it comes to the internet and the worldwide web?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So Verisign almost no one will know who this company is, I didn’t know about it until, you know, the last maybe year, year and a half, but they’re a company that has basically exclusive management over.com and.net domains.

Neil Fairbrother

So by .com and.net domains, you mean the website name of a particular organization, which might be safetnet.com for example, that is owned, managed, by Verisign?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yeah. They have exclusive management over it. And so doing business with them you know, you might buy your domain from someone like Amazon or GoDaddy, but they’re the ones that are ultimately in charge. They’re a little bit shadowy in the upper echelons of the internet.

Neil Fairbrother

Yeah. So as a consumer, as someone who wants to set up a website, you need a domain name. You will buy it from a domain name registrar, such as GoDaddy as you’ve mentioned, or 123Reg, there are loads of them out there. You won’t buy them directly from Verisign. Verisign themselves don’t sell the domain names, but they are responsible for the management of them. Is that correct? Is that what you’re saying?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes

Neil Fairbrother

Okay.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

And as it turns out more than 70% of web pages that contain child sexual abuse images are on .com and .net domains. That was according to a 2019 report from the Internet Watch Foundation. Now Verisign has in their contract, that you are not allowed to share child sexual abuse images, or child pornography, on that platform, it’s pretty reasonable to have that in your contract.

Neil Fairbrother

And then this would be a contract between Verisign and the domain name registrar?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes.

Neil Fairbrother

Okay.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

But they do not enforce it. And they really need to be doing more to make sure that these kinds of websites are not hosting this kind of material. When it becomes incredibly clear that a company is trading in child sexual abuse materials, Verisign should step in.

Neil Fairbrother

Oh, okay. So you’ve got a little chain going on here. You’ve got Verisign who is responsible for managing the .com and .net domain names. You’ve got the vendors or the purveyors or the sellers of those domain names, which are the domain name registrars such as GoDaddy, and we’re not by the way, implying that GoDaddy or any other registrar is doing anything wrong here. This is just an example. And then you’ve got the purchaser, the person who might want a website. And what you’re saying is that the person who buys the domain name and sets up a website who ends up trading or sharing this kind of illegal content, they are somehow bound by the contract between the registrar and Verisign?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes. And there’s a number of things that get highly, highly complicated. You know, they’re on our website it’s endsexualexploitation.org/verisign is one way to learn about this, or just dirtydozenlist.com.

But another aspect of this too, is that two years ago they promised to implement a “Trusted Notifier” program, so that there would be some transparency and accountability for .com websites that had child sexual abuse materials on them and kind of create a mechanism to make sure that these websites are taken down. But after two years, it seems like no progress has been made on that front. So it seems like it’s not enough of a priority for them.

Neil Fairbrother

Okay. Now, you also mentioned in your report about Verisign something that sounds slightly odd to the lay person, I think, and that is that they have something called a “Thin whois” when they’re supposed to implement a “Thick whois”? What on earth is all that about?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes. So “Whois” is basically a database. It’s like the internet white pages. So it lists individuals, businesses, organizations who register domains. If you register a domain, you have to provide your name, you have to provide contact information. And it’s a way to really police and make sure that these exploitive websites are taken down and that the people who host them are held accountable.

But Verisign has not implemented this “Thick whois”, which is basically what many members of the child protection community think is best because that’s something where you actually have some robust information on who is managing the websites. And so Verisign’s really one of the only ones that is not requiring that more robust data collection, which then makes it very difficult to give even information to law enforcement, for example, if a website is facilitating exploitation.

Neil Fairbrother

Okay. So I buy a website with the.com or.net domain name and I register that name with the seller, which is GoDaddy or 123Reg or whomever. They then collect information from me as the purchaser, which they are then presumably supposed to relay back to Verisign so did it ends up in the “Whois” database and that is simply not happening?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Right. And because of that Verisign says that they don’t have the ability to give law enforcement the information that’s needed to shut off domains that are knowingly hosting child abuse materials. They say that they just can’t because they don’t have the information. But they could have the information. Most other domains [registrars] do require that kind of information.

Neil Fairbrother

Okay, brilliant. Now we are rapidly running out of time, unfortunately. So very quickly, I’d just like to spend a couple of minutes on your “Watch List”, because not only do you have the Dirty Dozen list, the full list of which is available from your website, we just haven’t had time to go through all of them. But I would like to talk about the Watch List because there’s a couple of interesting companies on there. What is the Watch List? And can you tell us a little bit about why Snapchat is on there and why Visa is on there?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Yes. So the Watch List really serves a couple purposes. One is it puts an entity on notice if maybe they have not been named to the Dirty Dozen list yet, but there were very concerned about problems going on in their platform. And it also can sometimes be a place where a company could be placed on it after having been on the Dirty Dozen list and we see that they’re making improvements, we’re encouraged by that, but we want them to know that, you know, you’re still on our watch list. We’ve got our eyes on you. So you’re not exactly off the hook yet.

On that list, like you said, this year we have Snapchat, TikTok and Visa. Visa rightly was on our Dirty Dozen list last year because of processing payments for PornHub. Again, it’s been knowingly found to be facilitating sex trafficking, child abuse, and much, much more. Visa by the end of the year, last year, rightly cut ties with PornHub, so we’re very grateful to them for making that change.

We believe that it’s paused right now. There’s a chance that they might reengage with PornHub, so that’s one reason they’re on the Watch List. And in addition to the fact that they’re processing payments for brothels in Nevada, we believe prostitution causes psychological and physical trauma that can’t be regulated away. There’ve been number of cases of sex trafficking going on in the legal brothels in Nevada in the US. They also process payments for Only Fans and for Seeking Arrangement, which is a sugar dating website, which is essentially prostitution with dinner and there’ve been many cases of sex trafficking through seeking arrangement as well.

Neil Fairbrother

Okay. And Snapchat on the Watch List because?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Snapchat is on the Watch List. Unfortunately the way Snapchat is built inherently means that it’s a primary place for people to go when they’re trying to groom and especially extort someone with sexually graphic images, either into abuse or intersex trafficking.

Snapchat really just needs to.. they’ve made a number of improvements. We’re encouraged. We think that they’re on the right path, but we just need a little bit more prioritization of proactively removing accounts that are monetizing pornography, and that are grooming and engaging in the trade of child sexual abuse images.

Neil Fairbrother

Those improvements that Snapchat made, just to be fair to Snapchat, are what?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

So one improvement a couple of years ago is they removed Snap Cash, which we had conversations with them about. It was primarily used for the sale of pornography and prostitution, a little bit of a precursor to Only Fans. So interesting to see that connection.

They’ve also made improvements to their Discover section, which is a portion of the app where you go and they have stories from [for example] Cosmopolitan Magazine, men’s health and it’s essentially a curated news section. And they had a number of really problematic things in there in the past with articles encouraging minors to engage in sexting and very hyper-sexualized content that was visible to minors. So they’ve done a good job of cleaning that up.

And so we’re encouraged. We think that they are working on making sure to keep children safe with age gating and kind of those algorithms to prevent grooming. But we are watching them. They’re not quite there yet.

Neil Fairbrother

Okay. Thank you very much for that. And where can people go to find the full Dirty Dozen report Haley?

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

At www.dirtydozenlist.com.

Neil Fairbrother

Thank you very much. It was fascinating. It’s a shame we didn’t have time to go through everything. It’s a fascinating read and good luck with that. And we will see next year’s report in 12 month’s time, I guess.

Haley McNamara Director of the International Centre on Sexual Exploitation (ICOSE)

Good. Thank you so much.