Safeguarding Podcast – the Trichan Takedown with Lloyd Richardson and Professor Michael Salter
By Neil Fairbrother
In this Safeguarding Podcast: Lloyd Richardson Canadian Centre for Child Protection & Professor Michael Salter talk us through the Trichan Takedown, how ISPs fought against CSAM message boards and how some ISPs colluded with them, the role of CDNs such as Cloudflare in CSAM replication and sharing, and the evasive tactics used by the CSAM message board admins to hide their vast libraries of child abuse images and videos.
https://traffic.libsyn.com/secure/safetonetfoundation/SafeToNet_Foundation_Podcast_-_The_Trichan_Takedown_-_Lloyd_Richardson_and_Michael_Salter.mp3
There’s a lightly edited for legibility transcript below for those that can’t use podcasts, or for those that simply prefer to read.
Welcome to another edition of the SafeToNet Foundation’s Safeguarding Podcast with Neil Fairbrother exploring the law culture and technology of safeguarding children online.
Neil Fairbrother
Internet Service Providers and network operators often use the “mere conduit defense” to not get involved in content moderation and online child safety issues as, they say, they don’t want to be the “arbiter of truth”. But one truth is that child sexual abuse material or CSAM, is university held as being illegal, so there’s nothing to arbitrate. So what role could ISPs have in eradicating this content from the online spaces that they connect their users to?
To guide us through a project known as the “Trichan Takedown”, I’m joined by two guests, Lloyd Richardson, CTO of the Canadian Centre for Child Protection in Canada and Professor Michael Salter in Sydney, Australia. Welcome both to the podcast.
Lloyd, this is your first time as a guest on the SafeToNet Foundation podcast, so could you give us a brief resumé please of your experience and background so that our audience around the world can appreciate where you’re coming from?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Absolutely. So in reverse chronological order, as you stated, I’m the Director of Technology here at the Canadian Centre for Child Protection. I’ve been working in the space of technology and child protection for about 15 or 16 years, now. It’s hard to remember how long, but prior to that I actually worked for a hosting provider doing technical work for them until I moved into this space. So I definitely had a foothold in the space of technology before sort of getting into the charity angle related to technology and protecting children.
Neil Fairbrother
Okay. Thank you. And our regular listeners may recognize Michael’s name as we did a deep dive in a previous episode into his work on organized sexual abuse of children, but Michael, could you perhaps also give us a brief resume of your background?
Professor Michael Salter
No problem, Neil. So I’m a criminologists based here in Sydney, Australia. A key focus of mine is child sexual exploitation both online and offline, so I’m interested in the ways in which child sex offenders network with one another and conspire in the abuse of children and over the last four or five years I’ve worked quite closely with the Canadian Centre and it’s been a real pleasure to see the tremendous work that they do on behalf of children, not just in Canada, but around the world.
Neil Fairbrother
Okay, thank you. Now, before we delve into the detail of what’s known as the Trichan Takedown, which you’ve written a fabulous analytical report on, we need to explore and understand a little bit about the structure of the internet, or at least the different types of internet service providers that there are and how they all relate to each other. But before we do that, let’s just talk a little bit about Project Arachnid, which is a key component of the Trichan Takedown. And that’s over to you Lloyd, I think for the first question on that.
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Sure. So in a very 10,000 foot explanation of what Arachnid is. So essentially it began back in 2016, 2017, as a pilot project where we started doing proactive detection of child sexual abuse material on the internet. Historically we’d done what most Tip Lines do and wait for reports from the public and then react to those reports. Obviously, that’s a bit of a slow approach when you’re looking at the sort of breadth and scale of child sexual abuse material on the internet.
So like I said, it started as a web crawler, but it’s actually a lot more than that. I’d say the web crawling portion is a small piece of what Arachnid actually does right now. So we have a team of analysts that sort of sit in what I call the call the driver’s seat of Arachnid and where you’re doing the actual classification of child sexual abuse material. We’ve brought in about 11 Tip Lines around the world to assist them in this massive classification effort within Project Arachnid. So it’s really global in nature in terms of pulling in other Tip Lines and being efficient about how we use analyst resources to reduce not only victimization, but also the exposure to images that analysts have.
So outside of the global Tip Lines, we also deal with industry. We created an API for industry to come in and do what’s called proactive detection of child sexual abuse materials. So they’re able to bounce these images and URLs directly off Project Arachnid to get a response as to whether or not they’re known child sexual abuse material. So to be honest, that’s a more ideal approach where we’re not chasing people in terms of trying to find that material on their services, they’re actually making an effort to actually find it themselves. And that’s generally a much better way to find this type of material.
The last sort of piece is the Notice and Takedown side of things. So when we find something on a particular provider, the sort of the routing system that happens to get that notice into the right hands and also the repeat of those notices until the material is taken down, because what we’ve learned is with many of these providers, you know, their abuse desks get a massive amount of abuse email into them, and some of them are understaffed and sometimes these things can get missed and ignored. So if you don’t have any sort of a policy in terms of how you’re detecting when this stuff is removed and how you sort of repeat your notices, then you’re going to be woefully inadequate in terms of removing this type of material.
Neil Fairbrother
Okay. Michael, could you give us perhaps a big picture view of the perhaps scale and even severity of the type of content that Project Arachnid is designed to find?
Professor Michael Salter
No problem. I mean, it’s, you know, broadly recognized that we’ve seen essentially the saturation of internet platforms, social media platforms, electronic service providers by child sexual abuse material. So we’re now dealing with a massive volume you know, of reports to authorities all around the world well beyond the capacity of law enforcement and well beyond the capacity of hotlines to manage. So in the late nineties, early noughties, jurisdictions around the world set up hotlines or tip lines designed to take reports, to the leak content and seek removal under a civil notification scheme.
So essentially the removal of CSAM overwhelmingly is post-hoc, it’s not proactive, it’s passive, it waits for a report from the public or increasingly from industry itself and at the moment industry is responsible for the vast majority of reports by scanning their services for CSAM.
And then hotlines and tip lines, will seek removal from electronic service providers, usually with very limited or no enforcement mechanisms, often because they’re seeking from electronic service providers that are not in their jurisdiction, but even when they are in their jurisdiction you know, very few countries actually have much teeth in the way of forcing electronic service providers to remove illegal content. It hinges on good faith. It hinges on collaboration. And frankly, if a private sector entity doesn’t feel like being compliant with those notifications, there’s historically been very little that hotlines can do.
Neil Fairbrother
Yeah, we may well come across some of that behaviour as we drill into this remarkable project. Now, I did say that we need to understand a little bit about the way that the internet service provider or ISP market is architected. And I think there are three tiers of ISP, Tiers one, two, and three. What are these and how do they relate to each other?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So I’d say there’s not a specific definition for each of the tiers, but when you’re talking about Tier one or backbone internet providers, they’re generally referred to as these very large multi-nationals, so global network providers, that don’t have to pay for “Transit” to any other network on the internet. So they have an entire view of the internet as it were. And Tier two being also a larger entity, but normally has to pay for Transit.
And then the smaller ones tend to be the Tier three providers that have a local Point of Presence or only maybe one or two upstream links to the internet. So you can think of the two lower tiers, so Two and Three, always having what you would call “upstream”, so people they’re beholden to in terms of getting access to the wider internet, whereas a Tier one provider’s beholden to no one in terms of they have peering arrangements with a lot of people, but no one is what I call upstream to them.
Neil Fairbrother
Yeah. So they’re there at the top of the pile, so to speak. The man in the street, the person at home generally would deal with for their internet access at home a Tier 3 service provider. Is that right?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Yeah. For the most part, I’d say it’s more complicated than that. There are Tier 1 providers, for example, that are local incumbent telecommunications providers. So you will find ones that provide internet access to Joe Public and the last mile of connectivity, but also be a Tier 1. There are examples of Tier 1’s that are that.
Neil Fairbrother
Yep. Okay. Now another key component of this, which you may have mentioned just a few moments ago, is something called “internet transit”, which is a service which we all use every day without knowing it. What is internet transit?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So basically, it’s a means by which networks find other networks. So very few companies would have an entire view and be peered with everyone else on the internet, so have a view of how to get packets to a certain location. So normally these entities engage in what are called “Peering Arrangements” and some Peering Arrangements are free where they’ll carry traffic towards each other without incurring any sort of cost, whereas others might have a cost because one entity has more power or more users wanting content than the other particular entity. So you’ll see a discrepancy in terms of I have this many users that want to pull bandwidth from this area, therefore you’re going to have to pay me this sort of fee to do that. So these contracts can get very, very complicated in terms of how they operate.
Neil Fairbrother
Now, there are other actors in this structure known broadly as Content Delivery Networks or CDNs and one of the most well-known of those which certainly featured in the Trichan Takedown is called CloudFlare. What is the CDN and what is a CDN’s role in all of this?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So Content Delivery Networks, there’s a number of ways to set them up. What they’ll do is they’ll primarily cache content close to where the users are. So for example, Michael in Australia there, if he wants to access a service that sits in the United States and that service employs a CDN like CloudFlare, what CloudFlare will do is essentially rent or own infrastructure within Australia, so when Michael goes to that website he will actually be pulling the larger assets of content off of local nodes within Australia, rather than crossing the ocean to pull it off of a node within the United States.
So this has enormous savings in terms of speed of how websites are loaded and it also gives higher capacity to websites that that don’t have a huge capacity or huge download capacity. So if you think about a mom and pop shop that sets up their website on a particular provider, if they put it behind CloudFlare, all of a sudden, they’ve been afforded the ability to basically serve up content to a much larger audience at a lower price.
That’s also confers a few other benefits, such as prevention of Denial of Service Attacks (DDOS). It’s really a protection service in a lot of ways. In the case of CloudFlare, they have web application firewalls. So it’s really more than just the whole CDN side of things, but primarily with the CDN it’s about delivering the content local to the user in some way.
Neil Fairbrother
Okay. Now Trichan is a reference to three different message boards or image boards, which I think were known as 180Chan, 144Chan and 155Chan, which all sounds rather cumbersome. What exactly were they and where were they based? What did they do?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So they, as you said, they’re image boards. So there’s many image boards on the internet that have particular topics they talk about. These three were related to the exploitation of children and they existed for a long time. So the nature of the material on the site changed over time, I’d say starting from what you might call a gray area where they might have partially clothed images of children and then sort of accelerating over time to a more egregious imagery that was clearly child sexual abuse material.
So what would happen is users would go and upload images to this website and they’d communicate on the images essentially. It’s a very 1990’s sort of look and feel of website where it’s a very simple sort of design where users can reply to each other and you have a sort of a chronological view of what users have posted on those particular sites. There’s other sections of the site that appeared to be more commercial in nature where you have people who were trying to make money off of content that they uploaded.
These image boards are really unique in the sort of scope and scale of them. The amount of child sexual abuse material that sat on them on a clear web host like this was I’d say pretty unprecedented and the fact that they were able to stay up for so long, I think they sort of keptvpushing the envelope with things. So while they might have got away with it originally because of the sort of gray area nature of the content, they kept sort of pushing that bar until they realized, oh, we’re still up, we’re still up.
Neil Fairbrother
We’ll have a look at some of the games they played to stay online shortly. Michael, perhaps I could bring you in here to make a comment on the size and scale of the content on these particular message boards. I think there was something like 52,000 images which have been classified as triple-vetted. What does triple-vetted mean and how big a scale of the problem was this?
Professor Michael Salter
I mean, as Lloyd said, I think it’s not hyperbole to say that this was probably the largest case of child sexual abuse material on the clear web during the period of time that Trichan was functioning and it was up and functioning for at least seven years.
Staff at the Canadian Centre underwent a deep dive into the material that Trichan was hosting as part of its efforts to actually seek not just the removal of the content, but ultimately to seek the takedown of Trichan operations themselves.
Triple-vetted refers to the fact that three analysts at C3P had seen the image and had agreed on its classification. And if the work of C3P was taken as representative of all of the content on Trichan then the boards were actually hosting somewhere in the realm of about 180,000 images of child sexual abuse and about 50,000 images of possible abuse material and when we talk about possible abuse material, we’re talking about sexual, sexualized, potentially nude images of children that might not meet the definition of illegal content, but given that it is adjacent to such a massive amount of illegal material, then, you know, we can recognize it for what it is, which is you know, erotic material of children.
Trichan and sites like it also you know, have an important function for child sex offenders in a range of ways, including one of the ways that sex offenders now seeking to avoid prosecution is that they’re sometimes, you know, often avoiding storing this content on their own hard drives, and instead are allowing it to sit online and viewing it online. And so, you know, as part of sort of the Trichan research that we did, we certainly came across offenders who were talking about the fact that there was content that was only on Trichan, so there were victims’ content that was only on Trichan that wasn’t anywhere else. And Trichan was the only place that they were seeing images of that child victim.
Neil Fairbrother
Okay, now these message boards were in fact, commercial operations. They had moderators who were paid, I believe to manage and promote them. How did this business model work? Where did the revenue for this stuff come from?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
I preface by saying, we’re not entirely certain. There are certainly some theories surrounding how they were able to monetize things. There was certainly an offsite advertising aspect to their operation because of the scale of what they did. I mean, it didn’t cost $0 for them to host their services. We can get into that. It wasn’t direct payment from what we could see on the outside, it was mostly advertising, click revenue. In some cases, people would host things on what we call premium file hosting sites where there was perhaps a little bit more of a direct connection to money. And as an aside to that, actually so the numbers that Michael just spoke of, those are just the ones that we knew of, that’s the absolute minimum that there was. We know there was far more than that because some of these corners that you can’t necessarily get into because it’s paywalled or what have you, there’s other sort of barriers to actually scraping that type of material. And some of it was related to commercialized content like this.
Neil Fairbrother
Okay. Now you said it in your paper that the Trichan Takedown project began in 2019, by which time these channels had been in operation for at least seven years, as you’ve just said, partly because of Cloudflare’s business model, CloudFlare being the Content Delivery Network we spoke about earlier. So what was that all about? What was CloudFare doing? How does that business model allow this stuff to stay online?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So I would clarify part of that in that when we got into the real sort of take down efforts with the Trichan, they were no longer being fronted by CloudFlare. But to get back to the first part of your question Cloudflare’s business model is such that they have sort of two tiers of customers. They have their big paying customers, and then they have what’s called a free tier of customers. And I think it’s pretty easy to guess where you start seeing the problems when it comes to their customer base. So when you’re dealing with free on the internet, it’s not actually free, right? There’s some reason that a company would give you free services for things and it tends to come at the expense of properly policing it, I would say.
So if you have a website, you can easily spin up a website and put it behind CloudFlare and get this massive Denial of Service protection, and you’re also insulated about where having anyone knows where you’re actually being hosted. So Neil, if you were to go set up a website somewhere in the UK and you put it behind CloudFlare, and I wanted to find out where you were hosted, I wouldn’t be able to necessarily right away, because it would tell me that you’re hosted with CloudFlare. So it’s a way of hiding your backend.
The reason that they do it that way is somewhat legitimate in that if you’re hosting a website and someone’s going to deny you service, you need to hide where your server is actually located for obvious reasons. That’s part of the business model. However it creates another layer of sort of obfuscation for the sites that are doing things that are nefarious.
So it’s all well and good if CloudFlare gives you information and says, oh, it’s actually hosted here if you contact them and they’re going to do that. But again it stymies the idea of being able to automate some of these take-down notices, because you always have to go to CloudFlare to first ask, can you please give me this information rather than having it sort of available and ready at your fingertips
The other piece with CloudFlare that I find interesting is they sort of fight tooth and nail to talk about how they don’t host any content, which is the most ridiculous thing in the world. We know that CloudFlare is probably the biggest hoster of content on the planet because of the scale of their CDN by nature. So like in the example I gave earlier, Michael’s accessing the content of a US site. He’s hitting those CloudFlare nodes to pull off all of the images and things that are cached by CloudFlare. Cloudflare sees all of this content. It sits on their servers, they’re knowing and aware of it. So the idea that they have no responsibility for it is, I think, that’s completely false.
Neil Fairbrother
So a cache, just to be clear, is a straightforward copy, is it?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Yup! Basically it’s the bit in the middle, so it pulls the original off of that backend server that they’re hiding and then stores a copy of it on that service. So I would say they’re in possession of the content if that’s the case. So…
Neil Fairbrother
Okay. All right. So let’s try and bring all this together then with what actually took place during the Trichan Takedown because you started by serving the message board channels themselves, 180Chan 144Chan, 155Chan with takedown notices. So for you, it was business as usual, you’d come across some content, you served some take-down notices I think in early 2019, what was the response?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Actually there’s a slight modification to that. I’d say we first contacted what I’d call the hosting provider before we contacted the image board themselves. That’s a typical sort of practice for us. So there was a group of what I’d call sketchy hosting providers in the Netherlands that seem to be clustered around the hosting of this particular provider. In an arrangement like that you have to rely on the hosting provider to go and provide the content administer or in this case, the administrator of the Trichan with the notices that we were sending.
So what we noted was there wasn’t really a lot of uptake on the notices we’d send. So we’d repeatedly send notices and the was back and forth with the hosting providers, but nothing was really getting done. We did have a few reach-outs from the actual administrator of the boards talking, there was some back and forth relating to some material arguing about the illegality of it or what have you. But for the most part, our notices were being ignored.
Neil Fairbrother
Okay. So having been unsuccessful with a direct approach, so to speak, with these message boards you came up with what I thought was quite an ingenious indirect approach involving the Tier 1 ISPs, which resulted almost a chess match between you, the Trichan message boards and the ISPs. What happened next?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Yeah, so basically what happened was we were building up more and more evidence of what was sitting on these boards. So as we crawled and scraped the image boards, we’d have analysts classify the material and the sort of detection rate of Arachnid would go up and up and up. So each day we’d see like another thousand images that were posted or that we’d classified on there that had not. We’d been notifying them. They hadn’t been taking it down.
So at that point we were sort of looking for possible solutions, you know, going around the internet service providers that were providing them access. So the interesting thing about these particular providers were that they were playing a game of upstreams to each other. So the hosting provider that that we were contacting had two other upstreams, so two jump upstreams that we talked to about this who also seemingly ignored us.
What was interesting was these two providers that were sort of between them and the larger, more reputable companies, I believe but I can’t prove it, but it seemed like they were all colluding in terms of, they were essentially the same company providing insulation for companies below them.
So what happened was once we got up to what I’d say are the more reputable providers, so some it wasn’t one Tier 1 provider. It was a couple of Tier 1s and a Tier 2 provider that were upstream to this the sort of edge of what I’d call this bad network.
And what we did was we compiled a large amount of evidence that had been sort of recorded within Project Arachnid, we went to these three large providers, like I said, two of them being Tier 1 and the other being more of a Tier 2 provider, saying we’re not telling you what to do, but here’s the notifications we’ve provided to these companies, no action’s been taken, you’re providing transit services for them, what can you do?
Now I’d like to say that that process was quick, it obviously wasn’t. When you’re dealing with Tier 1 providers, I think you probably will find a lot of reluctance to interposing any kind of activity like this. But on the other hand, you can wield that power – you don’t really want your brand taken through the muck when it comes to hosting child sexual abuse material, no one wants that next to their brand. So I think probably in this case, we got a little more attention because of both the scale and egregiousness of this particular actor.
So it didn’t happen instantaneously, but after some back and forth, where, what happens is those Tier 1 providers will… I don’t know what goes on behind the scenes, but what likely happened was they had conversations with the people that do the peering arrangements between these companies. So that Tier 1 provider says, well, I’m Peering with this other company, why am I doing this? What’s going on here? Can we rectify this? And it got to the point where one of the Tier 1 providers was in fact instead of sink holing that entire internet service provider, because you don’t want to do anything that’s going to cause collateral damage as a Tier 1, if you can be surgical about something and be able to specifically take out those three image boards that’s probably the option you’d want to go with.
And like I said, we didn’t prescribe any sort of approach that these companies take. We simply gave the evidence, said no actions being taken, what can you do? So there was a little game of I’d call whack-a-mole that happened where a few of the Tier 1s would sink hole or null route the IP address that was hosting the Trichan websites and then all of a sudden they’d pop up on another IP address on that same provider. So you sort of have to ask the question of the sort of collusion between the content administrator and these hosting providers and it seems to me that they would have had some idea that something like this was happening.
But this went back and forth in a while, so you’d have the Tier 1 providers moving to the next IP address to back haul as they moved in. Eventually I again, I can only speculate because I only see it from this side of the desk, but my guess is that that hosting provider got sick of moving him around or moving the boards around and gave up and kicked them off their service because at some point in time, it’s not worth it as a business to continue having to, you know, allocate new IP addresses that you’re just getting burned by Tier 1 providers like this. So what happened was the site went down for a while and they moved to a different provider and decided to rinse and repeat with that.
There was another phase as well, just one last thing, a provider that is similar to CloudFlare that provides CDN and Anti-denial of service services. But they’re a smaller, and I’d say a sketchier company, so I won’t name them. But they basically got in front of the Trichan, so we couldn’t see where they were hosted. And so we had to engage in a whole back and forth again about, okay, well, we need to know where this site is hosted, because their response was similar to CloudFlare in that, well, I can’t tell you where it’s hosted. I’m not responsible for it because I’m not hosting the content. I’m just this wall in front of it.
Neil Fairbrother
Okay. Now you refer to a technique called Null Routing. So the Tier 1s, from what you said, what I understand, the Tier 1s would Null Route the traffic that was destined for the IP address of one of these message board channels, which I think means that although the content is still there, the people coming to it, the traffic going to it, can’t get at the content, it’s diverted off, it’s given an error message of some sort and you can’t actually get to the content, even though it’s still there. Is that right?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
That’s correct. And it’s actually only for a certain amount of internet users. So to cover the entire internet, you would have needed all three of these particular Tier 1 providers to all play ball. So you’d need them all to Null Route it. So when the first one Null Routed it, the site would have been inaccessible to up a large proportion of the internet. So say maybe 40% of their traffic would have disappeared. But once we were able to get the other large providers to play ball, that inevitably becomes a hundred percent. So like you said, the content is still there, but no one can get at it.
Neil Fairbrother
Okay. Now I think at one point the service providers in the Netherlands where this stuff was being held, I think referred to as service provider P5 in your paper, they came back and said that they weren’t hosting any of the content, they were only acting as a proxy, which sounds a similar argument to the CDN argument.
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
Yeah. And that was exactly it with this particular provider. They were reverse proxying the content. So it’s a similar argument and maybe has a little more ground, but you’re knowingly doing something that you’re not supposed to be doing, right?
So if you’re going to reverse proxy something like that, and you’ve been told that you’re reverse proxying child sexual abuse material you have sort of two options there. Either, Number 1, you stop doing it, which is the preferable one. Instead of just saying that, you know, I can’t be involved, I’m just reverse proxying.
Or B, you let us know what the backend IP address is, well, if you’re reverse proxying it, you tell me where the server’s located. So there’s some back and forth and eventually they disclosed to us the backend server IP address so we were actually able to find out who the poster was. And it was another related provider in the Netherlands at the time that was closely related to the original provider they were setting on. So more fun and games.
Neil Fairbrother
Okay. Now, Michael, I just want to clarify something here. This content is illegal. There are no privacy carve-outs for hosting this stuff. There’s no freedom of speech carve-out, there is no First Amendment carve out. There is no Constitutional carve out. There is no excuse, I believe, whatsoever for anyone to be legitimately hosting this stuff. Is that correct Michael?
Professor Michael Salter
I agree with you in principle Neil, but it’s not correct in practice. You know, the bad network that Lloyd’s describing was clustered in the Netherlands. The Netherlands has a notoriously laissez faire approach to internet governance that prioritizes freedom of speech, but effectively what freedom of speech in this context means is that it prioritizes the financial and corporate interests of the technology sector over public safety and child protection.
The Canadians Centre for Child Protection this year released a major report looking at the jurisdictions in which the child sex abuse material that Arachnid has detected recently, looking at those jurisdictions where it’s hosted, and the Netherlands was found to be hosting 50%, so half of all child sexual abuse material detected by Qrachnid over the last couple of years. And so the Netherlands in theory does not tolerate this material, but in practice what we’ve found and certainly what this case study illustrates is that there are certain jurisdictions where bad actors know that they can operate, and they are essentially operating with impunity.
Now, law reform efforts are underway in the Netherlands but we need to be very clear that there are jurisdictions in Europe and around the world where child sexual abuse material can be hosted. There are also corners of the technology sector and private sector actors in the internet infrastructure stack, delivering services and so on, who knowingly collude in the trade in child sexual abuse material.
Lloyd referred earlier to premium file downloading services where CSAM offenders can store large amounts of CSAM and essentially sell access to that material. Now, premium file downloading services, there’s clear evidence that some of those companies know that that they have cornered the market in pedophilia and they choose not to proactively screen the content that sits in their service.
So, you know, we have a technology sector that has been largely unregulated now for decades and it has, you know, CSAM and CSAM offenders have become parasitic on the internet to such an extent that, you know, we’re now facing a global child protection catastrophe, I think,
Neil Fairbrother
I believe at one point the admin of the Trichan message boards attempted to deceive the Canadian Centre for Child Protection or C3P as it’s known, and also possibly to deceive their own hosting provider. In this instance, they were now on a service provider known as P4 in your report and the administrator used a technique called “referrer protecting images”. What is that?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So not an uncommon practice. So some folks will not want people linking to content on their site because they’re monetizing it in some ways. So that’ll mean if you take a URL to a specific image and you place that in your web browser you’re going to get an error message, right? Because they want you viewing it directly on their site. They don’t want people essentially stealing images off their sites.
So originally the tri Chan image boards didn’t have that sort of protection in place. So when we’re sending notices on particular images, what would happen would just cause confusion. So we’re sending a notice on a particular image. And then the hosting provider would tell us, oh, well, this, you know, we’re getting a 404, this image isn’t up anymore, and we’d have to explain to them, well, no, you have to actually look at it on the actual site and then you’ll see the image that comes up.
It’s another one of those sort of tricky sort of things that comes into play when you’re dealing with automation at a high sort of scale, right? I don’t think that they were doing this to prevent people from viewing the images on their site, they were doing it because they knew that it would cause issues with the sort of transition of notice from hosting provider to content administrator.
Neil Fairbrother
Okay. Now, another technique that I think they use to try to protect themselves is they were trying to exploit the strength of Project Arachnid which is that it uses image hashing to identify CSAM and the admin tried to exploit this by adding what was described as “digital noise” into the images on their site. What is digital noise? Was it successful?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
So when we send notices on child sexual abuse material that we find on hosting providers within the notice itself, we’ll include a cryptographic hash of the image in question, saying this was the particular cryptographic hash of the image we found that this particular URL at this particular point in time. So my guess is that the administrator of the Trichan board believed that we only match on exact images. So what his theory was, was okay, I’ll take an image, I’ll change a couple of pixels in it, which when you’re dealing with cryptographic hashes, that completely changes what the hash value is. So it looked like every time you go to view an image on the site, it would technically look like a different image. So that means when we’re going to report an image it would never actually match with what we were saying.
But unfortunately for them we use perceptual type of hash that I think you’ve talked about in one of your previous podcasts with the wonderful Dr. Hani Farid, we use PhotoDNA behind the scenes to match similar images, which is made it, so we continually sent notices on all of these images. It really didn’t have a big impact, but it was interesting to see the sort of lengths of countermeasures that this particular individual went to, to sort of try and keep the site up for as long as possible.
But I would sort of add to that is that this all wasn’t discovered in two seconds on our side of things. If you can imagine what’s going on on our side of things to determine, okay, well, I’m looking at this image, it’s the exact same image. This let’s do a Diff on these two images to see what’s actually different about them, or like, okay, there’s random pixels that are changed. This guy’s actually messing around like this. We have a lot of notices we send to a lot of providers. If you can think of the amount of time and energy we had to put in to, you know, continually going after this guy, and his little ways, this little game of cat and mouse was unbelievable.
Neil Fairbrother
Okay. Now this project started, I think in early 2019, March I think, and by November the 18th, 2019, he gave up. The Trichan admin gave up and he posted a message that basically said, they’d run out of money. You referred earlier that this, you know, it’s not a free ride necessarily to host this stuff, but he also said in his message that he “…decried the increasing online censorship of CSAM, without which, he suggested pedophiles will be left totally disconnected from their sexual identity”. And he worried that “…little girls will not be allowed to fully express themselves in a world that censors nude and sexual images of children”. Michael, what on earth can we make of a message like that?
Professor Michael Salter
I think it’s really important because what it indicates is that, you know, very serious child sex offenders are part of a broader online social movement that sees pedophilia, not as a sexual pathology or as a sexual disorder, but that sees pedophilia as a legitimate sexual identity. And unfortunately this is a social movement that has significant overlap with discourse that we see on social media, on Twitter. There have been academic champions of this view that that pedophilia is a legitimate sexual orientation, even if its expression is harmful. And then this child abuser is, is linking his views very clearly to libertarian ideals around the completely uncensored and unregulated internet, that that’s the natural state of the internet is that it stands somehow above terrestrial law and operates in some other sort of dimension as it were and that’s a place where, you know, little girls can express themselves through exploitative and sexual images.
And so I think it’s a really significant statement from the administrator of these child abuse websites, because it clearly points to some of the ways in which pro-pedophile type advocacy has been moving into the mainstream over the last 5 to 10 years and we might have significant reason to worry about some of the kinds of discourse that we’re seeing about child sexual abuse legitimization on social media and even by certain advocacy organizations,
Neil Fairbrother
Okay, now we are rapidly running out of time so we’ll have to wrap it up fairly soon so let’s be as brief as we can on some key points, because there are some quite startling takeaways for me anyway, from your paper on the Trichan takedown program. And the first of which is that at a “…granular level beneath the higher order concerns of multi-stakeholder dialogue, providers of internet transit and other key services are revealed to be routinely entering into commercial arrangements with service providers and clients involved in abuse material”. How is that possible?
Professor Michael Salter
Well, I think we need to be really clear in terms of you know, that ultimately the internet is largely owned and run by the private sector and it’s governed by private sector prerogatives and by and large private sector providers have not taken it upon themselves to impose ethical obligations above and beyond basic legal requirements on their customers and, and their clients and they haven’t looked too closely at the sort of activities that customers and clients are engaged in.
And as a result you know, what this means is that, you know, major legitimate companies are as our case study shows are unknowingly providing and providing services to bad actors who are trafficking and child sexual abuse material. Now, if they were to take an ethical lens onto their business practices, then it’s possible to remove massive amounts of child sexual abuse material from the internet.
You know, in this case, when Trichan went down, we’re talking about at a minimum, a quarter of a million child sex abuse material images is going offline in an instant. So rather than a hotline seeking individual removal for each of those images, a quarter of a million images, you know, overnight the internet became a much safer place for children and a much safer place, particularly for the children in those in those images.
So, you know, for, for me, I think this case study really teaches us the importance of looking beyond the kind of utopian visions of the internet as this de-centralized immaterial cloud that floats above us and is made possible by collaboration between government, civil society and private sector. We need to get real about the profit prerogatives that drive the internet and the ways in which those perogatives have facilitated child sexual abuse material.
Neil Fairbrother
Speaking of governments, another takeout for me is you say that “…in their zeal to realize the economic promise of e-business, governments have exempted internet service providers from the same responsibilities to prevent the circulation of illegal content that is required of traditional media”. Is that the case?
Professor Michael Salter
It’s absolutely the case. So, you know, I can access the New York Times here in Australia. It’s you know, I could receive it by print or online, the New York Times does not print child sexual abuse material. And if it did, it would be held to account by law in the United States. And yet the United States does not impose that obligation on other kinds of content distributors, particularly online content distributors.
So we’re in a bizarre situation where individual internet consumers face increasingly punitive sanctions for accessing content, illegal content such as child sexual abuse material when that content is legally delivered to their computers by massive multi-billion dollar private sector organizations and no other media distributor enjoys such privileged status and it is completely reasonable for the community to expect internet service providers to be held to the same standards as a newspaper or a TV station or radio station.
Neil Fairbrother
Is this bordering on a Section230 discussion?
Professor Michael Salter
Look, I mean, I think it obviously is and Section230 is just one of many challenges that we face around internet regulation. But it is worth recognizing that, you know, myself here in Australia, Lloyd in Canada, when it comes to our work online, we are operating effectively under United States law. That what we find is that you know, the basic community expectations of my country are not being by the internet services that are delivered to us, they don’t reflect our laws, they don’t reflect our expectations.
And it’s, you know, not only the United States, but obviously this case study illustrates the ways in which the Netherlands has not been fulfilling to global obligations to children all around the world. And so, you know, this vision that we’ve had of the internet now for a quarter of a century as this collaborative utopian sort of libertarian ideal, it’s clearly failed. And you know, neither Lloyd nor I want a censored internet, we don’t want a restricted or limited internet. We appreciate the many benefits that the internet has delivered. But that does not mean that we can any longer camouflage the harms that the internet as it’s currently configured is doing to children.
Neil Fairbrother
In the UK, we have the draft Online Safety Bill working its way through parliamentary process and there’s an opportunity here perhaps to influence it because one of the other take outs for me is that in the case of the technology industry, there are “…no legal obligations to deny service to customers who collude or facilitate child sexual exploitation. All those such commercial arrangements are fundamental to the ongoing CSAM epidemic.” Is there a place for such legislation?
Professor Michael Salter
I mean, I think there absolutely is. The online safety reforms in the United Kingdom is certainly world-leading and I think governments around the world are watching very closely how this reform progresses. The focus in the UK has been on a statutory duty of care, particularly for social media companies, and while that is welcome, I think that that duty of care should be extended to all internet service providers.
It is not much to expect that an internet service provider does not extend service to a bad actor who is known to ignore notifications to child sexual abuse material, or indeed flout those notifications. And we have a network of approximately 50 hotlines around the world, including in the United Kingdom, who would have record of noncompliant service providers who are not responding to these notifications. And in fact, we don’t even need law. We just need the technology companies and telecommunications providers to agree amongst themselves for these basic ethical standards upon which they found their business.
Neil Fairbrother
Okay. Michael and Lloyd, I think we’re going to have to end it there. Thank you so much for your contribution to this debate and for taking us through the Trichan Takedown, an absolutely fascinating project. Is it going to be repeated?
Lloyd Richardson Director of Technology for the Canadian Centre for Child Protection (C3P)
It has been repeated and I bet it will be repeated again in a different context.
Neil Fairbrother
Okay. Thank you so very much. Thank you.