In this Safeguarding Podcast we discuss with Denton Howard Executive Director of INHOPE their international network of Hotlines, we explore the EU’s Strategy for a More Effective Fight Against Child Sexual Abuse, ask whether the Child Abuse Directive is still fit for purpose and whether Project AviaTor has taken off, we See No Evil Hear No Evil and examine why Holland has such a problem with CSAM.
There’s a lightly edited for legibility transcript below for those that can’t make use of podcasts, or for those that simply prefer to read.
Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the law, culture and technology of safeguarding children online.
The Council of Europe estimates that one in five children in Europe falls victim to some form of sexual violence and in the past few years, there’s been a dramatic increase in reports of child sexual abuse online within the EU from 23,000 reports in 2010 to more than 725,000 in 2019, which included more than 3 million images and videos.
Reports indicate that the EU has become the largest host of child sexual abuse materials globally from more than half in 2016 to more than two thirds in 2019. Partly as a response to this crisis, the EU published last year, their Strategy for a More Effective Fight Against Child Sexual Abuse and today’s guest Denton Howard CEO of InHope will explain it to us. Welcome to the podcast Denton.
Denton Howard, InHope
Well, thank you very much, Neil. Thank you for inviting us to speak with you today.
Can you provide us Denton with a brief resumé so that our listeners from around the world have an appreciation of your background?
Denton Howard, InHope
Okay, Neil, thank you for that. Well, my name is Denton Howard and I’m formally known as the Executive Director of InHope, which is the international association of internet hotlines. We’re an organization of 47 member Hotlines spread across the globe.
Just to briefly explain what a Hotline does. A Hotline offers the public way to report anonymously any content that they feel may be illegal in regard to child sexual abuse material, and some other topics as well on a national basis. And the Hotline will then investigate that report, check it for illegality and then if it is illegal and it’s child abuse, confirmed child sexual abuse material, that report will then be investigated.
And if it’s found to be hosted in another country, that information will then be shared with their law enforcement partners, their industry partners, but more importantly, it will be shared via the InHope network, so that the Hotline in this case where the hosting of the content is, the report will be advised to them, so then they can take action basically rapidly and in many cases almost instantly to get that content removed from the internet and then the evidence preserved for investigation purposes by law enforcement.
But the objective is to get that content down from the internet as rapidly as possible. I don’t want to bog you down with initial detail, we do that with an infrastructure and it’s very important to mention this. It’s an infrastructure that sits in the background that nobody sees, but it’s funded through the European Commission and without that facility, we simply wouldn’t be able to do our job.
Thank you, Denton. Now, although we’re talking about the EU, the InHope network does a stretch beyond the EU, I believe
Denton Howard, InHope
That’s correct. We will be present in all the EU member States temporarily bar one of them, that’s going to be addressed very rapidly. But our 47 members are spread across the globe. There would be North America, South America, Southeast Asia, Australia, New Zealand, Africa. We have a member in South Africa and hopefully we’ll be expanding that as well.
And the UK is represented by the Internet Watch Foundation.
Denton Howard, InHope
That’s correct. They would be one of our founding members.
Now, when it comes to the report I mentioned which we’re going to discuss, there are a number of other related organizations involved, and I just wonder if you could spend two minutes briefly explaining them. First of all, there’s the EU itself. There’s the European Commission and the Council of Europe. What’s the difference between those organizations?
Denton Howard, InHope
Okay, well, I’m not a political analyst so I’m going to give it to you from the practitioner’s perspective. So there might be slight textual or legal definitions that I’m going to basically ignore because I don’t know. But the European Commission is in effect the executive arm of the European Union. So it’s like the civil service of the European Union.
The European Commission is split up into different DGs or Directorate Generals, just like in a government there are different departments. Okay. And each of those apartments is headed by a Commissioner, each of the Commissioners report to the President. And then each of those departments or Directorate Generals have a hierarchy that sits below them and they enact their areas of activity, just like government services.
They propose and put forward legislation which then goes to the European Parliament, which everybody’s familiar with and the Commission’s proposals are then bolted on, are discussed, just like in a normal parliamentary system,
The European Commission, they then implement policy. So in my world, the primary organizations or DGs we would deal with would be DG Home Affairs and DG Connect. DG Connect deals with internet safety and awareness and programs. DG Home Affairs deals with the police and security aspects of implementation.
The Council of Europe, one of the key things that it does is it identifies key areas of policy and then makes recommendations to the European Parliament and Commission. So one key part of that, that I am involved in, is the Lanzarote Committee, which deals with developing a European policy in regard to online child abuse material, on online child abuse. It has many other remits as well, but that will be the part I’m familiar with.
I suppose in the interest of openness and transparency, I have to say to you that the European Commission is one of our large funders, mainly through projects, in partnership with our awareness partners, the EUN who run the InSafe project in Europe.
Okay. Well, we might look at the InSafe project later in our discussion. So you recently published an eight part blog post series, which is very interesting and it explained the EU’s Strategy for a More Effective Fight Against Child Sexual Abuse.
So let’s dive in to this and there are eight points eight parts, eight points. And the first one was to “Ensure a complete implementation of the current legislation” and the current legislation being the Child Sexual Abuse Directive, 2011. Now this current legislation, the Child Sexual Abuse Directive was published 10 years ago, in 2011. So what areas are left to implement after 10 years? Surely it’s all been done?
Denton Howard, InHope
Well again, I have to start this conversation with a few caveats. I’m not so much a political scientist in terms of how their policies on a political level get implemented. I’m more of an in the trenches kind of guy, and we’re kind of very focused on the mission first and foremost. So sometimes the political nuances get a little bit lost.
But what I will say is that when the European Parliament vote on legislation and when it goes in a Directive, it then has to be imposed or implemented on a national basis and sometimes different countries are at different stages in their awareness or priorities in regard to how they will implement a Directive in their national legislation.
Now from a distance, I have to be honest, I find it difficult to understand why there is such a massive delay in the implementation. I’ve heard individually, you know, people say “It’s on our plan of legislative implementation”, but at the same time, I have to say to you, I do not have an explanation as to why it takes so long to implement something which is so important.
Indeed. How can the public keep track of this, do you think Denton? Is there a league table of progress that is easily referenceable by the general public?
Denton Howard, InHope
The straight answer is if there is, I’m not aware of it. And let me just briefly explain. In the European Union, you have something like twenty seven different Parliaments each on different parliamentary cycles, each with different priorities, each with different cultural approaches as in language.
From our perspective, there’s one particular thing, which is that in many legislative texts around the world, the term is “child pornography”. And we never use that expression in our day-to-day business. We refer to it, and you referred to it also, as child sexual abuse material. Now that the thing about it is that if you don’t have alignment in language, then it’s very hard to align with legislation.
I think ultimately it’s if there’s political will to do something, things happen. So all I can say is to any of your listeners to this podcast is wherever you happen to be, if you could check the legislation that operates in your country, and does it say child pornography, or does it say child sexual abuse material? If it says child sexual abuse material, chances are the rest of the language and legislation is probably well up-to-date. If it doesn’t please talk to your local political representatives and please get them to add dealing with that. Because if they deal with that issue in effect and indirectly, they will deal with many of the other issues. I mean, that’s a little bit of a personal rant, but it’s just an important thing.
The second blog post in your eight part blog post series dealt with the issue of “Ensuring that EU legislation enables an effective response to child sexual abuse”. And in the blog post entry, you say that the Commission intends to assess whether the Child Abuse Directive requires updating. Now as we just mentioned, the Child Sexual Abuse Directive is 10 years old. So is it still fit for purpose? Is it still relevant? Because an awful lot has changed, particularly online, in the last 10 years?
Denton Howard, InHope
Well, in regard to legislation that governs things that are evolving it should always be reviewed on a regular basis because it not in is carved in stone. And that comment that we made on the blog is that is a reminder to say, look, if something is evolving, then you need to have reviewed that legislation to make sure that your legislation evolves in equal matter.
I’m going to take a line from a song. I can’t remember who sang it, but it was like, “Don’t get in your own way”. So in other words, you should never write legislation which will stop you from doing what you’re trying to do in the first place. The regulations have to remember that we’re dealing in the internet in a fluid sense. You ask is the legislation up-to-date? But it’s not good enough that it’s just looking at, is it up to date for today?
It should also be forward-looking. Is it thinking if you like, a little bit around the corner, not just where we are now in terms of making legislation, and I think it’s probably an impossible ask, but it should be an aspiration, which is to make legislation as future-proof as is practical. Try to anticipate issues. Try to keep the language generic rather than specific so that, in other words, if one specific thing changes or if one piece of legislation hangs on one word and that ceases to be relevant in that specific sense, then all of a sudden… so I will give you an analogy.
There were people using mobile phones while driving. Well, hold on, you were on the telephone. And people were arguing, “I wasn’t on the telephone, I was using a personal device and it was using an internet connection to communicate, I wasn’t using a telephone device”. And so convictions failed because of those kinds of wordings.
The legislation has to be thought through. So again, you asked me the question in terms of the look back and to see,is it relevant? You know, is it still relevant today? That review has to happen, but it should also be looking, is it going to be relevant into the future?
Okay. Well, one area that may well be for review in the EU Strategy for a More Effective Fight Against Child Sexual Abuse is the eCommerce Directive. And the reason I say that is that the eCommerce Directive is the EU equivalent of the American Section 230, which grants platforms immunity from liability for content posted on their site and Section 230 itself is under intense pressure for being at least modified if not scrapped, or secondary legislation passed that has the effect of modifying it. So should the eCommerce directive be looked at again in terms of liability for content on these social media platforms?
Denton Howard, InHope
Well, I can’t give you a straight answer to that question because I’m simply not a practitioner in that area. Again, it comes back to the point of regular review of legislation to check that it is still relevant and still up to date. And does it still matched the aspirations of the society in terms of what you want it to be? The eCommerce directive covers many different strands. You can’t just go in and modify that for one thing, you have to look at the whole thing and I wouldn’t like to underestimate what would be involved in that… it’s also important to say is that when you modify a directive, there’s the cascade effect. Assuming it’s all agreed and gets through the parliamentary side, then it has to go through the consultation with the different national governments. And then they have to figure out a legislative plan to implement it in their own laws. So think about the waterfall. It takes a long time for the water to get all the way to the sea.
Okay. Now InHope I believe launched a campaign “See no evil. Hear no evil”, which was related to this particular point of ensuring the EU legislation enables an effective response. What was the “See no evil. Hear no evil” campaign, Denton?
Denton Howard, InHope
I need to give you a little bit of background first, to give it a little bit of context. The ePrivacy Directive was originally put together using again, old concepts and technologies. It’s basically to protect the privacy, your privacy, my privacy, in communications that we have, but it was before PhotoDNA.
Now PhotoDNA was a technology that was developed originally in partnership with Microsoft and Dartmouth University led by a man called Hani Farid, who you may have heard of. And they developed this basic technology, which allows you to compare files without actually looking at them. So what you do is you have an algorithm, which you put a file into and it spits out a big number.
So let’s say you know a file is illegal. So you have a sample and that gives you a sample output number. Then using technology, you can scan millions of files almost instantly and you compare the numbers that come out of that with the number that you have on the sample. If you got a match, you know that that file is a match of this file, and if we’re talking about child abuse material, that means that we know it’s child abuse material, but without actually having to look at the file.
So companies were able to do this on a massive scale and detect child abuse material automatically on their services, but without actually going in and looking at anybody’s content and they were able to then take action. That technology is called pseudo-anonymization. And so what that means is that that activity under the wording of the ePrivacy Directive, because it was designed before this technology existed, could be seen as potentially illegal, breaking the Data Privacy regulation.
So many companies, and InHope, have said that because the Directive means that this pseudo-anonymization of data is infringing the data rights, thus it could be illegal to operate. And so we campaigned that that can’t be allowed to continue. So to do that required a thing called a Derogation. In other words, an exception to the rule and that had to happen before the ePrivacy Directive came into full force on the 21st of December.
We mounted this campaign to try and in effect lobby or encourage and educate the European Parliament committee that was handling this, in this case the LIBE committee, to see the importance of this issue and that if it failed, and if the Derogation was not granted, that many companies potentially would have to reassess their use of these technologies and stop doing it.
Facebook were an implementer of this technology and under the US system if they become aware of any content on their platform, they must automatically advise the authorities in the US. Now the authority, the authorized body in the US is NCMEC, and they’re also our member in the US. In 2019, there were 17 million reports to NCMEC, the vast majority were automatically detected using PhotoDNA DNA or similar technologies. Now there are 17 million incidences of child abuse. Now the question there is, if the technology gets switched off, that content and traffic won’t stop, but we will not see it.
You mentioned the “See no evil” [campaign]? Everybody complains, industry must do more, industry must do more, and we agree industry must do more. But you can’t expect them to do something and then tell them they can’t use any of the tools at their disposal. So that is why we put up the campaign, the “See no evil”. Now I’m sure that’s probably the longest answer to the question that you ever wanted, but I needed to give you the context
Denton, that’s fine. The third blog post you wrote on the EU Strategy for a More Effective Fight Against Child Sexual Abuse, concerns “Identification of legislative gaps, best practices and priority actions”. Now, with all of these, we could spend a very long time discussing each point, but in relation to what you were just talking about, PhotoDNA and the finding of this kind of content, CSAM content, this section refers to the point that “offenders have become increasingly sophisticated in their use of technology and technical capabilities, including encryption and anonymity, particularly peer-to-peer file sharing and the use of the dark net”. What is the impact of end-to-end encryption here for online child safety?
Denton Howard, InHope
There’s a lot in what you just asked. So I think I probably need to break it down. So there was sort of three elements to your question. The first one is legislative gaps. Just to keep this conversation simple, I’ll just talk about the European Union, but this applies across the globe.
There are differences in interpretation in terms of levels of legality across different countries and to give you a specific example, there are a number of countries where possession of child sexual abuse material is not actually a criminal offense that you will end up in prison for. In other countries, it is a very severe offense. Now, you might ask the question “How did somebody come to possess it? Where did they get it?” They have to download it from somewhere. So the thing is the different legislative gaps have to be bridged, in other words [there needs to be] consistency across the board. Okay. So that if it’s illegal here, it’s also illegal there. Okay. That’s the first thing.
Then you mentioned about offenders using different technologies and using, you know, avoiding situations and becoming more savvy. We’re not just talking about child sexual abuse material, no criminal wants to get caught and they will use disguises. They will use whatever way they can to avoid getting identified and prosecuted. So the technology just happens to be a different set of tools and they will always do that. So to use the smartness in many cases, the really smart guys, very much are very hard to catch. And you always try to go for the low-hanging fruit and that’s what police officers in effect have to do in the hope that they create examples.
Now, then you mentioned about end-to-end encryption. Now end-to-end encryption is not new. End-to-End encryption has been around for quite a period of time. If anybody’s a user of WhatsApp, or as a user of Signal or any of those communications applications, or Telegram, they are encrypted from your phone to the receiving phone or device.
So it doesn’t matter if you intercept, if you have tapped a phone, you’re just going to get nonsense. So in terms of, from person-to-person communication end-to-end encryption is alive and well. But it does present challenges for everyone, for society. People will, on one hand, want to have secure communication between themselves and whoever they want to speak to.
And then we also have a situation: “Yeah. But what if they’re a child abuser or a terrorist, we want to be able to get into that communication.” This is a societal and legislative. It’s not something that is new. It’s already out of the bottle. It’s a reality. While we may have concerns about it, well then we will have to develop solutions around it.
Now I am going to get ahead of what you’re probably going to ask me next, which is well what do we do about it? Well there are technologies there that can detect patterns within encrypted communication. While the data itself may be encrypted, there are certain things in the patterns within that data, which apparently do leave footprints. Now I am not a technical expert and it’s all in the theoretical realm at this point, but that’s probably something which will come down the tracks.
Okay. Thank you for that. The fourth blog post that you wrote on the EU Strategy for a More Effective Fight Against Child Sexual Abuse concerns “…the strengthening of law enforcement efforts at both a national and an EU level.” And I believe that InHope, to try to help in this area, is an active participant in the Project AVIATOR, which stands for Augmented Visual Intelligence And Targeted Online Research. What is Project AVIATOR?
Denton Howard, InHope
To set the scene, InHope and Hotlines can only exist with the cooperation support and partnership of law enforcement because we’re dealing in illegal material. So InHope Hotlines are civilian organisations, so they do need to have the agreements because Hotlines just can’t go off and do solo things.
So how we approach that is we work in partnership with law enforcement and we do that on a national level for Hotlines and on an international level with organisations, Interpol and Europol would be our primary law enforcement partners. So in the case of AVIATOR, AVIATOR is a project that we are a partner in. Now, again, I don’t want to get into too much technical detail, but I mentioned to you about the reports that get sent to NCMEC, I mentioned the big number earlier in the earlier question, in regard to the 17 million reports.
Now, a lot of those reports go to the United States and they are separated by what country did they originate in. Because you can do that by the IP address where the data comes from. So IP addresses are then categorized and then in effect they create, I’m going to use a very American term, buckets. So there’s a UK bucket and there’s an Ireland bucket and a Dutch bucket and a Norwegian bucket. And basically as MCMEC identify reports and they say, okay, that report is from the UK, and then they will push that into the UK bucket. Just to give you the imagery.
Well, what was happening was is that there was in effect a “Europe bucket’. And so there was just stuff was being fired into the Europol and then they were trying to pull apart the reports of where does this go and where does that go? They were being asked to do a really, almost an administrative difficult job.
So Project AVIATOR is a system which was created to automatically redistribute the reports within a very technical criteria into the correct national police force, so that the right information went to the right people as quickly as possible. AVIATOR has grown, gradually many more police forces have been joining that project because they see the value in getting the right information to the right people as quickly as possible and equally, and this is important, that they’re not being sent the false positives so that they get to see the stuff that they need to see.
And then there’s also another important thing is there’s also a thing about the police role was concerned about impending danger. If there is something to indicate that somebody’s going to be in danger of abuse, you know, not just stuff that’s happened historically. And the police all over the world work really hard on that, and we try to support them as much as possible. And that’s what AVIATOR is all about.
Okay. You mentioned in your blog post that it’s being used by the national police in the Netherlands. But according to figures from the IWF, the Netherlands is actually one of the largest hubs, if not the largest hub, for CSAM. So what’s going on in Holland?
Denton Howard, InHope
The Dutch authorities are a victim of circumstance. I deal a lot with the Dutch national police and they do a fantastic job. They really do. And they work very closely with our member Hotline EOKM in the Netherlands and they have had to overcome, I would just say, a mountain. But let me explain why they have had to really work so hard and I can’t sing their praises highly enough. So if I sound like a bit of a zealot, it’s because I am.
The Netherlands is the largest hosting provider country of all content. Now let me just explain that. If you are a company that wanted to build a hosting infrastructure, it makes the most sense to actually be based in the Netherlands with your infrastructure for the simple reason the physical infrastructure is very easy to access. The taxation structure is very favorable, and I’m talking about fully legal corporate businesses. I’m not talking about bad people. It’s basically internet infrastructure friendly.
What many people don’t realize is that the internet traffic that you go on normally goes via under-sea fiber-optic cables. And a lot of them make ground or land in the Netherlands. So you want to be as close to those cables as possible so that you will have the quickest access to the internet. So it’s all down to physics.
So consequently, the large scale hosting providers where the physical servers are, the data centers, they’re all fairly much an awful lot in the Netherlands. So it’s disproportionate, is the polite way of saying it. So consequently, there is a certain proportion of illegality on all services generally. And it just means that it’s a bigger bubble and thus the Netherlands is overly representative. So that’s the macro. That’s the macro explanation.
There were instances of a number of what you would call “Bulletproof providers”, Bulletproof Hosters, who were operating in the Netherlands, who were basically, I won’t say that they knowingly knew, but they didn’t go out of their way to make sure their services were clean and they did not take action proactively to remove content from their services. And they are being addressed in the legislative process in the Netherlands at the moment.
Okay. Point five of the EU Strategy For a More Effective Fight Against Child Sexual Abuse is all about “…enabling member States to better protect children through prevention.” And InHope is running a prevention platform called “Stop It Now”. What is the “Stop It Now” platform, what does that do?
Denton Howard, InHope
There’s many angles on prevention and awareness, but the Stop It Now program originally started in the UK and the Netherlands to a certain degree. Basically, it’s a way of offering support to potential or actual offenders who may not want to offend or who may want to stop offending. There’s a Stop It Now in the US as well, and there’s a few other programs around the world in a similar vein with slightly different names.
So in other words, the prevention from the abuser side first and foremost. There’s an organization in the UK, the Lucy Faithful Foundation, who would be the leaders in this area and that’s really great organisation. But I will say that if we were having this conversation say six, seven years ago I would have been very much I see that as people talking to the dark side. They would have seen people dealing with offenders, whereas I would have seen that as a betraying the victims to a certain degree because we would all have a very victim centric perspective.
But I’ve kind of come around, I think many people have come around to the view that we can’t ignore the fact that if it’s possible to also reduce abuse material online by if you like, interdicting bad behavior at an early point, maybe we’re preventing stuff from happening in the future. So we advocate for it. We’re not directly part of it. We do support it. And I often meet the guys and speak with the guys and work in partnership to a certain degree with Stop It Now.
The sixth part of the blog post series on the EU Strategy For a More Effective Fight Against Child Sexual Abuse, concerns the construction of a European centre to prevent and counter child sexual abuse. And it says that the commission details its plans to start working towards the possible creation of a European centre to prevent an encounter child sexual abuse, which on the face of it sounds pretty good, but it also sounds fairly vague and long-term. Plans to start working towards the possible creation. Is this getting lost in the political long grass again?
Denton Howard, InHope
It’s in the air at the moment, but I genuinely don’t think so. I think this is going to become reality. Because nobody knows what it’s going to look like, it remains to be seen what will actually transpire. When things get published, we’ll have a clearer idea.
I mean, I don’t think anybody can argue with the logic of it, but the question is it has to also to be practical and implementable. I think the first part is to be creation of an organization. Then it’s probably going to have to research its role because there’s no point in saying it we’ll do X, Y, and Z, and then when it turns out that it’s launched, if it comes up against an obstacle, let’s say a legal obstacle somewhere, that they didn’t foresee. So there will be quite a bit of research before anything becomes [of it]. And then my guess is it will be implemented in stages.
I will say as well that this is at the forefront of the new European Commission’s policy front. In other words, everybody has bought into it in the European Commission, and all the Commissioners have put a thing in their initial things they will work to achieve during their tenure. And so the EU centre concept has a lot of political goodwill. So I look forward to what it eventually comes out with and we’ll support all the sort of research activities. And hopefully it will work towards, and I’m going to be a little clichéd when I say this, I mean, we have a vision and our mission is that an internet that is free of child sexual abuse material. So we will sort of work with any organization, any initiative, that will help us towards that vision.
Okay. The seventh and penultimate blog post entry that you wrote on this EU strategy was to “…galvanize industry efforts to ensure the protection of children in their products.” That is to say the internet industry and the social media industry, and you quoted some interesting numbers here, some of which we’ve mentioned in passing. So Facebook sent almost 16 million reports 94% of the total I think to make while other US-based companies sent fewer than 1,000 reports and some fewer than ten. Ten reports really doesn’t seem very credible, so what’s going on there? Why is there such a disparity, do you think?
Denton Howard, InHope
So what I’m more concerned about is the other end of the scale, as you mentioned, the eights, the nines, the tens, you know, and there are large companies that are down in the you know, those low numbers, the brand names that you would recognize and be aware of.
So the question is, you’re going, “Wait a minute. Is there nobody using those services that to do bad things? Is it that they’re actively looking on their services to detect bad things? Is it that they are reporting them to other organizations rather than a mandated organization?”
I will say in my experience that the large brand names of, if you say, customer or consumer focused services, they tend to be quite proactive on this because they want to do the right thing. But they’re also conscious of the perceptionof doing the right thing. But there are others further down the list, which you’re going “Well, hold on. Why aren’t they doing more?” So it’s only by asking and delving into the questions and going, well, why is that number so low?
This particular blog post entry also referred to some work that the InHope organization is doing, which is the “InHope Summit.” What is the “InHope summit?”
Denton Howard, InHope
Well, the InHope Summit… We realized, Oh, we’re going back to 2016, 17 that lot of what happens in the internet is decided in Silicon Valley. Or in California in general. And what we wanted to do was to increase the awareness of 1) child abuse material online 2) hotlines and by connection InHope as a representative body for Hotlines. And we wanted to basically push our agenda and get in front of a technology company audience to go: “This is who we are. This is what we do. This is why it’s important, and this is why you should listen.” We did that first on a pilot basis in 2018 and it was a good response considering it was the first attempt.
So then on the next year, which was 2019, we tried to create one obviously to get our messaging across. And when we do the Summit, we tend to bring experts with us. So in terms of the child safety side of things, for example, John Carr, who’s a member, who you may be familiar with. He’s on our Advisory Board. We brought our partners from Interpol with us to go from the international law enforcement perspective. And then on the national basis, we had representatives from NCMEC to come along just to show how all of the dots connect.
Can I just say it has paid off in terms of impact because all of a sudden the amount of people… it was sort of on the radar, but in when you put something in front of people, all of a sudden it changes that dynamic. And then also doors get opened in terms of communication and issues, and helped us to open a lot of doors for Hotlines and also a lot of doors, about policy conversations.
Then in a very ironic twist of fate, we were going to have a bigger event than ever before. But then with COVID, that dreaded that’s affected all of our lives this year, we couldn’t obviously travel and you couldn’t have meetings. So we had to pivot this to online, which worked out ironically really well because of the convenience of the ability to log on and join.
We got 1) more people to participate than we ever could have imagined and 2), it also allowed organizations, so let’s just say John Smith in an organization joined this event and 20 minutes in, he was able to say, “Oh, I need to get the other guys into this!” And he was able to pull other people in. So consequently, the audience, it didn’t just double, it swelled. And there’s the beauty of the online scalability.
So we were able to get our messaging out and the impact that that has had… by far, the most important part is that it has opened up a lot of organizations’ awareness to what’s actually going on, who they can talk to, because many organizations come to us and say, well, we just didn’t know who to talk to.
And sometimes we’re not the most relevant people to talk to. Sometimes we’ll go; “Oh, well, I’m glad you talked to me” and you’re able to put people in contact with more relevant people. And it’s about opening connections for people. And it’s only by coordination and working together because InHope is never going to solve this problem on its own. Industry is never going to solve it on its own. And a police officer said to me once, and it’s been repeated many times, is “We’re not going to arrest our way out of this problem.”
So it’s only by coordination and cooperation and people at events like this who can hear what other people are doing, without any agenda. Because we’re not a company we don’t have to protect any intellectual property or anything like that. So we can be super open and it gives other people a forum to discuss and to challenge.
But ultimately again, I’m going to come back to it. It feeds into our mission, you know, on both combating online child sex abuse material, it’s like anything that will support that mission, we will do our best. So again, there’s so much, I know again I’ve given you the longest answer in history, but the point is what we try and do is extract as many things and benefits out of these things as we possibly can.
Okay. Now the eighth and final blog post in the series on the EU’s Strategy for a More Effective Fight against Child Sexual Abuse takes a global view and it says it’s there to “…improve the protection of children globally through a multi-stakeholder cooperation”. And the document itself has an astonishing map in it. It’s a global map and it shows pinpricks, red pinpricks for where CSAM has been downloaded or accessed, in red and Europe on this map on this scale is, the dots are so close together Europe is just a massive, glowing red. It’s really quite an astonishing image to see. So is this showing that Europe has a specific issue with CSAM, or is it that the reporting and tackling efforts are more effective in Europe than elsewhere?
Denton Howard, InHope
Okay, well, I’m biased because we have representation of Hotlines in every country. While there are differences in legislation, you have coordinating bodies for law enforcement that work very, very well and very closely together. There is no one absolute answer to your question, but there are certain things that do affect it.
In Europe you have a very high connectivity rate to the internet. And generally speaking, when you’re dealing with hosting or accessing child abuse material, if the connectivity rate goes up, the abuse content rate goes up. If you look at that map again, and you look at sort of affluence to a certain degree, you’re going to see it’s almost like an affluence map. And it’s a case of, we see it as soon as internet connectivity goes into a particular region in a country, let’s say a developing country, you will see the content issues rise as well.
So is it a bigger problem per capita in Europe than anywhere else? I’m afraid to have to talk to a sociologist about that, but from what I’ve heard and giving it to you from a practitioner’s perspective, the portion of people who have a sexual interest in children from what I’ve heard, and what I’ve seen is that it remains fairly consistent. They exist in all cultures. They exist across the globe. If you look through the history they have existed, and it’s just about society’s response to it.
Okay. Denton we’re really very much out of time now, so one final question if I may. What’s next for InHope? What does 2021 look like for you?
Denton Howard, InHope
I’ve been in this post for three years now, and we’ve reorganized how we operate, how we’re funded and going forward we have embarked on a number of programs. The first one is a technology-based program. We’re developing a number of platforms and they will hopefully start coming to fruition towards the end of 2021.
We are expanding our network, which is the network of Hotlines. There will be a lot of expansion in Latin America, hopefully we will add four or five new members in Latin America this year, and hopefully two in Southeast Asia and hopefully one in Africa, but that’s still a work in progress.
I have a new Board of Directors and together with them, we’ve developed a new development strategy with two main strands. And the first one is technology. And the second one is people. So one is to create technologies and then support technology development, which will feed into us achieving our objectives. And the second part is the people that run the systems and work in those Hotlines, to support them, to develop them and to make them more effective in what they do.
Again, I’m a zealot! I make no apologies for it. I’ve been working in this area for the past 16 years and I’m probably more committed about it now than I was when I started. So if I sound a little bit excited, that’s because I am!
Denton it’s a great topic to be a zealot for, I think! So good luck for InHope for 2021. I hope all of your plans work out and it’d be great to keep in touch and maybe we can revisit some of those technological developments you’re looking at later in the year.
Denton Howard, InHope
That will be great, Neil, and thanks very much to you and to SafeToNet for the opportunity to take part today. That’s great.