Safeguarding Podcast – The 2021 Global Threat Assessment with Iain Drennan WPGA

In this Human Rights Safeguarding Podcast: Iain Drennan CEO WeProtect Global Alliance discusses the 2021 Global Threat Assessment report, research with the Technology Coalition, the Global Partnership to End Violence Against Children and Crisp Thinking, the Global Strategic Response, Online Sexual Harms and the seven recommendations they make for government, civil society and online service providers.

There’s a lightly edited for clarity transcript below for those that can’t use podcasts, or for those that simply prefer to read.

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the Law, Culture and Technology of safeguarding children online.

Neil Fairbrother

The WeProtect Global Alliance has published their third Global Threat Assessment in which they share research conducted with thousands of young adults globally about their experiences of online sexual harms. They provide findings from the technology industry and make seven recommendations, which are aligned to their Global Strategic Response Framework. There’s a lot in this report and to help guide us through it, I’m joined by Chief Executive of the WeProtect Global Alliance Iain Drennan. Welcome to the podcast Iain.

Iain Drennan Chief Executive, We Protect Global Alliance

Great to be here.

Neil Fairbrother

Thank you Iain, could you give us a brief resume please, so that our listeners from around the world have an appreciation of your background and experience?

Iain Drennan Chief Executive, We Protect Global Alliance

Absolutely. So Iain Drennan, I am Executive Director of WeProtect Global Alliance. We are an independent civil society organization. We have 98 governments, 53 companies, 61 civil society organizations and international institutions on board, and we exist to bring together the knowledge and expertise that’s needed to end child sexual abuse online.

Neil Fairbrother

Okay. Thank you. You weren’t always an independent organization, you were previously funded, I think by the British government, is that correct?

Iain Drennan Chief Executive, We Protect Global Alliance

We were back in the early 2010s funded by the UK governments with support from European Union and the US Department of Justice, and we were up until April 2020, we were hosted and staffed out of the UK Home Office. So I was originally leading the international response to child sexual abuse from the Home Office and when we transitioned to become an independent entity, I was absolutely delighted to take up the invitation to spin out with it.

Neil Fairbrother

Okay, well done on that! And how is WeProtect Global Alliance funded?

Iain Drennan Chief Executive, We Protect Global Alliance

We have a range of funders, predominantly private philanthropy. We also have some funding from the European Union and from the private sector as well.

Neil Fairbrother

Okay. Thank you so much. Now this is, as you say, the first Global Threat Assessment report that you’ve published since becoming an independent organization, but the third one overall. What is the purpose of your Global Threat Assessment reports?

Iain Drennan Chief Executive, We Protect Global Alliance

So fundamentally if we don’t understand the scale and nature of the problem that we’re facing, we’re not going to be able to address it. So a key strategic objective for us as an Alliance is knowledge. So to be the definitive source of knowledge on both the threat and on the response. And so in this report, we give a global overview and we tried to invest a lot of effort in gathering information sources from around the world, not just the global North. And then from that, we’re able to draw recommendations on how best to respond.

Neil Fairbrother

Okay. And you on this occasion works with a couple of other partners, the Technology Alliance (Technology Coalition!), and also EVAC, the End Violence Against Children organization. And what was the purpose and background behind that collaboration?

Iain Drennan Chief Executive, We Protect Global Alliance

So there’s the Technology Coalition, which is a group of technology companies focused on work that they can do to tackle child sexual abuse and exploitation. We’ve got a longstanding partnership with the Global Partnership to End Violence Against Children, but also I think another element is Economist Impact, so formerly known as the Economist Intelligence Unit. So we commissioned them to do some research to listen to young people themselves. So there’s lots of exciting elements there.

So from the Technology Coalition, we were able to survey their membership along with our own private sector membership, about what tools are they using? How are they responding to this threat? Then with the Global Partnership to End Violence Against Children, we had some great case studies from the projects that they’re funding on the ground that you can find within the Threat Assessment. It’s a real sort of active examples of good practice.

And then with the Economist Impact, absolutely essential to listen to what are young people telling us about their experience of life online. And there are a lot of challenges with doing that. We spoke to 18 to 20 year-olds about experiences that they’d had as children. So it’s retrospective, but we felt that the ethical challenges of actually engaging with young people remotely were too great. But I think this does give us a really good picture of what kind of challenges that young people face online.

Neil Fairbrother

Yes, you’re absolutely right. There is also the issue perhaps of unintentional revictimization when you talk to children in particular and the younger the child, perhaps the greater that impact is when they revisit what’s happened to them, even with the best of intentioned interviews. Now the WeProtect Global Alliance also produces a Global Strategic Response, or you have produced a Global Strategic Response and the Global Threat Assessment links I think to four different areas of the Global Strategic Response. What were those areas? What were those links?

Iain Drennan Chief Executive, We Protect Global Alliance

Absolutely. We identified four focus areas from the Threat Assessment. The first was around internet regulation. So this is somewhere where we’ve seen some real progress since our last Threat Assessment in 2019. So with countries like Australia, the UK obviously, Ireland, Germany, European Union, all thinking about how to design effective online regulation. And we’re also hearing from companies, from Monica Becker of Facebook earlier this week, you know, we welcome regulation. So I think, you know, we feel this has the potential to make online environments safer for children, but there needs to be the right level of consultation. There needs to be the right supporting legal frameworks to ensure that this is done effectively, and it’s not just going sit on the statute books and not be effectively enforced.

Then the second area is around law enforcement capacity building. So while this report focuses on the need for prevention and that we should be pivoting towards a more prevent-focused response, we’re still going to need effective law enforcement elements to be able to tackle the threat. And you know, what we find is most police agencies dealing with this crime are underfunded, under equipped and frankly overwhelmed by the scale of offending. And that means that there will be cases that won’t be investigated and fundamentally there’ll be children who won’t be protected or rescued.

Then thirdly, on technology, regulation isn’t going to be a silver bullet. There needs to be action from within companies. There needs to be also continuing evolution of the online safety sector, which the report shows has grown n sort of leaps and binds. And so I think what we want, you know, there are principles and plans out there. If you look at Australia with safety-by-design or the Voluntary Principles initiative that UK, US, Canada, Australia, New Zealand launched in 2020 with a number of technology companies, there are frameworks out there. We need to make sure that action is being taken and is being seen to be taken.

And then finally, around Societal Initiatives, that’s a sort of quite broad and all-encompassing term. But what we’re specifically thinking about is drivers of the problem. So this is looking right back, how are young people learning about sexual behaviors? How are we supporting them to develop a good understanding of that?

And that links into phenomenon that we’ve identified within the report of enormous growth in self-generated sexual imagery. That’s a very complex problem, but I think one of the things we can do is identify where are the intervention points to empower young people, and then also reducing stigma. So you mentioned there can be a gap between abuse trauma taking place and people being ready to talk about it. I think that’s why this is a historic under reported area. There are encouraging signs that the conversation is starting to happen. You know, you look at Everyone’s Invited, which was about teenagers sharing experiences of sexual harassment. There is a sort of opening up of that conversation. So that’s a kind of quick canter of those four areas where we think there’s most potential for action.

Neil Fairbrother

Let’s try and get a handle on the scale of the problem. You have statistics that illustrate the scale of the challenge. For example, in 2020, over a million individual media files were I think identified or exchanged via the InHope Network and so on. Could you shed some light on the scale of the problem Iain?

Iain Drennan Chief Executive, We Protect Global Alliance

Yeah, so what we’re seeing is a hundred percent increase in reports from the public to the National Centre for Missing and Exploited Children of online sexual exploitation from 2019 to 2020. In the same period the Internet Watch Foundation has reported 77% increase in child’s self-generated sexual material. You’ve got analysts at the US National Centre processing 60,000 reports a day. We’re seeing, you know, evidence from across the board. So looking at law enforcement, looking at the referrals. looking at reports from hotlines, from analysis of the dark web, you know, we’re seeing an increase in grooming, seeing an increase in the volume of child sexual abuse material, seeing an increase in sharing and distribution. We’re also seeing an increase in live streaming for payment and overall sort of more monetization of abuse, which is a really worrying trend.

Neil Fairbrother

One of the outstanding statistics that really leapt off the page to me was actually from the Canadian Centre for Child Protection where their Project Arachnid system you say has processed an unbelievable 126 billion images. That’s not a typo, I take it, that is not meant to be million. That is a billion, 126 billion images. Is that right?

Iain Drennan Chief Executive, We Protect Global Alliance

So you’re looking at enormous numbers of images, videos circulating around the internet. Some will be new, some will be historic and it’s groups like the Canadian Centre, like the National Centre for Missing and Exploited Children, like Internet Watch Foundation, are going out and looking for it and where they look, they find.

And I think we need to look at the question in the round kind of, you know, high do we have sight of the scale of the problem? How do we best encourage that, and I think for me, a big part of that is companies looking proactively at what they’ve got on their platforms and sites. I think the biggest risk for us is if we kind of move into see no evil, hear no evil mentality. And I think, you know, if you draw a comparison with child sexual abuse offline, you know, that’s what happened for years in Rochdale and Rotherham and other towns in England where victims weren’t listened to, where people turned a blind eye and I think there’s serious risks of that happening online, if people don’t go out and look for the material we know is out there.

Neil Fairbrother

Yes, now based on the survey that you did in this report with the Technology Coalition, your report says that 87% of tech companies are using image hash matching, such as photo DNA and the like, but only 37%, so well less than half, are using tools to detect online grooming. So the focus seems to be on output rather than process. Wouldn’t it be better if the focus was on the process, as that would then automatically reduce if not eliminate the output?

Iain Drennan Chief Executive, We Protect Global Alliance

Absolutely. We completely agree. And I think it’s one of the big themes of the reports that we move to stop the harm before it happens. And that makes moral sense. It makes operational sense, and it makes financial sense. The UK Child Sexual Abuse strategy looked at the costs of child sexual abuse, and they came up with an annual figure of £10 billion. And that’s just one country.

We know these are harms that can affect people all their lives in terms of mental health, in terms of physical health, in terms of reduced ability to take advantage of life opportunities. And so we should be shifting that investment towards making sites safer, towards deterring potential offenders, towards empowering children to make choices and to report effectively. So yeah, we would strongly encourage investment from companies into identifying grooming, identifying where those risk areas are and then being able to intervene and stop that from happening.

Neil Fairbrother

And presumably those interventions would have to be in real time, to prevent another victim?

Iain Drennan Chief Executive, We Protect Global Alliance

I think there’s a number of different models and techniques to disrupt that, whether it be in real time or looking at accounts and terms and conditions. We’re not here to specify the precise methodology. What we want to do is to focus on what the outcome is that we want to achieve.

Neil Fairbrother

Okay. So let’s have a look at some of the information that you’ve gleaned from new research with young adults, where they were reflecting on what they had experienced when they were children. So these 18 to 20 year olds were telling you about some of the incidents that happened to them when they were in fact under 18, when they were children online. And one of the first items that they mentioned was being sent sexually explicit content from an adult or someone they did not know before they were 18. What was that all about Ian?

Iain Drennan Chief Executive, We Protect Global Alliance

We looked at a series of four of what we called Online Sexual Harms, so these are specifically not the most serious forms of abuse because we’re very alive to the risks of retraumatisation. What we’re looking at here are risk factors, ultimately. These are things that point to more serious problems, and these are things that children feel uncomfortable with, that you shouldn’t be expecting children to experience when they’re online.

Receiving sexually explicit content. We’ve got, you know, adults asking children to keep online sexual explicit interactions secret, sharing images and videos of them without permission and doing something sexually explicit online that they were uncomfortable doing. And all of those, you know, it’s between a quarter and 34%, and that’s worldwide, that’s happening across the globe, people are experiencing those harms.

And I think if you look at what children as children, what they were able to do, it’s only a relatively small percentage that either reported the problem to the platform or spoke to a trusted adult. The most common response was to delete or block the person. And I think that speaks to how easy we can make those reporting mechanisms and how we can encourage adults to be able to talk to children about their online experiences.

But it was really eye-opening, both in terms of the breadth and in terms of particular challenges experienced by you know, transgender, LGBTQ+, disabled children who were more likely to experience these online sexual harms during childhood. And that’s something we want to do more research on to understand that a bit better.

Neil Fairbrother

Okay. Now there is another partner that you worked with on this report called Crisp Thinking. And Crisp Thinking, I think, did some analysis of conversations on dark web offender forums. So who is Crisp Thinking and what did their research tell us?

Iain Drennan Chief Executive, We Protect Global Alliance

Sure. So Crisp are an online safety company that focus on actor risk intelligence. Is there a term for it? I want to use the right terminology! What they’re talking about is the agendas and tradecraft of individuals and groups to prevent online harms, misinformation and abuse. So they spend their time going through the dark corners of the internet, looking at digital conversations to predict online harms for private clients. And it was fantastic we were able to involve them in the production of this report because they were providing information about what offenders are talking about. A bit of an insight into how offenders are acting on the dark web.

And I think what jumped out at me from the assessment was an analysis of the kinds of topics that come up and the vast majority, so over a third was about what social media platforms to use. So they understand what the vulnerabilities are. There’s a very sophisticated understanding of, you know, where are the gaps? Where can we exploit that? Where can we take advantage of that?

The next one, some way down at just under 15% was around secure tooling, as it was described. How can I communicate safely? What’s the software I need to be using? What’s the hardware I need to have? And then below that 13% on content storage and exchange. So it’s very, very practically focused, but I think what the report brings out is you don’t need to be a technical mastermind to become a serious abuser, a serious offender.

A lot of this material is available off the shelf, you know, you can get on Tor, you know, the most well known dark web portal, and just go on the app store, download it, and away you go. And then, you know, you’re able to find your way onto these kind of forums where you will be instructed and given a lot of very detailed guidance about how to operate, how to cover your tracks, how to take advantage of the vulnerabilities that there are.

So I think it underlines two things for me. So that there’s that element of sophistication, but also there’s that element of, it’s easy to draw people in, compared to 25 years ago when people may have thought these things, but, you know, wouldn’t have the first idea of where to go. Now, here’s very, very clear and easy pathways for them to take.

Neil Fairbrother

Indeed. And now there is a huge amount in your report and we can’t possibly hope to cover it all. So let’s have a look at the seven recommendations that you make towards the end of the report. And the first recommendation that you make is all about funding, where you say that “…the current levels of investment are neither proportionate to the scale and scope of the issues and are not sufficient to deliver the step change in the global threat response”. And for me here, the issue is almost a business case, and it was interesting that you mentioned the UK government’s estimation of the cost to society of online sexual abuse of children, which is put at £10 billion. So how does this investment work, whose investment is it, who should fund it and indeed, how much should it be?

Iain Drennan Chief Executive, We Protect Global Alliance

So it’s very difficult to put a an overall figure on it, but I think funding needs to be sustainable. What has been missing in this space is sustained global funding for this issue. The problem has been quietly growing over the past period, but the resourcing has not kept pace. And I think it’s one of the secondary impacts of COVID 19, that now the resource climate is even more competitive than it was before, but COVID-19 also underlines that this is a problem that has been exacerbated by the pandemic and that comes through in the report.

I think that, you know, the two key elements here are both governments, because governments are able to support legislation, are able to support the enforcement of legislation, are able to put in place support for victims, build this into the overall child protection landscape. And that’s really, really important. This shouldn’t be sort of treated as some kind of separate niche issue. It should be integrated within the overall provision of child protection, both in the health context or the education context, it should be baked into what governments are doing.

But then secondly, on the private sector, you know, you’re providing, you’re owning, you’re managing the environments in which this abuse has taken place. There’s a responsibility there to make those environments as safe as possible if you’re going to have children using them.

So I think, you know, in the same way as companies take responsibility, you know, you don’t put a product to market if it’s vulnerable to viruses. You don’t put a product to market if it’s violating international law on copyright or intellectual property. In the same way, one of the questions you should be asking before you launch a product is, is it safe? Can it be exploited? Are there vulnerabilities? And I think that what we’d love to see as a kind of business as usual investment, this is one of the issues that needs to be taken into account. And at that kind of base level, not, I think at oh we need to put in some funding for online child sexual abuse, it needs to be integrated more deeply into planning.

Neil Fairbrother

Yeah, we interviewed your chair Ernie Allen several months ago, and Ernie said that towards the end of the 20th century, the production and distribution of child sexual abuse material in the offline world had all but disappeared. And then along came the internet.

Now the internet is far more than just social media companies. For a start to get to the social media companies, you’ve got to have some kind of device in your hand, you’ve got to have some kind of network and those networks are getting more and more complicated with not just a hierarchy of internet service providers, but also what are known as content delivery networks as well. So should the funding for these preventative measures come from the entire online digital context? Not just the social media companies? Is it unfair to expect the social media companies to fund everything when they are only one part of the whole ecosystem that gets predators in front of children?

Iain Drennan Chief Executive, We Protect Global Alliance

I think part of the reason we exist is to have those conversations, because I think anytime that money and who has to pay for what comes into it, those are when those conversations become quite challenging, but that’s why we exist as an Alliance. So we have telecoms firms like Vodafone, and ATT on our Alliance. We have the big tech companies. We also have online safety companies, finance companies, all I think have a role to play. And I think any company who has the potential and has the ability to take action you know, our view is that they should be doing so.

I think, you know, we would encourage all of our members to take responsibility for what they can do, but also to collaborate with others, to try and deliver a greater effect. Again, we’re not in a position to dictate sort of who should pay for what. I think our view is it should be paid for and then we are quite happy to sort of facilitate = those conversations and those dialogues about high best that should be done

Neil Fairbrother

Recommendation two refer to Policy and Legislation. And here, you say that “…there should be laws for industry reporting and removal of CSAM and should be lawful use of tools to detect CSAM”. But what isn’t said in the report is that there should be a law for proactive searching for CSAM. Should person-to-person service providers and search engines, and indeed content delivery networks, have a legal duty to proactively search out and then take down CSAM?

Iain Drennan Chief Executive, We Protect Global Alliance

What we’ve seen where there is proactive searching is it delivers really great results. It really helps to lift the lid on the extent of the problem. We’re conscious that with some countries, with their legal frameworks, that’s very challenging. However, I think it’s something we should aspire to do and to look at, because I think relying on reactive reporting is only ever going to give us a small proportion of the problem.

Where you look at statistics in jurisdictions where proactive reporting is allowed and you look at the balance between what they get from proactive searching versus what they get from a reactive, then it’s massively skewed in favour of proactive reporting. In terms of the material that’s coming through, there’s going to be a lot of reports that are, you know, it’s the same material. You know, we’ve got to aim off for the fact that there are, you know, there’s people just sharing material for twisted sense of humour or misplaced sense of responsibility where they feel that they need to share for action to be taken. But even with that, I think it’s definitely something we want to aspire to, and that to have that analysis on the top of it so we understand what we’re doing.

Neil Fairbrother

The final point on Recommendation Two, is that NCMEC reports that of the 196 Interpol member countries only 32 require ISPs to report CSAM offenses, which is a very low number. Why is that? What can be done to incentivize the remaining countries to take action here?

Iain Drennan Chief Executive, We Protect Global Alliance

I think it’s, again, something we should be encouraging. I think in some countries, the priority remains digital connectivity and getting more people online and that’s where they are in terms of focus. And I think what we can do is sort of provide the frameworks and recognize that people are on different stages of the journey. I think something we’re going to be looking at more over the next few months is how we can break down those priorities a bit more and see where our members are at different points, but certainly we want to have as much information as possible. I think those reporting flows are absolutely crucial and being able to build up our understanding of the problem, and then we know how we can design our response to match.

Neil Fairbrother

Okay. Recommendation Three a is all about Criminal Justice. And here you say that “…there should be investment in deterrence and rehabilitation to help those at risk of offending or to change and manage behaviours”. So what kind of deterrence might work, because self-evidently the current deterrent system isn’t working, given the scale of the problem?

Iain Drennan Chief Executive, We Protect Global Alliance

Prevention is always a very, very difficult thing to measure because you’re trying to understand something that didn’t happen. So I think it’s very tricky to put metrics in place. However, I think what we’ve seen from examples from our members, looking at the Lucy Faithfull Foundation, being able to link support to search terms. So when people, for example, on PornHub, you put in a search term connected to child sexual abuse, you are directed to support services provided by the Lucy Faithfull Foundation. And I think, you know, they’ve seen sort of significant increases in the numbers of people who are coming to use their services. Politically, I think it’s always going to be easier to advocate for more resource for law enforcement, for police, than it is for prevention, but it’s absolutely crucial because you know, there’s no silver bullet here.

We need to be able to address the “radicalization” in inverted commas of offenders and people who are on that cusp, people who are potentially considering crossing that line. How do we address that? How do we make it as difficult as possible for that to happen? In the same way as in the pre-internet world, it was a very difficult thing to do. So how do we do that?

But I think, you know, there’s some really interesting data coming through from the Finnish child protection charity, you know, they did some analysis of the consumers of child sexual abuse material on the dark web. And what that illustrated to me was the links between online and offline abuse and you know, 37% of people they surveyed had reached out to children in real life after viewing abuse material. And even if we’re successful in making arrests, those people are going to come out at some point for the most part, you know, how do we make sure they don’t defend again? You know, and there’s a, you know, an amazing prevention opportunity but we just need to take it and we need to resource it properly.

Neil Fairbrother

Okay. Recommendation number four, Victim Support Services and Empowerment. And here you say that “…there should be standards set for the timely removal of CSAM from the internet, the reduction of image recidivism (which you’ve sort of referred to just now) and also design child-friendly reporting processes”, all of which are very hard to argue against, but why not also include standards for eradicating or eliminating the creation and sharing of child abuse images in the first place?

Iain Drennan Chief Executive, We Protect Global Alliance

I think it’s a question of sort of where, you know, based on the data that we’ve got from the Threat Assessment and you know, this is all gone through an expert steering group drawn from around our membership and these were the things that they felt were particularly important to address at this point. That’s in no way arguing against what you’ve suggested. I think it’s about what we prioritize now and then what we move on to next

Neil Fairbrother

Recommendation Five is all about technology. We love technology. You say here, “…the online service providers must take a safety-by-design approach based on child’s rights”. The UN this year published the General Comment 25 which enshrines child’s rights online. Should this be incorporated, for example, in service providers’ terms and conditions, so using General Comment,25 as the basis of child centric terms and conditions for social media sites and the like?

Iain Drennan Chief Executive, We Protect Global Alliance

I think we would strongly support that. I think that’s a sort of internationally agreed statement, building on one of the seminal documents on child rights. I think we’d have no argument whatsoever with that. I think these recommendations are focused specifically on an overall approach, so a mentality almost, that I referred to earlier that this is just one of the things that you need to do. It’s just routine, it’s normal, it’s, you know, antivirus, legally compliant. Does it work and is it safe? Is it safe for children to use? If you don’t want children to use it, then you, you sort of take meaningful steps to ensure that children aren’t using it.

Neil Fairbrother

Okay. Now you also refer to the use of Age Estimation tools here. And it seems to me that if you focus those purely on children, you will end up with the Fox in the hen coop problem, in that predators will fool the system into becoming officially classified as a child. So these age estimation or age verification tools really need to be for everyone, not just for children.

Iain Drennan Chief Executive, We Protect Global Alliance

I think as with my background being in government policy, I think I’m extremely alive to the risk of perverse incentives and trying to ensure that things are designed such that they can’t be easily exploited. You know, just listening to some of the testimony has been going through Parliament recently, about there are kind of account farms in places like India or Sub-Saharan Africa where children are basically paid a few dollars to play on an account for a year or so then it’s seen as a legitimate account and then that’s sold to the highest better for nefarious purposes. So, I mean, I think, you know, the levels of ingenuity in the in the offender community and some of the Crisp reporting brings that out very, very clearly. I think we, you know, we should be moving to avoid loopholes and vulnerabilities.

And I think, you know, there’s some really exciting technology out there. There’s some brilliant companies you know, many of which are members of ours who are looking at this age verification challenge. But I think it links back to the legislation point. You know, it needs to be within the context of an effective legislative framework so that companies, service providers, online safety companies, age verification firms, kind of know where they stand in terms of how they can operate and the privacy of all users is prioritized, including those of children.

Neil Fairbrother

Okay. So two other quick points, if I may on, on technology. One concerns end-to-end encryption, what impact does end-to-end encryption have on all of this Iain?

Iain Drennan Chief Executive, We Protect Global Alliance

It’s an issue we discuss in the report. And I think, again, it’s something that links back very strongly to the legislative framework, because at the moment, encryption opens up a number of vulnerabilities in our current posture of addressing child sexual abuse material. But, what I’d also say is that the genie is out of the bottle on encryption. You know, there are a number of encrypted services that are out there that offenders are using. I think we need to be preparing for encryption.

I think we talk in the report about innovation. So we look at device-level solutions. We look at digital signatures, homomorphic encryption, there are exciting developments, but I think there needs to be, you know, a proper open conversation. You bring in encryption, what are the impacts of that on child safety?

And I think one of the biggest risks that I can see is that you’ve got a vicious circle where non-encrypted tools’ effectiveness is reduced by reducing the access for training of the tools so that they can get access to the child sexual abuse material that they need to make those tools effective. And regulation is important, but that needs again, greater international collaboration that we’re seeing, because we don’t want to so-called “splinter net?. We don’t want a confusing array of different regulations and displacement of offenders to more lax environments. But you know, we need to have an open conversation about this. It’s not a black and white issue. It’s not privacy, good encryption bad, or vice versa. We need to get beyond any kind of simplistic analysis.

So from our perspective, we start at the child user, you know, how are they experiencing this, you know, what are the risks that they face? How can we mitigate those risks effectively? We’re looking at the needs of all users. So we’re very keen to, you know, get involved in this debate, to be as supportive as we can. We exist to bring people together and have those conversations.

We’re strong supporters of Apple’s announcement. We hope that they come back and we released a public letter in support of that we want to see that kind of innovative thinking in all of our private sector members because you know, it’s an issue that’s not going to go away.

Neil Fairbrother

Interesting you should mention Apple. One of the other points you make under technology is that there should be more use of classifiers for “unknown CSAM”. Now Apple’s NeuralHash scans for what’s termed as “known CSAM” as do more established technologies, such as photoDNA. In other words, it’s looking for pre-hashed by NCMEC images. WhatsApp says they are already looking for unknown CSM, but to find unknown CSAM doesn’t that mean that all images on a device must be checked because those pre-hashed images don’t exist. That’s not what it’s looking for. The classifier is looking for un-hashed images. And if that’s the case, isn’t this even more invasive of privacy, or at least wouldn’t some people say that that is even more invasive of privacy than Apple’s pre-hashed image search?

Iain Drennan Chief Executive, We Protect Global Alliance

I think because this is such a polemic issue that arouses strong feelings, and there are sort of impassioned voices on all sides, I think, you know, a staged approach makes sense. So I’m not saying that, you know, Apple’s solution solves everything. I’m not saying that by any means, but what I do say is that it’s an important milestone on the way, it’s a step forward. And, you know, we want to see that implemented and kind of as a foundation for further work.

But you’re absolutely right, if we all were scanning for is pre-existing hashed imagery then there’s a massive blind spot there because offenders, they’re creating material all the time. We know that for certain dark web forums, that the price for entry is X amount of new material per week per month.

We also know that a lot of the scanning technology doesn’t work as well with videos. We also have a massive vulnerability in terms of live streaming where you’re not collecting anything, you’ve got that kind of video on demand where you’re not having those massive collections of imagery.

But I’m fundamentally a technology optimist. I believe that, you know, you have the brain power within the private sector to be able to solve these problems. Are we giving them the right incentives, whether that be in terms of legislation or in terms of financial incentives to take action, to make this more than a nice to have? What we want is that this should be part of the business model. This should be one of the issues that come up at the board level of these companies. We’re confident we’re already seeing it as a fundamental strategic risk for companies that particularly target their products at children, but I think for companies with wider audience bases, I think what we’ve seen in the news over the past few weeks that underlines it, this is an issue that people care about.

They care about it at a very deep level. We’re all parts of families, you know, we’re all linked in to that network. And I think we need to sort of have a calm, open conversation about what can we do. Let’s move on to that rather than what can’t we do. And I think if it’s a staged process or it’s taking it step-by-step, that’s fine as long as we’re pointing in the right direction and there’s a plan to move forward.

Neil Fairbrother

Okay. Recommendation six, the penultimate recommendation is all about society or societal issues. And here you say that “…governments must include online safety into school curricula”, which is all well and good, but don’t schools and teachers in particular have enough on their plates already delivering the subjects that they teach without this extra burden?

Iain Drennan Chief Executive, We Protect Global Alliance

I am very understanding and appreciative of everything that children [Iain meant to say teachers] do as I’m in the midst of half term with two small girls running about the house. But I think we’re already seeing these conversations happening, you know, the fantastic work of Safer Internet and teaching kids will be a really important part of their lives. You know, the online world will be absolutely foundational to their future working lives and how they operate and behave on the online is incredibly important. It’s like, you know, when you or I would have got our road safety certificate, it’s about how children are able to operate in an environment that is absolutely essential to their wellbeing.

So I think, yes, there’s a lot on there already, but I think that this is kind of overarching everything. If children are coming out of primary school, and they’re not equipped on how to operate, are not unempowered to operate online, we’re going to store up problems further down the line, so it’s a worthwhile investment to make.

Neil Fairbrother

Okay. The final recommendation you make, recommendation seven, is all about research and insight. And here you call for yet more research, which is what researchers always call for. Do we not have enough research already to make some decisions and make progress to eradicate CSAM from these technology-based services?

Iain Drennan Chief Executive, We Protect Global Alliance

I mean, I would say we do, we have enough to take action. However, I think what we provide with this report is a global assessment, it’s worldwide. So if you are sitting in you know, country X, and you want a detailed breakdown of what the scale of nature of the problem is in your country, then, you know, that’s not what it’s intended to do, but there’s enough there to take action. You know, I used to work in counterterrorism. If you compare the amount of research,investment, the more we know about this issue compared to terrorism, it’s not even comparable, it’s not even close. So I think there’s a long way to go before we have the right level of information to be able to fully get around this problem, but we have enough.

Neil Fairbrother

Okay. Where can people find this support Iain?

Iain Drennan Chief Executive, We Protect Global Alliance

It’s on our website, weprotect.org. It’s available in English, French, Spanish, Portuguese, and Arabic, and we’re very excited to share it. So more detail on the Economist Impact findings as well. But you can find that all on the website.

Neil Fairbrother

Thank you Iain for your time and thank you for the report. Fascinating reading as these Global Threat Assessments always seem to be so good job, well done I think.

Iain Drennan Chief Executive, We Protect Global Alliance

Thank you for the invitation.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top