Safeguarding Podcast – the Perfect Storm with Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

In this Safeguarding Podcast with Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children: the UN General Comment 25 of the UN CRC and what it means for online child safety, encryption and privacy versus safety, the EU’s ePrivacy Temporary Derogation, the requirement for age appropriate design and the tech chicken and egg. Note that this was recorded before Apple’s recent announcement about their plan to counter CSAM on the iOS and iCloud ecosystem which you can read more about here.

There’s a lightly edited for legibility transcription below for those that can’t use podcasts or for those that simply prefer to read.

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the law culture and technology of safeguarding children online.

Neil Fairbrother

The United Nations’ Convention on the Rights of the Child, or the UN CRC as it’s known, is an international human rights treaty, which sets out the civil, political, economic, social health, and cultural rights of children. The UN General Assembly adopted the Convention and opened it for signature on the 20th of November 1989, when we very much led analog lives.

UNICEF estimates that there are now some 750 million children online, globally. And to make clear children’s rights online, the UN has recently published what’s referred to as General Comment 25. To guide us through what this is and what it means, I’m joined today by Howard Taylor, who is the Executive Director of the Global Partnership to End Violence Against Children.

Welcome to the podcast, Howard.

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Thanks Neil. Thanks for having me on. Very much looking forward to the conversation.

Neil Fairbrother

It’s pleasure to have you as a guest Howard. Could you give us please a brief resumé so that our audience from around the world has an understanding of your background and experience?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Very happy to do so. I began my career, I won’t say exactly how long ago, but sometime ago, working in the UK government for what was then the Department for International Development, now the Foreign, Commonwealth and Development Office. I spent 14 or 15 years with DFID, including spells in India and Ethiopia. I then moved across to the west coast of the US where I ran the Nike Foundation and subsequently span out the major initiative with the Nike Foundation called Girl Effect as a separate social enterprise, which is now headquartered in Nairobi. And following that I had the fantastic opportunity and privilege to come into my current role, which is as you noted the Executive Director of the Global Partnership to End Violence Against Children.

Neil Fairbrother

Okay, thank you for that. And what is the Global Partnership to End Violence Against Children? The clue’s in the name, I guess, but perhaps you could flesh it out with a little bit more detail?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

The clue is in the name, but the name’s a little long. So the End Violence Partnership was launched, the partnership and fund were launched in 2016. So a year or so after 193 world leaders agreed the Sustainable Development Goals, the Global Partnership to End Violence Against Children and the fund, and a set of evidence-based strategies to end violence against children, were all launched in 2016.

And we now have over 600 partners in the Partnership and we really are a platform for collective advocacy action and investment. And that’s both investment with the fund, but also making the case for significant scale-up of investment to end all forms of violence, abuse, and exploitation against children.

We have partners from every region and every sector because it takes multiple sectors to successfully tackle violence and abuse against children. And we gather around a shared vision that every child, wherever they live in the world, deserves to grow up safe, secure, and in a nurturing environment. And I think in terms of what we’re going to be talking about today, the General Comment 25, on the UN Convention of the Rights of the Child, the online piece, that digital piece, we know that one in eight children are sexually abused before the age of 18. We know that’s just the tip of the iceberg. And the digital technologies, while of course they have many benefits for children, also pose some serious risks and harms.

Neil Fairbrother

Yes, and by Sustainable Development Goals, this is the UN SDGs and presumably you’re focusing mostly on a 16.2. Is that correct?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

That’s exactly right. So, 16.2 is the SDG target to which we roll up as it were, and all our efforts are aimed towards driving progress, accelerating evidence-based progress towards SDG 16.2.

Neil Fairbrother

Okay. Before we get onto General Comment 25, I would just like to get a sense check from you and your organization as to where you think we are with online child safety as you seem to have a global perspective on this. So, what are the risks of online harm for children?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

I think one of the biggest risks is that many people are not really fully aware of the risks. So without bombarding you with too much data, I think a few data points just to sort of frame the conversation. First of all, it’s estimated that about 69% of young people globally were online in 2019. And as I said earlier, that has huge opportunities, whether it’s learning, networking, socializing, gaming, et cetera for young people, but also exposure to risks, including the risks of being groomed, violent and sexual abuse online.

NCMEC, which is a US-based organization, the National Centre for Missing and Exploited Children had more than 20 million reports in 2020 of child sexual abuse material. So that’s images and videos depicting children being abused, sexually abused. Just to reflect on that for a moment, 20 million images, 20 million images and videos of children being sexually abused, and that’s just in a one-year period.

And we know that over time, over recent years, that figure has been on a fairly dramatic upward trend as there’s been digital expansion, connectivity, which is a good thing [but] along with that comes risks if safeguards are not built in to protect children.

And just to reflect also in the last year and a half or so during the COVID-19 pandemic of course, we’ve seen you know, many more children and much more of children’s lives move online. So, children who may have been online some of the time before were suddenly [online more]. Many children found themselves learning online, their school moved online, of course, their ability to connect with their friends, et cetera moved on to online and social media and other digital platforms.

So suddenly children, many more children, and much more of children’s lives moved online, and the perpetrators who wish to harm children didn’t then move offline. In fact, you know, I’ve seen anecdotal suggestions that actually the COVID-19 pandemic has also made it easier for those who wish to harm children because if you’re sitting as I am now in my basement at home it would be easier to be perpetrating online abuse here than it would be if I was sitting in an office where people might look at what I’m doing.

We know that at any one time, there are about 750,000 people online with the intention of harming, grooming, sexually abusing children, 750,000 people globally at any one time are estimated to be online. And we can only assume I’ve not seen data on this, but I can only assume that figure has also gone up during the COVID-19 pandemic when so many people have been working from home. So it’s not just children who have moved much more online and much more of their life online, but also the perpetrators, those who wish to harm them as well.

Neil Fairbrother

You’ve described what seems to be a massive problem in terms of scale. What are the main obstacles do you see to addressing that? How can the internet be made safe for children, if indeed it can be?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah, I think the starting point is to believe that it can be, and it will take a certain scale up in will and resources and commitment to make that happen. So, first of all, I think there’s a huge mismatch between both the scale of the issue, but also the growth rate of the issue and the evolution of the issue.

We’ve seen over recent years the way that children are abused, the format and the types of abuse that happen online have been evolving as technology has also evolved. So a mismatch in awareness, a mismatch in the size and growth of the problem, and then the attention, the action, the investment, the resources needed to address the problem.

And that’s both from, in terms of public awareness, but it’s also in terms of governments and public policy and legislation but also industry as well, so the tech industry, telecom industry, those who run the platforms on which the violence, the abuse happens.

And then I think there’s a bunch of specific obstacles, Neil. Just to reel off three or four, I think there is still limited evidence and research about child sexual exploitation and abuse online and that’s because it is a new and fast growing and fast evolving area. Secondly there’s not yet enough international collaboration. So cross-sector collaboration, international knowledge sharing, et cetera, you know, identification and sharing of best practice. And then even national responses tend be very fragmented. You need to have multiple parts of any government involved. So law justice, social services, education, and more. So even at the national level, it’s difficult to coordinate some of those things, but of course, that coordination because the internet is global, that coordination has to be both national, but it also has to be regional and international as well.

And then just finally, I think just the final one on the tech industry specifically there is a really uneven, I think, pattern of prevention, detection and response by the tech industry. And there are no really formal or clear coherent accountability mechanisms in place and no real standards of best practice across the tech industry. So there’s a lot to do.

But to go back to where I started and answer your question, I don’t approach this issue from one of despair. I think that whilst there is a lot to do, recognizing this as a new and fast-growing issue. I think we are at a point now where we know enough to significantly scale up our collective response globally, regionally, nationally to make a real difference.

Neil Fairbrother

Okay. And I believe that obviously End Violence is involved in all of this. What specifically have you done? I believe you’ve just published your annual report, which I haven’t had time to read. What have you done specifically that helps to address this issue?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah, well, I know that you’ll enjoy our annual report when the weekend arrives when do you have time to read it, but in the absence of you having read it let me just say a few things. So we have made a combination of three or four things. We’ve done a combination of three or four things over the last four or five years, and that’s a combination of programmatic interventions, evidence generation, promotion and investment in technology solutions and innovation and advocacy.

And since 2017 through the End Violence fund, we’ve now invested around $44 million in 53 projects, which have an impact in 70 countries. And I’ve sometimes been on platforms, and I’ll say the same here, we believe we are the single biggest financial investor in this space. And I say that Neil, not as a badge of honor or a boast, but because it’s woeful. If we’re right in our calculations that we are the biggest investor and we have invested $44 million over a few years, that of course, compared to the stats I gave earlier about the size and scale of the issue is woefully inadequate. It just underlines the huge mismatch of resources, currently, just financial resources as one of the proxy indicators for effort being put into this space.

We also recently launched a new fund with the Technology Coalition. The Technology Coalition is a group of global technology companies working in this space and we have launched a fund with them, which is going to through research, develop actionable insights for those companies, for tech industry, to use, to make their platforms, their services safer for children.

So a combination of advocacy, collaborative action, really pushing also the links between what needs to happen globally for collaboration, but also regionally and nationally and making links to broader initiatives. So whether it’s initiatives about expanding the internet or other initiatives that are being done globally.

We’re often the voice in the room, or the people on Zoom these days, making the case that it’s great to expand access, we need to connect everyone globally, we need to connect every child to give them the opportunity of what the internet can provide, but we must do so safely. And so we’re often the voice in the room or on Zoom, as I said making, making that case.

Neil Fairbrother

In the UN SDGs, there is one under the Infrastructure section, which I think is SDG 9, which is talking about exactly what you just said, encouraging more infrastructure so that more people can go online. But that seems to be at odds with 16.2, which is about ending violence against children. The UN SDGs  on the one hand, it’s encouraging more people and therefore children to go online, but equally it’s got this tension with 16.2, which is ending violence against children.

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah. I don’t think there is a tension but there would be a tension if the expansion and getting everyone online and more people online was done in a way that is not done with the safeguards put in place for children. So for example, some of the investments that we’ve been making, whether it’s supporting governments, [for example] we work through the 5 Rights Foundation to support the government of Rwanda to develop its first policy in this space and an implementation plan with the government of Albania who have just published their first national cyber security strategy, which for the first time incorporates a whole section on child children’s protection online.

We do a lot of investments to the parents, caregivers, teachers, those around children who actually can educate them to keep them safe online. So they know how to protect themselves online, or the parents, caregivers and teachers are helping to play that role as well.

So that those kinds of investments are so critical because one thing I’m very fond of saying is at the moment we are investing, and not just the End Violence partnership, but we collectively, globally, those involved in this space, we’re investing to protect children online today on an internet which is inherently unsafe for children. But at the same time, we’re advocating for an internet to be safe by design and to be made safe for children tomorrow. So I don’t see the tension in those two SDGs, as long as the expansion of access and children coming online, is done in a way that acknowledges and doesn’t just acknowledge, but actually takes specific actions, to make sure that as children come online, they are doing so in an age appropriate and safe way.

Neil Fairbrother

Okay. Just one final question before we get onto General Comment 25, if I may. You mentioned impact. How do you measure impact?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

I think in a variety of ways, there’s through the investments I mentioned. So the sort of $44 million that we put into 53 projects, each of those projects has specific measurements, indicators of impact, excuse me. And as you’d expect that’s both about reach of children but also because some of the efforts that we’ve supported, for example, some of the policy efforts the outcome there will be the development, the implementation, the passing of legislation, the implementation of legislation that we know is going to have a significant impact across this space.

So that’s one area. And then of course, with the broader agenda around advocacy, et cetera, I think it’s slightly different in terms of how we would measure our impact alongside other key voices in this space. So we’re advocating for legal change, policy change, action, and investments to be made.

Neil Fairbrother

Okay. Thank you for that Howard. So when it comes to General Comment 25 then, what is General Comment 25 and why is it so important?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah, so General Comment 25, it’s a bit of jargon, isn’t it? Let’s be frank, General Comment 25, it’s not going to be obvious to the person in the street what General Comment 25 is about. So let me just back up stream a little bit to speak briefly about the UN Convention on the Rights of the Child, which is the most ratified human rights treaty globally. And what’s exciting about General Comment 25 is that it for the first time embeds children’s rights online into that global UN children’s rights treaty.

And the reason why we and others who are champions and advocates for children’s online safety, see that as such an important development is it’s both the signal it sends, but also the substance that’s embedded into General Comment 25. So the signal is that for the first time, this very important treaty, this very important convention, is taking account of the digital era, the digital world, and the fact that children are engaging digitally and they are at risk digitally. So that signal is just fundamentally important across governments, across regional bodies, et cetera, globally.

And then because substantively what the comment embodies, which includes an anticipation of future trends because I’d already mentioned a couple of times, this is a fast-evolving space and so what the General Comment 25 tries to do is to anticipate future trends, but also to underline the criticality of safe, empowering online environments for children, leading towards preventing violence and abuse against children online before it happens. So whilst response and justice and healing is absolutely vital for children who have experienced it through online abuse and exploitation, you know, coming up stream and trying to prevent it happening, of course, is the goal.

And then reconfiguring the internet with children in mind, because the internet wasn’t created with children in mind and General Comment 25 also leans towards that reconfiguring of the internet, which in other spaces, people refer to as “safety by design” and making sure that digital services, digital platforms, the internet, is made safe for children.

So we see it as really exciting, as I say, it’s General Comment 25, it is what it says on the tin. You’d scratch your head unless you’re an expert, but actually open the tin and dig beneath and actually it’s a very exciting development.

Neil Fairbrother

Yes, it is. Now you’re based in New York which is I think the home of the UN in America, and America is where most social media companies used in the Western world at least are based, but America hasn’t ratified the UN CRC. So, is that a problem? Should they? Could they? Why haven’t they? Will they ignore General Comment 25?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah. I’ve been very encouraged by what the current American administration have done just in a few months. So the Biden/Harris administration, just in the space of a few months have taken a bunch of steps with regards to child sexual exploitation and abuse, not all specifically with regards to what’s happening online, but for example, there was a presidential proclamation marking April the 8th as a day that child sexual exploitation and abuse will be remarked and notified, and there will be an energy around that. And I know that there’s more underway behind the scenes in the US government right now across this whole agenda.

So I’m encouraged what I see from the current administration and just to step back further from that though, and say, would it be good if every country ratified the UN Convention? Yes. But as a partnership, the End Violence partnership, we choose to encourage, to work, to support, to understand where the blockages are and to work with those. And we have many of our partners based in the US. It’s not just the UN agencies which are based here, but we have many, many great partners based in the US, not just tech partners, but other partners in CSOs, in faith networks and others who are really coalescing and advocating around these agendas and supporting the agencies of the US government, working with the agencies of the US government on these issues.

So, you know, as I say, we take a position of trying to gather the evidence, looking at the evidence, looking what’s working elsewhere, sharing that with relevant parts of governments, whether it’s the American government or any other government and taking a position of encouragement, encouraging governments to do more, encouraging them to learn from others.

And as I say, I think there are some things coming up, not for me to speak to today, but there are, I know some things working way behind the scenes. This administration has been in office for what five or six months? But there are, I know, there is more coming up that we’ll see, I think in the coming months and we’ll see I think this administration taking this particular issue, both child sexual abuse and exploitation broadly, but also the online digital dimensions of it, more, you know, taking more action across this agenda, which is exciting.

And also because we want to see what happens in America, we want to see more in every country, not just the US, we want to see more action taken, but also we want to see the US increase its global presence, its global role on this as well, because it has, I think a leadership role to play there too.

Neil Fairbrother

Okay. Just to dig into some of the detail on General Comment 25 then. In section I think it’s 5, evolving capacities it’s called, and it says there that “…states parties should respect the evolving capacities of the child as an enabling principle that addresses the process of their gradual acquisition of competencies, understanding and agency”. What does that actually mean? And how do you identify who is a child?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

I think the essence of this part of General Comment 25 is that online services and platforms need to be age appropriate. And therefore, as we know children, their development, their abilities, their behaviour, their resilience, is different at different ages and so online services need to need to take account of all of that.

And so if you think about, I mean, very, very broad buckets of infancy to 5 years when children are babies, toddlers, they are very dependent on parents and caregivers. When they’re age 6 to 11, starting school, they are a little bit more independent and self-care and then 11 to 18 or 12 to 18, where children are increasing independent, they have more autonomy, they lean more on their peers for approval and support. So you can slice it in different ways, but just thinking about how radically different children are from the age between the ages of zero and 18 and at the moment digital services are not designed with these evolving capacities of children in mind.

And it’s frightening. I heard a great analogy a few months ago about the car industry you know, the cars have evolved. They have safety standards in most countries. And so when you or I get in the car and drive it, we expect the brakes are going to work. If we have a crash, it has an airbag that is going to work. There may be warning signals and all sorts of stuff. And there’s a bunch of stuff in car technology, mobile vehicle technology, which has evolved global standards and national standards over many decades. And the internet of course is much younger. But actually for many of us, and I have two teenage children, when we let our children go online, we are letting them go into an incredibly dangerous environment.

And I think many people, including parents and caregivers, don’t actually appreciate that. And I share that as an overarching comment, because then when you think about the different ages of children, they’re all on the same internet. Now there may be some apps, there may be some things you can download to add protections, but my anecdotal sense on that is that not many parents do. And so when you look at the figures of how many 4, 5-year-olds are online playing with a parent’s iPad or whatever it might be, through to a 17 or 18 year old, they are all on the same internet. It is just as dangerous for all of them, whatever platform, services, et cetera, they may be accessing.

So that kind of reinforces the point. The whole thing is not built for children and it’s not made safe for children in the way that in other industries we expect safety. But I think what this part of the General Comment does, Neil, which I think is really important, is to recognize that it’s not just making the internet safe for children, it’s being age appropriate in terms of children’s development, abilities, behavior, resilience, decision making, their independence, all of those factors, which we know evolve over a child’s lifetime, over their adolescence, before they become an adult.

Neil Fairbrother

Well, indeed. But to do that, surely you need to know what their age is? So you need to have some kind of age verification system in place, yes?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah. And I think there are ways that can do that. And then you’ve got to be thinking about also the role of parents and caregivers as well in terms of what their children are able to access online. So you would have some kind of age verification process for sure. And in the way that many, many products do now. And in fact, actually, you know, there are many social media platforms where there’s a very light touch age verification. You have to confirm you’re a certain age to go on a certain platform. And at the moment I think there’s quite a lot of children actually just don’t put their actual birth date because otherwise the platform wouldn’t let them on.

Neil Fairbrother

Exactly. Okay. Moving on, under “General measures of implementation by states parties”. Now it is quite a detailed section, but it says that amongst other things data collection, research, independent monitoring, as well as commercial advertising and marketing, all say that the “…profiling or targeting of children of any age for commercial purposes should be prohibited”. So that being the case, should social media accounts be algorithm and advert for free [for children]?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

I think they should. And I think that’s as indicated in the General Comment, which you just referred to, I think all businesses should be preventing their networks, their digital networks and services being used in ways which violate children’s rights. And that includes data privacy. It includes, I think, profiling, typing your children for marketing and commercial purposes. I think that’s absolutely what should be happening.

In fact, again, as I say, I have two teenagers myself who are bombarded with the adverts, commercials, et cetera, based on I presume what they’ve looked at, et cetera. And I think children should not be bombarded by all of that. So I think that the networks should prevent that happening.

I might just mention here, Neil, actually, I don’t know if you saw a couple of weeks ago, one of the partners we’ve invested in, the 5 Rights Foundation had a really innovative advertising campaign called Twisted Toys.

Neil Fairbrother

Yes, I saw that.

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

It’s slightly related to this point, but I just give it a bit of a plug because what they did in that campaign is unmask some of the hidden risks that children face on the internet using toys and they’ve renamed them, things like “Share Bear” rather than Care Bear, “Stalky Talkie” rather than walkie-talkie. But, you know, they’ve used traditional children’s toys and re-imagined rename them and shown how they connect, that they do connect, children to strangers across the world, which just shows how dangerous and inappropriate the violation of data privacy can be. And that I think emphasizes as part of that data privacy point, that we should expect our children to have online.

Neil Fairbrother

Okay. Section K relates to access to “justice and remedies”. Now in a previous podcast, we covered the case of Twitter being sued by a 16 year old because he alleges that Twitter didn’t remove his intimate images that were extorted out of him as a 13 year old, when he requested it. And by a simple bit of maths, these images were online for at least a month before Twitter eventually removed them. Is this what is meant by access to justice and remedies, that children have to engage with lawyers to get action taken?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

I think access to justice and remedies encompasses a very wide range of services for children who’ve been victims of online sexual exploitation and abuse. So child friendly and age appropriate reporting, complaint mechanisms, also services available if their rights have been violated or abused. The Canadian Centre for Child Protection recently launched a powerful campaign. You might’ve seen that one with survivors of online child sexual abuse, and they called out Twitter on its birthday for the spread of their assault, videos and imagery. And this specific focus on Twitter based on the study by the Canadian Centre, you may be familiar with last year that reviewed reporting functions of child sexual abuse material on a variety of popular platforms and based on that study, Twitter classified as poor across the four indicators and shows that it is very difficult to report child sexual abuse material on Twitter.

So I go back to some of the earlier points I made about the criticality of research, generation of data, to inform advocacy efforts, to demand the change and the accountability we need to see from tech companies, their platforms, their social media offerings to better protect children.

And I think the broader conversation there, I think around justice and remedies you know, I’m aware of a nascent survivor-propelled movement, that’s gathering momentum in the US and I think globally around child sexual exploitation and abuse. People now adults who have experienced exploitation, child sexual exploitation and abuse, really framing an agenda around prevention, but also around healing and justice.

And one of the things you mentioned actually, as you introduced this question was about some materials staying online and I think that’s just a broader point that goes beyond this question and this issue. One of the many issues why sexual abuse material videos and images is so devastating is that they often live on for many, many years online. So again, that detection, that removal is as important as the upstream disruption and prevention, but detecting and removing them is a vital task as well. Otherwise you have people who may be adolescents, late age children, and then adults and those images of them being abused are still out there for many, many years after the abuse took place.

Neil Fairbrother

And that neatly segues into the next question really, which is all about privacy and the right to privacy is often use as an argument particularly by the large tech companies at the moment for end-to-end encryption. And in fact, I watched Tim Cook the recent Apple developers’ conference and one of the lasting images in my mind was Tim Cook standing in front of a huge screen with the word “privacy” splashed across it. Now these private spaces though, these encrypted spaces can first of all, lead to dangerous spaces for children online, but it also makes it impossible to find the very content that you’ve just talked about. How does General Comment 25 address this?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah, that’s a great question. It’s really one of the few, very live, very current, very important debates. So the General Comment 25, is clear that where encryption is considered an appropriate means, states that parties should consider appropriate measures, which enable the detection and the reporting of child sexual abuse material. And those measures won’t surprise you, but it must be strictly limited according to the principles of legality, necessity and proportionality. And the reason I say this is this is such an important debate, the end-to-end encryption, the privacy debate ongoing within a wider conversation, which is looking at the ways to balance the online safety and the privacy of children, but also ensuring that tools to do what you just said, to detect and remove child sexual abuse material, are also able to operate.

And I think what we’ve seen Neil in recent years is that technology advances have tended to favour the exploiters. So what I mean to that is with encryption, with peer-to-peer networks, the dark web, they’ve all made it easier for supply to meet demand. The supply of this material to meet the demand, those looking for this material and to do so anonymously and frankly, relatively free of discovery, free of exposure, free of detection and prosecution.

And we know that the rise of online child sexual exploitation and abuse material is really fueled by a perfect storm, which is, as we’ve mentioned a couple of times in this discussion already, increased internet access, the rise of sophisticated transnational criminal networks that exploit the technologies and the weaknesses and how the governments legislating how the tech companies respond, whether it’s across geographies, jurisdictions, and content platforms.

There has been progress, but we have to be honest at this point long-term solutions have not yet been found. One example that I’m sure you’re familiar with the recent EU legislation, which restricted the use of tools to detect and remove online child sexual exploitation or child sexual abuse material. It took many, many months and concerted efforts of advocacy by many partners around that, the political agreement on a temporary derogation was secured.

So the companies continue to use those tools, but the long-term solution hasn’t yet been found. And that should have real urgency. And I go back to the broader point I’d made about public awareness, not just awareness around governments and tech companies and legislators who are increasingly aware, but there is very little public awareness about this. If most parents were aware about the risks that their children face and what is actually going on online to children and how quickly a child on a connected device, in a private space, such as their bedroom can get into fairly serious trouble, and there are so many well-documented stories sadly of this, I think they would be horrified.

I mean, certainly I know just anecdotally you know, from talking to friends in events I go to in social settings, people are often horrified when I give them just a glimpse of what’s going on in this space. And these are otherwise very well-informed people about many aspects of parenting and child safety, but this is one that there seems to be just a real absence of information and awareness, which it has to be a critical part I think, of that pressure, consumer pressure, parent pressure, you know, voter pressure, calling on governments to legislate for change if necessary, calling on the tech companies and others to go further than they have gone.

I don’t know which comes first, Neil, is it the legislation? Is it the companies choosing to make the platforms safer? Is it parent, voter, consumer pressure, or is it a mix of those things? Likely it’s a mix of all three of them, you know, I don’t know what’s the chicken and the egg, which is going to come first, but we just need to see a great rise, I think, a great increase of awareness that will drive, I hope, the demand for the steps to be taken by governments, by tech companies and others to make the internet safe.

Neil Fairbrother

Indeed, and, and what you were just saying now about the Temporary Derogation really just re-established the status quo. It’s allowed companies to use products such as PhotoDNA, which does a retrospective analysis of content, which has it has a place. General Comment 25 does have a kind of catch-all, which says “…special protection measures, protection from economic sexual, and other forms of exploitation.” And as we’ve discussed, technology can be an enabler of all of this and to some extent technology is already used to help prevent it, but retrospective technologies like PhotoDNA, while they may have a place, don’t stop things from happening. It seems to me, the only way you can stop things from happening is if you can do it in real time.

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yes, I think you’re right. And that’s why, you know, the existing tools, you mentioned PhotoDNA, the existing tools are what we have today, and that’s why this push that I spoke to about, you know, yes, it’s legislation, it’s public awareness, it’s consumer awareness, it’s educating parents, caregivers, and children, but it’s also tech innovation and tech companies doing more.

And so the existing tools, frankly need to be adapted to different contexts, different systems around the world. They need to secure the licensing, the training, the maintenance, all those things that you would expect for such tools. And then the companies involved in the technology sector need to do more, to create more tools, to detect, to remove. And as you say, the closer we can get to real time the better, and those tools need to be open-source and shareable in not just the companies, because governments need them, NGOs, businesses, other sectors, you know, this has to go across so many different sectors, the education sector as well.

And that’s why we’re doing, you know, one of the recent rounds of investment from the End Violence fund was in technology enabled solutions. So I hope that when we speak again in a year or so, we’ll be able to tell you about some of the things that we’re investing in, and some of the things that we hope to come out of that.

The partnership we have with the Technology Coalition, where with funds from the Technology Coalition, we’re doing research to get insights that can be actionable by industry. We’re still in the foothills of this agenda. I mean, this is a mountain to climb. Frankly we’re in the foothills. We can see the peak that’s ahead of us and collectively we have to really move much faster up that mountain if I can stretch that metaphor a little bit because, you know, we have what we have today, and we know that we need more, we need better, and we need them more widely adopted and used.

Neil Fairbrother

Okay. Thank you very much for that, we are sadly running out of time. I feel like I could talk to you all day Howard about all of this. Very briefly, what’s next for the Global Partnership to End Violence against Children. What’s the next big thing that you’re working on?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Yeah, a few things. I mean, first of all, we’re at an interesting inflection moment. The Global Partnership to End Violence Against Children and fund is five years old this month. As I mentioned earlier, we were launched in 2016 on the back of the Sustainable Development Goals, the SDGs. The partnership was launched as I say, we’re a platform for collective advocacy, action and investment. We’ve learned a lot in these early years. We know it’s not just the online space, but whether it’s keeping children safe at school, safe at home, safe in their communities and then there’s multiple forms of violence against children, which manifest in different ways in different settings. So we’ve really been looking at learning as we’ve moved over the last five years as we built now to, as I said earlier, over 600 partners we’re currently preparing our next strategy for the next three years.

We’ve refreshed and simplified some of our governance and really positioning I hope, for a collective effort where we can accelerate progress to end all forms of violence against children by 2030. That’s the SDG, that’s SDG 16.2 that we are striving towards.

And then maybe just to close out Neil on some things, because we’ve been talking more specifically around online safety, you know, just to reinforce; the internet was not created with safety for anyone in mind, and certainly not children’s safety. We’ve seen an explosion in CSAM, child sexual abuse material, in recent years. We know that it’s gone up even higher during the COVID 19 pandemic. One of the things we have also seen though, I think during the COVID-19 pandemic, I think is that nascent public awareness. I think there’s been a little more public awareness because more parents have been aware, of course their children have been living life more online.

We did some work with a coalition of partners globally, WHO, UNICEF, Parenting for Lifelong Health and others to get parenting advice and tips to parents and caregivers. It’s now reached I think 193 million families globally. And that advice included specific advice about protecting children online. So we know there’s a demand for that sort of advice for parents and caregivers.

There’s a much wider agenda, which we’ve been talking about, I won’t in rehearse all of that now, but the whole internet space has to be one digital space where we collectively do much better in terms of public awareness, evidence, generation prevention, technology-enabled solutions, and a significant upturn in financial investment to get all of this done.

We, the End Violence partnership, are in the middle of what we’re calling “Together To End Violence” campaign and solutions summit series. It was launched last December. It’s running through a series of events across this year, and there will be a leaders’ event at the end of this year in December 2021. And at that leaders’ event, what we’re working towards is leaders across all sectors. So government and the UN and CSOs, media, private sector technology, and elsewhere coming together and making policy and financial and other commitments of what they’re going to do to prevent and respond to all violence, abuse and exploitation of children, including the online piece.

I would just add, sort of stepping back a little bit, why we do all of this. There’s a moral imperative, but there’s an economic case. We know the violence against children, whether it’s online or otherwise, undermines all the other investments in children, in their health, their development, their education, et cetera, with often lifelong, sadly, with lifelong consequences for so many. So there’s a really strong investment case of why it makes sense to invest upstream, why it makes sense, all the things we’re talking about, why we’re so excited about General Comment 25 in the CRC. Just one more tool in the toolbox to help advocate and push and strengthen government, tech, industry and other efforts to protect children online.

Neil Fairbrother

Maybe Facebook could use some of the $1 trillion that they’re now valued at to help a little bit more?

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

It would be good to see increased resources into this space, for sure. And yeah, it’s not lost on us, the dissonance between some of the resources for some things and what’s available for this space, but, you know, I just think we have to have frankly a movement that builds around this, a public awakening and awareness and awakening around this issue. If we get more pressure, more demand from parents, consumers on their governments, as voters, as consumers, et cetera, on governments, on tech companies and others to go much further than most have gotten so far in doing all they can. And I mean all they can, to make the internet safe for children.

Neil Fairbrother

Brilliant. Howard, we’ll have to end it there. Thank you so much for your time. It’s been a fascinating discussion and it certainly brought the General Comment 25 to life.

Howard Taylor, Executive Director of the Global Partnership to End Violence Against Children

Thanks, Neil. Great to speak to you.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top