Safeguarding podcast – Missing the target. A discussion with Anna Borgstrom of NetClean.

In this safeguarding podcast we discuss the prevalence of child sexual abuse material (CSAM) in the corporate IT environment, it’s scale and impact. We discover that IT teams typically don’t have the specialist systems in place to find this illegal content, and that processes and procedures are inadvertently designed to make matters worse. What can corporate IT teams can do to eradicate from their IT estate? Listen to Anna Borgstrom, CEO of NetClean to find out.

There’s a lightly edited for legibility transcript of the podcast below for those that can’t use podcasts, or for those that simply prefer to read.

Neil Fairbrother

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast where we talk about all things to do with safeguarding children in the online digital context. Child safeguarding in the online digital context is at the intersection of technology, law and ethics and culture and it encompasses all stakeholders between a child using a smartphone and the content or person online that they are interacting with.

The corporate workplace can be a source of child sexual abuse material, but many businesses aren’t aware that the IT that they supply to their employees and contractors is being misused to collect and disseminate illegal and damaging child sexual abuse images and videos, and this at a considerable scale. Our guest today can shed some light on this issue, the problems it poses, the solutions that are available and the impact it has. Welcome to the podcast, Anna Borgstrom, CEO of NetClean.

Anna Borgstrom, CEO NetClean

Thank you very much.

Neil Fairbrother

Anna, could you give us a brief resume of yourself and tell us what NetClean does?

Anna Borgstrom, CEO NetClean

Yes, sure. So my name is Anna. I come from Sweden and NetClean is a Swedish company as well. I will come back to that. I started out as a software developer at Ericsson and after that I went into project management, program management and I ended up at NetClean, 10 years ago. The reason I found NetClean as an interesting company for me was that it was an opportunity for me to combine two things I really care about and feel passionate about, and that is technology and doing good to society, and what better solutions are there than technology that could potentially save the most vulnerable in our society?

Neil Fairbrother

Indeed.

Anna Borgstrom, CEO NetClean

yes, it was a perfect match. And I’ve been here now 10 years and it’s been a fantastic journey.

NetClean was founded back in 2003, and back then, you know, it was quite early on the internet and it was in a time where law enforcement saw that images of child sexual abuse material were uploaded on the internet, on the open net, and they found a child sexual abuse material in their investigations. The founders of NetClean, they thought that if law enforcement is finding images and if they are classifying images and marking them as child sexual abuse material, there could be ways to find that type of material in other places such as the workplace, because no employer would want to have their equipment misused.

Neil Fairbrother

Indeed. And in the context of the corporate work environment, how does your product NetClean work?

Anna Borgstrom, CEO NetClean

Well, we have different technologies and the product that we are targeting the work-base with, our main products for that is an end point solution.

Neil Fairbrother

Okay. What do you mean by an endpoint?

Anna Borgstrom, CEO NetClean

Yeah. So it sits on every computer and every server in a corporate environment and when someone is using the computer or to consume known child sexual abuse material, an alarm is sent off to the company’s security department and the company handles that alarm according to their processes. So what we do is that we work in collaboration with law enforcement agencies around the world and companies or NGOs such as the National Centre for Missing and Exploited Children (NCMEC) in the US and the internet Watch Foundation (IWF) in the UK and we get hashes, fingerprints, of all known child sexual abuse material that we compare to in the corporate environment. So that’s how we sort of protect the endpoints.

Neil Fairbrother

Okay. You do a lot more than just deliver a technology solution. First of all, you have an excellent series of annual reports, which we’ll talk about shortly, but you also run what you call the “Brighthood” conference. What is the Brightwood conference?

Anna Borgstrom, CEO NetClean

Yeah. So one really important thing for us is that we want to develop the right type of technology, technology that could be the most efficient. And to be able to do that, we need to know a lot about this the crime type that we are trying to address and we also wanted to share the knowledge that we gain over the years working in this field. So hence, that’s why we have the report.

The Brightwood conference is a way to actually gather all the stakeholders in this society and talk about solutions, how different types of stakeholders work in order to combat child sexual abuse and exploitation. So our aim for the Brighthood conference is to look at the whole society, from a political level down to, you know, companies and technology providers in this field.

So we are trying, we have one day a year where we gather all the stakeholders. It’s an open conference. Everyone is welcome to join. And we talk about solutions and how different stakeholders are facing obstacles and what they are doing in order to sort of overcome those obstacles. It’s usually very, very interesting to learn from each other as well.

Neil Fairbrother

Okay. Now we interviewed in a previous podcast edition Ernie Allen of the WeProtect Global Alliance where we discussed the Model National Response and you’ve taken that Model National Response and adapted it into what you call a Technical Model National Response. What is that, Anna?

Anna Borgstrom, CEO NetClean

Yeah, yeah, we did. We thought it was such a good model and we thought about how can we with technology apply that model? So in that model, we look at what different connectivities exist in society and what is needed in order to combat child sexual abuse material. So the model describes how a society can work together to combat child sexual abuse and we were looking at what technology that can be applied by internet service providers, and we look at what technology can be applied within the law enforcement community, and we look at technology for, you know social media and the social platforms and also the companies or the corporate environment as well. And then we sort of in a way to try to, in a pedagogic way to view how material can be found and by following the trail of a detected image you can actually safeguard children.

Neil Fairbrother

Okay. Now every year for the last five years, I think, you’ve been producing what you call the NetClean report. What is the purpose of this report? Why do you produce this report and how does it differ from other reports which cover similar topic?

Anna Borgstrom, CEO NetClean

Yeah, that’s a good question. What the difference is with our report is that we surveyed the police community globally and we do that with help from our sister company Griffeye. So we use their users, police officers, that are working with child sexually, with crimes against sexual crimes, against children on a daily basis. And we ask them how they see these crime evolving and what obstacles they have when they are investigating this crime type, how they see technology as a driver for the client, but also as a driver to combat the crime. And the purpose of the report is to highlight this topic from a global perspective. And we also survey I think since ,two to three years back, we also look at what the private sector are doing in this field and how they can work together with law enforcement because law enforcement contrast themselves out of this crime. They need help, you know, from everyone in order to be able to investigate these crimes.

Neil Fairbrother

Okay. The 2019 report it covers eight sections or eight topics and unfortunately, we don’t have time to go through all eight in a lot of detail, but I do want to have a look at some of these sections and in particular at the live streamed child sexual abuse section. Can you tell us what the livestreaming of child sex abuse is and how does it happen? What’s the modus operandi of the predator in this area?

Anna Borgstrom, CEO NetClean

Yeah, well, live streaming. We have heard about live streaming. How live stream is a challenge and an increasing problem in the last three reports that we did. And we know that it’s a big technology challenge to be able to detect an abuse in the livestream. So we wanted to look into if the abuser is the same abuser that is livestreaming and that is collecting child sexual abuse material. So that was our sort of stance when we did this this report. And we also looked at what type of material that is live streaming, what courses they take. So what we found was that voluntarily self-produced live streaming which is featuring children or teenagers is actually on the rise.

It could be, you know, teenagers that are in a state of undress or engaged in sexual behaviour, it might mean it might be intended for a boyfriend or girlfriend or posted via a game or an app without sexual intent. But law enforcement see that those types of images or material is ending up in child sexual abuse material collections. And we also found something that is called “Induced livestream child sexual abuse”, which is a result of grooming or sexual extortion. And in the case of grooming, the child is coerced into livestreaming while they are in a state of undressed or engaged in sexual behaviour. And in the case of sexual extortion, they are threatened and forced into livestreaming the abuse. And I think that those are the worst victimizations of livestreaming.

Neil Fairbrother

Yes. One of the statistics that leapt off the page to me anyway, is that it’s actually the voluntary and not the coerced self-produced livestreaming that is the most common type of livestreaming. I think over half, 53% of live streaming incidents were voluntary and not self-produced, which is quite astonishing. What do you think is driving that behaviour? Is it simply slightly older children, perhaps teenagers, mid-teens, exploring their sexuality or is there something else there?

Anna Borgstrom, CEO NetClean

Well, I think it could be and I think it’s most likely that. I mean we are talking about the generation of people that are sort of has grown up with the internet and for them to share images and videos of themselves, it’s natural for them. So I think it could be that case. Definitely.

Neil Fairbrother

Okay. Now your report had two more categories when it comes to live streaming. One is called “live stream” and the other is called “distance live streamed”. What is the difference between those two?

Anna Borgstrom, CEO NetClean

While distance livestreaming is a, and how can I say that? It’s something that we saw was important to look at because we were curious whether the live streaming that occurred on the internet was, you know if people in Sweden for example, that were sort of abusing children in Asia or where sort of the victims and the offenders were in the same region, or if they were in different parts of the world. So the distance livestreaming is people that don’t live in the same country as the victim.

Neil Fairbrother

Okay. Okay. And when it comes to victims and predators is there much of a difference in locations around the world? Are predators more prevalent in certain regions or countries and are victims more prevalent in certain regions or countries?

Anna Borgstrom, CEO NetClean

Well we found actually that many victims of it, especially voluntarily induced livestream, shared sexual abuse material, comes from the US and from Europe. While victims from distant livestream, child sexual abuse come primarily from Asia but also from Europe, Russia and the US. And I think that we also found, if I remember correctly, that the abuser and the victims are often in the same region or country, but that could be also depending on how the law enforcement is investigating the crime. I mean, you probably if you’re Swedish police, you investigate crimes from Sweden. So that could be a bit misleading in that way. We were quite surprised that the many victims of voluntarily or induced livestream child sexual abuse comes from US and Europe in that large scale as our report shows.

Neil Fairbrother

Yes. It’s an interesting one because I mean, if you look at the UK, for example, the IWF who you mentioned previously have done some fantastic work in serving take down notices and getting hosted images and video taken down in the UK. And we have way below 1% of known content hosted in this country, but live streaming is another issue completely because your report suggests that actually it takes place here too.

Anna Borgstrom, CEO NetClean

Definitely. Yeah, definitely.

Neil Fairbrother

Okay. Now, as many people are now discovering thanks to the Covid-19 lockdown, there are many different streaming services that are available. Is there a most used streaming service for streaming child sex abuse at all? Or is it evenly distributed across them all?

Anna Borgstrom, CEO NetClean

I think it’s now, maybe not evenly, but definitely on every platform I would say. We know that Skype for one example is one of the frequently mentioned platforms in our reports. But I would say, and I know that the Swedish police just launched a report today, I think, and they see these type of material on every platform really.

Neil Fairbrother

Okay. I’m looking at the section of the report that focuses on CSAM in the business environment, in the enterprise environment. You focused or concentrated on four key areas, the first being whether businesses have policies in place that lets all staff know and understand that it’s prohibited to handle CSAM within the company’s IT environment. Did you find that those policies exist or is this not addressed by the enterprise?

Anna Borgstrom, CEO NetClean

We found that the policies exist and we are very positive about that because we didn’t know that they were existing at all, but so very positive response that the policies do exist in corporate environments, that it’s illegal to handle child sexual abuse material explicitly written in the policies. So that was a very sort of positive response from the report.

Neil Fairbrother

Okay. The second area of the study on the enterprise market asked whether businesses have an action plan in place to deploy, if CSAM is found on the organization’s IT environment. Do they have action plans, do they know what to do?

Anna Borgstrom, CEO NetClean

Oh yes or no, I would say. They might have, when you read it, 8 in 10 say that they have action plans in place. When we asked further what these action plans were, I would say that they are not, I mean you would expect more if you see the images as being crime scenes, you would handle it in a different way.

Neil Fairbrother

Yes, yes. Your report says that part of many of these action plans is simply to delete the material. And as you rightly say, these are often regarded as crime scenes. If they’re deleting the material, they are deleting a crime scene and they’re preventing further investigation and removing the chance of a child, perhaps being identified and rescued. So that surely can’t be good practice?

Anna Borgstrom, CEO NetClean

No, no, it’s not good practice at all. I think it has to do with the lack of knowledge in what this really is. And I think that we all have a lot to do in that space to educate everyone that it’s not just an image, it’s actually a crime scene with victim and a potential perpetrator. So the image even if it’s known in a different database, it needs to be fed back to law enforcement because the victim in the image may not have been found yet.

Neil Fairbrother

Yes, indeed. And some companies report that they will try to accrue more evidence and they will analyse what they find. But looking at this stuff is for the most part, for most people, illegal. If they try to analyse this content by looking at it, does this not then make them culpable themselves? Are they not then guilty of revictimizing the victim and looking at illegal content?

Anna Borgstrom, CEO NetClean

Yeah. It could be that depending on the laws, definitely. But it also, I mean, you don’t want to see these images and you don’t want to have that job that is that you are the one that are going to make a judgment call whether this is child sexual abuse or not. I mean, the images that we talk about is really a rape of children and it affects everyone who looks at it. And if you’re not trained for it and you don’t have a support and backup system behind you, I would say that you may, you know, suffer from mental health if you’re exposed to this type of material. So I think that if you don’t have the right policies, if you don’t have the right processes in place and technology, I would say to find this type of material in the corporate environment, you potentially expose your staff to view images that could be mentally destroying to them.

Neil Fairbrother

Indeed and illegal as well. And in fact, the third area of focus of your enterprise study was whether companies do have specific technologies in place to detect, to block, CSAM images. Do they by and large? I’m imagining not, but surprise me.

Anna Borgstrom, CEO NetClean

Yeah, no, you’re right. They don’t. I think also once again, I think it’s a lack of understanding of how the consumption pattern in this crime works. I mean most companies have filter solutions and filter solutions are doing a good job filtering what they are sort of focusing on, which could be pornography, could be gaming sites, it could be places on the internet that are very static.

I mean if you have a porn site, you want as many viewers as you can on your site because you sell advertising and that’s how you make the money. While as you are a distributor of child sexual abuse material you don’t want to be found, so you tend to find other ways to distribute the material. Don’t do it over the open net, which is where a filter solutions block. They use dark web, they use peer-to-peer network, they use encrypted social media applications and stuff like that. So if you use filters solutions in a corporate environment in order to protect yourself against child sexual abuse material, I think you are missing the target.

Neil Fairbrother

The fourth area was whether CSAM had ever been found on the business IT environment. What was the prevalence of that?

Anna Borgstrom, CEO NetClean

Yeah. Yeah. The prevalence of that was that 1 in 10 companies said that they had found material and also that they reported to have found the material in a wide variety of ways. It could be by IT specialist in different check-ups, or IT scans, or by someone else reporting it to IT, or with help of technology. So it was different aspects on how they found the material.

Neil Fairbrother

So 10% of companies have found this material within their it estate?

Anna Borgstrom, CEO NetClean

Yeah. And we thought that number was quite low actually, because our statistics, if you use our endpoint solution, which I was talking about earlier, we know that we have approximately 1 in 500 computers in a corporate environment is used to consume child sexual abuse material. So yeah…

Anna Borgstrom, CEO NetClean

So in a 10,000 employee firm, that’s 20 people using the corporate IT infrastructure to store and perhaps disseminate child sexual abuse material?

Anna Borgstrom, CEO NetClean

Yes. And that could be, I mean, depending on the gender of your staff. There’s a gender balance; we can also see from our customers that if you have more men employed, you are more likely to get more alarms than if you have more women.

Neil Fairbrother

Is there a distinction between the type of staff doing this? I mean the seniority perhaps of the staff doing this? I had a very quick conversation this morning with my sponsor’s safeguarding expert, Sarah Castro, and she said that in research that she had previously been involved in the more senior staff were the more culpable of doing this. Is that what you found or do you not have data on that?

Anna Borgstrom, CEO NetClean

Oh, well we do have data that it’s in our 2018 report. And we found that [the age of] people that are using the company computer [for CSAM] is between 20 and 55, if I remember correctly. But most people are 40, between 40 and 45.

Neil Fairbrother

So senior in age, if not in responsibility. You spoke at the UN launch of the Broadband Commission report recently and you outlined there the scale of what you’ve been finding, which isn’t just the number of predators. We gave a number of 1 in 500 machines within the corporate IT infrastructure being used for this stuff. But you had a different scale here. You put up a slide that contained some actually staggering numbers. You said that in a normal case, a predator might have one to three terabytes of content, which would be about 1 million to 10 million images. But at a worst case scenario an almost unbelievable 100 terabytes of content, which is you say 100 million images and 100,000 hours of video content. That’s 33 years’ worth of nine till five, eight hour day watching. That is an astonishing volume. So how do investigators even begin to analyse 33 years’ worth of video?

Anna Borgstrom, CEO NetClean

Well, that’s the thing. I mean, can you imagine having their job? They are their true heroes and we have to make sure that they have the right technology and the right resources around them in order to be able to do their work. So I would say how they do it is that they need really efficient technology that could help them go through that type of massive material that they have to work with. And I believe that artificial intelligence could be really, really helpful for law enforcement going forward.

Neil Fairbrother

Yeah. This, this volume of content… first of all, we’ve briefly mentioned the potential impacts on the mental health of people who stumble across this stuff, you know, an IT manager is probably not trained to deal with this. He finds some of this content on a laptop and looks at it and that can be enough to affect his or her mental wellbeing.

But then there must be also an impact on the perpetrator. As they hoard more and more of this stuff, their mental wellbeing must be suffering and it’s almost like an addiction. It gets worse and worse. I’m not trying to defend, by the way, predators, but there must be an impact on them as well that perpetuates what they do.

Anna Borgstrom, CEO NetClean

Yeah, definitely. And I think that what you are saying is, I’m not an expert on perpetrators, but I think that some experts may agree. I think that, you know, if you are exposed to violence content and you get an addiction to it, you need to see more and more violent content. And I know that in other reports that you could see the trend of these type of materials getting more and more abusive and violent and that is really terrifying.

Neil Fairbrother

Yes, it is. And obviously there’s a huge impact on the victims. We must never forget that behind all of these images is a real person and a real story,

Anna Borgstrom, CEO NetClean

Just to say the addiction stiry that you brought up, I think that that’s true too, because what we see in corporate environments is that many of the alarms comes from USB sticks. So you have your private USB stick with your own collection and then use the computer to view those images. I think that you get hold of the images through peer-to-peer networks or darknet or other places which could be hard to block or detect with technology and then you store them on USB sticks or personal devices or external devices or we also see that cloud storage is increasing at the moment.

Neil Fairbrother

I worked in a very large corporation a few years ago and if you used a USB stick it would flag an alarm and the IT team would come and visit you. Does end-to-end encryption provide a safe harbour for these perpetrators? And if so, what can be done about it?

Anna Borgstrom, CEO NetClean

Oh, definitely. I mean that’s definitely. What can be done about it is that you shouldn’t be able to run a service or an application if you don’t agree to open up the traffic in a way that you could detect child sexual abuse material.

Neil Fairbrother

A lot of people find that that’s not acceptable because they claim that this is a back door into an invasion of privacy by the state, or it provides a back door for other criminal activity, for example, it weakens your privacy to do perfectly legitimate things like online banking. Is that your view?

Anna Borgstrom, CEO NetClean

Yeah, I understand that discussion. But I mean, there has to be ways to do this without invading privacy and your banking stuff. But when we talk about previously, because this is quite interesting, we also always talk about privacy of the users, that the user is an adult. But what about the privacy of the children that are pictured in the images. I mean I don’t think that we have the child’s view on this at all. And we talk about privacy and human rights. What about children’s rights and what about the children that are being exploited?

Neil Fairbrother

Well, yes. And you are very much a supporter I think of the UN’s Sustainability Goals because you have a project running called Agenda 2030. What is that?

Anna Borgstrom, CEO NetClean

Yeah, Agenda 2030 is actually the UN or the member countries of the UN has agreed upon 17 goals that will make our world better and safer and healthier. And one of those coals is 16.2, which talks about ending violence against children and sexual exploitation against children. So we very much work toward that goal, of course with our technical solutions and our customers also work towards that goal. So our solution is a really tangible activity to put in your corporate social responsibility agenda if you are working towards Agenda 2030.

Neil Fairbrother

Okay. One of the most popular trends in corporate ITY over recent years has been Bring Your Own Device or BYOD, does this make it harder to track child sexual abuse material in the corporate environment?

Anna Borgstrom, CEO NetClean

Yes, of course it does. I mean it requires that the company has policies and procedures in place to safeguard, and not just from child sexual abuse, but also from other type of threats that could potentially end up in their networks.

Neil Fairbrother

So in your 2019 report, you refer to some emerging trends that are in the corporate IT space for tracking down and eradicating child sexual abuse material. What are these emerging trends?

Anna Borgstrom, CEO NetClean

Well, the emerging trends are an increase in use of cloud storage, encryption and smartphones and encryption is the biggest challenge for law enforcement.

Neil Fairbrother

And we are running out of time unfortunately, so we’ll have to wrap it up soon. But what would you say is considered best practice for a corporate IT team if they find child sexual abuse material on their IT estate?

Anna Borgstrom, CEO NetClean

Well, if they find it by accident, they should definitely not delete it. They should make sure that they are reporting the findings to law enforcement and then take law enforcement guidance on how to proceed. If the images is found by technology, like for example, our endpoint solution, you will not be exposed to the images at all. You will just get the fingerprint or the hash volume and then you also get a comprehensive report that you can send to law enforcement.

But they should definitely report to law enforcement and to give law enforcement a chance to investigate further because what law enforcement find if they start up an investigation and they do a house warrant, is that in most cases they find new images, newly produced images on devices that are in ownership of the offender. And in many cases they also find children that lives in the proximity of the offender, because there is a strong correlation between viewing child sexual abuse material and committing own abuse. And most of the abuse happens in the proximity and in the home of the abuser.

Neil Fairbrother

Anna, on that note, I think we’re going to have to wrap it up. Thank you so much for your time, it’s been very, very interesting indeed and I think what you’re doing is fantastic. So I wish you well during this lockdown period that we’re now in, and please stay safe.

Anna Borgstrom, CEO NetClean

Yes, you too Neil, thank you for a great podcast.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top