Safeguarding Podcast – CSAM Users in the Dark Web with Tegan Insoll & Anna Ovaska, Protect Children

In this Safeguarding Podcast with Anna Ovaska and Tegan Insoll from Finland’s Protect Children we discuss their research programs  “Help Us To Help You” and “No Need for Help”,  the compassionate approach Protect Children takes to offender treatment, the drivers, motivations, attitudes and habits of CSAM consumers, and whether offenders’s responses can be trusted.

There’s a lightly edited for legibility transcript below for those that can’t use podcasts, or for those that simply prefer to read.

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the Law, Culture and Technology of safeguarding children online.

Neil Fairbrother

It’s hard to argue against the idea that all children have a right to non-violent childhoods and that no child should be subjected to sexual harassment, grooming, or any kind of sexual violence, whether offline or online. Based in Helsinki, Finland, Protect Children establishes, they say, an “…innovative intervention strategy to prevent online sexual offenders through an offender-led approach.”

To discuss their recent Redirection project. I’m joined by Anna Ovaska and Tegan Insoll from Protect Children. Welcome to the podcast, both of you.

Anna Ovaska, Protect Children

Thank you so much, Neil, for having us on. My name is Anna Ovaska and I’m the legal specialist here at Protect Children. I’m also a global criminal lawyer with a specialization in transnational criminal law and crimes against children.

Neil Fairbrother

Okay, thank you Anna. And Tegan, what is your role at Protect Children?

Tegan Insoll, Protect Children

Thanks for having us on the podcast Neil. I’m Tegan Insoll, I am the project researcher at Protect Children. I have a background in international human rights law with a specialization on the rights of the child and also global health law.

Neil Fairbrother

Okay. And Protect Children is what?

Anna Ovaska, Protect Children

We are a relatively new and pretty small non-governmental organization here in Helsinki. Finland. We have one sole aim, which is to protect children from all forms of sexual violence, harassment, and grooming both online as well as offline. And that’s all that our work refocuses on.

Neil Fairbrother

Okay, thank you. Now Finland has a population of around five and a half million people I believe. Save the Children’s Crisis Helpline receives about 3,000 tips in Finland every year regarding messages that could be construed as sexual abuse aimed towards children or the distribution of illegal images through the internet. And a recent report on Yle’s website suggests that nearly 90% of children in one survey at least had received sexually explicit messages from an adult. How does this compare with other European countries?

Anna Ovaska, Protect Children

Well, we obviously don’t have the data from every single European country, but what we can definitely say is that this is a problem that is not solely in one place. This is a very global problem. We face it everywhere in the world. And recent European data also shows that two thirds of CSAM is actually hosted within Europe. So I think it’s definitely something that is a problem here. It’s a problem in Finland is a problem in the EU and in Europe as a whole, but there’s also a problem globally everywhere. And that’s why we at Protect Children, we’re taking a global kind of approach to this problem. We launched the survey on the dark web to reach individuals globally to gather some information.

Nel Fairbrother

Okay. Now I’d like to explore that a little bit, because this is quite an interesting approach that you’ve taken. In your report, you say that traditional research on individuals who use or consume CSAM, child sexual abuse material, is often inherently biased as it is focused primarily on convicted and known offenders, whereas CSAM criminality, as well as other forms of child sexual abuse and exploitation, is often well hidden. And your survey attempts to overcome this bias by inviting responses from all CSAM users, not just those who have been convicted. So how did you manage to get quite a large response I think actually, from all CSAM users?

Tegan Insoll, Protect Children

Yeah. As you said, the previous research on this specific crime has largely been primarily on convicted offenders, which when we look at the data on CSAM use, this is obviously not very representative because it’s quite a very small proportion of users who will actually ever be convicted of their crimes. So we really saw this need to take a look at the broader group of offenders who will unfortunately never be caught and never be convicted for their crimes.

So we did this by, as you said, putting a survey into the dark web where a lot of CSAM is viewed and shared. We’re not saying that the majority of CSAM is viewed and shared on the dark web because it also happens a lot on the surface web, on normal social media platforms and other web browsers. But yeah, to gather such a large number of responses, we really pin this down to adopting quite a compassionate approach.

So this is quite a core of our work at Protect Children. We believe that a compassionate approach, which is a sort of psychological approach, can actually increase the effectiveness of firstly intervention programs, but also gathering responses from individuals. So this is shown with our survey which is actually titled “Help Us To Help You”, so when the individuals are searching for CSAM on the dark web, they are instead met with our survey, Help Us To Help You and also “No Need for Help”.

So this actually sort of allows the users to click on the survey and they feel as if they’re not being confronted by something that’s telling them that they shouldn’t be searching, which obviously is our view, that isn’t very good, but it basically allows the users to be met with a compassionate approach, which helps us to gather a bit more data from them.

Neil Fairbrother

Okay. So I’m interested in the mechanics of how that survey worked. So if I were on the dark web and I’m searching for this kind of content, how did you get your survey in front of that search for CSAM?

Tegan Insoll, Protect Children

So when searching on the dark web for CSAM, we’ve learned from a number of law enforcement agencies, including global law enforcement, that often users use specific search terms to search for this content. So they’re not simply searching for the very basic terms, but they often use sort of different terms to help basically search for material. So we gathered these lists of search terms from law enforcement and basically with the help of the founders of some dark web search engines, including Dr. Juha Nurmi who founded the search engine “Ahmia”, so with the help of him and others, we implemented the survey so that they appear when a user searches for these specific terms. So that means, we know that a user answering the survey has searched for CSAM using these specific times.

Nel Fairbrother

Okay. And was this focused on Finland or was it a wider international project?

Tegan Insoll, Protect Children

It was an international project. So we actually gathered search terms in different languages across the world, really trying to increase the spread of the users searching, so not just Finnish search terms, which would of course gather Finnish respondents but also gathered from various law enforcement around the world so that it would be quite representative and we would gather quite a good sample from across the world.

Neil Fairbrother

Okay. Now in the report, you say that “…sexual violence against children is a public health problem and that it is not effective and nor is it in accordance with the rights of the child to focus solely on tertiary prevention”. What are the implications of it being a public health problem and what do you mean by “tertiary prevention”? Why is that not effective?

Anna Ovaska, Protect Children

Well, we believe that to prevent such a massive problem, to protect children from this very widespread problem, we need to focus on prevention from all possible perspectives. By “tertiary”, we mean that we cannot simply protect children from getting revictimized and we cannot simply try to prevent individuals from re-offending. What we mean is that we want to protect children from being victimized in the first place, as well as preventing individuals from offending in the first place.

So we’re trying to take this comprehensive approach whereby one of our other projects very much focuses on teaching children these digital safety skills, how to stay safe online, but this Redirection Project focuses on the offender’s side, on the demand side, we’re trying to intervene. We’re trying to help these individuals to change their behaviour, to change their, you know, their behaviour and their seeking of CSAM. That’s why we’ve also created this self-help program that is meant for individuals who are searching for CSAM, but who want to change their behaviour, who wants to stop. So we’re trying to implement this very broad and comprehensive prevention mechanism for this global public health problem.

Neil Fairbrother

Okay. Now you’ve talked about the compassion-based approach or your compassion-focused therapy, which I can understand in terms of your approach to the consumers of CSAM and getting their cooperation, if you will, gaining their trust, to provide information to you. But how well does your compassion-based approach play with the general public, because for a lot of the public locking these folk up and throwing the key away seems to be the view?

Anna Ovaska, Protect Children

Well, I just wanted to give a little bit of background first to add on to what Tegan was already saying about our compassion-focused approach. So this whole innovation of the Redirection Project and the innovation of the Help Us To Help You compassion-focused approach to our survey, was innovated by our Executive Director and Senior Specialist, Nina Vaaranen-Valkonen. She’s also a social psychologist and a cognitive psychotherapist and she works with individuals who have been sexually abused in their childhood.

She does not work with offenders per se, however, you know, her belief as well as all of our belief is that we’re all just individuals with normal human brains. And we have to believe, and we have to understand that everyone has the ability to change their behaviour. And when our aim is to protect children from sexual violence, we cannot simply focus on, you know, the positive side of teaching children their safety skills. We also have to look at the unfortunately dark and, you know, devastating side of the offenders. We want to help these offenders to change their behaviour. And it’s definitely something that is controversial and it’s difficult for some to understand but what we want to highlight is that our aim is to protect children and we will take every single possible approach and avenue to do this.

And we want to meet these individuals who are willing to change their behavior with a compassionate approach. We want to help them. We’re not, you know, we’re not trying to lure these individuals in by answering the survey and then, you know, trying to capture them with law enforcement, nothing like that. We’re simply trying to find the people who want help and then offer it to them with this anonymous rehabilitative self-help program.

Neil Fairbrother

Okay. We’re going to have a look at some of the detail in the surveys, the Help Us To Help You survey and also the No Need For Help one, but before we get onto that, just a quick question on the self-help program, if I may. I believe it uses Cognitive Behavioural Therapy, CBT and it guides users through a number of tasks asking them to consider their own thoughts, feelings, behaviours regarding their use of CSAM and that the end goal of the program is that the behaviour is in fact changed. How is success defined in that program and how is it measured?

Tegan Insoll, Protect Children

It is difficult to measure the success of this kind of program as it is anonymous and only over the internet. So only through the dark web or through the surface web can users actually access the program. So we can’t really get in contact directly with the users in order to measure the success. But we have already seen at least a quite large number of users that have already started to click on the program. It’s only been available for about two or three weeks now and in the first week alone, add about 300 users per day accessing the program. So measuring the success is quite difficult, but we do believe that by seeing these numbers, at least it’s showing some sort of interest. And even if the users are just looking at a little bit, looking at a few tasks, we do believe it will help them sort of change their attitudes and hopefully be motivated to reach out and get proper in-person help as well.

Nel Fairbrother

Okay. Well, good luck with that. Those initial numbers are surprisingly large, and it does reflect, I think some of the numbers in the Help Us Help You survey, which we can start to have a look at now. The survey has a lot of questions and the first of those is “What age was your first exposure to child sexual abuse material?” And the standout figure for me was the largest cohort of respondents, 37%, said that 13 or younger was the age they first came across this, which I thought was quite astonishingly young.

Tegan Insoll, Protect Children

Yeah, this is actually one of our top key findings from the report. It’s something that really shocked us as well. We were expecting quite young sort of first exposure to CSAM, but this figure of under 13 was quite a shock, because, well, we have been seeing sort of the trends over the past few years that internet use and also pornography use is becoming a problem affecting younger and younger ages. So I think there is quite a correlation there if younger and younger children are using the internet to access pornography and other material that is harmful to them. This also links quite clearly with the use of CSAM. So that’s sort of one explanation we have for this quite shocking number.

But the other explanation we have is also specifically when you say exposure to CSAM, because it’s not necessarily that children are searching for this material, but they can, in some instances be shown this material potentially by their own abusers. So you see this kind of cycle of abuse by young children are shown material and they become victims of CSAM themselves or child sexual abuse as well. And then in the future, they may be searching for CSAM, potentially to even look for material depicting themselves, but also to understand their own abuse.

Neil Fairbrother

Yeah, my understanding is that during the grooming process, it’s not unusual to show children this as a way of desensitizing children and to portray almost as a normal activity, to normalize this activity.

Anna Ovaska, Protect Children

Absolutely yeah, and we can also see that children often, you know, as a part of the grooming process, are also shown this material to shock them, to scare them. And a child with limited cognitive abilities at a young age will be so shocked and so surprised that they might then go on to do what the groomer or the offender is asking them to do. So there’s kind of many ways in which offenders may be using CSAM against the child victim as a part of the grooming process and as a part of that abuse as well.

Neil Fairbrother

Okay. Now the following question was “How users first saw CSAM?” and 51% said it was accidental. And I can’t help but be a little bit skeptical about that response because they would say that anyway; oh, it cropped up, I wasn’t searching for it, it just appeared. Is it right that I should be a little bit skeptical about that number? Or do you feel that it’s a genuine reflection?

Tegan Insoll, Protect Children

I would say it’s definitely right to be skeptical just as it is right to be skeptical about a lot of the other answers because we can’t say the sort of level of honesty that the respondents are answering these questions with. However, we do think that a lot of the respondents are being quite honest because we see in the open-ended answers quite a lot of direct quotes from the respondents. And actually, a lot of them do talk quite openly about their experiences. And of course there’s a lot of sort of justifications saying, oh yeah, I just stumbled across this material, but there are also very clear, open, honest answers saying that they are voluntarily searching for this material and that then may have stumbled upon it accidentally, but now they have gotten hooked on it and really can’t stop their behaviour.

Neil Fairbrother

Nearly 20% reported that they first saw it after deliberately searching for it, so this does seem to underline the honesty aspect of what you’ve just been saying.  Now, length of use of CSAM. In other words, for how long have people been looking at this stuff? 19% reported using it for five or more years and this particular question had nearly I think, 4,000 responses. So at 20% this is a thousand long-term users of CSAM. That sounds like a hardcore embedded problem.

Tegan Insoll, Protect Children

Yeah, absolutely. I think we see as well from, again, the open-ended ended answers that using CSAM often is quite an embedded long-term problem. Even though a lot of the users in this survey are quite sort of young users or they haven’t been using it for a very long time, we can see this sort of pathway that they go down because if they using this material so frequently, it can become a kind of addiction where they can’t go a day without using CSAM. And they use the CSAM to deal with their emotional problems, their stress, it can be tied into substance abuse as well. So we do see it as some kind of addiction similar to internet addiction or pornography addiction. It’s something that individuals rely on to get through the day. So definitely this number actually didn’t surprise us so much, that quite a few of the users are using it for a long time.

Neil Fairbrother

And indeed your report does say that 10% of respondents, which equates to 400 people, say they do in fact use it on a daily basis. So the type of CSAM was rather quite shocking as well, at least it was to me, wth 24%, so a quarter of people saying that they watch violent, sadistic or brutal material?

Anna Ovaska, Protect Children

Yes, this is hard wrenching and it’s really, really sad for us to see as well. However, this is something that we know through, you know, analyzing CSAM material. Our Executive Director, Nina Vaaranen-Valkonen has analyzed CSAM for nearly 10 years now and this is what she sees daily, as well as others working through Project Arachnid that we’re also a part of. However, it is of course, you know, devastating to see those numbers through the survey.

Neil Fairbrother

Yeah. Perhaps rather unsurprisingly the vast majority at 45% watch CSAM relating to girls but rather distressingly the age range here is from 4 to 13 years old.

Tegan Insoll, Protect Children

Yeah. We do see that this is the sort of most popular age group and gender to be watching for CSAM users, which definitely sort of backs up the previous research and other things that we already know about gender-based violence, that violence against girls is quite prevalent and really is something that we need to be focusing on. But I think an issue here as well is that we need to not be blindsided by these numbers and not start to believe that this isn’t a crime that only happens against girls, but we really need to ensure that our child protection efforts of course provide extra safeguards to girls, but definitely doesn’t exclude boys as well.

Neil Fairbrother

One of the surprising results in the survey is regarding time of use. So what time of the day is this content consumed and for me, it was surprising to see that most people viewed this stuff early in the morning, from seven to midday. My impression was it would be a late-night activity when people, the rest of the family might be asleep or whatever, they’re online, on their own. 7am to midday was a surprising statistic. What are they doing about work for example?

Tegan Insoll, Protect Children

Do you think actually these numbers are a little bit strange because as you said, the largest group of respondents say that they watch between 7am and 12 in the morning. However, the other answers are between 1pm and 3pm, 4pm and 8pm, and 9pm and 12 in the later hours. And actually in total, the numbers are much more in the later half of the day. So it’s about a 70% then that watched between 1:00pm and 6:00am. So, so the numbers are, as you presented maybe a bit strangely there. But we do see actually in total that that most of the respondents are saying that they watch it in the evening or in the afternoon.

Neil Fairbrother

Okay. Now there is hope in all of this. One of the questions you asked was “What is your willingness to stop using CSAM?” and half say they do and hlf so they don’t, which may mean something or may mean nothing, but if they really do want to stop, what is stopping them from doing?

Anna Ovaska, Protect Children

We don’t really understand well enough, what is happening in the brains of CSAM users? What is the biological kind of hurdle that they have to get past when they’re trying to stop? And that’s why we’re taking this compassion focused approach. That’s why we’re offering our help to these individuals through this anonymous rehabilitative self-help program. We’re not trying to guilt these individuals into stopping. We’re trying to help them to help themselves. And we believe that that’s kind of the first step that we have to take.

Tegan Insoll, Protect Children

I would also say what makes it very difficult for the CSAM uses to stop is the taboo nature of the problem. With a lot of other sort of mental health issues or substance abuse issues, it’s not so difficult to reach out for help, whether that’s talking to a family member, a friend, or going to seek medical care and attention. Because of the taboo nature of using CSAM it can be very difficult for these users to seek the proper help and care that they need to be able to stop their problem. And we’re not saying that the taboo shouldn’t be there because of course it should, it’s a terrible heinous crime. And we shouldn’t be just sort of act giving care to these, these offenders, but that doesn’t negate the fact that they need proper programs and help to change that behaviour. And this intern is what we need to protect children.

Neil Fairbrother

Okay. Now where offenders or consumers of CSAM have stopped consuming CSAM, 41% reported the reason that they stopped were negative feelings or shame. How could you induce that feeling, or would it be appropriate to induce that feeling, in the hardcore respondants who said that they never think about stopping?

Tegan Insoll, Protect Children

Well, we see in the hard correspondence that say they never think about stopping, we see very, very strong cognitive distortions where the sort of traditional view that society has, which we stand by, is that watching CSAM is wrong. And we see from the respondents that say that they have stopped based on feelings of guilt and shame, we see that they’re understanding the societal implications and the societal norms. But the other group of users who say that they will never want to stop, they have this sort of cognitive distortion where they don’t at all align with these societal norms. So they in fact think that watching CSAM is absolutely okay, that’s nothing wrong with it.

And we see this again and in the open-ended answers quite explicitly. So this group of users, we believe are very, very hard to reach because they have such strong cognitive distortions that basically it might be impossible to change their mind. That’s why with our self-help program, we’re trying to at least reach the users who have some kind of motivation to stop and potentially aren’t too far gone to change that behaviour.

Anna Ovaska, Protect Children

And just to add on that, like you said, Neil, about 50% said that they have the willingness to stop. So although 50% don’t have the willingness to stop, we do have the other half of the individuals who do want to change their behaviour, who do want help in doing so. And for us as a child protection organization, that 50% of individuals who do want to stop is a huge amount of people. And we really need to help at least those individuals who do want to change.

Neil Fairbrother

Okay. So the consumers of CSAM’s own self-perception is an interesting area and you have explored that. You asked the question, “What did these users feel before using CSAM?” And 67% basically said that they felt great about themselves and 30% reported that they felt optimistic and good about myself, even though they must know that what they are doing is illegal. They may not believe it damages children, but they must know it’s illegal. So are they in some kind of denial about what they’re doing?

Tegan Insoll, Protect Children

Absolutely. I would say they may know that it’s illegal, but for some of them that’s actually sort of the thrill of it. They know that they’re doing something illegal, but they also know that the likelihood is they’re never going to be caught for doing it. So again, with sort of watching taboo pornography and then moving on to CSAM as well, they know they’re doing something wrong, but that’s actually giving them some sort of good feeling. So this is a part of the explanation, but another part is, again, the cognitive distortion where they really don’t believe that what they’re doing is wrong and they disagree with the laws that bound it.

Neil Fairbrother

Okay. Now, one of the common conceptions is that consuming CSAM is harmless, the abuse has already happened. So someone else is performing the abuse, all you’re doing is watching an image or a video, and you’re not abusing anyone at all and there’s no inherent danger there. But is there a link between direct sexual acts and consuming content?

Anna Ovaska, Protect Children

Well, first of all we know that every single time an image is viewed or distributed again, the child is revictimized. Every single time an individual abused the same image again, or forwards it or distributes it, the child victim will be revictimized. And this in and of itself is just an incredibly harmful fact. And these individuals who believe that their viewing of CSAM has no negative impact on anyone, they’re not doing any harm as they say, this is in and of itself already wrong because just viewing the material is incredibly harmful to the child victim portrayed in it.

More over what was another very, you know, surprising result of our survey was that nearly 40% of the individuals stated that they have sought direct contact with a child after viewing CSAM. And this shows the clear link between CSAM use and direct offending that, you know, maybe in the past, we have believed that CSAM viewers are just merely viewers. However, our research results demonstrate very clearly that there is a huge risk for CSAM viewers to actually contact these children.

Neil Fairbrother

Okay. Now, another reaction that I get when talking about this topic particularly on social media platforms is that most abusers of children are already known to the child, but if any abuser anywhere can contact any child at any time, is this actually the case? Or is this a common misconception that most abusers of children are already known to the child?

Tegan Insoll, Protect Children

Well, we do see the link between for example, CSAM using and direct contact offending. One of the risk factors for moving from simply viewing CSAM to contact offending is actually having access to children. So this does increase the likelihood of an offender actually directly contacting and directly offending against children. But I would say that having this sort of direct access to children, which could mean knowing the child personally, that’s just one factor. I wouldn’t say it’s the sole factor. And definitely there are many, many cases of abuse where strangers are contacting children.

Anna Ovaska, Protect Children

And I think it’s increasingly prevalent now with the development of technology. Children are on these platforms and unfortunately we know that where the children are, the offenders are well and through social media platforms, through all of these very popular platforms, we know that even total strangers can contact children and abuse them. So it definitely the physical proximity to a child is no longer necessary.

Neil Fairbrother

This does raise the question about age verification. Do you have a view on whether the social media companies should be running some kind of age verification at whatever age, whether at 13 or 16?

Anna Ovaska, Protect Children

Absolutely we do. We have the strong belief that social media companies should play a stronger role in the protection of children. Young children with their level of development are not able to recognize, you know, a threat, a threat of grooming for instance, and we cannot put that pressure and that responsibility on the children either. So we believe that children who do not have the cognitive skills to be on these platforms should be on them and therefore we need stronger age verification. We need to ensure that the children cannot access the platforms before, you know, they are old enough to do so.

Neil Fairbrother

Okay. Now there is an impact not just on the children but also on the consumers of CSAM. Half your respondents say that they have thought about self-harming or even suicide as a result of their activities.

Tegan Insoll, Protect Children

Yeah, definitely. This is a quite a problem because using CSAM in itself can be very harmful of course to the children. And that is what our main focus is, but we do also see this harm to the users themselves.

Neil Fairbrother

Okay. So, some of your respondents have said that they have sought help to curtail or limit or even stop there their CSAM use. I think 13% of your respondents said that they have received help is that correct?

Tegan Insoll, Protect Children

13% of respondents have tried to get any help tried to get some help but actually only 3% have in fact received help. So the extra 10% there say that they have sought help, but they did not get any help. So, we are definitely interested to understand these results a bit as to why these 10% have tried to get help, but they haven’t received any. Perhaps they have visited a doctor or something similar, but they have been refused help for some reason.

Neil Fairbrother

Okay. And do you think part of that might be in relation to the taboo nature of helping consumers of CSAM content?

Tegan Insoll, Protect Children

Absolutely. Because definitely in some areas of the world, this taboo is stronger. So even reaching out to a medical professional you may be turned away because of this problem. So in some places, if you go to a doctor and say that you’ve been using CSAM, you will be directed to the proper help and care, but this is not the case everywhere.

Neil Fairbrother

Yeah. And it’s interesting that one of the questions that you asked is “What would help you as a CSAM consumer stop consuming CSAM?” And there’s a range of answers, medication to lower sex drive, having a relationship with a partner, an alternative way to feel release, less access to CSAM, all the way through, down to simply having a safe space to confess. How can all of that be move forward?

Anna Ovaska, Protect Children

Well, we’re taking the first step with our self-help program. So like Tegan said, this is a very taboo topic for a reason, of course, but to offer help to those individuals who really want to change their behaviour, we developed this anonymous self-help program that is fully accessible through the dark web, as well as the surface web. The individuals don’t need to say any details about themselves. They don’t need to log in at all to access the self-help program. And we believe that this sort of very low threshold help is the first step necessary. And also in our self-help program, we offer multiple next steps for the individuals to take, including, you know, consulting a medical professional, but where that isn’t possible, we have a link to a dark web therapy service.

Nel Fairbrother

All right. So when it comes to your other report, we are sadly running out of time, but I’d like to spend a little bit of time on the No Need For Help report, just by way of a summary. There are a number of reasons, I think, why people say that they don’t need any help with consumption of CSAM. The first one has a heading of “Cognitive Distortions”, which you’ve referred to earlier, where basically they’re saying I am using CSAM, but I don’t need help because it’s not wrong. How can we persuade these folk that what they are doing is in fact wrong?

Tegan Insoll, Protect Children

I think it’s very difficult to persuade them because what we know about cognitive distortions is it’s not simply they’re thinking this way or that, you’re sort of trying to make themselves think this way, but they’re really stuck in this thought pattern and it’s taken a while for them to get there. So over time they’ve developed this thought pattern and to justify their own behaviour and started to really, truly believe that what they’re doing is okay. But something that we can do is to prevent this from happening for future users. And does to educate the young population about the harmful effects of CSAM, which has obviously a very tricky problem.

Neil Fairbrother

Okay. What are the next steps for this report and for your organization, Protect Children?

Tegan Insoll, Protect Children

The next steps include developing the research because we have quite a lot of data now, as we present in the reports, just the sort of preliminary analysis. So right now we’re really looking into analyzing specific sections of the data including looking into live streaming, looking into contact sexual offenses, also looking a lot into the young age of first exposure to CSAM. So again, as you mentioned earlier about age verification and looking into that, and the responsibility that corporations, especially tech companies may help in relation to protecting children.

Neil Fairbrother

Okay. And whereabouts can people find this report?

Anna Ovaska, Protect Children

They can find it on our website at www.protectchildren.fi, it’s available there for free download.

Nel Fairbrother

Okay, well, thanks very much, Tegan, thanks very much, Anna for a fascinating insight into the profile of consumers of CSAM, and I wish you all the best for your report and your research, and let’s hope that there are some positive outcomes soon.

Anna Ovaska, Protect Children

Thank you so much, Neil, for having us on. Thank you.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top