Safeguarding podcast – How Safe are our Children? A discussion with the NSPCC
By Neil Fairbrother
In this safeguarding podcast we talk to Holly Bentley and Martha Kirby of the NSPCC about the 2019 edition of the NSPCC’s “How Safe are our Children” report, which this year focusses on the online digital context. This podcast explores the 10 indicators that form the framework of the report and is an ideal companion for it. The 2019 How Safe are our Children report can be downloaded from here.
http://traffic.libsyn.com/safetonetfoundation/How_Safe_are_our_Children_A_discussion_with_the_NSPCC.mp3
There’s a lightly edited transcript of the podcast below for those that can’t use podcasts, or for those who simply prefer to read.
Neil Fairbrother
Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast where we talk about all things to do with safeguarding children in the online digital context. The online digital context comprises three areas, the first of which is technology, the second of which is law, and the third of which is ethics or culture. Child safeguarding is right in the centre of that Venn Diagram, and it encompasses all stakeholders between a child and the content or person online that they are interacting with.
Today’s podcast asks the question, “How Safe are our Children?”, particularly in the online context, and to help guide us through this, we have not got one guest today, but two: Holly Bentley and Martha Kirby, both of whom are with the NSPCC where we are today, thank you for inviting me in. So could you give us a brief intro, a brief resume, of who you are, your backgrounds and areas of expertise?
Holly Bentley
Okay. So I’m Holly Bentley, I’m Senior Information Specialist for Statistics at the NSPCC. My role is about looking at how the NSPCC uses data to inform its work and how it talks about data publicly, and a big part of my role is working on the “How Safe…” report. I’ve been working on it since the first edition, which was seven years ago now. It’s evolved over time and I’ve got more involved as it’s gone on and now I’m the lead author on the report.
Neil Fairbrother
Fantastic. Thank you very much. And Martha?
Martha Kirby
I’m a Policy Manager for Child Safety Online in our policy and public affairs team. My role is quite varied, we do quite a lot of work around research and we do a lot of research with children and young people and also develop policies and ideas about how to keep children safe online. For instance, our campaign, the Wild West Web, which calls for the statutory regulation of social networking sites.
Neil Fairbrother
Okay. Thank you. So the “How Safe…” report as you say it has been running for seven years now and in this latest edition I was really pleased to see us focusing on online safety. So, how safe our children online? What’s the answer?
Holly Bentley
Well I think a lot of what this report is saying is that the measures that we have aren’t quite there yet and one of the reasons we thought it was so important to have a report that looks specifically at online abuse is to draw attention to some of those gaps in the data as well. So we’ve identified 10 measures here, which we think help tell the story of how safe children are. It raises a lot of issues. We can see there are lots of areas where more needs to be done, but it’s only part of the picture. There’s more research that needs doing, there’s more data that needs collecting as well. So not a straight forward answer for you, but it never is with statistics!
Neil Fairbrother
Well indeed. And Martha, what’s your view? How safe do you think our children are online?
Martha Kirby
Well, I think as Holly said, it’s very difficult to tell with the statistics that we have at the moment. This is very much an emerging area and field of research and so we are just starting to kind of piece together different bits of research in different surveys and reports to try and get an understanding of how safe our children are online. What we do know is that children love being online. It’s very exciting. They can meet new friends, develop new hobbies, but that comes with attendant risks and it’s really important that we’re aware of those risks.
Neil Fairbrother
Yes, indeed. Okay. Now your report was created using 10 indicators and I thought that would be a useful framework for this discussion. So start at the start with indicator number one, which is all about inappropriate content. What do you mean by inappropriate content? Is there a definition of what that is?
Holly Bentley
So I’ll start from a data perspective. For this indicator, we’ve looked at lots of different pieces of research that try to look at inappropriate content and different researchers have taken different approaches to that. So there’s some Ofcom research in there where children were asked if they’d seen anything that they personally found worrying or nasty.
Obviously, what a child might find worrying online is not necessarily the same as what an adult would define as worrying, but there’s also other data in there from NetAware and O2 research and I think there’s something from London Grid for Learning (LGfL) as well that looks at what researchers would perhaps consider inappropriate. So there are questions that about, have you seen any hate messaging online, any bullying, any violent images, sexual images or any content where children are being encouraged to hurt themselves or any expressions around suicidal thoughts and feelings. So there’s a real mixture of content in this indicator.
Neil Fairbrother
Okay. Martha, what’s your view on this?
Martha Kirby
So I guess there are some areas where it’s slightly easier to define, for instance in the area of pornography, that’s obviously illegal for children to see below the age of 18. So we do have some areas where inappropriate content is quite easily defined. But then, as Holly was saying, the rest of it is a bit more nebulous and it’s very much based on what children feel is upsetting or scary or that they’re not happy seeing. And that also depends on the age of the child in question as well. That’s one of the reasons why it’s quite important for sites to have a graded experience with children and young people, so that for children of maybe 10, they have a slightly more walled experience and then as they get older, those kinds of protections are taken away and that helps to prevent children seeing content that’s inappropriate for their age as well.
Neil Fairbrother
Okay. So, to provide age appropriate content, you need to know the age of the person. Does that imply that you need to have an age verification or age estimation system in place to be able to do that?
Martha Kirby
You need some way of being able to tell. The easiest thing is, and lots of sites don’t even do this, is ask for the age of the person that’s trying to use their service. And then there’s age verification, which is quite a kind of strict way of doing it, but then there are other kind of age estimation tools using “Know your user data”, which is data that you gather from the way that people behave and interact online, which can then be used to try to estimate the age of the person.
Neil Fairbrother
Okay. Some children are better able to cope with exposure to inappropriate content than others. So not only is it age relative police also relative to that particular child, which makes things even more complicated for the site owners to deal with because you may identify, for example, that a child is 10, but a second child who is also 10 may be better able and be more resilient. They’d be better able to deal with seeing some inappropriate content than the other 10 year old. So not all 10 year olds are the same. So how could that be implemented? How can you, from a technology point of view, do you think you can limit that exposure, that experience? You said when people reach a certain age, take the limits off but how would you impose limits in the first place?
Martha Kirby
I think it really depends on the site and what kind of material the site has up online as opposed to how you would do that. For children and young people, I think that there’s no perfect solution and it’s never going to work seamlessly, but you have to go off of understandings of developmental stages and those types of things to try and unpick some of those things. But then there can also be issues, for instance, for children that have learning disabilities all react very differently. And I think it’s just about making sure that children have the option to opt into safer, more walled spaces that they can be online and giving them the chance to take away those protections as they feel that it’s appropriate.
Neil Fairbrother
So Indicator number two then was Online Sexual Abuse, which is clearly an illegal activity, I think everyone can agree with that, and some of the stats you said were 21% or one fifth of surveyed girls aged 11 to 16 said that had received a request for a sexual image or message, which is quite alarming. But the survey data also said that the indicator does not come from a representative sample of the UK child population. So does that mean that results are not accurate? Is there a danger that one fifth of girls are saying that they’d had an issue with this, but if it’s not a representative sample, does this mean that some actions might be taken either in law or through implementing some technology that might actually make the situation worse because it’s not an accurate reflection of the population?
Holly Bentley
So firstly a quick correction, it’s 11 to 18 year old girls, not 11 to 16 year olds so a bit of an older age range. It’s not a representative survey, it was a survey that was done in schools that were chosen to reflect the population of the UK base. You’re right, it’s not a precise measure. Really what we found when we were looking at “How safe…” was, there’s not that much out there yet and rather than put nothing out there and say “We don’t know”, it’s good to look at the best information that we do have. I think just having that indication that there is a problem and that there is something that at least young people need support with is important to put out there.
Since this research was published, we’ve also done some further research that is representative around children and young people and their access to images that’s going to be published later this year, that Martha’s been looking at…
Neil Fairbrother
Okay, so Martha. Can you reveal a bit of that information or is that not something you can do just yet?
Martha Kirby
Not this point in time, but I’m sure we could update your listeners maybe later on in the year when we can talk about those results.
Neil Fairbrother
Okay. Fair enough. So Indicates three then, Online Sexual Offenses. How is that different from indicator two which was Online Sexual Abuse?
Holly Bentley
Indicator two was all based on self-reported experiences. This is when young children are given the opportunity in confidential surveys to say “This has happened to me”. Indicator three is all based on offence data, the reports that are made to the police. We know that a lot of abuse goes unreported and unrecorded, which is why we thought it was so important to start the report with two indicators looking at young people’s own self-reported experiences. So in this one we’ve got all the stuff that police have either had reported to them by people who are concerned, or through proactive investigation have unearthed instances of child sexual abuse. So it’s a slightly different measure.
Neil Fairbrother
Okay. Martha, do you have a view on that?
Martha Kirby
Yes, I think as Holly explained it’s the difference between reported to the police and then self-reporting of data and we see much higher levels of instances of self-reported data.
Neil Fairbrother
Okay. Now you did fantastic work, the NSPCC did fantastic work in lobbying for a new offence, which has been introduced and it’s been included into, I think the Serious Crime Act, which is “Sexual Communication with a Child”, and that became an offence in England and Wales, or rather since it’s become an offense in England and Wales, you’re reporting that 5,211 offences against children under the age of 16 have been recorded by police forces, but the rate of growth of that reporting seems to be slowing down. So does that mean that we’ve reached a peak, or is there even more to come do you think?
Martha Kirby
I think it’s quite normal with any piece of new legislation that you have, it will start off slightly lower than you might expect because there’s less awareness by the police that they can use this to report instances that before they wouldn’t have been able to report. So we would expect to see that lower level and then it increase, but it’s quite early to tell still whether it’s levelling off or not. We’ve only got a year and a half’s worth of data and we’re just in the process of collecting the next six months’ worth of data, so hopefully at that point we’ll start to see whether there are any trends or not. But I think that the general trend has been for it to increase and I would expect that to continue.
Holly Bentley
And I think the important thing is that this offence didn’t used to exist, there was no way of recognising that this was happening to children before, in terms of criminal law or anything. That’s a real win for children.
Neil Fairbrother
Yes. It totally, it absolutely is and well done you for making it happen, that’s a brilliant piece of work.
So police forces across the UK have introduced an “online crime flag” to mark when an offence involves an online element. But you say that there are large variations in the proportion of child sexual offences being flagged as online crimes by each police force and there’s anecdotal evidence to suggest that the online crime flag is currently being underused. So what can be done to ensure better use of this flag across all of the different police forces do you think?
Holly Bentley
I think this is another case where when new forms of recording come in, it takes a while for police to get used to using them, it’s a new system. So part of the solution is time. We’ve seen increases over time already in the number of crimes that are being flagged and they’re continuing to go up. So part of it is just police getting used to this new system and applying it accurately. I don’t know if you have any other thoughts, Martha?
Martha Kirby
I think we have seen an increase in the number of offences that are noted as being cyber-flagged and I think that is helpful and it probably reflects a better understanding of online abuse and the impact that that can have. But I think that it is important to note that this is probably a vast underestimation of the number of offences that include online and I think that’s quite important because it means that we’re not necessarily reflecting children’s experiences.
Children don’t necessarily distinguish between their online lives and their offline lives. It’s just part of their overall experience. So I think that it’s important that the police start to try and find a way to note that like most grooming cases, for instance, will have some form of online communication because that’s how children communicate. So I think it’s really important that we start to see a better use of these kind of cyber-flags.
Neil Fairbrother
Speaking of child sexual abuse and grooming, the National Crime Agency Data showed an 893% increase in the number of UK-related industry reports of child sexual abuse material, primarily from the US-based National Centre for Missing and Exploited Children over the last five years. What’s triggered that huge increase?
Holly Bentley
The NCA submitted that data originally for the inquiry into child sexual abuse and they were asked the same question at the time, “What do they think is happening?” As with all data, part of the answer is we don’t know for sure but they’ve said that it’s likely to be a combination of increased offending but also increased detection.
So people getting better at finding out that this is happening and reporting it and referring it. But they do think that there is also more of it happening. The ease of online communication has seen this. I mean if you look in the same indicator the number of obscene publication offences, a lot of which take place online, are being reported to the police, there’s been a similar upward trend, the proliferation online that enables that.
Martha Kirby
What’s important about this kind of data is that you are also talking about child sexual abuse material, which is the sharing of indecent images between perpetrators, and one of the things that will have led to this kind of vast increase is the fact that we have technology now that’s able to detect child abuse material automatically.
It’s called PhotoDNA and basically it creates a way for artificial intelligence to immediately detect that abusive image and for it to be removed. So actually we’ve got technology now that’s much better able to deal with finding child abuse material and then removing it as well. And as that database increases, and more images get put onto that database, the more images you’ll be finding and taking down.
Neil Fairbrother
That’s the “hashing” technology, but clever and marvellous as that maybe, it is still after the fact; the abuse has to happen first, so you still have a victim. And so whilst this may be great at removing the content that’s being produced, it’s not stopping anything at source. Is there a way to use technology from stopping it at source?
Martha Kirby
So there definitely are ways that we could use technology to prevent it at source and I think prevention is absolutely vital and that’s why the NSPCC is calling for statutory regulation of social networking sites through our Wild Wild West web campaign. And that’s very much focused at the ways in which technology can be used to prevent these crimes from happening in the first place. There are ways that companies can use analysis and Metadata for instance, the way in which perpetrators might send out lots of friend requests to children and young people, with the intention that only a few of those would reply, but that creates certain patterns in terms of friendship making on social networks, which companies can use to then detect and find accounts that they need to look at in greater detail.
So there are ways in which companies can use things like linguistic analysis to detect grooming as well. So there’s lots of ways in which AI can be used to help prevent crimes from happening in the first place. But I think we shouldn’t also underestimate the importance of being able to find and remove those images in the first place, because I think it’s really important to remember that behind every child abuse image is a child that’s been abused and we know from research conducted elsewhere that the impact of having your material out on there can be very, very damaging for people whose abuse has been recorded and then widely shared. So it remains important for those victims to make sure that those images are being taken down and removed as well.
Neil Fairbrother
Just because you were [only] looking at an image, does not mean there isn’t a victim.
Martha Kirby
No, no. There’s a victim behind every one of those images. And I think another reason why it’s so important that we get on top of this abuse imagery is that the more people that are looking it, the more that then fuels perpetrators going on and committing new abuse to create new images. So it is that very unfortunate supply and demand system that you have there.
Neil Fairbrother
Indicator five – moving on rapidly. Childline Counselling Sessions. Childline obviously was set up many years ago by Esther Ranzen from That’s Life which I remember watching as a child, and the NSPCC have taken on and continue to operate it, it is still doing fabulous work. From what you’re saying, you are using the data that you collect for analysis, and this is one of the contributing data sets, and there does appear to be some good news here because the number of counselling sessions for children for online sexual abuse has decreased by 23% which is quite a drop, you can’t ignore 23%. So is this good news? Is this a good news story or is there something else going on here that the number itself doesn’t explain?
Holly Bentley
Well, there’s a few things going on here. We’ve seen in recent years there has been a slight drop off in the number of counselling sessions we’ve been delivering, there’s a health warning in the report that has actually been a decline in the number of counselling sessions over the last year. So part of that decline is because there are less counselling sessions and we’ve put in place a whole development program to address that.
We’re recruiting more volunteers and we’re looking at more efficient ways of delivering our counselling at the moment and that that decline is largely because there’s been a real shift, and this ties into the online theme. Children don’t want to phone anymore, they want to talk to counsellors online and obviously having a chat online where you’re typing things out, you’re reading them, someone’s responding, then you’re processing that and typing out again, it takes over twice as long for an online counselling session to a phone counselling session. So we need to keep increasing our capacity to deliver counselling in order to reach that demand.
Part of that decline is because there are less counselling sessions but that doesn’t explain the whole decline. There is a natural variation year-on-year on what children are talking to us about, but we’ve done a lot of proactive work in the last year signposting children’s information and advice on the Childline website, which is another really important resource where children can find the information about the problems they’re facing. There are also message boards for peer support. There are loads of resources there and we’ve been really investing, increasing the amount of resources that are out there so the children can proactively find that information. So we’re hoping that’s also having a bit of an impact on our figures. But if you’ve got any other thoughts, Martha?
Martha Kirby
No, I think you’re absolutely right and Holly’s definitely right to point out the fact that we’ve had certain pilots in place which were very much aimed at driving children towards the resources that we have online. So that in a way is a good news story that children are able to find what they need online, so that the need to kind of call up is less as well.
Neil Fairbrother
Okay. There were a couple of interesting data points. 26% of perpetrators were adult acquaintances but also 26% were children who are not a friend of the victim. And the question I really want to ask about this is where those actually children, or adults posing as children? In other words isn’t an element of catfishing going on here?
Holly Bentley
So from a data perspective, we can’t be sure. We at Childline are very child-led, so what a child tells us they’ve experienced determines how we record what happened in that counselling session. But the child may not know that the person that they’re interacting with is a child or an. adult. We have done some more in-depth analysis where we’ve actually gone into the individual counselling sessions and read through all the notes and looked at trends and there is a separate report that was published a couple of years ago. That’s on the NSPCC Learning site that looks in much more detail the types of things children we’re talking to about in counselling sessions. So that’s worth a read to get a bit more context. But Martha’s probably got some thoughts on adults posing as children…
Martha Kirby
From the research that we’ve done elsewhere, we do know that it can be very difficult for children to tell what age someone is online and it’s very difficult for them to really have that strong sense of whether somebody is as an adult or a child. And I think that’s why it’s really important for adults to be having those regular conversations with their child about what it is that they’re doing online, the friends that they’re making online and making that a very normal part of a day-to-day conversation, like they would talk about what they’ve done at school or something like that. And I think it’s through adults having these conversations with children and helping them to think through what’s happening to them online. I think that’s the best way that adults can really help children to protect themselves.
Neil Fairbrother
Okay. Indicator six. NSPCC Helpline Contacts. In the report it says that although the children calling into Childline has decreased, concerned third parties calling about children has increased by 19%, and again, predominantly about girls this time, 36% age 10 to 13 and 37% 14 to 17. What’s causing that increase?
Holly Bentley
Part of it is the opposite of what happened with Childline. We’re seeing increases in the number of adults contacting about anything to the helpline, and this is part of a general trend. But again, it’s gone up more than contacts in general have gone up. Contacts to the helpline or affected by things like awareness of the issue and I think there has been a real push to make sure that parents and adults in children’s lives are aware of those online risks and that perhaps they’re seeing more of those risks because they’re more open to noticing that these things are happening.
We’ve also got a helpline specifically around online issues with O2, which we run and promote, so that’s also encouraging more people to talk to us about this issue, where they might not have previously thought that it was something that they could call our helpline about.
You mentioned predominantly about girls, so for Childline we know that’s largely, although not entirely, because girls are more open to talking to Childline. Girls in general seem to be more willing to talk to other people about their problems. So part of that gender disparity is because there are more girls talking to Childline, although again, it’s more pronounced for online abuse than for other issues.
For the helpline, we don’t really know. It could be partly the girls are more vulnerable or it could be that adults are more likely to be concerned or see worrying behaviour in girls than boys. But I don’t know if any of the other research you look at Martha supports this?
Martha Kirby
Yeah, I mean it’s very difficult to know, as Holly says, it could be about the fact that girls are increasingly more vulnerable than boys online and that they are more likely to be targeted. It could be the adults are more readily able to see that vulnerability in girls than they are in boys and they recognize that more quickly.
What we do know is that girls are predominantly targeted, we see that through Childline and we see that through the helpline. But we also see it through the number of children, or girls specifically, where there have been reports of sexual communication with a child. So we find that the majority of cases, I think it’s around 70% of cases of sexual communication with a child that reported to the police, 70% of those are girls age between 12 and 15. So that does seem to be something around that kind of cohort that is in some way more vulnerable.
Neil Fairbrother
Okay. Indicator seven then is data from our very good friend the Internet Watch Foundation, and they are reporting that there has been a 54% increase in the number of URLs confirmed by the IWF as containing child sexual abuse imagery since 2015 but less than 1% of that is hosted in the UK. Mostly it appears to be in the Netherlands. So it’s a different country, different jurisdiction. It’s in the EU. Guess what? We may soon not be in the EU. What can the UK do about that when it’s in a foreign jurisdiction?
Holly Bentley
I’ll hand this over to Martha in a minute, but just on the data point, I think in some ways it’s a good news story. When the IWF first started in 1996 the UK hosted 18% of URLs and it’s dropped down now to, well it’s less than 1%, it’s 0.04%% actually. So it does show that when there’s a real concerted effort and focus in a country to tackle that problem, you have this demonstrable result.
Martha Kirkby
Absolutely. It is a very difficult area and I think that what this really, really highlights is that, as Holly was saying, you can make a difference in whether these images are being hosted in a particular country. Countries can make a difference on their own, but really it highlights the fact that this is a global issue and a proliferation of child abuse images is an international issue that really requires international responses.
Neil Fairbrother
Now also in this section you say that there is limited information available from social networks about the levels of abuse that occur on their platforms, but that that information is available and yet is not provided in a consistent or comparable way. Is this something that the proposed regulator from the Online Harms white paper could help with, do you think?
Martha Kirby
Yes, the easy answer, or the quick answer, is yes, absolutely, that’s part of the idea of the regulator. It’s really important that we start to understand from sites themselves what kind of risks they are seeing, what are levels of risk their seeing because it’s almost impossible, as I think most of this report shows, is that we’re feeling in the dark for different snatches of insight that we can get.
But the best way to really understand what the scale of the risk is, what kinds of things are coming up, what reporting is like for children and young people, what impacts new methods and new ways of trying to protect children online have, the only way we can really do that is by understanding what is going on, on those sites and that requires them to be open about the information that they share.
Neil Fairbrother
Well we’ll look forward to what comes out of the white paper review period later this year, perhaps early next year.
So Indicator eight concerns itself with survey responses which is subtitled advice seeking and awareness and in here something that I found quite interesting. 62% of surveyed key professionals in Primary schools and 84% in Secondary schools said that they were confident in their understanding of online threats. But other subject matter experts I’ve spoken to in other podcasts even indicate the opposite and that this complex topic isn’t institutionalised into schools and teachers or ill-equipped to deal with cyber related issues. And this is born out again by the graphic on page 42 report, which shows that the majority think that they are only reasonably confident, which basically means they are unsure. So there’s a slight contradiction in the data itself and there’s a contradiction in what I’m being told by other people. So what is actually the truth here? Is there a truth?
Holly Bentley
Is there ever a truth? Firstly, they were key professionals that were asked these questions. So these were members of the senior leadership teams in schools, they were the designated safeguarding leads or they were the school’s network managers. So they are the people within the school team that you would most expect to know about this topic and part of their roles requires them to be a bit more on this topic than perhaps everyone else. They have a key role in keeping children safe.
And despite that you’ve still got a significant proportion that don’t feel that confident and you’re right within the bracket of teachers who said that they did feel confident, most of them said they were reasonably competent, not really confident. So 39% of Primary school teachers and key professionals and 43% of Secondary said that they were reasonably confident and only 23% of Primary, 41% of Secondary said they were very confident.
So there’s clearly still a need for a lot more support and information provided to teachers. I wouldn’t want that stat to make people think it’s all fine or there’s no more support needed. And certainly we find on the NSPCC Learning site, which is aimed at professionals, that our key audience is teachers and we have an eLearning course on keeping children safe online and that is predominantly used by teachers as well. So there’s obviously still a strong need for more information and training on this topic.
Martha Kirby
I think this also the can be a bit of an issue between how teachers think about something in an abstract way of “Would I be able to handle this?”, and then what happens when they are actually in that situation and faced with actually then needing to deal with something. I think sometimes there’s a bit of a disconnect there between the kind of theoretical being able to do it and then the practical.
What we do also know is that quite often teachers can find the online sphere very scary and they find it difficult to pick apart what to do when a child needs safeguarding online. And I think the most important thing that we can say for teachers is that it’s the same as what you would do for any type of safeguarding concern, it’s just that it happens to be online, but the most important thing is still for teachers to talk to children and young people about what is going on in their lives. Try and kind of open up that kind of dialogue and that conversation as much as you would in any other type of safeguarding situation.
Neil Fairbrother
So, Indicator nine, the penultimate one, we’re almost there. We’re almost done. So we’re cracking on. This was all about children taking action to stay safe online. There’s an assumption that children know much more than their parents about the use of technology, this is almost regarded as an unwritten rule by many people. But your report suggests that children aren’t as savvy as we might assume because your data shows less than half, I think it’s 44% of children, aged 12 to 15 said they knew how to change their settings to control who could view their social media. So is this just a myth that they are technological experts, this generation of digital natives?
Holly Bentley
Well, I’m not wanting to jump ahead to indicator 10 too soon, but this is also reflected in how easy it is to find the settings in the first place. So more than half of young people in a separate survey said the privacy settings should be better, easier, or clearer to use. So it’s not made easy for children to change these settings. And I think another thing to bear in mind from these findings where we found that less than half of children could change their settings according to the Ofcom survey, even less than that actually do change their privacy settings, it drops down to 32% that said they actually then went on and did change their privacy settings.
So there’s one question around children having the knowledge to be able to do these things to start off with and then there’s also a separate question about the awareness of the importance of doing these things and why they should be doing them in the first place. So I think all of this shows, there’s still a key role for adults in children’s lives to make sure that they talk to them about online safety, that they’re teaching them those skills and also talking about why you might need to be thinking about online safety when you’re online to start off with and be aware of how important it is.
Neil Fairbrother
And a child-focussed design would be a good idea as well.
Holly Bentley
Oh yes, definitely. Anything that could make it easier for children to take those actions would be a big help.
Neil Fairbrother
Okay. So, last but not least, Indicator ten. We’ve got there! What needs to be done to improve online safety and in this section, there are some quite interesting stats: 41% of 11 to 18 year old surveyed said that they thought websites, apps, and games weren’t doing enough to keep them safe online, 92% of 11 to 16 year old survey said that social media platforms should be required to protect children from inappropriate content and behaviour and so on and so on. So what practical steps can be taken to safeguard children in the online digital context that don’t restrict their freedom of speech, freedom of expression or freedom to participate?
Holly Bentley
This is definitely a question for Martha, but I’d just like to reiterate what you’ve said. All the data points to the fact that there’s a real sense that something needs to be done, that not enough is being done at the moment. It gives a real impetus to the type of work that Martha’s doing.
Martha Kirby
For the NSPCC I would say one of the biggest things that we can do to help keep children safe online is the introduction of statutory regulation for social networking sites. This would force companies to act in a way where they have a duty of care to children and young people online and they have a duty to act in the best interests of the child.
So when they are thinking about releasing a new feature, for instance the proposals by Facebook to encrypt all of their services, we would say that they then have a responsibility to think about the impact that this could have on children and young people and whether it’s appropriate to offer this to children and young people, because what’s most important is that we’re able to keep them safe online. And obviously that does mean that that they absolutely have a right to privacy and it’s a really difficult balance and it’s always going to be a difficult balance to ensure that you’re safeguarding alongside ensuring freedoms and protections.
But it’s really important that we do that and that we take those steps to keep children safe online because I think although the data’s patchy and although we’re still learning about the impact that the Internet is having on children’s safety and wellbeing, we know that a lot of children are experiencing abuse online and that something really needs to be done now to stem that flow.
Neil Fairbrother
Brilliant. Martha. Holly, thank you so much for your time. It’s been a fascinating discussion and I look forward to next year’s “How Safe are our Children?” report where hopefully all the indicators are down.