In this safeguarding podcast, Will Gardner CEO of Childnet International discusses international aspects of safeguarding in the online digital context. We explore international projects such as Project deShame, how young people can be empowered to protect themselves and other young people online, the power of young people’s voices and how they can be heard, Brexit, the Online Harms white paper, Childnet’s film competition and Safer Internet Day.
Below is a lightly edited for legibility transcript of the podcast for those that can’t make use of podcasts or those that simply prefer to read.
So welcome to another edition of the SafeToNet Foundation’s safeguarding podcast where we talk about all things to do with safeguarding children in the online digital context.
Today’s podcast has an international focus. Today’s guest is the CEO of an organization that represents and speaks for children globally and I’m delighted to welcome Will Gardner, CEO of Childnet International to the podcast. Will, could you give us a brief resume of who you are, your background, and indeed of Childnet International.
Brilliant. So maybe I’ll start with Childnet. Childnet’s a childrens’ charity and it was set up in 1995 with the mission to help make the Internet a great and safe place for children. We work in a number of different ways, three main ways really.
We work in the area of education. We believe very strongly in the opportunities that technology brings to children, young people, but we are committed to making sure the young people appreciate the opportunities that are there and know how to exploit them as well as raise awareness and empower young people to manage the risks that they encounter whilst they’re online. We have an education team that goes out to schools and we talk to children throughout all school ages, even for children as young as three, to give them messages about how to get the most out of technology and use it safely. We also work with parents and carers with school staff in that way.
That is the information source for all the work that we do. We’re talking to children and young people and those that support them every day. And we then recognize and identify key issues that need to be focused on and we try to raise money from different places to develop educational materials, which we put on online using the power of the Internet to disseminate, which is available for all those who are working with children or children directly to access for free. So that’s kind of the education side of it.
Then we also work in the policy area. So we want to work collaboratively with governments, with industry, with other organizations to try and help shape this environment, or these environments, which are just so popular with children and young people, with young people’s interests at heart.
So we will almost take the voice from the schools where we are listening, from children and young people and embed that within the policy work that we do. We work collaboratively with these key stakeholders.
And then underpinning the education and the policy, we have youth voice. We want young people to have voice within education, both in terms of critiquing and informing the resources that we’re developing in this space. But we also know that young people feel very passionately about this issue of online safety, and we want to harness, mobilize and encourage that so they can have agency within this area, within their local communities or beyond. So quite often we will hold events and bring the ears of policymakers, so they can hear children and young people directly. We do that often on Safer Internet Day and on other key events.
We have young people on the stage and sometimes we have government ministers or industry representatives hearing directly from children and young people. We have a “Digital Leaders” program which helps to encapsulate some of that work.
Yes, can you expand on the Digital Leaders program?
Yes, the premises is as I’ve outlined, it’s about using the passion of young people in this space. One Safer Internet Day several years ago we had this amazing piece of research and 15,000 children responded about their rights and responsibilities online, and they were ranking what they thought their top ones were. We invited 10 secondary school children and 10 primary school children down, I think we held it and Microsoft, to articulate some of the issues that they wanted to express and it was very powerful and very moving.
But there was one primary school child who spoke, took the microphone and spoke to the audience, probably 120 policymakers. He said, (and it did help that he’d been studying Martin Luther King and the civil rights movement just the week before), but he took the microphone and he said:
“I have a dream. I have a dream where me and my friends can use the internet without the fear of being bullied”
It was enormously powerful. And it just shows the power of a young person’s voice could really penetrate in a way that’s quite different to the other tools we have at our disposal, and if we are prepared to go that distance to give young people that opportunity, then the impact we can have in that space is so much the greater. This program is really about that.
So we have an online platform, schools register to the Digital Leaders’ program. About 10 young people from each school would go through the online training modules and become at the end of it, the qualified digital leader, and then they will go and earn badges by running a session for their peers or running a session for parents.
And this isn’t just older that pupils?
We started it with secondary aged children, and then we had primary schools that said we want to have this too. So we then expanded it to include primary and we have children who are in year two who are active Digital Leaders within their school community running session for peers, for parents, even for school staff. And it really is enabling young people to articulate the current [situation], this is what is happening now. They are credible voices in this space because they’re experiencing the same thing.
We’re not devolving responsibility for online safety to children and people. I think it’s important to say that. We’re not saying “Right, well this is too tricky, you know what you’re doing. You sort this out”. It’s about mobilizing the assets we have at our disposal to try and deal with this very challenging issue, about how we stay safe online. And it’s very, very exciting. You give something to children, young people and they ended up doing things that you wouldn’t have predicted that they would do.
So we have secondary schools going to visit primary schools and we have a secondary school in Stratford who have gone to visit their local PRU, Pupil Referral Unit, to work with young people there, working with children who’ve been excluded because of the results of their behaviour with technology. As they come back into school, they are [having] reintegration sessions with the Digital Leaders to kind of help them understand how they need to be working with technology in a positive way. We even had 70 young people who responded to questions related to the Online Harms white paper that just came out.
So we are genuinely giving our people the opportunity, the microphone if you like. We’re amplifying their voice and making sure that people who need to listen are listening.
That’s absolutely fantastic. It’s inspirational to hear that. You do have an international flavour, the clue is in the name, Childnet International. Do you find any cultural differences in the countries in which you operate? Is there a difference between UK and Greece, for example? I know you’re doing a project in Greece.
There are actually. I guess I should start off to answer this question [by saying that] Childnet International is international because the Internet is international. This is an amazing technology that doesn’t respect borders in that way. And in the last 10, 15 years perhaps, we’ve seen a real shift in that the experience of young people using technology, and we work very closely with our EU partners absolutely, this is the case across the EU, are not only using the same devices but they’re using the same services more and more.
We’ve seen the rise of global, phenomenally, hugely popular services, so there’s a lot more similarity in young people’s online experience than there has been before. I think we’ve seen that, but there are obviously some cultural differences in what we’re talking about.
So at Childnet we will be doing our work with children, young people, mainly here [in the UK], and we will work with partners, working with children, young people, in different countries, and then we will look to share what we’re doing with people across the world. We put all our content up, it’s online, it’s free, anybody can use it and see it, and we don’t need to keep reinventing the wheel. We will draw from and learn from others and others will look at what we’re doing and use that.
You mentioned Greece. We have our book called “Digiduck”, an illustrated children’s picture book where the model is a really exciting one where we’re trying to encourage that dialogue between parents and children at a very early stage. We know we need to start this conversation early. It’s also for teachers and kids to have that conversation early and in Greece they are really excited by that model. They saw the book and it’s literally a question of, well, let’s translate it and make it work for the Greek context.
We’ve just completed our second book and we’re talking about how we can take that further and there are other resources that we’ve done which have been used in different parts of the world, such as a cyberbullying video, which we’ve had subtitled and used in Denmark, it’s not been translated but it’s used in Australia, it’s used in Germany and France.
You know, this is a shared issue that young people are facing and those supporting young people are facing globally. The opportunity to be international in this space is hugely important and we need to make sure that we’re using that to the best.
Yes. One of the other international projects you’ve been running was Project deShame. What is Project deShame?
Project deShame is a project that we run in partnership with Save the Children in Denmark, Kek Vonal in Hungary and the University of Central Lancashire and it’s focused on online sexual harassment amongst teenagers. We did some work back in 2016, focus group work, with children in schools and different parts of the country here in the UK where we were updating some guidance for schools on preventing and responding to cyberbullying. And in those conversations, the young people were talking about a type of bullying that was slightly different. It was very sexualized cyberbullying. They were talking about “Bait Out” pages.
Can you explain Bait Out for us?
So it’s online pages on social media sites like Instagram that are connected, are linked, to local communities. So the person who set up the page will be from a school and they’ll be encouraging images or gossip about other young people within their school community. So sometimes there’ll be images solicited from others in the school, sometimes sexual images, sometimes not sexual images, but it would generate a sexualised gossip on the page. So it would be the sexualised form of cyberbullying.
And then when talking to the professionals’ Online Safety Helpline, which is part of the UK Safer Internet Centre, they were talking about getting cases and calls about such issues too. We felt that we needed to stop and have a deeper look and we applied to the European Union under their Rights and Equalities and Citizenship program and launched Project deShame.
The idea was first to ascertain, well, what is going on? What are young people’s experiences of this, not necessarily direct experiences, but witnessing this, how much of a feature of their childhood is this issue, before [we] try to address it?
The goal of the project is to tackle the issue of under-reporting, because young people in these focus groups said that they would never tell anybody about this particular thing. And it’s also about developing multi-agency work. So those are the two goals of this project. And we did some incredible quant research and you can see it on deShamed.eu, and we’ve broken up the Online Sexual Harassments into four different areas.
We’ve got the non-consensual sharing and the non-consensual taking of images.
We have coercion, we have 10% of young people in the UK saying that they’ve received sexual threats online including rape threats.
And then we have sexualised bullying, which is the gossip, for want of a better word, the slut shaming, the kind of gossip about perceived sexual activity of others, or even the sexual orientation of others, which takes place online.
And then the last one is unwanted sexualisation, being sent sexualised images or messages that you hadn’t asked for.
These are the broad areas we looked at. And what was really striking is that the experience of young people in the UK, and we are talking about 13 to 17 year olds it should be clear, this is teenagers we asked this question to, is more similar than different in Denmark and Hungary and the UK. So we deduced from this that this is actually a more widespread issue. It’s not culturally or nationally bounded. We think this is an issue that many young people are facing.
And even if they’re not the ones who were receiving a rape threat directly, 30% of young people said they’d seen that happen to people that they know online. And the concern that we have is there’s the risk that harmful norms will be created and people’s expectations of what happens online will encompass this type of behaviour, which is behaviour that nobody wants to see online.
So this project deShame is about recognising it and then addressing it. We’ve developed educational materials to try and raise awareness that people can recognise what online sexual harassment is and when it happens, talk about how do you report that when it happens, how do you respond to it when it happens?
So we built this toolkit, if you like, for educators to use. It also has information for law enforcement and for young people too, as we try to bring this up into the public discourse because it is a problematic behaviour.
You’ve been talking a lot about rape threats and so on, so I’m assuming that most of the victims are girls and the most of the perpetrators are boys?
Yes and no. There is not an exclusive gender split. Girls do have a more negative experience online and certainly within the category of sexualised bullying the girls’ reputational damage is different from the boys’ reputational damage when it’s talking about sexual activity and so on. So there is a gender differential in the rape area, I need to check the figures, but you can find it at deShame.euto find that. I would anticipate that they would be agenda split on that area too.
You mentioned that this project was at least part funded by the EU, and we cannot avoid the topic of Brexit. It does pop up in these podcasts and the reason for that is because there are a lot of organisations that are dependent on EU funding but also so there are EU-wide regulations as well, so it does impact safeguarding our children online. Do you see there’s an issue here or is everything going to be okay?
Well, if you’re asking me what’s going happen with Brexit I’m going to happily say, “I don’t know”, like everybody else. I don’t know what’s going to happen with Brexit, but what I can say is the EU has been really important in relation to online safety in the whole area, whether that be for hotlines, for helplines, for awareness centres, the network of National Safer Internet Centres that we have across the EU, and as you know Childnet is part of the UK Safer Internet Centre. The EU has been really important.
The Internet is one of those things like climate, as I said before, it doesn’t respect borders in that way and we do need to have that international cooperation to make a better response in relation to this and learn from each other’s experience. So the EU has been a really significant player and a significant funder in this space for the last 20 years.
Some areas of the work that we’re doing, we know the Government is looking to guarantee going forward. So the UK Safer Internet Centre, a project which we haven’t spoken about yet, the Government has said that they will guarantee that project. For all the Safer Internet Centres and all the different EU countries, we are currently on a project cycle that ends at the end of 2020 and we’re waiting to see what happens after 2020. The expectation is there’ll be a continuation of the program and as partners of the UK Safer Internet Centre, we’re absolutely committed to seeing what we can do, whether we are in the EU or not in the EU, of carrying on that input and cooperation, as the UK is a really important part of that network and a community of people who are active in trying to support children, young people in that space.
Well let’s explore the Safer Internet Centres. What are they, how do they work, what do they do?
So there’s one in every EU country and they have three main components. They have a Hotline, where members of the public can report illegal content online and in the UK that’s the Internet Watch Foundation (IWF), where you can report child abuse images that you come across online. The IWF is a global leader in this particular area and it’s one of the few Hotlines that is able to do proactive searching and to go out and look, not just to rely on reports from the public, they can actually go out and find content and get it removed.
The second component, there’s the Helpline of the Safer Internet Centre and the in the UK we have the professionals’ Online Safety Helpline, which is for all professionals working with children and young people. So if you’re a teacher or a social worker, police or medical professional, youth worker and you have an issue around safety relating to children in your care, or it might relate to you personally, we sometimes get contacts from teachers who’ve been bullied by parents of kids in their school or via social media, you can contact POSH and they can take steps to either support you, or even sometimes to escalate concerns that you might’ve raised with social media already to the social media contacts that they have, in order to get content taken down. So that’s the Helpline.
And the third part is the Awareness Centre, and that’s something that we do at Childnet together with our partners, the Southwest Grid for Learning (SWGfL), and in amongst that we develop different educational materials for specific target audiences, covering different topics around the online safety theme.
We run the Digital Leaders program to try and encourage youth voice in this area and we run Safer Internet Day, which happens every February. Next year (2020) it’s going to be on February the 11th, it’s the big national opportunity to raise awareness about these particular issues.
So that what the Safer Internet Centre’s comprised of, and there’s one in every EU country and twice a year we meet with our counterparts in the Awareness Centres and the Helplines to share trends, issues, good practice, because as I said earlier, we’re all dealing with the same things and there’s a lot we can learn from the approaches and experience of others and collaboration makes us better at what we do. And that’s a really important element of this work.
When it comes to international work, it’s really quite an interesting area as we’re beginning to discover. Safeguarding [online] is bounded by technology, law, and ethics or culture. Each country has got its own national laws, but internationally is a different thing because there isn’t a global government. The closest thing we’ve got I guess might be the UN and there is the UN Convention on the Rights of the Child, the UNCRC, which gives some guidance and almost every country in the world has signed up to the UNCRC with the notable exception of Sudan, and also the USA.
Which is an interesting point because a lot of the social media companies are American companies. So does the fact that the USA not recognizing the UNCRC, or not signing up to it, or not adopting it, and yet these are North American companies, is there a conflict there? Does that cause any issues?
Well, it’s a good question and I can’t categorically answer that question, but these are universal rights that people recognise, the rights of the child to information, to education, these are universally accepted and it’s the duty of the states to then put these rights into law.
Now clearly you’re going to notice that I’m not a not legal expert and I’m not a lawyer, but I would say that we do see the embodiment of these rights come to fruition in national legislation, or in EU Directives that then come into international legislation. But there are elements of balance between some of the rights that we have to strive to find, the right to privacy for example, and the right to be safe, we have to try and balance how these things work to the optimum way for individuals. Because you might have some people thinking that encrypted services are the way the people are moving and people are demanding more privacy in their communications with the popularity of different services working in that space, but what limitations does that bring in terms of safety in that area?
Those conversations are really important conversations and I don’t think we’ve got nearly as far as we would have liked to have seen in having that sort of policy direction and discussion.
One of the fundamental tenants that you subscribe to is to be “child focused” and the ICO (the Information Commissioner’s Office) has a just ended a period of consultation for its Age Appropriate Design Code. Did you provide input into that Age Appropriate design code and if so, what was it?
We responded to the consultation, absolutely, and we are supportive of the direction of travel. I mean the fundamental thing that we want to see is people being informed. If people are using these services, and we do see young people using these services and getting a lot of enjoyment out of different online services whatever they may be, they just need to be clear about what the deal is in relation to the data that they are sharing through their use of these services.
We want to make sure that when people are [using these services], they know the rules of the road and I think that is an important piece. I think a lot of the Age Appropriate design code was about absolutely making clear, or using defaults to help ensure, that the educational element within the journey that users have within these particular services, is promoting transparency and openness and really developing young people’s awareness and understanding about their data and how it’s used by different services that they’re interacting with.
Obviously to have an age appropriate design, you need to know the age of the child. So does an age appropriate design go hand-in-hand with some kind of age verification or age estimation process?
I wouldn’t let that stop progress in this area. There’s been a lot of talk about Age Verification and some do see this as kind of the Holy Grail of online safety in some areas. It’s easier if you are 18 or not 18 to have Age Verification. The mobile operators use a credit card verification system to ascertain whether somebody is 18 or not. It’s harder to differentiate between different ages of young people who are under 18.
What we would advocate is, let’s say you’re setting up a new social media profile and you are starting it and you are an adult, you could be a child, but when you start it, your starting point on the social media profile is default private, it is a private profile and in order to make it more public, you have to consciously make your profile more public.
You’re making an informed decision about the status of your online presence on a particular service. I think that’s relevant and absolutely important for children, young people. I think that’s also really important for adults in this space too and until the social media provider is unable to ascertain whether somebody is a child or an adult, I would suggest that would be the best way of working this system for your service. Because that way you don’t need to have age verification. You want to say effectively applying the same rule for all.
Talking about social media companies and “age”, a key piece of legislation which affects children across the world is a US piece of legislation, the FTC’s, the Federal Trade Commission’s, Children’s Online Privacy Protection Act or COPPA, which was last amended in 2013. They say then it was all about giving parents greater control over the online collection of their children’s personal information and it covered areas such as persistent identifiers like cookies that track a child’s activity online as well as geolocation information, photos, videos, audio recordings and such like. They’re now looking to review COPPA again. What would you like to see in this really important piece of legislation?
That’s a good question and I don’t know. I mean, I think people use this 13 age as a kind of safety audit. Well, it’s not a safety audit. This is about companies collecting personal details from children and being able to do it without parental permission, which is why the age of 13 is set. It’s about privacy and personal data, it’s not about safety. It doesn’t mean that your child knows how to use social media safely because they’re 13. It means that the company is able, without parental permission, to collect data about children using the service.
I think there is that conflation between the two sometimes, that this age requirement is about safety. And it can be an element. I mean, I think when we’re working with children and young people, we are focused on, when we’re talking to teenagers, a lot of the conversations about social media, but then when we’re talking to children under the age of 13, a lot of it ends up being about information sharing and it is related to social media.
In essence, as we know, many children under the age of 13 are using [social media]. So there is a lot of work for us to do, beyond both sides of this age divide. I think having an age is useful. It is useful for us to try and tell parents that these services do have age requirements and there are parents who take notice of that and some children and young people recognize that there are age requirements in there too.
But the educational piece needs to be broader than the age requirement. We need young people to be prepared for social media before they start using it, but we need to do that in a way where we’re not marketing social media to young people who are under the age of 18. So we’re always trying to strive to find the balance where we are empowering young people to be able to engage constructively and responsibility so they can look after themselves and look after their peers in this online environment, absolutely, but we are doing it where we are absolutely respecting that there isn’t age requirement for these different services and we’re not promoting them to children and young people.
The IWF and indeed the NSPCC both provide pretty compelling evidence that 13 seems to be the peak age for child sexual exploitation, so should the minimum age be raised to say 16?
Well, I think it’s an important discussion to be had and I don’t think it will a discussion that will probably ever end. I know that GDPR has given the guidance towards 16 but some countries are choosing to use 13. I think it’s an important discussion. It will end up coming down to a practical discussion and I don’t think it’s been as effective as it could have been and just changing an age by itself is not going to be the change that people might want and expect to see.
If that was going to be the case, I think certainly they’d have to be a lot of work done around it, but I do think you are taking away an incredible medium which they are absolutely engaging with and they are using it to communicate to their friends and they are using it in a very positive way. I think there will be an element of almost disenfranchising young people in a way from the technology which they are using.
You know, we do different research around how young people engage with technology and one piece of research was looking at how technology has become important within friendships and relationships. 80% of the young people said they have used social media when they’ve seen a friend that’s been upset, they’ve used social media to reach out and see if they’re okay or to comfort them. We need to recognise that there is this positive element of young people’s experience and that 13 to 16 year olds are very actively using this technology in very positive ways. I think we need to just keep all of that in our minds as we are thinking and having our discussions about what the optimal age is.
Okay. We talked about UK legislation, the Online Harms White Paper, which may well have international repercussions with any legislation that may be forthcoming from it. What is your view of the Online Harms White Paper? Some people regard the proposals as being somewhat controversial.
I think the theory behind the thinking and the objective behind what has been proposed is absolutely right. You know, there are issues around the online space and the Online Harms White Paper is very focused on social media, although it does have a slightly wider remit. It is focused on really bringing transparency and accountability to services, making sure that things that break the terms and conditions of a service are taken down, it’s looking for systemic failure within that space. Stuff which is below the legal threshold, that harmful band of activities and behaviours and content which might appear on social media that does need to be taken down and trying to make sure that that happens.
We’d absolutely want to try and support that. How that works in practice? The Duty of Care model that’s been proposed, which is designed in a way that’s flexible and future proof, seems to me a very practical and sensible [way] to go forward. So we are looking forward to engage and pursue and support as this develops. We’re waiting to see what’s going to happen next. I understand the DCMS have had many responses to the White Paper to digest and I think we should expect to have something towards the end of this calendar year.
I’ve got a couple of responses here which take a slightly different view. Graham Smith runs a blog called Cyberleagle, he’s a lawyer and he’s written a response as well which is published online, which makes very interesting reading. And he has some concerns about the Duty of Care, because in UK law, “…private individuals and bodies, [which could be a social media company] generally owe no duty of care towards individuals to prevent them from being harmed by the conduct of a third party.”
So in other words, you might have a duty of care to me as a visitor to your premises here with a loose floorboard, but you don’t have a Duty of Care for one of your staff to bop me on the nose or to be abusive to me. There’s no Duty of Care law on that. So putting it out onto a social media context, what the government is saying to social media companies is “Well, you now have a Duty of Care for the all this user generated content and not only do you have a Duty of Care, but you as an individual might be personally liable.”
Now that’s going to have enormous ramifications for freedom of speech because they will become so conservative about what they will allow on their platforms they will actually suppress freedom of speech and they will be taking down legal content that they deem inappropriate, but also it might have the impact that they will be fined for failing to remove lawful speech. It may be lawful but it may be undesirable, so it has to be taken down because it’s harmful, but it’s not illegal?
You’re asking me to pre-empt the discussions that are going to happen as a result of the consultation. I share the goal and I think everybody sees the goal of the objective, of what it is we are looking to try and achieve. And I think what was being proposed is the sort of next part of the consultation that we’ve just had. These views are all going to be taken on board and we’re going to see where we end up. If we’re looking to try and find a way where we can get social media to be more responsive and responsible in relation to the reports that they receive from the public, then I think that’s a good thing. And you know, we’ll have to make sure that we are on the lookout to see whether we think it’s overstepping the mark or not going far enough, as we go forward.
I think it’s a very ambitious proposal that they’ve put forward and I think it is a world first in relation to what’s been outlined. You know, there have been some other types of initiatives that have taken place with the e-Commissioner’s Office in Australia and Netsafe in New Zealand. Here there is talk about “the regulator” and I think this is the kind of key ingredient for whatever actually happens within the White Paper. So let’s see.
I guess if I do have a concern around this space, it is that it will draw the whole focus of the online safety discussion to this particular issue and in fact online safety is important, but it’s not the only piece within it. And you know, as we are operating to try and raise awareness and empower children and young people, that work absolutely has to be continued. It has to be this broader breadth.
The regulator will be dealing with some really important issues but there are wider elements that we need to talk about, including the self-esteem and the mental health of children and young people to make sure that young people know how to look after themselves and look after their friends online, trying to encourage young people to be good digital citizens. There is a stronger educational digital media literacy element within this area and although that was alluded to in the White Paper, that is a key element in the [space] , if you want the UK to be the safest place in the world for children and young people, that has got to be the key component within it.
In the discussions about the regulator, I want to make sure that the education thing is still seen as the key component within it. Of course industry needs to do more and we just need to make sure that the focus around the regulator doesn’t detract from the other important work that needs to take place.
Talking about media literacy and education, Childnet runs the Childnet film competition. Tell us about that.
It’s an amazing competition, open to all schools across the country or youth clubs or places where young people are brought together. The challenge is for groups of young people to create short films, for primary age children there’s a one minute film, for Secondary Children a two minute film. It’s normally on a theme around online safety. So this year’s theme, and we just had the film event in July, was about how young people see the future of the Internet, we had some amazing, amazing films and I encourage anybody listening to go and have a look on the Childnet.comwebsite and go and have a look at the films.
What happens is we shortlist the entries we get, so we get three primary school entries and three secondary school entries and we invite the young people involved in the creation of these films to the British Film Institute on the South Bank. The British Board of Film Classification, the BBFC, rate the films and then we all watch them with other stakeholders from Government and charities and industry. We watch the films on a big screen at the British Film Institute, and before the screening we have a judging session where we have David Austin, who’s CEO of the BBFC, the signature you see on the black card of the films every time you go to the cinema, we have the BBC, we have BAFTA, and we have the British Film Institute all judging the films and it’s incredible.
There’s one of my favourite moments when I see the excitement on children’s faces where they see their film coming up onto the screen. They see themselves six foot wide in front of that audience is amazing in itself, but then we interview the young people involved so they get a chance to talk about why did they decide to approach this topic like this and so on, so we get to dig a bit deeper. The exciting thing is we’ve developed, we’ve derived content created by young people that is therefore shareable and we know that schools, when they submit [their films], they often share [them] in assembly to the whole school community and we put [them] online and we have our website, the BBC “Own It” site have highlighted some of the winning films on there.
So it gets its broader audience and we have effectively got this young people’s voice talking to other young people about this creative space. At this year’s film competition, we had some amazing entries and some amazing winners. I really can’t recommend people going to have a look at them enough.
Yes, please do! Unfortunately podcast and films don’t naturally go well together, you can’t really see them terribly well, but I was really impressed by the sophistication of these films, given that they’re produced by youngsters. I thought they were magnificent! They are using green screens and animation and stop motion and all sorts of things. They are really astonishing.
The creativity, the imagination and the unpredictability of how young people approach the particular topics and the prizes for the competition are for the school or for the Youth Club. And the prizes are a camera or a green screen or the all-important clapperboard or these kind of elements which will enable and encourage that creativity in children, in young people, because while the children are making the films, it’s an incredibly fun and creative process, but the learning about online safety is kind of a secondary goal, not a secondary goal, but it’s in disguise and it’s very exciting from that point.
We had one school who won a camera. They already had a camera, but this was another camera. The camera that school had was the camera that was kept in a cupboard, but here they won another camera and that kind of almost democratised the camera use within the school community that enabled more children to be able to access and play and learn with what is a really fundamental tool for our 21st century living.
One last question as we really are out of time now. When is Safer Internet Day?
It’s on the February the 11th next year in 2020 and I’d like everybody to mark that in their calendar.
This is the best opportunity in the calendar year that we can use to raise awareness about online safety and empower children and empower parents and carers and schools. The reach is astronomical, we reached last year 46% of 8 to 17 year olds, which for a national campaign is extraordinary and that’s because people are collaborating and working together because in fact this issue is not just important for Childnet or for schools. It’s important for absolutely everybody. If you go to the UK Safer Internet Centre website and look at Safer Internet day and see what you can do to get involved. Next February the 11th
Will Gardner CEO, Childnet International, thank you so much for your time.