Safeguarding Podcast – Year in Review 2019
By Neil Fairbrother
In this review of 2019 we remind ourselves of the work done by the Internet Watch Foundation, we discuss Brexit and Age Verification, we define and measure cyberbullying, we cover grooming for radicalisation as well as for sexual exploitation, we delve into the digital age of consent, we ask whether the UN CRC is fit for purpose in the 21st century and explore some Ecumenical aspects of the complex and dynamic topic of safeguarding children in the online digital context.
http://traffic.libsyn.com/safetonetfoundation/SafeToNet_Foundation_podcast_-_A_review_of_2019.mp3
There’s a lightly edited for legibility transcripted below for thise that can’t use podcasts or those that simply prefer to read.
Neil Fairbrother
Welcome to the SafeToNet Foundation’s safeguarding podcast where we talk about all things to do with safeguarding children in the online digital context. In today’s episode we’re reviewing 2019 and the contributions from many of our podcast guests to this dynamic and complex debate.
First of all I’d like to thank our guests for giving up so much of their time and for providing their insights into the technical, legal and ethical and moral aspects of keeping children safe online. I’d also like to thank our listeners from around the world for taking the time to download and listen and thank them very much indeed for the fantastic feedback and encouragement that I get. I’d also like to thank our sponsor, SafeToNet, for their continued financial and technical support.
Where to start with this review? Well, how about at the beginning, with Susie Hargreaves OBE, CEO of the Internet Watch Foundation who bravely volunteered to be my first guest. Here she is describing some of the work they do and the impact that online sexual abuse can have:
Susie Hargreaves OBE, Interview with the IWF, 5.2.2019
Okay, so the IWF is an international hotline for reporting and removing online child sexual abuse. To give you a size and scale of the issue, the police say that in the UK about a hundred thousand people at any one time are looking at child sexual abuse. But for the IWF we’ve definitely seen the amount of content we remove grow and grow. In fact, in 2018 we removed over 105,000 web pages of child sexual abuse and that’s millions of images and videos, so it is a very serious matter.
People say what are talking about? It is anybody who’s under the age of 18, but [with] child sexual abuse, there’s a massive range. So you’ve got babies through to 17 year olds who’ve taken a self-generated image of themselves. But the majority of what we see is, is shifting slightly and we’ve seen more self-generated content, but we see an awful lot of children under 10. And by self-generated, we define that as content that’s produced by the child but, inevitably it’s nearly all done on live streaming, and then recorded, which means we don’t see the other side of it. We see the child and the end result, but clearly these are children that are coerced, they’re deceived, they’re groomed, this is not something where we’d say these children are not victims.
Interestingly we’ve been seeing so much of this content we now recorded it, we have for the last 6 months, when we take action on anything we now record whether it’s self-generated. So that would be a video of a child in a room, there’s definitely not another adult in the room, they are doing something at the behest of someone we can’t see.
And then we’ll categorise it And unfortunately, in the six months we recorded it in 2018 about one in four of the reports that we took action on were self-generated [images]. And about 78% of the children in those images were girls that we assessed as being 11 to 13 years old and in bedrooms or household settings. But just because your 11-year old is in their bedroom, if they’ve got a camera enabled device and Internet access, they’re potentially at risk if they don’t know what they’re doing.
People think is a victimless crime. It’s not a victimless crime. We don’t make up these pictures. They’re real children and they’re really sexually abused.
I would like to tell you a story about, we call her Tara who I met in the States who was sexually abused by her stepfather from birth right up to the age of about 15 when she was rescued. But she now speaks about her experiences. She had an assigned police officer and in the United States, you can opt in to be notified anytime someone’s caught with your images on their computer because it’s linked to damages. You have the right to sue them for damages. And she opted in and she had had over 1500 notifications when I’d seen her, and the police officer who worked with her told me that one of her images had been shared 70,000 times.
—
Neil Fairbrother
All children are vulnerable online, but some are more vulnerable than others. Award-winning author and researcher Adrienne Katz on vulnerable children in a digital world.
Adrienne Katz, Vulnerable Children in a Digital World, 22.2.2019
There are children who are vulnerable because they themselves have a special need or a disability that makes something difficult for them and their daily life. And there are children who are vulnerable because of the home life that they experience.
We were able to study five vulnerable groups.
We looked at children in care or leaving care, and we included also in what we call our family social vulnerability,young carers. Young carers turned out to be very vulnerable online because as you can imagine, they spent lots of time at home because they can’t be out with their friends doing what teenagers maybe do, because they caring for a very sick person at home, or helping the siblings at home, so they go online to have a teenage experience.
We looked at children with special needs, which included learning difficulties and a whole range of special needs including the autistic spectrum.
We looked at people with physical disabilities, which includes chronic and long-standing illness because you can imagine if you’ve got a really longstanding illness, your access to the Internet is fantastic. It’s your lifeline really.
We also included in the physical disabilities vision loss.
We have another category which was communication difficulties. And into that we have people with speech and language difficulties, hearing loss.
There was a particular interest in young people with hearing impairment because they had emerged in the Cyber Survey as doing quite daring things. They were sending images quite frequently, self-generated [intimate] images.
They were meeting up with people they only met online, which may have been safe, they may have been meeting other deaf teens in a good group, but it wasn’t always the case. It was as if the Internet facilitated a normal teenage social life for them, with the risk taking that that implies.
The other communication difficulty was not having English as your first language because all the advice of staying safe is in English. The terms and conditions and on most of the sites, if you even read them, they are in English. Your parents then probably don’t speak such good English and may not be able to help you with this advice or navigate the English-speaking Internet.
So we included them because they came out as also having certain difficulties like cyber-scams. They fell prey to those kinds of things.
There is a hierarchy of risk in that some are more at risk than others. But the interesting thing is that they are at risk in different ways. They’re not all at risk in the same way because, and this was important because the analysis could predict these, it meant that if you are a front-line worker helping a young person, or caring for a young person, you could take extra care to support them in terms of that risk. You could really look for that and talk to them and prepare them better. Give them a more targeted support.
—
Neil Fairbrother
One of the most momentous topics in British politics in recent years has been Brexit and March saw me at the House of Lords where I interviewed John Carr OBE where this topic came up as it did in some other podcasts.
John Carr OBE, Blinded by the Light, 6.3.2019
I am a consultant to the Council of Europe. I’m also a consultant to a global NGO based in Bangkok. I travel a lot around the world and I often visit small countries outside of Europe. One of the things that they almost all say is we can’t get anybody to listen to us.
You know, they ask me if I can help them make a connection with Facebook or Google or Microsoft or one of the big American companies, and of course I do because I work with these people and I know them. But it’s simply this, these small countries, small markets, the big companies don’t care that much. I mean, obviously they will never say it like that, but that’s the reality.
The European Union, different kettle of fish. They have to care what the European Union says and does because it’s the biggest single and richest market in the world and if we’re outside it, who knows?
I can remember years and years ago before we formed the European coalition of children’s organizations, going to talk to big American companies, this was as British groups, and saying, we don’t think this is right. We want to explore this. And they would say to us, oh, it’s funny, you’re the only people saying this anywhere in the world. We never believed that, but they were able to say it.
Once we got our act together, and once we formed this European-wide coalition of children’s organizations, we knew perfectly well that what the children’s groups in the other European countries were saying is pretty much identical to what we were saying.
So being in big international blocks such as the European Union is incredibly important and it’s an another of the many tragedies that will follow if we do in the end Brexit, although I think we’re going to have to stay aligned broadly speaking to EU rules. I mean if the United States and China are going to have to adapt their policies to make sure they can sell and work in Europe, I’m pretty sure that an independent Britain outside of Europe, outside of the European Union, is going to have to do the same.
—
Neil Fairbrother
Dr Holly Powell-Jones took us through aspects of British law as it pertains to children in the online digital context.
Dr Holly Powell-Jones, Online Media Law, 21.3.2019
The kinds of things that I cover in my training for schools and for young people, probably going in order of most important downwards, I would say. Firstly, we teach them about sharing indecent images, the law around what’s sometimes called sexting or revenge porn, stressing the illegal nature of sharing those kinds of images. We also talk about harassment online, online abuse and threats, kind of malicious or threatening or menacing communications online.
We talk a bit about hate speech, material used to incite hatred and violence towards people, for example, on the grounds of race, religion, or sexuality. We also talk about things like anonymity provisions, people who are entitled to an anonymity such as sexual offenses claimants or children involved in court proceedings and a bit about contempt of court. And as if that wasn’t enough, we also talk about civil rights online as well, that will include things like an introduction to defamation law, and a bit about privacy and copyright.
Neil Fairbrother
All of that sounds pretty fantastic, but if you place yourself in the context of a young teenager who is maybe 12 or 13 on social media, they’re not going to bear all of this in mind are they, before they send a message or something? They’re not going to go through a tick list of am I able to send this? Is this okay?
Dr Holly Powell-Jones
Well, they’re certainly not going to, if nobody’s ever told them about what is or isn’t illegal, that’s kind of where I’m coming from. Yeah, of course, certain things might be common sense, or might seem like common sense, not to send people threats. But actually, my research has shown that there are quite a lot of young people who maybe don’t see that as a criminal risk to the sender.
I’ve had a lot of students say to me things like, “Well that can’t be illegal because I see stuff like that online all the time”, and so the idea that their perceptions might actually not match up to the laws that we have is a really important gap in knowledge. I can’t promise that everyone who sits in my training will adhere to the law, but I certainly think it’s useful for them to know about it as a starting point.
Neil Fairbrother
Okay. So, the law versus terms and conditions. Now we know that there is a minimum age to be on social media, which is usually but not always 13, and that’s there because of another law, the COPPA law, which is a US-based law. If a child in the UK is under 13 and he or she’s on social media, are they breaking the law or is it simply terms and conditions?
Dr Holly Powell-Jones
Well, this is what’s interesting as well because my understanding is that terms and conditions constitute an agreement and I’m pretty sure you can’t sign a contract until you’re 18, so we have these grey areas. We have commercial law like you talked about, advertising law, then we have criminal law…
But I think that where it gets complicated is what do we mean by child and what do we mean by adults? For example, in this country you are technically a child until 18, there certain things you can’t do till you’re 18; sign a contract, get married, etc, but you can be held criminally responsible from the age of 10.
You can’t legally own a pet until you’re 13, you can’t have sex until you’re 16 but you can’t share indecent images till you’re 18, so what we’re talking about here is a process of becoming an adult, which is stretched out over many, many years. And we’re trying to put on top of that legal rights, risks and responsibilities at different age stages. But I think it’s very confusing because why do we have such a difference between, for example, the age at which you’re allowed to have a contractual agreement with someone versus the age at which you could be prosecuted for a crime, for example.
Neil Fairbrother
So could you end up with a situation where a child argues the case that the contract with the social media company, i.e. the terms and conditions, are null and void because they’re too young?
Dr Holly Powell-Jones
That’s a really interesting question and one you would probably have to ask a lawyer, she says dodging it completely, because that’s not my area of expertise. But I think that would be a really interesting case to watch if that’s something that could be done.
—
Neil Fairbrother
Cyberbullying, although not defined in the Online Harms white paper, was defined by Martha Evans of the Anti-Bullying Alliance and Associate Professor Lucy Betts came up with a method for measuring it.
Martha Evans, Director Anti-Bullying Alliance, Choose Respect with the Anti-Bullying Alliance, 24.4.2019
The ABA has a shared definition of bullying. Bullying has four elements. The first element is that it’s repetitive, so it has to happen more than once. It is also intentional, so there has to be intent behind it. So that’s where you take out the elements where people talk about the horrible word “banter”, so it has to have intent behind it.
There’s also having to have a power imbalance, which is one of the most important elements of our definition of bullying, because you could have a power imbalance in many ways. You could be taller than somebody else, you could be the majority faith in a school. You could, by definition of being anonymous online, have power over somebody else because they don’t know who you are, for example. So the power imbalance element of the definition is very important.
And finally, it has to be hurtful, so you have to hurt somebody. So the definition is that bullying is the repetitive intentional hurting of one person or group by another person or group where the relationship involves in the imbalance of power. And that could be offline or online.
Lucy Betts, Cyberbullying and how to measure it, 29.3.2019
This piece of research was conducted through two studies. In the first study, what we tried to do was based on our previous work with young people, we gave them a whole range of items that linked to experiences of cyber-victimization, so being on the receiving end and also bullying behaviours. And then through a process of statistical analysis, which I can say more about if you want me to, what we did was to refine those items down to two scales. So the first scale was a cyber-victimization experience scale. And within that were three subscales. The first looked at experiences of threats in cyber-world, the second was sharing images, and then the third one was experiencing gossip.
The other scale that we developed, we call a cyberbullying behaviour scale, because this focused more on the behaviours that children were in engaging in. And there were three subscales also but they were slightly different. They were sharing images, gossip, and personal attack. And we then did a follow-up study to validate these sub-scales.
—
Neil Fairbrother
Age Verification was another big topic this year, with legislation being pulled this year at the 11th hour by the UK Government, just as the industry was about to implement a solution. Here’s Rudd Apsey of the Digital Policy Alliance explaining the point and purpose of Age Verification.
Rudd Apsey, Age Verification, 12.4.2019
Under the Digital Economy Act, which just passed into law in 2017, Section 3 requires website owners serving content to UK citizens to have a form of Age Verification in place. It doesn’t specify what that methodology should be, but it does indicate it should respect, privacy and anonymity.
So this isn’t about creating one central Government database where people register, which has been a method used by other countries in Europe and Germany in particular has a central registry methodology for accessing adult content. So at the moment we’re at the very edge of what in a data world, they call probability. So I don’t need to know your name, but I do need to know some indicators about you that tell me that you are over 18. Don’t give me your exact age, but [tell me that] you’re over 18.
So that’s the sort of technical challenge. There’s a number of ways we can do that. We can search for credit information, or an electoral roll type of information, to check that you are who you say you are and your date of birth is correct, which is a similar sort of check that you would have gone through if you take out a loan or you do some banking. They always ask for your date of birth and passport.
There’s documentation that could be associated with you; passport and driving license [for example], there are different ways of sounding out your paper identity in the passport and driving license world. Two years ago, we were simply looking at modelling data, taking information from the passport itself and sense checking that against some other information that you’ve given us, so that we know in probability you’ve got that document on your person at that time. In the last two years that’s now moved on to where we can take a picture of those documents and extract your date of birth information from the document itself.
—
Neil Fairbrother
As a response to the Online Harms white paper, PA Consulting published a report called “A Tangled Web”. Nick Newman of PA Consulting explained the disturbing “tradecraft of pedophiles”.
Nick Newman PA Consulting, A Tangled Web, 7.5.2019
Well, when PA Consulting first became involved in this space working in the early years of CEOP, the Child Exploitation Online Protection centre, it was clear that paedophile groups were intensely private. They would typically be private individuals who were petrified that they would be caught. They were petrified that if law enforcement were to locate their homes or the places they were operating from, that the evidence of their crime would be there on their servers.
And what we’re seeing now is that these groups are able to communicate much more freely in these highly encrypted, secure sites on the dark web, safe in the knowledge that they are unlikely to be penetrated by law enforcement, and so it creates this “safe haven” where they can share tips about how to groom children. The most disturbing thing that we encountered when we were researching the global threat assessment was that the membership fee to be part of these networks is often to produce new abuse imagery. So if you can produce a video clip of a child being abused that hasn’t been seen before, or new photographs of a child that hasn’t been seen before, then providing that every month is your membership subscription fee.
—
Neil Fairbrother
Jonny Shipp of the Internet Commission discussed how AI can obfuscate and obscure online harms.
Jonny Shipp, Licensed to Operate with the Internet Commission, 24.6.2019
…the transparency is only useful insofar as you’re asking the right question. So if you’re asking the right question of the AI kind of environment, then maybe transparency has a role. I’m sure it has a role, but the more important thing is that the owners of AI technology are accountable for the way they deploy it, so that they they can show that they are deploying it in a responsible way and that the social impacts are positive.
It’s absolutely right that the AI can be deployed. It may be good, may be bad. I believe that it should be the responsibility of the owners of the AI, the organisations that deploy it and actually use it for impact in the world to make sure that it’s positive for people and they should be accountable for ensuring that.
—
Neil Fairbrother
The Online Harms white paper and Brexit were both discussed with Baroness Sal Brinton, President of the Liberal Democrats.
Baroness Sal Brinton, Don’t go into the Dark Woods, 2.7.2019
In theory, this sounds straightforward. I mean the reality is that most young people are very clever at finding way round the age barrier to get onto that preferred social media platform. And again, the family often don’t know about it. And whilst it’s important that we hold social media companies’ feet to the fire to make them put in as much as they can to make sure that they check when people sign up, the reality is that that isn’t always going to work.
And then you’ve got to look at the response time when something goes up or they’re following, they may have had a complaint about a dialogue that’s going on in a private bulletin board between someone who might not be a child, putting other children under some sort of threat. The speed of their response and investigation is what’s really, really important and we need whatever protections we can get, and social media companies can only respond. So much is going on. They cannot monitor everything. We are deluding ourselves if we think they can. They can’t. But we still need firm barriers. So things like registration, age proof, yes, they’re important.
I think my perspective is the chaos that we have in Parliament in trying to translate everything that’s currently in EU legislation, into laws in this country to make sure that we don’t inadvertently end up with a hole somewhere in the system.
So the current bullying legislation is UK legislation, but most of the social media legislation and the way social media companies have bases in the EU means they have to follow EU law. We need to make sure that regulations that relate to those are translated back into UK law as well.
In terms of our relationships elsewhere, I think the big thing that my colleagues talk about is the European Arrest Warrant, where at the moment being part of the EU, it becomes very easy if you have somebody who, whether they’ve committed a crime in the UK or if they’ve done some cyberbullying or cyber-crime on young people in this country, but they live abroad, we have the power to be able to get them straight back through the European Arrest Warrant without having to go through very slow and long extradition processes that we have to use with other countries.
I don’t think Brexit in itself would change the role of organizations like the IWF, but if we are tying ourselves up in knots in Parliament with regulations and legislation, it becomes quite difficult to pick up if there is a problem because of a trade treaty with another country, that might mean that we couldn’t access social media companies.
—
Neil Fairbrother
Although social media companies get a lot of the blame for online harms, the Online Digital Context comprises much more than just social media companies. Here’s Andrew Kernahan, Head of Public Affairs for the Internet Services Providers’ Association discussing the “mere conduit” defence that is used by ISPs.
Andrew Kernahan, The Good Samaritan with the ISPA, 21.7.2019
So the “mere conduit” defence comes from the eCommerce Directive that was implemented in the UK as the eCommerce Regulations. This gives protections to ISPs in that they are judged as mere conduit. So all of the activity that is transmitted on their network by their customers they’re not legally liable for unless there is another legal protection or legislation that requires them to act.
So, for example, we have this mere conduit defence, but at the same time copyright infringement blocking takes place by a notice to ISPs to block access to sites that infringe copyright. But crucially, a Judge there is making a determination, or a Court is making the determination, as to whether that content needs to be blocked or not.
And I mentioned age verification as well. So again, we have a regulator that’s put in place to essentially pass blocking injunctions to ISPs to get them to block things that happen on their network. So it’s not a Wild West where ISPs are just given this defence as a mere conduit to not care about what happens on the network, but it does mean that where policymakers and society want ISPs and access providers to act, they need to do so, using ideally legal mechanisms so that ISPs aren’t being the sort of judge and jury of online content.
—
Neil Fairbrother
David Wright, Director of the UK Safer Internet Centre at South West grid for Learning (SWGfL), discusses the “digital age of consent”.
David Wright, Digital Ghost Stories and Blue Whales, 3.8.2019
So it is not illegal for anyone under 13 to have an account, for example, on Facebook or Whatsapp or Snapchat, although Whatsapp is 16. It’s merely a violation of their terms and conditions. The offense that’s committed under COPPA is that if you report somebody for being under 13, if the provider doesn’t take action, they’re then the ones committing the offense if they don’t take any action. So trying to provide some clarity around what this actually means. GDPR introduced what is essentially a “digital age of consent”, which ranges between 13 and 16 across Europe, depending on the decisions taken by each member state.
So for the UK it’s 13, so anyone over 13 can provide their own [digital] age of consent and verification. Whereas in other countries, for example, in Germany it’s 16. And that’s why we saw the changes that Whatsapp introduced going from 13 to 16, the increasing in the minimum age requirements for use of Whatsapp at the beginning of last year. So it is a complicated picture and so any form of clarity, particularly in the focus and the support by the ICO is always very welcomed.
—
Neil Fairbrother
Professor Sonia Livingstone OBE, discussed the Online harms white paper, the FTC’s COPPA legislation and whether the UN’s Convention on the Rights of the Child, the UN CRC, is fit for the digital age.
Professor Sonia Livingstone OBE, The Extraordinary Paradox, 12.8.2019
I’m very much in favour of there being an independent regulator. I think we’ve spent the last 10 or 20 years, many of us in the research world and also the policy world, exploring the possibility for industry self-regulation, and we’ve reached a point where public trust in industry is pretty low in relation to safety, privacy and harm. And any domain has regulators in the world of transport or food or health or whatever, but in relation to the digital world, we haven’t got that, and the existing regulators are kind of saying “Not my problem”.
So I think there should be a regulator. I think it could be an existing regulator like Ofcom or the Information Commissioner’s Office (ICO), but it’s really important that it is trusted by the public, engaged with productively by the industry, transparent in how it works and sufficiently expert to properly know what’s going on and how the digital world is changing.
So I don’t think it’s an easy task and I don’t think it’s easy for the white paper as it eventually becomes an Act [of Parliament]. I don’t think it’s easy to say how that’s going work, but I think we have to try.
…what happens in this domain is that the technology moves faster than the regulation. So COPPA was developed in 1998 as a way of ensuring that if television companies were going to send advertising to children, the parents would have a right to say yes or no, as it were, to know that this was happening and that’s where the age of 13 was first set to say if a child is under 13, then the parents should know if the child’s getting advertising.
…any psychologist will tell you that roundabout 12 or 13 is generally accepted as a really kind of crucial turning point, both emotionally if you like as it is the start of puberty more or less, but also cognitively. There’s a lot of theoretical work that says roundabout that age children become more critically aware of the world that they’re in and are able to make better judgements about it.
So that’s also a good question. So just to be really clear, I think the Convention on the Rights of the Child is actually beautifully written and entirely relevant to the digital age. But what’s being written now is called a General Comment, which is a kind of a short text which explains it, which does the translation job, which explains how that convention written 30 years ago, how it applies.
So the Convention says the child has the right to be protected from harm or the right to participate or the right to have their privacy protected. So what does that look like in the digital environment where “privacy” has come to mean something about data collection and where “participation” has come to mean being on social networks and where “harm” has got all kinds of new meanings given the kind of risks that the Internet poses. So it’s like saying we need a translation document to really explain that the Convention doesn’t just apply to the offline child. It really applies to the whole child.
—
Neil Fairbrother
While grooming can be used for the sexual exploitation of children, it can also be used to radicalise them for terrorism, as in the case with Samima Begum, as explored with Dal Babu.
Dal Babu, the Online Grooming of Shamima Begum, 2.12.2019
The processes are the same. It’s about identifying individuals that are vulnerable. So, whether it’s a paedophile who does that, whether it’s a gang member who does that, or whether it’s a radicaliser there’ll some sort of vulnerabilities, some vulnerabilities will be the same, but they will identify those vulnerabilities. They will then engage with that individual. Then they will provide them with attention, provide them with in a perverted, twisted way what they might call “love”, a sense of belonging. And from there they’re able to manipulate those individuals.
I think what the paedophiles, the radicalizers, the gang members do is they will be on Twitter [for example], they will see exchanges from vulnerable individuals. That individual’s address, that Twitter name and address will be visible to them. They will then have an opportunity to monitor that and then identify those individuals that are likely to be susceptible to grooming, in their eyes.
Whether it’s about radicalisation, whether it’s about gang membership, whether it’s about grooming for any other purpose, for sex, gangs, and radicalisation, I think that’s the initial process.
Exactly. And I think not only do they identify them, they are then able to have a conversation in that child’s bedroom, on a device, a smartphone. It will be making sure that they build up that system where that child will not tell other people about what’s happening. It will be about saying, “look, this is our secret, don’t let me down” and sending them a gift, grooming them in whatever way they want to.
Certainly, with Shamima Begum, I think her latest interviews talks about how she was tricked. I think she’s probably the most hated women in Britain.
I’ve never met Shamima Begum, but I’ve worked with the families.
What’s really struck me about this, was that she was a young girl, she hadn’t even got a passport, hadn’t been out of the country, she stole her sister’s passport in order to travel. She wasn’t even a practicing Muslim before, she got radicalized and then became a practicing [Muslim]. And I’ve seen pictures of her, she’s just like a normal youngster, doesn’t wear head covering or anything [like a] niqab. She’s just wearing normal western clothes. And then as time’s gone by, she’s been radicalized and has started practicing the faith.
So, it just really struck me that people had a lot more understanding of young girls, in particular young women, who had been groomed on the Internet and sexually exploited; they found it very, very difficult to understand how somebody could be groomed on the Internet [for radicalisation]. I think that’s probably a sobering lesson for for all of us to understand, the power of groomers.
—
Neil Fairbrother
Carlene Firmin pondered whether the Pierre Bordieu’s Social Theory would apply online.
Carlene Firmin MBE, Contextual Safeguarding, 15.12.2019
Potentially, because you would have a social rules at play in various different online platforms and people’s behaviour changes platform by platform and they might do something on a social media platform that they wouldn’t do on another social media platform or whatever.
Their social capital is who they know and the more friends you have or whoever you’re friends with on social media will give you a certain amount of status for example in an online setting.
Young people’s economic capital may have an impact online in terms of whether they can afford to buy certain access to different gaming sites or upgrades or whatever it is, and young people’s cultural capital, their understanding of the language, the cues, the do’s and don’ts. Do you post that image? Do you not post that image? Do you comment on that? Do you use a pseudonym for that? Will all have an impact on safety in the online world.
So you can definitely utilize Pierre Bordieu to explore those things, in the same way that you would say, what are the cultural cues in the youth club? Do you understand them? Who do you need to know when you’re there? Who are your friends? Do you have the money to participate in the activities that everyone else is participating in the youth club?
—
Neil Fairbrother
We explored some Ecumenical questions with Father Hans Zollner, President of the Vatican’s Centre for Child Protection.
The 11th Commandment with Father Zollner 5.10.2019
Neil Fairbrother
Safeguarding as a spiritual matter sounds very interesting indeed. What do you mean by that?
Father Zollner
We mean that people of all faith, and we are of a Christian faith, need to realize that and need to live up to the fact that, if you believe in God and if you believe in Creation and if we believe that human beings are created in the image of God, there is no bigger commandment than to love one’s neighbour as we love God and ourselves.
So I believe that when we talk about spirituality, it means that it is the outlook on life and the outlook on relationships, on how we deal with people, how we try to work so that people are not exploited in any way. So I think this is one of the aspects of the spirituality that is implied. If you consider safeguarding of the whole creation, of which Pope Francis has reminded us so many times, that of course includes also all human beings and especially the ones who are more in danger of being exploited and being abused.
Neil Fairbrother
You mentioned confession, so if a member of your congregation during confession discloses that he or indeed she has been abusing children as perhaps a first step towards repentance, what are the duties of the clergy in this respect?
Father Zollner
The priest who hears a confession needs to tell the person that he or she needs to report him or herself to the police… priests are not obligated to give the absolution if they have doubts about the honesty of the confession.
Neil Fairbrother
I read a very interesting blog post on the CCP’s website and the author, which may have been you actually, I’m not sure who the author was, says, “It seems like society underestimates the power of the internet and the rate at which evil can spread in its innumerable forms”. Evil can be defined as ungodly or irreligious. It’s the very opposite of all the values of your faith. Can someone who attends Mass, takes the Holy Sacrament, goes to Confession even, and yet commits these evil acts be a true Roman Catholic or are they just acting a part? It’s a surface gloss with a rotten core?
Father Zollner
Yes, of course. As human beings, be it Catholics, Christians in general, any believer, we, all of us, are caught in the tension between what we promise, what we strive for and what we really do. Jesus has come precisely to call the sinners, not the perfect ones and those bringing him to death, those who believed that they were the perfect ones.
This is not to condone sins or crimes, but human reality is that evil will always be here. And this is another current in the discussion that I really find quite disturbing, that people think with even more perfect measures and even more perfect technology and even more perfect guidelines and laws we will do away with evil. I find this is a very dangerous illusion.
Neil Fairbrother
Yes. Now talking about evil, if these acts are evil and the Devil is defined as the supreme spirit of evil, is this abuse of children, the Devil’s work?
Father Zollner
Evil is a reality that is impossible to describe. Ultimately evil is committed by human beings. Evil deeds are committed by human beings. You see that sometimes evil deeds are committed that are beyond understanding. Why, why would any human being do that to another human being? So there are aspects that go beyond our understanding and our reason. I think we need to realize that we are all of us, we are somehow unable to detect the concrete roots of what is going on.
—
Neil Fairbrother
What have I learned from all of this? What are my conclusions? Well, there’s a lot of fantastic people doing a lot of fantastic work that should not be needed to be done. Had the social media companies in particular really thought things through about how human nature would subvert their high ideal about how everyone was going to be connected in a global village, had they adopted a “safety by design” approach from the outset, then social media would be very different from what it is today.
In fact it may not even have been launched in the first place.
I’d like to think that Zuckerberg’s famous “Move fast and break things” didn’t deliberately mean breaking children’s lives, but that’s what’s happening. Somehow, all the players in the online digital context, the social media companies, the internet service providers, the mobile network operators, the handset manufacturers, the law makers and legislators, the candlestick makers, all need to find a way to retrofit the kind of safety features and legislations that should have been there in the first place.