Safeguarding Podcast – Jane with Glen Pounder COO Child Rescue Coalition

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother exploring the law, culture and technology of safeguarding children online.

In this Safeguarding Podcast with Glen Pounder, COO Child Rescue Coalition: Apple’s CSAM detection announcement, the impact on privacy, real time filtering in iMessage, Private Relay, Secret Sharing Threshold’s weird 30 file trigger, photoDNA and hashing, real time streaming and Jane.

There’s a lightly edited for legibility transcript below for those that can’t use podcasts, or for those that simply prefer to read.

Neil Fairbrother

At the SafeToNet Foundation, we’ve long wondered why device manufacturers couldn’t, or don’t, make more use of the amazing technologies in their smartphones to do more to protect children online. It seems to us that this is the most logical place in the entire online digital context for proactive, protective tools to be deployed. Relying on cloud-based services alone requires an historical or retrospective analysis, which means it’s too late. The abuse has happened and therefore there’s victim often in a world of pain.

Apple’s recent CSAM detection announcement has shown some of what can be done on a device, but it’s complicated. To help guide us through this. I’m joined by Glen Pounder of the Child Rescue Coalition.

Welcome back to the podcast, Glen.

Glen Pounder, COO Child Rescue Coalition

Thanks so much Neil and it’s good to talk to you this morning.

Neil Fairbrother

Could you just for the benefit of our global audience give us a refresher of your background and expertise please.

Glen Pounder, COO Child Rescue Coalition

Yeah, so I have a background principally in law enforcement for three decades and then for over two years, I’ve been the Chief Operating Officer at the Child Rescue Coalition, a non-profit that provides technology to free to law enforcement to allow them to proactively investigate criminals in their jurisdiction.

Neil Fairbrother

Okay. Now we recently recorded a podcast with you on those very technologies, but could you give us a very quick two minute synopsis of what they actually are and what they do?

Glen Pounder, COO Child Rescue Coalition

Yeah, in essence, we monitor unregulated spaces on the internet often called the deep or dark web, and we provide an investigative platform to law enforcement. They receive a three-day training course and it allows law enforcement to see the targets in their jurisdiction and then proactively investigate those with a dangerous and also illegal sexual interest in children.

Neil Fairbrother

Okay. Now as I mentioned, Apple one of the largest and most valuable companies on the planet and obviously extremely well-known for their iPhone series of smartphones, has recently launched a solution or at least the first stage of a solution for trying to address the appearance of CSAM or Child Sexual Abuse Material on their ecosystem, which extends, I believe their search for this content out from the cloud or the internet, into the devices themselves. And this has stirred up quite some controversy. So Glen before we get into the detail, what is your overall view of this move by Apple? Is this to be welcomed?

Glen Pounder, COO Child Rescue Coalition

Yeah, there is a lot of there is a lot of detail here to be discussed. But overall, it is in my opinion, absolutely to be welcomed. I know on our previous podcast, we very briefly touched on both Apple and Google with regards to the device level and we kind of touched on it and then went away from it because I think, you know, we were probably both of the opinion, you know, they’re probably not going to do this. So to see them doing something productive in the in this space is very encouraging news.

Neil Fairbrother

Yeah, I think so, as I say, it’s something that we at the SafeToNet Foundation have long pondered what could be done on the device. And I think there’s a lot of good things here and I think there’s some strange things here as well. But what is your take on some of the good things, assuming you have any Glen?

Glen Pounder, COO Child Rescue Coalition

I certainly do. I think from what I read on the protective measures for young children, those under 12 in particular, that images will not be rendered where Apple’s apparent AI identifies those images as being potentially nude, potentially offensive. And I really like the steps that Apple’s then taken to encourage children “Hey, are you sure you want to see this?”

Neil Fairbrother

Yeah, this is in their messaging app I beleive?

Glen Pounder, COO Child Rescue Coalition

It is apparently in their messaging app, to the child. “Hey, are you sure you want to see this?” If the child says, yes, well, “Are you really sure because we’re going to let your parent know.” I think that’s a way of encouraging the child to really think about the next steps that’s taken, which is great. Honestly, when you hear about the number of women who receive images, unsolicited, I can see some women who might want to say actually, I’d like to add that feature to my device. But I don’t think that’s an option Apple are considering right now, but in terms of child protection, it’s very positive.

Neil Fairbrother

Yeah. Well the adult case is I think a different case from children. I think children are very much a special case. We quite like that as well and I think the reason we like that above all is that we’ve long maintained, having discussed this issue in quite some detail with people, such as Sonya Livingstone and Stephen Balkham from the American Family Online Safety Institute, that it’s good to get parents and carers involved with children in the discussion about children’s online lives, not intrusively so. But I think that this is a technical move that can aid that conversation to take place.

Glen Pounder, COO Child Rescue Coalition

Yeah, absolutely. And again, with the size of the resources at Apple and the number of I’m sure very large brains that work in there, then this seems like a very, very positive step.

Neil Fairbrother

Now the big sort of $64 million question seems to be one of privacy. Apple has long made a very strong play of protecting their users’ privacy to the point where they seem to be declining requests from American law enforcement and possibly others around the world to get access to devices through some kind of backdoor. As far as you’re aware, does this, as some people claim break Apple’s privacy promise, or is our privacy promise still intact?

Glen Pounder, COO Child Rescue Coalition

For this aspect of it? I don’t think it goes anywhere near breaking Apple’s privacy or privacy promise. Not at all. It’s certainly not any kind of access for law enforcement. It seems very much if you’re like a family contract between a parent and a child, and again, I would say that a child of that age needs that extra level of protection and certainly if my kids were that young, I would very much welcome this additional resource really on my child’s phone.

Neil Fairbrother

We have to bear in mind here that although the legal definition generally speaking for a child is under 18 this is really addressing the thirteens and younger because there is no real effective age verification process in any of these social media platforms. And we know that there are under thirteens, although 13 is often the minimum age, there are plenty of under thirteens on these platforms who simply are not worldly wise enough to discern exactly what’s going on and who they’re talking to.

Glen Pounder, COO Child Rescue Coalition

Yeah, absolutely that. And, and of course, you know, those very young children, it is absolutely the parent’s right to know what’s going on with their child’s device and this kind of really, if you like making that point easier for parents is great.

I’m not honestly sure whether or not it would work for apps that are installed in that phone. My impression is that it does not which, which again is, I think unfortunate, because again, I’d love to see that at the device level, so that any image would not be rendered by the device without that safety in place for a young child, because let’s remember having a child having that device is often a matter of safety as well. Right. If the parent is away from the child, the child can instantly call the parent if it has a device at 11 or 12 years old.

Neil Fairbrother

Yeah, indeed. Absolutely. and by rendering the image, you mean display the image on the screen?

Glen Pounder, COO Child Rescue Coalition

Yeah.

Neil Fairbrother

Okay. Now one of the things that struck us was that Apple seem to have chosen to use a very well-known piece of technology to underpin this, as far as image detection is concerned, which is called photoDNA. And we, I think like that idea because photoDNA is mature. It’s tried, it’s tested, it’s been around for a long time now, it’s got a long track record of success in finding CSAM hashed images or images of hashed CSAM content. And also, I’m not sure if this is quite the right phrase to use, but it’s approved for use in some ways by the EU’s ePrivacy Temporary Derogation which we’ve covered in previous podcasts. What’s your view on photoDNA and Apple using it?

Glen Pounder, COO Child Rescue Coalition

If we’re stepping away now from what I understand the AI to be, which is more that an image frankly, of a of a naked penis, I think that’s what it would block with regards to that child, the rendering of an image on the device, to iMessage.

Moving over to the photoDNA space, photoDNA is highly accurate. And when I say highly accurate, I mean, in the realms of it’s far more accurate than human DNA. I’m happy to expand on why that’s so important if that’s useful.

Neil Fairbrother

Yes, perhaps you could, that would be fantastic.

Glen Pounder, COO Child Rescue Coalition

There are well, not hundreds, but thousands of people now in prison through the use of the analysis of human DNA. And that’s because human DNA is so unique. Now photoDNA is far more unique and by that I mean far more accurate. And so those people who were in prison through the use of human DNA as evidence, and more importantly, those people who are not in prison and they’ve been exonerated through an analysis of human DNA, because the chances of it being wrong are so infinitesimally small as to make no difference. Now photoDNA is far more accurate to the tune of several more zeros. So there’s one in billions chance of photoDNA being wrong, which is why it’s such a great industry standard. And it’s so great that that work was done back in 2009 to develop that photoDNA technology.

Neil Fairbrother

Yeah, the comparison with real-world DNA is a good one because of course these photos are graphical representations, graphical recordings, of actual crime scenes. And we’re all familiar with DNA being found at crime scenes in the offline world. This is simply extending that principle and practice to the online space.

Glen Pounder, COO Child Rescue Coalition

Very much so. And the fact that it’s many more times more definite than human DNA indicates how really the machines can do the work for humans here, because the machines, they’re not going to be wrong, basically. Any reasonable man in the street, which is you know, something that is used in Commonwealth countries to prove cases “beyond reasonable doubt”, this is a phrase that you’ll hear off the normal TV for when somebody is found guilty of a crime. Well, photoDNA is far more accurate than beyond reasonable doubt.

Neil Fairbrother

Okay. Now the way this works is by hashing an image, what does that actually mean to the man I the street?

Glen Pounder, COO Child Rescue Coalition

Yeah, so hashing the image gives it a mathematical formula to that image, by which I mean the individual pieces of that image are hashed, are given a “digital fingerprint” is the easiest way to think about it. The only way to match against that digital fingerprint is by another image being either exactly the same or very, very, very close. And I’m happy to expand on why that’s important in terms of my concerns around what some of what Apple’s changes mean.

Neil Fairbrother

Yes please do Glen.

Glen Pounder, COO Child Rescue Coalition

So if there’s, if there’s a match in photoDNA, if one or two of the pixels have been changed, photoDNA will still say that is a match. Now that could just be a slight change of the colour, a slight change in one or two of the thousands and thousands of pixels. But in essence, the image will still be, in this case, child sexual abuse material.

Neil Fairbrother

My understanding is that one evasive tactic that purveyors and peddlers of this kind of content or collectors of this kind of content use is to doctor the images somewhat. So they will change the colour grading, they will crop the image. And what you’re saying, I think is that photoDNA is smart enough to recognize those changes and still classify it as a related CSAM image to the original one.

Glen Pounder, COO Child Rescue Coalition

Yes, exactly right. Which as you said earlier, is the digital evidence of a serious crime.

Neil Fairbrother

Okay. Now my understanding also is that the Apple solution is using the on-device analysis, not to rummage through the entire contents of a photo library on an iPhone, but it’s to see if it can find evidence of these hashes, is that correct?

Glen Pounder, COO Child Rescue Coalition

My understanding of it is, from a privacy perspective, Apple’s gone a step further than that to only check images when they are connected to the iCloud. So if somebody is on their Apple device and decides not to store those images in the iCloud, then Apple will have no notification at all.

Neil Fairbrother

Okay. Irrespective of whether or not that’s a CSAM image? So you might, you might have someone there with these illegal and damaging images on their phone, they don’t share to the iCloud and in that instance, Apple’s CSAM detection tool won’t be triggered.

Glen Pounder, COO Child Rescue Coalition

That’s right, exactly right. That’s how I read it anyway and I’ve read it a lot!

Neil Fairbrother

Indeed. Okay. So that’s all well and good for privacy. But let’s assume that someone does attempt to upload it to iCloud. I believe that what happens next is that the “hash match” takes place. In other words, the hash that’s found on the image is compared to a database of other hashes or codes or digital fingerprints provided by NCMEC amongst others, and if there’s a positive match, this then goes to human arbitration. Someone will manually inspect image.

Glen Pounder, COO Child Rescue Coalition

That’s not how I read it. How I read it is they will do a matching that takes place by machines. And then the machines will not refer that to a human moderator at Apple until there’s been 30 matches. The reason that’s so crucial is again, from our cases at the Child Rescue Coalition. One of the things I did yesterday was check for, it turned out to be scores and scores of cases, probably hundreds, where investigators using our technology, are identifying and arresting these dangerous criminals, where the number of matches is less than 30.

Neil Fairbrother

Okay. Now this falls into the Threshold Secret Sharing feature, or component, of this overall solution. And what Apple say is the reason for that threshold there is to help eliminate or reduce the likelihood of a false positive. In other words, a mistake happening, and someone being accused, if you will, of having a CSAM file on their device, which actually isn’t a CSAM file on their device. And Craig Federighi there, I think the Vice President of Software Development, he’s publicly said that this number is a starting point and it will reduce in time.

Glen Pounder, COO Child Rescue Coalition

Yes, I read the same thing. But if we step through the process of what happens. Apple wouldn’t be accusing anybody of anything, right? So if there was one match, and let’s remember the chances of that not being CSAM are infinitesimally small. Like beyond any reasonable doubt, that would be a match. And that’s only to refer it to somebody in Apple to moderate it, to say, is this a match for CSAM?

So let’s say they did set it at one, okay? The threshold was one. Well then, the human moderator at Apple would see that image and confirm whether or not it was child sexual abuse material and only then would it be referred to our friends at the National Centre for Missing and Exploited Children. Even then nobody is accusing anybody of any crime.

Neil Fairbrother

Okay. And what would NCMEC then do with that report?

Glen Pounder, COO Child Rescue Coalition

And I think this is part of the reason this is so crucial to discuss this threshold, because as you know, and as probably most of your listeners know, NCMEC are the clearing house for these types of reports from Facebook and Gmail and Hotmail and everybody else.

So let’s say there’s been a report from Facebook and now another report of one image from Apple. Well, in terms of triage for when they refer those cases on to law enforcement for further assessment, again, before anybody has been accused of anything, well then if a potentially lower-level risk has been presented by Facebook and law enforcement, when they receive the cyber tip, don’t know anything else about what’s being reported, that person may never be investigated ever, right? Because law enforcement has too much work.

We all know this is not really just a law enforcement problem, but law enforcement have too much work in this space.

If law enforcement via by the cyber tip, find out not only was this a report from Facebook, but also over here, we see between one and 29 CSAM images identified through Apple’s good work, well then triage wise, which suspect shall we investigate next, could push that target into being investigated by law enforcement.

And again, it is of course about, about finding the offenders who are consuming this horrendous material, and perhaps we really need to focus on what it is talking about, but, but also it’s really about rescuing children from being raped.

Neil Fairbrother

It is. Now what information does Apple to the best of your knowledge provide law enforcement? I mean, for this to be effective one would imagine that Apple would have to provide quite a lot of information about the account that has these images, that Apple must know the ID of the account holder. iCloud accounts aren’t free. They must have their credit card number. So does all of that get handed over to law enforcement, do you know?

Glen Pounder, COO Child Rescue Coalition

I do know. I used to deal with cyber tips and I never saw a cyber tip from Apple. I don’t know if that’s because to take 2020 as an example, they report 265 total leads to NCMEC. So not 265,000, but 265 total. So it’s a few quite few years since I’ve seen a cyber tip, I’ve never seen one from Apple.

They are not compelled by law to report all those extra details, which I’m sure they do have, and quite often what generally happens is law enforcement, having had the initial tip, often needs to serve a search warrant back on to Apple to ask for those further details that you’ve just mentioned. So again, I think this is very important in terms of the process, because nobody’s been accused of anything, right? Law enforcement still has an investigation to do for evidence to be gathered, et cetera.

Neil Fairbrother

Okay. I take your point then about the 30 limit threshold. Is one image, enough evidence to convict someone that they have in fact been storing or sharing this kind of illegal and offensive imagery?

Glen Pounder, COO Child Rescue Coalition

Yeah, Neil, that’s a great question. And although one image could be enough to help after the law enforcement process and the judicial process to convict somebody of possession and intent to distribute child sexual abuse material. It goes much further than that because I know of cases where our technology was used by law enforcement to identify the location where that one image in our system led to that house and the offender was found to be sexually abusing in his own daughter. She was 18 months old.

And so when we think about cases like that, that’s what concerns me with regards to the threshold. So again, one image is the absolute minimum crime that we knew about in our system. But law enforcement begin their investigation often when they arrive at the premises to figure out, okay, we know about this. What else has this potential offender done, right?

Neil Fairbrother

18 months old?

Glen Pounder, COO Child Rescue Coalition

18 months old. Yup.

His wife was at work and he was abusing in his own daughter. And the evidence was found at the premises, on the devices of course.

Neil Fairbrother

Okay. Now you wrote in the blog post entry last week that “Apple is designing in a way to automatically mask a user’s IP address and make it impossible for Apple, the internet service provider, or anyone else to trace the user, even when a judge agrees that they have to be identified.” Is that still your understanding?

Glen Pounder, COO Child Rescue Coalition

Yeah, that is my understanding. And in terms of this threshold, I don’t know, and I have asked actually a prosecutor friend of mine to look at this, but it seems to me that Apple have designed a way so they themselves will not be notified until the threshold is met. And when we know the threshold right now is 30.

I don’t understand why that can be legal because their machines will know that there’s been one match, two matches, three matches and so on right up till 30. Well, those machines belong to Apple. So in terms of the law, and they are compelled by law when they have identified known CSAM, when they themselves have identified that, they are compelled to report those crimes to NCMEC.

Neil Fairbrother

Yeah, I’ve been reading through one of the documents that they’ve released, which is called the Security Threat Model Review of Apple Child Safetym they do refer to this. Page 10, actually of the document that gives a description at least of the principles behind their thinking. And what it seems to be focusing on is this false positive rate. And what they say is extremely long, and I can’t read all out otherwise we’d be here all day just on that particular piece, but they are concerned about safety margins for individuals. Obviously, they’re concerned about online child safety, otherwise we wouldn’t be doing this. Working back to this really low rate of false positives, I think that’s the way to put it, they’ve settled on an initial 30 images because that’s where the maths take them.

But it does say in this document, and as I said, Craig Federighi mentioned this in the Wall Street Journal or a Washington Post interview, “Since this initial thresh hold contains a drastic safety margin, reflecting a worst case assumption about real world performance, we may change the threshold after continued empirical evaluation of NeuralHash, [which is the technology they refer to] of false positive rates, but the match threshold will never be lower than what is required to produce a one in 1 trillion false positive rate for a given account.”

Now, false positives would be a plague on this system because it would just devaluate it and make the whole thing nonsensical. So it seems as if what apple is saying is that there has to be a small area of doubt, and that small area of doubt will decrease over time.

Glen Pounder, COO Child Rescue Coalition

I agree. That’sa reasonable interpretation of what they say, but when we look at what a trillion is, that’s 12 zeros, okay? So that’s way, way, way beyond what anybody in the street could term “reasonable doubt.”

Let me put it to you another way, okay? If your child, their infant school teacher has 29 child sexual abuse images of five and six year olds being raped, would you, or would you not like that report to go to the National Centre for Missing and Exploited Children? And by adding those zeros, by making it 1 trillion, personally I think it goes way beyond what the man in the street would think was reasonable.

Neil Fairbrother

Do you think they’re just being overly cautious due to the extreme sensitivity of this particular issue?

Glen Pounder, COO Child Rescue Coalition

It is really hard to figure that out. And again, unfortunately we can’t all interview the VP of Apple, but of course we’d all like to hear those same things answered because I feel like he, and many of the Apple employees and employees of other tech companies would have those same concerns, right?

Neil Fairbrother

Yeah. Indeed. Tell us about Private Relay, Glen. What is Private Relay, which is another component of Apple’s solution?

Glen Pounder, COO Child Rescue Coalition

Yeah, the Private Relay element in essence, at least from what I read, seems to create Safari into being a browser that would be completely untraceable. Again, from a privacy perspective, I understand that, but if law enforcement are not able to identify an IP address for a suspect because it’s being deliberately cloaked, well, I personally, I can see that being a wholly separate, but still dangerous problem.

Neil Fairbrother

So an inadvertent consequence of what Apple are trying to do with online child safety might actually end up protecting the privacy of predatory pedophiles?

Glen Pounder, COO Child Rescue Coalition

It certainly could. I’m glad you mentioned privacy. You know we often seem to focus on privacy and for good reason, but there seems to be very little thought for the child’s privacy. So, you know, again, coming back to this threshold, what about those children being revictimized because there’s been a conscious decision not to report between numbers one and 29? Even though the math would tell us almost certainly that will be a match. And again, this would only be to refer it to an Apple employee who says, oh my God, that is a terrible image, it needs to be reported to NCMEC.

Neil Fairbrother

Well, yeah, I mean, that’s yet another pair of human eyes seeing these images, which for the child victim is a traumatic knowledge to have, that even though people who are trying to do good are seeing these images, it’s still another person seeing these highly intimate images of children.

Glen Pounder, COO Child Rescue Coalition

It’s absolutely the case. And there have been cases here in the US where, because of the accuracy of the likes of photoDNA, a photoDNA match has been argued in court to be the equivalent of the human eye confirming the CSAM. And then one case in particular, it allowed for law enforcement to review that same image without going back to the service provider for a warrant. Because again, any man in the street would say, if there’s one in a billion chance that this is wrong, the machines have got it wrong. Well, any reasonable man in the street would say one in a billion? Yeah. Well, I’ll take those odds all day long has being reasonable steps to take in order to protect children.

Neil Fairbrother

We’ve talked about PhotoDNA, which as we’ve said, is a tried and tested well known tool. It’s been around for at least a decade, and it has secured convictions. But there is one fundamental weakness with it, and the clue’s in the name “PhotoDNA”. And as far as I’m aware and according to Hany Farid who invented it, it is blind to video.

Now I don’t know what the distribution curve is for this kind of content, but I assume that video is highly attractive to the purveyors and collectors of this kind of content. iPhones and other smartphones produce 4k, highly detailed video content. And in slow-mo mode can run 120 frames a second, which is a vast number of images over a few minutes of a video, but photoDNA would be blind to this kind of content. So one assumes, therefore that Apple’s current implementation of their CSAM detection is also blind to video content

Glen Pounder, COO Child Rescue Coalition

Potentially. I mean, again that’s not a detail that I’ve seen in there. I know some organizations are talking about the equivalent of photoDNA for video, by going down to that frame level to do the matching. Again, I don’t know what Apple have done. Certainly, I think as I covered previously, we’ve done some in-depth work on that with a company analyzing videos and where they’ve been changed, and we got a hundred percent match rate on video content.

Neil Fairbrother

Okay. Now some social media organizations, for example WhatsApp hit the headlines, their CEO, I think his job title is, Will Cathcart said that they won’t be adopting Apple’s CSAM detection system, which I think you alluded to earlier. If an app developer has the option to use this or not, then it doesn’t seem to be mandated across Apple’s ecosystem and people can opt out of it.

Now it depends really, I suppose, on what Apple’s aim is. If Apple is aiming to rid their iCloud ecosystem of this kind of content, then what they’ve done seems to work. But if they’re aiming to eradicate CSAM from all of the internet, including social media, then this non-compulsory use of this feature by apps in their app store seems to be a bit of a get out because people who want to share this stuff can simply move off the Apple platform and use something else such as WhatsApp.

Glen Pounder, COO Child Rescue Coalition

Yeah. That’s absolutely my understanding as well. And again, you know I think Mr. Cathcart’s submission really, again, seem to focus around privacy rather than child safety. And, you know, again, we know that in amongst those reports last year, 400,000 reports from WhatsApp to NCMEC around suspected child sexual abuse. I don’t know how much of those 400,000 were found by WhatsApp themselves, or were people using WhatsApp have reported those suspect images to WhatsApp for then onward reporting to NCMEC.

Neil Fairbrother

Well, either way WhatsApp seems to have a problem with CSAM content, so you’ve got to ask the question, well, why wouldn’t they do something like this if they’re really interested in on-child safety?

Glen Pounder, COO Child Rescue Coalition

Yeah. And again, I think between the horrific equation of privacy and safety, well, privacy in that case has won out, as we know it will when more and more organizations head towards end-to-end encryption, because then they have the best of both worlds in some ways, right? They can potentially still target us with adverts, but they can say, we don’t know what’s inside the communication we can’t see it because…

Neil Fairbrother

And it’s only once they know that they are legally obliged to report.

Glen Pounder, COO Child Rescue Coalition

Absolutely. Which coming back to Apple is quite the reverse. They’re basically saying through that breakdown, Yeah we all know 29 times almost certainly, almost beyond a shadow of a doubt in terms of the mathematics, but we’re not going to choose to be notified ourselves, even though they are our machines until it reaches 30.

Neil Fairbrother

In Apple’s CSAM Technical Summary Document, they don’t mention MacOS, which powers their desktop and laptop machines. They refer only to iOS15 and iPadOS15, which is the operating systems on iPhones and iPads. So presumably this means that these tools won’t be available on those devices and yet those tools, those platforms are often used to either edit video or photographic content, or even live stream this kind of abuse of children through platforms such as Zoom and Skype and so on.

Glen Pounder, COO Child Rescue Coalition

Yeah. That’s absolutely the case. It’s difficult, right because of course we all welcome any move away from only 265 reports a year. Again, that’s almost that that number from last year is almost so small, it’s like, well, is that where people have come to you to ask you to report it? But in any event what they could do right, within the law is obviously not being done in terms of child protection. They’re very much more on the privacy side. And I agree with you, there’s absolutely no move to combat anything around the live streaming of children and through some new technology that we’re working on, we know that’s going on hundreds of times a day.

Neil Fairbrother

So if this was an app in an app store, Glen, and you had to rate it one to five stars, how many stars would you give it?

Glen Pounder, COO Child Rescue Coalition

You know, again, I applaud them for doing something, don’t get me wrong, but I just think they could have done so much more. And the devil really is in the detail. When you look at the detail, they’ve gone a long, long way to protect user privacy. I saw some headlines when it first came out around, well, it’s going to report me for taking a picture of my child in the bath. That’s clearly not the case. None of your known photos, none of the photos on your device are being analyzed at all. It’s only when it connects to the cloud. And it’s only if those images are almost certainly child sexual abuse material is there any interest from Apple. They’re very much more focused on the side of privacy than safety.

Neil Fairbrother

Is this focus on privacy, whilst it’s important, overstated and overplayed?

Glen Pounder, COO Child Rescue Coalition

In my opinion, yes, because again, unfortunately, the things we see at the Child Rescue Coalition are obviously quite disturbing, but we’re talking about the rape of very young children, about the rape of toddlers.

Now our responsibility as a society to protect those kids just seems like it’s common sense, to promote the protection of children above all else. I mean let’s not get drawn into the whole society issue because we know children subjected to sexual abuse are that much more likely to be drug users and, you know, all kinds of other problems. But if we just focus on what I would say, the man in the street would expect from a large company like Apple, I think they would expect them to do a little bit beyond the bare minimum. Really.

Neil Fairbrother

I just published a podcast interview with Howard Taylor who heads up the End Violence against Children Coalition and he said that one of the biggest risks that he thinks children face online is that many people are not really fully aware of the risks. It seems to me that one of the fallouts from this Apple announcement has been a massive global discussion in the mainstream press about an issue that was previously taboo. And that’s got to be a good thing, hasn’t it?

Glen Pounder, COO Child Rescue Coalition

Yes, it is absolutely a good thing. The more people know about this subject area, the better, but as we constantly see I think the severity of what we’re dealing with is very much underplayed and it becomes rather an intellectual discussion around you know, the algorithms and, you know, if you can find a good mathematician to show what one over a billion is times 30, then please let me know if you do that.

But what we’re really talking about is the rape of children, the rape of a four-year-old. So we received a communication yesterday from a lady who was worried about sending her four year old to a sleep over at a friend’s house. And sure enough, you probably know what’s coming, the four year old was sexually abused.

The abuse happens in real life. These reports to our friends at NCMEC are not accusations of crime. That doesn’t happen until law enforcement gets to do their job. The whole point of all of this is to prevent that sexual abuse in children and to really honour those kids who’ve already been abused and that their images have been consumed on a daily basis by thousands and thousands, if not millions of criminals around the world. Is that the society that we want to live in?

I don’t know, but from a privacy perspective, what I constantly read, again, everything seems to be extreme. These days always seems to be black and white. And I think there’s a middle ground for protecting privacy without this being the thin edge of the wedge for government monitoring, because that’s not what this is. This is not opening the doors to government monitoring of your devices and every picture you take in your house, that’s not what this is at all.

But it always seems to be, oh, well, here’s the thin edge of the way, because this is at the beginning of the end for your privacy. That’s not the case at all. And if we have to have a specific carve out for child protection, then perhaps that’s what we need. And I’m sure there must be some very, very clever lawyers you know, in government who can come up with wording that would create that balance, create that child safety balance and protect users’ safety.

Neil Fairbrother

Well, there were plenty of carve-outs for child safety in the real-world. So why on earth should there not be any carve-outs for child safety in the online world?

Glen Pounder, COO Child Rescue Coalition

I absolutely agree with you. How, and when this is going to change, I’m frankly at a loss, because reasonable legislation is proposed and then you don’t hear anything more about it and it dies somewhere in a dusty hole. And then we seem to start again and we’re like, why wouldn’t there be some legislation to compel the companies? I think some of the companies themselves have proposed it. I think even Mark Zuckerberg agreed there should be some regulation. And then the companies know what framework they’re working within, right?

So we have elected governments, whether you’ve elected this particular government or not, they’re elected by us, the people. So elected governments set a framework that the people are happy with for child protection. And then the companies can say, “Hey, we’re just complying with the law”, rather than the companies making the agenda and it all being based on voluntary principles.

Neil Fairbrother

Just one final question, Glen, what was the name of the 18 month old girl that you described earlier?

Glen Pounder, COO Child Rescue Coalition

Unfortunately for legal reasons, we can’t reveal the name of the child. But we give her the name Jane, because obviously it’s a real child and let’s hope she’ll be able to forget that terrible crime that was perpetrated against her. And let’s hope that other children can be rescued when law enforcement to give them the right information.

Neil Fairbrother

Okay. Glen, I think we’re going to have to leave it there. We are out of time. Thanks very much for your insights on this really important issue. I think that this development from Apple is going to be the first from a number of handset manufacturers. I hope anyway it will be. I hope to see other handset manufacturers come in with other solutions that can also help to address this very pressing online child safety issue.

Glen Pounder, COO Child Rescue Coalition

I hope so too. And I hope an example like Apple leaning forward into this will encourage some of the other service providers. If more service providers were actually leaning forward this issue rather than doing the absolute bare minimum, then more children would certainly be protected.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top