Safeguarding podcast – Bad News for Bad Guys with Glen Pounder, Child Rescue Coalition
By Neil Fairbrother
In this Safeguarding Podcast with Glen Pounder from the Child Rescue Coalition, we explore the Deep Web, explain how it works and the scale of CSAM sharing on it. We find out whether it’s as safe as people think for the distribution of CSAM, discuss Section 230, whether social media platforms are a no-win situation, why handset manufacturers are silent on child safety and the UK’s Online Safety Bill.
https://traffic.libsyn.com/secure/safetonetfoundation/SafeToNet_Foundation_podcast_-_Bad_News_for_Bad_Guys_with_Glen_Pounder_Child_Rescue_Coalition.mp3
There’s a lightly edited for legibility transcript below for those that can’t use podcasts, or those that simply prefer to read.
Welcome to another edition of the SafeToNet Foundation safeguarding podcast with Neil Fairbrother, exploring the law culture and technology of safeguarding children online
Neil Fairbrother
The Dark Web came to the forefront of the general public’s awareness a few years ago with the Silk Road drug e-commerce case. For many of us though, it’s still a place of mystery, but it is increasingly being used for the publication and dissemination of child sexual abuse material, in the belief that the anonymity it provides will render its users safe from detection.
To guide us through the deviousness of the Dark Web and why it’s perhaps not as safe as some people might think it is, I’m joined by Glen Pounder of the Child Rescue Coalition. Welcome to the podcast, Glenn
Glen Pounder, Child Rescue Coalition
It’s a pleasure Neil. Thank you for having me on today.
Neil Fairbrother
It’s good to see you again, Glen. Can you give us a brief resumé please, so that our audience from around the world has an appreciation of your background and experience?
Glen Pounder, Child Rescue Coalition
Yes, I’ll try and keep it brief. So many years ago I joined Her Majesty’s Customs and Excise at the tender age of 17 and then worked for the Queen for 31 years across Customs and Excise, the Serious Organized Crime Agency and I finished up with the National Crime Agency which I left two years ago, or just over two years ago, and came to work at the Child Rescue Coalition.
Neil Fairbrother
You’re now based in the USA, I believe?
Glen Pounder, Child Rescue Coalition
That’s right. Yeah. So we’re a non-profit, a charity based here in the US.
Neil Fairbrother
The Child Rescue Coalition is what? What are your objectives?
Glen Pounder, Child Rescue Coalition
Our objectives effectively are to protect children and we achieve that, or we try to achieve it, by providing technology, which is free to law enforcement worldwide, which allows them to investigate suspects in their area who have an illegal sexual interest in children.
Neil Fairbrother
Just before we start to drill a little bit into some of the technology that you’re using and the success you you’ve had with it, the UK Government last week published their Online Safety Bill. Do you have any initial thoughts on this draft legislation?
Glen Pounder, Child Rescue Coalition
Yes, perhaps my initial thought is that I’m showing my age because I now get excited by a new legislation! The Online Harms Bill, I think a lot of us would agree, is somewhat overdue because we’ve tried self-regulation for big tech companies on the internet, and I think we can all agree that it doesn’t work.
So some of the big takeaways for me with regards to the Online Harms Bill, and hopefully it passes sometime this year would be good. We say, although we supply this technology we have free at the Child Rescue Coalition to law enforcement, we all say law enforcement alone cannot solve this problem of online harms.
And of course, even the name itself, online harms, suggest somehow that that harm stays online. Whereas obviously we know that, you know, whether it be the risk of suicide and obviously where we work in the child sexual abuse combating that in that arena, these are real world harms. So they might be facilitated through the big tech companies as an example, but their kind of mind-boggling profits from using our data needs to eventually result in them having some kind of duty of care. Because like I said, self-regulation is just not working.
So a couple of kind of big takeaways from what I see in the Bill are legislation with teeth and by which I mean the, the £18 million or $25 million fine or 10% of profits. And as we, as we know in other..
Neil Fairbrother
Well, it’s 10%, I think, of annual global annual revenue as opposed to profit. So it could actually be much larger than 18 million quid.
Glen Pounder, Child Rescue Coalition
That’s right. Yes. 10% of worldwide turnover, which again to me in the online space, it seems to mirror the idea of GDPR. And we know that GDPR got itself recognized in the Boardroom through those teeth that it has. So the legislation has to have teeth and consequences.
Personally, I believe the online harms bill, it’s a good idea that they’ve kind of held back criminal accountability because of course holding somebody criminally responsible will mean a high degree of proof that the Directors were knowingly concerned in facilitating the online harm. But it’s good to have that in the background as well, some kind of criminal liability. But let’s see.
It’s a really kind of exciting prospect. I know some of the countries are developing their own legislation as well, and I’m sure countries will be working together to see what the big tech companies respond with and to see whether some of the principles they’ve come forward with will actually work. But yes, with regards to the online harms bill, it’s an exciting time and hopefully the regulators like Ofcom can actually show this work in real world examples.
Neil Fairbrother
Now on your website, the Child Rescue Coalition’s website, you say that in 2017, the Bureau of Justice announced that the annual number of persons prosecuted for commercial exploitation of children cases filed in the US District Courts nearly doubled between 2004 and 2013. What’s driving that increase Glen and how much of it is down to online exploitation of children?
Glen Pounder, Child Rescue Coalition
Yes, that’s a great question. I think there were a number of factors. Of course, the online access to child sexual abuse material certainly fueled some people who perhaps previously would never have been able to access what is really quite horrific abuse material. So as an example, back in the nineties we would intercept a package at the UK borders such as a VHS tape and look at where we are now.
So that VHS tape was actually quite difficult logistically sometimes for the suspect to obtain, whether they’d ordered it online or in some instance where they traveled, committed the abuse, recorded it and then sent the tape back to themselves. Whereas now we sometimes say offenders are sometimes three clicks from accessing really horrific child abuse material.
Neil Fairbrother
Yes. And the stats around it that you have on your website are quite mind blowing. You say that the most shared child abuse file is currently being seen on over two and a half million unique IP addresses and that the Child Rescue Coalition has identified 72.5 million unique IP addresses worldwide sharing and downloading sexually explicit images and videos of children. What does that mean, does that mean people?
Glen Pounder, Child Rescue Coalition
An IP address, in simple terms, they’re a little bit akin to when people write their own address on the back of an envelope and send it in the post. The IP address advertises on the internet, this is the location that this information was sent from. So through legal process law enforcement, for example, can resolve an IP address to a real world location, a physical address.
Neil Fairbrother
I think there are two different types of IP address though, there are dynamic and there are static IP addresses. Does the dynamic nature of some of these IP addresses pose a particular problem for that kind of work?
Glen Pounder, Child Rescue Coalition
So the nature of dynamic IP addresses i.e. those IP addresses which change is something that’s decided upon by the Internet Service Provider. So in some instances the IP address might stay the same for weeks or even months. But in other instances, somebodies’ house can have a different IP address three times in one day. So it’s really all to do with the exact timing when an offense took place. And that exact timing is obviously what’s very important with regards to an Internet Service Provider providing the correct information often to law enforcement.
Neil Fairbrother
The fact that the IP address might change a few times a day, it doesn’t really pose a problem because that can easily be tracked?
Glen Pounder, Child Rescue Coalition
Yes correct.
Neil Fairbrother
Okay. Now in the US where you’re based these days, there is no, as far as I understand it, there’s no legal requirement for the online platforms to proactively look for CSAM content. The only requirement on them from a legal point of view is to report it once it’s found. So once they know it’s there, they’re legally bound to do something about it. But as long as they don’t know about it, they don’t have to do anything at all. What can be done about this? I think the EARN IT Act tries to address this, but I’m not quite sure where we’ve got to with the EARN IT Act.
Glen Pounder, Child Rescue Coalition
Yes the EARN IT Act would help address it. The elephant in the room here Neil for me is Section 230 of the Communications Decency Act. It was discussed in the mid-nineties and enacted in 1998 and in 1988, Mark Zuckerberg was starting high school. So, you know, I think we can all say that we are now decades behind the technology with regards to legislation. And in my opinion, it’s urgent now that real legislation is enacted to amend Section 230 of the Communications Decency Act, which I know you know, and probably a lot of your listeners know, provides somewhat of a comfort blanket to the big tech platforms.
Neil Fairbrother
There is little bit of a contradiction in some of the US law, at least as far as I understand it. In the US Child Protection and Obscenity Enforcement Act that was passed in 1988, it says … “it is illegal to use a computer to transport information in interstate or foreign commerce concerning the visual depiction of minors engaging in sexually explicit contact (child pornography)”. Now that is what we now know today as CSAM. Now that Act was passed in 1988. So what went wrong there? How have we got to this contradiction where one law makes it quite clear it’s illegal to use a computer in this way and another law seems to turn a blind eye to it?
Glen Pounder, Child Rescue Coalition
Yeah, I mean, my personal opinion is that the Communications Decency Act allowed for the rapid, and obviously in some cases the very welcome development of the internet and internet-based technologies. But I think somehow in those last 20 odd years, we’ve lost our way with regards to the protection of children. And, you know, I completely agree with you. In fact, we mounted a campaign to have the law changed from “child pornography”, to “child sexual abuse material” in the US because I think in some circles, people are under the impression that child abuse material might somehow be a 20 year old dressed up as a school girl and somehow it’s not really abuse material. It’s not really rape of a child. Whereas an organization like ours, the Child Rescue Coalition and others know that that’s not the case. And some of the material is really quite horrific involving children sometimes only months old.
Neil Fairbrother
Now when platforms do remove the content and report content as they’re obliged to do once they are aware of it, to NCMEC, this tends to generate headlines such as “Facebook is responsible for the majority of child sexual abuse images”, which seems to be a little bit unfair. The social media companies have gone to the effort of finding this stuff and they’ve reported it. They’ve been quite transparent in those respects. Are the platforms in a no-win situation with this problem?
Glen Pounder, Child Rescue Coalition
Yeah. I see your point with regards to that. So, because we are better at finding it, in Facebook’s case, then somehow, it’s our fault. Well, part of the question is, well, why aren’t the other tech companies finding proportionately a similar percentage based on the number of users that Facebook are? So, you know, are they using the same tools? Are they using different tools?
I think the interesting aspect of this as well, and I’ve worked with NCMEC cyber-tips while I was still in the NCA, is even according to the Facebook’s own analysis some of the material is inadvertently shared, or it’s joke material, which is still illegal and still a problem, but it’s not the highly damaging abuse material in all instances. Now we also know from many, many cases that the grooming of children, abuse of material, which is sent in Messenger and other Facebook-based platforms makes both things true.
So there’s highly abusive, highly illegal acts taking place on some of these platforms, but equally because they’ve improved their technology at finding the known material, or some of that material is illegal, but it isn’t necessarily the highly damaging and dangerous child abuse material which is the rape of children.
So they’ve got both things in that platform and an organization like ours, the Child Rescue Coalition, where we’ve been helping some organizations, Western Union is one example where we are really helping them to find a needle in a needle stack. And by that, I mean, Western Union, have got some great resources to find a needle in a haystack. If there’s a transaction that looks strange, they’ll find it themselves. But when they’ve got such a high amount of transactions, millions and millions of transactions going on, we have unique data so that we can say to them hey here’s IP addresses of those in many Western countries who possess, and are advertising highly illegal child sexual abuse material, and they’re able to…
Neil Fairbrother
Well, we’ll come to that in just a moment but before we dive into the detail of that, just a couple of other questions first. I mentioned in my intro the dark net, the dark net isn’t a single place online, or a single service. It’s a mix of technologies that actually have got legitimate uses, but which are used by bad actors for nefarious purposes. And one of the first technologies that purveyors of child sexual abuse material turn to is a peer-to-peer network for distributing their content. What is a peer-to-peer network Glen?
Glen Pounder, Child Rescue Coalition
It’s a great question. And I think of the area we are, which is these peer-to-peer file sharing and chat networks really is the deep web rather than the dark web. So I’m sure we’ll come onto the dark web, but the peer-to-peer networks are highly efficient ways for people to share in this instance, child sexual abuse material, but also other types of files. So highly efficient, completely free and importantly though with absolutely zero regulation. So there’s no headquarters, there’s no central server importantly, so they can’t be shut down by traditional means. You can’t seize the server because these networks exist on people’s computers throughout the world.
Neil Fairbrother
Probably the most famous example of a peer-to-peer network was Napster, which was used for the dissemination of music, copyrighted music, which was obviously a massive threat to the music industry at the time. And as you rightly say it’s a distributed network so everybody that uses it on their laptop or desktop is in fact a server of the network?
Glen Pounder, Child Rescue Coalition
Exactly right Neil. And of course, so that when law enforcement use our core technology and arrest one of these bad guys, they’re only taking out a very tiny proportion of the whole network, right?
Neil Fairbrother
Another technology that’s often used in the belief that it provides anonymity and safety is a VPN or virtual private network. What is a VPN?
Glen Pounder, Child Rescue Coalition
Yeah. As you say the virtual private network provides anonymity for somebody’s online activity. And of course, you know, as we know, there are some paid services and there are some free services. What I would say, you know, we come back to the online harms and legislation is some of these VPN providers, for profit businesses, are effectively making money from child sexual abuse. And kind of my question really is, well, eventually we need to hold some of these companies to account and say, okay, you provide a VPN service in whichever country and your service is very, very popular with those who like to consume child sexual abuse material. What should we do about that?
Neil Fairbrother
Yes, well, indeed, we’ll get onto how you can find out whether a VPN is being used to share this stuff shortly. And the final tool in the well, the, the dark net or, or deep web, if you want to call it, that is the Tor browser. What is that, how does it work and why is it a potential problem?
Glen Pounder, Child Rescue Coalition
Yes, again, really fascinating and I’ve worked several cases involving TOR-based offending, not just child sexual abuse material, of course, but also drugs and guns, and another nefarious activity. The fascinating thing being that it was developed originally for good purposes. It was developed to protect freedom of speech. It was developed by DARPA the US-based research agency. And so on the one hand, freedom of speech, protection for journalists, all very good things. Of course, I think we’d all agree. But then on the other, as with any of the technology taken over and abused by bad guys for child abuse, or with selling of guns and everything else. So again provides complete anonymity at least to a certain extent.
Neil Fairbrother
Okay. Now, when we were planning for this podcast, a couple of weeks ago you showed me quite an incredible piece of technology. If our listeners can imagine an interactive map on their screen which was updated in real time and it was covered in a plethora of different colored dots. What was this map? What was it showing?
Glen Pounder, Child Rescue Coalition
So I gave you a visual demonstration of our system and what we see live every day unfortunately, all of those computers are advertising that they are in possession of child abuse material, and being willing to distribute it across these networks. So you saw our visual representation of the tool that we provide to law enforcement and the different colors are the different networks that we’re monitoring
Neil Fairbrother
Each dot represented what?
Glen Pounder, Child Rescue Coalition
Each dot represented effectively an IP address you know, basically putting its virtual hand up in the air and saying, I have the obese material, I have it, I have it and so on and so forth around the world. And again, where there’s a prevalence of high internet presence, that’s where we see more targets, of course.
Neil Fairbrother
Okay. And this was looking at traffic on a number of peer-to-peer networks?
Glen Pounder, Child Rescue Coalition
Exactly right. And of course, you know, this is completely separate from the work our friends at NCMEC do who receive their leads from Facebook and Twitter and everybody else as you know. So this is a completely separate area of criminality that’s going on, on a daily basis.
Neil Fairbrother
So the purveyors of this kind of content, the distributors that this kind of content, CSAM content, who elect to use a peer-to-peer network in the belief that it gives them anonymity from detection and therefore prosecution, they are wrong in that belief?
Glen Pounder, Child Rescue Coalition
They’re right on they’re wrong. They’re wrong in that belief because an organization like ours exists, but without an organization like ours, and the tools that we provide to law enforcement, then they would be completely correct. And in some jurisdictions, unfortunately just because of the way the laws are crafted, then they are still right, because law enforcement either isn’t able, or isn’t resourced, to investigate those leads. So they are right, to some extent, bearing in mind, you know, like I said, some countries just are not operating proactively in this space.
Neil Fairbrother
Okay. Now as far as I understand it, based on report I recently read, the providers of this kind of content may use various tricks with their video and picture content such as cropping the image to change the shape of it, they might change the video codec, they might speed up or slow down the video in an attempt to avoid detection. Does that work?
Glen Pounder, Child Rescue Coalition
It does, to an extent. Now the challenge would be material that we’re involved in monitoring for is based on a digital hash, which is an actual fingerprint of the image or video as it was when it was produced. Now, if somebody doctors that video in a way that you’ve just suggested and reposts it on one of these networks, then yes, we wouldn’t be able to see that.
That said again, there are technologies now, which are very close to being available for use, which are able to also detect when a video has been altered in that way. So as, you know, PhotoDNA works very well for images, but there is now a technology which is very useful for altered videos, including cropping and changing color and everything else.
Neil Fairbrother
Okay. A video is 25 frames a second, 30 frames a second, or something like that. So it’s a number of photographs per second, basically. And if you’ve got one technology that can be used for photographs, then it would be fairly easy, I guess, to adapt that for video?
Glen Pounder, Child Rescue Coalition
Not from our experience,no. And we’ve done quite a lot of testing. We’ve done quite a lot of testing of this new technology. And we see it as being very different from the traditional still image-based [solution].
Neil Fairbrother
Okay. Now the volume of this content, it seems quite significant, vast would be a good word to use, I think, but if we took YouTube, which I know isn’t CSAM, but in terms of volume of video, it has something like 10 hours of content uploaded every second. Could your technology cope with a system of that size?
Glen Pounder, Child Rescue Coalition
Well, we’ve been in collaboration with a company called PEX around this protection technology and from everything we see and have tested, then it can work on the major platforms as well. And uniquely what it does is enables the platform to block known CSAM before it’s uploaded to a platform, which is obviously the reverse of what generally happens now, which is that notice and take down procedure and abuse material is taken down. In this instance it would never be uploaded in the first place, which of course has a knock-on effect with regards to the leads to NCMEC and the number of cybertips that law enforcement are basically being drowned with on a daily basis.
Neil Fairbrother
The problem with the take down notice, the traditional way, is it does rely on the content being taken down. And in a recent podcast we published, Twitter is being sued by a 16 year old because they failed to take down images of him that he was extorted into taking when he was 13.
Glen Pounder, Child Rescue Coalition
Yes. And of course, all us will be watching that case with baited breath to see how it goes with regards to the apparent at least complaints that he was making and proving his age and the material wasn’t being taken down. Like I said, for child abuse material from approved national libraries, where it’s clear that it is child abuse material rather than blocking some kind of freedom of speech video, well, again, this new technology would block that in the first place. So whether that be for child abuse material, eventually down the line, perhaps even for so-called revenge porn where somebody has been abused and effectively it might be either material they never intended to be published, or even the rape of an adult.
Neil Fairbrother
The issue of false positive crops up every time I have a discussion around this kind of technology. What is a false positive, first of all, and what is your false positive rate? Can you say?
Glen Pounder, Child Rescue Coalition
Well, on the test, we had zero false positives. So we ingested thousands of images and videos and doctored them much in the way that you said earlier. And every single one was found in the test. Of course we didn’t use live child abuse material, but we doctored, cropped, change colours, and we didn’t have any false positives at all.
Neil Fairbrother
And a false positive is what?
Glen Pounder, Child Rescue Coalition
Where there would be an indication of child abuse material or other nefarious material amongst everything being uploaded. And of course, that would be a huge problem, especially if it created a lead for law enforcement because law enforcement, if they were passed the lead where there was a false positive, that would be a big problem. So they would have to be for cases that led to law enforcement, they would, of course, still need to review the actual material that was identified by the technology, right.
Neil Fairbrother
OK so this won’t completely avoid human eyes having to view this kind of content before legal action is taken?
Glen Pounder, Child Rescue Coalition
Yeah. It would reduce the amount of human eyes, which again, is a welfare issue for those involved in this horrific area. But yeah, to my mind, it wouldn’t completely negate it, especially where there was potential for a criminal outcome.
Neil Fairbrother
Okay. Talking about law enforcement I believe that your technology has aided in the arrest of something like 12,000 predators and rescued more than 2,700 children. Is that correct? Is that an accurate assessment?
Glen Pounder, Child Rescue Coalition
It’s something over 13,000 now and over 3000 children. And you know, I think people who listen to your podcast probably know this already, but this is not some, you know, dirty guy sitting in a basement. You know, we’ve had pediatricians, teachers, teachers of very young children arrested, who’ve been consuming this horrific material and also you know, direct victims who have been being raped in the home. And again, this is something that’s not regularly talked about it, but most studies showed that most children who are being physically, sexually, abused are abused by somebody they already know, right?
Neil Fairbrother
Yes. And another type recently reported by the IWF was the self-production or self-generation of intimate images by children themselves, often in the locked family bathroom.
Glen Pounder, Child Rescue Coalition
Yes. I mean, what a huge challenge. And again, that’s not a law enforcement challenge, it’s a society challenge, where even for some of those self-abuse images there can be parents heard shouting in the background. I mean, really, you know, children are effectively taking horrendous risks online and sometimes the parents are just downstairs.
So, you know, children are experiment in ways that just weren’t possible even 20 years ago, and then effectively sharing some of this material with somebody they might suspect is the same age as them, but as we’ve seen in many cases sometimes are not. So that can lead to that material obviously being shared on some of the networks we monitor. And we’ve seen that that can lead to the child being forced to produce ever more horrific material and even in some instances where that child is then compelled to abuse a younger sibling.
Neil Fairbrother
One of the impacts of the Online Safety Bill is that it’s very much focused on the larger social media companies.If they if they have their house fully in order and they become very difficult to share this kind of content on, will this simply move the problem elsewhere on to less well-regulated platforms?
Glen Pounder, Child Rescue Coalition
Potentially. Yes. And again, for those very large companies, it would be something if they could help the smaller startups if you like, achieve some kind of level of safety by design, as they’re launching their new apps. So, for example, if there’s a new the app or the new technology will automatically get free access to tools which will allow protection and safety by design as the app is being developed rather than as they currently are some kind of afterthought.
Neil Fairbrother
Yes. If we look at a recent technology that’s exploded, if that’s the right term, the electric car, the battery power car. Elon Musk with Tesla was not exempt from safety requirements for the first 10,000, 50,000 cars. He had to comply fully with all safety regulations from the get-go, from the very first car. And yet that’s not the case at the moment for platforms that provide access to children by anyone from anywhere, at any time.
Glen Pounder, Child Rescue Coalition
Absolutely. And frankly it’s mind boggling that the companies profit from our data, but yet have no duty of care to those on their platforms that they’re making money from. And frankly, that’s a position is not tenable and has to change.
Neil Fairbrother
Okay. Now we spoke about peer-to-peer networks, VPNs and the Tor browser. Is there anything that you’re aware of in the Online Safety Bill that addresses the use of those platforms for the distribution of CSAM?
Glen Pounder, Child Rescue Coalition
Honestly, I haven’t taken a deep dive within the legislation, so I don’t know that there is. Personally, I think there has to be some common industry standards. So if and when there’s proven technology which can help protect children by design and default, well, unless the platforms have their own technology, which is the equivalent of I think they should be compelled to use technology, which is proven to protect children.
Neil Fairbrother
Okay. Time is running short, I know you’ve got a busy day, but a couple of short questions, if I may, although the answers may be long, I don’t know. There’s a lot of talk about platforms. In fact, it’s almost exclusively about the platforms when it comes to online safety, which is kind of understandable, but there are two voices in particular that very rarely get a mention in this space. One of them is Apple and the other is Android. The handset manufacturers are noticeably absent from this discussion about online safety, particularly as far as children are concerned. Should they be, and what should, or could, they be doing to make their products safer?
Glen Pounder, Child Rescue Coalition
Yes, I mean, I don’t see why they shouldn’t be. I mean, clearly as you say, their devices are the ones that are used to facilitate child abuse too, right. The networks are being used to transmit it, but these devices are the ones being used to record and literally physically send the material across the networks. So I don’t see why they should remain outside the conversation. I think they should be fully part of it.
Neil Fairbrother
Okay. Finally, what is the future looking like for the Child Rescue Coalition? What are your plans for this year, Glen?
Glen Pounder, Child Rescue Coalition
The future is actually very exciting and from a technological development perspective. We are working actively to help law enforcement in the live streaming and abuse space but also in the app space. So on the live streaming side, law enforcement are effectively drowning in data and the Child Rescue Coalition, we are many things. As well as being a charity, but effectively we’re also a technology company. So we’re able to help law enforcement view the data they’ve got back from their legal process in a way that they previously couldn’t. So that technology is still being developed, but it’s already proven to be very exciting. And it’s bad news for bad guys because this means that the leads with regards to the market for child sexual abuse online, live streaming abuse, some bad guys will be getting a knock on the door a little bit earlier than they probably expected.
Neil Fairbrother
Well bad news for bad guys sounds like good news to me.
Glen Pounder, Child Rescue Coalition
Yes, absolutely.
Neil Fairbrother
Glen, listen, thank you so much for your time. We’d better wrap it up there. I really appreciate that. Good luck with everything you do, and it’ll be great to keep in touch and to hear more about this technology as it comes on stream.
Glen Pounder, Child Rescue Coalition
Excellent. Thanks so much.