Safeguarding podcast – Hunters, Talkers & Loopers with Austin Berrier, HSI officer
By Neil Fairbrother
In this Safeguarding Podcast: Austin Berrier Homeland Securities Investigations Officer discusses the impact of Apple’s Child Safety tech on Law Enforcement, the live streaming of child sexual abuse on encrypted video streaming services, how online predatory pedophiles hunt in packs, Project Mercury and how Zoom worked with international law enforcement to indict 300 child abusers.
https://traffic.libsyn.com/secure/safetonetfoundation/SafeToNet_Foundation_podcast_-_Hunters_Talkers__Loopers_with_Austin_Berrier_HSI.mp3
There’s a lightly edited for legibility transcript below for those that can’t use podcasts, or for those that simply prefer to read.
Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the law culture and technology of safeguarding children online.
Neil Fairbrother
Most criminals want to remain hidden for the simple reason they don’t want to get caught, but there’s one type of criminal that proactively posts online evidence of their offline crime scenes. And they even use this digital evidence as a form of trading currency. From a law enforcement perspective, this might seem like a handy way to catch criminals, but it’s not that easy, especially when those crimes committed in real time, on end-to-end encrypted streaming video communication services. To guide us through how law enforcement can catch these criminals, I’m joined by Austin Berrier, a Homeland Security Investigations Teams Officer, welcome to the podcast, Austin.
Austin Berrier, HSI
Thank you very much. I appreciate you giving me this opportunity to speak today.
Neil Fairbrother
You’re most welcome Austin, and I should say I think at this stage that this podcast may well contain triggers or triggering moments and if you are affected by this podcast, then please turn to a trusted advisor or organization for advice and guidance.
Austin could you please provide us with a brief resumé, so our audience from around the world has an appreciation of your background and expertise.
Austin Berrier, HSI
Yes I can. So I’ve been with Homeland Security Investigations since about 2003 and during that time I’ve worked with a wide variety of crime types. However, since 2009, I have been engaged in investigating child sexual exploitation and the child sexual abuse material types of investigations. Prior to my time with Homeland Security Investigations, I spent about five years in municipal policing in the State of Virginia on the East coast of the United States. And prior to that, and prior to university, I spent four years in the United States Marine Corps where I was a Military Police officer. So I first pinned on a badge, I like to say in 1993, and I’ve worked through a variety of different types of law enforcement agencies since then.
Neil Fairbrother
Okay, thank you Austin, for that background. Before we get onto the issue of live streaming of child sexual abuse, Apple recently announced their child safety proposals, which use hashcode matching and this has caused a furious debate or at times furious debate, it seems, between the privacy community and the child protection communities. As an experienced law enforcement officer in this space, what is your view of Apple’s child safety announcement?
Austin Berrier, HSI
Anytime we see a company such as Apple or Zoom or Microsoft, or any of the big corporations out there nowadays that perhaps aren’t using all the technology that’s available to them, I find it disheartening. I think part of the issue with the hash value discussion regarding Apple is that the average person out there who’s reading these news releases or the press releases from Apple, don’t quite understand how that works.
Just talking to friends and family of my own, they believe that Apple’s… what they view as some kind of “Big Brother” project where their phones are being scanned, you know, real time, constantly for any type of photo whatsoever. And once I explain to them how the hashing system works and that’s based on “known CSAM” that has been determined to be CSAM by, you know, law enforcement experts, not just some person sitting in a cubicle somewhere, I find that the average person I speak to with that new understanding is actually in favor of the technology.
Neil Fairbrother
So it’s a question of better communications from Apple?
Austin Berrier, HSI
Yes. I think law enforcement and child protection NGOs and organizations and shareholders are definitely losing the battle and messaging. Law enforcement, obviously we’re supposed to remain above the fray, so to speak, it’s not for us to judge what companies do or do not do. And that’s where, you know, victim rights advocacy groups and NGOs can come in and perhaps better explain to the general public, legislators, privacy advocates, how it actually works.
Neil Fairbrother
Okay. Well one way that it works is that Apple say that they will manually inspect images, flagged up as having a NCMEC hash match. They will report this to NCMEC as they are then bound to do by law having discovered it on their platform. But then they also say that they will shut down the offending iCloud account, which seems to me to be preempting law enforcement. If Apple have shut down the iCloud account at the same time as reporting the offending account to NCMEC, does this end up tipping off the offending account holder and would that hamper your law enforcement investigation?
Austin Berrier, HSI
I think it does, it would. There are mechanisms in place here in the United States that allow us to, I’m not going to necessarily say prevent shutting down the accounts, but at least preserving the evidence. I can understand the quandary a private organization or corporation is in when they find contraband in their possession, so to speak, on their servers. It becomes a question for them, I guess, of at what point is there liability, or do they accept the liability of that? And that’s a hard, hard question to ask and to answer. I think it’s pretty complicated.
Now the shutting down, yes. It obviously it can tip them off. I know most companies have a procedure in place where this could be perhaps delayed, evidence can be maintained, so it does not affect the overall investigation, but it does require an open dialogue between law enforcement and the companies, right? There has to be some kind of cooperation and some assurances given.
I always equate CSAM to like a more physical contraband, like hard narcotics. If we had asked a corporation or a business or a person, “Hold on, I’ll be on the way, I’ll come pick up that cocaine, just hold it for me for a couple of days until I can get there” you know, that person may feel that seems a little dodgy, right? So they might want some kind of, you know, we call it like a feel-good letter, some kind of, some kind of piece of paper that can wave if they stumble into some other law enforcement entity by accident. So it is a fine line that we have to work, but there are processes in place and there are checks and balances that enable that to happen.
Neil Fairbrother
Okay. Now you’ve referred to a different type of offline crime scene there with referring to some form of a drug crime, narcotics of some sort. One feature of Apple’s proposal is called Secret Sharing Threshold and that won’t be triggered until an account has 30 or more matched CSAM images. Is that the equivalent of saying in a different type of criminal investigation that the investigation won’t begin unless you have at least 30 fingerprints or 30 samples of DNA to work with?
Austin Berrier, HSI
I think that’s a very good analogy. I think what happens when a private corporation makes the determination whether or not they’re going to report the crime, they’re taking away that prosecutorial discretion from the law enforcement entity, the prosecutorial entity. That’s up to the prosecutors, if they’re going to make that determination, whether there’s a certain threshold.
I can imagine that in certain places in the US and, you know, municipal laws, or perhaps in the UK, there may be some threshold that has to be met, you know for certain types of investigations or charges, but that should be up to, again, the prosecutorial agency. I likened it to if Apple were to discover 29 kilos of fentanyl or heroin, or some other hard narcotic in their locker room or in their cafeteria on their campus, do they just choose not to report that because it’s not 30? I think again, that should be left up to the law enforcement. It’s our job to determine if charges should be pressed and levied, it’s not the private companies’.
Neil Fairbrother
Okay. I should point out that Apple aren’t the only one that shut these accounts down. I think WhatsApp say that they also shut down accounts once they’ve discovered this kind of content on their servers. Anyway, in 2020 last year, Homeland Security Investigations published a paper called “Live Stream Child Sex Abuse”. What was the purpose of this document, Austin?
Austin Berrier, HSI
Well, this is, you know, this is not a new phenomenon. I think, unfortunately, the pandemic, you know, 2020, kind of brought live streaming technologies into evrybodies’ frontal consciousness, right? It’s something that the kids are having to use every day in school. People were having to work from home. People were teleconferencing like we are now, video conferencing. But it’s technology that has been around quite some time and if we think about how long Skype has been around, that’s nothing new, right?
But how pervasive live streaming has become into the lives of children, especially with the live streaming tools moving from a laptop or a desktop version, let’s say like Skype or the robust business platforms like WebEx or Zoom, into these mobile applications, you know, TikToK, Periscope, Bigo, there’s countless, there’s dozens of live streaming apps that are really… they’re live streaming but they’re also social media.
The purpose is to influence people. The purpose is to generate positive feedback for children or for the user. And the purpose of that documentation was to try to educate law enforcement in general, that this is, while it’s not a new crime type, it’s becoming more pervasive and more common, and we need to come up with some techniques to investigate it. We need to understand how the technology works. What are some of the problems with the technology, investigatively, forensically, and then some of the ways that we can actively combat it.
Now, obviously I can’t go into great detail on that, but the purpose is an educational document for law enforcement to understand that, you know. We’re moving away from the day when offenders maybe just went on the internet and downloaded content from some, you know, perhaps a peer-to-peer platform or some other type of platform, to a point where they can actively engage one-on-one or in groups with children in real time by using technology.
Neil Fairbrother
Okay. Now the document has introduced another acronym, it’s Livestream Child Sex Abuse, which is LCSA. But is having another acronym really helpful. Can we not standardize on CSAM, which seems to be the preferred acronym.
Austin Berrier, HSI
I don’t think LCSA is supposed to supplant or replace CSAM. CSAM was like the general broad, you know, acronym that we’re using for all types. Unfortunately, law enforcement and government love acronyms, right? And frankly, some of it comes down to when we’re writing a report and warrants and affidavits and other court documents where we have to describe something, we have to explain it for the reader. You know, we have to write for somebody who doesn’t understand our job oftentimes. So we have to come up with a descriptive names. And then after that, you come up with the abbreviation so you’re not typing 35 letters. So it’s more of a tool for law enforcement. I think that acronym. I just refer to it as “live streaming” and I think most investigators that I work with, we just call it live streaming just to differentiate or to drill down on a type of technology. Right. It’s not post-production CSAM, it’s real-time.
Neil Fairbrother
Yeah. And I guess an additional benefit is it does sharpen the mind somewhat. It does bring some focus to it because there are some specifics about this which we might get onto later. But you have identified, I think, three different types of LCSA. You got Victim Self-Produced, Peer Streaming and Financially Motivated, which may be somewhat self-explanatory, but perhaps you could go into a little bit more detail about each one. What is Victim Self-Produced first of all?
Austin Berrier, HSI
Right. And so that’s very self-descriptive. This is when the child victim is physically separated or oftentimes alone in a room, their bedroom, their bathroom, somewhere, you know, in their home or some location, but they’re physically separated from the offender who’s perhaps grooming them. So an example would be, your victim could be in their bathroom somewhere in the UK, the offender could be somewhere here in the United States and they’re using that technology to groom from a distance, the victim, which is common. But then that actual one-on-one contact, so instead of having to meet in person and offend on that child, physically, by touching them, they can offend on that child in real time.
So the children are often directed to produce certain types of content and the offenders use the same standard grooming techniques that we have seen for years. They use fear, you know, force, fraud, coercion. Perhaps they build up some type of “loving relationship” with the child where the child feels that they perhaps have a healthy relationship with this person.
That’s the self-produced. There’s nobody actually in the room, the child, unfortunately is the one hitting the record button oftentimes and that presents some difficulties. Sometimes we have to explain to people that the child’s not doing it willingly, right? They’ve been manipulated into producing that CSAM.
Neil Fairbrother
Do the perpetrators of that kind of crime think that this is a way to claim that they are not responsible for what the child is doing?
Austin Berrier, HSI
I’ve certainly seen offenders trying to minimize by saying, “Well, at least I wasn’t touching the child”, or “At least I wasn’t having physical contact”. I imagine at some point that, you know, whether they believe it truthfully or not, that is probably one of the ways that they minimize it by saying, “Look, I wasn’t the one who did it, this child willingly hit the record button”.
Now in the US and in most countries, there’s an Age of Consent and the purpose of that is that under that Age of Consent, that child cannot legally make a decision, right? They’re not developed enough mentally or emotionally or whatever to make that decision. Unfortunately though, we have to often explain that to let’s say, a jury or the offender, or just people in general who don’t understand that look, because no one held a gun to that child’s head and forced them to push the button, they were absolutely tricked into doing it.
Neil Fairbrother
And Peer-Streaming is what?
Austin Berrier, HSI
That’s where the offenders will use a platform, it can be either a traditional social media style one like TikToK or Periscope, or it could be a more robust business platform, WebEx or Zoom, and you have offenders streaming content to each other. Now it can be prerecorded post production stuff, CSAM images that have been around for many years, for example, videos, or they can actually use that technology to offend, an offender could have access to a child, they could offend on that child on camera and share it with other offenders, or there’s even like a Crowdsourcing-type platform or methodology where again, you have multiple offenders working in concert to offend on a child.
Neil Fairbrother
And the third one was Financially Motivated?
Austin Berrier, HSI
Yes. And that’s what we see most common in Southeast Asia. That’s probably the oldest style of live streaming child abuse that’s been around. That’s where unfortunately, countries like the Philippines or Thailand, Cambodia, where many people are barely, you know, they’re living a subsistence level, you know, hand to mouth day-to-day type thing. And unfortunately, oftentimes in those countries, children are seen as a commodity or a way to make money. And what we have are victimizers in those countries, contact offending on children, and then they’re live streaming it to a customer here in the US, UK, who’s paying for it. And oftentimes it’s absurdly small amounts of money, $5, $10. I mean an offender in an English-speaking country can spend $5 or $10 or £5 or £10 and watch a child be brutally raped in the Philippines, and that’s unfortunately been going on for quite some time.
Neil Fairbrother
Presumably the financial industry, the online payments industry and credit card companies, they could have a role to play here could they not?
Austin Berrier, HSI
Yes. And actually, they do a pretty good job at it. You know, in the US the financial industry is heavily regulated, initially due to surge in narcotics in the eighties and nineties, but that of course has expanded to all types of criminal behaviour. The credit card companies, and a lot of the financial institutions in the US have figured out, or they’ve determined, you know, patterns to this type of activity, source countries, you know, payment patterns and stuff like that. And they have done a very good job of working with law enforcement, either bringing that information to our notice, or again, if we have an investigation going on being very responsive to legal process, warrants and subpoenas and production orders.
Neil Fairbrother
Okay. One of the counterintuitive findings for me anyway in the report, is that the offenders prefer to use the clear web as opposed to the dark web for promulgating CSAM. Could you outline what the difference is between the clear web and the dark web and shed some light on why the clear web is preferred for this type of crime?
Austin Berrier, HSI
I think, you know, there’s a lot of misconceptions about the dark web and the clear web. You know that the dark web, you have to use some type of browser, you know, special software, the Tor browser, Freenet, these are areas of the internet that you can’t access without those specific tools. Where obviously the clear net are the open nets. They’re indexed by a search engine, or maybe they’re behind a password, you know, a password protection or a paywall so you can’t get to them with a search engine, but you can still get to them, you don’t need special tools, right? And I think the biggest reason that we’re seeing, when it comes to live streaming, especially the use of the clear net is the offenders are going to go where the children are.
I have children, none of my children are as far as I know, hopefully, are on the dark web using Tor, or some other form like that. You know, kids nowadays are on mobile devices using these social media apps that just don’t lend themselves to the dark web. And if I’m an offender who wants access to kids, I need to go where they are.
Now we definitely know that there’s children being abused and their content is being traded on the dark web. That’s absolutely happening. I think for some of this, a lot of offenders just aren’t as technologically savvy, they don’t have the time or the patience to do that.
And again, I think believe it or not, there’s kinda like a social aspect to this. While offenders definitely operate with a sense of safety and security and trying to avoid law enforcement and arrest, I think there’s a certain human need to find like-minded individuals and congregate with them and interact with them. And there is definitely offenders, you know, pedophiles, people that are have a sexual interest in children, I think they seek out others, not just to trade material, but for their validation, that it is a human need, right. To feel validated and offenders do as well, I believe.
Neil Fairbrother
You’ve mentioned some of the names of livestreaming companies, which are not surprising, but also some that are surprising. So Facebook Messenger, Instagram, WhatsApp, TikTok and the like, but also you’ve mentioned some very corporate names which are WebEx, for example, Zoom, GoToMeeting, which you would not normally associate with this kind of activity at all. Do all developers of these kinds of live streaming privacy enabled encrypted apps have a legal duty to ensure that this activity doesn’t take place through their platforms?
Austin Berrier, HSI
I think the legal duty is to report the activity if they come across it. I think it’s very difficult for any kind of software developer to make a platform or a tool or a space on the internet that is impervious to this. Offenders, criminals are very savvy. They will find a way to exploit a tool if it suits their needs. I think the legal duty comes in where that developer finds out that their product is being misused for criminal purposes and they have to do their best, they need to use best practices to try to deter and detect that.
And I can’t go into that too much, but there are companies that I’m aware of that we work with that are fantastic at that. They study known criminal activity on their platforms and then they develop tools or techniques to detect and deter it and make their platforms less hospitable for the offenders. And for them, I think that’s a good business sense or a good business decision. Nobody wants to be known as the platform that supports the exploitation of children. And if a company can stand up and say, “look, here’s all the things that we’re doing, we’re trying our hardest, we’re not perfect”, you know, for them, I think that’s fantastic public relations.
Neil Fairbrother
Okay. Now we’ve talked about different aspects of child abuse through these encrypted streaming platforms. We’ve talked about grooming, sexual extortion and crowdsourcing, but I’d just like to drill into those a little bit more. In the report that we’re talking about, there’s a short sentence that says “…offenders may fake personas or frame LCSA to victims as a game”. So what we’re talking about here is the “gamification” of grooming. How does that work?
Austin Berrier, HSI
When we say gamification, we want to make sure we don’t confuse people and think that they’re doing this like on PlayStation or something like that. But by gamification, think about the average child when they’re young, right? I mean, there’s a famous psychologist Piaget, he has his levels of adolescent and youthful development in a sense that, you know, when children are very young in the first and second phases of their mental development, they don’t understand the ramifications and the long-term repercussions of their actions.
If there’s anyone one that is going to be listening who has children, think about when your kids are little, 3, 4, 5 years old, they jump out of the tub or the shower. They might run around the house naked. They think it’s funny. And they don’t quite understand that it’s perhaps inappropriate behaviour.
So with the younger victims [Austin said “offenders” in the podcast but it’s clear he meant victims], some of the things we’re seeing is that, that’s the gamification of it. It’s to them, it’s just innocent play. They don’t understand yet at that early age, that their body is something to be respected, protected, kept from common view. And the offenders will do that. They’ll play on that.
As children get older and start becoming self-aware sexually or curious about their sexuality or just somebody else’s sexuality or sexuality in general, again, the offenders use that natural curiosity. There’s not a whole lot of fear needed, or there’s not a lot of force, kids are just naturally curious and they’re innocent and that’s what the offenders do. They don’t make it this taboo thing: “Oh, don’t tell your mom, don’t tell your dad”. They just normalize that behaviour.
Again, I’m sure there’s probably listeners who’ve at some point, you know, they’re in the shower as an adult and their child at three or four happens to walk into the bathroom. And we don’t as parents, we don’t freak out and say, oh my gosh, you can’t do that. It’s just normal kind of accidental behaviour. And that’s what the offenders use. They use that.
Neil Fairbrother
The feature of child sexual abuse on online streaming platforms that I found most intriguing and really quite frightening was Crowdsourcing and the three different roles that members of a Crowdsourcing child sexual exploitation team have: Hunters, Talkers, and Loopers. Could you go into some detail there? What does a Hunter do? What does a Talker do? And what does a Looper do?
Austin Berrier, HSI
Some of these roles of course, can be mix and match. Somebody might have multiple roles. They can rotate and stuff like that. But for example, a Hunter, their job is to go out and find children at risk. One of the ways they can do that is on, let’s say on live streaming platforms, they may be out on pick your platform, Bego, Periscope, Livemeet, and they’re just looking for kids that are doing at-risk behaviour.
And what’s at-risk behaviour? If you have some young girls that are maybe doing like their cheerleading or dance routine and their mother or father are also in the frame and they’re in the family room and everybody can see what they’re doing, perhaps that’s not at risk because there’s adult supervision. But if you take those same young girls and you have them in their bedroom behind closed doors, and maybe they’re doing that routine in their underwear, now that’s a child that’s at risk, right? They’re doing something that perhaps their parents wouldn’t approve of and they’re willing to take risks.
So that’s the Hunter’s role, to go out and find those kids. It sounds horrible, but like they’re harvesting, right. They’re harvesting children that are ripe for the picking. And that sounds terrible, but that’s the truth.
The Talkers are the chatterers. That’s what they do. They’re their glib, right? The silver-tongued devil. It’s their job to figure out the in with that kid. And it’s in my experience, you catch more flies with honey than vinegar, right? So if they can befriend that child, or be kind to that child, perhaps they are the ear that that child can share their secrets to, they can be the shoulder to cry on. They can be that understanding friend, or perhaps they build friendship through video games or dance routines or music, but they build a rapport with that child victim. And that way, when that exploitative behaviour is requested or comes up in conversation, that child has already… you know, we do things for our friends that we wouldn’t do for strangers, and at this point, that child may feel that person they had been talking to you for days, weeks or months is a friend.
And then finally the Loopers. Their job is to desensitize children. So one of the things that they’ll do is they may, while they’re talking to the kids, or while someone else is talking to the children, they may stream adult pornography. They may stream even something that’s not pornography, just suggestive, some type of erotica you know, movies that are again, while legal, just have you know, have sexual innuendo. And then eventually they may start streaming child sexual abuse material to show the children “Look, Hey, you know, other kids are doing this, you know, you’re not any different than anybody else everybody’s doing it”, you know, like a mild form of peer pressure. So those jobs, again, will be interwoven with each other. One person can do all of those.
Neil Fairbrother
That’s really quite distressing, I think, the idea that there are gangs of these predators working together with such discreet functions to ensnare children. I think we’ve got a pretty good idea of the general picture of this kind of illegal activity. As law enforcement, I think you were involved in Project Mercury and there may be various operational aspects of this we can’t get into but what was, or indeed is, Project Mercury?
Austin Berrier, HSI
So it’s completed. It was a three-year investigation into the livestreaming abuse of children, it was both real time, live abuse, meaning actual abuse was real time that the investigators were witnessing as well as using the tool, they were on the platform Zoom, to share post-production or existing content. So you had offenders that were using the tool to stream, you know, images or videos that had been identified by NCMEC or another organization, known images. And then you had individuals going out who had access to children and were offending on the children themselves. There were a few instances where we had self-produced content, but it was mostly peer-streamed, where there was an offender and the child in the room together.
I can’t commend Zoom enough. They were excellent partners in this investigation. They were an excellent example of how privacy and protection can co-exist. They were an excellent example of how they can work with law enforcement, and also still meet their obligations to their shareholders and to their customers.
That’s a real high-level overview, but what happened was we spent three years working the platform Zoom. This is from 2015 to 2018, by the way, so long before Zoom was on the radar because of the pandemic, but the offenders were already using it six, seven years ago.
In general, what they were doing is individuals who had access to children, they were offending, they were other sexually abusing, you know, the kids that they had access to and they were sharing with individuals all around the world, with groups of 40, 50, up to 200 individuals at a time logged in. And not only were they watching the live stream they were oftentimes directing it through comments, telling the offender what to do, which was horrifying.
We ended up, I think, globally indicting, I want to say 250 individuals by now. There’s still some stragglers that are being indicted. We’re probably closer to 300 by now. And I want to say in total, around 90 children were safeguarded over the course of those three years. I don’t like the term rescue. It just makes me feel weird, but there was about 30 kids, 25 to 30 kids that we were able to identify as being actual contact victims and get them out of the situations they were in and another 50 or 60 or so that were at risk, somebody who had access to them on a regular basis was streaming or viewing, or had interactions with copious amounts of CSAM, and that we had developed intelligence that these kids are at risk. They were up next, so to speak, they were prevents. And frankly, as a law enforcement officer, I would rather prevent a child abuse than identify one, right, and that’s a whole other topic, but when we identify a victim, unfortunately means they had to have been victimized, right? I think prevention is a far more victim-centric approach.
Neil Fairbrother
I guess the word “rescued” implies it’s all over, but of course it’s not for the victim because the impact is life lasting.
Austin Berrier, HSI
Yeah. Yeah. But it was a fantastic investigation. There were, I mean, we had US, Canadian, the NCA, lots of municipal police services in the UK, Australia, New Zealand, Italy, Belgian, Dutch, Danish law enforcement, all were involved in identifying child victims, getting offenders incarcerated. We had clergy, we had, oh my gosh, we had university professors. We had medical personnel, we had primary and secondary school teachers and principals. We had nurses and doctors. We had regular people too. You know, there was one individual, I believe he was in Wales, they had a family Inn. There was one individual there in the UK who was a defrocked Catholic priest and was a prior convicted sex offender and had come back after his release from prison was doing it again. So there was a wide range of people.
Neil Fairbrother
This is clearly an international crime and your report into Project Mercury says that because it’s international there are different law enforcement jurisdictions clearly. And I think in the UK we have something called a “reasonable suspicion” for warrants and we don’t need to provide what is known, I think in the US, as “probable cause”. Now you’ve mentioned a raft of countries. How did you manage to coordinate the law enforcement activities against the different law enforcement regimes in all of those countries? Did that present a problem or was this an opportunity?
Austin Berrier, HSI
It was a problem that turned into a fantastic opportunity. We recognized that the English-speaking cousins, so to speak, were the primary investigators in this operation. And what we did early on is we had a meeting where we had investigators, agents, police from a number of different countries, the UK, Canada, and the US in person, then our allies, you know, New Zealand and Australians virtually. And we had prosecutors, you know, we had United States attorneys from the United States, we had Crown Prosecutors from, again, the UK and in Canada that specialize in CSAM investigations. And we all met in Toronto early, early on. And that was probably what made the investigation so successful as we determined what standards needed to be met, to make everyone happy.
It wasn’t up until that point that I had actually learned that in the UK “reasonable suspicion” was the standard, and that kind of makes it difficult when a UK law enforcement officer would send information over to the United States, thinking there’s plenty of information here to act on. And here in the US we’re thinking, I can’t really do anything with this. And unfortunately, bureaucracy gets in the way and the two working cops never talk. Right?
So that was one of the things that we did. We figured out ways to, again, if the UK is doing things, if it’s going to go to the US you’ve got to meet our standards. For example, in the US unfortunately the legal term is still “child pornography” and that’s being worked on, and I have to use that term in US legal documentation, but if I’m going to draft reports or send information to our UK partners, say the NCA or the Met or Merseyside, I have to draft it in a way that it would work in UK courts.
And we came up with a game plan and it was very, very effective. We, and the investigators, had the ability to contact each other instantaneously at a moment’s notice without having to go through, you know, the Home Office, the State Department, and all the other bureaucrats.
Neil Fairbrother
You mentioned a lot of different predator types a few moments ago. One of the individual successes, I think that you particularly were involved in with respect to Project Mercury involved a character that went by at least the online name of Augusta Byers. Tell us about the Augusta Byers case, if you can.
Austin Berrier, HSI
There’s so much to this one and, you know, the listeners could actually just Google his name at some point, if they want the three-hour talk William Augusta Byers. So he was actually identified by our good friends with the Toronto police service. A very dear friend of mine who’s a Detective Constable was logged in and stumbled into the abuse in real time of a little small six-year-old victim. And they reached out to me here in the US because we had been working together.
This was actually the genesis of Project Mercury. This is what kicked it off. We, law enforcement had been dabbling with the live streaming issue on Zoom and we were trying to still figure things out and come up with best practices and this Constable called me and advised me that this was going on, we really didn’t have a process in place yet, we were still trying to figure it out.
We were able to reach out to Zoom, who frankly had never really been contacted by law enforcement before and at that time, they weren’t the juggernaut that they are now. They were a small 20-person shop in San Jose. And within probably 90 minutes, we got to the person we needed, somebody who can make things happen. And they got us the information that we needed through legal process, right. And then we tracked down that IP and worked with the ISPs and then we figured out that this poor victim, this poor young child was in a rural area of central Pennsylvania on the east coast of the US.
And by then it was, I don’t know, probably 10 o’clock at night and our local office there was only maybe four or five people, and they worked with local law enforcement there, the police services, and we’re able to obtain a search warrant and get in and identify that child within 14 hours from the time that the Constable in Toronto saw it to the time that that child was removed from that situation.
And when you think that we, at that time, we didn’t even have a process in place for this company. It was something new that we had to make it up as we went along, so to speak. You know, I tell people, we went from Toronto to Phoenix, to Phoenix, to San Jose, San Jose, to Atlanta, Atlanta, to rural Pennsylvania in 14 hours. And that was amazing.
But when we got in there, we realized that he was the suspect was sharing this information, or, sorry, he was live streaming this abuse to about 50 people globally, US Canada, the UK, Germany, all around the world and we spent about six months identifying those other individuals. And eventually at least here in the US and we ended up charging 16 individuals total.
And that was again the genesis, the beginning of Project Mercury, where we realized it was such a problem. And some of the sentences we in the US we have far stiffer sentences, you know, individuals are taking plea deals for 35, 40 years for in effect virtually raping a six-year-old from, you know, the comfort of their home in San Diego or Miami, Florida, or Chicago, Illinois.
Neil Fairbrother
What’s lessons have you taken Austin from Operation Mercury?
Austin Berrier, HSI
You know, it’s almost overused, unfortunately, but it’s a hundred percent true that it takes a network to defeat a network. If you talk to any CSAM investigator, the NCA or Taskforce Argos in Australia, or RCMP or HSI, or the Bureau, while it sounds trite, it’s absolutely vital and accurate.
There’s no “open air” CSAM market, and what I mean by that is that a detective in Pittsburgh, Pennsylvania, or an HSI agent in Arizona, or an NCA or MET Detective can’t just drive down to their local corner and investigate CSAM. It’s not like narcotics or gun crimes or crimes of violence.
We find a victim or a crime before we know the actual location or jurisdiction or bailiwick. So I sit at my desk in Phoenix and find the crime and then once I figure out where it is, I have to have that network to reach out to whether it’s our colleagues in Europe or Asia or South America. And that’s that network. That’s what I learned.
It’s that it’s law enforcement job to worry about all kids, not just the kids in their postcode, right. If we only investigated the CSAM violators in our postcode, it wouldn’t work because I can’t just go out and draw a circle around Phoenix on the internet and hope to find offenders there. So we have to work together. And that was probably the biggest eye-opening for me with Project Mercury is that working together, we have different capabilities, different skill sets, different techniques that we can all utilize.
Neil Fairbrother
And for the companies that are providing these encrypted, private live streaming services, what would be your recommendations to them to stop this from happening?
Austin Berrier, HSI
I think first and foremost, I think they need to understand, and the public should understand too, is that when law enforcement is attempting to gain access to internet content or activity, it’s a robust process that we have to go through. We have to have either probable cause or reasonable suspicion in the UK. There is a different branch of the government, there’s a magistrate, you know, the magisterial branch or the judicial branch that’s involved. We have to articulate and document and have some evidence upfront. We can’t just do it willy-nilly. Unfortunately, television makes it look like we can and that’s not accurate. So I think first and foremost, the company and the public needs to understand that.
Next I think if industry understands that, we’re not asking for a back door and for us to keep the keys to the back door, that’s not what we’re asking. We’re asking, please build in a back door, you guys keep the keys, and when we show up with proper permission from a judge or magistrate only give us the key to that specific thing that we’re looking for. That’s all we’re asking.
We’re not asking for some unfettered, unrestricted access where we can read everybody’s everything. I don’t want that. I don’t want to be on the receiving end of that either. And I don’t think most law enforcement officials do, but that would be the first thing.
And I think the next is that I liken it to this. You know, oftentimes there are in their terms of use, they say, no. The tech industry will have things like, you know, “contraband” “illegal activity”, which is kind of broad. And I understand that. If you, if you make it too narrow, something new will come up tomorrow that’s technically not in terms of use, but contraband is a great term, illegal activity.
But what I find interesting is that some of the tech industry will say, “Well, I can’t control what happens in our space”. And I find that difficult to believe simply because, you know, if I am driving in my car and I drive by a restaurant, and the next time I log into social media, I can have an advertisement for that restaurant that I didn’t even go to, I just happened to drive by. In my social media, they can certainly identify illegal activity.
I had somebody once explained it to me like this: if I purchase a car, so we’re there in the UK, if I purchase a Vauxhall from Vauxhall and I take the car home and I use that car to break the law, you know, Vauxhall does not have any care, control, custody or concern in that vehicle anymore because I own it. But if I kept that car, if I took that car and I committed a crime on Vauxhall’s property, like say, there in the showroom or in the parking lot, while even though I own the car, I’m on their property, they would have some obligation to report that.
Now we take that over into the digital world. Yes. I create an account. It’s my account on whatever platform I’m on, but who’s backbone, whose environment, whose space am I living? If I create a Facebook account, it’s my Facebook, but I don’t leave the Facebook environment, I’m still within Facebook’s environment that they control. So I’m still in their environment. I haven’t left their property, so to speak, if I’m on Facebook, the app committing a crime, I’m still on their property. It’s just their digital property. It would be the same to me as if I was on Facebook’s actual physical property, committing a crime. And that’s how I try to explain it to people.
Neil Fairbrother
Austin, thank you. I think we’re going to have to leave it there, we’re out of time. That’s a fascinating if somewhat hair-raising insight into the world of the criminal who rapes children online and thank you for providing us with an insight into how law enforcement deals with this matter, particularly from the international aspect.
Austin Berrier, HSI
Thank you.