Safeguarding Podcast – John Doe vs Twitter with Lisa Haba, Haba Law & Peter Gentala, NCOSE

In this Safeguarding Podcast with Lisa Haba partner at the Haba law firm and Peter Gentala, Senior Legal Counsel for the National Centre on Sexual Exploitation (NCOSE) law centre. we discuss what recourse children have against social media companies who refuse to take down their intimate images. Victimised first by predators then by allegedly by Twitter, John Doe is taking legal action in a true David vs Goliath fight. Could it change the world?

There’s a lightly edited for readability transcript below for those that can’t use podcasts or for those that simply prefer to read.

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the law culture and technology of safeguarding children online.

Neil Fairbrother

When he was just 13 years old, John Doe was contacted by someone on Snapchat with whom he began to exchange messages and to develop a relationship, which is not an unusual thing for a modern teen to do. The other person who John Doe was led to believe was a 15 year old girl, eventually asked him to send a nude picture of himself. As soon as he did that, the nature of their correspondence changed and John Doe found himself trapped in a world of abuse and exploitation that he never imagined. And three years later, things got even worse and quite surreal when Twitter got involved.

To explore what happened, or perhaps more accurately what didn’t happen, I’m joined today by Lisa Haba partner at the Haba law firm and Peter Gentala, Senior Legal Counsel for the National Centre on Sexual Exploitation law centre.

Welcome both of you to the podcast. Lisa, could you provide us with a brief resumé so that our listeners from around the world have an appreciation of your background and Peter, perhaps you could do the same after Lisa?

Lisa Haba, Haba Law firm

Sure. thank you so much for having us on the podcast. I am Lisa Haba, I’m a partner at the Haba law firm. Prior to being at the Haba law firm, I was a State Prosecutor in the State of Florida for approximately eight years and during my time as a prosecutor, I prosecuted human trafficking, sex crimes and crimes against children primarily, along with a variety of other you know, horrible crimes to happen to different residents of Florida.

Since going into private practice the focal point of our law firm has been to help survivors and victims of both human trafficking and sexual abuse put together both the civil aspects of their cases, and then other necessary and needed assistance in legal help as it comes about.

Peter Gentala, NCOSE law centre

Thank you, Neil, It’s good to be with you. My name is Peter Gentala, I’m the Senior Legal Counsel with the National Centre on Sexual Exploitation (NCOSE). And within the Centre, we have a law centre, which is a legal team that focuses on the civil court system and helping survivors and victims of sexual exploitation.

I have a background in American constitutional law and the federal court system in particular. I also have a background in state public policy. For seven years I was the General Counsel of the Arizona House of Representatives and my policy portfolio included all the matters that came before the judiciary committee, which include criminal justice issues and child safety issues.

I also worked at a nonprofit, a national nonprofit in the United States called Childhelp, which runs child advocacy centres, which is the forensic investigation of criminal child abuse, often criminal sexual child abuse. And this is the way in the United States that law enforcement teams up with healthcare, behavioural health and provides a therapeutic environment for the forensic side of investigations which must take place, but does so in a compassionate way that is sensitive to child victims. And with the NCOSE I specialize and focus on the various ways that sexual exploitation takes place online.

Neil Fairbrother

Okay, thank you both for that now. You’re both acting I believe as at least legal advisors on behalf of a child who we call John Doe in a lawsuit against Twitter. But before we start drilling into that particular action against Twitter, I’d like to just very quickly, if we can, cover the background to the case, because it didn’t actually start on Twitter. It started some years before the Twitter incident took place. It actually started on Snapchat. So what took place on Snapchat? What’s the origin of this story?

Peter Gentala, NCOSE law centre

Well, Neil, your introduction was right. This exchange that began between our client who was a young man and someone else he thought was his age, did occur on Snapchat. And he essentially thought he was talking with a peer, someone that if he didn’t know them directly would have been in his own community. And he was just 13 years old at this time.

So at the age of 13 he didn’t have an appreciation for the dangers that are there through online correspondence and he was asked at some point during this conversation to exchange nude pictures. He would provide a picture of himself and in return he would receive pictures of who he thought he was speaking with on the other side. And everything changed once that exchange of nude photos took place. From that point on the conversation was extortive. He was subjected to blackmail and whoever was on the other side, and there’s reasons to believe that he was ensnared in a criminal conspiracy at this point, a significant criminal conspiracy, demanded many things of him, including more sexually explicit, more sexual, sexually graphic photographs also that they would include other miners. And they also attempted to meet physically with him to try to schedule an in-person meeting.

Thankfully, that that meeting never took place because John DOE could have been in a more dangerous situation if he ever met face-to-face with the people who were on the other side of that. But he was in a very, very difficult place. And at 13 years old, very few young people are equipped to handle that kind of a situation. He didn’t know what he could do, who he could trust.

Part of the black mailers or extorters or sex traffickers, part of their plan was to divide him, preventing him from going to the people in his community, that he should be able to trust like his parents or like the authorities at his school or his coach. And they were saying, if you don’t do what we tell you to do, we will provide the picture you’ve already taken, or pictures you’ve already taken, to those people. So he was in a difficult position.

Neil Fairbrother

I believe he managed to extricate himself from that position and over time this threatening and extortive correspondence dried up, it finished

Peter Gentala, NCOSE law centre

It’s true that after a while there was radio silence. He decided he wasn’t going to comply with any more demands, but by that time he had provided a few extra photographs to the traffickers that included sexually graphic images of himself and another minor who was also 13 years old at the time.

Eventually he decided he wasn’t going to comply with any more demands and after a few more threats from the traffickers, there just was no more communication. And he thought at this point in time that it was just a nightmarish episode that was brief and over with, and that he could move on with his life and everything would be normal. But sadly and tragically it boiled back up to the top through the involvement of Twitter about three years later.

Neil Fairbrother

Okay. Well, we’ll have a look at that in a moment, but before we get onto that, I would just like to very, very quickly, if we can just cover a couple of other points, one of which is the 1996 Communications Decency Act and in particular Section 230, because there is an impression given that this provides immunity to electronic service providers for the content that is posted by third parties on their services. Is that the case, do they actually have this immunity? What does it really mean?

Lisa Haba, Haba Law firm

Well, that’s a really complicated question and it’s going to be a complicated answer. But the short version is there was a statute put in place by Congress in 1996 that we know as the Communications Decency Act. And yeah, it’s a huge statute. A small piece of it is called Section 230, which talks about the “Good Samaritan” exception to the proliferation of content on the internet service providers’ websites.

So when you read the statute itself, the plain language of the statute is quite clear, it’s called the Communications Decency Act. If an internet provider is trying to do the decent thing, trying to do the right and lawful thing, and removes content that they believe to be offensive or harmful to let’s say a minor for example, or something of that genre, then they would have an immunity. They couldn’t be prosecuted.

So for example, if somebody put up pornography on the internet and the platform decided, no, we don’t want this on our platform, we’re gonna take it down and then they got sued for violating Freedom of Speech or something to that effect, they would potentially have an immunity under Section 230.

What our case law has over the years delved into and candidly, we intend to dispute this, but case law has delved into kind of the opposite side of the same coin. So what this has been interpreted to mean, and it started with one case at the beginning that has kind of turned into a line of case law that has followed this belief, is that it’s not about being decent. It’s not about protecting Good Samaritan actions as is codified in the statute. It has been turned into an exception that is so broad and so overreaching that now internet providers are acting under impunity to say that they can do whatever they want at any time. There’s no recourse. There’s no repercussions.

At this point, we’re literally dealing with a situation where we have alleged in our complaint in quite some detail, that child pornography was posted and that Twitter refuse to take it down, which is completely illegal content on their platform. And the response that they have provided to us is that they are immune under Section 230.

So now at this point in time, it seems ironic and it seems to defy logic and common sense as well as Federal law that somebody could act face-to-face with somebody and do something that would be seen as criminal, you could be liable civilly for that tortious conduct. But then on the flip side, if you do it online, all of a sudden, it doesn’t matter because we have a statute immunizing all comers.

If it doesn’t make sense on what the law stands for and what common sense stands for. And so our position is that Section 230 does not immunize Twitter from this. They are not immune under of the laws they would like to claim that they are and seek to be held as such, and that they did violate the rights of our client.

Neil Fairbrother

Okay. Thank you, Lisa, for that. There were a couple of other laws that pertain to this case. Also the FOSTA SESTA Act. Could we have a quick two minute resumé of what the FOSTA SESTA Act is please?

Peter Gentala, NCOSE law centre

Well Lisa’s explanation is a great setup for this. FOSTA SESTA is the one place where Congress has stepped back in and focused on Section 230 of the communications act. And they did so specifically in the area of sex trafficking and human trafficking in general. But what Congress was concerned about in the dynamic that Lisa articulated, where several Federal courts had been interpreting a very broad immunity, one might even say breathtakingly broad immunity for online providers, that some of the cases that were getting thrown out of court summarily included trafficking cases.

Congress was of an overwhelming mindset that that is not at all what was intended by Section 230 and so they clarified for the Federal court system that nothing in Section 230 should be construed to prevent civil causes of action for survivors or victims of sex trafficking in the United States who could meet the elements of the Federal sex trafficking statute. So that’s what FOSTA SESTA is about.

It’s only a few years old that, that change to the law. While the Communications Decency Act has been with us since 1996 in the United States, FOSTA SESTA came in 2018. And so with the way litigation works in the United States, we’re seeing the FOSTA SESTA cases work their way through the court system currently and it’s gonna take a while before the full reform of FOSTA SESTA is fully felt throughout the United States.

But it is part of the legal framework of this case that we’re here talking about today because John Doe was the victim of sex trafficking in the United States. And Twitter knew about a trafficking venture that was taking place and still allowed child pornography to be disseminated on its website. And it profited from that dissemination. So those are the allegations in the complaint and they bring FOSTA SESTA squarely to bear. And it’s yet another way that Section 230 does not apply to this case.

Lisa Haba, Haba Law firm

I’d add one thing, it’s not just that it’s child pornography online either. It was a depiction of the human tracking acts that our client was victimized by. So it is child pornography. It also is human tracking and a video of such.

Neil Fairbrother

Okay. And is there a definition of human trafficking, just so we can clear up that particular phrase, because I think it has a certain connotation to certain people which may not be correct.

Lisa Haba, Haba Law firm

Okay, there’s sex trafficking and labour trafficking. Sex trafficking is what we’re dealing with in this case so I’ll focus on that for now. There’s actually two ways under Federal law that you can be sex trafficked. One is that you have an individual who is essentially treating a human being as a commodity. So more or less they’re recruiting a person, they’re selling a person, they’re transporting a person, there’s a tremendous number of adjectives and verbs that go into this. But the essence of it is they’re treating a human being as a commodity to be bought and sold in the marketplace. And so if they’re doing that for the purpose of exploiting that person and causing that person to be a victim of commercial sexual activities, they’re exchanging sex for something of value. That could be money, it could be something else that they find to be valuable, and that would be sex trafficking.

But the way that we’ve handled that under this case and Twitter’s implications here are actually in the other form of sex trafficking, which is you might not be the person that actually committed the trafficking act in that respect I just described. But if you profited from that sex trafficking act like, for example, if you’re a hotel and you know sex trafficking is happening in your rooms, and you decided to just keep collecting a profit because of that knowledge, or you should have known, and the signs were all there and you failed to act appropriately, then you are participating in a venture with sex trafficking and profiting from it.

And so in our case, we’ve alleged essentially that Twitter knew or should have known that there was sex trafficking on its platform. And instead of doing their socially responsible thing, instead of doing the decent thing and removing it, instead that decided to profit from it and continue to profit from it until the Federal government got involved.

Neil Fairbrother

Okay, well, we’ll have a look at some of the numbers on this case in a short while and that’s a good segue to the Twitter case you’re bringing. Now there seems to be a timeline that starts on or around the 25th of December, Christmas day, 2019 when a concerned citizen raised an issue with Twitter, that two user accounts were posting CSAM, child sexual abuse material, which turns out to have been of the 13 year old John Doe, who by this time was 16 and in high school. Is that where it starts on Twitter?

Peter Gentala, NCOSE law centre

That’s the timeframe and the point of that allegation in the complaint is that even before John Doe and his family reached out to Twitter in a state of concern, deep concern, about the images of John Doe and the other young person that were being distributed on the Twitter platform, Twitter already knew that the very user, or one of the very users, who was disseminating that type of image, there were direct reports to Twitter saying this user is disseminating child pornography, please take it down.

So that, well before the family actually reached out in relation to John Doe, Twitter was already on notice that the user in question, or one of the users in question, was a bad actor that was using its platform to broadly disseminate child pornography.

Neil Fairbrother

Okay. And on learning of the resurgence or re-emergence of this material that John Doe had assumed was consigned to history, what impact has that had on him?

Peter Gentala, NCOSE law centre

Well, it’s it was a drastic impact. And perhaps the best expression of it is the way John Doe’s mother found out that the materials were being circulated on Twitter. She found out because a friend called and said, are you aware that your son is talking about taking his own life? Up to that point she had no idea of the situation he was in.

And then she learned that these images of John Doe were circulating throughout the school community that many people knew about it. And John Doe was enduring just tremendous anguish, as you can imagine, he was not being treated well by some of his peers and in his community he was subjected to ridicule and bullying and it was a very difficult situation.

But this is also where the story takes an encouraging term because the family rallied together. John Doe’s mother, and really the entire family, supported him and they resolved together that they were going to find anyone, someone, anyone who could help them. And so they began a very concerted, focused, committed campaign to take these images and what it was, Neil, it was a series of images that were spliced together and placed into a compilation video. So that’s what was being circulated on Twitter by at least two users. And so the family worked together to try to get Twitter to remove these images.

Neil Fairbrother

Okay. Now Jane Doe, John Doe’s mother as you rightly say, got involved and both of them corresponded with Twitter on a number of occasions through this month-long period from the 25th of December, 2019 through to the 30th of January, 2020, and the replies that they received from Twitter read like they are simple, automated replies. They don’t appear to have been written by a caring person who received them on the other end, a caring, thoughtful person who was planning on taking action. They read like an automated scripted reply. Now, I don’t know if they are not. Were they? Or was someone actually behind those responses from Twitter?

Lisa Haba, Haba Law firm

Yeah. I don’t know that we can answer that question right now without going through Discovery. Obviously we’re gonna have to seek the answers to those very questions you just asked. We have been asking as well, and we’re waiting to find out the answer. So I don’t know that we can answer that question right now.

Neil Fairbrother

Okay. That’s fair enough, but one of them does say “…if there’s a problem with a potential copyright infringement, please start a new report.” Why would they be talking about a copyright infringement issue if someone had actually bothered to look at the complaints raised and the material that they were being complained about?

Lisa Haba, Haba Law firm

I feel like you were in some of our discussions, I’ve asked the same question as a speaker, you know, honestly at this point, without knowing, I can’t delve into the mind and Twitter and their employees and what they intended and didn’t intend at this time, but hopefully during Discovery process we’ll get some answers to these very important questions.

Peter Gentala, NCOSE law centre

Neil, one thing to point out about the correspondence at that point, when copyright was mentioned, that’s really quite stunning, is that Twitter already by that point had been provided with all of the evidence of John Doe’s age, of the nature of the images that were being disseminated, circulated on Twitter at that point. Twitter’s emails back and forth basically asked for all the proof that John Doe is who he was saying he was at his age. So by that point in time, they, however that process is created, it had brought the information forward from the victim and so Twitter was in possession of everything that it needed to take action and take action quickly. And sadly, that didn’t happen.

Neil Fairbrother

Okay. Now this incident was brought to closure with the involvement of Homeland Security, the US Homeland Security Investigations team I think. What happened there? What did they do? Why were they successful?

Peter Gentala, NCOSE law centre

Well there were a series of conversations with law enforcement that the family had, both local law enforcement and eventually the family was very fortunate to come into contact with a Federal Agent, an Agent with the United States department of Homeland Security. And that was a very important development for them because that Agent immediately recognize the signs of sex trafficking, understood the gravity of the situation, and also had the ability because of his office to get Twitter’s attention and make Twitter eventually do the right thing.

And so the Agent reached out and specifically sought that the materials will be taken down and reported under Federal law as child pornography which is the Federal obligation of any online provider in that situation. So that was a really important development for just getting to a point of relief for the family, that the images were taken down, but by that time, significant harm had been done.

Through screen captures and things like that, we’ve been able to learn that at least partway through the period in which the family was begging Twitter to take these images down, the abuse video had been viewed over 167,000 times, and that’s only partway through the period. It was live even longer than that. So it gives you some idea of the scope of the broad dissemination of the material. And it had been retweeted at least 2,200 times.

So Neil I know you know what that means. That we’re talking about a very broad scale, and there are so many things we can’t measure that are included there, like user downloads, other screen captures, other uploads to other platforms that may have happened. The potential scope and scale of this very harmful material, harmful to both John Doe and the other young man who is depicted in those images, is really staggering. And it’s really tragic.

Neil Fairbrother

Yes. And in your case against Twitter, you explain how Twitter makes money from its content, how it monetizes content and you’ve mentioned just now the scale of this particular instance. So in approximately 30 days this content had as you say over 167,000 views and 2,223 retweets amongst other things. Do you have any idea at all that the scale of money that they may have made out of this?

Lisa Haba, Haba Law firm

I don’t know, at this point in time, if we can tell you how much they’ve made, but we certainly know the mechanism by which they’re making money. They’re making money in two different ways. One is their advertising, and the second way they’re making money is their data licensing. So every time you go on Twitter, as you go down a feed and you’re reading the different tweets and responses and commentary that goes back and forth on Twitter’s platform, there are advertisements laced throughout that.

Each of those advertisements is strategically placed by Twitter. Each of those advertisements is placed to be able to target the users. They can either be clicked on, creating a profit stream, be viewed creating a profit stream or lead the user to another website, which also would create a profit stream through the click per ad mechanism that exists. Some of these are targeted advertising based on what has been gathered and collected by Twitter about the user that’s viewing that content.

Others I understand may not be, but the point being they’re making tremendous money through the advertising mechanism, and this is a billion dollar company, right? So they’re making tremendous money through the advertising mechanism by taking the very tweets that we’re discussing in this case and there’s advertisements throughout them. Every single person that clicks on those tweets or reads those tweets, or use those tweets, although illegal, Twitter’s making a profit off of that. And so that’s the first point in which they’re monetizing the exploitation of John Doe.

The second one is the data licensing. So through data licensing, we’ve got spelled out in our complaint that they’re profiting and they obviously have put up in their material, that there’s a tremendous amount of data that comes into Twitter, or leaves Twitter in different notions throughout how you use the Twitter platform, which includes data which they collect about the users and they can sell that data. They can use that data to try to target more advertising and more targeted advertising towards each of the users, again, monetizing the exploitation of John Doe.

So what we found very interesting, and it’s in our complaint, is that there was a tremendous amount of commentary on the very compilation video of John Doe. That compilation video had comments of people saying things like, “I think that’s a minor. I think they both are”. Comments that would alert anybody who’s reading that and using that and we’re talking about child pornography and the exploitation of children.

Yet instead of doing the socially responsible thing, instead of doing the decent thing as required under the statute, instead all of them said was complete indifference to the dignity of John Doe and to his freedom and exploitation and being free from that online.

Neil Fairbrother

Okay. Let’s look at the Counts that you’ve bought against Twitter. And we could spend all day just on this one aspect, or probably even just on one Count, so if we could be as brief as we can, because we’ve got 11 of them to get through.

Count one is “Benefiting from a sex trafficking venture in violation of the Trafficking Victims Reauthorization Act”. Now, is that a piece of Federal law? Is it State law and is the key point here that the defendant i.e. Twitter knowingly benefited financially from the sex trafficking and exploitation of John Doe, which resulted in his serious harm.

Peter Gentala, NCOSE law centre

Sure. Yes this is Federal law Neil. This is the Federal sex trafficking statute, and we’ve already discussed it in some detail. These are the components that Lisa broke down earlier of sex trafficking under Federal law and it’s also what Congress, through the FOSTA SESTA Act, specifically said is different from Section 230 and is not immunized or blocked by Section 230.

So this is the reason that it’s the first Count in the complaint, that the elements of it are clearly established by what is being alleged there and it shows that right up front, from the very get-go, this is not a Section 230 case.

Neil Fairbrother

Okay. So there’s no defense?

Peter Gentala, NCOSE law centre

Section 230, what it does is it creates an immunity at the very outset of litigation. And when it applies, it’s been applied to actually block the very beginning of the trial process, including Discovery. So where a court has decided that Section 230 applies, plaintiffs aren’t even entitled to go through the very initial stages of Discovery. So it’s about as complete a shield from liability and immunity as you can imagine. It’s not necessarily a defense, it’s used to choke lawsuits off before they really even begin.

Neil Fairbrother

Okay. I Understand, thank you. So count two then is a “Violation of Duty to Report Child Sexual Abuse Material”, and Twitter, I think, are obliged once they know to report this content to NCMEC?

Lisa Haba, Haba Law firm

Well, that’s the very allegation they did not do that. It wasn’t until the federal agents got involved, that they suddenly said, Oh, of course we’ll report it to NCMEC, but then Federal agents are already involved, so it was a little bit too little too late. So it’s a federal crime. And there’s federal litigation in civil law that supports count two, and we plan to proceed accordingly.

Neil Fairbrother

Okay. Count three is “Receipt and distribution of child pornography”. Now the term “child pornography” isn’t really the preferred term these days, it’s generally referred to as child sexual abuse material. Notwithstanding that, the point here I think is that there is sufficient evidence that Twitter had been notified that they had this material in their possession, but they didn’t act. It had been brought to their attention sufficiently clearly enough for anyone to realize that this content was in their possession. Is that correct?

Peter Gentala, NCOSE law centre

Yeah, that’s correct Neil. Keep in mind that they’re not just possessing it, but they’re actively distributing it. The Twitter platform is designed to broadcast things as loudly as possible through its system of tweets and retweets. And so the whole mix of 330 million users is at least conceptually involved with the dissemination that’s there.

And one word about the phrase “child pornography”. You’re absolutely correct that the preferred term is child sex abuse material, and Congress right now is looking at converting the federal legal system over to the use of that phrase. It hasn’t quite fully happened yet. So we’re using the terms as codified in the United States code right now, but eventually I think the United States will join the worldwide consensus that the best term is child sex abuse material, because it actually shows that each image is an actual crime against the child.

Neil Fairbrother

Okay. You also make the point under this particular account that Twitter’s conduct through their apparent inaction was “…malicious, oppressive or in reckless disregard to John Doe’s rights” and indeed health?

Peter Gentala, NCOSE law centre

Yeah, absolutely. They asked a sensible question in their reporting process, which is, “Can you provide us evidence that the material depicted is of a minor and that you are who you say you are?”. And so John Doe is a child that’s able to do that and demonstrate that he’s a child, and share with them that the other person that’s depicted is a child. And once they’re equipped with that knowledge, which is what they need under the federal reporting statute, it’s what they need to do the right thing, then they proceed to do the wrong thing, which is just say, we don’t think this image violates our policies. We’re going to stick with it. We’re going to keep it on our platform. And of course they continue to benefit from it as well. So that constitutes reckless disregard in our view. And that’s why it’s alleged there in the complaint.

Neil Fairbrother

Okay. Now Count 4 I found particularly interesting which is the California Products Liability Act. The reason I found it interesting was that you’re alleging that Twitter’s products aren’t effective because they do not perform as safely as an ordinary consumer would expect them to perform when used in an intended or reasonably foreseeable way, really in much the same way as an appliance or a car or a fridge or even a laptop would be expected to work. Is that correct?

Lisa Haba, Haba Law firm

I think what we’re alleging more specifically is that there is a concern on our part, as you previously stated, this could have been in part a robot responding, part of their algorithm on their platform, responding to John Doe’s inquiries. It could have been a live person. It likely was a combination of both. But we have to take into account there might be part of Twitter’s algorithm behind the responses that were given. And so, as a result of that, if that’s the case, if Twitter’s algorithm has children reporting that they’re being sexually abused online, there’s depictions of that online and Twitter’s response through your algorithm is just the generic “Nope, we aren’t going to take this down”,  then that’s a severe defect that was harming children and allowing sexual exploitation to continue.

On the other hand, we also alleged in our complaint various other mechanisms on their platform which are equally as damaging. There is prolific dissemination of child sexual abuse material on Twitter. You can easily go into a search bar, type in known hashtags that disseminate child pornography, I don’t want to say what they are here to further promote them, but if you type those in it’s amazing.

We literally typed one in and it gave us suggestions of keywords to make it more easy to find child sexual abuse material. We typed in the hashtag and it suggested add the term “young” to it, and that’s horrifying. How could children use this? Adult use this. The user agreement doesn’t say 18 and up, and it’s all illegal. So if we’re dealing with that material, there should be a mechanism in place to protect people from being exposed to such horrific material. But unfortunately in Twitter’s algorithm, our evidence and research and investigation to date has shown quite the opposite.

Neil Fairbrother

Yes the Canadian Centre for Child Protection has a view on some aspects of this product functionality in that they have concluded that Twitter makes it hard for users to report CSAM. The reporting function is so hard to use it seems to the point of even obstructing the reporting of CSM. Is that correct? Is that your view?

Lisa Haba, Haba Law firm

Well, I went online and tried to report an image that I found to be offensive. And it took me an experienced attorney who deals in this matter day in and day out because of this lawsuit, about 20 minutes to find the right button to press to report such a thing. So, no, I don’t think it’s easy. And if you’re not invested in figuring it out and invested in reporting it, you know, especially in today’s culture and in age of instant gratification, I agree with that completely. It’s very hard to find it and it’s hard to figure out how to employ it.

Neil Fairbrother

Okay. Count 5 concerns Negligence and “…the possession and distribution of CSAM is not only against the law but it’s also against Twitter’s very own terms and conditions”. What do their terms and additions have to say about this?

Peter Gentala, NCOSE law centre

And we’ve pivoted, you noted this earlier, Neil, but with that last count on the product liability side, now we’ve now pivoted to State law claims in the United States. There’s both Federal and State law claims. So we’re now on the State law side of things. And of course in the United States, that means we’re building off the bedrock of our common law in most States. We’re building off our shared common law legal heritage with the United Kingdom.

But the concept of Negligence in most States is that there’s a duty owed and that duty has been breached and as a result of that, someone has been caused significant harm. The duty here is expressed in several different facts that we’ve alleged in the complaint and most notably to start with it’s what Twitter says that it will do itself to protect children on its platform, its own commitment.

Twitter claims that it has a zero tolerance approach when it comes to sexual exploitation of children. And of course, as this case has shown that’s the exact opposite. They are exceedingly tolerant. They are so lackadaisical in fact that they failed to even take action when the child himself comes to them and says, please take down an abuse video featuring myself and another child. So Twitter’s own policies and standards that they commit to the community, commit to the world, that they will stand by, they do not stand by.

And then of course, other ways that their duty is established are federal law, state law and the fact that they have an obligation to report. If child pornography is not reported in the American legal system, the means of tracking these images as they make their way across the world can’t be put into effect. There there’s a system that’s involved there that allows each image to be cataloged and it’s called hashing, but it allows specific identification, and this ultimately is the only way that abuse videos and images of children can be tracked down and removed wherever they pop up. And by keeping these abuse images out of that system Twitter utterly failed in its duty.

Neil Fairbrother

Okay. Now Count six is closely to related to Count five because it’s Gross Negligence. What’s the difference between gross negligence in this instance and negligence?

Peter Gentala, NCOSE law centre

It’s very, very close. It’s basically that what has happened shocks the conscience. It’s a more severe form of negligence. And in here it has everything to do with the knowledge that Twitter had here. Again, this is not one of those cases where they have someone saying, “Hey, I think you should check out this user account because of child pornography”. They already had that level of knowledge, but then they had the additional layer of knowledge here where the victim, the child victim himself is saying, “Please take down my abuse videos”, which are illegal under federal law, and they’re doing nothing with it. So that’s what steps it up from not just the common negligence standard, where there’s a duty owed, but there’s an egregious form of reckless disregard for duty.

Neil Fairbrother

Okay. Count seven I guess is also related because it’s Negligence per se. What is Negligence per se?

Peter Gentala, NCOSE law centre

Sure. I’ll just close out these Negligence claims by mentioning this one. Negligence per se occurs when the duty is framed by a legal context that’s been codified in law. So there’s a number of legal duties here that are established under Federal law, under State law and this is what Twitter was obligated to do. We don’t have to argue about whether they had a duty when it’s right there on the face of the legal texts themselves as an enacted by Congress, as enacted by the California legislature and that makes them subject to strict liability under the principles of the civil system in the United States. So because there were so many different statutes that were violated here, that’s why we have this “Negligence per se” claim before the court.

Neil Fairbrother

Okay. Now Count eight is still related to Negligence, but it does focus on the victim and it says “Negligent infliction of emotional distress”. And you’ve mentioned this in passing, the impact that this has had on John DOE, you mentioned that some of the content of whose family were made aware of were him talking online about suicide ideation. So he must, at one point, have been in a pretty bad way. And this particular point is focusing on the impact on him. Is that correct?

Lisa Haba, Haba Law firm

Yes, it is. So when we talk about it, the very essence of this type of allegation is that it caused severe emotional distress and mental anguish to our client. You know, we talked about before how he was suicidal. I know Peter laid out the Negligence that Twitter exposed our client to, and their negligence was not only a cause of it, but it was a substantial factor in causing John Doe to become suicidal and severely harmed through the emotional mechanism and could be emotional relief that he was seeking. So at this point this particular course of action focuses on that harm, just like you said, and John Doe was severely harmed.

Neil Fairbrother

Okay. Count nine talks about the “…distribution of private, sexually explicit materials.” And the point I picked up from here is that the issue of consent has been raised. But of course, John Doe, when these images were first created, he was 13, when they appeared on Twitter he was 16, he was still a minor, he was still under 18 and therefore surely consent couldn’t have been given in the first place?

Peter Gentala, NCOSE law centre

Absolutely that’s correct. This particular statute is the expression under Californian State law of concern for non-consenting situations. A variety of legal jurisdictions have laws like this, and they focus on the problem of nonconsensual distribution of intimate photographs. Sometimes they’re referred in a phrase of “revenge porn”. You’ve probably heard that. I think the better phrases is “nonconsensual pornography” or “nonconsensual sexual explicit images”.

But you’re absolutely right, a child cannot consent. Consent can’t happen in that context. That’s clear under the entire fabric of both Federal and State sex trafficking laws. And on top of that, of course, there was fraud here in the first place, just in the conversation that was happening between John Doe and the traffickers. So there was there’s no conceivable way that consent could come into play at all.

And of course as we’ve been saying, Twitter knew about the problems with consent, knew he was a minor. He also expressed to them that what had happened to him was coercive and that the traffickers had created these images through or extracted these images from him. So Twitter had everything it needed to know that there was no consent here.

Neil Fairbrother

The penultimate point you’re raising is “…Intrusion into private affairs”. And here you say that Twitter intentionallyintruded into John Doe’s private life through negligence, through not taking this content down, is that the same thing? Is inaction the same thing as an intentional intrusion?

Lisa Haba, Haba Law firm

Well, we’ve alleged the negligence aspect of it. We’re also alleging the intentional act and no, they are not the same thing. Twitter’s negligence became intentional when they were notified that there was a problem. They were notified of his minor status. They were notified of the course of acts behind the lack of consent that Peter’s described. And so in ignoring those signs and failing to act at that point, that was a deliberate and intentional choice and act that was made and John DOE was harmed severely because of that.

Neil Fairbrother

Okay. And the final Count, Count 11, is the “Invasion of privacy” under the Californian constitution, article one, section one, which reads to me as a very much as a layman, and a non-American layman, very similar to parts of your famous Constitution, the American Constitution, which says that all people are by nature free and independent, and have inalienable rights, among these are “…enjoying and defending life and Liberty, acquiring, possessing and protecting property and pursuing and obtaining safety, happiness, and privacy”. And presumably this count is focusing on those last three points of safety, happiness, and privacy?

Peter Gentala, NCOSE law centre

It certainly is. And this gets to a very fine point of American law, which is that State constitutions can be more protective of individual rights and individual liberties than the United States Constitution. Our Bill of Rights in the United States Constitution gets a lot of focus and attention and perhaps that’s because so many constitutional issues are resolved under it on a nationwide basis, but States themselves are free to create constitutional rights for individuals that are even more expansive, more protective of individual Liberty. And that’s what California has done here with this privacy provision, and a lot of that turns on what you might call a “reasonable expectation of privacy”.

Would a person reasonably expect that there would be an expectation of privacy for something like intimate photographs of children? And of course the answer to that is yes, that would be a reasonable expectation.

And the other point about those constitutional provisions that’s somewhat unique is that it binds not only the government… the United States constitution largely almost exclusively binds the government and government actors to its provisions, but this is a state constitutional provision that has been held by the courts to not only be the Duty and the Authority and guide for government actors, but it also guides private action as well.

Neil Fairbrother

Okay. Now we are massively out of time I’m afraid so we have to finish very quickly, but I would like to sneak in one or two additional questions if I may. Where in the judicial process are you with this case? Presumably you’ve filed your complaint, is there a court case due, is there a date?

Lisa Haba, Haba Law firm

Wait, the current status of this is that we had filed a complaint and served Twitter. They have filed a motion asking to dismiss our case for a variety of reasons, many of which we have talked to about today, and we are now in the process of responding to that. And I guess we’ll fight it out in court. Our next hearing is going to be in June to have that battle.

Neil Fairbrother

Okay. Now we are getting into an area of speculation, which you may not want to engage in, I appreciate that. If the decision goes in John Doe’s favour, which I assume you hope it does, and against Twitter, would this open the door to more such cases or possibly even a class action against Twitter and possibly other social media service providers? And what impact might that have on the industry as a whole, do you think?

Peter Gentala, NCOSE law centre

Well, I know that John Doe hopes that the result of his lawsuit will be justice in his particular situation, but also lasting change. The goal here is to bring accountability where there is no accountability right now, is to change the status quo and it’s to achieve a posture where Twitter is actually doing what it says it’s doing.

It claims that it has a zero tolerance policy for child sex abuse material, and for the exploitation of children and yet our investigation has demonstrated to our satisfaction that the platform has a massive problem with child sex abuse material and exploitation of children. So that needs to change.

It’s one of the largest communication providers in the world and it needs to start doing the right thing and become an example for being a good corporate citizen, instead of one that doesn’t commit to uphold its commitments when it comes to child safety.

 

Neil Fairbrother

Okay. I think Peter and Lisa, we’re going to have to leave it on that note. Thank you so much for your time. It’s been a fascinating walk through the legal side of your proceedings, and I wish you well with the case. And I look forward to seeing the results in due course.

Lisa Haba, Haba Law firm

Thank you.

Peter Gentala, NCOSE law centre

Thank you, Neil. Thank you for your work.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top