Safeguarding podcast – Agents of the State: A discussion with Rick Lane

By Neil Fairbrother

In this safeguarding podcast we discuss with Rick Lane, formerly “the point person for cleaning up MySpace”, the legal issues around Section 230 and why social media platforms are fighting any change to the status quo which provides them with immunity from liability from what’s on their platforms and we reach a startling and frightening conclusion as far as child online safety is concerned.

There’s a lightly edited transcript below for those that can’t use podcasts, or for those that simply prefer to read:

http://traffic.libsyn.com/safetonetfoundation/SafeToNet_Foundation_podcast_-_Agents_of_the_State_with_Rick_Lane.mp3

Neil Fairbrother

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast where we talk about all things to do with safeguarding children in the online digital context. Child safeguarding in the online digital context is that the intersection of technology, law and ethics and culture, and it encompasses all stakeholders between a child using a smartphone and the content will person online that they are interacting with.

The early two thousands was a heady and giddy time for social networks and many came and went. Ecademy for example, was a leading edge British business networking site, which even today has a lasting legacy, but one of the largest global consumer sites was MySpace, which in its heyday was adding some 70,000 users every day and led to the Arctic Monkeys and Lily Allen becoming global stars. Today’s guest is someone who recently described himself as being “the point person for cleaning up MySpace” in the early days of the News Corp acquisition: Rick Lane, welcome to the podcast.

Rick Lane

Well, thank you for having me.

Neil Fairbrother

Thank you Rick for making so much time. Can you give us a brief resume please of yourself, Rick, so that our audience has an understanding of your background and please bring us up to date with what you’re doing today.

Rick Lane, Iggy Ventures

Happy to. My friends will laugh when they hear you say me being brief! So I apologize if it goes on too long. I started after I graduated working up on Capitol Hill and I was an Appropriation Staffer, which is, for Congress’s funding committee. And so I began following this thing that was back in 1988 through ‘93 and followed this thing called the “information super highway”, because we were funding it through the National Science Foundation. And I became very intrigued about this new network that was being created.

When I left the Hill, I created a non-profit called the Modern Educational Technology Centre in 1993 to bring technology into the classroom and not just computers, but we were using Mosaic and ISDN lines to connect teachers and training and how to really take advantage of this new medium that was being created and developed. From there, I went to a law firm, Weil Gotshal and Manges (LLP), where I’m not a lawyer, but I was their Director of Congressional Affairs and focusing on tech policy back then. Two of the major bills that I worked on were the 1996 Telecommunications Act, which include Section 230 and in 1998 the Digital Millennium Copyright Act. So both dealing extensively with platform and device and technology platform liabilities.

Neil Fairbrother

Okay. And what are you doing today, Rick, where have you found yourself after all of that?

Rick Lane, Iggy Ventures

Sure. Today I have kind of two things that I do. One is I volunteer my time to advise around 150 anti-human trafficking and survivor groups on technology related policies because of the broad impact that that has on protecting kids and going after entities that are knowingly facilitating sex trafficking and human trafficking.

And then I have a “for profit” side where I’m a strategic advisor to several technology companies and focusing on companies that are going to make a positive societal difference.

Neil Fairbrother

Okay. Thank you. Now in a recent LinkedIn seminar run by Perry Aftab, the child’s rights lawyer and activist, you described MySpace as being “a smorgasbord for sexual predators”. What was your involvement at MySpace and News Corp and what did you discover about online safety issues at MySpace?

Rick Lane, Iggy Ventures

Sure. Let me clarify that. I did not say that. There was a State Attorney General who when we met with him said that MySpace was a smorgasbord for sexual predators, but I was always proud to say that at the time we were done, he said we were the gold standard of protecting children. So I was always very appreciative of that State AG and working with us. But when we bought MySpace I was the point person for all tech policy, digital transmission and distribution. And so [when] we bought MySpace it fell in my lap you know, to deal with all the public policy issues, and knowing the privacy and cybersecurity and the child protection issues from COPPA on down, I knew that we were going to have our hands full.

Neil Fairbrother

Okay. And what did you discover? What specific issues, the Senator [Stage AG] that described MySpace as being a smorgasbord for example, why did he say that? What was going on at MySpace?

Rick Lane, Iggy Ventures

Yeah, I think what was happening is there was this perception of MySpace, because it was an open standard type of social networking site and the first one, of grooming of children online and there were a lot of stories coming out, shows [such as] “To Catch a Predator, and all these different shows were talking about how there in MySpace, you know, people were meeting young people on MySpace and it was all this negative press going on.

But the reality was that even on “To Catch a Predator”, none of those people ever came from MySpace. They actually came from Yahoo. So that was interesting. But MySpace was the way to get press and the way to get headlines back then because it was growing.

We did find that there were some things that we could do to make it better. And I brought in a guy named Hemanshu Nigam who was recommended to me by Ernie Allen, who was the CEO of the National Centre for Missing Children. When I first met Ernie on this, I said, Ernie, tell me who the best is I can hire to do this and one of the folks he mentioned was Hemanshu Nigam. And so we got him to begin working on the technical side, making MySpace safe.

Neil Fairbrother

Okay. And what lessons did you learn from implementing safety policies at MySpace?

Rick Lane, Iggy Ventures

Sure. One of the major lessons we learned was, there’s a lot of State AGs, we had 50 State Attorneys Generals in the United States on the collaborative effort looking into MySpace. And one of the solutions that they wanted to put forth was age verification. And so we looked at it and we found that we could not do age verification because it’s impossible to verify somebody who is really under the age of 23 or 21 because they have no records and we didn’t want MySpace having health records or school records or criminal records of children. Those are the only public records available to try to identify. So we tried to figure out other ways to protect children online and we came up with some really, I think, innovative new ideas and got legislation introduced and passed in several States and Federal level to help address how to protect kids online.

Neil Fairbrother

And how did that age verification work then?

Rick Lane, Iggy Ventures

It wasn’t age verification. What we did was we came at it from a different perspective, because try to do age verification, as I said, is impossible for younger people because there’s just so many work arounds. And so what we did is we drafted legislation that basically required, this is here in the United States, that anybody who was a convicted sex offender not only had to register their physical address, but they also had to register their online handles and emails and other types of usernames so that we could block those names and emails from signing onto MySpace.

And so folks would sit there and say, well then they can just get fake email addresses. And the question is yes, but we were also using tagging in photographs. So we would try to also match up, because we had the database access to the database of all their pictures and we’d also run that against MySpace. So if we found people doing that, it was a very proactive way to try to find these individuals. If we found somebody that’s on there and they were on our site where they were using a fake handle, then they were in violation of their parole probation or parole and they go back to jail. So we looked at that as being a very proactive step versus passive and just waiting for somebody to kind of come to us. We wanted to help them.

Neil Fairbrother

Okay. And those local laws, those state laws, sorry, those federal laws I think you refer to, are they still in place or are they being ignored by the, this kind of current generation of social media sites?

Rick Lane, Iggy Ventures

So my understanding of that, and I want to get into Hemanshu Nigam on because he’s a lawyer slash safety expert, but my understanding is some of the laws got shot down because they were litigated by some of the civil rights groups that were out there, the ACLU [American Civil Liberties Union] and others as being unconstitutional either at the state level or the federal level. So I’m not sure exactly where all of the laws are and how they stand in and how they’re being enforced and implemented. But at the time we ended up in 23 States, but we did have a federal law as well on the books.

Neil Fairbrother

Talking about laws and legislations. Back then the raging legal argument seems to have been whether online service providers were regarded as publishers or book sellers. What is the difference between the two? Why were those comparisons being made?

Rick Lane, Iggy Ventures

Sure. That was back in 1996 even before MySpace, I was part of the Communications Decency Act, which direction should the law go? And the difference is pretty simple. This is, think about it, it’s more from a liable standpoint. So who’s responsible if somebody says something horrible about you and ended up being liable and again, that would be illegal speech, right? That’s not legal speech.

And there were two court cases at the time; CompuServe and Prodigy, and one case decided that the platform’s, remember back then these were bulletin board service, you know, AOL was a big player, but mostly CompuServe and Prodigy were bulletin board services. And it was decided, because they were moderating their site, that they were more of a publisher because they were moderating, they had knowledge, and if you have knowledge of an illegal activity, you can be held liable. I’m sorry, not liable, but you could be responsible for it, legally liable for it.

So then there’s another case that said no, these bulletin board services are really more like newspaper stands. And so if I’m telling a newspaper, it’s the Times of London [for example] and there’s something libellous inside the Times of London, you can’t sue the newspaper stand, the owner. You could sue the newspaper but not the newspaper stand.

So the question was, which of these types of common law experiences should be going forward? And that’s when the legislation, Section 230 was introduced, the 1996 Telecom Act by Ron Wine and Chris Cox to say we’re going to create what the liability standards should be in this new online environment.

Neil Fairbrother

Okay. So what then is Section 230, what is its purpose?

Rick Lane, Iggy Ventures

The purpose of Section 230 was to clarify these two conflicting standards. And the goal at the time was to try to, as part of a broader child protection provision, Senator Exxon had provisions dealing with school filters and library filters and a very proactive way of trying to protect children online. And then Congressman Wyden at the time, now Senator Wyden and Congressman Cox had this other provision that dealt with the platform liability issues.

The platform liability issues really do several things. One, it exempts platforms from any type of state, criminal and civil law that is quote “inconsistent” with the Act, and which has been interpreted very broadly. And then it also provided them with immunities if they took down content. Which is actually one of the things that we think the “Safe Harbour” provision.

So if they go on their site and look and find something that is illegal and take it down, or something that they find harmful to children and take it down, they cannot be sued. And so those were really the big provisions of the Bill. So for a platform in the United States, they’re really only able to be held criminally responsible at the federal level, not at the 50 state or local level.

Neil Fairbrother

So you might have a situation then where before this Act is that they came across some illegal material, they might be some child sexual abuse material, which was previously illegal, so it was already illegal. They would take it down and then they could then themselves be open to some sort of legal action because they’d removed that illegal material. Is that what you’re saying?

Rick Lane, Iggy Ventures

Yes. Yeah. Before the law because they had knowledge of say, let’s say child sexual exploitation photos online, if they found those and they would say that there’s no place to put them, right? They know they’re there and they take them down. Now they have knowledge, it’s on their site. They could have been held criminally liable. And so that was the concern. So you wanted them to be proactive.

The other thing that happened as well was this whole working with the National Centre was, if they found child pornography on their site then they were required by law to send it to the National Centre for Missing and Exploited Children, and they can’t be held criminally responsible for sending it. Actually at one point, this is how laws can be a little quirky, you are required by law to send the photos, but if you send them, you’re violating federal law of transmitting child pornography. So we had to fix it, the law, to say no, if you’re sending it to NCMEC [the National Centre for Missing and Exploited Children], it’s okay.

Neil Fairbrother

Okay. So Section 230 gives them immunity from prosecution for handling this stuff when they’re doing the right thing, which is to take it down and report to NCMEC, but it also gives them immunity from hosting it. In other words, if they are hosting it, they are not liable because they are not the publisher. They are regarded as the bookshop.

Rick Lane, Iggy Ventures

Yes. And that’s exactly the segue to the EARN IT Act. The problem is that there is no legal incentive or platforms to proactively take action against illegal activities that are happening on their sites by third parties, there’s no punishment.

Neil Fairbrother

Ah, yes. Okay. But when Section S30 was first created, there was a proviso, was there not? Were there not some; “You can have this immunity but you must do certain things to gain that that immunity”, was that not built into Section 230 at the time?

Rick Lane, Iggy Ventures

It was not. It was not. And there was a reason for that.

Neil Fairbrother

And that was?

Rick Lane, Iggy Ventures

The argument for the platforms back then… You have to remember what the internet was like in 1996 and 95 when these things were being drafted, was that the platforms were mostly for-profit, subscription-based platforms. And so the argument was that we are going to, if I’m AOL, we are going to proactively do all these things because we want people on our site, you know? We want it to be the clean site, we want to be the good site. And so, you know, the market forces will resolve if we don’t do the right thing.

But what dramatically changed over the years is when the internet went from a subscription base to an advertising base, especially when it went to targeted advertising. It no longer made economic sense for the companies to take anybody off their site or to take anything down because they were selling bulk advertising no matter who it was.

Neil Fairbrother

Okay. So this is the balance of a punishment, so to speak, versus incentive. There’s no incentive to comply with the law and because there’s no punishment to not comply with the law, the commercial imperative takes precedent.

Rick Lane, Iggy Ventures

That that’s the problem. That’s the fundamental problem that those of us who want to modify Section 230 have been pointing out that there is no legal incentive for platforms to do anything. And if they do do something, it’s a complete cost centre. So if you’re looking at rolling out a new function or feature, the cost benefit ratio is usually if you do that for any other product, you say, well can we be sued, right? Is this going to be harmful to the consumer? And if it’s going to be a harmful product or function that you add to your device, for example, you could be sued for that because you’d say, Oh, we knew it was harmful and therefore we should not have put it in. In the online world and the platform space, there is no legal incentive not to roll out a platform because if you roll it out and it’s used for bad purposes by a third party, you have no responsibility but you have all the upside of the revenue that that new function or feature will generate.

Neil Fairbrother

Now, often people talk about a Tesla analogy where they say, well, Tesla was a start-up car manufacturer. It’s way beyond being a start-up now, but even just to sell their first car it had to comply with all of the car safety regulations, otherwise they wouldn’t be able to manufacture it and had they tried to do so they would have been sued.

What you seem to be saying is that that analogy isn’t quite correct because a social media platform has no safety laws to comply with. They can just get on and do, or in the famous or possibly infamous words of Mark Zuckerberg, they’re moving fast and breaking things. But in this case there is no law to break. They just getting on and reaping the financial benefits.

Rick Lane, Iggy Ventures

Well, it’s even worse than that because they were able to move fast and break things because if they’re breaking things, there was no legal problem for them. Right? No one could sue them, civil litigation was thrown out the window with Section 230. And so there was no harm, as long as they weren’t engaging in direct criminal behaviour. And I mean direct, which means that they themselves were engaged in illegal behaviour. There was no legal incentive if somebody was using their services to engage in illegal things.

Neil Fairbrother

Okay. In the legal world often it takes a legal case to set a new direction. And one of the big legal cases concerned Backpage, which was a social networking site, well, it was it was an online site that had the equivalent of back page adverts. So a lot of personal adverts were posted there. It was launched in 2004, and by 2011 critics and other law enforcement agencies were accusing Backpage of being a hub for sex trafficking of both adults and minors and this all came to a head in a legal case. Can you tell us about that?

Rick Lane, Iggy Ventures

Sure. I highly recommend that your listeners watch a movie called I’m Jane Doe, that was directed and produced by Mary Mazzio. It’s on Netflix. And it goes through the history of the Backpage.com litigation from the survivor perspective as well as from the Attorney General perspective and which led this legislation called FOSTA SESTA.

What was happening in the backpage.com situation was that survivors are trying to sue Backpage for the harm that they caused after being frustrated that their daughters or themselves were being trafficked and they had no recourse to deal with it. And so then they went to the state AGs, the state Attorney Generals here in the United States, and went after backpage.com but because Section 230 says platforms are only responsible for federal criminal laws, the state Attorneys Generalchands were tied as was local law enforcement. So this is what created the frustration on Section 230 at the state and local level that led to the FOSTA SESTA the legislation.

Neil Fairbrother

Okay. Now FOSTA is F O S T A and SESTA is S E S T A, and they seem to be a pair of laws that are kind of almost like Siamese twins. They’re joined at the hip. What do they enable?

Rick Lane, Iggy Ventures

Oh, that’s probably a whole other podcast of the FOSTA SESTA laws, they are two separate and there’s a whole reason why the two bills came together. But in general, in the broader sense, what they do is they provide local and state law enforcement and survivors with the legal tools to go after websites that are knowingly facilitating sex trafficking.

Neil Fairbrother

So by sex trafficking does that include all types of sexual abuse, particularly of children?

Rick Lane, Iggy Ventures

No, it doesn’t include the pictures and video. It only includes the trafficking of humans. So it’s the selling. It’s the selling of young children, you know, online

Neil Fairbrother

Now recently yet another law has been proposed in the US with the rather catchy title of the “Eliminating Abuse and Rampant Neglect of Interactive Technologies Act” or rather more catchingly, the EARN IT Act. What is the Earn It Act?

Rick Lane, Iggy Ventures

The Earn It Act goes after the issue that you were just talking about, which is the sexual exploitation of children online through video and photographs. As I mentioned, FOSTA SESTA dealt with the sex trafficking sides to the selling of individuals, but there’s nothing dealing with the pictures. And so what senators Blumenthal and Graham have introduced is legislation to provide that legal incentive to make sure that they are proactively looking and trying to find child pornography on the sites, on the platforms, so that they can get those taken down and over to NCMEC.

Neil Fairbrother

Okay. So does this mean that this kind of content is not already illegal?

Rick Lane, Iggy Ventures

It’s illegal, but there’s, again, there’s no legal incentive for the platform to proactively monitor their sites. So what this legislation does is create, through a commission and then approved by Congress, best practices. And if the sites are certified that they are adhering to the best practices then they continue to get their Section 230 immunity, if they don’t adhere to the best practices, then they lose it.

Neil Fairbrother

Okay. And there’s a nice play on the title of the act, the Earn It Act, they have to earn their indemnity, which at the moment is just automatic.

Now there are people objecting to the Earn It Act. I mean, it sounds pretty straightforward and common sense to me, but there are people objecting to it. And they set about their objections by citing the Fourth Amendment and raising the spectre of “agents of the state”. What is the Fourth Amendment in this respect and what do they mean by “agents of the state”?

Rick Lane, Iggy Ventures

Sure. Just remember, I’m not a lawyer, but you know, I play one on TV, but no, just kidding! So the big arguments, there are several. They always use the old argument, “it’s going to break the internet, hinder innovation and stifle free speech and harm small businesses”. So that’s their common thread on all these types of platform liability issues. So we’ll put those aside.

The two new arguments that they’re using and the Earn It Act is the Fourth Mmendment argument as well as it’s going to require them to do a backdoor for inspection. So those are the two. And the Fourth Amendment issue with their arguing is that under the US Constitution, the Federal government cannot ask a private sector entity to do something that would be a violation of the search and seizure laws of the Fourth amendment, because in the US you have to have a warrant and or a subpoena to go to search someone’s home computer and so forth.

And so what they’re arguing and what they’re saying is that if you do the best practices and you’re providing information to law enforcement, then you’re doing something that law enforcement and the Federal government can view itself without a warrant, and therefore it’s a violation of the Fourth Amendment.

Neil Fairbrother

And in what way is it not a Fourth Amendment violation?

Rick Lane, Iggy Ventures

Sure. First of all these websites are already required by Federal law, if they find child pornography on their site, to give it to MCMEC. And NCMEC does already use that and working with law enforcement to go after these people.

Neil Fairbrother

And NCMEC is the National Centre for Missing and Exploited Children?

Rick Lane, Iggy Ventures

Correct. So there’s already Federal laws that require any website or platform, if they discover what they even believe is child pornography, they are supposed to report that and send that to the National Centre for Missing and Exploited Children.

So the question is, is that already a violation of the Fourth Amendment? I hope they’re not saying that. I hope they’re not going to be arguing on behalf of [that]… because there are some people who have been convicted of child pornography trying to make a Fourth Amendment argument that the evidence was obtained by MCMEC, and they were given that information by a private sector entity and therefore it’s a violation of the Fourth Amendment. I hope that’s not what they’re arguing because that would completely undermine the ability to really go after these people. But that’s kind of what they’re arguing and that’s a problem.

Neil Fairbrother

It is a problem because it would, if that argument is successful, it would seem to render any legal protection against children being sexually abused online, useless.

Rick Lane, Iggy Ventures

Yeah, it would make it that the only way law enforcement could ever find… it would make it [so that] the only way that you could ever find child pornography is if [law enforcement] themselves knew that there was a child pornographer on the website, on the platform, got a subpoena, and then try to search that person’s, you know, history, versus, you know, having it on Facebook where you see it and people are uploading it and sending that to the National Centre and then they’re doing the investigation. It would make the internet even worse than it is today in this area.

Neil Fairbrother

That’s quite a mind-blowing image to portray, I have to say. I wasn’t expecting that.

Rick Lane, Iggy Ventures

Yeah. Their Fourth Amendment argument and could undermine… The interesting thing though, and this is where what I said to my friends in the Senate judiciary committee is that the National Centre for Missing and Exploited Children is pushing this legislation, if they felt that there was any chance that this would help and allow people who are engaged in sexual exploitation of children to not have to go to jail [then they would not support it]. And the only ones arguing that it is a Fourth Amendment problem is the Internet Association, NetChoice and all these different entities that represent Google and Facebook.

Neil Fairbrother

I understand the principle of their argument. But surely children are special case, that children are not adults and there are already plenty of laws in place that provide limitations to children for what they are able to do as they progress through their journey from childhood to adulthood. They are proscribed, for example, from buying certain types of products from a store. They can’t buy alcohol, they can’t buy fireworks, they can’t buy firearms. In the UK a child can’t even buy a pet until they are over 13 I think it is. So there are already a progression of laws in place to protect children. Surely it is something that everyone should defend the rights of the children, to be free from sexual abuse, well all abuse, but particularly online?

Rick Lane, Iggy Ventures

I agree. But for the tech platforms, the Googles and the Facebooks and the Reddits of the world that are fighting this with literally millions of dollars to try to stop this legislation. What they’re fearful of is the whole issue that somehow that as a platform, they now may have some liability and they’re throwing everything they can to make the argument and try to scare people by saying, “Oh, this is going to be a First Amendment problem” or “This is going to hinder innovation. This is going to be a trial lawyer’s dream bill, because now you’re gonna have all trial lawyers involved”.

And then their other big one is the whole encryption issue, especially because of concerns on privacy and cybersecurity. So they’re throwing everything but the kitchen sink to try to stop this legislation.

Neil Fairbrother

And the reason that they’re supporting so much in the way of encryption, is that because they then would be blind; they would not be able to know what is on their site and therefore they wouldn’t be able to implement systems at great expense to take action against this kind of content?

Rick Lane, Iggy Ventures

Well you think about what they’re advocating for. If add together there the issue of the Section 230 liability protections, you add together encryption, your put that in with digital currencies. And this other issue that I would love to talk to you about, which has huge child safety implications, which is this whole “Who Is” database and ICANN, you put all that together, what they’re advocating for is a DarkNet.

Neil Fairbrother

Okay. Just very quickly because we are running out of time. You mentioned two new things there. You mentioned ICANN and “Who is”, can you briefly just outline what they are?

Rick Lane, Iggy Ventures

Sure. So ICANN is the Internet Corporation for Assigned Names and Numbers and is a non-profit that basically controls all of what are known as a “General Top Level Domains”, .com. .net .org, go down the list. They are in control of it. As part of when ICANN was created, there was a database called “Who is” and the purpose of the Who is database is it’s used by law enforcement, child protection agencies to know who owns that website, who is behind the website. So that way if there is a problem with those websites, you know who to contact, to find, and law enforcement, cyber-security folks and child protection advocates use Who is all the time. Because of the GDPR, it has gotten dark.

Neil Fairbrother

So what you’re saying then is that the social media companies of the world, the Googles of the world, are actively pursuing a policy to put the lights out, to make the whole internet dark, but at the expense of child online safety. That is the price that they are willing to pay.

Rick Lane, Iggy Ventures

Yes, that’s exactly, you’ve nailed it. And it’s also Verisign, the entity that controls .com and.net, who has done nothing to help in terms of child protection, and they’re pushing the dark Who is, and so it’s a mess. People aren’t paying attention, [they] have to realize that in the guise of allowing folks to be anonymous online and anonymity, which we don’t have a law here in United States, [that allows you to] be anonymous, in certain aspects, but they’re going to make the internet go dark and the only ones who are going to have information will be the platforms. And if the platforms are blind because everything’s encrypted and they can’t see anything and who’s going to protect the children?

Neil Fairbrother

Rick we’re, we are out of time, but what a note to end on. Thank you so much for your time. That’s been absolutely fascinating and I hope you remain safe during this rather bizarre period of lockdown. And I look forward to keeping in touch and I will follow this with a great deal of interest.

Rick Lane, Iggy Ventures

Happy to have the chat and look forward to future conversations.