Safeguarding Podcast – Licensed to Operate with Jonny Shipp, Internet Commission

By Neil Fairbrother

In this safeguarding podcast we talk to Jonny Shipp of the Internet Commission about how their proposed frameworks of Digital Responsibility could be used to hold social media companies to account and how the use of AI can obfuscate and obscure much needed transparency. We also discuss the Online Harms white paper, the regulation of the internet and social media’s “license to operate”.

http://traffic.libsyn.com/safetonetfoundation/Licensed_to_Operate_with_the_Internet_Commission.mp3

There’s a lightly edited for ease of reading transcript below for those that prefer to read or for those that can’t make use of podcasts.

Neil Fairbrother

Welcome to another addition of the SafeToNet Foundation’s safeguarding podcast where we discuss all things to do with safeguarding children in the digital context. This year the UK government published an Online Harms white paper, which promises a new culture of online transparency, trust and accountability, particularly as far as children are concerned, in their drive to make the UK the safest place in the world for people to be online.

Given that the Internet, the worldwide web and social media are largely privately owned and often by foreign companies beholden to laws in different states, some might question the UK’s laudable ambition. The Internet Commission thinks that there’s a solution and has devised an independent evaluation framework which will help make the Internet and everything on it, a safer place for us all, but especially of course for children. And today’s guest is from the Internet Commission and it’s their CEO Johnny Shipp. Welcome to the podcast Jonny.

Jonny Shipp, Internet Commission

Thank you. Nice to be with you.

Neil Fairbrother

So Johnny, can you give us a bit of background about yourself? Who are you? What’s your background?

Jonny Shipp, Internet Commission

I’m a social entrepreneur and an advisor on technology strategy and public affairs. I spent many years working with Telefonica and had senior roles in the Internet and communications industry and corporate affairs in regulation, in marketing and technology programs both in the UK and Europe, and in the Americas. I was six years the vice-chair of the Internet Watch Foundation. And now and for some time I’ve been a visiting Fellow at the London School of Economics in the Media and Communications Department, working with Sonia Livingstone.

I’ve got a kind of interest there in what drives digitalisation, how it is reshaping everyday life and in digital policy both in the UK and in Europe and globally. My key focus at the moment there is the development of the Internet Commission.

Neil Fairbrother

That’s a nice segue into my next question Jonny. What is the Internet Commission?

Jonny Shipp, Internet Commission

So we think accountability rather than just transparency should be the main focus, and so whilst transparency is an important thing, it’s not a kind of a solution on its own. If we want to restore trust, we need to find a way of building accountability, so we think that setting your own criteria for transparency, might lead to less, not more accountability. So what we’re working on is a framework of independent evaluation of how platforms are responding to the various issues, particularly with a focus on content moderation.

We’ve seen over a number of years or lack of understanding or true understanding of how Internet platforms are dealing with particularly content moderation at scale, what their processes are for tackling online harms. And there’s a reasonable amount of difficulty in kind of surfacing and finding the evidence required to tackle those issues. And so, we’ve been working towards a set of questions that we think will help us to broker that much needed evidence about how firms are tackling, for example, illegal content, hate speech, cyberbullying, self-harm and fake news, the whole range of issues. We want to do this in a collaborative way with business, but in, in the way that helps to shine a light on the processes at work in these privately-owned platforms that the public use.

Neil Fairbrother

Now on your website, you refer to something that you call “Dialogue on Digital Responsibility”. What do you mean by that?

Jonny Shipp, Internet Commission

I mean, it’s really critical that there is an informed stakeholder dialog in order to address these online harms issues. And so a big part of what we’ve set out to do is to inform that dialogue, to bring together a range of stakeholders who have an interest in advancing digital responsibility and to bring to that dialogue some better evidence of how issues are being tackled. So our Dialogue on Digital Responsibility is about convening and informing a new dialogue on digital responsibility and to work towards a better informed discussion and action around how companies and the wider society, are addressing digital responsibility issues.

Neil Fairbrother

Now in the context of that, you refer to the UN’s Global Goals of which there are 17. It’s a laudable framework for the UN to have put together, but it’s not immediately obvious from the UN Global Goals which of those refer or relate to online harms. So how do you make that link?

Jonny Shipp, Internet Commission

This is about a better digital world, right? So, the link with the Sustainable Development Goals is that we started a conversation about “what is the vision for better?” So, if you’re talking about a better internet, what is a better internet and what is that better world that we’re trying to get to them? We need some kind of reference point for that. And there is one place to look, which is the kind of UN agreed vision of a better world as expressed in the Sustainable Development Goals.

So in that part of the dialogue, what we were looking at is how could you articulate, how could you take this view of a better world and then understand how digitalisation is going to get us there. And so we were working on the principles that would underpin that path to a better digital world, having in mind where we were guessing, informed by that you envision. And that led us to developing some underpinning thoughts and framing of our questions, our 45 questions that we’ve since developed to prompt an understanding of how content moderation at scale is happening.

Neil Fairbrother

Yes, those 45 questions that you put together in the new accountability framework covers six different sections; you’ve got Reporting, Moderation, Notice, the Appeals process, Resources and Governance. Now these are things that you want the privately-owned companies, and probably in particular social media companies, to take account of. And on the one hand it seems quite thorough, 45 questions, six sections, that sounds like a substantial body of research or governance. But on the other hand, is it really enough, given the complexities of the topics that we’re covering,

Jonny Shipp, Internet Commission

I’m sure one question will lead to another, but we need a framework to start to understand and are our methodology is to build case studies. So what we’re working towards is a number of case studies which will be informed by these questions.

It’s important to say that no two companies acting in this environment are exactly the same, so we will find some ways to compare and contrast, but you know, there won’t be always an exact comparison between the way people are working. There will be some, but they may mean different things in different contexts as quite a wide range of different types of companies and sizes of companies as well, a huge range in fact.

And so the way that we see the question is kind of, there’s a set of almost best practice indicators, perhaps some things will be more appropriate to larger companies, some smaller companies may not do them, some different types of companies may take onboard different types of activity. I mean a video sharing company, a dating company, a Games company, they’re quite different, although they may all do content moderation and need to do content moderation at scale.

So it’s a framework, which represents a set of starting point for understanding best practice and the starting point for building case studies, which we will then compare and contrast between the different companies.

Neil Fairbrother

What do you mean by a case study? What might a case study include? What would it include?

Jonny Shipp, Internet Commission

It’s going to be built around the answers to these 45 questions in the sections that you outlined and it’s focused on painting a picture, or creating an agreed picture of how content moderation happens. How is it decided whether or not a particular item of content is there, what the processes are around removing it, interacting with the content owners, what the safeguards are around privacy and freedom of expression, the extent to which these processes are manual or automated, the ways in which the human moderators are organised, the resources that are put into them, the training they have, the cultural context that they operate in, the way that the automated processes work, the governance around it, the possibilities of processes of appeal and so it kind of a whole picture of the process, it’s a focus on process.

And so we think that by establishing these pictures, these case studies, that will allow us to compare and contrast how different companies are operating and highlight best practice, highlight areas where different companies differ.

And as I said, large companies adopt some practices and others may do things in different ways, opportunities for innovating in how content moderation is done, and so produce a comparative analysis, taking account of all these different contexts, but based on a base set of questions, which we think is grounded in best practice. And of course, as we go through that process, we’ll be updating the questions and the view of what best practice is, so that will be a cycle of improvement.

Neil Fairbrother

One of the outputs that you’ve referred to is “confidential disclosure”, you refer to confidential disclosure, which can play an important part in ensuring internet companies are taking sufficient steps to safeguard people, children in this context. But how does that work?

Because if the disclosures are confidential, doesn’t that remove the public pressure for change? Isn’t that the opposite of transparency?

Jonny Shipp, Internet Commission

I guess what I was kind of saying at the beginning, which was that there’s a difference between transparency and accountability and if you are transparent then you can sort of choose what to be transparent about. Alright, so someone else is now setting the questions, that will be us, but first of all, it’s difficult to compare like with like, and also difficult for an independent organisation without the backing of law, without legal powers, to insist on information.

So as an independent organisation, what we have decided is the good way to make a contribution in this space, is to work confidentially with companies, to build these case studies with the collaboration of these companies but only publish information which is agreed. So we won’t be producing an analysis which is in the interest of the companies, we’re not owned by the companies, we will be producing an independent expert analysis. But it makes sense that companies won’t disclose everything to the world and they’re never going to.

And people, as I mentioned, also have tried for many years to get information from the big platforms about how things operate, and it hasn’t happened so far and I don’t believe it’s likely to happen even with the force of legislation in any full way. So, what probably will happen is that there’ll be a law that says that certain numbers or certain indicators must be disclosed, but the difficulty is it’s such a complex environment knowing exactly which questions to ask and as it keeps changing, it will be quite difficult to keep kind of pushing with the force of law.

So that will be part of it. But we think that a complimentary activity that can also work much better across borders, because national law has its limits obviously, is a way that is collaborative with companies in order to help them demonstrate accountability. And so not self-regulation, because it’s not something that is owned by the companies, it’s clearly independent and not following the company’s agenda, but works nonetheless behind closed doors to start with and produces an analysis behind closed doors, which then can be published in useful evidence, but with details that are by agreement with the companies that are involved.

Neil Fairbrother

Yes. And getting their agreement is fundamental to achieving this. Now, one of the phrases that you use, which struck a chord, certainly I found it interesting, was that you say “Internet platforms and service providers have much to do in this space. Current transparency efforts lack the ambition and the oversight necessary to rebuild consumer confidence and to sustain a license to operate.”

What do you mean by license to operate?

Jonny Shipp, Internet Commission

Well, I mean companies have to have broadly the blessing of the communities that they work in in order to continue to survive. So licensed to operate in my kind of understanding is about the permission from society to carry on their business. It’s not just about goodwill, it comes out in the law as well, but also could be soft or softer as well as hard law.

So licensed to operate is the permission of society through different mechanisms for the company to carry on its business. And we’ve seen some sort of initial challenges to companies operating in the online space, and so they need to pay attention to this.

Neil Fairbrother

Okay. Now we’ve been talking about transparency and we’ve also have been the talking about accountability, and in the mix of all of this is technology and AI, Artificial Intelligence, which can be used, and is to some extent being used to tackle online harms. But how do you ensure that the AI is doing the right thing? Because of course, AI itself is a product of human beings and it’s difficult not to build in human bias. I think we were seeing some results of human bias in some AI instances, albeit potentially unwittingly, so wouldn’t that exacerbate to the problem?

And it’s interesting, the Electronic Frontier Foundation, the EFF, are quite cautious in their view about how AI should be used because the whole point of AI is it creates its own, in inverted commas, “intelligence”. It has a different way of thinking or coming to a conclusion, given a set of data. And there’s a lot of secret sauce in how that works as well. There’s going be a lot of commercial confidentiality. So is there a transparency issue there as well as an accountability issue as you start to bring in AI?

Jonny Shipp, Internet Commission

Well, again, the transparency is only useful insofar as you’re asking the right question. So if you’re asking the right question of the AI kind of environment, then maybe transparency has a role. I’m sure it has a role, but the more important thing is that the owners of AI technology are accountable for for the way they deploy it, so that they they can show that they are deploying it in a responsible way and that the social impacts are positive.

It’s absolutely right that the AI can be deployed. It may be good, may be bad. I believe that it should be the responsibility of the owners of the AI, the organisations that deploy it and actually use it for impact in the world to make sure that it’s positive for people and they should be accountable for ensuring that.

Neil Fairbrother

Who makes that judgment call on whether it’s good for people?

Jonny Shipp, Internet Commission

That goes back to this question about the SDGs, the Sustainable Development Goals. Someone’s got to have a vision of what “good” looks like. And I say someone, well of course the government has a view of that and they will make rules. But actually, we’re talking about, in some sense, places which are a bit beyond the reach of any national government.

So it then has to be about the responsibility of the actors and if they’re global corporations, then I think that there is an important question about how to ensure the responsibility of those global corporations and how to ensure that they do act for a better world. That’s a developing kind of situation, but we need global corporations to be ethical and that needs to be somehow increasingly embedded in the rules under which they operate at a global level.

Neil Fairbrother

Yes. And in fact, interestingly we saw recently, I think it was Verizon, the shareholders placing child safety at the centre point of shareholder action, so maybe the tide is turning in that sense.

Jonny Shipp, Internet Commission

Well there’s been a few of those. A year and a half ago or so there was similar pressure on Apple and yes, Verizon, the first ever shareholder vote on child exploitation, apparently backed by 34% of its shareholders. Harnessing the owners of the companies in this journey towards more ethical corporations is obviously a very important thing. I think that actually Government could do more to encourage that that kind of activity.

Neil Fairbrother

Okay. Now if they are more involved, or if legislation is more involved, it’s a difficult thing to do because you acknowledge that the processes by which online content is created, control, distributed, targeted, curated and promoted are complex and dynamic, and that advertising fraud, misinformation and personal data leaks suggest there is a real need for legislation.

But you also say that law makers are ill-equipped to regulate the new global processes, that only a small number of private actors know and understand.

So you’ve got a conflict here I think between ill-informed, or a not well informed enough regulator and Government, and the Government is notoriously bad at IT and technology, versus this dynamic unregulated place. But the former wants to regulate the latter, so isn’t there a conflict here?

Jonny Shipp, Internet Commission

Well it’s very difficult, but back to what I was saying about the purpose of the Internet Commission, which is to bring some more evidence to that situation. The policy makers need to understand more about the processes that are working in the Internet companies. The Internet companies are not easily compelled to provide that information, or at least it’s hard for the policymakers to know exactly which information to compel them to release.

So we need to find a mechanism for building and understanding on behalf of everyone really, so that policy makers can have a sharper focus on the areas that they need to regulate, and so that companies are not caught with collateral damage, from their point of view, being regulated in areas which didn’t need to be regulated.

So it’s in both parties’ interests to somehow find a mechanism of bringing a richer understanding of what is going on to the discussions and the deliberations about policy in a way that doesn’t compromise commercial operations. Companies are allowed to operate, and have to operate with commercial secrets, and have to have to operate in that way. So that’s what the Internet Commission is setting out in to do, is to play that kind of broker, intermediary brokering of a fair analysis and independent analysis of what is going on in terms of content moderation at scale in order that the processes can be benchmarked and improved.

Neil Fairbrother

Okay. Your accountability model, we’ll put a link to this on the SafeToNet Foundation’s website, your accountability model is really quite neat. It’s got nine key areas, some of which seems to directly bear on the online experience of children in terms of Truth, Respect, Inclusion, Wellbeing, Safety and Security. So how does that particular model, that a spider’s Web, spider diagram model, work?

Jonny Shipp, Internet Commission

Our “beautiful flower”, as we call it, is a mapping of the digital responsibility issues that ultimately are in scope of that better world. So, if we’re looking for a better internet for a better world, there is a whole bunch of digital responsibility issues that we think need to be tackled.

Our Accountability Framework is not attempting to look at all of those issues straight away. We’re looking at issues which are linked directly, or can be linked directly, back to issues of content and content moderation. So that’s our first area, and as we develop in the future, we’ll look to have other areas of independent evaluation and new frameworks which address other parts of that model. The impact of AI on employment is on there, which is obviously not directly to do with content. Issues of access and inclusion are important in terms of the impact of technology, but they’re not our current focus in terms of our framework. But we may get to them in the future.

Neil Fairbrother

A lot of moderation today is done by people, it’s a manual process which has issues with scalability and cost, but there is already a set of established principles, the Santa Clara Principles of Content Moderation. Who created that? What was behind it?

Jonny Shipp, Internet Commission

The Santa Clara Conference brought together leading academics and industry around the challenge of moderation at scale. Santa Clara law led the process, I think a year and a half ago, and came up with the Santa Clara Principles, which we took as one of the big inputs to our accountability framework.

Neil Fairbrother

And are all the big social media companies using these guidelines, do you know?

Jonny Shipp, Internet Commission

I don’t, I mean that’s partly what we’re trying to find out, but at least they are established as some kind of best practice. And the big companies were certainly all involved in the discussions.

Neil Fairbrother

Any form of moderation will be decried by some as censorship, as an infringement of freedom of speech and liberty and so on and so forth. Can the online or digital context be a safe space for children with no moderation at all? Isn’t this a compromise that we’re just going to have to accept whether or not it is an infringement of civil liberties or freedom of speech, we just going to have to accept that for the good of the children?

Jonny Shipp, Internet Commission

Well, I think it’s there already, but there’s a reasonable argument that it needs to be kept in check. I don’t want to live in a society where everything is completely monitored and I have doubts about the way in which this operates in other parts of the world. I think in Europe we have an opportunity, a sort of odd position between China and the US to the kind of maybe more extreme places where are more extreme philosophies on this.

And I think the challenge and opportunity for Europe is to find the balance, and that’s being played out in practice in these companies where the balance is being found between freedom of expression and privacy and safety. And both people and machines are part of that mix. What we need to have is a way of demonstrating that that balance is being found in an appropriate way, and firms need to be accountable for that.

Neil Fairbrother

Now we’re at a funny point in history for the UK because we have voted to leave the EU, you mentioned Europe. It seems to me that there’s an argument for scale here. The bigger the numbers, the stronger the argument and being part of a region of half a billion people, which has got some rules around content and moderation and all those kinds of things, the license to operate as you termed it, you’d have more impact as being part of that, than being a small island with 76 or 70 million people in it. So is this a good thing or a bad thing, or it doesn’t make any difference?

Jonny Shipp, Internet Commission

Well, there’s a been a lot of interest in the [UK Government’s Online Harms] White Paper in Europe. I think lots of other European governments are looking to see what the UK Government can come up with. It’s a massive undertaking to try and regulate this space, and they’ve had a go at it and I think that the other countries are interested, and will be informed.

The ultimate shape will be shaped collaboratively across the world really. I think it’s fairly certain that the UK will still be involved in part of a European approach to this. I mean, I speak broadly, I use the word Europe as opposed to China and America. So we’re not going to suddenly become Chinese or part of the United States if we leave the EU!

Neil Fairbrother

Indeed. In your briefing paper, “Values for Digital Responsibility, Agency, Intention and Stewardship”, there’s a nice little equation you’ve used that says “Accountability = Transparency plus Values”. What do you mean by that?

Jonny Shipp, Internet Commission

We’ve spoken about transparency not being a silver bullet on its own, but then you need to apply a perspective to the questions, that’s what I’ve been saying, that in order to reach accountability you have to have a directed transparency.

So some questions have to be asked which are independent and grounded in some sense of European values. This was some of the work that we were doing to link our question set with what the “better world” looks like and with the Sustainable Development Goals.

Neil Fairbrother

Coming back to the Government’s Online Harm’s white paper, they very strongly suggest that we need some kind of regulator for the Internet. Who could this be? Is it an existing organisation or a new organisation? Does a new bureaucracy need to be created?

Jonny Shipp, Internet Commission

I think they’ve pointed quite strongly that it would be Ofcom, or at least it may be Ofcom in the short term and maybe there’ll be a new internet regulator. They were clear there will be a regulator. It will take some time to appear. My main thoughts about the question of the regulator, is what happens in the meantime?

I think this process will continue, but there’s no reason to for inaction in the meantime. And I think that the Internet Commission has something ready to go now that we’re working on that can help to bring a bit of light to this situation and we don’t have to wait for the regulator to be in place in order to help shape policy in that way.

When the regulator is in place, Internet Commission activity will have developed, but hopefully we’ll be in a good place to also help the regulator at that point, as I said, providing insight about the way companies are operating in this space, in order to make sure regulation is sensible and smart.

Neil Fairbrother

Jonny, on that note, we going to have to wrap it up. We’ve run out of time, I’m afraid, so thank you so much for your time,

Jonny Shipp

Your welcome. Thank you.