Safeguarding podcast – The Digital Sweetshop with Baroness Beeban Kidron

In this safeguarding podcast with Baroness Beeban Kidron of 5Rights, we discuss the rights of children online. What are the moral and legal obligations of businesses as far as children are concerned? How can large scale digital service providers encapsulate children’s rights as defined by the UN CRC into their services? What are Age Appropriate Digital Services? Is “privacy” absolute and is that all that’s needed for safety, and is Section230 the root of all evil?

There’s a lightly edited for legibility transcript below for those that prefer to read, or for those that can’t use podcasts.

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother exploring the law culture and technology of safeguarding children.

Neil Fairbrother

Online children haven’t always had rights, indeed, the concept of the rights of children is a relatively recent thing which were codified by the UN Convention on the Rights of the Child only in 1989. Since then the UN CRC has become the most widely ratified international agreement in history, yet there are still some countries and many organizations that have yet to recognize the full range of rights of children that this convention describes.

One organization that is determined to ensure that the digital world at least caters for children and young people is the 5Rights Foundation and to guide us through their work, I’m joined by Baroness Beeban Kidron, OBE. Welcome to the podcast, Beeban.

Baroness Beeban Kidron

Thank you.

Neil Fairbrother

Beban could you give us a brief resumé please, so that our audience from around the world has an appreciation of your background?

Baroness Beeban Kidron

Yes, of course. I’m a slightly unusual creature and that I’ve had two lives at the very least. So for the first 30 years of my professional life, I was actually a film director and I made movies both in Hollywood and here in the UK, I made telly, I made documentaries, I made big feature films with stars, and I very much enjoy doing it, but I did it at a time when very few women were directors. And so I’ve always had a little bit of one eye on what might not be quite right in the world.

Now towards the end of that period, I actually started a different charity, not 5Rights, but one called Film Club, it’s now called Into Film. And the idea was that that children who perhaps were not getting a full range of world views and a full range of understanding a broader world, perhaps would see that if they watched films. And so we set up them film clubs up and down the country for state school pupils from the age of five to 19 and they watched films and they watched films of all kinds and different subjects.

And I’m saying this in long form, partly because it was the reason that I was offered a place in the House of Lords. So it was on the back of this incredible growth where one third of all UK children came to watch films in our Film Club and had a very, very educative experience, a very big culture shift.

But it was also in that context, that in 2012, I noticed something happening to children. And I think as many people will realize, who listened to your podcast, but 2012 was also the point at which the smartphone became roughly speaking a price point at which parents would give it to a child. And all around me, there’s hundreds of kids that I was interacting with through Film Club, I noticed a difference.

I became very interested in it and I actually found myself making a film about it. It was called “In Real Life” and it was about teenagers and the internet and making that film transformed my life completely. And it’s because of that film, and because of watching children engaging online, that I actually set up 5Rights and that I have the life that I have now.

And if I say one thing, just one thing that it pointed out, which is really at the core of all we do, was that in the course of the film, I interviewed so many people who were credited with inventing the web and inventing the digital world. And they kept on saying the same thing, which is all users would be treated equally and they said it as this marvellous thing. And actually at first blush, it sounds like a marvellous thing, but of course, what happens is if you treat all people equally, you treat a child as if they’re an adult.

Neil Fairbrother

Indeed. And that therein lies the problem, I believe. Now why is 5Rights called the 5Rights?

Baroness Beeban Kidron

It’s called 5Rights because originally when it was first set up, we really wanted to look at how to transfer children’s rights into an online setting and at the time everybody was talking about digital rights and I was getting quite cross because as you’ve already said in your introduction, children have existing rights, there is the UN CRC for 30 years. In fact, for the same 30 years we’ve had the internet, children have also had those rights.

What we must do is recognize that their digital life is the same for them. It’s their lived experience. There is no difference for them between on and off [line] and the sorts of norms, expectations, and rights that they enjoy offline must be actually embedded online. So the 5Rights work, let me try and find a subtle way of saying it, but you know, it was our attempt to say look, here’s the UN CRC, if you don’t think too hard in the digital world, it means these things.

Now, since that time we’ve grown and we’ve developed and as I’m sure that you know one of our very major projects is actually doing the Addendum to the Convention on the Rights of the Child, the General Comment on the digital world. So we actually are going to do that piece of work formerly for the Council on the Rights of the Child and how children’s rights manifest in the digital world will be adopted by Nation States by the end of next year.

Neil Fairbrother

Okay. And you’ve created what you call the 5Rights Framework, which I think is a reasonably well established. What is the 5Rights Framework and how will that feed into your work you are doing on the UN CRC?

Baroness Beeban Kidron

So really what it looks at is both I think that people are very, very concerned, a lot of the time they see the problem of children as being one of, sometimes they see the problem of children as being one of contact. But what they don’t look at is really the question of design, of service, of what the business model is and how that discriminates against children and how that problematizes childhood.

And so we work in five different areas. We work in data protection, we work in the actual design of service, so are the aspects of design good for children’s safe for children. We look at child online protection in a very holistic way, making sure that absolutely everything has been considered. And we look at embedding children’s rights as a principle into the design and into the services that children use. And then finally, we really look at children’s participation.

If the digital world did recognize children, you would know that one in three of their customers, one in three of their users are in fact under 18. And there’s not enough understanding of what those children’s needs are, nor what their attitudes are. And so the fifth thing we do is work very, very closely with children to help them understand what’s happening to them and to capture their feelings and actually very, very often their solutions to the digital problems that we all face.

Neil Fairbrother

Yes. So and that digital participation is partly reflected in the UN CRC, because of course they have a right to participate, which is enshrined in the CRC. But it’s also an encapsulated by service design thinking, which is a one approach to creating digital services. How do you think that large scale online digital service providers should work with children in the context of what you’re calling digital participation? I’d like to just explore some of the other aspects that you’ve mentioned as well, but let’s start with digital participation.

Baroness Beeban Kidron

It’s a really good question. I mean, I always say, you know, when I do my public speaking and I say, you know, if I had five pounds for every time that a tech designer, entrepreneur, governance person, executive said to me, “I never thought about that before”, yeah, I would be able to fund quite a lot of design myself. I mean, they always say this phrase and the reason that they never thought about it before is because they’re not really talking to their users and they’re not really talking from the right perspective to their users.

So if I say to you that you know, that actually only yesterday, I was with a bunch of kids who were actually taking the text of the General Comment on the Convention and helping me make a child-friendly version. One of the things that they talked about was not just language. So we’re used to in our environments, we say, okay it should be clear to children what they’re getting into, it should be presented in language that they understand.

But this young boy said to me, “No, no, no. It’s not just about language. It’s also about form. I actually would like to see it in a video. I learn everything in video. Can I have a video?” And then you know, a young woman said, she said “No, no, actually I would like it in a podcast. And I would like to have someone my own age telling me what it’s all about, because I would find that more acceptable.”

Anyway, we get into this conversation about language, about format and suddenly an 11 year old sitting somewhere up in Scotland, turned around and said, “Well, I don’t think it’s about the words. And I don’t think it’s about the picture. I think it’s about the timing. If I want to get on to a service in the moment, I will agree to anything. It doesn’t matter. You know, I mean, literally I will agree to anything, but maybe in a cooler moment when I’ve had a look and I’ve had a see, perhaps if they asked me then, I might be in a situation where I think maybe that’s not worth that.”

So you need both the clarity and you need the understanding of where a child’s emotional state is and not to exploit the timing. Now, I think we know that and some researchers know that, but think about the fact that an 11 year old child said that yesterday. I’ve had that again and again, about security, about swear words, about bullying, about CSE, about all sorts of privacy issues. Children have really creative ideas, but they have also very, very clear understanding of their experience and what is wrong for them and they will say so.

So the tech companies could do a great deal more to understand that, but that does take a commitment to do what the children say. And I will say that I have some very poor experiences of running workshops where the clear outcomes are seen, but the tech companies are not willing to do it. So I think we have to not be naive in this. It’s not just about having a bunch of kids on your bean bags and then ignoring what they say, but there is absolutely in the heart of children and in the imagination of children an understanding of what would make their experience better.

Neil Fairbrother

Okay. And that links nicely into another one of your key areas, which was service design. And I think you’re working with the IEEE to create standards for what you’re calling Age Appropriate Digital Services. Who is the IEEE? Why have you chosen them and what are the standards? How will they work? How would they be measured?

Baroness Beeban Kidron

Well, okay. So the IEEE is actually the International Electrical Engineers Association. That’s the largest professional engineers’ association in the world. And I believe that they have something like a quarter of a million engineers who are members around the world. And it does many, many things, but one of the central things it does is, it is a creator of standards. So anybody who’s listening, who has used WIFI today has very likely used an IEEE standard for what good WIFI is. And if you’ve driven a car, you’ve probably got a little thing on the windscreen, you know, and it says this glass meets that standard. So basically it’s an international standards organization.

And so we are working with them because they are experts and they have they have engineers all over the world. We are working with them because they have standards and they have a standards program that is picked up by companies all over the world. And we are working with them because I had a wonderful meeting with the Head of their standards organization and he was talking about how worried his membership was that digital engineering was getting a bad name and he pointed out that there are a handful of companies that do that determine the culture. But actually there are hundreds of thousands of companies that use digital technology that would like to do the right thing, but they don’t know what the right thing is.

And I said, you know, as one does, well, why don’t we look to create some age appropriate standards? Let’s start putting the best interest of children, let’s start putting children’s rights at the core of engineering standards and let’s start looking at what that would mean in terms of rejigging the system. And the one that we’re in the middle of right now is about age appropriate published terms.

Published terms are the terms and conditions, community rules and privacy notices, and we’re looking at what it would be to reimagine those three sets of published terms as being age appropriate and what questions you’d have to ask yourself, and then what steps you would have to take, in designing your service, having answered those questions for yourself in order to engage with children.

And we’ve been doing that with a group of engineers right across the globe. Our meetings are at 7:00 PM UK time, but we have people who are at 4:00 AM and 11 in the morning from all over the world. And it’s been a hugely intellectually challenging process, but it is it’s going to have a very, very practical output with a standard probably early next year, that companies will be able to adopt, you know, those that want to treat children right. And I would like to sort of emphasize that there are many, many, many companies, all companies, you know, virtually all companies are beginning to be digital in some form. And many of them want to do the right thing by kids.

Neil Fairbrother

Indeed, indeed. So the third area that you mentioned was child online protection and protection is often couched in terms of privacy, using technology such as encryption, but safety is also important for protection. And yet there seems to be an area of conflict here. Frequently when you talk about safety people, if you want to use this term from the “privacy camp” start to say, well, you can’t have less privacy. You have to have lots of privacy. You’ve got to have lots of encryption. And that’s what makes people safe. But clearly that kind of privacy can make people less safe. We all know, I’m sure that encrypted video streaming services are used for online child abuse. So clearly safety is not as important as privacy in some people’s minds. Do you think that that is the case or is there a case that says that privacy is all?

Baroness Beeban Kidron

So I mean the quick answer to your last bit of your question, is there a case that privacy is all? No, all rights have to be put in balance with other rights and all people’s rights have to be balanced with other people’s rights and that is what makes us communities and human and why we bother making rights frameworks is so that we can actually balance out these things.

But I actually want to challenge a little bit not your question so much as this idea about privacy. And I think that we have to think about privacy in more than one form. So I think we all fall into the company’s hands when we talk about privacy and what it seems to mean is privacy either from each other, or it means privacy from the State, it doesn’t necessarily mean privacy from them.

So when they offer perfect privacy, I think that we all have to ask where the money is, what they know about what’s happening on the device, what they can sort of find out from other sources and from traffic and meta [data], and what happens to the data when it’s open on your phone when it’s landed, after it’s been encrypted. So there’s a lot of questions I think you have to ask first about what the privacy offer is, because otherwise we get in a really ridiculous set of things.

The other thing is we have to also ask who we’re making private for what purpose. I’ve had really sensible conversations that are about, you know, people in autocratic States and citizen journalists and so on and so on and those people do need protections, you know and we must offer those people protections. But I’ve also had a lot of very ludicrous conversations about, you know, the adult male right to access pornography without being exposed, as being of a greater right than a child’s right to be protected from child sexual abuse material, because, you know, somehow in protecting the child we’re going to expose the other one.

Well, I think that at the point at which, you know, an adult male right to access what is legal pornography puts a child at danger. We have to change some of our thoughts and attitudes about pornography and just not make that a risk for that adult, because it’s ludicrous. Yeah. So I think the trouble is your question has a very complicated answer.

And I think the other thing is, just before I even get to end-to-end encryption, is that actually there are layers of anonymity, there are layers of privacy and there are layers of protections that the platforms afford. So, you know, I think a lot of us are very worried that, you know, that actually Zoom [for example] offers a service until actually the Chinese State says, well, you can’t do Chinese rights activists, so then they don’t offer the service in China, you know, or to those rights activists.

So actually they make all sorts of moral choices all the time, according to all sorts of environments and pressures. And I think that, you know, again, it’s about hierarchy of what is most important. So let me answer and give you my attitude is once you start unpicking it all, and once you start saying, Hey, you can know who someone is, but not share it. You can let us all know who someone is. You can protect people in particular environments or for particular reasons. There are many more options than on or off. And I would say that you may not withdraw any oversight into child sexual abuse material or grooming that we currently have without putting something equal or better in place.

It is outrageous that they are suggesting that they do so and I think the real reason is money, not privacy and the real reason is reputation, not privacy. It’s an abject moral failure to not consider children’s safety on top. And I don’t mean by being in every single person’s pocket, I mean in all sorts of other ways and there are other technical ways, but I want to just challenge anyone who’s listening to just say, do you know any other business where it would be on the balance sheet that you would say that child sexual abuse at scale was an acceptable form of doing their business? If they can’t run that business without it then go out of business. They’re not fit, I’m afraid.

Neil Fairbrother

Yes, indeed. And that brings us into an area I think we spoke about in your fairly recent webinar, which is Section 230 of the Communications Decency Act. Is that the root of all evil? Should companies who offer up content, whether it’s originated from the general public or not, but if they offer content up based on an algorithmically curated basis, does that not make them publishers and therefore they should be treated as such?

Baroness Beeban Kidron

Well, I think your question is its own answer. I mean, I think any business should be treated as a business and it’s a price of business to be responsible for your service. So absolutely 230 is out of date. It’s based on the idea that they’re cables and conduits and they’re not. But I think that we don’t have to only concern ourselves about 230, although I do think we need to do that, I think the other part of your question is actually at the point at which they are rating and ranking, at the point which they have algorithmic preferences based on any kind of value, whether it’s monetary value or popular value, whatever it is, but at the point at which they choose it, then I think that is clearly a place where regulation can come, irrespective of 230.

That is to say there are more tools in the tool shed. And that’s why, you mentioned earlier, when you look at data protection as being one of the few tools that we’ve actually managed to manifest in the real world to make a difference to the user experience and the user’s rights, it’s because it’s talking about their interaction and their choices and what they take of ours. Yeah? And I think that the next battleground will indeed be a recommendation loops and algorithmic preferences and so on and that stands absolutely separate from any question of 230.

But yes. I mean, I’m sorry, it’s a bit 101, isn’t it? If you are a business, you’re responsible for the harms that your business creates, whether they be societal, individual, you are responsible. That is the basis upon which all businesses function and if they’re not a business, I don’t know what they are.

Neil Fairbrother

Okay. Now just to come back to the five areas that you specialize in. The fourth area is Children’s and Young People’s Rights. Should children’s rights, particularly now we are all so connected in a digital online world, should children’s rights under the UN CRC be formally taught in schools?

Baroness Beeban Kidron

Well, I would love to see that. I mean, absolutely. And I think that, you know, there is a terrible thing in education that we’ve sort of, you know, extracted wisdom and inserted sort of nudge, you know, where they get very narrow things and they tick the box and learn this thing and so on. And I think that, you know, the wisdom of how people live together and the wisdom of the past and the, you know, it’s something that’s missing.

I’d like to see a bit more philosophy, a few more rights discussions and perhaps a little bit more household management, accounting, cooking, carpentry and driving. But you know what, I think my views on pedagogy maybe surplus to this discussion, but I think there’s something more important than teaching it in schools. It’s actually living up to those rights that we have already offered children.

And we have you know, we have set out, you said the word it’s “codifying”. Children’s rights are codified and they’re codified on the basis that it is bleeding obvious that someone who is young cannot necessarily look after themselves in every way, act in their own best interests at all times, or have the maturity of an adult.

And we know that there are developmental reasons, physical reasons, emotional reasons for that to be the case. So the reason for codifying children’s rights is that they get them irrespective of whether they know them, irrespective of whether they’ve learned them and irrespective of whether anybody thinks they’re right. They are their rights. And what we have to do is have digital services that embody those rights above and beyond children’s ability to enact them on their own behalf.

Neil Fairbrother

And this leads nicely onto the final and the fifth part of your areas of focus, which is Data Literacy. Children often regarded as digital natives, they’re app savvy. What’s the problem?

Baroness Beeban Kidron

Well it is a sort of a funny thing, it’s like, you know, just because your kid can do the plastic track across your living room doesn’t mean you put them in an HGV on the public road. I mean, I think that it’s a sort of a ludicrous thing that we think about digital natives, because it’s not understanding what they’re using. So yeah, they’re quick with their thumbs, yeah, and they have a facility for using the technology. Well, the technology is very well designed to be a very low grade facility and once you get the idea of it, it works actually very well.

But if you ask any parent, any teacher or actually most children and say, is it designed in your best interest? Does it do what you want it to do? And is it causing you any particular issues that you wish it wouldn’t, you get a whole host of things from “I wish it was a bit less demanding” to “It’s waking me up all night” to “I can’t get them off” to “It’s actually very attention grabbing” to “I see things I don’t want to see” to “It polarizes my friendship groups.”

I mean, I could go on to the end of the podcast and the truth is that everybody can see, everybody knows and everybody can articulate all sorts of problems. So if there’s all these problems, then how come it’s just okay that they’re quick with their two thumbs? You would not send a seven year old kid out into the street and say, just because you can cross the road in front of you to the corner shop, yeah, with me standing on the front doorstep in a fit of independence, would you actually go “Off you go to school. I’m not going to tell you where school is. I’m not going to tell you when it starts. I’m not going to tell you how to get there or what its name is. And off you go” Well, that’s ludicrous, isn’t it? So I think we’ve just got to be more careful in the language we use. They’re not digital natives, they’re children.

Neil Fairbrother

Indeed. Now 5Rights has recently launched a project called “Freedom, Security, Privacy, and the Future of Childhood in the Digital World.” And as part of that, you’ve published a book, which I have to say is an absolutely fascinating read. It’s a book with contributions from many leading lights in the areas that we’ve been talking about. And obviously we don’t have time to go through everything, but just to give a flavour, I would like to ask four questions and we are running short of time, so if you could keep your answers brief Beeban that would be fantastic, or as brief as possible. So the first era of freedom as an example, Susie Allegra refers to something called the “forum internum”, which is the right to keep our thoughts private and free from manipulation. Is this something you think that social media in particular ignores?

Baroness Beeban Kidron

Oh, absolutely. And I think that if you think about that sort of idea that Facebook knows your sexuality before you do, or that they know what you’re going to buy next, or that Google knows where you want to go on holiday before you even think it, you know for a fact that actually the amount of information they hold, plus the amount of information they hold on other people, plus the nudges, means that they actually suggest things before you thought them, and then there’s a muddle between what is your interior life and what is their suggestion. And I think there’s a lot of evidence to say that that is very problematic, particularly when you’re in a development stage where you’re not fully formed.

Neil Fairbrother

Indeed. Doctor Ian Levy writing an article in the security part of this fascinating book discusses the need for “safe software.” What, does he mean by safe software?

Baroness Beeban Kidron

Well, I do think that’s a wonderful essay. I mean, you’re right, it’s a wonderful book and I didn’t write it so I can say so but Ian really is taking our concept of designing the digital world to be fit for childhood to another level. He’s just saying, safe software. Build the thing with safety in mind. I think this is something people forget. This is an entirely man and woman made world. Everything in it is engineered and therefore everything in it can be changed. And so what he’s talking about is actually instead of letting everything go and then looking at the harms and chasing after them, you know, think about software from the get go as a place, as a thing that needs to be safe and the best way I have of saying it is you know, we do like our medicines tested don’t we?

Neil Fairbrother

We do indeed. Francesca Fajado is a young participant in the 5Rights Data Literary workshop and she argued in her contribution to the privacy section, that her data, well, all of our data in fact irrespective of age, is being quote unquote “Peddled away by conmen”, which I think is possibly a shorthand way of saying what you’ve been saying throughout this podcast?

Baroness Beeban Kidron

I think it is, but of course, Francesca said it better than I, but I mean, she’s 18 now, she was 17 when she did our course and she represents a lot of young people in the way that they have no idea that their phone knows their gait and therefore their height, their breath, their heartbeat, their sexuality, you know? They don’t know that they’ve signed away the rights for someone to listen to their conversations and they are appalled when they find out. And I think that that young people are very outraged at this sort of fundamental unfairness of what the deal is. And I think that they would be prepared to settle for a bit less, for a bit more privacy.

Neil Fairbrother

In one of the essays towards the end of the book in the Future of Childhood in a Digital World section Jay Harman, who is one of your team members, wrote an interesting article where he said that we could learn two lessons from the famous Babel Fish of the Hitchhiker’s Guide to the Galaxy. One is the law of unintended consequences and the other was a quote: “Every virtue, if carried to the extreme, becomes a vice”. What does he mean by those two lessons that we could learn?

Baroness Beeban Kidron

Well, I think, I mean, starting with the end, you know, I mean, I have on my screensaver, it says “Ice cream makes you happy”, yeah? I’m very committed to ice cream, but if I ate nothing else, you know, it would not do me well. And I think that very often, you know, we talk about, you know, the endless possibilities of the digital world, but the endless possibility is not always what you need. Sometimes you need someone to take care of you or to think about your needs or perhaps even to constrain you.

But I think that, you know, if you think about the digital world like a digital sweet shop, it’s not just a sweet shop. There’s also a proprietor the door saying, don’t get out, don’t get out, come and have some more, come and have some more. So I think that that’s, you know, too much of a good thing. It’s not that digital is bad. Yeah?

And on the unintended consequences, I actually don’t know. And I think we’d have to ask Jay exactly. But I do think there’s something that is really worth sort of saying as a guiding principle. It’s not that everybody who makes digital things is automatically bad, automatically wilful, automatically thoughtless, but we are way beyond the nursery slopes. These are the biggest companies in the world. They have the biggest capital value and the biggest revenue and actually it is not okay to not understand what it’s doing to children, not understand the anxiety levels, not deal with the self-harm on your platforms, not deal with the child sexual thing, not introducing strangers to kids by automated friend requests.

Just because you think your service is bringing people together, it doesn’t make it okay for you to bring a predator together with a child. So I think the truth is, it’s not about what that intention is, it’s about what the actual reality is and that is the price of doing business. If you want to be a business in the real world, you have to take care of your customers.

Neil Fairbrother

What conclusion Beeban can we draw from this book? Is the future of children in a digital world, bright?

Baroness Beeban Kidron

Well it has to be bright. I mean, all of our futures are entirely dependent on us building a digital world that children deserve. You know, one that is appropriate and kinder and more equitable and doesn’t nudge them around. Because I don’t think, and it is one of the very sort of important principles of 5Rights is, you know, we love tech, we use tech, we love tech, we want kids on tech, we’re not blockers and banners, what we want is a more equitable design and moral system that says actually taking care of kids is a price of doing business. We will do that first and then we’ll check our profit margin. And that is an absolute for us.

And so the best interest of the child must come first. And every now and again, someone’s going to sacrifice for that. But the truth of the matter is we do need a good digital world, because if we don’t have that, we’re going to have very deformed kids in a very deformed world and it’s going to be very difficult to have the world we want, because you know, this is indeed our future, both with technology and the kids.

Neil Fairbrother

Beban, I could talk to you all day, but unfortunately we have run out of time. Thank you so much for your time and insights. Absolutely fascinating. And where can people find that this book that I’ve read?

Baroness Beeban Kidron

Oh, it’s on the 5Rights website. We do have hard copies for people who refuse to books online and they can get in touch with us at info@5rightsfoundation.com. But it is actually available on a micro site via on website.

Neil Fairbrother

Brilliant. Thank you so much.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top