Safeguarding podcast – the Age Appropriate Design Code with Steve Wood, ICO

In this Safeguarding Podcast I’m joined by Steve Wood, Deputy Commissioner and Executive Director of the Information Commissioner’s Office to discuss the Age Appropriate Design Code.

Online data sits at the heart of the digital services that children use every day. From the moment a young person opens an app, data is gathered about them. What is the Age Appropriate Design Code, who does it apply to and how does it help protect children’s most personal data?

There’s a lightly edited transcription below for that that can’t use podcasts, or for those that simply prefer to read:

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the law, culture and technology of safeguarding children in the online digital context.

Neil Fairbrother

Online data sits at the heart of the digital services that children use every day, from the moment a young person opens an app, plays a game or loads a website, data begins to be gathered about them, which raises some questions such as who’s using the service? How are they using it and how frequently? Where from and on what device? And what possible harms might all of this represent to children?

To help guide us through this topic and their Age Appropriate Design Code, I’m joined by Steve Wood, Deputy Commissioner and Executive Director of the Information Commissioner’s Office, the ICO. Welcome to the podcast, Steve.

Steve Wood, ICO

Good morning!

Neil Fairbrother

Steve, could you give us a brief resumé please, of your background so our listeners from around the world, understand a little bit more about your experience.

Steve Wood, ICO

Yes. Thank you. So I’m Deputy Commissioner at the ICO, we’re the UK’s independent data protection regulator. My job at the ICO is to oversee our approach to regulatory strategy, so to make sure we’re thinking ahead to future challenges, thinking about joining together what we’re understanding about new technologies, changes in society, and how data protection legislation protects from the greatest harms, but also allows the wider public to enjoy the benefits of services that use data as well. So our role is to both enable and protect. We also operate internationally. The other thing I have under my portfolio is I’m also Chair of the OECD’s Working Party on Data Governance and Privacy.

Neil Fairbrother

And the OECD is?

Steve Wood, ICO

The OECD is the Organization for Economic Cooperation and Development. So it’s an international organization based in Paris, which most major economies are members of.

Neil Fairbrother

Okay, thank you for that. Now earlier this year, you introduced your Age Appropriate Design Code. Who does that apply to and what is its purpose?

Steve Wood, ICO

So the Age Appropriate Design Code was really an additional requirement brought in by the UK parliament to sit alongside the data protection law that we already have in the UK, the GDPR, which most of your listeners will be familiar with. So the requirements that the UK parliament added were for the ICO to produce an Age Appropriate Design Code, which was aimed at online services which are likely to be accessed by children.

So any of those services where there is evidence that children are likely to be using them, the providers of these services need to take account of the standards set out in the Age Appropriate Design Code, which really focused on taking account of the best interest of the child, when services are designed and really putting the child at the forefront of thinking when services are collecting data from children and that’s broken down into 15 practical standards in the Code. The Code has now been laid before parliament, it’s completed its process there and is now going into a transitional period of a year when it will come fully into force.

Neil Fairbrother

Okay. Now in the Explanatory Memorandum for the Age Appropriate Design Code, it says that “…the territorial extent and application of the Code is the United Kingdom”, but many of the social media platforms, whether they are designed intentionally or otherwise for use by children are based in other countries, notably the USA, where they operate under very different laws and regimes such as the Child Online Privacy and Protection Act otherwise known as COPPA. So how does the Age Appropriate Design Code apply to them?

Steve Wood, ICO

Essentially the aim of the Age Appropriate Design Code is really to follow the jurisdiction of data protection law (which I mentioned earlier, the GDPR), which essentially covers services which are delivered to children, in the context of those services being based anywhere in the world, if they’re directed to children in the UK, then the requirements of data protection law can apply and therefore the Age Appropriate Design Code can apply as well.

Neil Fairbrother

Okay. When it comes to children under Section 123(3) of the Data Protection Act, the Information Commissioner is required to prepare the Code or amendments to it in consultation with children, which I think is a particularly interesting area. What steps did you take to include children in this? How were they involved and what was their contribution?

Steve Wood, ICO

Yes, it was a really important requirement that Parliament placed on the us there to make sure that we fully took account of the views of children and also parents in this process, as well as the normal formal consultation we would do with business. So we realized we would need to really involve and engage other experts to help us get the information we needed so we went out and commissioned an expert company called Revealing Reality to run a series of surveys and focus groups with children to better understand their views and really to ground it in the reality of how children were using online services and also to understand the challenges and the feelings and the reactions children had to discussions about how their data was used by these services as well.

So we went through a series of pieces of research, which then enabled us to understand how to best set out the requirements in the Code, in the standards, for areas which we knew children cared about would work, and would work in practice, and would address the underlying concerns and also understanding the language and the concepts, which would work best for children, particularly taking account of a wide range of ages.

Neil Fairbrother

Yes. That in particular, must’ve been quite a challenge because obviously what we’re talking about is a complex topic and you addressed children across a range of ages from the youngest of 3 up to 18. How did you manage to convey these complexities to children, say a 10 or 13 year old who might be more typical of a social media user?

Steve Wood, ICO

We got the researchers to engage in using a range of concepts, which would work for children depending on their age. So thinking about whether children could understand what data is, how data might be collected about them obviously would increase over a spectrum as children got older and very much obviously with younger children also thinking about getting the views of parents as well.

But it’s important we do understand the full spectrum of children’s use of the online services, because I think the phrase we coined a few years ago was, you know, children are learning to use iPads before they’re riding bikes, and it’s important that the different contexts are taken account of, but equally as well, it’s important we understand that all of the online services are different as well.

So we don’t want to be prescriptive in saying exactly what online services should do. It’s up to these online services to understand where the range of their users is most likely to be, and then focus their efforts in terms of the language, the transparency, the information they provide, so that it works best for their users as well.

I think equally as well, it’s important to stress that the Code is not about getting children off the internet. It’s about recognizing that it’s a key part of a child’s development these days to start using these online services at an early age, they’ll be also be used in schools as well. And so it’s about children learning to explore, play online, but finding the right environment for them to sort of progressively learn how to control their data over time as they get older and more confident, but to provide them with the right space really in the default settings and the right protections for them.

Neil Fairbrother

Okay. Now the ICO’s Regulatory Action Policy states that “…the ICO has a range of enforcement and sanctioning powers from light to severe including monetary penalties of up to 4% of global turnover should the most serious and harmful contraventions take place.” So how does that align with the proposed regulator for the internet, which is Ofcom, and is this actually sufficient for it not to become simply the cost of doing business for these companies?

Steve Wood, ICO

Yeah, so it’s important I think to view the Age Appropriate Design Code that we’ve produced as a piece of a jigsaw, because the issue of protecting children online is a complex area. Data as you said in your opening is a crucial part of this because of the profiling, which is done using data. But it could equally be questions about content, which we as the data protection regulator, don’t cover. [But] it’s likely that the prospective regulator Ofcom, if they’re brought in under the new online harms legislation over the next few years by the government [will cover content].

But it’s important that all of these systems work together as well, so we already work closely with Ofcom, we have a Memorandum of Understanding with them. So we know it will be important that there are joined up and clear messages going to go into the outside world from both of us, so we will work together collaboratively when we have say common issues with the same company, et cetera, to make sure that our actions are complementary.

In terms of the actual actions we can take, as you said, we have a fining power, which is one tool we have in our cupboard for I’d say the most serious and systematic breaches, particularly [when] it looks like that a company may have essentially profited from using the data of individuals and particularly children. In this context, we may use our powers in that situation to penalize and to set an important precedent, to send a message to that company and to a wider sector. So we will use our powers, we will always use them proportionately though.

[But]it’s important to recognize, particularly with a new area like this that some companies just might not get it right the first time. They’ve got a year to get this right. In some circumstances, it’s better that we get alongside a company, we use our powers to audit because ultimately we’re outcome based. We want the services to improve if they’re not at the right standard and it’s working out how serious the breach each is, whether for example, the company has cooperated with us, as to what powers we need to use.

Ultimately as well the fines are only one strong sanction we have. The ultimate sanction we have is to issue what we call an Enforcement Notice, which is an order which can essentially tell an organization to stop using personal data. And obviously that is actually as powerful mechanism as a fine, because it can ultimately address the root cause of the concerns which have arisen as well. That can also go to the heart of how a service is designed and delivered as well. We all need to learn how to use those powers in this particular area as we start to move forward after the transitional period has completed for the Code in terms of using the Code, really as a tool in our enforcement action for assessing how companies have done in protecting children’s data.

Neil Fairbrother

Okay. Now one of the areas that you say that the Code aims to improve is that of the Freedom of Thought. What is the Freedom of Thought and how does the Code achieve protection of that?

Steve Wood, ICO

Yes, I think it’s important to recognize with children that there are many different sort of rights at play within the overall rights of the child, so that the overall conception is the best interest of the child. So that will include protecting the data and the privacy of the child, but equally the child’s right to receive information, to learn, to explore, to be educated online. Therefore all of these different factors will need to be considered in designing online services, taking account of how data is used and equally as well, areas like profiling, which may lead to certain types of content being delivered to a child, of course, therefore affects what a child learns and thinks about. So making sure really the child has the right to control the settings, to understand what content may be delivered to it on the basis of the decisions it makes, it is really important. So it’s recognizing that privacy, isn’t just one catch all, it’s got to operate in the real world and freedom of expression online is a very important issue as well, which we recognize as a regulator.

Neil Fairbrother

Okay. Now you’ve mentioned the phrase “in the best interest of the child” and that is enshrined in the UN Convention on the Rights of the Child. So do you see this Age Appropriate Design Code as being almost an extension of the UNCRC?

Steve Wood, ICO

That was very much the UK Parliament’s intention when they introduced the provisions in the Data Protection Act requiring the production of this Code. They referenced the UN Convention, therefore it was clear that was sort of setting a foundation stone of what needs to be considered, which is why we have the best interest of the child as the first standard in the Code.

What we recognize though, is that it needs breaking down and companies have got to understand what that means in practice, because there will be trade-offs, different considerations. I think as well, it’s important to recognize that doesn’t just mean banning things or very prescriptive measures or stopping things happening. It means really assessing the risks and understanding really what the impacts of how the service is designed, how those impacts will actually impact on how the child’s data will be used and then the wider impacts of that data use on the child.

And it’s also recognizing in some circumstances, uses of data can be positive as well. So some uses of data could be in the interests of the child, if they’re related to a safety measure online, or even how a certain feature in a service works but because that is generally how the child would expect that service to work as well. So it’s really recognizing that this is a nuanced concept, but what it can be is ignored. So when companies are designing the services, we want there to be a greater shift. Some companies are moving in this direction, we recognize, but we want children’s rights, thinking about the impact on them when the services have been designed, to be a consideration right at the forefront.

Neil Fairbrother

Okay. Now there seems to be two different types of data, so to speak. There’s what you might call explicit data, the age of a child, for example the sex of a child, the name of a child, possibly their location. But there’s also inferred data. What is meant by inferred data and does the Age Appropriate Design Code also cover inferred data?

Steve Wood, ICO

Yes, it is an important distinction, but services, when someone signs up to them, either a child or an adult, will ask for certain information. We type it into a form that’s generally, as you said, quite an explicit process of that data being collected. But then there’s really the behavioural data, the trails of data that we leave behind as we use the service, as we use an imprint as our actions, our behaviours, the different activities we do online are tracked essentially, and that data is recorded and can then be used essentially to create a picture of someone. And often that footprint is used to, for example, target advertising at someone or target content they may be more interested in.

Often a user may be aware of these services and indeed may find them beneficial, but obviously it comes with a number of risks. So because personal data is still being used in that context, it may be inferred data, putting a child in a particular bucket to say, they’re interested in that type of content and therefore they’re going to get fed more of that content. But because that data has been used in that way, it’s still their personal data, therefore it’s still covered by Data Protection law and the Age Appropriate Design Code of practice does cover those situations, particularly in the standards relating to profiling and detrimental use of data.

Neil Fairbrother

Okay. On the Age Appropriate Design Code, you said earlier that there are 15 standards, which make up the Code and unfortunately we don’t have time to go through all 15, although the document itself is freely available, is a very easy read even if it is quite a long read, but let’s cover some of them. I think there were five that I’m particularly interested in. And we’ve somewhat covered the first one, which is “the best interest of the child”, but does that conflict with the “legitimate interest” argument that social media companies use to apparently sidestep the consent, the age of digital consent? You know, they claim that they have a legitimate interest in the data of children because they are an advertising based organization?

Steve Wood, ICO

I think the first thing to outline is that legitimate interest as a legal basis in the law as set out in the GDPR doesn’t necessarily allow organizations to sort of sidestep, if you like, important data protection considerations. The most important opening consideration is really, is this a situation where consent is a relevant legal basis? If you can’t use consent, you can consider legitimate interests, but this very much doesn’t just give an open door to an organization to do what they want.

For example, the transparency provisions of the law still have to apply and legitimate interest doesn’t just mean that the legitimate interest of the company, for example, to conduct their business via an advertising model overrides really… the considerations of legitimate interests as set out in the law really requires the organization using the data to balance that legitimate interest against the rights and freedoms of the user. In this context, we’re talking about a child user, but it applies to all processing scenarios in terms of legitimate interest.

So it doesn’t really move us away from those key concepts in the Code, because you have the child user there. The organization would have to still think pretty carefully if it was relying on legitimate interests rather than consent, whether that was appropriate and making sure really it had done the assessment of the impacts of how they were going to use the data, taking account of the child’s best interests in any case.

What we’re saying to organizations is if they make those decisions, it’s very important that’s documented and recorded, and they go through a thorough process. So it’s not just a quick process of thinking advertising is important to our business model, therefore that will be an overriding consideration, but it’s really thought through in the context that that company is using data. And also it’s about the different safeguards, which might actually mitigate some of the risks as well, because it is not often an “either or”, it’s can the right safeguards be put in place as well to minimize the risks as well.

Neil Fairbrother

Okay. The second standard is Data Protection Impact Assessments or DPIAs. What are DPIAs, and should these be made public on the service providers’ websites and in their apps?

Steve Wood, ICO

So Data Protection Impact Assessments, DPIAs, are an important requirement of the GDPR, the data protection law we have in the UK, that requires in essence the organization to make an assessment of the risks when they’ve deemed they’re in a number of different scenarios, which could include using children’s personal data, for example, to profile.

So they must produce some documents, but it’s not meant to be, you know, a piece of paperwork, a piece of bureaucracy it’s actually meant to be an open and clear assessment of the risks and also what can be done to mitigate the risks. So it also can be an enabling document. It doesn’t need to lead to the organization being stopped using data, but they’ve got to assess the risks and then think about what they can put in place to address those risks, if particularly they come up out as a high level. Organisations should consult in their Data Protection Impact Assessment when they’re conducting it.

It depends on the circumstances as to whether an organization should publish their DPIA. It’s not an absolute requirement in law. In our guidance at the ICO, we encourage it wherever possible, or even if it’s not possible, for example, for reasons of commercial confidentiality, or an advanced product launch, that the organization still conducts some form of consultation, even if they can’t release the whole document.

But certainly taking examples in the public sector, in the UK, for example, the NHS COVID app, the data protection impact assessment for that has been published. So it certainly it is possible to publish them. It depends on the circumstances as to whether the whole document is published.

But the other thing is that we can demand to see it as a regulator. So it might often be a starting question. If we investigate a concern that’s been brought to us by a member of the public, we may ask an organization, have you done a Data Protection Impact Assessment? Can we see it? So it’s often a good starting point in your organization actually demonstrating to us how thorough they have been.

Neil Fairbrother

Okay. And would you make it public?

Steve Wood, ICO

We might make the details public, if we felt it was in the public interest, if it was necessary to do it as the result of an investigation. So it’s one of those situations. It depends on the context, but because transparency is so important in Data Protection law, it will always be a key consideration to make sure the public are informed. Data Protection Impact Assessments, obviously are quite technical documents in terms of the assessment, the risks, so that’s separate to the requirement in GDPR for there to be transparency and essentially a privacy notice for individuals. So it’s separate to that requirement anyway. So we’d also look, for example, when a user was signing up to a service or at different points while they’re using the service, have they got the right information in front of them in very plain language terms, clear and accessible as well? Data Protection Impact Assessments can often be useful as well for sort of expert members of the public like civil society and other academics and other technical groups sometimes to scrutinize as well. So it’s another layer of transparency, I would say.

Neil Fairbrother

Okay. Standard 3 refers to Age Appropriate Application and Recital 38 of the GDPR, which says that “…children are a special case with regard to the personal data, as they may be less aware of the risks, consequences, and safeguards concerned, and their rights in relation to the processing of personal data.” Does the Code require an age verification system of some sort, given that the current self-declaration processes that are used are pretty ineffectual?

Steve Wood, ICO

So this is an important topic and actually it was one of the key areas in the consultation we ran on the draft of the CCode so we had quite a lot of feedback from industry about this area. And really we came up with the term Age Appropriate Application to set out what we were focused on here, which was not mandatory age gating, but really an appropriate system focused on risk so that the right safeguards could be put in place depending on the level of risk in how the organization uses its data, to make sure that the standards in the Code are effective. But equally we don’t place unfair requirements on particularly small and medium sized businesses so if they are actually doing quite a small amount with a child’s data that equally we respect what they would need to do, and it wouldn’t be disproportionate to their business.

As so essentially in the Code is set out a sliding scale of risk. The greater the risk, the greater the effort will need to be from the organization in verifying whether they have a child user or not so that the right standards can be applied. The other alternative we set out is if an organization doesn’t want to do any verification, they could apply the standards in the Code to all users, which wouldn’t really treat adults like children, which would actually just give a high level of protection. So there are really choices for organizations in terms of what they can do.

For the higher risk areas, which have a greater aggregation and concentration of data uses in assessment some form of formal age verification might be necessary, so we recognize that in the Code. At the lower end of the spectrum, we also recognize that in certain circumstances, self-declaration in a low risk scenario might be appropriate as well.

I think we’re recognizing that there’s a real big sweep of businesses involved here because the conception is broad, it’s any service likely to be accessed by children. Therefore, we’ve got to take a proportionate approach, I think, to make this work in practice and that’s where we learned more during our consultations with business.

Neil Fairbrother

Okay. So is risk then a function of scale?

Steve Wood, ICO

So risk will certainly be driven by the amount of data that an organization gathers, but it will be focused on the nature of the data sets, how sensitive the data is, so how much has been learned about a child essentially in the data collection and how that data is being used in terms of the profiling, the intrusiveness really of the use of data in creating that risk profile. We tried to set that out in the Code to give some guidance for organizations to sort of work that through.

We’re also planning to do some more work with different sectors as well. That’s something we’re doing during the transitional period because we were aware that that approach might look slightly different for an online gaming company, compared to a social media platform, compared to a connected toy manufacturer as well. So we’re working with trade bodies as well so we can actually bring this to life so people really understand what risks means for their sector as well.

Neil Fairbrother

Standard 11 refers to Parental Controls and says that “…if your online service allows a parent or carer to monitor their child’s online activity or to track their location, you must provide an obvious sign to the child when they are being monitored.” Now, the use of parental controls to monitor or even block access to services seems to me to impact the child’s right to privacy as recognized by Article 16 of the UN CRC, and also it impacts their rights to association, to play, access information and also for freedom of expression. And it can in fact result in children becoming evasive and deceptive because they feel so strongly that they need to express themselves and to communicate with their friends. So how does this circle end up being squared? Can it be squared?

Steve Wood, ICO

I think we learned from our consultations on this with both parents, children, and businesses that it is a tricky circle to square. And certainly I think we had a range of views there from parents who liked these controls [and who] want to use them, some online safety campaigners think these protections are important. Some companies want to start to offer them as part of their services.

But equally, as you said, there’s a balance there in the autonomy of the child and particularly how that changes over of a child’s life. The autonomy a child might have aged 9 is different to the autonomy a child might have when they’re 15, for example. So that’s why we really wanted to strike the right balance.

We want to essentially remain neutral on parental controls in saying if you use them, then this is the way you should do it, recognizing they can have some benefits, but saying in a way as well, they’re not a silver bullet in terms of solving this issue as well, which is why they’re just one of many standards, but really letting parents and children also have the conversations in their homes.

I think this is something we have said on a number of occasions as well, that the Code is a piece of guidance from a regulator, which would improve the design of these services, improve the protections. Other work, like education, raising awareness, you know, the discussions which need to take place in the home, the important role of that sits alongside some of this as well.

So we are doing our bit in providing some guidance about parental controls, but equally there’s a wider sort of public policy effort, which we know the government will lead through their online harms work and digital literacy to sit around this. I think it’s an area we’ll probably need to monitor to see how it develops in future. If the Code needs to add more context, or we have some different understanding about how this operates in practice.

Neil Fairbrother

And are our parents sufficiently motivated to learn about these kinds of online risks and how appropriate or inappropriate some of the services that their children are using are?

Steve Wood, ICO

I think, again, there’s a spectrum out there. Some parents are motivated, some aren’t for many different reasons. I’m a parent myself, so I know what it’s like, particularly when the children were younger, how much time you have to check settings or monitor things and get things set up. But I think there was a role there still for parents. Parents have the right to obviously play a role in shaping the environment and how the children use the service. So that is important.

But the thing is also recognizing with the design approach we set out in the CCode, that it is very challenging for parents to do all of this, particularly given children in the course of you know, an evening might download three different apps and delete two, then the next day install another three apps. But it’s an ever changing picture, which is why really the important starting point of those apps is when they’re downloaded of being really built with the right protections built in by default.

But this doesn’t necessarily mean banning everything and saying all uses of data are switched off, but at least the default settings are switched off. And then the child can perhaps learn how to switch things on, still with the right protections in place, over time. And that’s the situation we want to get to really. Strong protections of the child’s data when the thing is downloaded and it is up and running. Equally as well time and space for that important parent child relationship and the other awareness work, which needs to take place. So as ever it is going to be a combination.

Neil Fairbrother

Okay, now the final standard I’d like to ask you a question on due to shortage of time is Standard 12, and we’ve referred to this earlier in the conversation, and that is Profiling. We’ve kind of explained what profiling is and the Age Appropriate Design Code suggests that “…profiling should be off by default, but there should be put in place measures that protect the child from any harmful effects, in particular being fed content that is detrimental to their health or wellbeing.” But the whole point about social media is that it is algorithmically driven. So should social media companies then restrict the use of algorithms when it comes to serving up content to children and would that help prevent another awful situation, such as the tragic case of Molly Russell?

Steve Wood, ICO

In terms of the situation you described, you know, the tragic case of Molly Russell, the role of algorithms in this context, I think has been recognized in feeding content often at a greater sort of volume and  [velocity]at the child. Although the harm, obviously ultimately comes from the content itself the connection between the data and content is getting stronger. So that’s why we focused on this aspect of profiling in the Code.

But as I said earlier, we know probably tackling the algorithm wouldn’t completely address this issue because you still might have this content existing regardless of whether it’s fed by an algorithm, which is why the UK government’s Online Harms regime is another important component. But really what we want the companies to do is to have a greater understanding of the impacts of their algorithm and to make sure they understand whether they have a child user in front of them, which can therefore improve the technologies they use.

And they are all looking at using artificial intelligence now to protect [children] from a number of harms online. So that can be matched to the process of how the child’s data is used in connection with the algorithm. It’s an emerging area, but it’s clear that there are further steps which can be expected from the companies in that regard.

And also we’re talking about very I think clear evidenced harms. So we also referenced in the Code where particularly there is evidence from other external third party experts, for example, experts on child psychology, experts from the UK medical community, they’ve actually published guidance about how certain types of content can be harmful for children. Therefore, that places we would say a greater onus on the companies to understand how data is used to feed that content.

Neil Fairbrother

Okay. And that is an area that you would monitor or the regulator of the internet, Ofcom, would monitor?

Steve Wood, ICO

We’re going to need to work together on this. So we’re responsible for the data and they will be responsible for assessing the policies and procedures that the companies have in place in relation to content. So there’s an intersection. We’ll always be focused on really whether the child’s data has been used fairly. That will be our ultimate consideration. It’s likely that we may need to share information between us, as regulators, to work out how we need to tackle certain companies if we get evidence of a particular issue arising. So we’ll need to work that through. We don’t have the full details of the legislation yet for online harms, but I think what we can say is the data piece is the here and now piece, because the legislation is enforced in terms of data protection laws already enforced and the Age Appropriate Design Code has been through parliament.

So to a certain extent, we will be at a slightly earlier stage on this compared to Ofcom because our regime has come in first. But what we have been clear to say is we can’t overstep our remit and start to step into a purely content based regulatory decision, because that would be going outside our jurisdiction. So for a while, it will be more when we’re more of the ICO being focused on this issue. But over time as Ofcom get up to speed, we’ll be together in the outside world. We’ll probably be seeing joint messages from us. We’ve already done research together on these issues as well, over the last few years.

Neil Fairbrother

Okay. Now you’ve mentioned the transition period, where are we with the Code? And what’s the next step? Is it going to become a law?

Steve Wood, ICO

So the Code itself isn’t law. So the law we regulate at the ICO is data protection law, which is the GDPR. The Code is essentially a statutory piece of guidance that the ICO has  produced. It has a higher status because it’s been laid in parliament. So it enables the ICO to take formerly account of it in an investigation. Also tribunals and courts would need to take account of it because of its high status as a piece of guidance.

So we therefore use the Code to guide things like when we’ve got audits and investigations and really using it as a benchmark and saying, this is our expectation as a regulator, as to what organizations should be doing to protect children’s personal data. But ultimately our powers of enforcement are derived from data protection law.

So where we’re up to in terms of the processes that the Code has been laid in parliament, it’s completed its passage. There were no objections to the Code in parliament. So it’s now starting a transitional period of a year before it is taken account of. So at the moment, it is there as guidance, but it doesn’t come into fully into effect until the 2nd of September, 2021.

So this period we’re into which we’ve already got less than a year to go now, is very important for us to work with companies in the key sectors, answering more of their questions, seeing if there are further tools or support needed to help them implement the Code in practice. So that’s what that transitional period of the year is for. Once that period has passed, the ICO is then free to use that Code and take it into account as are the courts and tribunals.

Neil Fairbrother

Okay. So what would your advice be to any companies that are operating in this area where the Age Appropriate Design Code might apply?

Steve Wood, ICO

The first thing to do is if a company thinks they might be caught by it is to go to our website. We’ve got a special hub of information building up there, obviously the Code itself, a short guide to the Code for small businesses, FAQs, we’re building up more sector specific guidance there as well. So companies should go to our website and start to work through some initial questions, like are they covered?

That will enable them to understand the test which is in the Code, likely to be accessed by a child, whether they fall within that and then if they decide they do, then to start to work through the 15 standards, start to assess the different types of risks that I’ve talked about earlier in working out what they will need to do to prepare for the Code’s introduction.

I’d also flag that if a company is part of a trade body, we’re particularly keen to work with trade bodies, because it’s a really good way for us to cascade the information down and to get guidance, which will work best in the language and in the terms for those companies in their sectors as well. So if you have got a trade body contact your trade body, if your trade body hasn’t been in contact with us, you know, ask them to get in contact because we can arrange round tables and workshops, which can get groups of likeminded companies together as well to discuss some of the challenges. We’ve already started that process working with a number of bodies in the UK, such as TechUK, who the trade body for the tech industry in the UK.

Neil Fairbrother

Okay. Steve, thank you so much. I think we’re going to have to leave it there as we really have run out of time. Really appreciate your insights into the Age Appropriate Design Code. It’s also a very important piece of work. Good luck with it and I’ll be interested in seeing its success or otherwise as it unfolds over the next few months.

Steve Wood, ICO

Thanks very much, very happy to speak to you.


The Age Appropriate Design Code can be downloaded from here.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top