Safeguarding Podcast – How Old Are EU? with Tony Allen CEO Age Check Certification Scheme

By Neil Fairbrother

In this Safeguarding podcast with Tony Allen, CEO Age Check Certification Scheme, we discuss the five principles of the EUConsent project, age verification across the EU, the eIDAS Regulation, the dangers of age verification for children’s data, age verification for delivery and the gaping hole in the UK’s Online Safety Bill.

https://traffic.libsyn.com/secure/safetonetfoundation/SafeToNet_Foundation_podcst_-_How_Old_Are_EU_with_Tony_Allen_CEO_Age_Check_Certification_Scheme.mp3

There’s a lightly edited transcript below for those that can’t use podcasts, or for those that simply prefer to read.

Neil Fairbrother

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother exploring the law, culture and technology of safeguarding children online.

The EU is the world’s largest trading bloc and comprises some half a billion people. As has been seen with the GDPR and the ePrivacy Directive, what the EU does has global impact. One of the EU-wide projects that could affect far more than the 70 million or so children within the EU 27 member states is Age Verification, a complicated enough topic without all the different languages and cultures that make up the EU. Today’s guest will guide us through all this complexity and explain how it might work. Welcome to the podcast, Tony Allen.

Tony Allen, CEO Age Check Certification Scheme

Hello. my name is Tony Allen and I am the Chief Executive of the Age Check Certification Scheme.

Neil Fairbrother

Excellent. Thank you. So Tony, could you give us please a brief resumé so that our listeners around the world have an appreciation of your background and experience?

Tony Allen, CEO Age Check Certification Scheme

Yeah, so my background is principally as a Trading Standards Officer, which in the UK is the law enforcement agency that is responsible for trading laws and regulations. I spent 20 years in that role in a specialist area of under-age sales, that’s preventing children from accessing alcohol tobacco, fireworks, knives, et cetera.

Since leaving public service, I set up a training company which provides training to people who sold those goods. And then a couple of years ago in response to industry demand, really, we started to explore creating a formal certification scheme for everything associated with testing age check systems and identity systems to make sure that they work. And that’s really what we developed and worked on. And we’re working on the development of the standards to go with that. And also this then leads into this piece of work across the European Union.

Neil Fairbrother

Okay. And who are your members, Tony?

Tony Allen, CEO Age Check Certification Scheme

We don’t have members as such. We are an independent third party certification scheme. So we have clients that come to us and say, we’ve got a piece of kit, can you test it? And so they will pay for an independent evaluation of their equipment and that could be age estimation equipment, artificial intelligence. It could be age verification online. It could be checking against databases. It could simply be we’ve trained our staff, can you come into our shop and see whether or not they ask for ID? So we have a panel of around 2,000 18 year-olds that we can send to shops and we send them to all sorts of places like Sainsburys, B&Q, Iceland [a UK frozen goods supermarket], to see whether or not they are served without being asked for ID. And then we have a series of online avatars that we can use to put your age verification system online to the test. And quite fundamentally what we do is we see whether it works.

Neil Fairbrother

That sounds very interesting. What do you mean by an online avatar and how does that work?

Tony Allen, CEO Age Check Certification Scheme

So one of the challenges with testing any kind of identity system, or anything that is trying to establish an attribute about you, is that you need to create the avatar, what we call an avatar. So it’s a pseudo identity that we can present to a system, to see whether or not a) it can tell the attributes of that identity and b), whether it tries to find that identity on an electoral registration or on a credit record or whatever method is using. And of course, it shouldn’t be because we’ve made it up. And so part of our testing is to see whether or not it falsely returns a verified identity when it’s one that the team here made up.

Neil Fairbrother

Okay, thank you for that. Now we’re talking about a project that’s I think funded by the EU called the EUConsent project. What’s what is it? What’s its objectives?

Tony Allen, CEO Age Check Certification Scheme

So it is a project which is looking at children’s Identification and Trust services in Europe. So it’s all about enabling children to enjoy a safer digital world throughout the European Union. And it was part of a project commissioned by the European Parliament and through the Commission, to look at infrastructure dedicated to how you do that in an online environment.

So looking at, for instance, the requirements in the General Data Protection Regulation (GDPR) that consent, when you’re processing data on the basis of consent, that if that person is under a certain age, that there is parental consent provided in addition to the child’s consent.

One of the problems with this is that the age [of digital consent] differs across Europe. But typically it’s between 13 and 16 and the aim of the project is to put into place an extension to something called eIDAS, which is the Europeans Identity Assurance System to basically create a node within that for children and young people.

Neil Fairbrother

Okay, well, we’ll, we’ll explore some of the eIDAS Regulation shortly, but before we come on to that on your website, you say that the EUConsent project will be designed with the help of Europe’s children. Having user involvement in the design process is always a good idea. So how would this operate?

Tony Allen, CEO Age Check Certification Scheme

So the project is broken down into work streams, as all these sorts of projects are, and there are two very important work streams. One is being run by a series of academics, and those academics are researching you know, what the needs are, what the risks are, what the research base is for how you engage with children and young people. And they’re using a series of focus groups and working with children to identify that.

But then the other work stream is being run by a specialist research agency that is working with children’s groups and children across at least five member states of the European Union to explore how they would interact with that. So that the work streams that are involved in the technical build bits of it are building something that children will feel comfortable using and will be able to use and will be engaged with.

One thing that’s very important in this, in the kind of principles of this project, I’m sure come onto the principles, but one thing that’s really important is that often when you talk about age verification, age assurance, or things associated with that, it’s in the context of stopping children from, you know, accessing gambling or pornography or alcohol or whatever it may be. And so it’s seen in quite a negative light.

But this project is about enabling children to enjoy a safer internet. It’s not about blocking them from doing things, it’s about enabling them to have age appropriate and age supported materials, enabling to give their consent to use their data in a way that they can trust and understand, enabling them to interact with their parents where their parents’ consent is required, or somebody exercising parental responsibility is required.

Neil Fairbrother

You mentioned the principles of the EUConsent project, and I believe there are five such principles, and the first one is “Children have the right to participate in a digital world to the fullest extent possible”, which you’ve touched on somewhat. This does sound like it’s linked to the UN Convention on the Rights of the Child, and perhaps more relevantly to the recent General Comment 25. What does it mean in practice?

Tony Allen, CEO Age Check Certification Scheme

It is indeed linked to the General Comment 25 of the UN CRC. It is about ensuring that children have as much access to the digital world as adults do and enjoy that. And we have to bear in mind that unlike people like yourself, or a little bit older when I was growing up as a child, you know, Google didn’t exist, Twitter didn’t exist, Snapchat didn’t exist. In fact, you still had Encyclopedia Britannica, which you on the shelf in the in the study, if you had a study, which was your method of finding out anything about the world.

Children today are growing up in an entirely different digital world and it’s normal a very, very young age, and they have rights to be able to enjoy that. They have the right to access information. They have the right to be able to learn. They have the right to start to develop understanding of politics and the way in which decisions affect them, and all those things. So they also have the right to be protected from things that may harm them. And so the whole basis behind the program is always expressed in the positive, that this is about enabling children to enjoy those rights in that digital world.

Neil Fairbrother

Okay. The second principle is that “Providers of digital services and content directed at children should have a robust, trusted framework to deliver high quality age-appropriate materials”. What does a trusted framework mean?

Tony Allen, CEO Age Check Certification Scheme

So one of the challenges for any organization that wants to do the right thing in terms of delivering the right content to the right audience, the right people, is there isn’t actually at the moment, a very clear and neat framework within which you can do that, where you are able to know who your audience are, even if you don’t know the individuals that you’re able to know the age range within that audience. There is no international standard for that. There is no process that is widely recognized to be able to do that. And so part of this project is about developing that trusted framework, so let’s say I run a chat room for how you build and play with train sets, I can do that in a way, which means that I can plug into a trusted network, which means that in that space, those children are discussing train sets in a safe space for them to do that.

Neil Fairbrother

Okay. You said that in this particular principal the service is directed at children. Now many services are not directed at children, but are nonetheless used by children, partly because there’s no age verification process. But if these services are not aimed at children, but children are using them nonetheless, does this program actually apply to those services?

Tony Allen, CEO Age Check Certification Scheme

I would say It definitely assists. If you’re doing age-appropriate design, we have to draw a line around the scope of what we’re looking at. So we are building this on the basis of services that are intended to be used by children, but if you’ve got a service that may be used by children, you can still use that trusted framework, it would still operate for you. But for the purpose of scoping of the particular project within the Commission it is looking at services that are directed at children or potentially directed at children.

Neil Fairbrother

The third principle is that “People with parental responsibility or guardianship of children should have confidence in the standards and framework to enable permissive content for their children”. Permissive is an interesting word there. But in the context of this, what does it mean?

Tony Allen, CEO Age Check Certification Scheme

So this is one of the real challenges with growing up isn’t it? It’s that you start off with a toddler where everything that they do digitally is set up by somebody with parental responsibility for them. And they have largely controlled access or even limited apps on a children’s tablet or things like that. And then as children get older and older and older, this is something which is very clearly recognized in the General Comment 25, and in the Age-Appropriate Design Code of Practice, they go through development and they go through stages where they start to have more freedom and more ability to do their own thing.

Now, in the context of European law, if you’re processing data on the basis of consent, then you need the consent of a parent or someone exercising parental responsibility for younger children. Typically, it’s up to the age of 13 and in some member states, it’s up to the age of 16, but it’s usually between 13 and 16. But then beyond that, you’ve then got this situation where young children start to develop more of their own freedom to make their own decisions.

And you take that to other bits of European law, such as the Protection from Child Pornography, the Audio Visual Media Services Directive, there are there are bits in there which are about 18 year-olds and and preventing access to things for 18 year-olds. And it’s the same in things like the Tobacco Products Directive, the Pyrotechnics legislation and various other bits of European law. So this is about enabling parents, particularly for those younger age group children, to have confidence that if they set their permissions at a certain level, that the framework will permeate through the network in relation to what their children are then able to access and see.

That’s one of the things that in the research is missing. Parents think I’ve done my security settings, I’ve set my system so that they can only see you know, a certain type of YouTube or Google or whatever, but then I’m finding that they can still see stuff that I wouldn’t want them to see. They still can still see content that is for either older children or for adults. And that lack of trust and confidence in that framework is at the heart of that process.

Neil Fairbrother

Okay. Now you mentioned consent here, but as far as I understand it, some social media companies claim a “legitimate interest” to work around this consent. So won’t this simply fall down because of that?

Tony Allen, CEO Age Check Certification Scheme

So you’re absolutely right. I mean, within GDPR consent is only one of the lawful bases for processing the data and the restrictions on children and parental responsibility only apply, this is Article 8 of GDPR, only apply if your processing is based on consent. That having been said, the legislative journey that we’re on at the moment, particularly with the European Commission’s Digital Services Act and in the UK we have the Online Safety Bill, we have various processes in the US, we think that that journey is going to extend the range of things that would require that kind of child protection mechanisms in place to be broader than just the consent thing.

So part of this project, and the reason why the parliament commissioned this project, was not only to look at the existing laws, but to look at where the gaps in legislation may be. And one of the deliverables of the project is to explore where the Commission and the parliament may look at things like the European Digital Services Act and things like that, as to how they might be more aligned with the child protection needs of a digital world.

Neil Fairbrother

Okay. The fourth principle of the five core principles that you have is that “Adult services and content should not be available to children to access either intentionally or by accident and illegal content should not be tolerated”. Now, you mentioned the UK’s Online Safety Bill just then and many people were expecting the somewhat delayed introduction of age verification to be included in it. But it wasn’t. Is this a missed opportunity do you think?

Tony Allen, CEO Age Check Certification Scheme

Just on the principal, first of all, I mean, that’s a fairly straightforward principle that providers of content and services for adults should be able to do that in a way, which means that they can be compliant. I mean, regardless of what you might think of what they do, it’s legitimate in the sense of what they do, but it’s not for children. So there is the ability for them to do that as well.

In terms of the Online Safety Bill yes, we were a little bit surprised to say the very least that the only mention of part three of the Digital Economy Act, which was bringing in controls on access to online pornography, the only mentioned of in the Online Safety Bill is to repeal it. There’s no mention whatsoever in the whole Bill about the contents of that Act.

That Act, and part 3, did run into some what I would describe as practical and administrative difficulties when it came to implementation. But the principle of it, the concept of it was I think broadly accepted and broadly understood and broadly welcomed, and certainly had a lot of political support. And so, although there were some genuine difficulties with the way in which the government initially sought to bring it into force, they were solvable with a bit more thought basically. And our expectation was that the government was buying itself some time with the Online Safety Bill to bring about implementation in a bit more of a structured and a better way. So we were therefore quite surprised that the Online Safety Bill made no mention of it.

We were also quite surprised as well, that the scope of the Online Safety Bill about this whole concept of user-generated content effectively meant that the vast majority of pornography that is online was outside of that scope. And so although the Bill purports to bring the sort of harms and protections in, that would apply to a very, very small percentage of online adult content providers on the current policy. Obviously, it’s got its journey to make, it’s going through Parliament, and I’m quite sure that parliamentarians are quite aware of that point and are not going to miss the opportunity to amend it.

Neil Fairbrother

Yeah, just to be clear, it is the first draft of the Online Safety Bill that was published, not the actual Bill itself. As you rightly say, it has to go through parliamentary process. But the interesting thing about all of this is that both the Online Safety Bill and the age verification legislation come out of the same department, the DCMS, the Department for Culture, Media and Sport. Is this an indication perhaps of lack of joined up thinking, do you think, in the same department?

Tony Allen, CEO Age Check Certification Scheme

I wouldn’t go that far. What I would say is that, although it comes out of the same department in terms of the department name, the entire team has changed. None of the people that were involved in the Digital Economy Act are still there, none of the ministers, none of the civil servants, none of the organization, it has entirely changed. So you have a new fresh team looking at this with a new, fresh set of eyes. And I think it will change as part of the parliamentary process. I think even the Secretary of State, Oliver Dowden, has conceded in the initial evidence to the DCMS Select Committee that he’s open to ideas on how this can change and how they can strengthen the issue in relation to access to online adult content. In particular, this potential loophole where it’s not so much a loophole as a gaping hole, for online adult content providers that aren’t doing user generated content.

Neil Fairbrother

Okay. The final and fifth principle that you have for the EUConsent program is that “A regulatory ecosystem should encourage market solutions through a robust framework of accreditation certification interoperability across the European Union”. This seems to be quite a fundamental driver behind the EUConsent project. Will there be a pan-EU regulator of some sort, or indeed will that be your own organization, or could it be the European Commission itself?

Tony Allen, CEO Age Check Certification Scheme

So generally speaking with technology, centralized government-based systems don’t work very well. There are exceptions to that, but generally speaking market-led, industry led solutions are likely to work better in practice, and they’re also more likely to be adopted because they are driven by, you know, ultimately they’re driven by profit motives. So there has to be a marketplace there, there has to be a need, there has to be recognized requirements.

What you need from a government is to set the requirements to say, you know, you’ve got to do this. You have to do X, Y, and Z. We have to protect children, et cetera, et cetera. And you need government to support the ecosystem that will all flow around that.

I always describe this as you are sat on a chair at the moment and you have bought that chair and you have quite happily sat down on that chair, trusting that it’s going to hold you up. Why have you done that? What has been the basis of why you are sat on that chair? And it’s because there is a system of standards, there is a system of certification, there is a system of testing and you’d place your trust when you buy that chair and you will sit on it, and it will hold your weight. That comes from the fact that you trust that there is certification. There are standards, there are testing regimes. There are test houses out there that are testing that chair.

And the same principle applies to age verification, age assurance, if you’re placing your reliance on that, that you would trust that there is somebody out there that’s checked that it works, checked that it is compliant, that there’s a standard as to what it applies to.

Now, the problem we have at the moment is there isn’t an international standard for that. We’re a certification scheme. We have relatively new, so we are testing that these things work, or you need to get the standards right. We’re not a regulator. Our role is simply to test whether something test somebody claim that they can do something and then issue a certificate to say whether or not they can or can’t do that. It’s for the regulators, probably plural, to determine what they might require as a gateway.

So it’s not about having a single centralized database or system or something like that. It’s about having an ecosystem within which that trust can operate. And generally speaking, centralized databases don’t work and they’re also not trusted. Consumers don’t have that much trust in centralized data. So distributed mechanisms and trust frameworks are generally speaking, seen as being a better way to go with how you do this sort of thing.

Neil Fairbrother

Okay. Now, if I was a child in America and I wanted to access a system that happened to be in the EU, how would that work?

Tony Allen, CEO Age Check Certification Scheme

That’s a very good question. I would say that there are already challenges for non-EU countries, third countries in relation to the GDPR. And although when GDPR came out initially you would have seen the vast majority of the American websites suddenly becoming inaccessible to EU citizens and vice versa, the vast majority of the EU websites being inaccessible to American citizens. As I think they’ve understood GDPR more, and they’ve got better at the processes that sit behind it, you start to see that open up again. So citizens can now access most of the American news sites and things like that.

There is this ongoing challenge though, between the regulatory system in the US and the regulatory system in Europe, in particular. And of course for Asia as well, not to forget about that. But the reforms are happening in the US as well. So there is a lot of debate about what’s called the Section 230 publisher liability in the US and reform of that. And that really links into online harms and materials and things like that.

The other thing that’s happening there is the issue in relation to where the home state is, where the material is distributed from. Now in the old days, you would have a computer in your office with a server on it, and then you would issue a website from that server. Now, of course, it’s Cloud all over the world, distributed, you don’t have a single publisher location. And so all of that is feeding into the considerations of things like the Digital Services Act and other restrictions and requirements around how you serve online content.

Neil Fairbrother

Okay. The eIDAS Regulation that you referred to earlier doesn’t actually refer to age verification, but what it does talk about a lot is digital ID and digital signatures. So is this regulation being co-opted for age verification, or is it the most appropriate regulation to be used?

Tony Allen, CEO Age Check Certification Scheme

The best way to describe this is what we’re looking at developing is a node within the eIDAS that enables age verification to happen…

Neil Fairbrother

Okay. What do you mean by a node?

Tony Allen, CEO Age Check Certification Scheme

It’s like an option. It’s something you could tap into. So, one thing that’s important about age assurance, and I use the term assurance versus verification, is it’s not necessarily the case that you need to identify the person. So for eIDAS, you might use the identity aspect if you want to do some sort of hard identifier of the person. But for age assurance you might not need that, and that might not be appropriate. So you might use something like artificial intelligence, you might do something like face analysis or hand geometry or whatever.

Neil Fairbrother

Yeah. Within the eIDAS Regulation there are indeed three levels of ID Assurance, as it’s known as. There’s Assurance Level Low, Assurance Level Substantial and Assurance Level High. Now, clearly children are different from adults in many different ways. One of those is there’s less official documentation, you know, they generally speaking, don’t have driving licenses, or some of them don’t have passports. Some may not have any form of formal ID at all. So how do these three levels of ID assurance relate to children?

Tony Allen, CEO Age Check Certification Scheme

So they probably don’t directly relate. So part of this project actually is to explore the interaction between eIDAS and what we’re potentially looking at here. But the principle that you have a kind of a zero or basic standard, enhanced or strict level of age assurance is probably core to how you would do that in an international standard. And that piece of work has started. But it’s not necessarily a direct correlation to what you see in the eIDAS. That might be a formative part of an aspect of that, but I wouldn’t say it gives you a direct correlation.

It’s more than likely for some of the reasons that you highlight in relation to children, they don’t have an economic footprint, they don’t necessarily have documentary evidence, you know, driving licenses and things like that. It’s not necessarily the case that there’s a direct correlation there, but that is absolutely one of the parts of the project we’ve got to explore with the eIDAS Regulation and with the standards around the eIDAS, whether they are actually fit for the purpose of age assurance and age verification.

Neil Fairbrother

Okay. And does it help that many EU citizens including children already have an ID card?

Tony Allen, CEO Age Check Certification Scheme

It may do, and of course your identity is a construct of what you claim to be called or what you claim to be and what the State recognizes you as. So if you have an identity system, a national identity system, and some European member states do, then that is effectively the written agreement or the construct of what you claim to be and what the State claims to be. But then one of the challenges with that entity is the date and place of birth are probably the two attributes that never change in your identity. And everything else about your identity can change. Your name, your address, your marital status, your sex or gender. All of that is all changeable. Whereas your date of birth and place of birth are constants.

So one of the issues that you have with countries that don’t have a national identity system, which is quite a large part of Europe, you’ve then got the issue of, well, how do you record that? And that’s often recorded through things like banking, driving records, border immigration or passport records. And of course for those, children don’t have bank accounts, they may never have traveled. They may never have driven or they won’t have driven. And so you’ve got a much lower set of records. So you start to get into things like education records, learning records, what level of assurance might they be able to provide and do you access them? And then you open up all kinds of questions about safeguarding and access to children’s data and information and all sorts of complexities in the overall picture.

Neil Fairbrother

Yeah. One of the common criticisms of age verification is that it is very intrusive into a child’s digital footprint and does represent a security risk. How is that countered?

Tony Allen, CEO Age Check Certification Scheme

It can be. Certainly, if you are looking at quite strict levels of assurance of age verification, then you can be looking at gathering quite a lot of data. And one of the problems that we quite commonly see is that there’s a tendency to gather too much data and then wither not use it or delete it, or it ends up just being stored. And then you create these this sort of data sets of children’s data, which could get misused or abused or hacked.

But what we’re actually seeing emerging, and this is where the standards come in, what we’re actually seeing emerging, and really Part 3 of the Digital Economy Act, even though it never became law, was the catalyst. We’re seeing emerging quite a lot of systems and technology out there, which is very privacy protecting and only shares your age attributes.

So you see systems that create your own identity on your own mobile phone or on your own system, and then will enable you to be able to share that I’m over 18, but not share who you are, where you are, all those sorts of things. And you see systems that actually don’t even try to identify you. They just simply look at either facial features or hand geometry or cognitive ability, so sentence structure and things like that to estimate how old you are, if that is sufficient for the use case that you are using it for, then that’s perfectly fine.

And what we do is we test those claims work. So if you have a company that comes in and says, you know, we can test how old you are from how you vibrate your head, which is one that I saw on the news yesterday about artificial intelligence on head vibration, we would look and see whether or not that’s a genuine valid claim and test that it workss.

Neil Fairbrother

One of the other concerns about age verification or age assurance systems is that once you’re in, you’re in. In other words, if an adult can fool the system into getting verified as a child, you then have the “Fox in the hen coop” problem; that you have someone who’s officially recognized as a child who isn’t a child, and therefore won’t be picked up when they are perhaps performing things they shouldn’t be performing.

Tony Allen, CEO Age Check Certification Scheme

Yeah. So one of the principles that I think will emerge as part of the standards is the concept that your age verification token will diminish with time. So obviously one of the principles of having age verification is that once you’ve verified somebody’s age, they don’t get any younger, they only get older. But if you have a system, like what you described, where that becomes a lifelong token, then you will have a situation where someone spoofed the system, got in, and they’re always going to be seen to be older than they are, or younger.

Now having a principal that age verification diminishes with time, that could be instant. So if you’ve got a very strict requirement for age verification, that may well be set to require that you are verified every single time you enter the site or the service. A lower level might last for a year or something like that, or…

Neil Fairbrother

Okay. But that might introduce what the service providers refer to as resistance in the user experience. And therefore, they’re not going to be that willing to implement such systems if they think it will put their customers off using their service.

Tony Allen, CEO Age Check Certification Scheme

Yeah. So it depends on the risks associated, and it depends on the specific use case. And so the purpose of the standards that we are writing is to describe the different levels of assurance that might be applicable and what they mean. So you might have zero, basic, standard, enhanced, strict levels of assurance. And so the standard would describe what all those mean. It’s then up to the policymakers to decide which one should apply to which particular use case.

So you might have, as I mentioned, you know, the train sets, that that’s a relatively low risk environment. It may be something that can be done with relatively low level maybe a bit more than just self-asserted, which would be zero level, but a bit of basic age verification. At the other end of the scale, you might have let’s say models in a photographic agency where they’re doing nude or erotic photography, you might have a very strict level. And so the expectation there would be that every single time they would be age verified to make sure they’re over 18.

Neil Fairbrother

Okay. So here we’re talking about age verification of everyone, not just children, we’re looking at age verification for over eighteens, as well as under eighteens.

Tony Allen, CEO Age Check Certification Scheme

The game of age verification Is about everything. It’s not necessarily about under eighteens or over 18. It is about how you interact with systems and processes. So it’s not just about children. But the project, going back to EUConsent project, is focused on particularly how you enable children’s access in a digital world, but age verification or age assurance in a broader sense is about everything.

Neil Fairbrother

Okay. Time is short so just a couple of other short questions if I may. One of the participating partners in the EUConsent project is the LSE and the LSE’s Sonia Livingstone ran the EU Kids Online project, which defined a table of online risks, commercial risk being one of them. And I’m sure with your background you’ll relate to this, but buying age inappropriate products is covered by this, but so is delivery of age inappropriate products. And when we buy things online, that I guess is relatively easy to consider under the auspices of the EUConsent project, but what about delivery of such a product? How do you prevent, for example, an older sibling taking delivery of a age inappropriate product for a younger sibling?

Tony Allen, CEO Age Check Certification Scheme

This is a bit outside the scope of EUConsent project, but it’s definitely a challenge for people providing home delivery. We test a lot of this and levels of compliance aren’t great in relation to home delivery. There are a number of opportunities to carry out age verification during the process of going through a checkout or going through the delivery or the fulfillment stage of the order. Different companies do it in different ways. And I think therein lies some of the problem that you have. This as again, a lack of standards and recognized methodologies for doing these sorts of things that companies are trying to find ways of doing it and in some cases floundering. Again, it’s got to be related to risk. So how risky is the product?

And one thing that you will see coming out in the next few months, and we believe the Autumn time of this year in the UK, is the Offensive Weapons Act, which is going to place significant restrictions on home delivery of knives and bladed articles including a double age verification where you have to not only verify at the point of sale, but you also have to verify at the point of delivery, on knives and offensive weapons.

So there’s an example of what the government consider to be a very high risk product being home delivered and the stricter controls coming in on that. I think some local licensing authorities are also imposing stricter controls on home delivery of alcohol. But in other sectors we certainly see things like e-cigarettes where the restrictions, maybe aren’t as strict and maybe the levels of compliance aren’t good. sounds good.

Neil Fairbrother

Okay. Thank you. And what is the next stage then for the EUConsent project? When will the results of this trial be known?

Tony Allen, CEO Age Check Certification Scheme

So we’re a couple of months in. It’s an 18-month project. We started at the beginning of April, the next stage, I think what you’ll start to see in the next couple of months will be a lot of the academic research papers coming out from people like Sonia at the LSE and from the Leiden University and from Aston University as well. And they’ve been doing this research piece on where the gaps are, what the issues are. You’ll also start to see the standards start to emerge as well. So there’s an international standard being developed on age assurance systems being led by BSI and I’m the lead author for it. And that is currently at the contributions stage on an international level. That will start to emerge. You’ll start to see things like the glossary and definitions and those sorts of things come out very soon.

The first real pilot trials will be starting to take place in the Autumn of this year where we’ll be looking at the technical infrastructure, what the IT lot are doing in relation to interoperability and stuff like that. And then the project is due to complete basically towards the Autumn time of next year. And you will start to see all of the materials out and the information being available. There is a website, is EUconsent.edu. And there’s lots of information on there.

Neil Fairbrother

Brilliant. Thank you. I recommend people go and have a look at that. So thank you, Tony, for that fascinating insight into the EUConsent project and maybe we can revisit this in 18 months time when the results are known.

Tony Allen, CEO Age Check Certification Scheme

So, yes, it’s good to speak to you, and hopefully it’s been helpful information for you.