Safeguarding Podcast – The Five Pillars with Sean Litton CEO Technology Coalition

In this Safeguarding Podcast: Sean Litton CEO of the Technology Coalition discusses their Five Pillars to eliminate CSAM, Universal Video Hashing, universal Terms of Service, the UN CRC and General Comment 25, Age Verification, their work with EVAC and WeProtect and the UK’s draft Online Safety Bill.

There’s a lightly edited for legibility transcript below for those that can’t use podcasts or for those that simply prefer to read.

Welcome to another edition of the SafeToNet Foundation’s Safeguarding podcast with Neil Fairbrother, exploring the Law, Culture and Technology of safeguarding children online.

Neil Fairbrother

Formed in 2006, the Technology Coalition is they say an industry Alliance of a leading technology firms that have come together to build tools and advance programs that protect children from online sexual exploitation and abuse, and to realize their vision of an industry which is fully mobilized to keep all children safe from online sexual exploitation. And to do that, they have a Five Pillar Plan. To guide us through that Five Pillar Plan, I’m joined by Sean Litton, Executive Director of the Technology Coalition. Welcome to the podcast, Sean.

Sean Litton, Executive Director, Technology Coalition

Hi Neil, it’s great to be with you.

Neil Fairbrother

Thank you, Sean. Could you give us a brief resume please, so that our listeners from around the world have an appreciation of your background and experience?

Sean Litton, Executive Director, Technology Coalition

Sure. Thank you, Neil. It’s wonderful to be with you today. I began my career as a lawyer in Washington DC, working in what in the United States we call Commercial Litigation, which means representing large companies in court. And then after about four or five years of that, I transitioned to work for a global human rights organization called the International Justice Mission. And I worked for them in Southeast Asia for about six years working on cases at that time of child trafficking, human trafficking and child sexual abuse. And then worked in that organization’s headquarters in Washington DC for several years ending there as Global President and last year took a break from that and then came to work for the Technology Coalition, working with these firms to try and protect children online.

Neil Fairbrother

Okay. And what is the Technology Coalition? What purpose does it serve?

Sean Litton, Executive Director, Technology Coalition

So the technology coalition is really the place where industry comes together to solve this problem and work on this issue. It’s a very difficult problem. No single company can solve it. No single actor can solve it. And so this is the place they come really to collaborate, and to build tools, build new practices and policies and share information and kind of try and get synergy out of their combined resources to eliminate this type of abuse from their platforms and generally make the internet safer for children.

Neil Fairbrother

Okay. And your members are typically whom?

Sean Litton, Executive Director, Technology Coalition

So our members are technology companies who have platforms, online platforms that can be impacted by this issue. And it’s a number of different sectors within that industry. So we have, you know, video game industry, live streaming, private file storage, a lot of social media companies, companies that work only in text-based content, web hosting services, education platforms, and even infrastructure and cloud services. So these are all the different, you know, sectors of the internet, of the digital world, that can be impacted by the issue of child sexual abuse and exploitation.

Neil Fairbrother

Sean, it seems to me that there’s a whole pile of things going on here, which hopefully we’ll drill into during the podcast, but you’ve got companies that are naturally competing against each other, but they’ve come together to collaborate, as you say, to fight child sexual abuse materials. So on the one hand, you’ve got all the commercial sensitivity of competition. You’ve got sensitivity around illegal content of children. You’ve got privacy issues, you’ve got competitive advantage issues. How do you enable discussions between these competitive organizations on and across these really sensitive areas?

Sean Litton, Executive Director, Technology Coalition

Thanks Neil. Yeah, it’s a very challenging circumstance for these companies. And what I really feel is needed is they need a place, and I would just call it this, “a safe place”, where they can come together as an industry, where they can be completely honest with one another about, this is where we’re struggling. This is the approach we’re taking. This is what’s working well. This is what’s not working well, without any fear of punishment or negative repercussions. And as they can share that information with one another, they can learn from each other and then they can more quickly improve their capacity to address the issue.

At the same time, new companies who are well behind the ball and are desperate to ramp up because they don’t want their platform to be used to harm children, can come in and immediately get access to all that these companies have learned over the past 15 years and all the resources that they’ve developed.

So, you know, the thought and the heart and the spirit behind the Technology Coalition is this is the place where industry can come together and work to improve everyone’s game. And it’s where new companies can come in and get the support and the counsel and the wisdom and the resources that they need so that they can very quickly put in place the strategies, the policies, the technology, and the practices that will keep children safe on their platforms.

Neil Fairbrother

Okay. So for an aspiring developer who has the next bright idea for a social media network of some sort, they can come to the Technology Coalition and gain insight and experience, which will help them accelerate the development of their new service, such that it doesn’t have the same safety issues as far as children are concerned that historical services may have had.

Sean Litton, Executive Director, Technology Coalition

Absolutely for sure.

Neil Fairbrother

Okay. Thank you for that. Now you are also working with a couple of other NGOs, the EVAC or the End Violence Against Children organization, that also the WeProtect Global Alliance. What are you doing with those guys?

Sean Litton, Executive Director, Technology Coalition

So the partnership with End Violence is focused on funding research that will guide industry’s action on these issues. So we provide the resources and they, End Violence, supervises the awarding of the grants and the supervision of the grantees. And the idea is, you know, we funded five different researchers right now. They’re working on different issues in different geographies around the world, but it’s all about how can we guide industry’s intervention. So for example, there’s a researcher in Latin America studying the language patterns and behaviors, for example, of predators. And that’s important because in Spanish language, you know, to study that and figure out how we can build tools that will identify and protect children. So it ranges around a number of different issues and the whole idea is that research will come back and feed into the development of new technology.

With WeProtect, they’re our primary partner on engaging with external folks outside of the technology industry. So WeProtect is a broader global Alliance that has governments in it, civil society, survivors and industry, and not just the tech industry, also for example, the financial industry. So WeProtect would be the forum and our partner in sort of reaching out to that broader set of stakeholders to try and generate common roadmaps in moving forward and collective goals.

Neil Fairbrother

Okay. Thank you for that. Now as I said in the intro you have an action plan consisting of five pillars and the first of which is Technical Innovation, and you sort of alluded to this a little bit in your answer just now. You say that you will invest in accelerating the development and uptake of groundbreaking technologies to support cross industry approach to thwarting child sexual exploitation and online abuse. Could you give us some examples of where some technologies have been delivered that help protect children?

Sean Litton, Executive Director, Technology Coalition

So this is our number one priority. Right now we’re focused on two different types of technology. The first one is what we call “Universal Video Hashing”. And so hashing is a digital fingerprint for an image that can be associated with known sexual abuse material and a database of those hashes can be used to identify that material on any individual platform. There is a standardized way for still images in terms of hashing. And so people use different technologies on that hashing and identification of those images, but the fingerprints are standardized, so they can all use the same database.

In the case of video, it’s not standardized and companies use proprietary technologies to identify video and they have proprietary hash sets. And those hashtags don’t translate from one technology to another. And so we’re working with our members to build a translator so that they can all access the same hash set.

So whether it’s member A, or member B or member C that identifies a video of child sexual abuse, and they hash it, that their different technologies can read that hash set, and we can speed up the identification and take down of video CSAM. And that’s really important because increasingly video is being used for child sexual abuse material. So there’s an increasing amount of video, and we need to increase our ability to identify and take it down quickly.

The second piece of technology we’re working on right now is building the capacity of our members to share information very quickly, almost in real time. And this goes beyond sharing of hashes of these fingerprints of these known images, but additional information to more quickly identify harmful content, quickly identify harmful actors more quickly and more quickly intervene to protect children.

Those are the two things we’re working on now, we have a list of other things that we hope to be working on in the future. We’re partnering with Thorn and NCMEC, the National Centre for Missing and Exploited Children, on the video hashing project and we’re partnering with just our member companies on the signals and intelligence sharing platform.

Neil Fairbrother

Okay, thanks for that. The video hashing project sounds extremely interesting. One of your members, Apple, caused some controversy earlier this year, just quite recently, with their announcement that they were going to implement a hash-based CSAM technology for images but as far as I understand it, NeuralHash, which is their technology, doesn’t cover video. Some of your other members might be using similar technologies that do do video. But what you’re saying is that the disparate solutions don’t provide a common thread, so to speak and so there may be different outcomes. And what you’re trying to do is introduce consistent outcomes?

Sean Litton, Executive Director, Technology Coalition

Right. I guess the easy analogy is they speak different languages. They do the same thing, they identify harmful video content but when they give it a name or a digital identifier, they’re not speaking the same language and so there’s no interoperability between the different proprietary video hashing technologies.

And so we’re building a translator, basically. We’re working with NCMEC, they have a massive hash set of harmful video, and we’re working with NCMEC and Thorn to create a translator that will allow the different proprietary video hashing technologies to access that hash set and use it, and also to contribute to it. And so the members will benefit from what each member identifies.

Neil Fairbrother

With these hash-based technologies, whether it’s looking for images or video it is a retrospective analysis of content that’s already been taken and no matter how quickly it’s identified and hopefully taken down, there is still a victim. Is there any prospect, do you think, of having real-time capability in a device that would stop it at source?

Sean Litton, Executive Director, Technology Coalition

Yeah. There are things that are currently going on, and there’s a lot of work focused on this right now, both on the prevention side to prevent any image from ever being uploaded and then secondly, to identify, as you say, in real time images that haven’t previously been identified and hashed. And so that is using, you know, what we call classifiers, which are either some type of machine learning or artificial intelligence to identify images that are likely to be child sexual abuse material and flag them. They typically then are sent for human moderation to confirm whether or not they are indeed CSAM.

Neil Fairbrother

Okay. The second pillar of your five pillar approach is called “Collective Action” and states that “…combating CSEA child sexual exploitation and abuse requires a whole of society approach in which everyone has a vital role”. Could we drill into this a little bit because I think some of your members may feel that they don’t have such a direct vital role, for example, the content distribution networks, the CDNs of the world often claim that they are acting as a mere proxy and they’re not hosting anything at all, but some people say well that can’t be right, because they clearly are hosting content. So is there a little bit of a conflict between some of your members actions and how they explain themselves versus this laudable approach of whole society?

Sean Litton, Executive Director, Technology Coalition

Well, I can only speak for the companies that are members of the Technology Coalition, and they are taking their responsibility seriously. That’s my experience. And the idea between collective action is really that, you know, we need a coordinated response on whatever the issue is, and we need to share information. So, you know, when we all come to the table, industry has a very important role, but also regulators and policy makers have an important role. Law enforcement has a important role. Social services that support victims and survivors has an important role. Civil society, survivor groups have an important role. So the idea is, you know, how can we coordinate our actions so that we can move forward sort of in a collective manner and drive greater impact across the entire ecosystem?

Neil Fairbrother

Yeah, some technology companies, such as ISPs for example, use what’s known as the “mere conduit defense’, because as far as they’re concerned, they’re simply passing packets of data across their network and they don’t want to be seen as the arbiters of truth. To be fair to them unless they’re doing deep packet inspection or something like that, of all the individual packets that go across their networks, then their room for maneuver is limited, but nonetheless, there are things they could do. So how do you bring organizations like that around to support the view of a whole society approach do you think?

Sean Litton, Executive Director, Technology Coalition

So within industry and even within our members there, you know, many of our members have several different platforms and several different technologies in play. You know, the issue of privacy also comes into play, you know, the expectation of privacy and private communications and whether, you know, private communications should be screened, et cetera. So, you know, each company makes its own judgment on, you know, where it’s appropriate to intervene and where it is not appropriate to intervene. They base it on their own set of principles and practices that they publish in their users’ agreement. These are not simple issues or easy issues, but I think the idea is that we find this harmful material where it’s being used in the most harmful manner and find it as quickly as possible and identify, you know, the generators of the material and try and take it down as quickly as possible.

Neil Fairbrother

Okay. The third pillar of your five pillars approach is “Independent Research” and research is very interesting. It’s great to feature it in these podcasts, but I just wonder how much more research we actually need. You know, we already know that social media platforms seem to be rife with CSAM. We also know that CSAM is illegal in most jurisdictions around the world, if not every jurisdiction. So do we actually need any more research? Hasn’t this topic been researched to death and could we just get on and implement solutions?

Sean Litton, Executive Director, Technology Coalition

I mean there’s tons of work to be done on implementation. There’s no doubt about that, but there’s also a great need for research. So for example you know, the idea is how can we prevent this from ever happening, right? So stop it before it happens. And so, you know, research on the behaviour of potential predators or actual predators so that we can work to quickly, identify them and protect children from them. Research on, you know, how this issue affects newer technologies and what type of interventions would be most effective. And so, for example, you know, live streaming is a technology that is problematic, right? Because it’s happening live and, and how can we appropriately intervene in a live streaming context? What are the particular ways that predators might use a live streaming context to access and harm children and what are appropriate steps that can be taken to prevent that and arrested as quickly as possible. I think the more research we can get the better.

At the same time, as you said, you know, the implementation side, it’s like in addition to that, for what we already know there’s also a lot of work to be done there. And I would say that, you know, no single member of the Technology Coalition has this all figured out, no single company has this all figured out. And the idea behind the Technology Coalition isn’t that, you know, here are the list of the perfect companies that have no problems, but here are the companies that have come together in a very, very committed fashion to share information and help one another continue to improve their performance on this issue and here is the place where other members of industry can come to quickly, you know, accelerate their progress and improve their game on this issue. But as you said, there is work to be done and there will always be more work to be done.

Neil Fairbrother

Yes, you touched on a really interesting topic that of how to prevent pedophiles from doing what they do in the first place, how do you identify these people? That seems like an impossible task without some kind of screening of children as they grow up to analyze their behaviour and see where they end up? Is that even feasible?

Sean Litton, Executive Director, Technology Coalition

Well, I can’t speak to that, but I can speak to that there are patterns of online behaviour that can be detected and identified as a risk. So as to the whole psychology that goes into creating that behaviour I don’t know, but in terms of the behaviour itself online, there tend to be common patterns and research can identify those patterns. And then companies with that information can build tools to identify that behaviour on their platform and flag it and then look into it further.

Neil Fairbrother

Yes. So my understanding, for example, of grooming is that it is a well-defined area, that there’s nothing new in grooming, grooming for sexual exploitation or indeed for radicalization, it’s the same process. The same practices are perpetrated and the outcome maybe sexual exploitation or radicalization. So are those the kinds of patterns that you’re referring to?

Sean Litton, Executive Director, Technology Coalition

Exactly. Yeah. Those types of patterns online and then developing tools to identify them, flag it and investigate it further.

Neil Fairbrother

Okay. Now you did say that privacy is an important thing here and there was a very strong reaction from various people against Apple’s proposal, because Apple were suggesting that they would have to do some scanning on their device, and it was the on device scanning that got a lot of people up in arms against the proposal. If you’re then looking at people’s behavioural patterns, that must fall into the same kind of privacy or potentially anti-privacy category because you will have to analyze on their device, their own communications patterns and behaviours.

Sean Litton, Executive Director, Technology Coalition

Not necessarily on their device. It could be what’s taking place on the platform itself. So, and I think, you know, always the idea to the extent that this has been done is to do it in the least intrusive way possible and anonymous until, you know, particular thresholds are crossed and then people begin to look into it more specifically, but only, you know, after a particular threshold has been crossed.

And again, this is something that’s unique to each member, how they approach this, the standards they set for it, the protocols they build around it. There’s not one, you know, single approach being taken by industry or by the members of the Technology Coalition. So, you know, I think in this case, if you want to know how different companies are actually working, for example, on grooming, you know, we’d have to ask them individually, they’re all taking unique and different approaches.

And again, so what we’re doing in the Technology Coalition is we get the companies to share these different approaches in the hopes that, you know, as Company A and Company B or Company C they’re all at different levels of maturity, they’re all at different levels of capacity. So, you know, we hosted, you know, an internal member webinar on grooming, for example, and we had Thorn come because they have a new grooming technology, a new grooming classifier that is available to our members…

Neil Fairbrother

Is that the one that Microsoft developed?

Sean Litton, Executive Director, Technology Coalition

It’s related to that it started with Microsoft and Hany Farid, who I know is a guest previously, and now Thorn has moved on to, you know, their own I believe it’s their own proprietary classifier, but it began with a project with Microsoft and Hany Farid. So Thorn presented on that and explained how their classifier worked and made it available to our members.

Then one of our larger members presented on their particular approach and they only presented on one piece of their approach, some of the analysis that they do, and then another member, a smaller company that is doing a lot of this actually manually presented out their approach. And again it goes to, you know, identifying particular patterns of behavior, but all we’re taking different approaches. Then the other companies that are present, can learn from this and begin to decide, okay, you know, how do we want to approach this issue? What technology is available to help us get there? Or what do we want to develop on our own and where do we want to use it, and where do we not want to use it? And how do we want to use it? These are very much individual platform decisions.

Neil Fairbrother

The fourth pillar is “Information or knowledge sharing”, which has a number of subsections. But the piece that stood out for me was the third bullet point, which says that “…our work in the coming years will likely include the development of rapid response protocols and mechanisms for sharing leads on abuse factors”. What do you mean by rapid response protocols?

Sean Litton, Executive Director, Technology Coalition

Right. And this is what I referred to earlier that’s also related to tech innovation, and this is facilitating a platform whereby our members can share information rapidly rather than, you know, a periodic download of a hashtag that, you know, they can immediately elevate a threat. They can immediately elevate an image and push that across everybody’s hash detection platforms kind of instantaneously, rather than, you know, company a downloads a hash set to a hash database and then there’s a delay between before the other companies might upload that hash set. So it’s the idea of doing it much more quickly and sharing information beyond hashes.

Neil Fairbrother

Okay. And the fifth and final pillar is “Transparency and Accountability” and in here you say that you’re “…inspired by the recent release of voluntary principles to counter online child sexual exploitation and abuse, and also the UN convention on the rights of the child”. Now, the UN Convention on the Rights of the Child is particularly interesting in this context, I think because, well, for two reasons, one is a recent addendum to it, which is known as General Comment 25, which defines children’s rights online and the other reason it’s of interest is that a lot of your members are American organizations operating obviously under American law and jurisdiction, but America as a nation hasn’t ratified the UN CRC. Should America ratify the UN Convention on the Rights of the Child and if it doesn’t then will it take any notice at all of General Comment 25?

Sean Litton, Executive Director, Technology Coalition

Yeah, Neil, I’m not in any way prepared to answer that question on behalf of the United States!

Neil Fairbrother

Okay. Fair enough. Let me phrase it another way perhaps! The General Comment 25 addendum to the UN Convention on the Rights of the Child defines and extends children’s rights online. Is this a useful document, do you think for your members to use as the basis for perhaps their own terms of service as far as children are concerned?

Sean Litton, Executive Director, Technology Coalition

With respect to General Comment 25, yes it is a helpful guide. And you know, it’s not for example, global legislation, but it is a helpful guide for companies as they seek to better protect children online and understand kind of what our corporate social responsibility is with respect to this very vulnerable population.

Neil Fairbrother

Okay. Now, when it comes to online child safety, there is a view that it should an integral part of a business plan for a company providing services that may be used by children or likely to be used by children, in exactly the same way for example, that Tesla had to comply with all car safety regulations from the get-go. Is this something you might agree with, that whether you’re a small startup in a social media space, you should comply with safety legislation such as it is?

Sean Litton, Executive Director, Technology Coalition

Well, you know, obviously to the extent, to the extent you’re operating in a country, and there is legislation in force, of course, you should comply with that legislation, where there’s not legislation, you know, companies are left to using their best judgment. And generally, my sense has been for at least working with the members of the Technology Coalition, that their business interests align a hundred percent with protecting children on their platforms and doing whatever they can to ensure the children are not harmed on their platforms. So I find no shortage of motivation to deal with this issue.

The challenge can be for a smaller company, is a resourcing challenge. It requires actually a great deal of resources to implement these technologies, engineering resources, human resources, to implement human moderation, work on policy and practice, et cetera. And, you know, there’s only one guidebook on how to get there that I’m aware of and we’re trying to write it at the Technology Coalition.

And so, you know the idea of the Technology Coalition, I think, you know, the thing you seized on that’s really important is for new startup small companies. And they may have been designed by a 19 year old, right, who had a great idea, but never thought about how, you know, this new social media platform they developed could be abused by people who want to harm children. And so, you know, this is just an individual, maybe they’re an engineering student, maybe they dropped out just to work on their new app. And so once that app starts to gain traction how does that new platform quickly improve its child safety practices, right? They don’t have the resources of a Google or a Facebook, et cetera.

So the idea is that they can come to the Technology Coalition and we can help them do that. And we’re trying to position ourselves and build out the resources and the expertise so that, you know, any small company can come in and we can help them quickly elevate their game and make sure that their platform isn’t used to abuse children.

Neil Fairbrother

Okay. You referred in your fifth and final pillar to Transparency and Accountability and the UK published a draft online safety bill earlier this year, which is working its way through due parliamentary process. And it focuses very much on transparency and accountability, to the point where it asks for a named individual within a company that falls within its remit. So it’s covering user-to-user service providers and search engines and for that named individual to possibly be open to criminal proceedings if they don’t comply with the transparency and accountability of reporting requirements. Is that position something that the Technology Coalition supports, or do you feel it’s too draconian?

Sean Litton, Executive Director, Technology Coalition

Right. I really can’t speak to that particular issue. I can say that you know, the impetus behind that type of regulation is to drive greater transparency with industry, right, and to ensure that there’s accountability both for actions and for the transparency itself. And, and so, you know, we do support greater transparency by industry and greater accountability, and this is what we’re working on. We published, you know, our first annual report this year and in that there’s a whole section on transparency for, you know, kind of the amalgamation of all our members. It was only a start and there’s a lot more to do and we’re working with all of our members on their own individual transparency reports, increasing the level of transparency and increasing the level of reporting,

Neil Fairbrother

Now Ofcom, which is the new regulator we have in the UK, have recently published some guidelines for regulation of video sharing platforms and this will impact a number of your members, I’m sure, in a number of different ways. So the guide is that video sharing platforms, or VSPs, as they’re known, need to comply with new regulations “…to protect under 18s, specifically from harmful content, such as hate speech and videos and adverts, some of which might incite violence against protected groups, terrorism, content child sexual abuse material”, which is, I think your group’s speciality, “…racism and xenophobia” So it’s quite an extensive list there, but this does raise the spectre of Age Verification as it’s focused on under eighteens. Is Age Verification something that your team at the Technology Coalition is working on?

Sean Litton, Executive Director, Technology Coalition

Yeah, you’re right. That’s a huge issue. And I think next month, either later this month or early next month, we’re hosting a webinar specifically on that issue. And so what we will do, like I talked about with grooming is we’ll bring in, you know, different members, you know, different size members with different types of platforms and talk about how they’re implementing age verification on their platforms. Again, with the idea of, you know, sharing, you know, sort of what their practices are and helping other members consider how they want to approach this issue.

Neil Fairbrother

We are running out of time, unfortunately, but I would like to ask something about the Terms of Service that your members use. Clearly, they are coming together within the Coalition for a common cause, but equally they are competitive service providers. They compete with each other at many different levels, but as we touched on earlier, they are probably all in agreement, I would certainly hope they’re all in agreement, that it is university illegal to create, share, upload and store CSAM. There’s no privacy carve out, there’s no Freedom of Speech carve out, First Amendment carve out, or Constitutional carve out for this kind of content at all.

So as well as having that common hashing technology you referred to earlier for video, would it be possible for your members to create a common unified approach across all of the Terms of Service in this respect so they can compete on other aspects, but in this respect, CSAM and trying to eliminate CSAM, could they have a common approach that’s defined within the Terms of Service?

Tik TOK, for example mentions CSAM a lot in their Terms of Service, and I know TikTok is one of your members, other similar service providers such as Telegram, which may not be one of your members, but their terms of service is so short as to be almost not worth the pixels it’s written with. But could there be a unified approach within the Terms of Service that then is communicated to the service users when they sign up to the service and it’s a common standardized approach that simply eradicates this stuff from all these services?

Sean Litton, Executive Director, Technology Coalition

Right. This idea of a common standardized approach is something that we have talked about. One of the real big challenges to this is that each platform is different than the other. They operate differently. And I’m not trying to avoid a responsibility for taking on the issue of CSAM. I’m just talking about the difficulty of setting common standards. What we’ve developed within the Technology Coalition is what we call a maturity model, which shows sort of the path. It’s a very general roadmap with some mileposts that a company would take as it matures on these issues. But even that is problematic because it it’s not customized to each sector. Like we have all different sectors represented it. And so one of the things we’re looking at in the coming year is developing roadmaps that are more sector specific, developing clear and objective benchmarks. And as we gather around to talk about those things, we’ll see if that moves at some point to common standards across all platforms.

You know, we do have some common standards now that we insist all of our members implement, you know, they must have for example, an ability to receive reports of CSAM. They must have an ability to report CSAM to you know, whatever the government has designated where to report it in their jurisdiction. So there are some basic requirements right now that, you know, industry is aligned on. But you know, as to how the technology gets implemented on the different platforms, it’s quite complicated.

But you know, this is something we’ll be working on and we’ll, you know, maybe in a year we’ll come back and I’ll have more news on it, but you know, what I see with the smaller companies is that what they’re looking for is like, you know, tell me what are the benchmarks? Where do I need to be on these issues? And we’re trying to, to make that as clear as possible for them so they can, you know, just have a very clear target to aim to, in terms of kind of basic minimum performance on these issues.

Neil Fairbrother

We’re coming up to the end of the year. It’s the middle of October as we’re recording this podcast, what’s left in store for the Technology Coalition for this year?

Sean Litton, Executive Director, Technology Coalition

Right. So we do have this webinar coming up on Age Verification and, you know, we’re excited to see where that will go with our members. And then, you know, we’re really looking towards 2022 and what are our plans for there and how can we better build out a set of resources to help new companies? How can we better support companies in in their efforts? And then, you know, we’re doing a lot of work on transparency and how can we facilitate greater transparency among our members, among industry, and how can we facilitate, you know, a genuine dialogue between industry and external stakeholders.

Neil Fairbrother

Okay. Sean, thank you so much for your time. A really interesting insight into the Technology Coalition and your members, and look forward to seeing your next annual report and hearing a results of the webinar on Age Verification.

Sean Litton, Executive Director, Technology Coalition

Thanks, Neil.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top