Safeguarding podcast – Tools for Today’s Digital Parents with Stephen Balkam CEO FOSI

In this Safeguarding Podcast with Stephen Balkam CEO FOSI we discuss their report “Tools for Today’s Digital Parents” to keep their children safe online. We cover “resilience”, Age Verification, Parental Controls vs “SafetyTech”, the different attitudes towards and practices of online child safety by Millennial, Gen-X and Boomer parents, the six key takeaways from FOSI’s report and what children themselves think.

There’s a lightly edited transcript below for those that can’t use podcasts, or for those that simply prefer to read.

Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil, Fairbrother exploring the law culture and technology of safeguarding children online.

Neil Fairbrother

Children it seems have been online forever. Being always on, always connected, has become so normalized to them that it’s simply considered as natural as eating or sleeping. But the online environment has never been designed with child safety in mind, which has led in some cases to some extreme abuse and outcomes.

There are online safety tools available for parents who are minded to use them, which can help protect children from the worst excesses of the online world and COVID has arguably made the case for using these even stronger. Yet many parents don’t use these tools and even if they do use them, just how effective are they?

To discuss this, I’m joined by Stephen Balkam CEO of the Family Online Safety Institute, or FOSI. Welcome back to the podcast, Stephen, because this is in fact, your second appearance on our safeguarding podcast.

Stephen Balkam, CEO FOSI

Well, thanks for having me back Neil, I really appreciate it.

Neil Fairbrother

Well, thank you, Stephen. It’s great you’ve made so much time for us. For our new listeners and perhaps as a reminder for our existing audience, could you provide us please with a brief resumé, so that everyone understands where you’re coming from, what your background is?

Stephen Balkam, CEO FOSI

Well I’m the Founder and CEO of the Family Online Safety Institute. I set it up in 2007 with the idea of bringing together government, industry and the nonprofit sector to collaborate and to innovate in the space of online safety.

We’ve grown from seven members to about 30 members, from Amazon to Yahoo in the alphabet, if you will. Facebook, Google, Microsoft are all members. And we do research, we do projects. We do something called “Good Digital Parenting” and we work behind the scenes with the tech companies to up their trust and safety efforts.

Prior to that, I ran something called ICRA, the Internet Content Rating Association, which I set up in 1999, actually in the UK. I’m a dual national, I’ve gone back and forth over the years, we got a big grant from the European union part of their Safer Internet program, and that was a rating system for websites, a self-rating system around nudity, sex, language violence. And then we created a filter that could help parents to block the stuff they didn’t want their kids to see.

Prior to that, I was part of another effort, a US-based rating system called RSAC, the Recreational Software Advisory Council going back to 1994. So I’ve been in this space for a while, have seen the kind of emergence both of the web and then Web 2.0 and social media in particular. And it’s been quite a ride!

Prior to all of that I lived in the UK for many years, ran various nonprofits there from Camden Community Transport to Islington Voluntary Action Council, even the National Step Family Association based in Cambridge. And so yeah, long, long history of working in the nonprofit sector world, in the UK as well.

Neil Fairbrother

Excellent. Thank you for that. Now in November last year, you held your annual FOSI conference, entitled “Building Resilience” and presumably that was an online conference last year as were faced with COVID lockdowns and the like. How did the event go? What was it like and what did it involve?

Stephen Balkam, CEO FOSI

Well, yeah, as you say it was a virtual conference. We had to make that fateful decision, I think back in April or May of last year when it became clear that by November things would not have cleared up. And so, yeah, funnily enough, we used a British-based platform called “Hopin” as our platform for the conference. We also use Zoom for the video feed and, you know, ironically, we were able to have far more folks involved and engaged in fact international ones as well.

So we had folks from the UK, from Brussels from Australia, New Zealand because they didn’t have to travel and so it was a lot easier for people to attend. Because Hopin has this really excellent chat function, it has one-to-one networking, it has virtual exhibit booths, people were able to interact with each other in ways that really surprised the delegates and they were pleasantly surprised and we got some very, very good positive feedback for the whole experience.

So yeah, we even changed the title of the conference to “Building Resilience” because you know, when we first conceived of it earlier in 2020 there was no pandemic on the horizon, but we felt that resilience was a key word for kids, for parents, for teachers, even lawmakers to get through this ridiculously tricky and challenging year.

Neil Fairbrother

Okay. And what do you mean by “resilience” in the context of children online? What do you mean by resilience, Stephen?

Stephen Balkam, CEO FOSI

If you think of the word itself, it’s about quickly recovering from a difficult situation or trauma in some cases. And I think that, you know, we’re very big believers in giving kids the opportunity to make mistakes, to give them enough freedom to enable for them to learn and enable them to recover from difficult situations.

So for instance, if they’re cyberbullied online, well one way to build resilience in your kids is to give them the tools, or at least show them how to report, how to block, how to filter out the people that they don’t want to continue to bully them. And in some more severe cases, taking the situation to the school, or even to law enforcement and showing that the kids actually have the power and the agency to overcome these challenges, rather than just simply turning everything off, throwing things in the bin, you know, and trying to walk away from it… because that’s just not going to happen. Digital technology is with us and certainly with our kids for the rest of their lives.

Neil Fairbrother

So it’s a little bit like an inoculation against a virus.

Stephen Balkam, CEO FOSI

Yeah. And you know what? Sometimes to inoculations hurt a little bit to begin with, you even might have an adverse reaction, but basically you recover and you build resilience against the invader, if you will…

Neil Fairbrother

Okay. Now at the event, which I attended and I have to join the people who congratulated you on it, I thought it was excellent, at the event you launched a major report that you had produced I think in collaboration with Verizon, called “Tools for Today’s Digital Parents” and this report describes attitudes to what are known as Parental Controls. What are Parental Controls?

Stephen Balkam, CEO FOSI

So traditionally we thought we’ve thought of Parental Controls, and I talked to you about my work all the way back to the nineties, you know, the original concern, the original problem as it were, was porn on the internet. We wanted to create filters, Parental Controls, so the kids wouldn’t get to porn. That was, that was the kind of genesis of all of this.

Neil Fairbrother

This is adult legal porn?

Stephen Balkam, CEO FOSI

Legal adult pornography, which is in this country protected by the First Amendment and in many other countries and cultures it is protected speech. Not all countries, obviously, more conservative countries don’t allow it at all.

So over the last two decades all kinds of Parental Controls have emerged to give parents the ability to control the kind of content that their children access. I mentioned earlier Web 2.0, the emergence of Web 2.0, typically around 2004, 2005, with the emergence of MySpace, later Facebook, Twitter, social media, basically. User-generated content created a new dilemma for parents where kids weren’t just accessing harmful content, they were creating the content we used to try to keep them away from. So they were sharing nude pictures of themselves, or they were bullying each other, or simply they were just spending way too much time online and, you know, their schoolwork and so on was being adversely impacted. So Parental Controls morphed and changed to deal with not just content, but also their conduct online, as well as contact; who they were contacting and who they were allowing to contact them. And so that sort of changed and morphed.

And then I would guess over the last four or five years, we’ve seen the emergence of one other category, what I call “Online Safety” tools, and these are more directed at young people themselves, the tweens, the teens, folks in their twenties. These are ways in which young people can report stuff that’s happening. They can block stuff that’s happening. They can take issue with the types of comments that are coming in. And we’re seeing those sorts of tools developed by Snapchat, by TikTok, by Twitter, by Facebook. And these tools are ones that teens and young people highly value as opposed to parental controls, which they mostly despise.

Although, you know, quite frankly, they would probably wouldn’t admit it, but maybe some are relieved that their parents have restricted some stuff. But all I’m saying is there’s this interesting new category and it’s one that I think we need to pay a lot more attention to, particularly as kids grow older and we don’t just give up because the kids have learned how to hack the Parental Controls that you’ve set, but to encourage them to create the parameters around where they feel safe on their apps and when they’re on line.

Neil Fairbrother

Okay. Now the report was called “Tools for Today’s Digital Parents” and that implies there’s such a thing as “digital parenting” and in the report you say that digital parenting is a journey towards trust. What does that mean and how is this trust earned by the child and recognized by the parent?

Stephen Balkam, CEO FOSI

Well, okay. So first of all, kids are getting digital tools and devices and going online at a younger and younger age. You know, Amazon’s Kindle Fire for Kids is aimed at the two year old market, you know? And so you wouldn’t hand a kindergartener a device with unfettered access to the internet. That just doesn’t make any sense. To give them the ability to go wherever they want, whenever they like. By far, the majority of parents would not allow that to happen. You don’t trust your four and a half year old to know, and to be able to discern where to go and where not to go. But of course, you know, childhood is this remarkable continuum from zero to say 18 and there’s a vast difference between a six-year-old and a 16 year. And so, as they get older and typically around 12, 13, there is a shift athat shift is a little bit like the training wheels coming off of a bike. As you, as you teach your child to ride a bike where you start to trust them more.

But as I think it was Ronald Reagan who said “Trust, but verify”. In other words, trust them, but stay as copilots with them along this journey where you say, “Okay, I’ve bought this phone for you, or you know, you’re 13, you’re allowed on Facebook and on Twitter and these other social media sites. But here’s some rules that we have about the ways in which you use your devices and the types of apps and websites that you go to. And I’m going to keep an eye on you. I will keep an eye on, I will monitor, I will check in. And on top of that, I want you to tell me what online safety tools you’re using on these platforms. Tell me how you’re going to keep your posts private. Tell me how you report when something bad happens”.

So the trust builds up over time and of course by the time they’re 18 it’s “Goodbye, thank you very much!

Neil Fairbrother

Okay. Now in the report, you talk about concerns and online risks that typically parents have for their children and the peak age of concern that parents have is when their children are in the age bracket of 7 to 11. But often the minimum age to be online is 13. Many social media sites have that as the lower level of age. So is there a requirement for some kind of robust age gating, age assurance, age estimation, age verification system, rather than the all too porous self-certification that is currently used?

Stephen Balkam, CEO FOSI

Well, you know, I guess it’s the Holy Grail of the online safety world. I’ve been a part of not one, but two different congressional commissions looking at this and all of the various solutions that have been provided have had some flaw, some difficulty, some issue that comes along with it.

Ironically to validate an 11 or 12 or 13 or a 14 year old, you have to gather even more personal information than you would do if it was a 22 year old. 22 year olds can typically show a driver’s license or some such, a government ID which is not available to much younger kids, or you use biometrics, which in itself creates issues around databases of kids’ biometrics. After all these years, there doesn’t seem to be a solution that works on a lot of different levels. So we’re stuck with this self-declaration of your age, as well as parents getting more and more involved with their kids’ digital lives.

Now, some parents decide that they’re going to let their kids on to Facebook or Snapchat at a younger age, regardless. They just want them on there because that’s the way the family is sharing their photographs, or that’s the way they organize their gatherings or once gatherings start to happen again. So some parents will actively lie for their kids to get them onto these social media sites. And I guess that’s their decision.

Now having said that if one of these social media sites discovers that there is an 11 year old on there, even though they declared that they were 15 and they have proof of that, then they will shut down their ] their profile. So it’s still a vexed issue. We’re not fans of the idea of encouraging parents to lie, to let their kids on onto social media platforms. It’s just not the best introduction to the notion of digital citizenship to begin your journey on by, you know, lying about your age.

Neil Fairbrother

Okay. So that then places some responsibility on parents and one of the interesting points that came out from the report is the generational impact of parents. The different ages of parents, whether they are Millennial parents, Gen X-ers, or Boomers will impact their attitude and practices towards keeping their children’s digital life safe. Could you define for us what a Boomer, a Gen X-er is and a Millennial is, and what are their different attitudes when it comes to keeping their children safe online?

Stephen Balkam, CEO FOSI

This was probably the most fascinating part of the research. We asked them two different questions. One was, who has most responsibility for keeping kids safe online? And the other is what are the top concerns?

So on the most responsibility one, Boomer parents, 1946 to 1962, I guess the older folks, if you will, the older parents, 57% said the parents had the most responsibility for keeping their kids safe online. Gen X, it drops down to 43%, but dramatically Millennials, only 30% said that parents had the most responsibility. They were half as likely to say that as Boomers. They thought that the tech industry and government, and even schools shared responsibility for online safety.

And I would agree. We’ve created a concept called a “culture of responsibility” where we talk about you know, government making informed decisions, the tech industry creating robust tools for parents, law enforcement having the resources to catch the bad guys, but also parents, teachers, and the kids themselves all sharing different, but overlapping areas of responsibility.

So that was one question. The other one was, what are your top concerns? Boomers overwhelmingly talked about outside threats, particularly predators. It was a real fixation with the idea of the boogeyman coming to get your kids. Gen-X was more about harmful content like adult porn. Millennials, interestingly enough, focused on bad behaviour, including their own kids possibly being the perpetrators of cyberbullying or sexting, or, you know, intimidating others or whatever.

And I think that’s fascinating because of course Millennials are ones who’ve grown up with this technology and may themselves have participated in some of this behaviour, whereas Boomers, you know, and I speak as one, you know, the internet happened halfway through my life. So I didn’t grow up with that as a teen. Whereas Millennials see it from a very, very different perspective.

Neil Fairbrother

Okay. From my notes, just for clarification, Millennials are adults aged between 23 to 38 as of 2019 anyway and Gen-Xers are those born between 1960 and the late 1970s, I believe that’s how that breaks down and Boomer are as you say 1946 to 64.

Okay. So thank you for that. You have six outcomes or conclusions in your report and I think it would be well worth exploring those outcomes. So in reverse order the sixth key takeaway or outcome from your report is that “…online safety centre destinations have become an industry best practice for media and tech thought leaders.” Could you take us through that? What does that mean?

Stephen Balkam, CEO FOSI

You know, this trend probably started about a decade ago and Facebook was one of the first ones. In this country, Verizon, AT&T, some of the other ISPs started to create online safety centres within their broader platform. And so it became a repository of information about how to stay safe on that particular platform or ISP. And then there would be blog posts and there would be events and other things generated around it to help parents to keep their kids safe.

Now we think that’s great and we’re all about it and encouraging the various different companies… In fact, I can’t think of a company now that’s active online that is involved with connecting kids to the parents, or providing some kind of social media platform for user generated content that doesn’t have a safety centre. It just makes total sense. The last thing they want is for people to feel unsafe on their place, or for parents to feel unsafe that their kids are on their particular platform or app.

The issue, I guess is for parents, it becomes difficult and confusing to have to go to all of these different safety centres as their kids get older. And you mentioned between 7 and 11, this is when there’s a great expansion, a great explosion of the ways in which kids use the net, the different sites they go to and the apps that they start to download. It’s a lot for parents to have to visit all of these safety centres and make sure they’re keeping up-to-date.

Neil Fairbrother

Yeah, and that links neatly actually into the fifth outcome or conclusion that says that “…it is critical for media companies to reach parents before, or when, kids in the household reach the 7 to 11 age range.” That seems to be a really key age there. So how can media companies or tech companies do that?

Stephen Balkam, CEO FOSI

Let’s start with the mobile phones, for instance. We would love to see in actual mobile phone stores our little mini online safety contract that parents can sign with their kids, or tips and tools to keep them safe, literally given to the parents as they’re signing the contract for the phone for their eight or nine or 10 or 11 year old. So that there’s an exchange, kind of like the safety information you get when you buy a new car say, or, you know, you buy some machinery for your house. There’s usually some kind of document that comes with it that talks about how to remain safe with this new device. So we would like to see more of that.

Similarly with downloading apps, we would love to see some inline messaging so as the app is being downloaded, there is a link directly to the ways in which parents can control what their kids are going to see and do, or a link to the online safety tools for the kids themselves, as I was describing those different ways in which you can create safety. Parental controls, of course are top-down, online safety tools are more bottom up, but we would like to see that messaging happening much earlier on.

Neil Fairbrother

There is a model suggests that much like that is in an Apple store, a Genius Bar for technical questions that could be deployed within the cell phone vendors’ stores a “Safeguarding Genius Bar.” Would that make sense?

Stephen Balkam, CEO FOSI

I love that idea. Yeah. I mean, I just think we have to get, I hate to say this phrase it’s so hackneyed, but thinking outside the box, we have to get a lot more thoughtful and imaginative about how to reach parents and do it in a way that is simple, easy to understand and can be actioned right away, rather than two or three weeks down the road when a problem arises and then all hell breaks loose in the family.

Neil Fairbrother

Okay. So the fourth in reverse order, the fourth outcome you had in your report was that there is “… a similarity across digital parenting tools and features, and that leaves room for innovation and differentiation”. I think is what you’re saying there is that there were plenty of tools out there, but they are all pretty much the same. Does that mean that they are not doing the job?

Stephen Balkam, CEO FOSI

Okay. This was a somewhat … I did a little bit of a question when I saw this come out of the research. There are definitely similarities, but I’m afraid that probably there are more differences than similarities. And here’s the thing. I would actually like to see the companies use similar or the same language, the same logos, the same graphics across the various different tools, so that when the parents come in… it’s a little bit like how we all sent it around that that triangle, which means “play”. You know, if you go on to any kind of device, the play, the pause button, the fast-forward button, they’re all the same across lots and lots of different devices. And you know, the industry got together and, you know, it was better for consumers that they were faced with the same icons than if they had to figure it out for each device.

I would love to see something like that for parental controls, and unfortunately the interfaces vary dramatically between them. Having said that, the similarities are that they, for the most part, focus on adult pornography and violent images, and, you know, so yes, you then see similarities there. Same with time controls. But when you actually go into the time controls the way in which the layout sets out the 24 hour period is different on each of the devices,

Neil Fairbrother

How much of an issue is it for these products to operate in real time?

Stephen Balkam, CEO FOSI

I think that there’s been some vast improvements over the last, I would say five or six years with the use of AI. AI artificial intelligence allows these filters to filter out on the fly. In, the old days, you literally had to create a database of URLs of bad websites and those would be blocked. AI allows for down to the image or down to the word of stuff to be flagged or blocked or filtered out.

The not perfect. Usually the problem is over-filtering rather than under-filtering.

Neil Fairbrother

Is that a bad thing? The precautionary principle applies

Stephen Balkam, CEO FOSI

If you over-filter, here’s the problem. If you over-filter, typically as a message comes up, which says, you have been blocked, please enter your password to access this content or access to this site. So it’s usually, you know, a kid calling from another room, “Mum, what’s the password I can’t get in. I’m doing this homework, you I’m doing biology. I need to see naked ladies, you know, give me the password.” And so you constantly, so the Mum has to come in and type in the password, or inevitably what happens after the third or fourth time, Mum just shouts the password out. And then Johnny now has the password and doesn’t have to even bother. So that’s, that’s the problem with over-filtering under-filtering, obviously, meaning that Johnny gets in regardless, and the filter is not working as it should.

Neil Fairbrother

The third conclusion you drew from your research is the Millennial parents, having grown up with technology are looking for more digital parenting support from the industry. What do you mean by the industry there? And is the industry, however it’s defined, doing enough? Are they moving in that direction or are they not?

Stephen Balkam, CEO FOSI

Yeah, so this was fascinating. And this goes back to who has the most responsibility? And parents, only 30% of them said they had the most responsibility. They really see the responsibility lying with the folks who are creating these tools or these apps or these websites, which makes some sense if you think about it.

I mean if we buy a car, we actually would probably mostly say that the car manufacturer has the most responsibility for that car being safe. We might also say, “Oh, government also has a role in ensuring that these cars, there are safety standards that government sets”. But we would probably not say, or a minority of us would say, “Oh, we’re the ones who have the most responsibility for this car being safe.” Do you know what I’m saying?

And these are the Millennials are the ones who grown up, driving the internet in a way that Boomers and Gen-Xers have not. So they’re putting the responsibility back on Facebook, back on Twitter, back on Snapchat, back on TikToK, which I think is right. I mean, these are our members, but we also put pressure on them to constantly innovate into improving their own trust and safety features so that the onus doesn’t constantly fall on busy and harried parents.

Neil Fairbrother

Yeah. And the busy and harried parent does link to the penultimate conclusion that you drew which is that “Digital parents feel overwhelmed”. What is causing that feeling of being overwhelmed? Is it the fact that they simply have too busy a life and this topic of online child safety simply doesn’t have enough space or they can’t make enough space for it in their lives? Or is it too complicated? The design of the apps, the disparate designs of the apps, makes it difficult? Or is it that the number of online harms that is possible to be perpetrated on a child is all too complicated?

Stephen Balkam, CEO FOSI

Yes! Yes to all of that. So just, just as an example, when I first got started on this in the nineties, you know, our advice was put the family computer in the sitting room, right? So you can keep an eye on what your kids are doing. Well, most households have over a dozen internet connected devices in their homes and kids are walking around with supercomputers in their pockets, which is what a mobile phone is now.

So the explosion of digital devices that are connected to the internet is one part of the overwhelm. And it’s not just mobile phones and laptops and tablets, although it’s all of those things, video game players, which by the way, used to just play video games, are also now connected online. As is your smart speaker. As is probably your front doorbell. I mean, it is astonishing now with the Internet of Things and the inclusion of AI into ordinary household objects, how much is now connected. That’s one part of it.

The second part of it is again, in the early days of Web 1.0, it was relatively static websites that we went to, if you will, to look at stuff and maybe eventually to play video. Now, of course we’re uploading video, we’re uploading stuff. The user generated content part of it has made it far more complicated and more difficult because we’re dealing not just with content issues, we’re dealing with behavioural issues.

Then you layer on top of that at a staggering statistic of the fact that 5,000 apps are uploaded every day to the app store. That’s 5,000 new apps worldwide are created every day. So if even only, I don’t know one becomes a hit [each day], you’ve got 365 apps that you’ve got to keep some kind of track on as a parent. And if you have kids, you will notice that the apps come and go, the games, the video games come and go in terms of popularity, and you have to constantly get your head around that whole ecosystem as well.

Never mind that we’re now mostly working and being schooled at home. The onslaught of digital input is extraordinary and unprecedented is this last year in particular. And the whole vexed issue of screen time has kind of blown up. Whereas a year or so ago, that was like the number one issue. Now it’s like, “Oh, thank God they’ve got something to entertain themselves with!.” Because if we didn’t have these screens, we’d probably all be going absolutely crazy and be killing each other. Sorry, we wouldn’t be killing each other. We’d would be fighting…

Neil Fairbrother

So the first or prime conclusion you drew from your report is one that says that “…consumer desire for a one stop shop and resource on parental controls is an opportunity for tech and media companies” Can you expand on that for us?

Stephen Balkam, CEO FOSI

Well, yeah, in the one-on-one conversations with parents, they were like, “I wish there was just one place I could go to get the answers that I need about how to keep my kids safe online”. Now in the UK, you guys have got a pretty good resource in Internet Matters. That’s something that we’ve certainly studied here in the US and there are other ones, the eSafety Commissioners’ Office in Australia have got a pretty good hub and resource for parents.

Here, the ESRB [Entertainment Software Rating Board], which is the video game rating system, has got a pretty good hub, but it’s only for video game platforms, the seven or eight, Sony, Nintendo, Sega, X-Box… So what they’re saying is we need something where we can go find out what we need and then how to set it up rather than having to go to all of these different sites, and all these different resources, to try and figure it all out.

Again, you know, in the early days we had one computer in the living room and it was connected to a wire in the wall, you know, we could keep some kind of control over it and it was usually dial up so there wasn’t a lot of noxious stuff that could come down. Now, I mean, unfortunately a lot of parents put a television, a laptop, and a phone in a kid’s bedroom and walk out the door, which we’re not big fans of. But that’s the reality in a lot of people’s houses.

Neil Fairbrother

We’ve spoken a lot about parents and we’ve spoken a lot about different types of products, safeguarding or parent control type products. We haven’t spoken about children yet. We are running out of time, unbelievably already, so very quickly, what is the perspective of children on all of this?

Stephen Balkam, CEO FOSI

I think it was summed up nicely by a 16 year old who when we asked her about online safety tools, as opposed to parental controls, I don’t have her quote in front of me now, but she basically said, you know, when she thinks of online safety tools, she thinks of empowerment. She thinks of you know, control that she has over the platforms and the apps that she’s using. When she thinks of parental controls, she thinks of being controlled by her parents. She feels like it’s an invasion of her privacy or of her agency, which is kind of a trendy word at the moment. So kids have very strong feelings about parental controls not surprisingly in the negative, but very positive around the online safety tools that they use themselves.

Neil Fairbrother

Okay. One final question, if I may please Stephen, what is 2021 looking like for FOSI? Or FOZI?…

Stephen Balkam, CEO FOSI

Yeah, we call it FOZI. You know, it’s an interesting one. I think it’s going to be a hybrid year, you know. We’re already beginning to plan our next annual conference in November. We have a hotel booked the Intercontinental at DC Wharf, which is this fabulous new development right down on the water. My guess is we will have some people in person and we’ll probably have a majority of people being beamed in on flat screen TVs.

Even with the vaccinations and everything else my guess is that we’ll still only struggle through this year in terms of in-person events. Otherwise you know, we’re looking at some new research topics for this year. We have a number of proposals already for the idea of creating that one-stop shop here in the US as well as our other work on public policy. And we have obviously a new administration with a new focus on technology, which is going to be fascinating to see how that plays out.

Neil Fairbrother

Okay, Steven. Well, thank you very much for that. Thank you for your time. A fascinating insight into the world of parental controls and safety tech certainly from the USA. Good luck with your conference later on in the new year and I look forward to attending,

Stephen Balkam, CEO FOSI

Thanks so much for having me, Neil really appreciate it.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top