Our 2021 review of Human Rights Safeguarding Podcasts includes many of this year’s amazing guests. Topics include Addiction, Apple’s NeuralHash CSAM, AgeVerification, John Doe vs Twitter, WhoIS, County Lines, General Comment 25, pack-hunting predators, and the TriChan Takedown: https://traffic.libsyn.com/secure/safetonetfoundation/SafeToNet_Foundation_podcast_-_2021_review.mp3 Welcome to another addition of the safe internet foundation’s safeguarding podcast with Neil Fairbrother exploring the law […]
In this Safeguarding Podcast: Austin Berrier Homeland Securities Investigations Officer discusses the impact of Apple’s Child Safety tech on Law Enforcement, the live streaming of child sexual abuse on encrypted video streaming services, how online predatory pedophiles hunt in packs, Project Mercury and how Zoom worked with international law enforcement to indict 300 child abusers.
Balancing Trust and Safety in End-to-end Encrypted Platforms The Stanford Internet Observatory hosted a panel of speakers presenting views on new products and services intended to protect children in encrypted spaces. These are the key points we feel that each speaker raised in this debate and we’ve provided our responses to each of these key points.
Welcome to another edition of the SafeToNet Foundation’s safeguarding podcast with Neil Fairbrother, exploring the law culture and technology of safeguarding children online. In this Safeguarding Podcast with Hany Farid, Professor at the University of California, Berkeley: PhotoDNA, what is is and how it works, what PhotoDNA doesn’t do, what are Hashes and do they
Apple’s recent CSAM detection announcement has caused a controversy about online privacy. Privacy of course is something we should all take seriously, especially with the unregulated commercial intrusion into our private use of social media services and of the internet in general. Apple is a private limited company. They provide a service, iCloud, for the
If you already use Apple’s iCloud service and are objecting to their recent announcement that they will search it for evidence of pre-hashed or digitally-finger printed images of illegal child abuse, then it’s too late. You’ve already agreed to allow them to do this, as they explain in their iCloud user agreement, or terms of
You can’t make an omelette without breaking eggs, or so goes the saying. Apple’s recent privacy announcement included details about online child safety features that they intend to introduce is, we feel, to be broadly but not unquestionably welcomed. For too long the handset manufacturers, a key element of the Online Digital Context, have been