You’ve already agreed to Apple’s CSAM detection, but you just didn’t know it.

If you already use Apple’s iCloud service and are objecting to their recent announcement that they will search it for evidence of pre-hashed or digitally-finger printed images of illegal child abuse, then it’s too late. You’ve already agreed to allow them to do this, as they explain in their iCloud user agreement, or terms of service.

Before looking at this contract, for that’s what it is that you’ve entered into with Apple, there are two points worth making. One is that from Apple’s perspective, you can do what you want with your iPhone. Apple’s CSAM Detection Announcement makes it pretty clear they are not rummaging through your photo library, not even if you do have illegal images of children being abused in it.

As previously mentioned, their CSAM detection tool is looking for the hashed or digital fingerprints of images that have been through the NCMEC database, and this search only kicks in once you try to share these images on their iCloud service, and even then there currently has to be a 30 file minimum threshold to trigger this.

The other point to make is that this content is illegal. Period. Creating it is illegal. Period. Storing it is illegal. Period. Sharing it is illegal. Period. Looking at it is illegal. Period. Other than for law enforcement purposes there are no exceptions. Period. There are no privacy grounds whatsoever that provide a safe harbour or carve out. None. What. So. Ever. Children’s rights to not be abused and raped far exceed your rights to use a private service operated by a private, limited, for profit technology company, especially for creating, storing and sharing this illegal content.

Apple’s intention seems to be to remove this content from their iCloud service, and you’ve already agreed to let them do it. Here are the relevant parts of the (UK) iCloud User Agreement with emphasis added to certain parts and our comments in italics:

  1. Content

“Content” means any information that may be generated or encountered through use of the Service, such as data files, device characteristics, written text, software, music, graphics, photographs, images, sounds, videos, messages and any other like materials. 

SafeToNet Foundation comment: basically, any digital content that can be uploaded or shared onto iCloud.

You understand that all Content, whether publicly posted or privately transmitted on the Service is the sole responsibility of the person from whom such Content originated. This means that you, and not Apple, are solely responsible for any Content you upload, download, post, email, transmit, store or otherwise make available through your use of the Service. 

SafeToNet Foundation comment: The onus is on you.

You understand that by using the Service you may encounter Content that you may find offensive, indecent, or objectionable, and that you may expose others to Content that they may find objectionable. Apple does not control the Content posted via the Service, nor does it guarantee the accuracy, integrity or quality of such Content. You understand and agree that your use of the Service and any Content is solely at your own risk.

SafeToNet Foundation comment: If you do find something objectionable, report it to NCMEC or your own country’s reporting hotline which in the UK is the IWF. Don’t be surprised if someone reports something you’ve posted if you’ve posted something illegal.

  1. Your Conduct

You agree that you will NOT use the Service to:

  1. upload, download, post, email, transmit, store or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy, hateful, racially or ethnically offensive, or otherwise objectionable;

SafeToNet Foundation comment: Child Sexual Abuse Material falls across a broad range of Apple forbidden categories, but until recently Apple like others with similar contractual clauses haven’t done much about enforcement.

  1. stalk, harass, threaten or harm another;

SafeToNet Foundation comment: Involving children in the abuse needed to produce this content is harmful to them. Threats are one technique often used to make children participate.

  1. if you are an adult, request personal or other information from a minor (any person under the age of 18 or such other age as local law defines as a minor) who is not personally known to you, including but not limited to any of the following: full name or last name, home address, zip/postal code, telephone number, picture, or the names of the minor’s school, church, athletic team or friends;

SafeToNet Foundation comment: Predatory Pedophiles will often solicit this kind of information to track down victims. According to US Homeland Securities Investigations the main three roles used in the practice of Crowdsourcing for CSAM generation are:

  • Hunters – those that find child victims and bring them to a communications platform
  • Talkers – those that convince child victims to engage in sexual activity
  • Loopers – those who play a pre-recorded video of another child victim to normalise the requested sexual activity
  1. pretend to be anyone, or any entity, you are not — you may not impersonate or misrepresent yourself as another person (including celebrities), entity, another iCloud user, an Apple employee, or a civic or government leader, or otherwise misrepresent your affiliation with a person or entity (Apple reserves the right to reject or block any Apple ID or email address which could be deemed to be an impersonation or misrepresentation of your identity, or a misappropriation of another person’s name or identity);

SafeToNet Foundation comment: Catfishing is an example of this

  1. engage in any copyright infringement or other intellectual property infringement (including uploading any content to which you do not have the right to upload), or disclose any trade secret or confidential information in violation of a confidentiality, employment, or nondisclosure agreement;
  2. post, send, transmit or otherwise make available any unsolicited or unauthorized email messages, advertising, promotional materials, junk mail, spam, or chain letters, including, without limitation, bulk commercial advertising and informational announcements;
  3. forge any TCP-IP packet header or any part of the header information in an email or a news group posting, or otherwise putting information in a header designed to mislead recipients as to the origin of any Content transmitted through the Service (“spoofing”);
  4. upload, post, email, transmit, store or otherwise make available any material that contains viruses or any other computer code, files or programs designed to harm, interfere or limit the normal operation of the Service (or any part thereof), or any other computer software or hardware;
  5. interfere with or disrupt the Service (including accessing the Service through any automated means, like scripts or web crawlers), or any servers or networks connected to the Service, or any policies, requirements or regulations of networks connected to the Service (including any unauthorized access to, use or monitoring of data or traffic thereon);
  6. plan or engage in any illegal activity; and/or

SafeToNet Foundation comment: As previously mentioned, the production, storage and sharing of CSAM is illegal. Period.

  1. gather and store personal information on any other users of the Service to be used in connection with any of the foregoing prohibited activities.
  2. Removal of Content

You acknowledge that Apple is not responsible or liable in any way for any Content provided by others and has no duty to screen such Content. However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.

SafeToNet Foundation comment: CSAM storage and sharing is illegal, Apple don’t want it on their platforms, they have the right under this user agreement to take steps to identify it and once having done so they have the legal duty to report it to law enforcement.

  1. Access to Your Account and Content

Apple reserves the right to take steps Apple believes are reasonably necessary or appropriate to enforce and/or verify compliance with any part of this Agreement.

SafeToNet Foundation comment: Apple believes their CSAM Detection system to be reasonably necessary, within the context of their overarching privacy safeguards. iCloud is their universe, it’s their software and their servers. They have responsibility for what’s on this service and it’s entirely up to them what the rules are, as long as they are legal. There’s no Constitutional obligation on anyone to use either an Apple product or an Apple service.

You acknowledge and agree that Apple may, without liability to you, access, use, preserve and/or disclose your Account information and Content to law enforcement authorities, government officials, and/or a third party, as Apple believes is reasonably necessary or appropriate, if legally required to do so or if Apple has a good faith belief that such access, use, disclosure, or preservation is reasonably necessary to: (a) comply with legal process or request; (b) enforce this Agreement, including investigation of any potential violation thereof; (c) detect, prevent or otherwise address security, fraud or technical issues; or (d) protect the rights, property or safety of Apple, its users, a third party, or the public as required or permitted by law.

SafeToNet Foundation comment: You agree that Apple can use their CSAM detection tools in the way they describe in terms of usage of their iCloud Service. If you are not using the iCloud service to store or distribute this illegal content, then this won’t apply to you, even though this content may actually be on your iPhone.

Apple’s CSAM detection service will provide information about the images found to law enforcement. As far as we currently understand it, Apple’s CSAM will also shut down the offending account and will report the user to law enforcement.

We do question this strategy however as by shutting down the account before law enforcement has had time to act, they are forwarning an offender that an investigation is underway and would seem to be acting as a law enforcement agency.

  1. Termination by Apple

Apple may at any time, under certain circumstances and without prior notice, immediately terminate or suspend all or a portion of your Account and/or access to the Service. Cause for such termination shall include: (a) violations of this Agreement or any other policies or guidelines that are referenced herein and/or posted on the Service; (b) a request by you to cancel or terminate your Account; (c) a request and/or order from law enforcement, a judicial body, or other government agency; (d) where provision of the Service to you is or may become unlawful; (e) unexpected technical or security issues or problems; (f) your participation in fraudulent or illegal activities; or (g) failure to pay any fees owed by you in relation to the Service, provided that in the case of non-material breach, Apple will be permitted to terminate only after giving you 30 days’ notice and only if you have not cured the breach within such 30-day period. Any such termination or suspension shall be made by Apple in its sole discretion and Apple will not be responsible to you or any third party for any damages that may result or arise out of such termination or suspension of your Account and/or access to the Service.

SafeToNet Foundation comment: Apple’s CSAM detection process description says that once human moderators have confirmed a hash match or digital fingerprint match of 30 images in your iCloud account with the NCMEC database, they will shut your account down. If you lose your collection of illegal content as a result of having your account shut down, you have no redress against Apple.

As we noted in an earlier blogpost on this, the efficacy of this CSAM detection system depends on Apple’s motives. Are they motivated by displacement or eradication? It seems the motive is to at best expunge CSAM from their iCloud ecosystem, as opposed to also eradicating CSAM from the Apps that use the Apple ecosystem, social media and the internet in general, in which case assuming this solution is effective, purveyors and pedlars of perverted pedophilic pictures will simply move elsewhere. Such as Whatsapp.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top