Apple has confirmed that it already scans iCloud Mail for CSAM, and has been doing so since 2019. However, it does not scan iCloud Photos or iCloud backups.
This clarification came after reporters questioned the anti-fraud executive's rather bizarre claim that Apple is "the biggest platform for child porn distribution." This immediately begged the question: if the company didn't scan iCloud photos, how could it know that?
What did it find out?
Apple has confirmed that as of 2019, it is scanning outgoing and incoming iCloud Mail for CSAM attachments. Emails are not encrypted, so scanning attachments as mail passes through Apple's servers was a trivial task.
Apple has indicated that it conducts limited scans of other data, but did not specify what kind of data, other than to state "small scale." The company also said that "other data" does not include iCloud backups.
While these claims sound convincing and seem to be based on credible data, there is speculation that they are not completely honest. As far as we know, the total number of reports that Apple sends to CSAM each year is measured in the hundreds, which means that email scans provide no evidence of the extent of the problem on Apple's servers.
What's the reason for the excitement?
The explanation probably lies in the fact that other cloud services were checking photos for CSAM and Apple wasn't. If other services disabled accounts for uploading CSAMs, but iCloud Photos did not (because the company did not scan there), then the logical conclusion would be that more CSAMs exist on Apple's platform than anywhere else.
Apple's initiative has inexplicably met with enormous resistance, including from various governments. This is all the more surprising given that virtually every major photo hosting site, from Dropbox to Facebook or Google, has been doing similar scanning for years.
Illustration: Stephen Phillips