News from Apple earlier in August interrupted Washington, D.C.’s cyber policy summer. To stem the tide of child sexual abuse materials (CSAMs) that are flooding the cyber network (and it truly is a flood), Apple announced a new client-side scanning (CSS) system. The new approach could be a reaction to criticism of Apple’s device encryption standards, which have frustrated law enforcement.
Of course, the goal is admirable. No decent actor wishes to see CSAM spread. However, Apple’s preferred method—granting itself the ability to scan the uploaded content of a user’s iPhone without the user’s consent—raises significant legal and policy concerns. Apple’s leadership disagrees, as the company has now implemented a CSS system. The associated endeavor serves as a valuable real-world case study of CSS implementation and its obstacles.
Apple revealed the new program in a series of public remarks, including a synopsis on its website. The statements expressly linked the latest technology to child safety, tying its efforts to spread CSAM. First and foremost, it is critical to recognize that Apple’s recently announced initiatives are three distinct new technologies, two of which have no direct impact on CSS. Two of them (offering new tools in the iMessage app to provide more robust parental control and allowing Siri to intercede and warn when CSAM material may be accessed) are unrelated to the CSS debate and will ignore.
The third effort raises concerns. Here’s how the company describes it:
Apple will submit these incidents to the National Center for Missing and Exploited Children (NCMEC). Instead, any positive match will forward to Apple without first identifying the user’s source of the matching alert. In a public defense of the technology, Apple stated that as a matter of policy, the threshold would be around 30 photos on a specific phone that matched known CSAM before an alert triggered.
The Pros and Cons of Apple’s Approach
Privacy groups quickly raised concerns regarding the implementation. While many people supported the overarching purpose, many perceived considerable privacy hazards. These complaints, according to NCMEC’s executive director, are screeching voices of the minority. Meanwhile, several security experts speculated that the limited nature of the scanning in question would offer few significant privacy threats.
So, what comes next? Some observers may understandably fear that Apple’s move is just the beginning. So far, though, there appears to be opposition. WhatsApp, for example, has stated that it will not follow in Apple’s footsteps. It remains to be seen if WhatsApp and others can sustain that position in the face of the inevitable political winds.
Apple’s new technology has failed to address some of the most challenging policy and execution issues. Much of the system’s validation relies on the degree to which one trusts Apple to implement the plan as stated while resisting mission creep and other political constraints. The level of trust one has in Apple is highly situation dependant and varied.