What Does EU Law Say About Apple’s Plan to Have iPhones Detect Child Abuse Photos?

Apple recently announced plans to use software to detect and report iCloud users who store “known” child abuse images. In a Technical summary it published, it says the following:

CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). This process is secure, and is expressly designed to preserve user privacy.

Apple’s system does not perform any scan locally (nor indeed server-side) but relies instead on a system of hashes, which are unique numbers associated with images. Apple’s CSAM software relies on a series of hashes obtained from the US National Center for Missing and Exploited Children (NCMEC), and compares them with hashes generated by the user’s iCloud photos. If the hashes match, and if the number threshold (currently set at 30) is met, the images are decrypted for individual analysis and, if suspicious, the account is disabled and reported to NCMEC for further processing. Apple does not itself engage in manually comparing photos not does it hold the database of incriminating photos on its servers. The technology only applies to its iCloud service, not to locally stored photos.

Apple only intends to roll the system out in the United States, at least initially. Perhaps unsurprisingly, it met with instantaneous and fierce criticism. The gist of it centers on Apple’s previous promises to protect privacy and have high encryption, now potentially broken, and on the dangers of uncontrolled surveillance. The purpose of this short entry is neither to analyse Apple’s technology in detail, nor to assess its effectiveness in fighting child porn (which will, presumably, be negligible as child pornography is mainly not distributed through the iCloud service) but to examine the extent of legal limitations which the EU law might impose on surveillance of this type.

Several methodological points are of interest. First, as the surveillance takes place both on a telecommunications/electronic communications device (a telephone with a data plan) and on an e-commerce service (iCloud), both telecoms and e-commerce regulatory frameworks are relevant, at least in theory. Second, different legal disciplines cover the issue. A contractual relationship (between Apple and the iCloud user) governs the process and it is possible for Apple to insert the relevant provisions in the user contract itself, simply eliciting consent for any surveillance performed. Second, various platform laws, both the existing ones (such as the E-Commerce Directive) and the proposed ones (such as the Digital Services Act) govern what platforms may or may not do. Third, cybercrime laws, including laws directly dealing with child pornography, cover various aspects of both child porn production and distribution. Finally, data protection laws cover issues of lawful data collection and processing but also data retention. While an analysis of each of the above would be appropriate, it is only the last I will concentrate on. This is primarily because the EU already classified the problem as falling within the scope of privacy laws (see below) but also because surveillance is essentially a derogation from a situation where data collection would otherwise be illegal.

The telecommunications regulatory framework covers privacy of communications in the 2002 ePrivacy Directive, revised in 2009 and 2016 (consolidated version here). This instrument is a lex specialis in relation to both General Data Protection Regulation (GDPR) and the European Electronic Communications Code (EECC). The question which needs to be posed here is if the provision of iCloud services is (also) a telecommunications service. Article 2(4) defines electronic communications services as services “normally provided for remuneration via electronic communications networks”, excluding services under editorial control, but including “interpersonal communications services”. In the Gmail case (C-193/18), decided under the old telecoms framework but relevant here, CJEU decided that Gmail is a “a web-based email service which does not itself provide internet access […], does not consist wholly or mainly in the conveyance of signals on electronic communications networks and therefore does not constitute an ‘electronic communications service’”. By analogy, iCloud services are not subject to general telecommunications framework. As an exception, they may be subject to those provisions of the ePrivacy regulation which cover terminal equipment. Article 5(1) thus prohibits “listening, tapping, storage or other kinds of interception or surveillance”, except with user’s consent or where authorised per Article 15 of the Directive (which includes Member States’ laws on public policy interests such as crime prevention, detection and prosecution). Also relevant is Article 5(3) of the ePrivacy Directive, which provides that “storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent, having been provided with clear and comprehensive information, in accordance with Directive 95/46/EC, inter alia, about the purposes of the processing.” This means little more than that a specific consent about the CSAM surveillance needs to be obtained and purposes explained, creating few barriers to hash-based surveillance. Finally, Article 6 of the Directive requires that traffic data be erased and anonymised when no longer needed. The last does not in itself prevent the monitoring for illegal content which in any case takes place at the point of transmission to the servers.

Specific measures aligned with the EECC were proposed by the Commission on 10 September 2020, in the form of a Proposal(COM(2020) 568 final) on the temporary derogation from Articles 5(1) and 6 of the e-Privacy Directive, which protect the confidentiality of communications and traffic data. The gist of this proposal is that Articles 5(1) and 6 of the ePrivacy Directive shall not apply to number-independent interpersonal communications as defined in the EECC where data is collected “for the sole purpose of removing child sexual abuse material and detecting or reporting child sexual abuse online to law enforcement authorities and to organisations acting in the public interest against child sexual abuse”. A number of conditions are imposed, including that processing be proportionate, the technology reliable, the processing limited to key indicators, the processing be limited to detecting, reporting and removal of child abuse online and that the provider publishes an annual report. The Proposal insists that derogations to ePrivacy may only be made for a limited time and:

  • to proportionate requests by law enforcement and other relevant public authorities;
  • for the blocking of the concerned user’s account
  • in relation to data reliably identified as child pornography, for the creation of a unique, non-reconvertible digital signature (‘hash’).

The Proposal specifically mentions “hashes”, which is the very technology Apple uses. It is worth noting that iCloud would probably not be classified as the “number-independent interpersonal communications” service even when and if the Proposal is adopted. If adopted, the Regulation would cover telecoms devices and services, leaving the ordinary information society services out of its scope and under general GDPR/ePrivacy regime.

In the study on the Proposal’s potential impacts, the EU Parliament emphasises that, while the EU has the competence to adopt the proposal, the “the impact of such practices on human and fundamental rights has not been adequately addressed”. This is at least in line with the criticism which Apple has received in the United States. It was further observed that the Proposal does not adequately address the remedies and that some technologies have a disproportionate impact and require additional safeguards. In particular, the Parliament noted that, while cryptographic hashes may be proportionate in terms of Article 51(1) of the Charter of Fundamental Rights, technologies that analyse original communications data (such as text messages) are not and would require additional safeguards. The Parliament further thinks that only those practices that satisfy GDPR Articles 6(1) (d) and (f) (see below) can be considered for an exception. In other words, for an exception under ePrivacy to kick in, the processing has to be lawful under at least one of these two bases. Finally, the Parliament calls for extra safeguards such as protection against transfers to 3rd countries, prior authorisation from data protection authorities and better internal review mechanisms. In its own final version, as yet not agreed in the Council, the Parliament proposes ways to address these problems.

The GDPR itself does not impose extra conditions in this situation other than those relating to the conditions under which data can be collected. It seems that these are capable, without further additions, of legitimising certain cases of surveillance in the interests of protecting children. Article 6(1)(e) establishes that processing may take place where this “is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;” More relevant possibly are Articles 6(d) and (f), the former covering the protection of the vital interests of the data subjects while the latter covering “legitimate interests pursued by the controller or by a third party”. Article 9(2)(g) further establishes that processing of special categories of data is allowed “for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject”. It is difficult to argue that protection of minors from sexual abuse is not a reason of substantial public interest. It seems, however, that rather than claiming that GDPR “allows” these types of surveillance, it would be more appropriate to state that it never considered them properly and that creative interpretation of the existing provisions is needed. It is also important to state that the Data Protection Law Enforcement Directive, a sister instrument to GDPR, contains a special regime on, among others, “persons with regard to whom there are serious grounds for believing that they have committed or are about to commit a criminal offence”. Data collection in such cases is not only specially regulated but also somewhat different in substance from GDPR.

Two conclusions can be derived from the above. The first is that the EU legal regime on data protection probably already allows a targeted attempt to filter child pornography on users’ devices of the type Apple proposes to engage in. While some aspects of such collection remain unclear (e.g. safeguards, oversights, remedies, etc.) it is not per se illegal to perform it. The situation might be conceptually different in case of server-side surveillance (in that ePrivacy provisions on terminal equipment probably do not apply or only do so to a limited extent) but here too the GDPR opens up the possibility for lawful filtering.

The second conclusion is that the Commission is aware of the importance of the issue and has proposed an instrument that specifically enables surveillance for this purpose. As seen from the Parliament’s comments, this instrument is far from ideal and suffers from various problems, most prominent of which is the lack of appropriate safeguards. Even if adopted, this document would probably not cover the situation currently analysed as cloud services and platforms are almost certainly not telecoms services. To include them would require the extension of the Directive to content-side services, which goes above its scope. The contrary would make for a smoother adoption but is likely to generate CJEU case law covering platforms as ordinary intermediaries.

I do not believe that Apple’s proposal will substantially contribute to fighting child pornography. This is for the simple reason that the production and distribution of child pornography is a criminal enterprise which requires considerable technological resources suitable for an illicit covert operation of such kind. iCloud is not an appropriate tool for that and Apple’s efforts are not likely to give significant results even if and when surveillance is extended to more states (and are probably not meant to). A more lasting consequence of this operation may be the fear that device-side surveillance is spreading and that other purposes (both legitimate and those that are less so) might emerge. Such purposes may be the political control of dissidents, the monitoring of compliance with financial or health regulation, etc. This is a legitimate concern. The EU legal framework is complicated and leaves a multitude of loopholes, making such acts potentially lawful at least in certain cases.

Until now, it has been clear that contractual terms and conditions and data protection policies regulate server-side surveillance and content control. It has long been possible for a platform to state in its terms that it will monitor for illegal content. The ‘Good Samaritan” provisions, present in the United States and now proposed in EU DSA, would not make an active platform liable in cases where it intervenes to remove them. The present situation is different as it technologically changes the process, moving it device-side but maintaining some aspects of privacy. In my opinion, this may be useful but it presently raises too many concerns. We may trust Apple with our data but would we trust another firm? Would we trust a sub-contractor we are not aware is even handling out data? How good an overview of the process do we possess? These are only some of the questions. My feeling is that the EU’s approach of having sector-specific legislation is probably the right approach. The Parliament adopted a final version of the proposed Directive in July 2021 adding safeguards and remedies as well as a public list of organisations which are acting in the interests of child safety, thus opening the doors for a better law. This almost certainly removes the main objections in cases of child pornography screening but does not resolve the tension created by the technology itself. For that, either a revision of GDPR or another sector-specific law would almost certainly be needed.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s