Should search engines implement de-listing requests globally? And do they have to remove sensitive data as a matter of course?

The CJEU has considered two further right to be forgotten cases. The first is on territorial scope of the right to be forgotten. Here, the CJEU concluded that de-listing requests should be implemented across the EU, not just in the member state applicable to the relevant data subject.

It also determined that there is no general requirement under the Directive or the GDPR for a search engine to apply delisting globally. However, the CJEU specifically upheld the right of a supervisory authority or member state court to require global delisting in a particular case, if this would be required under national standards in that member state.

The second case considers how search engines should deal with de-listing requests involving special category data (for example, information about religious belief) and information relating to criminal offences and convictions. The CJEU confirmed that reports of court proceedings and investigations would fall into this category – even if there is no subsequent conviction. The CJEU noted that, while the interests of data subjects would ordinarily outweigh the interests of internet users in accessing information, search engines do have an obligation to consider the interests of freedom of information. In specific cases, these interests may justify a refusal to de-list. If a search engine considers that de-listing is not appropriate then, search engines must ensure that any articles about criminal offences and convictions display current information first.

Global de-listing?

The first case, Google v CNIL C 507/17 considered whether search engines have to implement delisting requests globally.

The case was triggered by an order of the French supervisory authority, the CNIL, in May 2015. Google had applied a de-listing request to all EU domain name extensions. It was also willing to block searches for the data subject with a French originating IP-address. However, the CNIL required Google to implement a de-listing request on all of its domain name extensions - so applying the de-listing globally. Google refused to comply with this and the CNIL imposed a fine of €100,000 on Google. Google appealed to the Conseil d’Etat to annul the decision - which, in turn, referred questions to the CJEU on the territorial scope of de-listing requests. Although the case was triggered by a decision of the CNIL before the GDPR became applicable, the Court noted that right to be forgotten was specifically written into the GDPR at Article 17. It also considered the relevant provisions of the GDPR in some detail.

The CJEU noted that search results which are accessible outside the EU could have significant impact on individuals in the EU [57] and that this would have justified the EU legislature in requiring global de-listing [58]. However, the CJEU concluded that, based on the text of the GDPR, the EU legislature did not intend to require global de-listing [62]. The CJEU also noted that many third states take different approaches to de-listing and that the balance to be struck between privacy and freedom of expression would vary around the world [60]. Accordingly, a search operator does not have to implement a de-listing request globally and cannot be required to do so under the GDPR [64,65].

The CJEU noted that the objective of the GDPR was to ensure a consistent and high standard of data protection within the EU, which would mean that de-referencing should be carried out across all Member States, not just in the Member State where the data subject is based [66]. This is not a straight-forward task, as the balance between protection of personal data and freedom of expression is not prescribed in the GDPR itself; instead the GDPR requires each Member State to provide the derogations necessary to achieve this. Accordingly the rules vary between Member States. The CJEU acknowledged this and noted that, for cross-border processing, the consistency and collaboration mechanisms set out in the GDPR (i.e. the so-called ‘one stop shop’ provisions) should be used in order to reach a consensus decision which would be binding on all supervisory authorities [68]. Curiously, the court did not reference recital 153, which says that, in this situation, the law of the Member State to which the controller is subject should apply. The CJEU did note that the measures deployed must be sufficiently effective as to prevent, or at the very least, “seriously discourage” internet users in Member States being able to access the contested data by means of a search on the individual’s name [70] - so effectively mandating Google’s proposed geo-filtering technique.

The CJEU went on to note that, whilst the GDPR does not require global de-listing, neither does it prohibit it. This would mean that a supervisory authority, or a judicial authority in a member state, would remain competent to determine that, in the light of national standards, that global de-listing would be required [72].

Special category data and search engines

The second case, GC, AF, BH, ED v CNIL Case C-136/17, looks at the obligations of search engines when processing special category data.

GC, AF, BH and ED had all made de-listing requests to Google. Google had refused to de-list; the individuals complained to the CNIL, but the CNIL declined to require Google to delist – leading to court proceedings by the individuals against the CNIL. The search results for these individuals related to articles about an affair between a public official and a mayor, membership of the Church of Scientology, an article about the commencement of criminal proceedings (which were later closed with the individual being discharged) and an article about proceedings sentencing an individual for sexual assaults on children.

Search engines rely on their, and their users', legitimate interests to provide a lawful basis for processing personal data when they return search results. This works for 'ordinary' personal data, but the legitimate interests condition is not applicable for special category data, or for data about criminal offences etc.. It has been difficult to see what condition search engines could most obviously rely on to justify processing of such data – either to justify returning search results containing such data in the first place, or to justify refusing a de-listing request. The CJEU was asked to consider this. The de-listing requests were all made under the 1995 Data Protection Directive but, as in the above Google case, the CJEU also considered equivalent provisions under the GDPR.

Google's approach to this difficulty was to argue that it should be exempted from compliance with the restrictions on processing such data. The CJEU rejected this. However, it did accept that the obligation to comply with the relevant restrictions only applied from the point at which a supervisory authority was asked to verify Google's response to a de-listing request [47].

The CJEU also accepted that, in principle, certain of the conditions for processing special category data, set out in Art.9 of the GDPR, could apply to Google. The two suggested were explicit consent and processing of data manifestly made public by the data subject. Explicit consent was quickly rejected – it would strain credibility to argue that individuals had consented to such processing and, even if they had, such consent could be revoked. However, the CJEU accepted that Art.8(2)(e) (processing of data manifestly made public by the individual) could apply to a search engine as well as to the original publisher of the article [63], although this would not stop Google having to consider de-listing requests if the individual could show that there are compelling legitimate grounds relating to his or her situation to object to processing [65 & 69].

When a search engine receives a de-listing request relating to special category data, displayed following a search on the data subject's name, the CJEU noted that the search engine must also consider the right of freedom of information, that is the interests of internet users in accessing the information. Ordinarily, the rights of the data subject would override the rights of internet users to access information – but the balance would depend on the specific case – taking into account the sensitivity of the information for the data subject, the interest of the public in accessing the information and whether the data subject played a role in public life – as set out in the original Google Spain case (C-131/12) [66 & 69].

The CJEU concluded that articles discussing (criminal) legal proceedings and judicial investigations and subsequent convictions (if any) would amount to data relating to offences and criminal convictions – and so would be protected by Art.10 of the GDPR. This would be the case even if irrespective of whether the individual was convicted of the offence or not [72]. A search engine's processing of such data would, therefore, need to comply with Member State provisions relating to processing of such data; the CJEU noted that this may be the case where public authorities had made the information public, in accordance with applicable national law [73].

The freedom of information arguments applicable to processing of special category data would also be relevant to search engines in this case [75] and the Court referred to jurisprudence of the European Court of Human Rights which balances the interests of the individuals concerned with interests of the press in reporting legal proceedings and the interests of the public to access information and to be able to conduct research into past events, including criminal proceedings [76]. The CJEU suggested that relevant criteria to consider in this balancing exercise would be: the nature and seriousness of the offence; the progress and outcome of the proceedings; the time elapsed; the role of the data subject in public life; his past conduct; public interest at the time of the request; the content and form of publication and the consequences of publication for the individual [77]. The CJEU specifically noted that this approach would be relevant to de-listing requests relating to articles concerning earlier stages of proceedings [77]: the CJEU did not address requests to de-list topical reports where the interest in availability would be much higher such that, presumably, delisting would not be appropriate.

The CJEU also noted that processing of the information may, over time, cease to be lawful – in particular, if it seems irrelevant or excessive given the passage of time [74]. This would mean that, in the event of a de-listing request, if a search engine concludes that de-listing is not appropriate, it must still adjust the results so that the overall picture gives users the current legal position, with the current information appearing first.


Latest insights

More Insights
Curiosity line green background

Requests for flexible work – can employers say “no”?

Apr 18 2024

Read More
Crowds crossing lines 782x440

Flex appeal - Exploring the new statutory flexible working regime

Apr 18 2024

Read More
Car by beach

Frontline UK Employment Law Update Edition 28 2024 - Case Updates

Apr 18 2024

Read More