Bad pictures, dire comments, past scandals. The internet never seems to forget. How convenient would be the possibility to suppress any registry of events of the past, which might adversely affect our honour and dignity or simply expose our private life in an undesirable way?
According to the recent ruling of the ECJ, someone wishing to delete personal information from a search engine’ index will have to request so to the search engines services provider.
This decision is undoubtedly good news to those who would like to suppress information about them from the internet, but is likely to cause problems in practice.
It is unquestionable that people have a right to privacy, and that that privacy extends to information about them. Thus they have the right to control their personal data and can ask for its deletion when no longer have interest in its processing or storage and there is no legitimate interest in being kept by a data controller.
According to this decision, search engines have to deal with individual complaints or they can be challenged before judicial courts or a supervisory authority.
In order to implement the decision, Google had to devise mechanisms (an online form available on its website) in order to receive requests for link removal and was submerged by the amount of requests for personal data deletion received from nationals of the 28 countries of the EU. This has been representing major additional administrative burden for Google and will affect as well all other web intermediaries, as the repercussions of this ruling extend well beyond Google.
Then, search engine services providers will decide on the balance of the rights at stake, weighing up whether it is in the public interest for that information to remain.
Does it make any sense to establish this principle when there is no evident and easy answer about what is privacy nowadays or where do we draw the line between what belongs exclusively to private sphere and what belongs to the public domain?
It is a fact that Google have been removing links which infringe copyrights content on demand of copyrights holders. In those cases, however, there is no individual assessment and the process is mostly automated, which is a very different scenario from what the ruling entails. Indeed, each request would have to be considered individually and demands an appraisal that is not fit for a search engine. Thus, complying with this ruling could end up very costly because making this kind of assessment is not possible through algorithms.
As much as I can agree with the motivation of the ruling, this just doesn’t make any sense. The ECJ thus passes the responsibility to find the right balance over to private entities, businesses, whose primordial concern is profits, although they have neither the expertise nor the legitimacy to act as a legal authority. As a general principle, deletion of websites, or search links, should be decided by a legitimate entity, entitled with public authority, ultimately a court.
Anyway, pertinent questions arise regarding the interpretation of the rules set out in the judgement. What exactly is a public figure? What is deemed to be qualified as public interest? How long has to pass before personal data is no longer relevant? How can a search engine determine if data is inaccurate or its processing is excessive or disproportionate? How will the rights of the publisher be safeguarded in the internal process of a private company?
In the absence of any rigorous criteria to balance the rights of the data subject against the search engine’s economic interests and the public interest, search engine services providers lack a precise test to apply when assessing requests from data subjects for removing links to websites containing their personal data.
The advisory committee set by Google might not be the most adequate solution. The agreement reached by national data protection authorities to form a subcommittee establish uniform handling of such requests is much welcome.
In order to avoid any likelihood of liability for breaching data protection law and of unlawful processing of data, search engine services providers might prudently remove links, despite any public interest in its disclosure, rather than consider the balance of rights in every request. And who would possibly blame them?
The most evident and worrying outcome is that our facility to find information about other individuals is substantially reduced.
But it leaves some room for other discrepancies as well. Will the links deleted from the specific index created especially for a country (e.g. specific national language required) be available within searches operated in other countries? Or only search engines with no connection to the EU will be able to serve the results?
We might be dangerously heading toward a tiered and fragmented internet, with searches results in Europe being less complete than elsewhere. What about the free and open internet, then?
And what about managing the potential interests of third parties, also concerned by the deleted information, who might have as well personal data published but, on the contrary, wish to be easily found online through the search engine set of results?
The Advocate General already alerted 1)Paragraph 134 of the Opinion that suppression of legitimate public domain information would amount to censorship . For the sake of the full access to information and freedom of expression, it is imperative that Google should remain a neutral platform. To that aim, empowering a search engine service provider with censor competences is a remotely desirable result.
The main search engines in Europe already met, in Brussels, with the Article 29 Working Party, which brings together data protection authorities from across the EU, which intends to issue some EU-wide guidelines in order to achieve a unified implementation of the ruling.
Considering the possible scenarios foreseen, these guidelines are very much welcomed. However, it just seems that we are now trying to correct a system that was wrongly built from scratch. Indeed, according to its press release, DPAs asked, for instance, the search engines to explain their de-listing process and what were the criteria applied to the balancing process of their economic interest, the public interest in accessing information and the right of the person who wants the search results de-listed.
Now Google has planned public hearings, in an alleged quest for transparency, in order to boost a debate on the implementation of the ruling.
More recently, at the WP29 Plenary meeting of 16 and 17 September, the European data protection authorities decided to put in place a network of contact persons in order to develop common case – handling criteria to handle complaints presented to the data protection authorities, resulting from search engines’ refusals to “de-list” links from their results.
Now that the confusion has started, one may wonder where will all this end?
References [ + ]
|1.||↑||Paragraph 134 of the Opinion|