Tag: Right to be Forgotten

Practical difficulties of the GDPR – the ‘right to be forgotten’ applied to online social platforms

From all the legal challenges that the GDPR will present for businesses in general, I would like to address in this post the issues raised by its implementation in regards of social network platforms, which are quite popular nowadays.

Article 17 of the GDPR establishes the ‘right to erasure’ or the right to be forgotten, as it has come to referred to, which provides data subjects with the right to require from data controllers the erasure of their personal data held by the latter, and the consequent obligation of controller, upon that request to abide, without undue delay, when certain conditions are fulfilled.

Considering that infringing the ‘right to erasure’ may lead to the application of significant economic sanctions, there is the risk that social platforms will be tempted to adopt a preventing approach by complying to all the deletion requests, disregarding their validity, thus erasing content on unfounded grounds. This is particularly worrisome because it may directly lead to the suppression of free speech online. Consequently, online businesses are not and should not be deemed competent to make any assessment in regards of the legitimacy of such claims, a point that I have already tried to make here.

While it seems that a notice and take down mechanism is envisaged without much detail being provided in regards of its practical enforceability, a particular issue in this context is the one related to the identities upon which such obligation impends. Indeed, the obligation to implement the ‘right to be forgotten’ can only be required from those who qualify as data controllers.

As data controllers are defined as the entities who determine the purposes and means of the processing of personal data, it is not clear if online social platforms providers can be defined as such.

Considering the well-known Google Spain case, it is at least certain that search engines are deemed to be controllers in this regard. As you may certainly remember, the CJEU ruled that individuals, provided that certain prerequisites are met, have the right to require from search engines, such as Google, to remove certain results about them, subsequently presented to a search based on a person’s name

Thus said, it is questionable if hosting platforms and online social networks, focused on user generated content, as it is the case of Facebook, qualify as such, considering that the data processed depends of the actions of the users who upload the relevant information. Therefore, the users themselves qualify as controllers. The language of Recital 15 of the GDPR about social networking is inconclusive in this regard.

The abovementioned Recital provides as follows:

This Regulation should not apply to processing of personal data by a natural person in the course of a purely personal or household activity and thus without a connection with a professional or commercial activity. Personal and household activities could include
correspondence and the holding of addresses, or social networking and on-line activity undertaken within the context of such personal and household activities. However, this Regulation should apply to controllers or processors which provide the means for processing personal data for such personal or household activities.

This is not an irrelevant issue, though. In practice, it will amount to enable someone to require and effectively compel Twitter or Facebook to delete the information about her/him despite being provided by others.

And considering that any legal instrument is proportionally as efficient in practice as it is capable of being enforced, the definition of whom is covered and ought to comply with it is unquestionably a paramount element.

As I remember to read elsewhere – I fail to remember where, unfortunately – one wondered if the intermediary liability as foreseen in the e-Commerce Directive would be an appropriate mechanism for the enforcement of the right to erasure/right to be forgotten.

Articles 12-14 of the e-Commerce Directive indeed exempt information society services from liability under specific circumstances, namely when they act as a ‘mere conduit’ of information, or engage in ‘caching’ (the automatic, intermediate and temporary storage of information), or when ‘hosting’ (i.e., storing information at the request of a recipient of the service).

Article 15 establishes the inexistence of any general duty impending on online intermediaries to monitor or actively seek facts indicating illegal activity on their websites.

Having into account the general liability of online intermediaries foreseen in the E-commerce Directive (Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market), a particular distinction will perhaps apply according to the level of ‘activity’ or ‘passivity’ of the platforms in the management of the content provided by their users.

However this liability does not fully clarify the extent of the erasure obligation. Will it be proportionate to the degree of ‘activity’ or ‘passivity’ of the service provider in regards of the content?

Moreover, it is not clear how both regimes can be applied simultaneously. While the GDPR does not refer to any notice and take down mechanism and expressly refers that its application is without prejudice of the e-Commerce Directive liability rules, the fact is that the GDPR only establishes the ‘duty of erasure’ to controllers. As the intermediary liability rules require accountability for the activities of third-parties, this is a requirement not easy to overcome.

Thus considering, the most awaited GDPR hasn’t entered into force yet but I already cannot wait for the next chapters.

The ‘right to be forgotten’
extended to Google.com

Forgetting everywhere

Forgetting everywhere

As you might well remember, the Court of Justice of the European Union, in a better known as ‘right to be forgotten’ judgement, ruled that individuals, provided that certain prerequisites are met, have the right to require from search engines, such as Google, to remove certain results about them, subsequently presented to a search based on a person’s name. (you can read more here, here, here, here and here) According to the ruling, the original information will still be accessible under other search terms or by direct access to the publisher’s original website.

In this context, in cases where the criteria for deletion are met and where search engines do not remove the links requested by the data subject, the latter would be able to complain to its national data protection or judicial authority.

Therefore, last week, the Article 29 Working Party, which gathers representatives of the 28 national data protection authorities (hereafter DPAs) of the EU Member States, has adopted Guidelines on the implementation of the judgement. This has not really come as a surprise as the Working Party had already announced its decision to establish a common approach to the right to be forgotten.

Indeed, the ruling left many questions unanswered. For instance, it was left to be found out if Google was only obliged to block requested names for European domain names only or should do so for all Google search domains. Moreover, it was left unanswered how could the balance between the relevant rights and interests at stake be achieved and how the ‘public figure’ concept could be defined.

According to the Article 29 Working Party, the ruling only refers to search engines as “data controllers” and is not to be applicable to the original source of information, so the information is not to be removed entirely, just from search indexes.

Furthermore, considering that users are directed to localised editions of the company’s search service, when they initially try to visit the Google.com website, Google’s current practice regarding delisting procedures consists in delisting results which appear in the European versions of its search engines, but not the international one ‘.com’. In this regard, the Article 29 Working Party considers that Google has therefore failed to effectively implement the abovementioned ruling. It considers, indeed, that limiting the de-listing of search results to EU domains, on the grounds that users tend to access search engines via their national domains, does not sufficiently guarantees the rights of the data subjects. In fact, it concluded that the de-listing should be conducted on all the relevant ‘.com’ domains. This conclusion is certainly in line with a recent position of a French court which decided that Google should remove a link to a defamatory article for a particular search on both on its ‘.fr’ and ‘.com’ domains.

In addition, the document clarifies that Google is not required to block links if searches lead to the same result without using the individual’s name and that, although all individuals have a right to data protection under EU law, DPAs should focus on claims where there is a clear link between the data subject and the EU.

Moreover, referring to the notice stating that “some results may have been removed under data protection law in Europe” posted by Google at the bottom of search results, the Working Party deems that the information provided to users of search engines that the list of results to their queries is not complete has no legal ground under data protection law and is only acceptable if it cannot be concluded that the results related to a particular individual have been de-listed.

Likewise, because there is no legal basis for such routine communication under EU data protection, and in order to avoid a ‘Streisand effect’, it is considered that search engines should not, as a general practice, inform the webmasters of the pages affected by removals of the fact that some web pages cannot be accessed from the search engine in response to a specific name based query. However, it is accepted that contacting the original editor of the content being targeted by a search de-listing request might actually be appropriate when more information is required in order to take a decision.

Furthermore, the guidelines establish a list of 13 common criteria which the data protection authorities should apply when handling complaints following refusals of de-listing by search engines.

It is also stated that no single criterion is determinative and that, in most cases, more than one will have to be taken into consideration. However, each criterion has to be applied in the light of the interest of the general public in having access to the information.

The Working Party further concluded that the impact of the de-listing on individuals’ rights to freedom of expression and access to information will prove, in practice, to be very limited and the balance between the public interest and the rights of the data subject will have to be assessed casuistically.

The abovementioned list includes an orientation regarding what can constitute ‘public life’, considering that, while details associated to the private life of a public figure may be delisted, information regarding the public role or activities should be available for search:

It is not possible to establish with certainty the type of role in public life an individual must have to justify public access to information about them via a search result. However, by way of illustration, politicians, senior public officials, business-people and members of the (regulated) professions can usually be considered to fulfil a role in public life. There is an argument in favour of the public being able to search for information relevant to their public roles and activities.

A good rule of thumb is to try to decide where the public having access to the particular information – made available through a search on the data subject’s name – would protect them against improper public or professional conduct. It is equally difficult to define the subgroup of ‘public figures’. In general, it can be said that public figures are individuals who, due to their functions/commitments, have a degree of media exposure.

In addition, search engines are called upon to be more transparent regarding the de-listing criteria. It is a legitimate concern as Google only has released very limited and abstract information on this regard.

Despite the fact that these guidelines are not legally binding, they reflect a consensual position of national regulators and therefore will certainly influence the enforcement decisions taken by the Member States’ data protection authorities. Considering all the issues surrounding the implementation of the ruling, these guidelines are undoubtedly useful. Nonetheless, it remains to be seen whether Google will actually follow the guidance and extend de-listing to ‘.com’ as well.

In my personal opinion, it is quite outlandish and unrealistic, from both legal and technical points of view, to assume that the ‘right to be forgotten’ can become a global scenario and not just a European one. I mean, considering the overlapping jurisdictions regarding Internet, how come nobody considers the evidence that such a global de-listing sets up a divergence on the international level, between the EU Member States and the rest of the world?

The Internet should not be subjected to geographical boundaries. Assuming that the rules referring to a country can actually apply worldly online can be associated with web censorship. In fact, the EU is not alone in its aim for the global implementation of its rules. Russia and China, for instance, have quite global ambitions regarding Internet governance. Of course, one may argue that the EU motivations are legitimate, intended to protect some individuals’ private sphere, and do not amount to censorship. But considering that this legitimacy of the measures is usually the most frequent argument regarding censorship, is this really the example the EU wants to set?

The Google Affair – Crossing the Border

You will cross the border. Just saying.

You will cross the border. Just saying.

Today I am referring again to the famous Google Spain judgement, better known for ruling on what press has been popularly calling the ‘right to be forgotten’. The amount and the complexity of the questions raised in that decision enabled me to address all of them in the previous posts (here, here, here, and here)… And as I like to honour my promises, I will not  promise that this will be the last post regarding that matter.

So, although the worldwide attention has been focusing on the fact that individuals may directly address, to search engines, requests for deletion of links from search results, the ruling also dealt with a key topic that seemed to have been undervalued, even if as equally important for businesses.

I am specifically referring to the territorial scope of the Directive 95/46 1)Directive 95/46/EC of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, i.e., whether it applies to Google Spain, a subsidiary of Google Inc. or not, given that the parent company is based in Silicon Valley.

In order to fall within the territorial scope of the national provisions implementing the above mentioned Directive, the data processing shall be namely carried out in the context of the activities of an establishment of the data controller on the territory of the Member State, as stated in its article 4(1)(a).

As foreseen in its recitals, “establishment on the territory of a Member State implies the effective and real exercise of activity through stable arrangements” and “the legal form of such an establishment, whether simply branch or a subsidiary with a legal personality, is not the determining factor.2)Recital 19 of the Directive

In this regard, the main relevant facts that the ECJ took into consideration were that Google search engine is operated by Google Inc. outside of the EU and that it has a subsidiary on Spanish territory which sells advertising connected to the Internet-related activities of Google Inc.

In parallel, the ECJ rejected the argument according to which Google does not carry out its processing of personal data activities in Spain and that Google Spain is a mere commercial representative for its advertising actions. Instead, the ECJ noted that, pursuant to recital 19 of the Directive, an establishment on the territory of a Member State implies the effective and real exercise of activity through stable arrangements. 3)Paragraph 48 of the ruling

Moreover, it held that Google Spain engages in such activity and, as a subsidiary of Google Inc., with its own legal personality, constitutes an establishment.4)Paragraph 49 of the ruling

According to the ECJ, Article 4(1)(a) of the directive does not require the processing of personal data to be conducted by the subsidiary itself, but only that it be carried out ‘in the context of the activities’ of the subsidiary.5)Paragraph 52 of the ruling That would be the case, for instance, if the subsidiary promotes and sells advertising space offered by the parent company which serves to make the service offered by that engine profitable.6)Paragraph 55 of the ruling Since the advertisements are displayed next to search results and finance the website, both activities are inextricably linked.7)Paragraph 56 of the ruling

Furthermore, the court considered that the very display of personal data on search results page constitutes processing of such data. As results are displayed, on the same page, with advertising linked to the search terms, the Court concluded that the processing of personal data is carried out in the context of the commercial and advertising activities of the controller’s establishment on the territory of a Member State.8)Paragraph 57 of the ruling

For all these reasons, the ECJ concluded that the processing of personal data in the context of the activities of a subsidiary of the controller established in a EU Member State, which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State, does fall within the territorial scope of application of the Directive.9)Paragraph 60 of the ruling

Last but not the least, the Court noted that, in light of the objectives of the Directive, the rules on its scope ‘cannot be interpreted restrictively’, and that it had ‘a particularly broad territorial scope’.

I must confess that I wasn’t particularly surprised by the conclusion that the Directive is applicable to companies based outside the EU, as long as it conducts a noteworthy local activity that has some link to the Internet activities of the parent body.

In fact, none withstanding the divergence of viewpoints regarding ‘right to be forgotten’ issue, the ECJ broadly confirmed the Advocate General opinion regarding jurisdiction.

The Advocate General had previously established the scope of application of the Directive, pointing out the very nature of the business model of search engines, and the inextricable link between Google Inc. and its subsidiary. Thus, the consideration according to which a controller should be treated as a single economic unit would lead to conclude that a controller is established in a Member State if the subsidiary which generates its revenues is established in that Member State. In this context, it was also disregarded that the technical data processing operations were conducted outside the EU. 10)Paragraphs 64, 65, 66 and 67 of the opinion

As a result, the ruling has broadened the territorial scope of the Directive. Not referring specifically to search engines, it applies to every data processing “in the context of the activities of an establishment”. Hence, it means that businesses with operations in the EU might generally be subjected to EU Data Protection rules.

The concept of establishment may therefore include non-EU businesses which have branches set up in a Member State. This is particularly relevant as it might affect foreign companies simply by virtue of having local sales subsidiaries in the EU. Moreover, it might potentially extend to every business that has a stable presence in the EU market, even if no European representation.

This is in line with the wider reach of the territorial scope of the forthcoming General Data Protection Regulation, which is intended to be applicable not only to businesses established in the EU. The Regulation will, in fact, introduce some key changes to the existing legal framework.

Firstly, while the current Directive applies to the data processing conducted by an establishment of a data controller in the EU, the new legislation will cover as well the personal data processing in the context of the activities of an establishment of a controller or a processor established in the Union.

In addition, the Regulation will also be applicable to the processing of personal data of individuals residing in the EU, by data controllers who are not established in the EU, when the processing activities are related to the offering of goods and services to data subjects in the EU or the monitoring of their behaviour (profiling), as far as their behaviour takes place within the EU.

If implemented, the proposed changes will bring all foreign companies who process EU citizens’ data, many of which have kept their data processing abroad to avoid being subjected to the current Data Protection Directive, within the scope of EU law.

As a consequence, non-EU based businesses will have to reconsider their arrangements for subsidiaries to ensure full compliance with EU Data Protection requirements.

References   [ + ]

1. Directive 95/46/EC of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data
2. Recital 19 of the Directive
3. Paragraph 48 of the ruling
4. Paragraph 49 of the ruling
5. Paragraph 52 of the ruling
6. Paragraph 55 of the ruling
7. Paragraph 56 of the ruling
8. Paragraph 57 of the ruling
9. Paragraph 60 of the ruling
10. Paragraphs 64, 65, 66 and 67 of the opinion

The match of the year: Right to be Forgotten vs Right to know

Round 1, Fight!

Round 1, Fight!

As it is well-known, the ‘right to be forgotten’ ruling extended the possibilities foreseen under the current EU Data Protection Directive for data subjects to exercise their rights to erasure of data and to object to personal data processing with regard to search engine services providers, which were deemed as controllers.

Therefore, facing a deletion request, search engines will have to decide on the balance of the rights at stake, namely freedom of expression and right to privacy, weighing up whether it is in the public interest for the information indexed in its search results to remain.

From the very beginning, the public opinion thrived both with enthusiasm and concern. The main question was: how would the decision be enforced? Isn’t the removal of links to legal and accurate information damaging for freedom of speech and right to access to the information? The debate was mostly vivacious between free speech advocates and privacy campaigners and hasn’t faded away with the course of time. The firsts insist that it will lead to a whitewashing of the past, whereas the latter uphold that it will enable individuals to limit the visibility of some personal information.

Google, despite affirming that the enforcement of the ruling could hamper free speech, alerting for the potential abuse of those looking for the deletion of important information and complaining that the ruling requirements for conformity were vague and subjective, started dealing (efficiently?) with the astonishing amount of requests for suppression of links received, rejecting some and admitting others.

In fact, Google says it has received approximately 143,000 requests, related to 491,000 links, to take down links in the last five months, involving everything from serious criminal records to embarrassing photos and negative press stories. Considering the data revealed by Google itself, the company has refused about 30 per cent of demands and about 50 per cent were taken down. According to its online transparency report, Google has removed more links to content on Facebook from its search results than from any other site. In this regard, Reputation VIP — the company that provided Forget.me, the first “Right To Be Forgotten” Removal Service – outlined that, ironically, most requests do not refer to unflattering or inaccurate web pages written by third parties, but, instead, to content authored by the requestor.

Google even set up an advisory committee to handle the requests. This council is headed by the company’s executive chairman, Eric Schmidt, and chief legal officer, David Drummond, and includes academics, technologists, legal experts and a journalist.

Most recently, Google decided to launch a public debate regarding the balance to be achieved between a person’s right to be forgotten and the public’s right to information. To that end, it organized a grand tour of hearings across Europe and has been on the road for about a month now.

The good intentions beneath this initiative failed to convince everyone. For instance, Isabelle Falque-Pierrotin, who heads the Article 29 Working Party, which gathers all 28 EU national data protection authorities, didn’t hesitate to share her scepticism about the Google initiative, which she described as part of a “PR war”:

Google is trying to set the terms of the debate. They want to be seen as being open and virtuous, but they handpicked the members of the council, will control who is in the audience, and what comes out of the meetings.

Although I do not share such a pessimist viewpoint of the initiative, I actually also have some doubts regarding the openness and transparency that it is intended. Indeed, when the public debate was firstly announced, I expected that it would allow for a better understanding Google’s current processes for dealing with requests. But, as far as I am aware, hearings have centred themselves in abstract and rather philosophical discussions.

Considering the ongoing negotiations regarding the EU data protection reform, already well advanced, the question which should be asked is: how much could the ruling and Google’s efforts in fact influence the direction of the discussions?

According to the European Commission’s initial proposal, the right to be forgotten would be built on the right to erasure of personal data and the right to object to data processing operations, which already exist under the current Data Protection Directive. Therefore, the data subject could exercise the right against the original data controller when and if: the data is no longer necessary; consent is withdrawn or when the storage period has expired; the data subject objects to the processing on specified grounds; or the processing is no longer valid on some other ground. Freedom of expression was among the exemptions foreseen.

The European Parliament was quite favourable to this proposal, having voted its opinion  last spring. However, it ensured that the right could also be exercised directly against third parties and the possibility to exercise the right following an order by a court or regulatory authority.

The Council of the European Union had already discussed the issue before but decided to suspend the respective debates in order to wait for the CJEU’s ruling. However, negotiations regarding other issues of the reform kept going and Member States even agreed on partial general approach since then.

An afterwards statement issued by the Italian Presidency made clear that the provision concerning the right to erasure would take into account principles set out by the CJEU. Indeed, the revised version issued recently left no doubt about it.

I thought this utterly confusing as it is for the Council of the European Union and for the European Parliament, as co-legislators, to make the law as it will stand in the future and for the CJEU to interpret the law as it exists. To take into account the judicial interpretation of the law that we are about to replace for the definition of the upcoming legislation is, in my opinion, quite puzzling. The ruling should not dictate the content or drafting of the future Regulation.

Nevertheless, something has to be done regarding the enforcement of the ruling. As things stand at the moment, it has been up to Google to determine the balance between the conflicting interests at stake. The criteria as defined by the CJEU are undoubtedly insufficient.

And if the ruling shall be taken into account regarding the upcoming legislation regarding anything, it most certainly has to address the scope of the right to be forgotten, the grounds on which it can be exercised and the need to balance this right with the freedom of information, as the judgement itself doesn’t establish with rigour how it shall be applied in practice.

In this context, it must be noted that the regulation has a horizontal nature and, thus, is intended to be applied to all controllers, independently of their nature. Search engines are not the specific aim of the future legislation although, as controllers, they are covered by its scope.

Regarding the scope, one may wonder if the distinction made by the European Commission between personal data which have been initially disclosed or uploaded by the data subject and the personal data which have been disclosed by third-parties will be kept.

Moreover, as it seems that there is no doubt that search engines – now considered as controllers – may receive deletion requests, it is important to clarify what about providers of social media, as Facebook, for instance, where it is possible to argue that the processing is based on consent or a contract.

As for the grounds on which the right can be exercised, I think it won’t be easy to determine who will be required to conduct the assessment in order to consider if the initially lawful processing of accurate data became unnecessary, inadequate, irrelevant or no longer relevant, or excessive in the light of the purposes for which they were collected or processed and of the time it has elapsed. Who is better suited for that role: search engines or the first controller?

In this context, one cannot assume that, if the initial processing is lawful, that the second processing is also legal. There might be cases where both might have reached different outcomes of lawfulness. What then?

Furthermore, should requests for deletion be addressed directly to the controller? Should they be addressed, instead, to the supervisory authority? Or to the competent courts? And if so, which court would be the competent one?

In addition, should the data subject have the right to choose any of the controllers to exercise the right to be forgotten and erasure? I believe that, at least theoretically, it should be possible for the data subject to exercise the rights against the processing carried out by the search engine before, after or independently from exercising the same or other rights against the original controller. But one should bear in mind that it is quite unrealistic to ask operators of search engines to track information and replication of data across the web.

As we can see, many questions are yet to find their answers.

The most popular is:

How will be the right to the protection or personal data fairly articulated with the right to freedom of expression?

Understandably, certain Member States have shown legitimate concerns regarding the freedom of expression and the interest of the public at large to have access to information, which may end up being underweight in the balancing process. So the debates are currently ongoing.

One of the big issues at stake is that, according to the spirit of the founding treaties, the conciliation of the right to the protection of personal data and the freedom of expression should remain in Member States’ legislative power. This implies that the European co-legislative institutions, the Council of the European Union and the European Parliament, are not entitled to regulate in detail this matter. However, if it is up to Member States to reconcile the two potentially conflicting rights, nor harmonization nor a unified application of law is ensured.

In this context, it will be important to delineate the concept of ‘public interest’ and ‘public figure’, which scope is not satisfactorily developed in data protection due to the swiftly evolved digital era.

Moreover, it will be important to establish that bloggers and individuals generally expressing themselves online fall within the scope of the ‘freedom of expression’ exception, even if they are not professional journalists. After all, article 11 of the Charter of Fundamental Rights of the European Union establishes that everyone has the right to freedom of expression, including the freedom to hold opinions and to receive and impart information and ideas, establishing the freedom and pluralism of the media.

On another level, and as it is well-known, Google has been systematically alerting websites when it cuts links to their pages from results presented based on searches for a person’s name, which is in line with the European Commission’s proposal. But should search engines be barred to inform publishers, as Google has been doing, when articles have been delisted from search results? Are they cases where it would be appropriate to involve a publisher? Which ones?

These notifications are mostly problematical due to the possibility of republication, which could cause additional harm or distress for the data subject. And indeed, it often leads to a republication of a version which indicates what URLs are being removed from the search index.

In my opinion, it is preferable for the data subject that the search engine, as a second controller, contacts the controller which has firstly published the information (preliminary controller), as, otherwise, it might not be always easy to establish the correct balance.

In parallel, Google has unilaterally restricted the deletion of internet links to European websites only, for instance Google.es, Google.de, Google.uk… Well, you get the idea… But shouldn’t the removal be global, considering the very nature of Internet? Shouldn’t links be removed from all versions of Google, such as Google.com? This is particularly important considering that most of European users of the search engine use local domains, rather than referring to google.com.

The Justice and Home Affairs Council gathered in Luxembourg, on the 10th of October, to discuss the regulation and directive. A partial general approach on chapter IV of the general data protection regulation, which deals with the obligations for data controllers and processors, was agreed. There is, nevertheless, still plenty to be agreed on, so one may wonder if the deadline established by the incoming European Commission President Jean-Claude Juncker for the end of negotiations – within six months of the commission starting work – will be enforceable.

Meanwhile, the Article 29 Working Party is preparing some guidelines which will set out a common record to deal with different types of appeals coming in from citizens. To that end, it has met with media and search engine companies, Google, Microsoft and Yahoo, to gather their views on how to strike a balance between the freedom of information and privacy. The guidelines are expected to be finalized by the end of November.

Considering the current state of play, let’s hope that some thorny questions would have been answered by then…

The Google Affair: Forget Me, Forget Me Not – Part II

I can forget you, you know?

I can forget you, you know?

Bad pictures, dire comments, past scandals. The internet never seems to forget. How convenient would be the possibility to suppress any registry of events of the past, which might adversely affect our honour and dignity or simply expose our private life in an undesirable way?

According to the recent ruling of the ECJ, someone wishing to delete personal information from a search engine’ index will have to request so to the search engines services provider.

This decision is undoubtedly good news to those who would like to suppress information about them from the internet, but is likely to cause problems in practice.

It is unquestionable that people have a right to privacy, and that that privacy extends to information about them. Thus they have the right to control their personal data and can ask for its deletion when no longer have interest in its processing or storage and there is no legitimate interest in being kept by a data controller.

The issues regarding the qualification of search engines services providers as ‘data controllers’ have already been raised here and here. Unfortunately, our concerns don’t end there.

According to this decision, search engines have to deal with individual complaints or they can be challenged before judicial courts or a supervisory authority.

In order to implement the decision, Google had to devise mechanisms (an online form available on its website) in order to receive requests for link removal and was submerged by the amount of requests for personal data deletion received from nationals of the 28 countries of the EU. This has been representing major additional administrative burden for Google and will affect as well all other web intermediaries, as the repercussions of this ruling extend well beyond Google.

Then, search engine services providers will decide on the balance of the rights at stake, weighing up whether it is in the public interest for that information to remain.

Does it make any sense to establish this principle when there is no evident and easy answer about what is privacy nowadays or where do we draw the line between what belongs exclusively to private sphere and what belongs to the public domain?

It is a fact that Google have been removing links which infringe copyrights content on demand of copyrights holders. In those cases, however, there is no individual assessment and the process is mostly automated, which is a very different scenario from what the ruling entails. Indeed, each request would have to be considered individually and demands an appraisal that is not fit for a search engine. Thus, complying with this ruling could end up very costly because making this kind of assessment is not possible through algorithms.

As much as I can agree with the motivation of the ruling, this just doesn’t make any sense. The ECJ thus passes the responsibility to find the right balance over to private entities, businesses, whose primordial concern is profits, although they have neither the expertise nor the legitimacy to act as a legal authority. As a general principle, deletion of websites, or search links, should be decided by a legitimate entity, entitled with public authority, ultimately a court.

Anyway, pertinent questions arise regarding the interpretation of the rules set out in the judgement. What exactly is a public figure? What is deemed to be qualified as public interest? How long has to pass before personal data is no longer relevant? How can a search engine determine if data is inaccurate or its processing is excessive or disproportionate? How will the rights of the publisher be safeguarded in the internal process of a private company?

In the absence of any rigorous criteria to balance the rights of the data subject against the search engine’s economic interests and the public interest, search engine services providers lack a precise test to apply when assessing requests from data subjects for removing links to websites containing their personal data.

The advisory committee set by Google might not be the most adequate solution. The agreement reached by national data protection authorities to form a subcommittee establish uniform handling of such requests is much welcome.

In order to avoid any likelihood of liability for breaching data protection law and of unlawful processing of data, search engine services providers might prudently remove links, despite any public interest in its disclosure, rather than consider the balance of rights in every request. And who would possibly blame them?

The most evident and worrying outcome is that our facility to find information about other individuals is substantially reduced.

But it leaves some room for other discrepancies as well. Will the links deleted from the specific index created especially for a country (e.g. specific national language required) be available within searches operated in other countries? Or only search engines with no connection to the EU will be able to serve the results?

We might be dangerously heading toward a tiered and fragmented internet, with searches results in Europe being less complete than elsewhere. What about the free and open internet, then?

And what about managing the potential interests of third parties, also concerned by the deleted information, who might have as well personal data published but, on the contrary, wish to be easily found online through the search engine set of results?

The Advocate General already alerted 1)Paragraph 134 of the Opinion that suppression of legitimate public domain information would amount to censorship . For the sake of the full access to information and freedom of expression, it is imperative that Google should remain a neutral platform. To that aim, empowering a search engine service provider with censor competences is a remotely desirable result.

The main search engines in Europe already met, in Brussels, with the Article 29 Working Party, which brings together data protection authorities from across the EU, which intends to issue some EU-wide guidelines in order to achieve a unified implementation of the ruling.

Considering the possible scenarios foreseen, these guidelines are very much welcomed. However, it just seems that we are now trying to correct a system that was wrongly built from scratch. Indeed, according to its press release, DPAs asked, for instance, the search engines to explain their de-listing process and what were the criteria applied to the balancing process of their economic interest, the public interest in accessing information and the right of the person who wants the search results de-listed.

Now Google has planned public hearings, in an alleged quest for transparency, in order to boost a debate on the implementation of the ruling.

More recently, at the WP29 Plenary meeting of 16 and 17 September, the European data protection authorities decided to put in place a network of contact persons in order to develop common case – handling criteria to handle complaints presented to the data protection authorities, resulting from search engines’ refusals to “de-list” links from their results.

Now that the confusion has started, one may wonder where will all this end?

References   [ + ]

1. Paragraph 134 of the Opinion

The Google Affair: Forget Me, Forget Me Not – Part I

Am I forgetting something?

Am I forgetting something?

The ruling better known as the ‘right to be forgotten’ decision won’t risk to be forgotten any time soon.

To make a long story short, the ECJ ruled that the operator of a search engine – in that particular case, Google – is obliged to remove from the list of search results displayed, according to a search made on the basis of a person’s name, links to web pages published by third parties and containing information relating to that person, even when its publication is otherwise lawful and the information is factually correct.

I understand (and welcome) the motivation behind the judgments. It is comforting to know that individuals, where there is no legitimate – and consequently stronger – public interest involved, may be able to move on from their past. As it is very difficult to restrict our lives to the offline world, Google and search engines in general are, nowadays, the primary instrument to find information about everything and anyone, to the point that ‘google’ is commonly used as a verb referring to an online search conducted on that search engine. Without search engines, information wouldn’t be, in the vast majority of cases, so easily accessible.

And that is certainly the main issue at stake. As stated by the Court:

(…) processing of personal data, such as that at issue in the main proceedings, carried out by the operator of a search engine is liable to affect significantly the fundamental rights to privacy and to the protection of personal data when the search by means of that engine is carried out on the basis of an individual’s name, since that processing enables any internet user to obtain through the list of results a structured overview of the information relating to that individual that can be found on the internet — information which potentially concerns a vast number of aspects of his private life and which, without the search engine, could not have been interconnected or could have been only with great difficulty — and thereby to establish a more or less detailed profile of him.  1)Paragraph 80th of the ruling

So a ‘right to be forgotten’ might as well be a necessary principle in an unforgetting world.

But, true to be told, the principle of giving citizens more control over their personal data – including its deletion – doesn’t come as the new right in town. It existed long before what was decided by the ECJ in several Member States’ legislation, although without such a sticky nickname. Furthermore, it is foreseen in the directive 95/46 on data protection which remains applicable to the Internet as it has developed since. And a related provision is expected to be included in the General Data Protection Regulation, intended to replace Directive 95/46/EC, currently being negotiated at the Council of the European Union. The principle of individuals being able to move on from their past therefore doesn’t come as an originality.

What was indeed a surprise, and not a good one I must admit, was the conclusion that Google, while processing information published by a third party, act and is liable as a controller, and shall delete links to articles lawfully published by third parties, which can remain available on the latter’s website.

One must admit that this conclusion is in line with the obligations that article 6(2) of the Directive foresees regarding controllers while processing personal data: the controller shall weigh its interests, those of the data controller, those of third parties and those of the data subject.

That conclusion is also in line with the observations of the Court in ASNEF and FECEMD ruling2)Joined Cases C 468/10 and C 469/10 ASNEF and FECEMD [2011] ECR I 0000, paragraphs 44–45, regarding the relevance of the data previous appearance in public sources.

And disregarding all the questions (as mentioned here and here) that unavoidably arise while considering a search engine as a ‘controller’, the ECJ tried to establish criteria for the removal of links and consecrated a balance to be achieved to that end.

Therefore, search engines shall erase data deemed inadequate, irrelevant or no longer relevant, or excessive in the light of the time that had elapsed.3)Paragraph 94th of the ruling According to the ECJ, even accurate data, despite lawfully published initially, can “in the course of time become incompatible with the directive”.4)Paragraph 93rd of the ruling These key criteria – “inadequate, irrelevant or no longer relevant, excessive in relation to the purposes of the processing” – shall be balanced with the economic interests of the operator of the search engine and the interest of the general public in finding that information upon a search relating to the data subject’s name.5)Paragraph 97th of the ruling

However, as rightly pointed out by the ECJ:

(…) that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question.6)Paragraph 97th of the ruling

The preponderant public interest is consequently the ultimate test to ensure that information which ought to continue widely available is not removed.

I personally doubt that the Court succeeded in defining the desirable rigorous criteria necessary to achieve a fair balance of rights to privacy and personal data against freedom of expression and information. In fact, the ECJ provides very little legal certainty as it merely portrays an unclear balancing act between the rights of the data subject against the search engine’s economic interests and the public interest, to be determined casuistically.

And this can have very serious repercussions considering that the removal of links from the list of results can have effects upon the legitimate interest of the publishers of the information and of internet users potentially interested in having access to that information.

As pointed out by the General Advocate:

The data protection problem at the heart of the present litigation only appears if an internet user types the data subject’s name and surnames into the search engine, thereby being given a link to the newspaper’s web pages with the contested announcements. In such a situation the internet user is actively using his right to receive information concerning the data subject from public sources for reasons known only to him.7)Paragraph 130 of the Opinion

And he added:

In contemporary information society, the right to search information published on the internet by means of search engines is one of the most important ways to exercise that fundamental right. This right undoubtedly covers the right to seek information relating to other individuals that is, in principle, protected by the right to private life such as information on the internet relating to an individual’s activities as a businessman or politician. An internet user’s right to information would be compromised if his search for information concerning an individual did not generate search results providing a truthful reflection of the relevant web pages but a ‘bowdlerized’ version thereof8)Paragraph 131 of the Opinion

It most certainly doesn’t help that the ECJ establishes that data subject’s rights protected by articles 7 and 8 of the Charter override, as a general rule, that interest of internet users.9)Paragraph 81st of the ruling Does this mean that the right to privacy is the general principle, only subjected to sporadic exemptions represented by the exceptional precedence of the freedom of expression and right to information?

This priority doesn’t seem to be established nor in the Charter nor in the European Convention of Human Rights. It is only expectable that a data subject’s right to protection of his private life must be balanced with other fundamental rights, namely with freedom of expression and freedom of information. Additionally, the conclusion according to which the inclusion in the list of results of a search engine constitutes a more significant interference with the data subject’s fundamental right to privacy than the publication on the web page, seems to forget that, nowadays, freedom of expression is not just about the right to publish, but also about being found online. We have attained a historic momentum where to not be found through a search engine or to no exist online is pretty much the same.10)Paragraph 87 of the ruling

Therefore, the applicability to search engines but not to news websites or other journalistic activities, due to exemptions for “media” under European data protection law, so the original information itself could remain, does not represent a satisfactory guarantee of the right to information at a time when people have access to both newsworthy information and publishing tools.

Considering the challenges at stake, it is very unlikely that a search engine service provider will satisfactorily balance all these conflicting interests. The general fear that the implementation of this ruling will entail sacrificing of freedom of expression and information might not come as a mere alarmism.

It is worth question whether and how this ruling will affect the negotiations for a General Data Protection Regulation currently ongoing at the Council of the European Union. The Italian Presidency circulated to the DAPIX working group a note entitled ‘Right to be forgotten and the Google judgment’ to examine how the future legislation on the right to deletion should be developed.

This raises some confusions considering that it is for the Council of the European Union and for the European Parliament, as co-legislators, to make the law as it will stand in the future and for the ECJ to interpret the law as it exists.

References   [ + ]

1. Paragraph 80th of the ruling
2. Joined Cases C 468/10 and C 469/10 ASNEF and FECEMD [2011] ECR I 0000, paragraphs 44–45
3. Paragraph 94th of the ruling
4. Paragraph 93rd of the ruling
5. Paragraph 97th of the ruling
6. Paragraph 97th of the ruling
7. Paragraph 130 of the Opinion
8. Paragraph 131 of the Opinion
9. Paragraph 81st of the ruling
10. Paragraph 87 of the ruling

The Google Affair: To be or not to be a Data Controller – Part II

Data controller or Data processor?

Data controller or Data processor?

With an evident and unfortunate confusion between the roles of ‘data controllers’ and ‘data processors’, (as analysed previously) the European Court of Justice (ECJ) qualifies, in its famous ruling, Google as data controller, while simply reproducing existing information.

This has far reaching implications and raises a number of complex and problematic issues. In this post, we will only be able to address some questions.

Indeed, to consider search engines services providers as data controllers for the purposes of EU law entails that they are obliged to respect the rights and freedoms of the data subjects, comply with a set of principles and obligations as foreseen in Directive 95/46, when they process personal data, and be prepared for liability in case of failure in behaving accordingly.

This obviously forces a shift in search engines services providers’ responsibilities regarding the individuals whose information they process.

A very pertinent and practical question must be asked: how can search engines services providers fulfil the obligations impending on controllers, as provided in Articles 6, 7 and 8 of the Directive, in relation to the personal data sourced from web pages hosted by third-party servers?

If an internet search engine service provider is to be considered a controller, it must guarantee that the processed personal data is adequate, relevant, and not excessive in relation to the purposes for which it was collected, up to date, and no longer than is necessary for the purposes for which the data were collected.

Considering that the search engine merely locates information made available by third parties, indexes it automatically and makes it accessible to internet users according to a particular order of preference, it remains to be explained how a search engine service provider will be able to appraise its compliance with those requirements. Isn’t the publisher of the website concerned in a better position to conduct that assessment?

As to the criteria concerning the legitimacy of processing of data made available on the internet, including personal data, in the absence of a data subject’s consent, it is unquestionable that internet search engine serve legitimate interests. Indeed, it allows an easy and quick access to information and contributes to the dissemination of the information uploaded on the internet.

But what happens when the search engine processes information that is inserted in a special category of data (e.g. personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and concerning health or sex life), which processing is prohibited, unless, for instance, data subject has given his explicit consent?

It is important to reflect on the hypothesis raised by the Advocate General, in its opinion:

if internet search engine service providers were considered as controllers of the personal data on third-party source web pages and if on any of these pages there would be ‘special categories of data’ referred to in Article 8 of the Directive (e.g. personal data revealing political opinions or religious beliefs or data concerning the health or sex life of individuals), the activity of the internet search engine service provider would automatically become illegal, when the stringent conditions laid down in that article for the processing of such data were not met.

Would the processing of special categories of data by search engines be deemed to be illegal if the requirements for the processing of such data on third-party source web pages were not met? Is that even a conceivable scenario?

Additionally, considering that, in order to be lawful, and if no other criterion is applicable, the processing of personal data must be carried out with the consent of the data subject, it is only legitimate to question how can search engines services providers ensure the consent from data subjects with whom they have never been in contact with?

Furthermore, one must wonder about the effectiveness of the exercise of the data subjects’ right to access to data, as foreseen in the above mentioned directive, and Google’s capabilities to satisfactorily comply with its obligation, attending to the fact that its activity of caching webpage’s for relevance index ranking depends on an algorithm that will perform the content analysis automatically. Attending to this notion of relevance, e.g., it will be impossible to distinguish in practice people who share the same name.

I guess we all just have to wait to see how all the implications will work in practice…

The Google Affair: To be or not to be a Data Controller – Part I

Is Google a Data Controller?

What am I?

It would be very difficult – and to some extent odd – to start a blog with the intention to express my viewpoints on privacy issues, and related matters, without mentioning the already thoroughly commented ruling, better known as the “right to be forgotten” decision, where the  European Court of Justice (ECJ) concluded that, in order to comply with the rights laid down in Directive 95/46:

the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.

It might not come as a surprise – after all, I just created a blog – but I have as well some comments that I would like to share in this (still) freedom of expression ever growing and globalized, yet interconnected, technological world.

However, I do not intend, at least today, to reflect on how the ‘right to be forgotten’ may or may not work in practice.

Instead, I would like to refer to a recent speech of  Martine Reicherts, who stated that internet search engine service providers, such as Google, have a big responsibility in ensuring that personal data is being handled properly.

And it is precisely this new notion of ‘responsibility’ regarding search engines services providers that I consider disturbing. After all, until very recently, the internet search engine service providers liability for the third-party content they transfer and/or store has always been quite restricted. I therefore have some doubts and concerns about this sudden and innovative understanding of the responsibility of internet search engine service providers as data ‘controllers’.

One should not forget that the role and legal position of internet search engine service providers has not been expressly regulated in EU legislation.
Their activity consists in locating information made available on the internet by third parties, indexing it automatically, storing it temporarily and finally, making it accessible to internet users according to a particular order of preference.

In this context, we agree with the understanding stated by the ECJ according to which an internet search engine provider processes personal data (if that data relate to an identified or at least identifiable subject) as foreseen in Directive 95/46, regardless of the fact that it also carries out the same operations in respect of other types of information and does not distinguish between the latter and the personal data.

However, in this same context, the internet search engine service provider cannot be automatically considered as ‘controller’ of the processing of such personal data!

A controller is, according to the aforementioned Directive:

the natural or legal person […] which alone or jointly with others determines the purposes and means of the processing of personal data.

A special reference to the processing of personal data by information society services that act as selection intermediaries is absent from the Data Protection Directive.

In this particular case, based on complete automated search algorithms which filters data and delivers results based on relevance and popularity, Google is not aware nor exercises any control over the data included in the publisher’s webpage. In this particular context, Google was not able to distinguish between personal data and other data. Moreover, it is not able to change the content in the host servers.

It is therefore evident that Google acted act as a data processor, i.e., an information society intermediary between content providers and internet users, only storing information and supplying an information location tool.

As stated by the Article 29 Data Protection Working Party:

[t]he principle of proportionality requires that to the extent that a search engine provider acts purely as an intermediary, it should not be considered to be the principal controller with regard to the content related processing of personal data that is taking place. In this case the principal controllers of personal data are the information providers.

I thus strongly support the view expressed by the Advocate General in his Opinion on this case.

Surprisingly, the ruling point out in the opposite direction, and Google was regarded as responsible, under data protection law, for the results which it returned. The fact that the processing of data is conducted entirely by complex algorithms, without any of its control, was absolutely disregarded by the ECJ.

Instead, the ECJ stated that search engines have a separate responsibility from publishers, consisting in loading those data on an internet page, to which is additional. Indeed, according to the ruling, search engines shall be liable to make accessible outdated information because it still is available in a publisher’s website:

that activity of search engines plays a decisive role in the overall dissemination of those data in that it renders the latter accessible to any internet user making a search on the basis of the data subject’s name, including to internet users who otherwise would not have found the web page on which those data are published.

This is just too far-fetched…

© 2019 The Public Privacy

Theme by Anders NorenUp ↑