Tag: Controller (page 2 of 3)

The ‘right to be forgotten’
extended to Google.com

Forgetting everywhere

Forgetting everywhere

As you might well remember, the Court of Justice of the European Union, in a better known as ‘right to be forgotten’ judgement, ruled that individuals, provided that certain prerequisites are met, have the right to require from search engines, such as Google, to remove certain results about them, subsequently presented to a search based on a person’s name. (you can read more here, here, here, here and here) According to the ruling, the original information will still be accessible under other search terms or by direct access to the publisher’s original website.

In this context, in cases where the criteria for deletion are met and where search engines do not remove the links requested by the data subject, the latter would be able to complain to its national data protection or judicial authority.

Therefore, last week, the Article 29 Working Party, which gathers representatives of the 28 national data protection authorities (hereafter DPAs) of the EU Member States, has adopted Guidelines on the implementation of the judgement. This has not really come as a surprise as the Working Party had already announced its decision to establish a common approach to the right to be forgotten.

Indeed, the ruling left many questions unanswered. For instance, it was left to be found out if Google was only obliged to block requested names for European domain names only or should do so for all Google search domains. Moreover, it was left unanswered how could the balance between the relevant rights and interests at stake be achieved and how the ‘public figure’ concept could be defined.

According to the Article 29 Working Party, the ruling only refers to search engines as “data controllers” and is not to be applicable to the original source of information, so the information is not to be removed entirely, just from search indexes.

Furthermore, considering that users are directed to localised editions of the company’s search service, when they initially try to visit the Google.com website, Google’s current practice regarding delisting procedures consists in delisting results which appear in the European versions of its search engines, but not the international one ‘.com’. In this regard, the Article 29 Working Party considers that Google has therefore failed to effectively implement the abovementioned ruling. It considers, indeed, that limiting the de-listing of search results to EU domains, on the grounds that users tend to access search engines via their national domains, does not sufficiently guarantees the rights of the data subjects. In fact, it concluded that the de-listing should be conducted on all the relevant ‘.com’ domains. This conclusion is certainly in line with a recent position of a French court which decided that Google should remove a link to a defamatory article for a particular search on both on its ‘.fr’ and ‘.com’ domains.

In addition, the document clarifies that Google is not required to block links if searches lead to the same result without using the individual’s name and that, although all individuals have a right to data protection under EU law, DPAs should focus on claims where there is a clear link between the data subject and the EU.

Moreover, referring to the notice stating that “some results may have been removed under data protection law in Europe” posted by Google at the bottom of search results, the Working Party deems that the information provided to users of search engines that the list of results to their queries is not complete has no legal ground under data protection law and is only acceptable if it cannot be concluded that the results related to a particular individual have been de-listed.

Likewise, because there is no legal basis for such routine communication under EU data protection, and in order to avoid a ‘Streisand effect’, it is considered that search engines should not, as a general practice, inform the webmasters of the pages affected by removals of the fact that some web pages cannot be accessed from the search engine in response to a specific name based query. However, it is accepted that contacting the original editor of the content being targeted by a search de-listing request might actually be appropriate when more information is required in order to take a decision.

Furthermore, the guidelines establish a list of 13 common criteria which the data protection authorities should apply when handling complaints following refusals of de-listing by search engines.

It is also stated that no single criterion is determinative and that, in most cases, more than one will have to be taken into consideration. However, each criterion has to be applied in the light of the interest of the general public in having access to the information.

The Working Party further concluded that the impact of the de-listing on individuals’ rights to freedom of expression and access to information will prove, in practice, to be very limited and the balance between the public interest and the rights of the data subject will have to be assessed casuistically.

The abovementioned list includes an orientation regarding what can constitute ‘public life’, considering that, while details associated to the private life of a public figure may be delisted, information regarding the public role or activities should be available for search:

It is not possible to establish with certainty the type of role in public life an individual must have to justify public access to information about them via a search result. However, by way of illustration, politicians, senior public officials, business-people and members of the (regulated) professions can usually be considered to fulfil a role in public life. There is an argument in favour of the public being able to search for information relevant to their public roles and activities.

A good rule of thumb is to try to decide where the public having access to the particular information – made available through a search on the data subject’s name – would protect them against improper public or professional conduct. It is equally difficult to define the subgroup of ‘public figures’. In general, it can be said that public figures are individuals who, due to their functions/commitments, have a degree of media exposure.

In addition, search engines are called upon to be more transparent regarding the de-listing criteria. It is a legitimate concern as Google only has released very limited and abstract information on this regard.

Despite the fact that these guidelines are not legally binding, they reflect a consensual position of national regulators and therefore will certainly influence the enforcement decisions taken by the Member States’ data protection authorities. Considering all the issues surrounding the implementation of the ruling, these guidelines are undoubtedly useful. Nonetheless, it remains to be seen whether Google will actually follow the guidance and extend de-listing to ‘.com’ as well.

In my personal opinion, it is quite outlandish and unrealistic, from both legal and technical points of view, to assume that the ‘right to be forgotten’ can become a global scenario and not just a European one. I mean, considering the overlapping jurisdictions regarding Internet, how come nobody considers the evidence that such a global de-listing sets up a divergence on the international level, between the EU Member States and the rest of the world?

The Internet should not be subjected to geographical boundaries. Assuming that the rules referring to a country can actually apply worldly online can be associated with web censorship. In fact, the EU is not alone in its aim for the global implementation of its rules. Russia and China, for instance, have quite global ambitions regarding Internet governance. Of course, one may argue that the EU motivations are legitimate, intended to protect some individuals’ private sphere, and do not amount to censorship. But considering that this legitimacy of the measures is usually the most frequent argument regarding censorship, is this really the example the EU wants to set?

EU PNR – A plane not yet ready to fly

Plane Not Ready to fly!

Plane Not Ready to fly!

The Civil Liberties, Justice and Home Affairs (LIBE) Committee of the European Parliament has recently discussed the Passenger Name Record (hereafter PNR) draft Directive according to which air carriers would be required, in order to help fight serious crime and terrorism, to provide EU Member States’ law enforcement bodies with information regarding passengers entering or leaving the EU.

This airline passenger information is usually collected during reservation and check-in procedures and relates to a large amount of data, such as travel dates, baggage information, travel itinerary, ticket information, home addresses, mobile phone numbers, frequent flyer information, email addresses, and credit card details.

Similar systems are already in place between the EU and the United States, Canada and Australia, through bilateral agreements, allowing those countries to require EU air carriers to send PNR data regarding all persons who fly to and from these countries. The European Commission’s proposal would now require airlines flying to and from the EU to transfer the PNR data of the passengers’ onboard passengers on international flights to the Member States of arrival or departure.

Nevertheless, the negotiation of the EU PNR proposed airline passengers’ data exchange scheme has been quite wobbly. The European Commission proposed the legal basis, in 2011, which ended up being rejected, in 2013, by the above mentioned committee, allegedly because it does not comply with the principle of proportionality and does not adequately protect personal data as required by the Charter of Fundamental Rights of the EU (hereafter CFREU) and by the Treaty on the Functioning of the EU (hereafter TFEU).

But concerns over possible threats to the EU’s internal security posed by European citizens returning home after fighting for the so-called “Islamic State” restarted the debate. Last summer, the European Council called on Parliament and Council to finalise work on the EU PNR proposal before the end of the year.

However, following the ruling of the Court of Justice of the European Union, regarding the EU’s Data Retention Directive, last April, which declared the mass-scale, systematic and indiscriminate collection of data as a serious violation of fundamental rights, leads to question if these PNR exchange systems with third countries are effectively valid under EU law.

Similarly, many wonder if the abovementioned ruling shouldn’t be taken into account in the negotiations of this draft directive considering that it also refers to the retention of personal data by a commercial operator in order to be made available to law enforcement authorities.

And there are, indeed, real concerns involved.

Of course, an effective fight against terrorism might require law enforcement bodies to access PNR data, namely to tackle the issue regarding ‘foreign fighters’ who benefit from EU free movement rights which allow them to return from conflict zones without border checks. For this reason, some Member States are very keen on pushing forward this scheme.

However, the most elemental principles of the rule of law and the most fundamental rights of innocent citizens (the vast majority of travellers) should not be overstepped.

For instance, as the proposal stands, the PNR data could be retained for up to five years. Moreover, the linking of PNR data with other personal data will enable the access to data of innocent citizens in violation of their fundamental rights.

As ISIS fighters are mostly well-known by the law enforcement authorities as well as by the secret services, it is therefore questionable how reasonable and proportionate can be such an unlimited access to this private information in order to prevent crime. How effective would be the tracking of people’s movements in order to fight against extremism? Won’t such a widespread surveillance ultimately turn everyone into a suspect?

Thus said, from the airlines point of view, the recording of such amount of data would undoubtedly imply an excessive increase of costs and, therefore, an unjustifiable burden.

The European Data Protection Supervisor (EDPS) has already considered that such a system on a European scale does not meet the requirements of transparency, necessity and proportionality, imposed by Article 8 of the CFREU, Article 8 of the European Convention of Human Rights and Article 16 of the TFEU. Similarly, several features of the PNR scheme have been highly criticized by the Fundamental Rights Agency (FRA).

At the moment, the EU Commission has financed national PNR systems in 15 member states (Austria, Bulgaria, Estonia, Finland, France, Hungary, Latvia, Lithuania, the Netherlands, Portugal, Romania, Slovenia, Spain, Sweden, and the UK) which leads to a fractioned and incoherent system. This constitutes a very onerous outcome for airlines and a need for a harmonization among data exchanges systems. The initiative is therefore believed by some MEPs to intend to circumvent the European Parliament’s opposition to the Directive.

Thus considering, it is legitimate to question if the EU-PNR will be finalized, as firstly intended, before the end of year. Given the thick differences between MEPs and among Member States, it appears that the deadline will be more and more unlikely to be meet.

The ‘One Stop Shop’ mechanism reloaded

Get all your data protection matters handled here!

Get all your data protection matters handled here!

The ‘one stop shop’ mechanism is one of the most heralded and yet most controversial features of the General Data Protection Regulation which draft is currently being negotiated within the Council of the European Union.

According to the most recent proposal of the Italian Presidency of the Council of the European Union, where data protection compliance of businesses operating across several EU Member States’ is in question or where individuals in different EU Member States are affected by a personal data processing operation, it would allow businesses to only deal with the Data Protection Authority (DPA) of the country where they are established.

Cases of pure national relevance, where the specific processing is solely carried out in a single Member State or only involves data subjects in that single Member State would not be covered by the model. In such circumstances, the local DPA would investigate and decide on its own without having to engage with other DPAs.

These are, however, deemed to be the exemption as the mechanism aims for a better cooperation among DPAs of the different EU Member States concerned by a specific matter.

Therefore, in cross-border cases, the competence of the DPA of the EU Member State of the main establishment does not lead to the exclusion of the intervention of all the other supervisory authorities concerned by the matter. In fact, while the supervisory authority of the Member State where the company is established will take the lead of the process which will ensue, the other authorities would be able to follow, cooperate and intervene in all the phases of the decision-making process.

In this context, if no consensus is reached among the several authorities involved, the European Data Protection Body (hereafter EDPB) will decide on the binding measures to be implemented by the controller or processor concerned in all of their establishments set up in the EU. Similarly, the EDPB will have legally binding powers in case of failure to reach an agreement over which authority should take the lead.

Multi-jurisdictional operating businesses operating in the EU, which handle vast amounts of personal data, would highly benefit from this ‘one stop shop’ concept, which would enable to reduce the number of regulators investigating the same cases. Indeed, as things stand presently, a company with operations in more than one EU Member State has to deal with 28 different data protection laws and regulators, which unavoidably leads to a lack of harmonization and legal uncertainty.

The Article 29 Working Party has already manifested its support for a ‘one stop shop’ mechanism under the proposed EU General Data Protection Regulation.

However, in the past, Member States have manifested numerous reservations regarding this mechanism. Among the main concerns expressed were the following: businesses would be able to ‘forum shop’ in order to ensure that their preferred DPA leads the process; a DPA would not be able to take enforcement action in another jurisdiction; individuals’ rights to an effective remedy under EU laws would not be appropriately recognised; authorities without the lead position would not be able to influence processes related to data protection breaches involving nationals of their Member States.

As the way the ‘one stop shop‘ mechanism would be implemented in practice is one of the main causes of the hindrance for the Member States to reach an agreement on the wording of a new EU General Data Protection Regulation, let’s hope that the solution proposed by the Italian Presidency of the Council of the European Union does get closer to a suitable accommodation of the various concerns expressed by Member States.

The EU external border’s security at travellers’ fingerprints

One fingerprint down, only nine to go!

One fingerprint down, only nine to go! 1)Copyright by Frettie under the Creative Commons Attribution 3.0 Unported

Last semester, the Council of the European Union and the European Parliament voiced technical, operational and financial concerns regarding the overall implementation of the ‘Smart Borders Package’. In this context, the European Commission initiated an exercise aimed at identifying the most adequate ways for its implementation. The aforementioned exercise would include a Technical Study, which conclusions would be subsequently tested through a Pilot project.

The Technical Study, prepared by the European Commission, has been recently issued.

But let’s create some context here…

The EU is currently assisting to a very important increase in the number of people crossing its borders, namely through air. I am talking about millions of people crossing, for the most diversified reasons, every day, at several points of the external border of the EU. This very fact transforms airports in the most relevant way in and out of the EU.

Therefore, if the border management, namely through the check in procedures, is not duly modernized and dealt with by a proper legal and technical structure, longer delays and queuing are expected. Adding to this, there is also a paramount concern of security, due to the growing numbers of foreign fighters and refugees.

Indeed, under the current framework – the Schengen Borders Code – a thorough check at entry of all travellers crossing the external border is required, regardless their level of risk or how frequently they actually travel in and out the EU. Furthermore, the period of time a traveller stays in the Schengen area is calculated based solely on the stamps affixed in the travel document.

So one of the main goals of the ‘Smart Borders’ initiative is to actually simplify and facilitate the entrance of “bona fide” travellers at the external borders, significantly shortening the waiting times and queues they have to face. Additionally, the initiative aims at preventing irregular border crossing and illegal immigration, namely through the detection of overstays, i.e., people who have entered the EU territory lawfully, but have stayed longer than they were authorized to.

In this context, biometrics 2)The concept refers to metric related to human features, i.e., to elements which are specific to the physical or psychological identity of a person and, therefore, allow to identify that person. Physiological biometrics, which we are specifically considering in this context, refer to human characteristics and traits, such as face, fingerprints, eye retina, iris and voice. appear as a solution. In fact, biometric technologies 3)Technologies which are able to electronically read and process biometric data, in order to identify and recognize individuals. are cheaper and faster than ever and are increasingly used both in the private and the public sector. They are mainly used on forensic investigation and access control systems, as they are a considered an efficient tool for truthful identification and authentication.

Indeed, the use of biometrics data for other purposes than law enforcement is currently being furthered at the EU level. The biometrics systems firstly implemented were deployed in regard of third country nationals, such as asylum or visa applicants (Eurodac4)Eurodac is a large database of fingerprints of applicants for asylum and illegal immigrants found within the EU. The database helps the effective application of the Dublin convention on handling claims for asylum. and VIS)5)The Visa Information System, which ‘VIS’ stands for, allows Schengen States to exchange visa data. and criminals (SIS and SIS II)6) The Schengen Information System, which ‘SIS’ stands for, is the largest information system for public security in Europe.. In 2004 it has been enlarged to the ePassport of the European Union.

Later on, in 2008, the European Commission issued a Communication entitled ‘Preparing the next steps in border management in the European Union’, suggesting the establishment of an Entry/Exit System and a Registered Traveller Programme.

Subsequently, in 2013, the European Commission submitted a ‘Smart Borders Package’, including three legislative proposals. In this regard, the proposal for an Entry/Exit System (hereafter EES) was intended to register entry and exit data of third country nationals crossing the external borders of the Member States of the European Union. Likewise, the proposal regarding a Registered Traveller Programme (hereafter RTP) aimed at offering an alternative border check procedure for pre-screened frequent third-country travellers, thus facilitating their access to the Union without undermining security. In parallel, the purpose of the third proposal was to amend accordingly the Schengen Borders Code.

The foremost aspiration to be achieved with these instruments was for a better management of the external borders of the Schengen Member States, the prevention of irregular immigration, information regarding overstayers, and the facilitation of border crossing for frequent third country national travellers.

Therefore, the EES would allow to record the time and place of entry and the length of stays in an electronic database, and, consequently, to replace the current stamping of passports system. In parallel, the RTP would allow frequent travellers from third countries to enter the EU, subject to a simplified border checks at automated gates.

Although being generally considered an welcomed initiative in terms of modernization, this has awaken, nevertheless, some concerns regarding privacy and data protection. Indeed, the proposal focuses on the use of new technologies to facilitate the travelling of frequent travellers and the monitoring the EU border crossing of nationals of third-countries. In practice, it means that hundreds of millions of EU residents and visitors will be fingerprinted and their faces electronically scanned.

Last year, the European Data Protection Supervisor (EDPS) adopted a very negative position regarding the proposal to introduce an automated biometrics-based EES for travellers in the region, calling it “costly, unproven and intrusive“. The data retention period in the EES, the choice of biometric identifiers, and the possibility of law enforcement authorities to access its database were among the main concerns raised.

As the proposed system would require ten fingerprints to confirm the identity of individuals at borders and to calculate the duration of their stay in the EU, the EDPS pointed to the unnecessary collection and excessive storage of personal information, considering that two or four fingerprints would be sufficient for identification purposes. The EDPS also expressed apprehension regarding the access to the EES database which would be granted to law enforcement authorities, even if the individuals registered were not suspects of any criminal offence. Questions were also raised regarding the possible exchange of information with third countries which do not have the same level of data protection.

Since then, the Technical Study – which I referred to at the beginning of this post – has been conducted in order to identify and assess the most suitable and promising options and solutions.

According to the document, one fingerprint alone can be used for verification, but it is acknowledged that a higher number of fingerprints could lead to better results in terms of accuracy, despite a more difficult implementation, “in particular, taking into account the difficulty of capturing more than 4 FPs [fingerprints] at land borders where limitations in enrolment quality and time may rise regarding the travellers in vehicle and use of hand-held equipment”. Nevertheless, the enrolment of four or eight fingerprints is recommended as one of the test cases of the pilot project.

Moreover, the study noted that “if facial image recognition would be used in combination with FPs [fingerprints], then it has a beneficial impact on both verification and identification in terms of speed and security leading to lower false rejection rate and reduction in number of FPs enrolled”. In addition, the Study has concluded that the use of facial identification alone is an option to be considered for EES and RTP.

Thus said, concerns regarding security should not take the limelight of the fact that biometric data are personal data. In fact, fingerprints can be qualified as sensitive data in so much as they can reveal ethnic information of the individual.

Therefore, biometric data can only be processed if there is a legal basis and the processing is adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed. In this context, the purpose limitation is a paramount principle. The definition of the purpose for which the biometric data are collected and subsequently processed is therefore a prerequisite to their subsequent use.

In parallel, the accuracy, the data retention period and the data minimisation principles have to be considered, as the data collected should be precise, proportionate and kept for no longer than what is necessary regarding the purposes for which it was firstly collected.

Besides, the processing of biometric data shall be based on the legal grounds of legitimacy, such as consent of the data subject, which must be freely given, specific and fully informed. In this context, the performance of a contract, the compliance with a legal obligation and the pursuit of legitimate interests of the data controller will also constitute legal grounds to that effect.

It must be noted that the processing of biometric data raises these and other important privacy and data protection concerns that, more than often, are not acknowledged by the public.

To start with, biometric data, in general, and fingerprint data, in particular, is irrevocable due to its stability in time. This makes possible data breaches all the most dangerous.

In addition, the highly complex technologies which are able to electronically read and process biometric data and the diversified methods and systems employed in the collection, processing and storage cannot ensure a full accuracy, even though fingerprints do present a high level of precision. In fact, a low quality of the data or of the extraction algorithms may steer to wrongful results and, therefore, to false rejections or false matches. This might lead to adverse consequences for individuals, namely regarding the irreversibility of the decisions taken based on a wrong identification.

Moreover, the risks associated with the storage of biometric data and the possible linking with other databases raises concerns about the security of the data and of uses non-compatible with the purposes which initially justified the processing.

Thus said, we will have to wait for the results of the Pilot Project which is being developed by the eu-LISA Agency 7)The acronym stands for Agency for the Operational Management of large-scale IT Systems in the area of Freedom, Security and Justice., and is expected to be completed during 2015, in order to verify the feasibility of the options identified in the Technical Study.

References   [ + ]

1. Copyright by Frettie under the Creative Commons Attribution 3.0 Unported
2. The concept refers to metric related to human features, i.e., to elements which are specific to the physical or psychological identity of a person and, therefore, allow to identify that person. Physiological biometrics, which we are specifically considering in this context, refer to human characteristics and traits, such as face, fingerprints, eye retina, iris and voice.
3. Technologies which are able to electronically read and process biometric data, in order to identify and recognize individuals.
4. Eurodac is a large database of fingerprints of applicants for asylum and illegal immigrants found within the EU. The database helps the effective application of the Dublin convention on handling claims for asylum.
5. The Visa Information System, which ‘VIS’ stands for, allows Schengen States to exchange visa data.
6. The Schengen Information System, which ‘SIS’ stands for, is the largest information system for public security in Europe.
7. The acronym stands for Agency for the Operational Management of large-scale IT Systems in the area of Freedom, Security and Justice.

The ‘risk-based’ approach to Data Protection, too risky for SMEs?

Balance is hard, very hard.

Balance is hard, very hard.

For those businesses which collect, process and exploit personal data, the draft of Chapter IV of the forthcoming EU General Data Protection Regulation is particularly relevant as it foresees the possible future compliance obligations of data controllers and data processors.

Considering the last position of the Council of the European Union regarding this chapter, a ‘risk-based‘ approach to compliance is a core element of the accountability principle itself.1)See article 22 of the Council’s document.

In fact, the Article 29 Working Party2)The Article 29 Working Party gathers a representative of the supervisory authority designated by each EU Member State; a representative of the authority established for the EU institutions and bodies; and a representative of the European Commission. recently issued a statement supporting a ‘risk-based‘ approach in the EU data protection legal framework.

But what is it meant by the concept of a ‘risk-based‘ approach?

It mainly refers to the consideration of any potential adverse effects associated with the processing and implies different levels of accountability obligations of data controllers, depending on the risks involved within each specific processing activity. It is therefore quite different from the ‘one size fits all‘ approach, as initially proposed by the European Commission.

In this context, the respect and protection of the data subjects’ rights (for instance, right of access, of objection, of rectification, of erasure, and rights to transparency, to data portability and to be forgotten) shall be granted throughout the data processing activities, regardless the level of risks involved in these activities.

However, principles as legitimacy, transparency, data minimization, data accuracy, purpose limitation and data integrity and the compliance obligations impending upon controllers shall be proportionate to the nature, scope, context and purposes of the processing.

This ‘risk-based‘ approach is developed throughout Chapter IV, namely regarding provisions related to the data protection by design principle3)See article 23., the obligation for documentation4)See article 28., the obligation of security5)See article 30., the obligation to carry out an impact assessment6)See article 33., and the use of certification and codes of conduct7)See articles 38 and 39..

These accountability obligations, in each phase of the processing, will vary according to the type of processing and the risks to privacy and to other rights and freedoms of individuals.

In this context, the proportionality exercise will have an effect on the requirements of privacy by design8)See article 23., which consists on assessing the potential risks of the data processing and implementing suitable privacy and data protection tools and measures in order to address that risk before initiating these activities.

Besides, the introduction of the ‘risk-based‘ approach is also likely to be relevant in respect of controllers not established in the EU, as they most surely won’t be required to designate a representative in the EU, regarding occasional processing activities which are unlikely to result in a risk for the rights and freedoms of individuals 9)See article 25..

Moreover, a ‘risk-based‘ approach will be implemented as well regarding the security of the processing, as technical and organisational measures, adequate to the likelihood and severity of the risk for the rights and freedoms of individuals, shall be adopted10)See article 30..

In parallel, it has been foreseen that the obligation to report data breaches is restricted to the breaches which are likely to result in an high risk for the rights and freedoms of individuals. In this context, if the compromised data is encrypted, for instance, the data controller won’t be required to report a verified breach.11)See article 31 and 32.

The weighing assessment is expected to be also relevant regarding the data protection impact assessment12)See article 33. required for the processing activities that will likely result in a ‘high risk’ to the rights and freedoms of individuals, such as discrimination, identity theft, fraud or financial loss.

Another important requirement is the consultation of a Data Protection Authority prior to the processing of personal data when the impact assessment indicates that the processing would result in a high degree of risk in the absence of measures to be taken by the controller to mitigate the risk.13)See article 34.

Of course “nothing is agreed until everything is agreed” and this chapter will be subjected to further revisions. There is, indeed, a vast room for improvement.

For instance, it is questionable if a ‘risk-based‘ approach does make data protection standards stronger, considering the inadequacy of the risk assessment methodology regarding fundamental rights.

In parallel, the definition of ‘high risk‘ is still too broad, including almost all businesses which are operating online. Similarly,  the impact assessment process presents itself as complex, burdensome and costly. At the current state of play, small businesses and start-ups are most likely to be negatively affected by the administrative and financial burden that some of the abovementioned provisions will entail. This is quite ironic, considering that it was precisely that concern that is at the core of the understanding according to which SMEs should be exempted from the obligation to assign a Data Protection Officer.

However, it is important for businesses to try to anticipate how the compliance requirements will be set in the future in order to be prepared for their implementation.

We will see in due time how onerous the regime will be. Whilst we do not know the exact content of the text that will eventually be adopted, it is evident now that substantive accountability obligations will be imposed upon businesses handling personal data.

References   [ + ]

1. See article 22 of the Council’s document.
2. The Article 29 Working Party gathers a representative of the supervisory authority designated by each EU Member State; a representative of the authority established for the EU institutions and bodies; and a representative of the European Commission.
3. See article 23.
4. See article 28.
5. See article 30.
6. See article 33.
7. See articles 38 and 39.
8. See article 23.
9. See article 25.
10. See article 30.
11. See article 31 and 32.
12. See article 33.
13. See article 34.

The match of the year: Right to be Forgotten vs Right to know

Round 1, Fight!

Round 1, Fight!

As it is well-known, the ‘right to be forgotten’ ruling extended the possibilities foreseen under the current EU Data Protection Directive for data subjects to exercise their rights to erasure of data and to object to personal data processing with regard to search engine services providers, which were deemed as controllers.

Therefore, facing a deletion request, search engines will have to decide on the balance of the rights at stake, namely freedom of expression and right to privacy, weighing up whether it is in the public interest for the information indexed in its search results to remain.

From the very beginning, the public opinion thrived both with enthusiasm and concern. The main question was: how would the decision be enforced? Isn’t the removal of links to legal and accurate information damaging for freedom of speech and right to access to the information? The debate was mostly vivacious between free speech advocates and privacy campaigners and hasn’t faded away with the course of time. The firsts insist that it will lead to a whitewashing of the past, whereas the latter uphold that it will enable individuals to limit the visibility of some personal information.

Google, despite affirming that the enforcement of the ruling could hamper free speech, alerting for the potential abuse of those looking for the deletion of important information and complaining that the ruling requirements for conformity were vague and subjective, started dealing (efficiently?) with the astonishing amount of requests for suppression of links received, rejecting some and admitting others.

In fact, Google says it has received approximately 143,000 requests, related to 491,000 links, to take down links in the last five months, involving everything from serious criminal records to embarrassing photos and negative press stories. Considering the data revealed by Google itself, the company has refused about 30 per cent of demands and about 50 per cent were taken down. According to its online transparency report, Google has removed more links to content on Facebook from its search results than from any other site. In this regard, Reputation VIP — the company that provided Forget.me, the first “Right To Be Forgotten” Removal Service – outlined that, ironically, most requests do not refer to unflattering or inaccurate web pages written by third parties, but, instead, to content authored by the requestor.

Google even set up an advisory committee to handle the requests. This council is headed by the company’s executive chairman, Eric Schmidt, and chief legal officer, David Drummond, and includes academics, technologists, legal experts and a journalist.

Most recently, Google decided to launch a public debate regarding the balance to be achieved between a person’s right to be forgotten and the public’s right to information. To that end, it organized a grand tour of hearings across Europe and has been on the road for about a month now.

The good intentions beneath this initiative failed to convince everyone. For instance, Isabelle Falque-Pierrotin, who heads the Article 29 Working Party, which gathers all 28 EU national data protection authorities, didn’t hesitate to share her scepticism about the Google initiative, which she described as part of a “PR war”:

Google is trying to set the terms of the debate. They want to be seen as being open and virtuous, but they handpicked the members of the council, will control who is in the audience, and what comes out of the meetings.

Although I do not share such a pessimist viewpoint of the initiative, I actually also have some doubts regarding the openness and transparency that it is intended. Indeed, when the public debate was firstly announced, I expected that it would allow for a better understanding Google’s current processes for dealing with requests. But, as far as I am aware, hearings have centred themselves in abstract and rather philosophical discussions.

Considering the ongoing negotiations regarding the EU data protection reform, already well advanced, the question which should be asked is: how much could the ruling and Google’s efforts in fact influence the direction of the discussions?

According to the European Commission’s initial proposal, the right to be forgotten would be built on the right to erasure of personal data and the right to object to data processing operations, which already exist under the current Data Protection Directive. Therefore, the data subject could exercise the right against the original data controller when and if: the data is no longer necessary; consent is withdrawn or when the storage period has expired; the data subject objects to the processing on specified grounds; or the processing is no longer valid on some other ground. Freedom of expression was among the exemptions foreseen.

The European Parliament was quite favourable to this proposal, having voted its opinion  last spring. However, it ensured that the right could also be exercised directly against third parties and the possibility to exercise the right following an order by a court or regulatory authority.

The Council of the European Union had already discussed the issue before but decided to suspend the respective debates in order to wait for the CJEU’s ruling. However, negotiations regarding other issues of the reform kept going and Member States even agreed on partial general approach since then.

An afterwards statement issued by the Italian Presidency made clear that the provision concerning the right to erasure would take into account principles set out by the CJEU. Indeed, the revised version issued recently left no doubt about it.

I thought this utterly confusing as it is for the Council of the European Union and for the European Parliament, as co-legislators, to make the law as it will stand in the future and for the CJEU to interpret the law as it exists. To take into account the judicial interpretation of the law that we are about to replace for the definition of the upcoming legislation is, in my opinion, quite puzzling. The ruling should not dictate the content or drafting of the future Regulation.

Nevertheless, something has to be done regarding the enforcement of the ruling. As things stand at the moment, it has been up to Google to determine the balance between the conflicting interests at stake. The criteria as defined by the CJEU are undoubtedly insufficient.

And if the ruling shall be taken into account regarding the upcoming legislation regarding anything, it most certainly has to address the scope of the right to be forgotten, the grounds on which it can be exercised and the need to balance this right with the freedom of information, as the judgement itself doesn’t establish with rigour how it shall be applied in practice.

In this context, it must be noted that the regulation has a horizontal nature and, thus, is intended to be applied to all controllers, independently of their nature. Search engines are not the specific aim of the future legislation although, as controllers, they are covered by its scope.

Regarding the scope, one may wonder if the distinction made by the European Commission between personal data which have been initially disclosed or uploaded by the data subject and the personal data which have been disclosed by third-parties will be kept.

Moreover, as it seems that there is no doubt that search engines – now considered as controllers – may receive deletion requests, it is important to clarify what about providers of social media, as Facebook, for instance, where it is possible to argue that the processing is based on consent or a contract.

As for the grounds on which the right can be exercised, I think it won’t be easy to determine who will be required to conduct the assessment in order to consider if the initially lawful processing of accurate data became unnecessary, inadequate, irrelevant or no longer relevant, or excessive in the light of the purposes for which they were collected or processed and of the time it has elapsed. Who is better suited for that role: search engines or the first controller?

In this context, one cannot assume that, if the initial processing is lawful, that the second processing is also legal. There might be cases where both might have reached different outcomes of lawfulness. What then?

Furthermore, should requests for deletion be addressed directly to the controller? Should they be addressed, instead, to the supervisory authority? Or to the competent courts? And if so, which court would be the competent one?

In addition, should the data subject have the right to choose any of the controllers to exercise the right to be forgotten and erasure? I believe that, at least theoretically, it should be possible for the data subject to exercise the rights against the processing carried out by the search engine before, after or independently from exercising the same or other rights against the original controller. But one should bear in mind that it is quite unrealistic to ask operators of search engines to track information and replication of data across the web.

As we can see, many questions are yet to find their answers.

The most popular is:

How will be the right to the protection or personal data fairly articulated with the right to freedom of expression?

Understandably, certain Member States have shown legitimate concerns regarding the freedom of expression and the interest of the public at large to have access to information, which may end up being underweight in the balancing process. So the debates are currently ongoing.

One of the big issues at stake is that, according to the spirit of the founding treaties, the conciliation of the right to the protection of personal data and the freedom of expression should remain in Member States’ legislative power. This implies that the European co-legislative institutions, the Council of the European Union and the European Parliament, are not entitled to regulate in detail this matter. However, if it is up to Member States to reconcile the two potentially conflicting rights, nor harmonization nor a unified application of law is ensured.

In this context, it will be important to delineate the concept of ‘public interest’ and ‘public figure’, which scope is not satisfactorily developed in data protection due to the swiftly evolved digital era.

Moreover, it will be important to establish that bloggers and individuals generally expressing themselves online fall within the scope of the ‘freedom of expression’ exception, even if they are not professional journalists. After all, article 11 of the Charter of Fundamental Rights of the European Union establishes that everyone has the right to freedom of expression, including the freedom to hold opinions and to receive and impart information and ideas, establishing the freedom and pluralism of the media.

On another level, and as it is well-known, Google has been systematically alerting websites when it cuts links to their pages from results presented based on searches for a person’s name, which is in line with the European Commission’s proposal. But should search engines be barred to inform publishers, as Google has been doing, when articles have been delisted from search results? Are they cases where it would be appropriate to involve a publisher? Which ones?

These notifications are mostly problematical due to the possibility of republication, which could cause additional harm or distress for the data subject. And indeed, it often leads to a republication of a version which indicates what URLs are being removed from the search index.

In my opinion, it is preferable for the data subject that the search engine, as a second controller, contacts the controller which has firstly published the information (preliminary controller), as, otherwise, it might not be always easy to establish the correct balance.

In parallel, Google has unilaterally restricted the deletion of internet links to European websites only, for instance Google.es, Google.de, Google.uk… Well, you get the idea… But shouldn’t the removal be global, considering the very nature of Internet? Shouldn’t links be removed from all versions of Google, such as Google.com? This is particularly important considering that most of European users of the search engine use local domains, rather than referring to google.com.

The Justice and Home Affairs Council gathered in Luxembourg, on the 10th of October, to discuss the regulation and directive. A partial general approach on chapter IV of the general data protection regulation, which deals with the obligations for data controllers and processors, was agreed. There is, nevertheless, still plenty to be agreed on, so one may wonder if the deadline established by the incoming European Commission President Jean-Claude Juncker for the end of negotiations – within six months of the commission starting work – will be enforceable.

Meanwhile, the Article 29 Working Party is preparing some guidelines which will set out a common record to deal with different types of appeals coming in from citizens. To that end, it has met with media and search engine companies, Google, Microsoft and Yahoo, to gather their views on how to strike a balance between the freedom of information and privacy. The guidelines are expected to be finalized by the end of November.

Considering the current state of play, let’s hope that some thorny questions would have been answered by then…

The Google Affair: Forget Me, Forget Me Not – Part I

Am I forgetting something?

Am I forgetting something?

The ruling better known as the ‘right to be forgotten’ decision won’t risk to be forgotten any time soon.

To make a long story short, the ECJ ruled that the operator of a search engine – in that particular case, Google – is obliged to remove from the list of search results displayed, according to a search made on the basis of a person’s name, links to web pages published by third parties and containing information relating to that person, even when its publication is otherwise lawful and the information is factually correct.

I understand (and welcome) the motivation behind the judgments. It is comforting to know that individuals, where there is no legitimate – and consequently stronger – public interest involved, may be able to move on from their past. As it is very difficult to restrict our lives to the offline world, Google and search engines in general are, nowadays, the primary instrument to find information about everything and anyone, to the point that ‘google’ is commonly used as a verb referring to an online search conducted on that search engine. Without search engines, information wouldn’t be, in the vast majority of cases, so easily accessible.

And that is certainly the main issue at stake. As stated by the Court:

(…) processing of personal data, such as that at issue in the main proceedings, carried out by the operator of a search engine is liable to affect significantly the fundamental rights to privacy and to the protection of personal data when the search by means of that engine is carried out on the basis of an individual’s name, since that processing enables any internet user to obtain through the list of results a structured overview of the information relating to that individual that can be found on the internet — information which potentially concerns a vast number of aspects of his private life and which, without the search engine, could not have been interconnected or could have been only with great difficulty — and thereby to establish a more or less detailed profile of him.  1)Paragraph 80th of the ruling

So a ‘right to be forgotten’ might as well be a necessary principle in an unforgetting world.

But, true to be told, the principle of giving citizens more control over their personal data – including its deletion – doesn’t come as the new right in town. It existed long before what was decided by the ECJ in several Member States’ legislation, although without such a sticky nickname. Furthermore, it is foreseen in the directive 95/46 on data protection which remains applicable to the Internet as it has developed since. And a related provision is expected to be included in the General Data Protection Regulation, intended to replace Directive 95/46/EC, currently being negotiated at the Council of the European Union. The principle of individuals being able to move on from their past therefore doesn’t come as an originality.

What was indeed a surprise, and not a good one I must admit, was the conclusion that Google, while processing information published by a third party, act and is liable as a controller, and shall delete links to articles lawfully published by third parties, which can remain available on the latter’s website.

One must admit that this conclusion is in line with the obligations that article 6(2) of the Directive foresees regarding controllers while processing personal data: the controller shall weigh its interests, those of the data controller, those of third parties and those of the data subject.

That conclusion is also in line with the observations of the Court in ASNEF and FECEMD ruling2)Joined Cases C 468/10 and C 469/10 ASNEF and FECEMD [2011] ECR I 0000, paragraphs 44–45, regarding the relevance of the data previous appearance in public sources.

And disregarding all the questions (as mentioned here and here) that unavoidably arise while considering a search engine as a ‘controller’, the ECJ tried to establish criteria for the removal of links and consecrated a balance to be achieved to that end.

Therefore, search engines shall erase data deemed inadequate, irrelevant or no longer relevant, or excessive in the light of the time that had elapsed.3)Paragraph 94th of the ruling According to the ECJ, even accurate data, despite lawfully published initially, can “in the course of time become incompatible with the directive”.4)Paragraph 93rd of the ruling These key criteria – “inadequate, irrelevant or no longer relevant, excessive in relation to the purposes of the processing” – shall be balanced with the economic interests of the operator of the search engine and the interest of the general public in finding that information upon a search relating to the data subject’s name.5)Paragraph 97th of the ruling

However, as rightly pointed out by the ECJ:

(…) that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question.6)Paragraph 97th of the ruling

The preponderant public interest is consequently the ultimate test to ensure that information which ought to continue widely available is not removed.

I personally doubt that the Court succeeded in defining the desirable rigorous criteria necessary to achieve a fair balance of rights to privacy and personal data against freedom of expression and information. In fact, the ECJ provides very little legal certainty as it merely portrays an unclear balancing act between the rights of the data subject against the search engine’s economic interests and the public interest, to be determined casuistically.

And this can have very serious repercussions considering that the removal of links from the list of results can have effects upon the legitimate interest of the publishers of the information and of internet users potentially interested in having access to that information.

As pointed out by the General Advocate:

The data protection problem at the heart of the present litigation only appears if an internet user types the data subject’s name and surnames into the search engine, thereby being given a link to the newspaper’s web pages with the contested announcements. In such a situation the internet user is actively using his right to receive information concerning the data subject from public sources for reasons known only to him.7)Paragraph 130 of the Opinion

And he added:

In contemporary information society, the right to search information published on the internet by means of search engines is one of the most important ways to exercise that fundamental right. This right undoubtedly covers the right to seek information relating to other individuals that is, in principle, protected by the right to private life such as information on the internet relating to an individual’s activities as a businessman or politician. An internet user’s right to information would be compromised if his search for information concerning an individual did not generate search results providing a truthful reflection of the relevant web pages but a ‘bowdlerized’ version thereof8)Paragraph 131 of the Opinion

It most certainly doesn’t help that the ECJ establishes that data subject’s rights protected by articles 7 and 8 of the Charter override, as a general rule, that interest of internet users.9)Paragraph 81st of the ruling Does this mean that the right to privacy is the general principle, only subjected to sporadic exemptions represented by the exceptional precedence of the freedom of expression and right to information?

This priority doesn’t seem to be established nor in the Charter nor in the European Convention of Human Rights. It is only expectable that a data subject’s right to protection of his private life must be balanced with other fundamental rights, namely with freedom of expression and freedom of information. Additionally, the conclusion according to which the inclusion in the list of results of a search engine constitutes a more significant interference with the data subject’s fundamental right to privacy than the publication on the web page, seems to forget that, nowadays, freedom of expression is not just about the right to publish, but also about being found online. We have attained a historic momentum where to not be found through a search engine or to no exist online is pretty much the same.10)Paragraph 87 of the ruling

Therefore, the applicability to search engines but not to news websites or other journalistic activities, due to exemptions for “media” under European data protection law, so the original information itself could remain, does not represent a satisfactory guarantee of the right to information at a time when people have access to both newsworthy information and publishing tools.

Considering the challenges at stake, it is very unlikely that a search engine service provider will satisfactorily balance all these conflicting interests. The general fear that the implementation of this ruling will entail sacrificing of freedom of expression and information might not come as a mere alarmism.

It is worth question whether and how this ruling will affect the negotiations for a General Data Protection Regulation currently ongoing at the Council of the European Union. The Italian Presidency circulated to the DAPIX working group a note entitled ‘Right to be forgotten and the Google judgment’ to examine how the future legislation on the right to deletion should be developed.

This raises some confusions considering that it is for the Council of the European Union and for the European Parliament, as co-legislators, to make the law as it will stand in the future and for the ECJ to interpret the law as it exists.

References   [ + ]

1. Paragraph 80th of the ruling
2. Joined Cases C 468/10 and C 469/10 ASNEF and FECEMD [2011] ECR I 0000, paragraphs 44–45
3. Paragraph 94th of the ruling
4. Paragraph 93rd of the ruling
5. Paragraph 97th of the ruling
6. Paragraph 97th of the ruling
7. Paragraph 130 of the Opinion
8. Paragraph 131 of the Opinion
9. Paragraph 81st of the ruling
10. Paragraph 87 of the ruling

The Google Affair: To be or not to be a Data Controller – Part II

Data controller or Data processor?

Data controller or Data processor?

With an evident and unfortunate confusion between the roles of ‘data controllers’ and ‘data processors’, (as analysed previously) the European Court of Justice (ECJ) qualifies, in its famous ruling, Google as data controller, while simply reproducing existing information.

This has far reaching implications and raises a number of complex and problematic issues. In this post, we will only be able to address some questions.

Indeed, to consider search engines services providers as data controllers for the purposes of EU law entails that they are obliged to respect the rights and freedoms of the data subjects, comply with a set of principles and obligations as foreseen in Directive 95/46, when they process personal data, and be prepared for liability in case of failure in behaving accordingly.

This obviously forces a shift in search engines services providers’ responsibilities regarding the individuals whose information they process.

A very pertinent and practical question must be asked: how can search engines services providers fulfil the obligations impending on controllers, as provided in Articles 6, 7 and 8 of the Directive, in relation to the personal data sourced from web pages hosted by third-party servers?

If an internet search engine service provider is to be considered a controller, it must guarantee that the processed personal data is adequate, relevant, and not excessive in relation to the purposes for which it was collected, up to date, and no longer than is necessary for the purposes for which the data were collected.

Considering that the search engine merely locates information made available by third parties, indexes it automatically and makes it accessible to internet users according to a particular order of preference, it remains to be explained how a search engine service provider will be able to appraise its compliance with those requirements. Isn’t the publisher of the website concerned in a better position to conduct that assessment?

As to the criteria concerning the legitimacy of processing of data made available on the internet, including personal data, in the absence of a data subject’s consent, it is unquestionable that internet search engine serve legitimate interests. Indeed, it allows an easy and quick access to information and contributes to the dissemination of the information uploaded on the internet.

But what happens when the search engine processes information that is inserted in a special category of data (e.g. personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and concerning health or sex life), which processing is prohibited, unless, for instance, data subject has given his explicit consent?

It is important to reflect on the hypothesis raised by the Advocate General, in its opinion:

if internet search engine service providers were considered as controllers of the personal data on third-party source web pages and if on any of these pages there would be ‘special categories of data’ referred to in Article 8 of the Directive (e.g. personal data revealing political opinions or religious beliefs or data concerning the health or sex life of individuals), the activity of the internet search engine service provider would automatically become illegal, when the stringent conditions laid down in that article for the processing of such data were not met.

Would the processing of special categories of data by search engines be deemed to be illegal if the requirements for the processing of such data on third-party source web pages were not met? Is that even a conceivable scenario?

Additionally, considering that, in order to be lawful, and if no other criterion is applicable, the processing of personal data must be carried out with the consent of the data subject, it is only legitimate to question how can search engines services providers ensure the consent from data subjects with whom they have never been in contact with?

Furthermore, one must wonder about the effectiveness of the exercise of the data subjects’ right to access to data, as foreseen in the above mentioned directive, and Google’s capabilities to satisfactorily comply with its obligation, attending to the fact that its activity of caching webpage’s for relevance index ranking depends on an algorithm that will perform the content analysis automatically. Attending to this notion of relevance, e.g., it will be impossible to distinguish in practice people who share the same name.

I guess we all just have to wait to see how all the implications will work in practice…

Older posts Newer posts

© 2018 The Public Privacy

Theme by Anders NorenUp ↑