Tag: Data Protection Regulation (page 1 of 2)

Bits and pieces of issues regarding the happy sharing of your children’s lives on Facebook

It's just a picture of them playing, they don't mind. Like!

It’s just a picture of them playing, they don’t mind. Like!

Similarly to what is happening in other EU Member States’ courts, Portuguese courts have been struggling with the application of traditional legal concepts to the online context. Just recently, in a decision which I addressed here, it considered that those having in their possession of a video containing intimate images of an ex-partner are under the obligation to properly guard it and the omission to practice adequate safeguard are condemned as a relevant omission.

Thus said, there is one particular decision which was issued by a Portuguese appealing court last year that I failed to timely address and which concerns the very specific rights of image of children in the online context. Considering the amount of pictures that appear on my Facebook wall every time I log in on my account and the concerns expressed by the upcoming GDPR in regards of the collection and processing of data referring to minors of sixteen, I would like to address it today.

The court at stake confirmed the decision of the court of first instance, issued within a process of regulating the parental responsibilities of each progenitor, which forbid a separated couple to divulge on social media platforms pictures or information identifying their twelve years old daughter. It severely stated that children are not things or objects belonging to their parents.

One would expected that a court decision would not be necessary to achieve the conclusion according to which children have the right to have their privacy and image respected and safeguarded even from acts practised by their parents. In fact, one would hope that, in the online context, and considering their specific vulnerability and the particular dangers facilitated by medium of the Internet, their protection would be ensured primarily by their parents.

Ironically, the link to the news referring to this court decision was greatly shared among my Facebook friends, most of them with children of their own. The same ones who actually happily share pictures of their own kids. And who haven’t decreased the sharing of information involving their children since then.

It is funny how some people get offended or upset when someone posts online a picture in which they are not particularly favoured or of which they are embarrassed and are quick to require its removal, and do not wonder if it is ethical to publish a picture of information about someone who is not able to give his/her consent. Shouldn’t we worry what type of information would children – our own, our friend’s, our little cousin or nephew – want to see about themselves online in the future?

Every time I log in my Facebook account, there is an array of pictures of birthday parties, afternoons by the sea, first days at school, promenades in the park, playtimes in the swimming pool, displays of leisure activities, such as playing musical instruments or practising a sportive activity… In a particular case, it has been divulged that the child had a serious illness, which fortunately has been overcome ever since but which received full Facebook graphic and descriptive coverage at the time of the ongoing development.

I have seen pictures where my friends’ children appear almost naked or in unflattering poses, or where it is clearly identifiable where they go to school or spend holidays. Many identify their children by their name, age, school they attend, extracurricular activities… In any case, their parenthood is quite well established. I always think that, in the long run, it would permit the building of an extended and detailed profile for anybody which has access to such data. And, if you had read any of my other posts, you know by now that I am not exactly referring to the Facebook friends.

More worryingly, these details about the children’s lives are often displayed on the parents’ online profiles, perhaps due to simple distraction or unawareness, without any privacy settings being implemented. Consequently, anybody having a Facebook account can look for the intended person and have access to all the information contained on that profile.

I do not want to sound like a killjoy, a prude or a moralist. I get it, seriously, I do. A child is the biggest love and it is only human to want to proudly share his growth, development and achievement with relatives and friends. It has always been done and now it is almost effortless and immediate, at the distance of a click. In this regard, by forbidding the sharing of any picture or any information regarding children, the abovementioned decision seems excessive and unrealistic.

Nevertheless, one should not forget that some good sense and responsibility is particularly required in the online context, considering how easy it actually is to lose control of the purposes given to the published information besides the initial ones. As many seem to forget, once uploaded on an online platform, it is no longer within our reach, as they can be easily copied or downloaded by others.

Thus said, while it is certainly impossible to secure anonymity online, the amount of information that is published should be controlled for security, privacy and data protection purposes.

Anyway, this common practice of parents sharing online pictures and information regarding their children makes me wonder how companies such as Facebook, and other platforms focusing on user generated content, who process data at the direction of the user and, consequently, who unintentionally have to collect and process personal data regarding children below the age of sixteen, may be asked to comply with the new requirements of the GDPR in that regard.

If it is to be lawful if and to the extent that consent is given or authorised by the holder of parental responsibility, and if, as the Portuguese court have understood it, parents are not entitled to dispose of their children’s image on social media, a funny conundrum is generated. If the parents cannot publish such information, they will not be able to authorize it either and, consequently, children/teenagers won’t be able to rely on their parents’ authorization to use social media.

Practical difficulties of the GDPR – the ‘right to be forgotten’ applied to online social platforms

From all the legal challenges that the GDPR will present for businesses in general, I would like to address in this post the issues raised by its implementation in regards of social network platforms, which are quite popular nowadays.

Article 17 of the GDPR establishes the ‘right to erasure’ or the right to be forgotten, as it has come to referred to, which provides data subjects with the right to require from data controllers the erasure of their personal data held by the latter, and the consequent obligation of controller, upon that request to abide, without undue delay, when certain conditions are fulfilled.

Considering that infringing the ‘right to erasure’ may lead to the application of significant economic sanctions, there is the risk that social platforms will be tempted to adopt a preventing approach by complying to all the deletion requests, disregarding their validity, thus erasing content on unfounded grounds. This is particularly worrisome because it may directly lead to the suppression of free speech online. Consequently, online businesses are not and should not be deemed competent to make any assessment in regards of the legitimacy of such claims, a point that I have already tried to make here.

While it seems that a notice and take down mechanism is envisaged without much detail being provided in regards of its practical enforceability, a particular issue in this context is the one related to the identities upon which such obligation impends. Indeed, the obligation to implement the ‘right to be forgotten’ can only be required from those who qualify as data controllers.

As data controllers are defined as the entities who determine the purposes and means of the processing of personal data, it is not clear if online social platforms providers can be defined as such.

Considering the well-known Google Spain case, it is at least certain that search engines are deemed to be controllers in this regard. As you may certainly remember, the CJEU ruled that individuals, provided that certain prerequisites are met, have the right to require from search engines, such as Google, to remove certain results about them, subsequently presented to a search based on a person’s name

Thus said, it is questionable if hosting platforms and online social networks, focused on user generated content, as it is the case of Facebook, qualify as such, considering that the data processed depends of the actions of the users who upload the relevant information. Therefore, the users themselves qualify as controllers. The language of Recital 15 of the GDPR about social networking is inconclusive in this regard.

The abovementioned Recital provides as follows:

This Regulation should not apply to processing of personal data by a natural person in the course of a purely personal or household activity and thus without a connection with a professional or commercial activity. Personal and household activities could include
correspondence and the holding of addresses, or social networking and on-line activity undertaken within the context of such personal and household activities. However, this Regulation should apply to controllers or processors which provide the means for processing personal data for such personal or household activities.

This is not an irrelevant issue, though. In practice, it will amount to enable someone to require and effectively compel Twitter or Facebook to delete the information about her/him despite being provided by others.

And considering that any legal instrument is proportionally as efficient in practice as it is capable of being enforced, the definition of whom is covered and ought to comply with it is unquestionably a paramount element.

As I remember to read elsewhere – I fail to remember where, unfortunately – one wondered if the intermediary liability as foreseen in the e-Commerce Directive would be an appropriate mechanism for the enforcement of the right to erasure/right to be forgotten.

Articles 12-14 of the e-Commerce Directive indeed exempt information society services from liability under specific circumstances, namely when they act as a ‘mere conduit’ of information, or engage in ‘caching’ (the automatic, intermediate and temporary storage of information), or when ‘hosting’ (i.e., storing information at the request of a recipient of the service).

Article 15 establishes the inexistence of any general duty impending on online intermediaries to monitor or actively seek facts indicating illegal activity on their websites.

Having into account the general liability of online intermediaries foreseen in the E-commerce Directive (Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market), a particular distinction will perhaps apply according to the level of ‘activity’ or ‘passivity’ of the platforms in the management of the content provided by their users.

However this liability does not fully clarify the extent of the erasure obligation. Will it be proportionate to the degree of ‘activity’ or ‘passivity’ of the service provider in regards of the content?

Moreover, it is not clear how both regimes can be applied simultaneously. While the GDPR does not refer to any notice and take down mechanism and expressly refers that its application is without prejudice of the e-Commerce Directive liability rules, the fact is that the GDPR only establishes the ‘duty of erasure’ to controllers. As the intermediary liability rules require accountability for the activities of third-parties, this is a requirement not easy to overcome.

Thus considering, the most awaited GDPR hasn’t entered into force yet but I already cannot wait for the next chapters.

The ‘Safe Harbor’ Decision ruled invalid by the CJEU

Safe harbor?!? Not anymore.

Safe harbor?!? Not anymore.

Unfortunately, I hadn’t had the time to address the ruling of the CJEU issue last October, by which the ‘Safe Harbour’ scheme, enabling transatlantic transfers of data from the EU to the US, was deemed invalid.

However, due to its importance, and because this blog is primarily intended to be about privacy and data protection, it would be shameful to finish the year without addressing the issue.

As you may be well aware, article 25(1) of Directive 95/46 establishes that the transfer of personal data from an EU Member State to a third country may occur provided that the latter ensures an adequate level of protection. According to article 25(6) of the abovementioned Directive, the EU Commission may find that a third country ensures an adequate level of protection (i.e., a level of protection of fundamental rights essentially equivalent to that guaranteed within the EU under the directive read in the light of the Charter of Fundamental Rights) by reason of its domestic law or of its international commitments.

Thus said, the EU Commission adopted its Decision 2000/520, by which it concluded that the “Safe Harbour Principles” issued by the US Department of Commerce ensure an adequate level of protection for personal data transferred from the EU to companies established in the US.

Accordingly, under this framework, Facebook has been transferring the data provided by its users residing in the EU from its subsidiary in Ireland to its servers located in the US, for further processing.

These transfers and, unavoidably, the Decision had been challenged by the reference to the CJEU (judgment in Case C-362/14) following the complaint filed by Max Schrems, a Facebook user, before the Irish DPA and subsequently before the Irish High Court. The main argument was that, considering the access electronic communications conducted by its public authorities, the US did not ensure adequate protection of the thus transferred personal data.

According to the AG’s opinion, “the access enjoyed by the United States intelligence services to the transferred data constitutes an interference with the right to respect for private life and the right to protection of personal data”.

Despite considering that a third country cannot be required to ensure a level of protection identical to that guaranteed in the EU, the CJEU considered that the decision fails to comply with the requirements established in Article 25(6) of Directive and that the Commission did not make a proper finding of adequacy but merely examined the safe harbour scheme.

The facts that the scheme’s ambit is restricted to adhering US companies, thus excluding public authorities, and that national security, public interest and law enforcement requirements, to which US companies are also bound, prevail over the safe harbour principles, were deemed particularly decisive in the assessment of the scheme’s validity.

In practice, this would amount to enable the US authorities to access the personal data transferred from the EU to the US and process it in a way incompatible with the purposes for which it was transferred, beyond what was strictly necessary and proportionate to the protection of national security.

As a result, the Court concluded that enabling public authorities to have access on a generalised basis to the content of electronic communications must be regarded as compromising the essence of the fundamental right to respect for private life.

The Court stated that the decision disregards the existence of such negative interference on fundamental rights, and that the lack of provision of limitations and effective legal protections violates the fundamental right to effective judicial protection.

Upon issuance of this ruling, the Art29WP met and concluded that data transfers from the EU to the US could no longer be legitimized by the ‘Safe Harbor’ decision and, if occurring, would be unlawful.
While its practical implications remain unclear, the ruling undoubtedly means that companies relying on the ‘Safe Harbor’ framework for the transfer of personal data from the EU to the US need to rely, instead, on another basis.

In this regard, considering that not all Member States accept the consent of the data subject or an adequacy self-assessment as a legitimizing legal ground for such cross-border transfers, Model Contractual Clauses incorporated into contracts and Binding Corporate Rules (BCR) for intragroup transfers seem to be the most reliable alternatives in certain cases.

Restrictions on data transfers are obviously also foreseen in the GDPR, which, besides BCRs, Standard Contracts and adequacy decisions, includes new data transfer mechanisms such as certification schemes.

You can find the complete version of the ruling here.

The General Data Protection Regulation – Start the countdown!

Start the countdown.

Start the countdown. 1)Copyright by Julian Lim under the Creative Commons Attribution 2.0 Generic

After years and years of lengthy drafting and negotiating, the European Commission, the European Parliament and the EU Council, following the final negotiations between the three institutions (the so-called “trilogue negotiations”) have, at last, reached a political agreement on the data protection reform package, which includes the General Data Protection Regulation (“GDPR”) and the Data Protection Directive for the police and criminal justice sector, as the Civil Liberties (LIBE) Committee of the European Parliament also approved the text on 17 December.

A formal adoption from the European Parliament and the EU Council is still required though, currently foreseen to take place at the beginning of 2016.

At this pace, and optimistically, the Regulation will finally be published somewhere in the middle of 2016.

So let the countdown begin…

References   [ + ]

1. Copyright by Julian Lim under the Creative Commons Attribution 2.0 Generic

Game of drones or the not so playful side of the use of RPAS for recreational purposes

I am watching you.

I am watching you.1)Copyright by Don McCullough under the Creative Commons Attribution 2.0 Generic

If one of the gifts you have found underneath the Christmas tree was a drone 2)The term drone is used to describe any type of aircraft that is automated and operates without a pilot on board, commonly described as unmanned aerial vehicles (UAV). There are two types of drones: those which can autonomously follow pre-programmed flight routes and those which have remotely piloted aircrafts systems (RPAS). Only the latter are currently authorised for use in EU airspace., and it happens to have some camera installed on it, you should prepare yourself to embrace your new status of a data controller and face a new set of obligations regarding privacy and safety.

Indeed, whilst drones can be a lot of fun, there are serious considerations at stake which should not be ignored. In fact, the extensive range of their potential applications3)Despite drones were firstly used for military activities, they are increasingly used across the EU for civilian purposes. The civil use usually refers to those commercial, non-commercial and government non-military activities which are more effectively or safely performed by a machine, such as such as the monitoring of rail tracks, dams, dykes or power grids., the proliferation of UAVs with a camera, the collection of data and the subsequent use of such data, namely by private individuals for personal and recreational purposes raise concerns about the impact of these technologies on the safety, security, privacy and the protection of personal data.

As a matter of fact, a drone in itself does not imply the collecting and the processing of any personal data until you attach a camera to it. However, drones are increasingly equipped with high definition optical cameras and therefore are able to capture and record images of the public space. And while there are no apparent privacy concerns regarding the recording of landscapes, having a drone filming through the sky over your neighbourhood might lead to a very different conclusion. Drones have a high potential for collateral or direct intrusion regarding privacy, considering the height at which they operate, allowing to monitor a vast area and to capture large numbers of people or specific individuals. Despite individuals may not always be directly identifiable, their identification may still be possible through the context in which the image is captured or the footage is recorded.

It must be noted that people might not even be aware that they are being filmed or by whom and, as a result, cannot take any steps to avoid being captured if such activity is not made public. People ought not to know that the device is equipped with optical imaging and has recording capabilities. Moreover, because the amateur usage of a drone may not be visible, there is a high risk of being directed to covert and voyeuristic recording of their neighbours’ lives, homes and back gardens. How would you feel if a drone was constantly looming near your windows or in your backyard? Indeed, there is no guarantee regarding the legitimacy of the end to be achieved with the use of drones. None withstanding the fact that a drone may actually pose a threat to people’s personal safety, belongings and property, considering that it may fall, its increasing popularity as a hobby outlines the issue of discriminatory targeting, as certain individuals, such as children, young people and women, are particularly vulnerable to an insidious use of RPAS. This is particularly relevant considering that the images or footage is usually intended to be made publicly available, usually on platforms such as Youtube.

Furthermore, the recording may interfere with the privacy of individuals as their whereabouts, home or workplace addresses, doings and relationships are registered. In this context, the use of drones for hobbying purposes may have a chilling effect on the use of the public space, leading individuals to adjust their behaviour as they fear their activities are being monitored.

Thus considering, the use of this type of aerial technologies is covered by Article 7 and Article 8 of the EU Charter of Fundamental Rights which respectively establish the respect for private life and protection of personal data. Taking into account the abstract nature of the concept of privacy, the main difficulty will be to define when there is a violation at stake.

In addition, there are obviously data protection implications at stake where the drone is capturing personal data. EU data protection rules generally govern the collection, processing and retention of personal data. The EU Directive 95/46/CE and the proposed General Data Protection Regulation are applicable to the collection, processing and retention of personal data, except where personal data is collected in the course of a purely personal or household activity. Hence, the recreational use of drones is a ‘grey area’ and stands almost unregulated due to this household exemption.

Nevertheless, due to the risks at stake, both to privacy and to data protection, the extent to which the ‘household‘ exemption applies in the context of a personal and private use must be questioned.

In a recent ruling, the CJEU concluded that the partial monitoring of the public space carried out by CCTV is subjected to the EU Directive 95/46, even if the camera capturing the images is “directed outwards from the private setting of the person processing the data”. As already analysed here, the CJEU considered that the processing of personal data involved did not fall within the ‘household exemption’ to data protection laws because the camera was capable of identifying individuals walking on a public footpath.

As the RPAS operations may be quite similar to CCTV, but more intrusive, because they are mobile, cover a larger territory, collect a vaster amount of information, it is not a surprise that they may and should be subjected to the same legal obligations. Subsequent to this ruling, these technologies should be considered as potentially privacy-invasive. Consequently, private operators of drones in public spaces should be ready to comply with data protection rules.

Of course, the footage needs to contain images of natural persons that are clear enough to lead to identification. Moreover, and in my opinion, it is not workable to consider, in order for the household exemption to be applied, the images collateral and incidentally captured. Otherwise, selfies unwillingly or unknowingly including someone in the background could not be freely displayed on Facebook without complying with data protection rules. The footage must constitute a serious and systematic surveillance on individuals and their activities.

Therefore, information about the activities being undertaken and about the data processing (such as the identity of the data controller, the purposes of processing, the type of data, the duration of processing and the rights of data subjects), where it does not involve disproportionate efforts, shall be given to individuals (principle of transparency). Moreover, efforts should be made in order to minimize the amount of data obtained (data minimization). Moreover, the controller might need to ensure that the personal data collected by the drone camera is anonymised, is only used for the original purpose for which it was collected (purpose limitation), will be stored adequate and securely and will not be retained for longer than what is necessarily required.

In this context, individuals having their image captured and their activities recorded by the camera of a drone should be given guarantees regarding consent, proportionality and the exercise of their rights to access, correction and erasure.

Thus said, depending on where you are geographically located in the EU, there are obviously different rules regarding the legal aspects related to the use of drones. It is therefore important for individuals intending to operate a drone to get informed and educated about the appropriate use of these devices and the safety, privacy and data protection issues at stake in order to avoid unexpected liability.

References   [ + ]

1. Copyright by Don McCullough under the Creative Commons Attribution 2.0 Generic
2. The term drone is used to describe any type of aircraft that is automated and operates without a pilot on board, commonly described as unmanned aerial vehicles (UAV). There are two types of drones: those which can autonomously follow pre-programmed flight routes and those which have remotely piloted aircrafts systems (RPAS). Only the latter are currently authorised for use in EU airspace.
3. Despite drones were firstly used for military activities, they are increasingly used across the EU for civilian purposes. The civil use usually refers to those commercial, non-commercial and government non-military activities which are more effectively or safely performed by a machine, such as such as the monitoring of rail tracks, dams, dykes or power grids.

(Un)Safe Harbour

Safe harbour for who?

Safe harbour for who?

As a general rule, the EU Data Protection Directive (Directive 95/46/EC) prevents businesses from transferring personal data from the EU to third-countries. Therefore, EU citizens’ personal data cannot be processed or hosted outside the EU, except if those countries do provide an adequate level of data protection. This adequacy requirement is met only when the European Commission recognize the data recipient country as providing an adequate level of protection. These decisions are commonly referred to as ‘adequacy decisions’.

It is deemed that the USA do not meet the above mentioned EU adequacy requirement, i.e., do not provide an adequate level of protection for data transfers to be accepted. Nevertheless, data can still be transferred from companies located in the EU on the basis of the Safe Harbour mechanism. In fact, by reason of the EU Data Protection Directive, the European Commission adopted a Decision (the “Safe Harbour decision”) recognising that the Safe Harbour Privacy Principles and the ‘Frequently Asked Questions’ provide an adequate protection for the purposes of personal data transfers from the EU to the USA.

The EU-USA Safe Harbour is an agreement concluded in 2000 which enables European data controllers to transfer personal data for commercial purposes, from companies located in the EU to companies in the USA that have signed up to the Principles. The framework aims to ensure that such transfers dully comply with the EU data protection law. To that end, USA companies pretending to lawfully receive personal data from the EU are required to self certificate the compliance of their personal data policies and practices to the Safe Harbour. Companies which voluntarily adhere to a set of principles issued by the Federal Trade Commission (FTC) are therefore presumed to qualify for the Safe Harbour ‘adequacy’.

This Framework has been greatly criticized since its implementation. Indeed, the Safe Harbour scheme has been used for the transfer of the personal data of EU citizens from the EU to the USA by companies required to give in data to USA intelligence agencies under the USA intelligence collection programmes. Moreover, some EU Data Protection Authorities manifested strong reservations about the rigour of the Safe Harbour framework, namely regarding the self-certification requirement. These concerns were echoed in the opinion of the Article 29 Working Party on Cloud Computing issued in July 2012, where it was suggested that EU data exporters could not rely on cloud provider’s self-certification regarding compliance.

As a result, it is no surprise that the framework has been reviewed twice, back in 2002 and 2004. Nevertheless, the Safe Harbour framework was endorsed by the European Commission, in January 2012, regarding the draft Data Protection Regulation, where adequacy decisions taken under the current Directive 95/46/CE would remain in effect unless amended, repealed or replaced by the Commission.

By contrast, the European Parliament’s LIBE (Civil Liberties, Justice and Home Affairs) Committee has proposed amending the proposal so that such adequacy decisions would only remain in force for five years after the Regulation comes into effect.

In the wake of the Snowden revelations regarding the USA covert surveillance programme, PRISM, for the interception and access to the electronic communications of EU citizens on a large scale, namely personal data that was transferred to online service providers in the USA under the Safe Harbour, the European Data Protection Authorities (DPAs) and the European Commission have been increasingly manifesting serious concerns regarding the safety of this agreement.

This led Viviane Reding, former Justice Commissioner, to argue that “the Safe Harbor agreement may not be so safe after all” and that it “could be a loophole for data transfers because it allows data transfers from EU to U.S. companies – although US data protection standards are lower than our European ones.” Vivian Reding further announced that the Commission would conduct an assessment of the EU-USA Safe Harbour agreement.

In July 2013 the European Parliament considered that the PRISM program constituted a “serious violation” of the Safe Harbour agreement and called on the European Commission to review the framework. Last March, following its report on mass surveillance activities, the European Parliament approved a resolution calling for the reversion or suspension of the EU-USA Safe Harbour scheme, considering that it fails to provide adequate protection for EU citizens.

Instead, in November 2013, the European Commission put forward a series of 13 recommendations for the USA to put into practice, which would make the Safe Harbour safer, if implemented. Nevertheless, the most controversial features of the framework, such as the voluntary adherence, were not adequately addressed. The expected conclusion of the discussions on the 13 recommendations proposed by the European Commission was set for the end of last summer. The deadline passed without any further developments.

Last June, following a complaint brought by the Austrian campaign group Europe v Facebook regarding the company’s part on NSA’s mass electronic surveillance programme, a Irish court (the Facebook’s international headquarters are in Ireland) referred to the Court of Justice of the EU on the compliance of the Safe Harbour with the EU Charter of Fundamental Rights.

There has been extensive debate regarding the future of the Safe Harbour, considering that some DPAs no longer recognize it as a valid data transfer mechanism. DPAs can exceptionally suspend data transfers based on the Safe Harbour, namely when it is likely that the Safe Harbour Principles are being violated. To date, no DPA has done so. Considering the serious economic implications, I think that it is very unlikely that the Safe Harbour will be suspended or reversed. In the meantime, the decision of the European Commission on the adequacy of Safe Harbour remains in force, until specifically repealed or changed.

Věra Jourová, the new Justice Commissioner, already expressed strong doubts on the security of the Safe Harbour mechanism. However, she did not favour a suspension or a cancellation of the programme. Andrus Ansip, the new Commissioner for the Digital Internal Market, for its turn, did not exclude that possibility.

 

The ‘One Stop Shop’ mechanism reloaded

Get all your data protection matters handled here!

Get all your data protection matters handled here!

The ‘one stop shop’ mechanism is one of the most heralded and yet most controversial features of the General Data Protection Regulation which draft is currently being negotiated within the Council of the European Union.

According to the most recent proposal of the Italian Presidency of the Council of the European Union, where data protection compliance of businesses operating across several EU Member States’ is in question or where individuals in different EU Member States are affected by a personal data processing operation, it would allow businesses to only deal with the Data Protection Authority (DPA) of the country where they are established.

Cases of pure national relevance, where the specific processing is solely carried out in a single Member State or only involves data subjects in that single Member State would not be covered by the model. In such circumstances, the local DPA would investigate and decide on its own without having to engage with other DPAs.

These are, however, deemed to be the exemption as the mechanism aims for a better cooperation among DPAs of the different EU Member States concerned by a specific matter.

Therefore, in cross-border cases, the competence of the DPA of the EU Member State of the main establishment does not lead to the exclusion of the intervention of all the other supervisory authorities concerned by the matter. In fact, while the supervisory authority of the Member State where the company is established will take the lead of the process which will ensue, the other authorities would be able to follow, cooperate and intervene in all the phases of the decision-making process.

In this context, if no consensus is reached among the several authorities involved, the European Data Protection Body (hereafter EDPB) will decide on the binding measures to be implemented by the controller or processor concerned in all of their establishments set up in the EU. Similarly, the EDPB will have legally binding powers in case of failure to reach an agreement over which authority should take the lead.

Multi-jurisdictional operating businesses operating in the EU, which handle vast amounts of personal data, would highly benefit from this ‘one stop shop’ concept, which would enable to reduce the number of regulators investigating the same cases. Indeed, as things stand presently, a company with operations in more than one EU Member State has to deal with 28 different data protection laws and regulators, which unavoidably leads to a lack of harmonization and legal uncertainty.

The Article 29 Working Party has already manifested its support for a ‘one stop shop’ mechanism under the proposed EU General Data Protection Regulation.

However, in the past, Member States have manifested numerous reservations regarding this mechanism. Among the main concerns expressed were the following: businesses would be able to ‘forum shop’ in order to ensure that their preferred DPA leads the process; a DPA would not be able to take enforcement action in another jurisdiction; individuals’ rights to an effective remedy under EU laws would not be appropriately recognised; authorities without the lead position would not be able to influence processes related to data protection breaches involving nationals of their Member States.

As the way the ‘one stop shop‘ mechanism would be implemented in practice is one of the main causes of the hindrance for the Member States to reach an agreement on the wording of a new EU General Data Protection Regulation, let’s hope that the solution proposed by the Italian Presidency of the Council of the European Union does get closer to a suitable accommodation of the various concerns expressed by Member States.

The ‘risk-based’ approach to Data Protection, too risky for SMEs?

Balance is hard, very hard.

Balance is hard, very hard.

For those businesses which collect, process and exploit personal data, the draft of Chapter IV of the forthcoming EU General Data Protection Regulation is particularly relevant as it foresees the possible future compliance obligations of data controllers and data processors.

Considering the last position of the Council of the European Union regarding this chapter, a ‘risk-based‘ approach to compliance is a core element of the accountability principle itself.1)See article 22 of the Council’s document.

In fact, the Article 29 Working Party2)The Article 29 Working Party gathers a representative of the supervisory authority designated by each EU Member State; a representative of the authority established for the EU institutions and bodies; and a representative of the European Commission. recently issued a statement supporting a ‘risk-based‘ approach in the EU data protection legal framework.

But what is it meant by the concept of a ‘risk-based‘ approach?

It mainly refers to the consideration of any potential adverse effects associated with the processing and implies different levels of accountability obligations of data controllers, depending on the risks involved within each specific processing activity. It is therefore quite different from the ‘one size fits all‘ approach, as initially proposed by the European Commission.

In this context, the respect and protection of the data subjects’ rights (for instance, right of access, of objection, of rectification, of erasure, and rights to transparency, to data portability and to be forgotten) shall be granted throughout the data processing activities, regardless the level of risks involved in these activities.

However, principles as legitimacy, transparency, data minimization, data accuracy, purpose limitation and data integrity and the compliance obligations impending upon controllers shall be proportionate to the nature, scope, context and purposes of the processing.

This ‘risk-based‘ approach is developed throughout Chapter IV, namely regarding provisions related to the data protection by design principle3)See article 23., the obligation for documentation4)See article 28., the obligation of security5)See article 30., the obligation to carry out an impact assessment6)See article 33., and the use of certification and codes of conduct7)See articles 38 and 39..

These accountability obligations, in each phase of the processing, will vary according to the type of processing and the risks to privacy and to other rights and freedoms of individuals.

In this context, the proportionality exercise will have an effect on the requirements of privacy by design8)See article 23., which consists on assessing the potential risks of the data processing and implementing suitable privacy and data protection tools and measures in order to address that risk before initiating these activities.

Besides, the introduction of the ‘risk-based‘ approach is also likely to be relevant in respect of controllers not established in the EU, as they most surely won’t be required to designate a representative in the EU, regarding occasional processing activities which are unlikely to result in a risk for the rights and freedoms of individuals 9)See article 25..

Moreover, a ‘risk-based‘ approach will be implemented as well regarding the security of the processing, as technical and organisational measures, adequate to the likelihood and severity of the risk for the rights and freedoms of individuals, shall be adopted10)See article 30..

In parallel, it has been foreseen that the obligation to report data breaches is restricted to the breaches which are likely to result in an high risk for the rights and freedoms of individuals. In this context, if the compromised data is encrypted, for instance, the data controller won’t be required to report a verified breach.11)See article 31 and 32.

The weighing assessment is expected to be also relevant regarding the data protection impact assessment12)See article 33. required for the processing activities that will likely result in a ‘high risk’ to the rights and freedoms of individuals, such as discrimination, identity theft, fraud or financial loss.

Another important requirement is the consultation of a Data Protection Authority prior to the processing of personal data when the impact assessment indicates that the processing would result in a high degree of risk in the absence of measures to be taken by the controller to mitigate the risk.13)See article 34.

Of course “nothing is agreed until everything is agreed” and this chapter will be subjected to further revisions. There is, indeed, a vast room for improvement.

For instance, it is questionable if a ‘risk-based‘ approach does make data protection standards stronger, considering the inadequacy of the risk assessment methodology regarding fundamental rights.

In parallel, the definition of ‘high risk‘ is still too broad, including almost all businesses which are operating online. Similarly,  the impact assessment process presents itself as complex, burdensome and costly. At the current state of play, small businesses and start-ups are most likely to be negatively affected by the administrative and financial burden that some of the abovementioned provisions will entail. This is quite ironic, considering that it was precisely that concern that is at the core of the understanding according to which SMEs should be exempted from the obligation to assign a Data Protection Officer.

However, it is important for businesses to try to anticipate how the compliance requirements will be set in the future in order to be prepared for their implementation.

We will see in due time how onerous the regime will be. Whilst we do not know the exact content of the text that will eventually be adopted, it is evident now that substantive accountability obligations will be imposed upon businesses handling personal data.

References   [ + ]

1. See article 22 of the Council’s document.
2. The Article 29 Working Party gathers a representative of the supervisory authority designated by each EU Member State; a representative of the authority established for the EU institutions and bodies; and a representative of the European Commission.
3. See article 23.
4. See article 28.
5. See article 30.
6. See article 33.
7. See articles 38 and 39.
8. See article 23.
9. See article 25.
10. See article 30.
11. See article 31 and 32.
12. See article 33.
13. See article 34.
Older posts

© 2017 The Public Privacy

Theme by Anders NorenUp ↑