Tag: Data Collection (page 1 of 3)

Truecaller: In the crossroad between privacy and data protection

Let me see, where am I uploading others personal information today?

Let me see, where am I uploading others personal information today?

As I have already made clear in a previous post, there is little that I find more annoying than, at the end of a particularly stressful workday, being bothered by unrequested telemarketing or spam phone calls. And while I understand that, on the other side of the line, there is a person acting under the direct orientation – and possibly supervision – of his/her employer, these calls most always seem a resistance test for one’s patience.

Therefore, mobile software or applications enabling the prior identification of certain numbers, by replicating the Caller ID experience (as if the contact number was saved in your contact list) or allowing for the automatic blocking of undesirable calls have found here a market to succeed.

Thus said, you might have certainly heard of and possibly installed apps such as Current Caller ID, WhosCall or Truecaller. Most probably, you find them quite useful in order to avoid unwanted contacts.

As I have, for several occasions, unlisted my contact from the Truecaller database, but keep noticing that it eventually ends up being integrated in that list all over again, I want to address today some specific issues regarding that app.

Truecaller is freely available on iOS and Android platforms and is quite efficient in regards of what it promises, providing its subscribers a humongous database of previously identified contacts. In particular, it enables users to identify, without being required to hang up to that end, spam callers.

This efficiency is the result of the data provided by the millions of users who have downloaded the app on their smartphones.

How?

Well, it suffices that users allow for the app to access his/her contacts list as foreseen in the end user agreement, which might not have been read by many. Once this consent has been obtained, the information of the contacts book is uploaded to the Truecaller’s servers and made available to the rest of its subscribers.

According to this crowd-sourced data system, you are able to identify unknown numbers.

Therefore it suffices that another user has marked a given contact as spam for you to be able to immediately identify a caller as such and save yourself from the patience consuming contact. Indeed, and quite undoubtedly, if a number qualified as unwanted by others call you, the screen of your Smartphone will turn red and present the image of shady figure wearing a fedora and sunglasses.

On the down side, if anybody has saved your contact number and name in their address book, if suffices that one person has installed the Truecaller app on their mobile phone and subscribed the abovementioned permission clause, for your contact number and name to end up in that database.

A new interface enables users to add information from their social media channels. Therefore, besides your contacts information, if users do activate the use of third parties social network services, such as Facebook, Google+, LinkedIn or Twitter, Truecaller may upload, store and use the list of identifiers associated with those services linked to your contacts in order to enhance the results shared with other users.

Moreover, it has recently been updated to introduce a search function, thus enabling you to search for any phone number and find any person’s contact.

In the same line, Facebook – which is only the largest social network – has decided to give new uses for the amount of data its users provide with the app Facebook Hello. In that regard, users are required to grant it access to the information contained in their Facebook account. Indeed, Hello uses Facebook’s database to provide details of a caller. Contrastingly, other apps such as Contacts+ integrate information provided in different social networks.

While it is undeniably useful to identify the person behind the unknown numbers, this means that the same others will be able to identify you when you contact them, even if they do not have your number.

Truecaller raises several privacy and data protection concerns. In fact, as names and telephone contacts actually enable the adequate identification of natural individuals, there is no doubt that such information actually constitutes personal data.

Nevertheless, in Truecaller’s own words:

When you install and use the Truecaller Apps, Truecaller will collect, process and retain personal information from You and any devices You may use in Your interaction with our Services. This information may include the following: geo-location, Your IP address, device ID or unique identifier, device type, ID for advertising, ad data, unique device token, operating system, operator, connection information, screen resolution, usage statistics, version of the Truecaller Apps You use and other information based on Your interaction with our Services.

This is particularly problematic considering that Trucaller collects and processes clearly and manifestly the personal data of other data subjects besides of its users.

As for the information related to other persons, it is said:

You may share the names, numbers and email addresses contained in Your device’s address book (“Contact Information”) with Truecaller for the purpose described below under Enhanced Search. If you provide us with personal information about someone else, you confirm that they are aware that You have provided their data and that they consent to our processing of their data according to our Privacy Policy.

This statement is, in my very modest opinion, absolutely ludicrous. Most people who have installed the service are not even aware how it works, let alone that an obligation of notifying an entire contacts list and obtaining individual consent impends upon them. In this context, it is paramount to have into consideration that, in the vast majority of cases, from the users’ viewpoint, the data at stake is collected and processed merely for personal purposes. Moreover, consent is usually defined as “any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed.”

This basically amounts as saying that non-users do not have any control over their personal data as the sharing of their contact identification and phone number will depend on how many friends actually install Truecaller.

It is evident that Truecaller has no legal permission to process any personal data from non-users of its service. These non-users are data subjects who most certainly have not unambiguously given their consent, the processing is not necessary for the performance of a contract in which the data subject is party, and no legal obligation nor vital interests nor the performance of a task carried out in the public interest are at stake.

In this regard, the possibility provided to those who do not wish to have their names and phone numbers made available through the enhanced search or name search functionalities to exclude themselves from further queries by notifying Truecaller is not even realistic. To begin with, one is required to be aware of the existence of the service and, subsequently, is required to actively seek if his/her contact is on its directory and then to require Truecaller to be unlisted from its database. However, considering all my failed attempts, I am not sure if this option is only available to users or if this simply does not prevent to be added again to the service’s database once another user having the relevant contact on his address book actually allows for such access.

Last, but not the least, and this should not really constitute any surprise, Truecaller has already been hacked in the past by a Syrian hacking group, which resulted in the unauthorized access to some (personal) data of users and non-users. This surely highlights the importance of users carefully choosing the services with whom they entrust their – and others – personal data.

All considering, Truecaller is the obvious practical example of the statement: ‘If You’re Not Paying For It, You Are the Product Being Sold’.

The dangers of certain apps or how to put your whole life out there

Finding love, one data breach at a time.

Finding love, one data breach at a time.

One of my past flatmates was actively looking for love online. Besides having registered in several websites for that end, I remember he also had several mobile applications (apps) installed in his Smartphone. I think he actually subscribed pretty much anything that even remotely could help him find love but outlined Tinder as his main dating tool.

Another of my closest friends is a jogging addicted – shout out P. He has installed on his Smartphone various apps which enable him to know how much steps he has made in a particular day, the route undertaken, and the heart rate via external device, which enables him to monitor his progresses.

What both of my friends have in common? Well, they actually use mobile apps to cover very specific necessities. And in this regard they can rely with almost anybody else.

Indeed, it is difficult to escape apps nowadays. Now that everyone (except for my aunt) seems to have a Smartphone, apps are increasingly popular for the most diversified purposes. For my prior flatmate it was all about dating. For my friend, it is to keep track of his running progresses. But their potential does not end there. From receiving and sending messages, using maps and navigation services, receiving news updates, playing games, dating or just checking the weather… You name a necessity or convenience, and there is an app for it.

On the downside, using apps usually requires to provide more or less personal information to the specific intended effect. Something that has become so usual that most consider as a natural step, without giving it further consideration.

In fact, a detail that most seem to be unaware of, apps allow for a massive collection and processing of personal – and sometimes sensitive – data. In fact, the nature and the amount of personal data accessed and collected raises serious privacy and data protection concerns.

For instance, in the case of my abovementioned flatmate, who was registered on several similar apps, and considering that he did not create fake accounts nor provided false information, each of them collected at least his name, age, gender, profession, location (enabling to presume where he worked, lived and spend time), sexual orientation, what he looks like (if he added a picture to his profiles), the frequency of his accesses to the app, and eventually the success of his online dating life.

In fact, in Tinder’s own words:

Information we collect about you

In General. We may collect information that can identify you such as your name and email address (“personal information”) and other information that does not identify you. We may collect this information through a website or a mobile application. By using the Service, you are authorizing us to gather, parse and retain data related to the provision of the Service. When you provide personal information through our Service, the information may be sent to servers located in the United States and countries around the world.
Information you provide. In order to register as a user with Tinder, you will be asked to sign in using your Facebook login. If you do so, you authorize us to access certain Facebook account information, such as your public Facebook profile (consistent with your privacy settings in Facebook), your email address, interests, likes, gender, birthday, education history, relationship interests, current city, photos, personal description, friend list, and information about and photos of your Facebook friends who might be common Facebook friends with other Tinder users. You will also be asked to allow Tinder to collect your location information from your device when you download or use the Service. In addition, we may collect and store any personal information you provide while using our Service or in some other manner. This may include identifying information, such as your name, address, email address and telephone number, and, if you transact business with us, financial information. You may also provide us photos, a personal description and information about your gender and preferences for recommendations, such as search distance, age range and gender. If you chat with other Tinder users, you provide us the content of your chats, and if you contact us with a customer service or other inquiry, you provide us with the content of that communication.

Considering that Tinder makes available a catalogue of profiles of geographically nearby members, among which one can swipe right or left, according to each one personal preferences, with the adequate analysis, it is even possible to define what type of persons (according to age, body type, hair colour) users find most attractive.

And because Tinder actually depends on having a Facebook profile, I guess that Facebook also gets aware of the average climate of your romantic life. Mainly if you start adding and interacting with your new friends on that platform and, why not, changing your status accordingly.

In the specific case of Tinder, as it mandatorily requires to be provided with a certain amount of Facebook information in order to ensure its proper functioning, these correlations are much easier for this app.

Thus said, a sweep conducted by 26 privacy and data protection authorities from around the world on more than 1,000 diversified apps, thus including Apple and Android apps, free and paid apps, public sector and private sector apps, and ranging from games and health/fitness apps, to news and banking apps has made possible to outline the main concerns at stake.

One of the issues specifically pointed out referred to the information provided to the users/data subjects, as it was concluded that many apps did not have a privacy policy. Therefore, in those cases, users were not properly informed – and therefore aware – about the collection, use, or further disclosure of the personal information provided.

It is a fact that most of us do not read the terms and conditions made available. And most will subscribe pretty much any service he/she is willing to use, disregarding what those terms and conditions actually state.

Nevertheless, a relevant issue in this regard is the excessive amount of data collected considering the purposes for which the information is provided or how it is sneakily collected. For instance, even gambling apps, such as solitaire, which seem far more innocuous, hide unknown risks, as many contain code enabling the access to the user’s information or to his contacts’ list and even allow to track the user’s browsing activities.

This is particularly worrisome when sensitive data, such as health information is at stake. This kind of data is easily collected through fitness orientated apps, which are quite in vogue nowadays. Besides any additional personally identifiable information which you will eventually provide upon creating an account, among the elements which most certainly are collected, one can find: from the name or user name, date of birth, current weight, target weight, height, gender, workouts frequency, workout settings and duration of your workout, heart rate. Also, if you train outdoors, geo-location will most certainly enable to assess the whereabouts of your exercising, from the departure to the arrival points, which will most probably coincide with your home address or its vicinities.

And, if you are particularly proud of your running or cycling results, and are willing to show up to all your friends in what good shape you actually are, there is a chance that you can actually connect the app to your Facebook and display that information in your profile, subsequently enabling Facebook to access the same logged information.

And things actually get worse when considering that, as demonstrated by recent data breaches, it seems that the information provided by their users is not even adequately protected.

For instance, and if I remember it well, due to a security vulnerability in Tinder – that apparently has been already fixed – it seemed that there was a time where the location data, such as longitude and latitude coordinates of users were actually easily accessible. Which is actually quite creepy and dangerous, as it would facilitate stalking and harassment in real life, which is as bad as it is happening online.

Anyway, it is actually very easy to forget the amount of data we provide apps with. However, the correlations that can be made, the conclusions which can be inferred, the patterns that can be assessed amounts to share more information than what we first realise and enables a far more detailed profile of ourselves than most of us would feel comfortable with others knowing.

When the information asked from job applicants is simply too much…

We also need your credit card info, body size and a blood sample just for the application.

We also need your credit card info, body size and a blood sample just for the application. [1]Copyright by Kathryn Decker under the Creative Commons – Attribution 2.0 Generic

I am currently looking for new professional opportunities and, in my quest, I have faced some very peculiar data collection policies in the context of some recruitment processes.

From being required to provide my full name, my ID number, my social security number, my complete address as mandatory information to be provided to apply for a certain job or to file a spontaneous application… I have pretty much been asked everything. At this point, I wouldn’t be surprised anymore to be asked for my bank account, my bloodtype or my electoral numbers, which are as useless information to be required for such purpose.

And when this comes from big companies which actually ought to know better and have data protection policies implemented, it is all the more astonishing!

Perhaps this may come as a surprise for some, as I am prone to conclude considering my recent experiences, but when personal data is collected as part of a recruitment process, the Data Protection rules do apply.

With regards to the balance which ought to be stricken between a potential employer’s need for information in order to select among applications and the applicants’ right to respect for their private life, I think that it is pretty straightforward that requiring the abovementioned elements is pointless and disproportionate in a recruitment process.

In fact, it amounts to collect from job applicants information that is only necessary if you are going to eventually appoint a specific applicant as an employee. Which only happens at a later stage.

Besides being annoying to be required to mandatorily provide pointless personal information to a recruiter from whom one might never hear again, it is actually a breach of data protection rules to collect irrelevant or excessive information.

Having this into consideration, if you collect such unnecessary information in the context of recruitment processes and if you have received my application, you should seriously consider calling me for an interview. :o)

 

References

References
1 Copyright by Kathryn Decker under the Creative Commons – Attribution 2.0 Generic

The ‘Safe Harbor’ Decision ruled invalid by the CJEU

Safe harbor?!? Not anymore.

Safe harbor?!? Not anymore.

Unfortunately, I hadn’t had the time to address the ruling of the CJEU issue last October, by which the ‘Safe Harbour’ scheme, enabling transatlantic transfers of data from the EU to the US, was deemed invalid.

However, due to its importance, and because this blog is primarily intended to be about privacy and data protection, it would be shameful to finish the year without addressing the issue.

As you may be well aware, article 25(1) of Directive 95/46 establishes that the transfer of personal data from an EU Member State to a third country may occur provided that the latter ensures an adequate level of protection. According to article 25(6) of the abovementioned Directive, the EU Commission may find that a third country ensures an adequate level of protection (i.e., a level of protection of fundamental rights essentially equivalent to that guaranteed within the EU under the directive read in the light of the Charter of Fundamental Rights) by reason of its domestic law or of its international commitments.

Thus said, the EU Commission adopted its Decision 2000/520, by which it concluded that the “Safe Harbour Principles” issued by the US Department of Commerce ensure an adequate level of protection for personal data transferred from the EU to companies established in the US.

Accordingly, under this framework, Facebook has been transferring the data provided by its users residing in the EU from its subsidiary in Ireland to its servers located in the US, for further processing.

These transfers and, unavoidably, the Decision had been challenged by the reference to the CJEU (judgment in Case C-362/14) following the complaint filed by Max Schrems, a Facebook user, before the Irish DPA and subsequently before the Irish High Court. The main argument was that, considering the access electronic communications conducted by its public authorities, the US did not ensure adequate protection of the thus transferred personal data.

According to the AG’s opinion, “the access enjoyed by the United States intelligence services to the transferred data constitutes an interference with the right to respect for private life and the right to protection of personal data”.

Despite considering that a third country cannot be required to ensure a level of protection identical to that guaranteed in the EU, the CJEU considered that the decision fails to comply with the requirements established in Article 25(6) of Directive and that the Commission did not make a proper finding of adequacy but merely examined the safe harbour scheme.

The facts that the scheme’s ambit is restricted to adhering US companies, thus excluding public authorities, and that national security, public interest and law enforcement requirements, to which US companies are also bound, prevail over the safe harbour principles, were deemed particularly decisive in the assessment of the scheme’s validity.

In practice, this would amount to enable the US authorities to access the personal data transferred from the EU to the US and process it in a way incompatible with the purposes for which it was transferred, beyond what was strictly necessary and proportionate to the protection of national security.

As a result, the Court concluded that enabling public authorities to have access on a generalised basis to the content of electronic communications must be regarded as compromising the essence of the fundamental right to respect for private life.

The Court stated that the decision disregards the existence of such negative interference on fundamental rights, and that the lack of provision of limitations and effective legal protections violates the fundamental right to effective judicial protection.

Upon issuance of this ruling, the Art29WP met and concluded that data transfers from the EU to the US could no longer be legitimized by the ‘Safe Harbor’ decision and, if occurring, would be unlawful.
While its practical implications remain unclear, the ruling undoubtedly means that companies relying on the ‘Safe Harbor’ framework for the transfer of personal data from the EU to the US need to rely, instead, on another basis.

In this regard, considering that not all Member States accept the consent of the data subject or an adequacy self-assessment as a legitimizing legal ground for such cross-border transfers, Model Contractual Clauses incorporated into contracts and Binding Corporate Rules (BCR) for intragroup transfers seem to be the most reliable alternatives in certain cases.

Restrictions on data transfers are obviously also foreseen in the GDPR, which, besides BCRs, Standard Contracts and adequacy decisions, includes new data transfer mechanisms such as certification schemes.

You can find the complete version of the ruling here.

The ‘Dick-Pic Programme’

How unfortunate it is that people are not generally very concerned about government mass surveillance… except when pictures of their private parts are involved.

The good news is that there is no such ‘dick-pic programme’. The bad one is that, well, the intelligence services do collect those kind of pictures and they are only a small part of the information which has been collected – and depending on each individual’s exhibitionist tendencies – not the most privacy-infringing one.

A spy in your living room: ‘Tu quoque mi’ TV?

How smart are you?

How smart are you?

So, it seems that the room we have for our privacy to bloom is getting smaller and smaller. We already knew that being at home did not automatically imply seclusion. Still, nosy neighbours were, for quite a long time, the only enemies of home privacy.

However, thicker walls and darker window blinds no longer protect us from external snooping as, nowadays, the enemy seems to hide in our living room or even bedroom.

Indeed, it seems that when we bought our super duper and very expensive Smart TV, we actually may have brought to our home a very sneaky and effective – although apparently innocent – spy.

As you may (or may not) already know, TV with Internet connectivity allow for the collection of its users’ data, including voice recognition and viewing habits. A few days ago many people would praise those capabilities, as the voice recognition feature is applied to our convenience, i.e., to improve the TV’s response to our voice commands and the collection of data is intended to provide a customized and more comfortable experience. Currently, I seriously doubt that most of us do look at our TV screens the same way.

To start with, there was the realization that usage information, such as our favourite programs and online behaviour, and other not intended/expected to be collected information, are in fact collected by LG Smart TV in order to present targeting ads. And this happens even if the user actually switches off the option of having his data collected to that end. Worse, the data collected even respected external USB hard drive.

More recently, the Samsung Smart TV was also put in the spotlight due to its privacy policy. Someone having attentively read the Samsung Smart TV’s user manual, shared the following excerpt online:

To provide you the Voice Recognition feature, some voice commands may be transmitted (along with information about your device, including device identifiers) to a third-party service that converts speech to text or to the extent necessary to provide the Voice Recognition features to you. (…)

Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.

And people seemed to have abruptly waken up to the realization that this voice recognition feature is not only directed to specific commands in order to allow for a better interaction between an user and the device, as it also may actually involve the capture and recording of personal and sensitive information, considering the conversation taking place nearby. No need to be a techie to know that this does not amount to performance improvement. This is eavesdropping. And to make it worse, the data is transferred to a third-party.

In the aftermath, Samsung has clarified that it did not retain voice data nor sell the audio being collected. It further explained that a microphone icon is visible on the screen when voice activation was turned on and, consequently, no unexpected recording takes place.

Of course you can now be more careful about what you say around your TV. But as users can activate or deactivate this voice recognition feature, my guess is that most will actually prefer to use the old remote control and to keep the TV as dumb as possible. I mean, just the idea of the possibility of private conversations taking place in front of your TV screen being involuntarily recorded is enough motivation.

Also, it should be pointed out that, considering the personal data at stake (relating to an identified or identifiable person) involved, there are very relevant data protection concerns regarding these situations. Can it simply be accepted that the user has consented to the Terms and Conditions on the TV acquired? Were these very significant terms made clear at any point? It is quite certain that there users could not have foreseen, at the time of the purchase, that such deep and extended collection would actually take place. And if so, such consent cannot be considered to have been freely given. It suffices to think that the features used for the collection of data are what make the TV smart in the first place and, therefore, the main reason for buying the product. Moreover, is this collection strictly necessary to the pretended service to be provided? When the data at stake involves data from other devices or other wording than the voice commands, the answer cannot be positive. And the transmission of personal data to third parties only makes all this worse as it is not specified under what conditions data is transmitted to a third party or who that third party actually is. Adding to this, if we consider that these settings mostly come by default, they are certainly not privacy-friendly and amount to stealthily monitoring. Last but not the least, it still remains to be seen if the proper data anonymisation/pseudinonymisation techniques are effectively put in place.

Nevertheless, these situations brought back into the spotlight the risks to privacy associated with personal devices in the Internet of Things era. As smart devices are more and more present in our households, we are smoothly loosing privacy or, at least, our privacy faces greater risks. In fact, it is quite difficult to live nowadays without these technologies which undoubtedly make our lives so much more comfortable and easier. It is time for people to realize that all this convenience comes with a cost. And an high one.

Game of drones or the not so playful side of the use of RPAS for recreational purposes

I am watching you.

I am watching you.[1]Copyright by Don McCullough under the Creative Commons Attribution 2.0 Generic

If one of the gifts you have found underneath the Christmas tree was a drone [2]The term drone is used to describe any type of aircraft that is automated and operates without a pilot on board, commonly described as unmanned aerial vehicles (UAV). There are two types of drones: … Continue reading, and it happens to have some camera installed on it, you should prepare yourself to embrace your new status of a data controller and face a new set of obligations regarding privacy and safety.

Indeed, whilst drones can be a lot of fun, there are serious considerations at stake which should not be ignored. In fact, the extensive range of their potential applications[3]Despite drones were firstly used for military activities, they are increasingly used across the EU for civilian purposes. The civil use usually refers to those commercial, non-commercial and … Continue reading, the proliferation of UAVs with a camera, the collection of data and the subsequent use of such data, namely by private individuals for personal and recreational purposes raise concerns about the impact of these technologies on the safety, security, privacy and the protection of personal data.

As a matter of fact, a drone in itself does not imply the collecting and the processing of any personal data until you attach a camera to it. However, drones are increasingly equipped with high definition optical cameras and therefore are able to capture and record images of the public space. And while there are no apparent privacy concerns regarding the recording of landscapes, having a drone filming through the sky over your neighbourhood might lead to a very different conclusion. Drones have a high potential for collateral or direct intrusion regarding privacy, considering the height at which they operate, allowing to monitor a vast area and to capture large numbers of people or specific individuals. Despite individuals may not always be directly identifiable, their identification may still be possible through the context in which the image is captured or the footage is recorded.

It must be noted that people might not even be aware that they are being filmed or by whom and, as a result, cannot take any steps to avoid being captured if such activity is not made public. People ought not to know that the device is equipped with optical imaging and has recording capabilities. Moreover, because the amateur usage of a drone may not be visible, there is a high risk of being directed to covert and voyeuristic recording of their neighbours’ lives, homes and back gardens. How would you feel if a drone was constantly looming near your windows or in your backyard? Indeed, there is no guarantee regarding the legitimacy of the end to be achieved with the use of drones. None withstanding the fact that a drone may actually pose a threat to people’s personal safety, belongings and property, considering that it may fall, its increasing popularity as a hobby outlines the issue of discriminatory targeting, as certain individuals, such as children, young people and women, are particularly vulnerable to an insidious use of RPAS. This is particularly relevant considering that the images or footage is usually intended to be made publicly available, usually on platforms such as Youtube.

Furthermore, the recording may interfere with the privacy of individuals as their whereabouts, home or workplace addresses, doings and relationships are registered. In this context, the use of drones for hobbying purposes may have a chilling effect on the use of the public space, leading individuals to adjust their behaviour as they fear their activities are being monitored.

Thus considering, the use of this type of aerial technologies is covered by Article 7 and Article 8 of the EU Charter of Fundamental Rights which respectively establish the respect for private life and protection of personal data. Taking into account the abstract nature of the concept of privacy, the main difficulty will be to define when there is a violation at stake.

In addition, there are obviously data protection implications at stake where the drone is capturing personal data. EU data protection rules generally govern the collection, processing and retention of personal data. The EU Directive 95/46/CE and the proposed General Data Protection Regulation are applicable to the collection, processing and retention of personal data, except where personal data is collected in the course of a purely personal or household activity. Hence, the recreational use of drones is a ‘grey area’ and stands almost unregulated due to this household exemption.

Nevertheless, due to the risks at stake, both to privacy and to data protection, the extent to which the ‘household‘ exemption applies in the context of a personal and private use must be questioned.

In a recent ruling, the CJEU concluded that the partial monitoring of the public space carried out by CCTV is subjected to the EU Directive 95/46, even if the camera capturing the images is “directed outwards from the private setting of the person processing the data”. As already analysed here, the CJEU considered that the processing of personal data involved did not fall within the ‘household exemption’ to data protection laws because the camera was capable of identifying individuals walking on a public footpath.

As the RPAS operations may be quite similar to CCTV, but more intrusive, because they are mobile, cover a larger territory, collect a vaster amount of information, it is not a surprise that they may and should be subjected to the same legal obligations. Subsequent to this ruling, these technologies should be considered as potentially privacy-invasive. Consequently, private operators of drones in public spaces should be ready to comply with data protection rules.

Of course, the footage needs to contain images of natural persons that are clear enough to lead to identification. Moreover, and in my opinion, it is not workable to consider, in order for the household exemption to be applied, the images collateral and incidentally captured. Otherwise, selfies unwillingly or unknowingly including someone in the background could not be freely displayed on Facebook without complying with data protection rules. The footage must constitute a serious and systematic surveillance on individuals and their activities.

Therefore, information about the activities being undertaken and about the data processing (such as the identity of the data controller, the purposes of processing, the type of data, the duration of processing and the rights of data subjects), where it does not involve disproportionate efforts, shall be given to individuals (principle of transparency). Moreover, efforts should be made in order to minimize the amount of data obtained (data minimization). Moreover, the controller might need to ensure that the personal data collected by the drone camera is anonymised, is only used for the original purpose for which it was collected (purpose limitation), will be stored adequate and securely and will not be retained for longer than what is necessarily required.

In this context, individuals having their image captured and their activities recorded by the camera of a drone should be given guarantees regarding consent, proportionality and the exercise of their rights to access, correction and erasure.

Thus said, depending on where you are geographically located in the EU, there are obviously different rules regarding the legal aspects related to the use of drones. It is therefore important for individuals intending to operate a drone to get informed and educated about the appropriate use of these devices and the safety, privacy and data protection issues at stake in order to avoid unexpected liability.

References

References
1 Copyright by Don McCullough under the Creative Commons Attribution 2.0 Generic
2 The term drone is used to describe any type of aircraft that is automated and operates without a pilot on board, commonly described as unmanned aerial vehicles (UAV). There are two types of drones: those which can autonomously follow pre-programmed flight routes and those which have remotely piloted aircrafts systems (RPAS). Only the latter are currently authorised for use in EU airspace.
3 Despite drones were firstly used for military activities, they are increasingly used across the EU for civilian purposes. The civil use usually refers to those commercial, non-commercial and government non-military activities which are more effectively or safely performed by a machine, such as such as the monitoring of rail tracks, dams, dykes or power grids.

CCTV: household security or how to be a data controller at home

CCTV, walking the thin line of protecting yourself or becoming a data processor.

CCTV, walking the thin line of protecting yourself or becoming a data processor.[1]Copyright by Nïall Green under the Creative Commons Attribution-Share Alike 1.0 Generic

Having suffered several attacks, in which the windows of the family home had been broken on several occasions, by persons unknown, Mr Ryneš, a Czech citizen, installed a CCTV camera under the eaves of his home. In a fixed position, the camera recorded the entrance to his home, the public footpath and the entrance to the house opposite. The system allowed only a visual recording, which was stored on a hard disk drive. Reaching its full capacity, the device would record over the existing recording, erasing the old material. Although the images would not be monitored in real time, this video surveillance system made it possible to identify two suspects, who were subsequently prosecuted.

However, despite the happy outcome, the operation of this camera system, installed by an individual on his household, for the purposes of protecting the property, health and life of the owner and his family, raised some questions due to the continuous recording of a public space.

One of the suspects challenged the legality of Mr Ryneš recording of the images. The Czech Data Protection Authority (hereafter DPA) considered that this operation infringed data-protection rules because the data collection of persons moving along the street or entering the house opposite occurred lacked their consent; individuals were not informed of the processing of that personal data, the extent and purpose of that processing, by whom and by what means the personal data would be processed, or who would have access to the personal data; and this processing was reported to the Office as mandatory.

Mr Ryneš brought an action challenging that decision in court, which was dismissed. The case was appealed to the Czech Supreme Administrative Court which referred to the Court of Justice of the European Union (hereafter CJEU) for a preliminary ruling.

In this context, in its judgment in Case C-212/13, the CJEU addressed the application of the ‘household exception’, for the purposes of Article 3(2) of Directive 95/46/EC, which refers to the data processing carried out by a natural person in the course of a purely personal or household activity.

The CJEU considered that the image of a person recorded by a camera constitutes personal data within the meaning of the Directive 95/46 inasmuch as it makes it possible to identify the person concerned.

Moreover, the Court considered that video surveillance falls within the scope of the above mentioned directive in so far as it constitutes automatic processing, i.e., an operation which is performed upon personal data, such as collection, recording, storage.

Considering that the main goal of the this Directive is to guarantee a high level of protection of the fundamental rights and freedoms of natural persons, in particular their right to privacy, as foreseen in article 7 of the EU Charter of Fundamental Rights, the CJEU recalled that derogations and limitations must be strictly necessary.

Therefore, the Court deemed that the ‘household exception’ must be narrowly construed and applicable when the data processing activity is carried out ‘purely’ private and household context, even if it incidentally concerns the private life of other persons, such as correspondence and the keeping of address books.

In this context, the CJEU concluded as follows:

(…)the second indent of Article 3(2) of Directive 95/46 must be interpreted as meaning that the operation of a camera system, as a result of which a video recording of people is stored on a continuous recording device such as a hard disk drive, installed by an individual on his family home for the purposes of protecting the property, health and life of the home owners, but which also monitors a public space, does not amount to the processing of data in the course of a purely personal or household activity, for the purposes of that provision.

However, Mr Ryneš’s concerns, which motivated the installation of the camera, were not overlooked by the CJEU. Indeed, the Court outlined that the Directive itself allows, where appropriate, to consider the legitimate interests pursued by the controller, such as the protection of the property, health and life of his family and himself. This reflection is in line with the Opinion of the Article 29 Working Party in this regard as security was mentioned as an example of a legitimate interest of the data controller.

This implies that, even if the household exception is not applicable in this very particular case, a CCTV camera recording activity such as the one in the proceedings is lawful in the light of article 7(f) of the Directive. Thus said, the referring Court will now have to take this interpretative guidance into consideration and decide if the recording and processing at stake were legitimate, for instance, in regards of article 10 of the instrument. It is possible that the Czech Court may still consider that because no information regarding the recording was provided to the public (individuals were not informed of the processing of that personal data, the extent and purpose of that processing, by whom and by what means the personal data would be processed, or who would have access to the personal data) and considering that this processing was not reported to the Office constitute a breach of the data protection rules.

This is particularly relevant considering that, precisely for security purposes, individuals are equipping their households with CCTV systems which capture public space. Only time will tell how this decision will be applied to individuals in practice. Most certainly, DPAs across the EU will update their recommendations regarding the weighing between the necessity of the recording and storing of the data to pursue an interest deemed legitimate and the interests for fundamental rights and freedoms of the data subject.

At this point, it is expectable that householders who have surveillance cameras that capture public space will need to ensure that their collection and further use of any footage which contains images of identifiable individuals complies with the data protection requirements. Thus, they will have, for instance, to at least inform people of this monitoring and ensure that no footage is illegally retained.

References

References
1 Copyright by Nïall Green under the Creative Commons Attribution-Share Alike 1.0 Generic
Older posts

© 2023 The Public Privacy

Theme by Anders NorenUp ↑