From all the legal challenges that the GDPR will present for businesses in general, I would like to address in this post the issues raised by its implementation in regards of social network platforms, which are quite popular nowadays.
Article 17 of the GDPR establishes the ‘right to erasure’ or the right to be forgotten, as it has come to referred to, which provides data subjects with the right to require from data controllers the erasure of their personal data held by the latter, and the consequent obligation of controller, upon that request to abide, without undue delay, when certain conditions are fulfilled.
Considering that infringing the ‘right to erasure’ may lead to the application of significant economic sanctions, there is the risk that social platforms will be tempted to adopt a preventing approach by complying to all the deletion requests, disregarding their validity, thus erasing content on unfounded grounds. This is particularly worrisome because it may directly lead to the suppression of free speech online. Consequently, online businesses are not and should not be deemed competent to make any assessment in regards of the legitimacy of such claims, a point that I have already tried to make here.
While it seems that a notice and take down mechanism is envisaged without much detail being provided in regards of its practical enforceability, a particular issue in this context is the one related to the identities upon which such obligation impends. Indeed, the obligation to implement the ‘right to be forgotten’ can only be required from those who qualify as data controllers.
As data controllers are defined as the entities who determine the purposes and means of the processing of personal data, it is not clear if online social platforms providers can be defined as such.
Considering the well-known Google Spain case, it is at least certain that search engines are deemed to be controllers in this regard. As you may certainly remember, the CJEU ruled that individuals, provided that certain prerequisites are met, have the right to require from search engines, such as Google, to remove certain results about them, subsequently presented to a search based on a person’s name
Thus said, it is questionable if hosting platforms and online social networks, focused on user generated content, as it is the case of Facebook, qualify as such, considering that the data processed depends of the actions of the users who upload the relevant information. Therefore, the users themselves qualify as controllers. The language of Recital 15 of the GDPR about social networking is inconclusive in this regard.
The abovementioned Recital provides as follows:
This Regulation should not apply to processing of personal data by a natural person in the course of a purely personal or household activity and thus without a connection with a professional or commercial activity. Personal and household activities could include
correspondence and the holding of addresses, or social networking and on-line activity undertaken within the context of such personal and household activities. However, this Regulation should apply to controllers or processors which provide the means for processing personal data for such personal or household activities.
This is not an irrelevant issue, though. In practice, it will amount to enable someone to require and effectively compel Twitter or Facebook to delete the information about her/him despite being provided by others.
And considering that any legal instrument is proportionally as efficient in practice as it is capable of being enforced, the definition of whom is covered and ought to comply with it is unquestionably a paramount element.
As I remember to read elsewhere – I fail to remember where, unfortunately – one wondered if the intermediary liability as foreseen in the e-Commerce Directive would be an appropriate mechanism for the enforcement of the right to erasure/right to be forgotten.
Articles 12-14 of the e-Commerce Directive indeed exempt information society services from liability under specific circumstances, namely when they act as a ‘mere conduit’ of information, or engage in ‘caching’ (the automatic, intermediate and temporary storage of information), or when ‘hosting’ (i.e., storing information at the request of a recipient of the service).
Article 15 establishes the inexistence of any general duty impending on online intermediaries to monitor or actively seek facts indicating illegal activity on their websites.
Having into account the general liability of online intermediaries foreseen in the E-commerce Directive (Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market), a particular distinction will perhaps apply according to the level of ‘activity’ or ‘passivity’ of the platforms in the management of the content provided by their users.
However this liability does not fully clarify the extent of the erasure obligation. Will it be proportionate to the degree of ‘activity’ or ‘passivity’ of the service provider in regards of the content?
Moreover, it is not clear how both regimes can be applied simultaneously. While the GDPR does not refer to any notice and take down mechanism and expressly refers that its application is without prejudice of the e-Commerce Directive liability rules, the fact is that the GDPR only establishes the ‘duty of erasure’ to controllers. As the intermediary liability rules require accountability for the activities of third-parties, this is a requirement not easy to overcome.
Thus considering, the most awaited GDPR hasn’t entered into force yet but I already cannot wait for the next chapters.
Recent Comments
invalidating the EU Data
Retention Directive
invalidating the EU Data
Retention Directive