
Grab the popcorns, this is going to be fun!
The access request to the information stored on the Smartphone of one of the San Bernardino shooting suspects has intensified the debate on the implementation of backdoors to enable the access to mobile devices for law enforcement purposes.
The issue does not refer to whether the law enforcement authorities, by means of a proper warrant, are entitled to search a mobile phone and access its content. That is a straightforward fact. They do.
What is at stake is Apple’s objection to a court order requiring it to provide the ongoing federal investigation the proper means to access such information. More concretely, it has been required to actually write a code modifying the iPhone software, that would bypass an important security function put in place, by disabling the feature which automatically erases information after ten attempts of entering the wrong password. This would enable authorities to endlessly enter wrong credentials and eventually crack the device’s password through brute force, without risking the deletion of content, thus being able to access it and extract the information contained on the iPhone of the suspect.
The use of new technologies to conduct criminal and terrorist activities has made it difficult to ignore the advantages of accessing the communications by means of such technologies in the investigation, prevention and combat of criminal activities. Law enforcement authorities point that it is particularly pertinent in the fight against terrorism, paedophilia networks and drug trafficking cases.
In this context, the use of encryption in communications has become a cornerstone of the debate. Investigative authorities are willing to see implemented backdoors in mobile devices in order to ensure the access when necessary. Contrastingly, companies such as Apple refuse to retain access keys – and consequently provide it upon request of law enforcement authorities – to such encrypted communications.
Just recently, FBI Director James Comey has told the US Senate Intelligence Committee that intelligence services are not interested in a ‘backdoor’ per se access to secure devices. Instead, what is at stake is requiring companies to provide the encrypted messages sent through those devices. James Comey is a wordplay habitué. He once said he wanted ‘front doors’ instead of ‘back doors’.
In the same line, White House Press Secretary, Josh Earnest recently stated that, by the abovementioned court order, Apple is not being asked to redesign its products or to create a backdoor.
While these are, at the very least, very puzzling statements, they nevertheless clearly express the subjacent motivation: the ban on encryption products with no backdoors and the implementation of backdoors.
Indeed, if companies can be required to undermine their security and privacy protection features in order to provide access to law enforcement authorities, regardless the legitimate inherent purpose, and disregarding the concrete designation one might find preferable, that is the very definition of a backdoor.
It never ceases to amaze me how controversial among free people living in a democracy it seems to be that the implementation of backdoors is – on both legal and technological grounds and for the sake of everyone’s privacy and security – a very bad idea.
Well, the main argument supporting the concept is that such technological initiative will chiefly help the combat of criminal activities. That is unquestionably a very legitimate purpose. And nobody opposing the implementation of backdoors actually argues otherwise.
However, it is a fact that backdoors would automatically make everyone’s communications less secure and exposed them to a greater risk of attacks by third parties and to further privacy invasions. Moreover, no real warranties in regards of the risk of the abuse which could ensue are ever provided. Those arguing in favour of the access to information through backdoors fail to adequately frame the context. It is vaguely stated that such mechanism will be used when necessary, without any strict definition. What is necessary, anyway? Would it depend on the relevance of the information at stake? Would it depend on the existence of alternative means or of how burdensome those are?
At least, if Apple complies with the order, it is difficult to accept that more similar requests will not immediately ensue. In fact, one will risk saying that those can be expected and will certainly be encouraged in the future. Ultimately, the creation of this cracking software could be used and abused in future cases. And this is particularly worrisome considering the lack of legal framework and the judicial precedent basis.
One may be tempted to sacrifice privacy in the interest of public security. That it is not a wrongful viewpoint. I don’t know anyone that would disagree on that. Except when considering the very own limitations of backdoors when it comes to fighting terrorism for instance. It is harder to support backdoors to prevent criminal activities when confronted with their very own inherent inefficiency and limitations, which seem to go unacknowledged by their supporters.
While companies may be forced to implement such backdoors, to provide access to encrypted communications, there is a myriad of alternatives in the marketplace for criminal seeking encrypted products where no such backdoors are installed. Encryption apps, files encryption, open source products, virtual private networks…
Let’s talk about Isis for instance. It has been alleged – without further demonstration – that they have their own open source encrypted communications app. Therefore, except from weakening the communications’ safety of everybody relying on encrypted messaging apps, considering the open source nature of the app used by Isis, the implementation of backdoors would be pointless for the purpose intended to be achieved.
Thus said, one can easily understand the stance of Apple. Having built its reputation on the privacy and security provided by its devices, it is very risky from a commercial viewpoint to be asked to develop software that counter its core business. Indeed, it modified its software in 2014 in order to become unable to unlock its Smartphones and access its customers’ encrypted data.
The fact that the company is now being asked to help enforcement law authorities by building a backdoor to get around a security function that prevents decryption of its content appears to be just another way of achieving the same outcome. Under a different designation.
Because it now goes way further than requiring companies to comply with a lawful order and warrant to the extent they are able to, requesting private companies to create a tool intended to weaken the security of their own operating systems just goes beyond any good sense. Indeed, it just amounts to require (force?) private companies to create and deliver hacking tools to law enforcement authorities which actually put everyone’s privacy and cybersecurity at risk.
And if this becomes a well accepted requirement in democratic systems, either by precedent either through legislative changes, well, one can only wonder with what enthusiasm such news will be welcomed by some repressive regimes eager to expand their surveillance powers.
From an EU viewpoint, and considering how uncertain is the future of the Privacy Shield framework, and despite the existing divergences among EU Member States in respect of encryption, this whole case certainly does not solve any trust issues in regards of the security of the data transferred to the US.
Recent Comments
invalidating the EU Data
Retention Directive
invalidating the EU Data
Retention Directive