A statement by the governments of 7 countries has urged the tech industry to address the “serious concerns where encryption is applied in a way that wholly precludes any legal access to content.” It essentially calls for backdoors into messaging and other technology so that encrypted data can be made accessible to governments legally. While this statement focuses on the challenges posed by end-to-end encryption, that “commitment applies across the range of encrypted services available, including device encryption, custom encrypted applications and encryption across integrated platforms.”
It calls on technology companies to work with governments to take the following steps, “focused on reasonable, technically feasible solutions:”
• Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable;
• Enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and
• Engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.
“Law enforcement, the statement continues, has a responsibility to protect citizens by investigating and prosecuting crime and safeguarding the vulnerable. Technology companies also have responsibilities and put in place terms of service for their users that provide them authority to act to protect the public,” says the statement.
End-to-end encryption that precludes lawful access to the content of communications in any circumstances directly impacts these responsibilities, creating severe risks to public safety in two ways, according to the statement:
- By severely undermining a company’s own ability to identify and respond to violations of their terms of service. This includes responding to the most serious illegal content and activity on its platform, including child sexual exploitation and abuse, violent crime, terrorist propaganda and attack planning; and
- By precluding the ability of law enforcement agencies to access content in limited circumstances where necessary and proportionate to investigate serious crimes and protect national security, where there is lawful authority to do so.
“Concern about these risks has been brought into sharp focus by proposals to apply end-to-end encryption across major messaging services.”
The statements asserts that “in light of these threats, there is increasing consensus across governments and international institutions that action must be taken: while encryption is vital and privacy and cyber security must be protected, that should not come at the expense of wholly precluding law enforcement, and the tech industry itself, from being able to act against the most serious illegal content and activity online.”
In July 2019, the governments of the United Kingdom, United States, Australia, New Zealand and Canada issued a communique, concluding that: “tech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can gain access to data in a readable and usable format. Those companies should also embed the safety of their users in their system designs, enabling them to take action against illegal content.”
A Precedent: The UK Home Office’s Encryption Factsheet
In late 2019, the UK Home Office issued a press release in which it expressed concern that “end-to-end encryption has created significant and avoidable barriers to companies being able to identify and prevent activity by terrorists, child abusers or serious criminals who are using their products or services to cause harm.”
Facebook’s proposals to apply end-to-end encryption to its messaging platforms by default presents significant challenges, the press release asserted. On 4 October 2019, the Home Secretary published an open letter to Mark Zuckerberg requesting that Facebook does not implement its proposals without ensuring that there is no reduction to user safety and without including a means for law enforcement to obtain lawful access to the content of communications.
In an open letter to Mark Zuckerberg, the UK Home Office called on the company to:
- Embed the safety of the public in system designs, thereby enabling it to continue to act against illegal content effectively with no reduction to safety, and facilitating the prosecution of offenders and safeguarding of victims.
- Enable law enforcement to obtain lawful access to content in a readable and usable format.
- Engage in consultation with governments to facilitate this in a way that is substantive and genuinely influences its design decisions.
- Not implement the proposed changes until it can ensure that the systems they would apply to maintain the safety of their users are fully tested and operational.
Facebook acknowledged that its plans will remove its access to content, which it currently monitors for safety purposes. Mark Zuckerberg said that “We face an inherent trade-off because we will never find all the potential harm we do today when our security systems can see the messages themselves.”
Facebook suggested that increased use of machine learning, artificial intelligence and user reporting will help mitigate the potentially very significant impact of the proposals. The UK Home Office countered that it is not satisfactory.
“Machine learning and artificial intelligence are a key element in advancing the detection of illegal material effectively. But they don’t take away the need for access to content. More than 99% of the content Facebook takes action against – both for child sexual exploitation and terrorism – is identified by its own safety systems and access to content, rather than by reports from users,” said the UK Home Office’s statement.
“Our preferred solution is to work with Facebook to ensure that it does not implement the proposals in a manner which would diminish user safety, and without including a means for law enforcement to obtain lawful access to the content of communications.”
The press release plainly stated that “Law enforcement and other agencies must, in certain circumstances, be able to access data, with strong and independent authorisation and oversight. We do not agree with the assertion that there is a binary choice between security and privacy. We assess that it is possible to develop a lawful, exceptional access solution which would not disproportionately increase cyber-security risk or undermine individuals’ privacy.”
Commentary by Tim Mackey, Principal Security Strategist at Synopsys Software Integrity Group
Economies and societies are based on trust. In a digital world, part of that trust equation is an attestation that two parties are who they say they are. This is accomplished partly via encryption technologies. If the encryption is faulty, say resulting from a poor implementation or reuse of encryption keys, then there is the potential for anyone to gain access to the encrypted data. Of course, once someone has access to the encrypted data, then it becomes potentially possible for them to modify the data. In the context of encryption, reuse of encryption keys is an example of a backdoor.
With the growth of technology and digital economies comes a rise in criminal activity. Governments are correct in their assessment that criminals are using encryption technologies to further their activities. Unfortunately, implementing a legislative remedy to this problem creates a different challenge – laws move slower than technology. This means that the legislative remedy could easily turn out to be an exploitable vulnerability that is embedded within all systems and as such very difficult to address.
Since encryption is a key element in the trust equation in digital economies, if a governmental backdoor is part of the DNA of a technology, then other questions are raised – not the least of which is the extent of monitoring any given government might perform and what criteria is used. With digital privacy laws varying globally, access to any backdoor likely will occur using different criteria in different jurisdictions. Given the recent invalidation of the EU-US Privacy Shield, it’s clear that such implementation details need to be addressed prior to the creation of monitoring backdoors.