Nobody wants to be a third wheel. Unless you’re a British spy.
Two of the most senior officials at British eavesdropping agency GCHQ say one way that law enforcement could access encrypted messages is to simply add themselves to your conversations.
“It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call,” said Ian Levy, technical director of the U.K.’s National Cyber Security Center, and Crispin Robinson, cryptanalysis director at GCHQ, in an op-ed for Lawfare.
“The service provider usually controls the identity system and so really decides who’s who and which devices are involved — they’re usually involved in introducing the parties to a chat or call,” they said. “You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication.”
Law enforcement and intelligence agencies have long wanted access to encrypted communications, but have faced strong opposition to breaking the encryption for fears that it would put everyone’s communications at risk, rather than the terror suspects or criminals that the police primarily want to target. In this case, two people using an end-to-end encrypted messaging app would be joined by a third, invisible person — the government — which could listen in at will.
This solution, Levy and Robinson say, would be “no more intrusive than the virtual crocodile clips” that lawmakers have already authorized police to use to wiretap communications.
Presumably that would require compelled assistance from the tech companies that built the encrypted messaging apps in the first place, like Apple, Facebook’s WhatsApp, Signal, Wire and Wickr. That poses not only an ethical problem for the companies, which developed their own end-to-end encrypted services so that even they can’t access people’s communications, but also a technical one, which would require the government to ask a court to compel the companies to rework their own technologies to allow government spies in.
It wouldn’t be the first time the government’s pushed for compelled assistance.
Only recently that the U.S. government lost its bid to force Facebook to re-architect its Messenger app to allow the government to listen in on suspected gang members. And not just the U.S. or the U.K.. Russia, the west’s favorite frenemy, forced Telegram, another encrypted messaging app, to turn over its private keys in an effort to allow its intelligence agencies to snoop in on possible kompromat.
Suffice to say, the U.K.’s plan has drawn strong criticism.
The conversation on exceptional access is a non-starter until the pro- side actually can come up with schemes that work that would satisfy them… this is weak sauce: https://t.co/rYGBizvr9J
— Joseph Lorenzo Hall (@JoeBeOne) November 29, 2018
This proposal for an encryption backdoor by Ian Levy and Crispin Robinson is deeply troubling. Among other concerns, it will severely undermine trust in the services that are subject to any such order – an equity the authors claim to prioritize. https://t.co/zovrsXJFm6
— Robyn Greene (@Robyn_Greene) November 29, 2018
And NSA whistleblower Edward Snowden, an outspoken commentator and critic of global surveillance, branded the move “absolute madness.”
Absolute madness: the British government wants companies to poison their customers’ private conversations by secretly adding the government as a third party, meaning anyone on your friend list would become “your friend plus a spy.” No company-mediated identity could be trusted. https://t.co/8CwoZfBM3K
— Edward Snowden (@Snowden) November 29, 2018
“No company-mediated identity could be trusted,” said Snowden, suggesting that the move would effectively render the trust in any end-to-end encrypted messaging app redundant.
Exactly what the U.K.’s solution looks like isn’t entirely clear, but Mustafa Al-Bassam, a PhD student at University College London, said that the ability for users to verify their keys — which proves the identity of a person in a conversation — in an end-to-end messaging app is “is going to be increasingly important” to prevent government manipulation.
WhatsApp and Signal, for example, tell you when a user’s key changes, indicating that a new device is in use — and requires verification — or that a device has been manipulated by a third-party and that the conversation isn’t secure.
“They’re proposing to exploit the fact that users don’t verify each other’s public keys, and inject bad keys,” said Al-Bassam.