The Focus on Terror has Distorted the Debate on Encryption
Answers to the question: 'Why Does Encryption Matter?'
Surveillance has hit the headlines again. The UK Data Retention and Investigatory Powers Act, or DRIPA, which passed last year after just 24 hours debate, was ruled illegal by the High Court in a landmark case. DRIPA was an emergency measure to allow law enforcement agencies access to communications data, and its illegality puts even more pressure on UK Home Secretary Theresa May’s forthcoming Investigatory Powers Bill, announced in the Queen’s Speech. Last week, David Cameron announced that WhatsApp, Snapchat, iMessage, indeed, any encrypted messaging system, could be banned under new laws. In the fight against terrorism, the security services’ ability to intercept communications by would-be violent extremists is said to be paramount. ‘In our country,’ said Cameron speaking in January, ‘do we want to allow a means of communication between people which we cannot read? My answer is no we must not.’
That same week, a few thousand miles away in Milan, Hacking Team – a ‘does what it says on the tin’ company which specialises in creating software for governments – suffered a massive hack themselves. Imagine the worst Monday morning you’ve ever had. Its owners woke up to find that every email, every contract, every piece of software they’d developed had been stolen – hacked – and was being spread across the internet. In the early hours of the morning, the hashtag #IsHackingTeamAwakeYet? began to trend as people watched for the company to discover the catastrophic breach.
Linking these two events is the subject of encryption. ‘Encryption,’ writes Jamie Bartlett in The Dark Net, ‘is the art and science of keeping things secret from people you don’t want to know them, while revealing them to those you do.’ In reality, we all use it. Online banking, Facebook and Google all use encryption to help deliver their services and keep data safe. So how important to the average Brit is encryption? Is it doing more harm than good? Last week’s events offer some timely perspective.
Hacking Team lost control of everything. Their contracts, their contact details and, most importantly, the source code of their software. They say you can judge a man by the company he keeps. You can certainly judge a company by its clients.
Alongside the US, Hacking Team sold software to countries whose human rights records weren’t up to much. Ethiopia, where political opposition members are frequently detained in black sites. Egypt, where human rights activists have been tortured while under arbitrary detention. The less said about the Sudanese human rights record the better. Bahrain, Kazakhstan, Morocco, Russia, Saudi Arabia, Azerbaijan, Turkey. The list is a long one. The assumption is that such software was being used by intelligence agencies worldwide. Encryption is frequently described in shadowy terms: in fact, in countries where human rights, freedom of expression and freedom of association are all threatened, it is a fundamental tool in keeping activists safe from governments who are out to get them. While the type of software sold by Hacking Team might be able to break into encrypted services, it would certainly be much more difficult than accessing non-encrypted services.
Italy, Switzerland and South Korea all bought Hacking Team software, too. The UK is a notable absence: leaked emails published on WikiLeaks show our government trialled the kit but the purchase never happened.
From their commercial success alone it is reasonable to think that Hacking Team software was the real deal. Contracts were sold to dozens of governments, armies and police forces worth millions of pounds. And since 5 July, that software is all in the public domain, free for anyone to use.
In the same week, we have the government announcement that WhatsApp and Snapchat may be banned because they rely on end-to-end encryption to protect their contents. It’s a bit like the government mooting a ban on changing the locks on your front door because terrorists are locking their houses in the same week as a lorry jack-knifes on the Westminster roundabout and spills thousands of Chubb master keys.
You are far more likely to be a target of someone unduly interested in your emails, text messages and photos than a target of terror. The Hacking Team leak underlines how easy it is becoming for anyone with criminal intent and a modicum of technical skill to access our personal data. Who knows who might now be using this software? The only way to take responsibility for keeping your private data private is to rely on encryption.
It shouldn’t be difficult to convince anyone that taking steps to protect their data is worth doing: bank details, business transactions, personal communications – there are serious consequences if this kind of information is easy to get hold of. The massive leak of intimate photos from dozens of celebrities in 2014 underlined the embarrassment a seemingly innocuous bit of curious hacking can do. It isn’t a case of ‘having nothing to hide’: you have an awful lot to ‘hide’, not because you are up to anything, but simply because so much of your personal information is now being transported electronically. The government should also consider this: there is a real opportunity to save money using encryption, because its widespread uptake would almost certainly reduce the levels of petty cybercrime.
Basic encryption on services like Facebook and Whatsapp won’t protect suspected terrorists, either. If the security services want to get to you, they will. This level of encryption is like a front door lock: probably enough to keep you safe from most petty crime, but it won’t stop the police getting in. The resources available to the intelligence services are massive and will grow over the next few years. As the former head of GCHQ David Omand has said, ‘the intelligence machine never stops’. If the security services can’t intercept communications between suspects, they will resort to more intrusive methods like bugging, personal surveillance and skilled human intelligence that surgically targets suspects.
Banning encryption is not the answer. Weakening it is not the answer. Like a lock on your front door, encryption is a basic requirement of keeping your valuables safe. And like a lock on your front door, if you really are up to no good, it won’t protect you from our intelligence organisations. We need better education about life online and the risks it poses. We need to address the increasing number of people worried about companies and criminals snooping on their data. Banning the one tool that might protect us would be a mistake.
Alex Krasodomski is a Researcher at DEMOS: http://ow.ly/PMvWy
Twitter: @akrasodomski