Your Thoughts: Whatsapp’s Privacy Protection & Terrorism

Following the recently deemed terrorist attacks in Westminster, London, calls were made for the ban of end-to-end encryption, such as that used by Whatsapp, the messaging platform indirectly blamed for being a tool of use by the alleged terrorists. This leaves many with the question: do those in positions of political power still think it unnecessary to understand technology?

This week’s ‘Your Thoughts’ has heard from many security specialists, researchers and legal professionals on the ins and outs of Whatsapp’s privacy protection debate.

Kyle Wilhoit, Senior Security Researcher, DomainTools:

The idea of having a perfect end-to-end encryption solution with backdoors embedded only for police sounds great, in theory. However, in practice, it’s not possible. If a backdoor is embedded into an application or service, it’s present for anyone to find and use. The only difference between police and criminals at that point is awareness of the backdoor and intent.

The ultimate victims are the end user and the organization required to comply with embedding vulnerabilities to allow for backdoors. Having embedded vulnerabilities leaves the end user vulnerable to criminals who leverage the backdoor that the organization willingly put into place. You can’t necessarily control who finds or uses this vulnerability once the application is distributed and used.

Javvad Malik, Security Advocate, AlienVault:

Today, as we stand with technology and encryption deployment, backdoors simply aren’t possible. It’s an all or nothing approach. If backdoors are built in, then they could be exploited by anyone, not just authorised bodies.

Lee Munson, Security Researcher,

Westminster gets tough on terrorists. MPs clampdown on encrypted communications. Amber Rudd foils imminent attack while chatting on WhatsApp.

Gavin Millard, EMEA Technical Director, Tenable Network Security:

As the computational power, complexity and value of these devices increases, the probability they’ll be targeted by cyber criminals to monetize security flaws will also rise. Smartphones are a particular weak spot, with cherished photos being stored and rarely backed up.

As with traditional IT equipment, it’s important connected devices are kept up to date, applying fixes the vendors release in a timely manner.

David Meltzer, Chief Technology Officer, Tripwire:

You can have true end-to-end encryption that nobody but the participants can read, or you can have a system where a central authority can decrypt any message they want. It doesn’t make any sense to suggest that you can have both. It is either one or the other. It is a reasonable policy position to believe you should have a government backdoor in messaging systems, but this always worries security experts because that same backdoor you create for the government inevitably creates the potential for misuse, abuse, and being exploited by others.

David Emm, Principal Security Researcher, Kaspersky Lab:

The recent terrorist attack in Westminster has brought with it renewed questions about the use of end-to-end encryption by messaging services such as WhatsApp. In particular, the Home Secretary Amber Rudd has appealed to Internet companies to provide a way for government to inspect the communications of those suspected of criminal activity, for example terrorists. Others have even called for a blanket ban on end-to-end-encryption altogether.

In reality however, both of these approaches are flawed. The requirement for application vendors who use encryption to provide a way for government or law enforcement agencies to ‘see through’ encryption, poses some real dangers. Creating a ‘backdoor’ to decipher encrypted traffic is akin to leaving a key to your front door under the mat outside. Your intention is for it to be used only by those you have told about it. But if someone else discovers it, you’d be in trouble. Similarly, if a government backdoor were to fall into the wrong hands, cybercriminals, foreign governments or anyone else might also be able to inspect encrypted traffic – thereby undermining not only personal privacy, but corporate or national security. It would effectively create a zero-day (i.e. unpatched) vulnerability in the application.

This places application vendors in an invidious position. In response to growing privacy concerns in recent years, more vendors have implemented encryption to secure their customers’ communications. They’re unlikely to be happy about switching to a ‘snoopable’ form of encryption – as illustrated by the stand-off between Apple and the FBI last year.

A blanket ban on encryption would be just as dangerous. Law-abiding citizens and organisations would seek to comply with such legislation – compromising their privacy. But cybercriminals would either make use of encryption capabilities developed in another country (i.e. beyond the reach of the UK government), or implement encryption for themselves.

There’s an inherent tension between privacy and security. This isn’t going to disappear, although the emphasis may shift depending on the geo-political situation and security context at any given time. Amber Rudd must surely be conscious of the fact that there’s no way to restrict the use of encryption to honest, law-abiding citizens. However, at the same time, the government has made it clear that it wants organisations in the UK to protect themselves from cybercriminals and other would-be intruders. There are steps organisations can take to do this such as running fully updated software, performing regular security audits on their website code and penetration testing their infrastructure. However, since no company can guarantee 100% that its systems will not be breached, encryption is essential to ensure that such a breach doesn’t result in the loss of sensitive information. The best way for organisations to combat cyber-attacks is by putting in place an effective cyber-security strategy before the company becomes a target.

Julian Sheppard, Director Computer Forensics EMEA, KrolLDiscovery:

Can you imagine if this was a call to build in backdoor to access your end-to-end private banking transactions or your day-today credit card payment authorisations with Amazon? The problem comes with ensuring we do always have totally secure communication methods when we need them but also the need for us to trust that the people who design these apps do provide absolute security to end users. Of course, it’s unfortunate that terrorists will favour using any of these encrypted communication technologies over plain text especially as they know that messages in transit cannot be intercepted even by the best eyes and ears in the technology world.

Not only are messages transmitted with end-to-end encryption but when they are stored on a phone they are held in an encrypted database; this includes within any backups taken of the phone. Furthermore, the device is generally also protected from casual reading the messages in the WhatsApp app by a PIN or passcode preventing unauthorised access to the phone. Overcoming access to the phone can be problematic without the owner’s assistance (if not also risky or impossible) as we know from the Apple v FBI iPhone access incident involving the San Bernandino terrorist’s phone of 2015. Fear of wiping the device from too many failed PIN entries was enough to initially hold back the best investigators and seasoned forensic minds. However, the PIN entry was eventually circumvented and the data from the phone was gained. This is where our best hope presently lays with gaining access to messages sent using secure apps like WhatsApp – the constant battle between phone provider security and those forensic and hacker minds finding vulnerabilities in hardware and software designs. At the moment, overcoming issues like the PIN/Passcode barrier is the first hurdle and this will at least give us a chance to manually read the data on the phone, maybe even capture all of the data. This is a constant battle where we win some days, only to have the door close the next.

Finally, to the point of trusting apps to be secure. In 2014, Truecrypt ceased ongoing development with an announcement on their own website that ‘WARNING: Using TrueCrypt is not secure as it may contain unfixed security issues’. People quickly lost trust that it was secure and alternative secure methods were sought. I strongly believe that should users lose trust in WhatsApp security through any rumours or actual backdoors, they will quickly move onto the next secure communications method on the market.

Paul Holland, Founder and CEO, Beyond Encryption:

The Westminster Terror attack, and subsequent revelations regarding the usage of encryption, present challenges to all wishing to protect their identity and data as part of their civil rights. With Chancellor Hammond’s announcement about the government’s £1.9 billion investment in UK’s cyber-crime defences, encryption will remain at the heart of the debate.

Encryption mechanisms have existed for thousands of years serving the same purpose then as they do now, seeking to safeguard the user’s communications from unwanted eyes. The UK’s pride in its rich encryption heritage is clear to see from the significant funds that have been invested into Bletchley Park, home of the enigma code breakers, in recent years. In 2018 the UK will launch its first ever College of Cyber Security on the site.

Naturally the population want their civil rights to be respected when protecting their sensitive data, whilst still being protected by the shroud of counter terrorism legislation in cases where encryption mechanisms are abused for illegal purposes.

Encryption employs a mathematical equation and keys to ‘scramble’ messages thereby rendering them unreadable in the absence of the key (or multiple keys). By way of example, with typical ‘military grade’ encryption methodology; if every atom on earth (a lot of atoms) were a computer, each capable of trying ten billion keys a second, it would still take about 2.84 billion years before you reached the key that enabled you to ‘unscramble’ message.

It is here that No 10 and the Home Secretary have called for police and intelligence agencies to have access to messages transmitted using services such as those applicable to high profile social media services.

This presents significant challenges given that in many cases the above keys are held by the message sender/ recipient and, as already described, the ‘brute force’ mathematical power required to decrypt the unreadable ‘blob’ of data is impractical.

Brian Lord, Managing Director, PGI Cyber:

The aftermath of any horrific terrorist or criminal act always reignites the debate of “what level of absolute freedom should citizens be prepared to cede in order to help the State preserve the wider security and freedoms that society requires.” The privacy versus security debate raging around WhatsApp’s message content retention, storage and disclosure policy is simply a contemporary iteration.

We live in a world where modern technology can make most things technically possible. So, it is easy to find a purely technical solution that meets these inherent principles:

(a) service providers’ requirement to maintain and grow market share and meet clients’ needs;

(b) nation States’ need to keep its citizens safe from avoidable threats and expect those providing services to their nation to behave with respective social responsibility;

(c) consumers’ right to access technological products that improve business and private life, which includes the right to privacy.

But consumers’ voracious appetite for new technology growth, and the diluting nature the Internet has on international borders (for good and bad), means the actual solution cannot be a purely technical one. The challenge is major re-calibration: socially, politically and commercially (and ultimately legally) to what 21st Century “digital normality” looks like.

Service providers, such as WhatsApp/Facebook seek a legitimate global market for products, in a world where wider access to the Internet continues to increase global connectivity. And nations will have very different “national security” criteria against which data is needed from such providers within their respective jurisdictions. Some nations’ criteria will certainly be wholly unpalatable and unacceptable to standards imposed elsewhere in the world.

Yet it is not a sustainable position for a global telecommunications provider/enabler to ignore all these implications and seek institutional policies and agnostic infrastructure that simply abrogates them from managing these 21st Century dilemmas.

Terrorist attacks and criminal activity can never be stopped completely, but making it harder to commit such acts is the responsibility of everyone: state, industry and public alike.

Technology is not the problem. Whenever I give “Cyber” advice to clients, whether at national level to formulate national security policies, or to industry to strike the right balance between security and operational efficacy, it always the organisations’ sociological and behavioural inhibitors that obstruct safe exploitation of available technology. There is some way before we have properly adjusted, from both a risk and benefit perspective, to the dynamic world that we have created.

Helen Goldthorpe, Commercial and IT Associate, Shulmans LLP:

Since the Westminster attack, the inability to access WhatsApp messages has angered the intelligence services and Home Office, which have called for backdoor access to content. However, with no messages stored on its servers and with end-to-end encryption, the company stands firm on its position that it’s not technically possible to facilitate this without undermining its security protocols. As security is at the heart of its offering, WhatsApp is in fact under obligations to keep information secure in accordance with the Data Protection Act. When GDPR comes into force in 2018, WhatsApp will be perfectly placed for compliance with the “Privacy by Design” obligations it imposes, in addition to the new draft ePrivacy Regulation which will extend current laws on the security of electronic communications to “over the top” providers such as WhatsApp.

A second reason why WhatsApp is likely to be comfortable with the fact that it cannot currently access the content of messages is that this helps it to argue that the app is merely a “conduit” for messaging, preventing it having any liability under defamation, copyright infringement, anti-terrorism and other laws. Whilst its terms and conditions forbid the use of the service for instigating or encouraging illegal conduct, it is not in its interest to either actively monitor messages, or to amend them to remove such content. Even if it could, doing so may impose liability on WhatsApp if it is deemed to have knowingly distributed or published the content. By ensuring that it does not have access to the messages, WhatsApp minimises the risk of liability.

Although a release of data to the intelligence services under a warrant would not necessarily breach data protection laws, the legal position and market demand for security have led to the creation of a system where this isn’t technically feasible. Given that WhatsApp is unlikely to voluntarily make its system less secure, legislation (which arguably already exists in the Investigatory Powers Act 2016) would need to require the company to create a backdoor. In order to be useful, a legal obligation to store the messages after delivery may also be required. As well as leading to technical and commercial difficulties and a risk of simply moving the problem elsewhere, any such obligation is likely to be the subject of legal challenge. The Investigatory Powers Act 2016 itself replaces a law which was successfully challenged on privacy grounds.

Omri Sigelman, Co-founder & Chief Strategy and Product Officer, NURO Secure Messaging:

The UK Home Secretary wants encryption companies to be legally obliged to provide Governments investigating terrorist acts with the means to decrypt messages.

Already with the Investigatory Powers Bill, the British government has gone “further than any other Western democracy” in its expansion of surveillance powers and its ability to collect bulk data without justifiable reason. In seeking more controls over encryption, the UK government is in danger of repeating its mistake of using the law as a blunt instrument for subverting technology to its will.

Any move to hand nation states the power to decrypt messages simply undermines the privacy of businesses and ordinary consumers. Terrorists would simply find other ways to communicate.

I do believe, however, there is a strong case for outlawing WhatsApp in strictly regulated industries such as banking/finance and law. Last week it emerged that traders, bankers and money managers are using WhatsApp and similar apps as an easy, almost undetectable way to evade compliance. The trend is happening in legal firms too. Meanwhile the balance of power in terms of data rights is shifting away from companies towards individuals. At the same time fines for non-compliance will get heavier.

Once EU General Data Protection Regulation (GDPR) comes into force from 2018, fines for major organisations could reach £70bn, while smaller businesses could see collective fines reaching £52bn. Against this backdrop legal firms need to ensure they have sufficient measures in place for managing employee behaviour in mobile group chats and total privacy control over the data they share.

A first step is to introduce new policies and procedures that specifically address what types of data can and cannot be shared in standard mobile group chat and collaboration sessions. Additionally, it is worth considering a secure messaging and collaboration platform built for business rather than consumer use. Such systems ensure client information remains private and secure at all times. Firms also retain full ownership of that data along with the encryption keys so they can prove compliance should they need to.

Adopting enterprise-class alternatives to WhatsApp for business communications and collaboration is a far more effective encryption strategy than handing over the keys for authoritarian governments, foreign spies and criminals to exploit.

We would also love to hear more of Your Thoughts on this, so feel free to comment below and tell us what you think!

1 Comment
  1. Professor Izhak Assouline Lawyer U.S.A says

    A terrorist attack is not the name of a product, but a deadly journey of a man or a woman or some together who set their daily goal of committing a massacre and murdering people, women, children and babies. It is the murder of whole families even of those who were not murdered for them. Life has become a terrible nightmare that will never be wiped out. These terrible acts must not be allowed to become part of the world\’s agenda. You hear today a murderous attack in this country and the next day in another country, it\’s not a brand, it\’s a mass murder. We citizens of the world should not accept this as part of the chilling routine of murdering and killing entire families in every country.

Leave A Reply

Your email address will not be published.