Tag Archive for: encryption

End-to-end encryption is good for democracies

Democracies need to accept the mainstream adoption of end-to-end encryption in popular messaging and communications applications like WhatsApp, Signal, iMessage and, in the future according to Meta, Facebook Messenger.

End-to-end encryption, where only the sender and recipient can see the contents of a message, has been around for a long time for people who wish to keep their communications private. It gained particular traction after Edward Snowden’s revelations of mass government surveillance in 2013. Its value has since been consolidated globally as a necessary technical response to the perceived surveillance excesses of technology companies and governments alike and can help protect the safety and rights of people around the world to express themselves freely.

But some liberal democratic governments are pushing back against its adoption by mainstream instant-messaging platforms, going so far as to launch an emotive advertising campaign against it recently in the UK.

Broadly, the Australian government argues that law enforcement and intelligence agencies are finding it harder to obtain usable content directly from the platforms because the platforms cannot decrypt and see the encrypted communications that transit their systems.

While Australia, together with the other Five Eyes countries, along with India and Japan, have stated their support for strong encryption, in the same joint statement they also call for platforms to work with governments on technically feasible solutions to enable law enforcement (with appropriate legal authority) to have access to content in a readable and usable format. These two things are not mutually exclusive, except in when end-to-end encryption is being used. End-to-end encryption is specifically designed to prevent third parties from viewing the content of a communication, thus providing privacy to users.

There are other encryption architectures available for messaging systems, such as client-to-server encryption, but it is weak in terms of privacy because a third party with access to the server can read all communications. This is the architecture used in services such as WeChat, and previously in Zoom.

The seven countries have also argued that the move to implement end-to-end encryption would threaten the current practice of platforms without it to continue to proactively identify photos and videos for known child sexual abuse material and report it to authorities.

The problem is that, whether it’s liberal democratic governments monitoring communications for child sexual abuse material or authoritarian governments monitoring for what they deem politically sensitive content, both require decrypting communications (either on the device or on the server) to analyse for ‘harmful’ or ‘illegal’ content, resulting in a third party (and unintended recipient) accessing the encrypted content.

Unfortunately, as we’ve seen with other restrictions in freedom of expression online by liberal democracies, authoritarian countries will use Western examples when enacting their own policies to police content online. Western countries will be hard pressed to criticise those that censor political content if they themselves don’t support an environment in which strong encryption is the norm.

Fundamentally, any access by law enforcement to communications is an intrusion of privacy. Any liberal democratic government with that power needs to have and maintain the trust of its citizens. Even if the current government is trusted, future governments may not be. And given that there are cases in Australia alone of authorities using capabilities, powers and data in unexpected ways, there is cause for concern.

Technology companies also have a role to play. They have responsibilities to protect vulnerable users of their platforms, prevent their platforms from being used to facilitate crimes and harmful activities, and help law enforcement (with lawful requests). Together with governments, they should seek solutions that do not demonise or affect the integrity of end-to-end encryption.

Law enforcement doesn’t necessarily need unencrypted content to prosecute a crime. Metadata, which provides information about the communication, can be used to generate leads. But content makes gathering evidence to even obtain a warrant significantly easier.

As Tom Uren writes in his recent ASPI report, The future of assistance to law enforcement in an end-to-end encrypted world, there are options available to protect children using social media from harm. Platforms can use metadata and behavioural analysis to sound the alert when, for example, an adult is contacting multiple children that they haven’t had prior contact with. They can educate and encourage children and parents to report suspicious content, which could provide unencrypted content for the platforms to analyse. Perhaps messages to and from children might not use end-to-end encryption by default.

Law enforcement agencies seeking access to evidence for a crime still have a wide array of tools in their belts. Police can access metadata from social media companies about users’ accounts or from service providers in Australia (such as ISPs) that are required to store certain metadata for two years. Australian law enforcement can take over social media accounts or apply for computer access warrants to obtain evidence from devices on which data is usually unencrypted. These investigative techniques are more challenging and can’t necessarily be done at scale.

End-to-end encryption, especially in combination with anonymising software and the growing complexity of modern internet networks, which includes a move towards a decentralised Web 3.0, does make the job of online policing more difficult.

We need to find a middle ground. There is a nuanced debate to be had about how users of online services can be protected both by law enforcement and by platforms, while at the same time governments ensure that the right to privacy, enforced with strong encryption, is not dismissed easily.

Electronic surveillance law reform a step in the right direction

The legislative framework governing electronic surveillance by Australia’s policing and intelligence agencies is undergoing major reforms. There have long been calls for such an overhaul given the significant changes in the way we communicate since the first of these laws was passed in 1979. However, developing a new, concise, ‘streamlined’ and ‘future-proofed’ act that lets security agencies do their jobs without excessively compromising Australians’ right to privacy is a task complicated by the equally compelling prioritisation of the security concerns of federal agencies and the privacy concerns of civil society.

In December last year, the Department of Home Affairs invited public submissions on a discussion paper about the planned legislative reforms at a granular, operational level. The paper shows a genuine commitment to regulating agency powers and actions in accordance with both sides of this argument. But while government agencies and industry are engaged at a detailed, operational level with how the reforms will work in practice, the civil-society debate remains overwhelmingly a high-level conceptual discussion about security and what a rights-based approach looks like in theory.

The public only really appreciates the overarching themes and values of security and privacy and the prospect of Orwellian state power. This is likely (at least in part) because getting across former Australian Security Intelligence Organisation chief Dennis Richardson’s 2020 review of the legal framework of the national intelligence community, the government’s response and the Department of Home Affairs paperwork on it means wading through a whopping 1,485 pages combined—and each document requires significant sectoral knowledge and time to fully understand.

The three acts set to be repealed and replaced are the Telecommunications (Interception and Access) Act 1979, parts of the Australian Security Intelligence Organisation Act 1979 and the more recent Surveillance Devices Act 2004.

Technological advancements have made communication faster, cheaper, more convenient and in some cases more private. But in doing so, it has also made policing more difficult, particularly due to the popular use of encryption and anonymisation software.

The current laws have not proven adaptable to the new operational and legal challenges resulting from these iterative advancements and the ways criminals have exploited new technology. A host of quick-fix attempts through legislative amendments has succeeded only in creating a ‘patchwork of overlapping, and at times inconsistent and incompatible parts,’ according to Home Affairs’ discussion paper.

This makes it difficult for law enforcement to frustrate and prosecute organised crime activities like drug trafficking, child sexual exploitation, and the recruitment, financing and organisation of violent extremist attacks. With an archaic legal framework supporting operations, it’s become hard to deliver security outcomes without compromising civil liberties.

As a result, Richardson’s review recommended that the relevant laws be scrapped and replaced with an entirely new, fit-for-purpose act.

This imperative is the result of three key developments since the laws were passed: how we communicate, how the associated industry and infrastructure works, and how organised crime is, for want of a better word, organised.

First, telecommunications has completely transformed since the original laws were passed. Communication is no longer unprotected, point-to-point and with clear endpoints. The internet and over-the-top messaging services such as WhatsApp and Signal, which provide communications services over data and may encrypt those communications, have reduced the ability of security agencies to legally monitor criminal activity under the current framework.

End-to-end encryption of the entire communications channel and anonymisation software that hides the user’s identity have made policing more difficult. Even if law enforcement agencies can legally ‘tap the line’, they can’t see the content, and even if they can see the content or see that two devices are communicating, they can’t necessarily identify the individuals communicating.

For a long time, the telecommunications infrastructure that this data runs across hasn’t been fully government owned and controlled. The physical components and electronic layers of communication are now likely to have multiple owners. Whereas a telephone company used to have all your records, you can now use your mobile phone to connect to wi-fi and then to an application providing over-the-top messaging services.

Private industry now contributes to decision-making and regulation, bringing necessary considerations about market forces and consumer confidence. The government understands the importance of this contribution and wants it to continue, which means that industry also plays an active role in the legal, ethical and implementation aspects of this national security issue.

Telecommunications has evolved from fixed-line telephones to smartphones, and communication over the internet (and all the hidden infrastructure that makes it work) has become more complex. As part of the reform process, industry providers are rightly being engaged on the implementation issues most pressing for them, generally regarding compliance costs, feasibility and the administrative burden for companies and government agencies. But the expectations the public has of industry responses to these reforms are complicated by the fact that Australians are now both citizens with a right to privacy and clients willing to pay money. We may choose providers based on the privacy and the security they can offer.

Transnational and serious organised crime groups have exploited the privacy and ease of communications that technology can provide within and between criminal groups in Australia and internationally. Through these new ways of communicating, organised crime has been able to leverage the increasing interconnectedness of nations’ economies and societies at a global level to elevate its business models from regional to global in terms of supply chains and target markets.

The development of a new act is a step in the right direction for both these interests, because it will line up this conversation and the legislation with the current technology and threat landscape.

Tricky questions about how to balance security and liberty in practice remain in developing the new law and the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018 that is pending review.

We should keep, as a guiding principle, the need to hold online law enforcement to a rights-based standard that is equal to or even higher than the standard for offline work due to the heightened capacity for insecurity and abuse by other actors in the online world.

WhatsApp and the right to encrypt

Encrypted messaging has become a very difficult business. Despite their reach, it’s hard to make money from messaging apps. And their key service—encryption—is under constant attack from actors across the political spectrum.

Authoritarian governments are furious because citizens are using encrypted messaging to organise dissent. In democratic countries, law enforcement agencies hate the way it facilitates criminal behavior of all kinds.

And democratically minded publics are incensed because right-wing extremists on the fringe and in the mainstream have used it to spread disinformation, to subvert elections and to organise political violence and radicalise others.

On top of that, elected politicians are increasingly addicted to using it for policy discussions and reporting, undermining basic democratic transparency and accountability.

Services like WhatsApp, Signal and Telegram are deeply entangled in high-stakes political struggles globally over fundamental rights such privacy and peaceful dissent, legitimate law enforcement imperatives, rising authoritarianism, and the existential threats to democracies from disinformation—all happening in the increasingly vulnerable and dangerous realm of cyberspace.

At the same time, encrypted messaging services are enormously popular. At least half the world’s people have at least one encrypted messaging app on their phones. Of these, WhatsApp has by far the greatest reach, used by about 90% of people in most countries.

In an interview with the director of ASPI’s International Cyber Policy Centre, Fergus Hanson, WhatsApp CEO Will Cathcart outlined his approach to navigating hazardous political waters.

Since taking over the top job at WhatsApp in 2019, Cathcart has built a reputation as a strong advocate for encrypted messaging as a service essential to protecting against threats to privacy, democracy and cybersecurity.

Under his leadership, the company has mounted high-profile challenges in India and Brazil against government attempts to pry open the service to enable surveillance of messaging.

WhatsApp has targeted companies that assist state surveillance. In 2019, it initiated a suit in a US federal court against Israeli tech company NSO Group. The suit alleged that NSO had developed its Pegasus spyware to penetrate encrypted messaging services, helping Mexico, Saudi Arabia, the United Arab Emirates, Bahrain, Morocco and Kazakhstan target journalists, academics and civil society activists. The case is still working its way through the US justice system.

‘This is about the fight for a secure internet,’ Cathcart says.

In his conversation with ASPI, Cathcart talked about WhatsApp’s origins. One of the company’s founders, Jan Koum, was born in the Soviet Union. Scarred by the experience of totalitarianism, he held a deep-seated belief that human beings need to be able to talk to someone in private without someone listening in.

‘Privacy, democratic values are in our DNA,’ Cathcart explains.

That DNA will be tested even further in coming years as WhatsApp attempts to enter more markets in places with varying levels of democracy. In response to a question about the service’s future in Hong Kong, Cathcart acknowledged that ‘we run the risk of being blocked everywhere we operate’.

But he goes on to say that the issue is bigger than tech companies. They can’t fight the problem of authoritarianism on their own and be profitable in a sector that demands global scalability to remain in business.

Rather, democratic governments need to be thinking about how to ‘commandeer tech companies in the fight to spread liberal values’. Partnering more closely with tech companies on combating disinformation and introducing regulation to bake more privacy into digital technologies are some of the approaches suggested by Cathcart.

WhatsApp made news this year by launching a lawsuit against the Indian government in India’s supreme court on constitutional grounds.

‘What the Indian government wants,’ explains Cathcart, ‘is traceability. But we’re arguing that this is inconsistent with the privacy guarantees in the Indian constitution.’

WhatsApp contends that the Indian authorities are asking it to break its end-to-end encryption. The Indian government denies this, but experts support WhatsApp’s assessment.

When asked what response he expects from the Indian government, he says that WhatsApp could be blocked just like it was in Brazil. In 2015–16, the service was blocked three times and a Facebook executive jailed, although those actions were later overturned by an appellate court decision which ruled that end-to-end encryption was important for human rights in the country.

Since then, Brazil has enacted a general data protection law and an internet bill of rights that provide a more coherent legal framework for the preservation of encrypted messaging.

WhatsApp’s case against India is happening against a backdrop of increasingly authoritarian moves by the Modi government to control information. However, the Indian government has argued that it is only following international precedent like Australia’s anti-encryption laws.

Still, Cathcart hopes that WhatsApp can remain part of India’s growth story. In fact, India is WhatsApp’s largest market, with 400 million users.

This points to a fundamental tension that Cathcart has to manage. Encrypted messaging services don’t make profits on their own. WhatsApp competitor Signal—started by WhatsApp founder Brian Acton—operates as a not-for-profit. Telegram has raised operating funds through initial coin offerings and its own cryptocurrency, and CEO Pavel Durov has said that it won’t allow ads or sell user data to raise revenue. But, basically, both of these services rely on billionaire subsidies.

Cathcart admits that WhatsApp also has yet to make a profit, but hopes to through a mixture of business services, advertising and financial services.

For the business sector, WhatsApp’s plans include offering direct business-to-customer messaging and business-to-customer services, like getting a boarding pass delivered via WhatsApp.

Advertising initiatives include offering businesses ways to find new customers through ads on Facebook and Instagram, which could be counted as revenue for WhatsApp.

The company also will market financial services like digital banking and money transfers in emerging markets where both literacy and underbanking are widespread problems.

But if the Indian court battle doesn’t go WhatsApp’s way, a big chunk of the company’s future revenue will be in doubt. And if WhatsApp decides acquiesce to the Indian government’s position, that might irrevocably damage trust in the brand’s privacy DNA, which is already shaky due to perceptions of data-sharing with Facebook.

Cathcart insists there’s a middle path here—that it’s possible both to have secure encryption for users and to assist law enforcement. He says the company is more than willing to work with law enforcement if it’s done through proper legal channels in accordance with human rights standards.

As an example, Cathcart says that WhatsApp doesn’t see individual messages, but it has a reporting mechanism for users to report suspicious activity. Content moderators can look at some metadata, group names and patterns of behaviour that might be indicative of criminal and inauthentic activity. They’ve also added a Google button to encourage users to factcheck information.

Another avenue for countering misinformation and disinformation is through changing the design of the product. One change that WhatsApp made in 2018 after vigilante killings in India was to adjust the forwarding settings so that content could only be forwarded once, limiting the speed at which harmful messages could be spread. But presumably this also works to limit pro-democracy messaging too.

Cathcart says these measures have seen large decreases in forwarding across the WhatsApp system. He also points to the company’s recent partnering with Brazil’s election systems, on things like factchecking and shutting down fake accounts. Again, working with governments is key, says Cathcart, as are government-run public awareness campaigns on disinformation.

But his broader message for law enforcement is that encryption protects the internet. Just like in physical spaces, there should be a limit to how much law enforcement can do in cyberspace to solve crimes. For example, in an age of increasingly smart homes, police shouldn’t be able to get into your living room whenever they want; a warrant should be required, like it is in the physical world. Breaking encryption might help solve some crimes, but it will make us less safe overall.

Somebody might hear us: the future of secure communications

I was pleased to become involved when ASPI was asked to run a workshop on secure communication technologies late last year. It’s been a long time since I was in the world of signals intelligence and I hadn’t kept up with technological developments. This was a good chance to catch up, and you can read a summary of my findings in my new ASPI report.

Of course, the fundamentals of securing communications haven’t changed. In a contested environment, you want to make life as hard for an adversary as you can, consistent with keeping the imposts on your own resources manageable. And defence-in-depth is preferable to relying on a possible single point of failure.

The paper breaks down that idea into three broad steps:

  • reducing the probability of a signal being detected
  • reducing the probability of a signal being accurately or completely collected if it’s detected
  • reducing the probability of a signal being exploited if it’s detected and collected.

Ideally, you would be able to exchange information without an adversary even being aware of the existence of a communication. That’s clearly possible sometimes—for example, if you can keep everything in a closed system, such as a fibre-optic cable ‘air gapped’ from the outside world. But that won’t always be possible without unduly limiting your own capability; deployed military forces usually can’t be constrained by the reach of cables. Information sometimes needs to travel over pathways that are exposed to the outside world, such as by radio transmission.

But it turns out that some clever applications of physics can limit—but not entirely eliminate—the ability of an adversary to detect the signal. One example given in the paper is the experimental use of ultraviolet (UV) communications to broadcast information. Because some wavelengths of UV are strongly absorbed in the atmosphere, signals can’t propagate much more than 3–5 kilometres, though the signal strength can be quite high at shorter ranges. Deployed land forces could use UV signals to communicate with nearby friendly forces, with little risk of interception by adversaries outside a fairly well-defined perimeter. And, unlike current short-range military radio communications, the signal remains difficult to detect even if the adversary has a direct line of sight. Another bonus is that moving to UV frequencies from standard radio frequencies increases the data rate and decreases the time required to transmit a message. The downside to UV is clearly the limitation on range.

Greater range with only slightly less security is possible via the use of lasers for communications between two points. Laser beams can be very tightly collimated and carry high data rates. The tightness of the beam means that interception is unlikely, though in this case detection is possible because some light will scatter off atmospheric particles. Because of the randomness of scattering, collection of enough of the signal for accurate demodulation is difficult—the adversary knows you’re sending information but is unlikely to be able to exploit it. In places where laser communication isn’t possible between two ground-based locations, satellite relays can provide a bridge.

Space-based systems are perhaps the most likely applications for emerging communications technologies. In the US, NASA and the Defense Advanced Research Projects Agency are working on a number of techniques for linking spacecraft with high-data-rate and low-probability-of-intercept communications, for a range of civilian and military uses. X-ray laser beams have very short ranges on earth due to the rapid absorption of energy in the atmosphere, but they work fine in the near vacuum of space, and the high frequencies can provide gigabit-per-second data rates. In space, nobody can hear you beam.

Defence capability planning always has to take a worst-case view. The history of espionage shows that clever adversaries can find ways to detect and collect even carefully hidden signals. A quarter of a century ago, frequency-hopping radios and spread-spectrum techniques provided a level of protection against interception. But Moore’s law has enabled the development of broadband collection and analysis systems that render those techniques much less secure. Even the emerging technologies sampled in my paper have already generated some thinking about how they might be defeated. In one case, different arms of the US government have issued requests for tender for proof-of-concept communications technologies and for systems capable of detecting and possibly exploiting them.

Since absolute security of communications can’t be assumed, there will always be a place for encryption. If an adversary manages to defeat the compounding low probabilities of detection and collection, the encryption of the content will then present it with another step with a low probability of success. The future of encryption is worth a study in its own right, but don’t believe everything you read about quantum computers spelling the end for encryption; quantum-resistant encryption is possible.

Communications will never be entirely secure—the presence of human beings on the ends of communication chains almost guarantees that, even in the absence of clever methods to defeat new technologies. But there are enough clever ideas about new security techniques to ensure that signals intelligence people will have to keep working hard for their living.

Encryption deal done, but more work needed

Yesterday the government and opposition came to an agreement to pass the Assistance and Access Bill 2018 just before parliament rises. The bill rocketed towards the top of the political agenda when the government demanded that the opposition allow it to pass this week—the final parliamentary sitting week of the year. Labor reportedly secured assurances from the government that the committee reviewing the bill (the Parliamentary Joint Committee on Intelligence and Security) will continue to do so into next year, that the bill’s powers will be limited to serious offences and given more oversight, and that the term ‘systemic weakness’ will be defined in the final legislation.

Before the deal was reached, industry input had spurred even Aunty to generate tabloid-esque headlines like this: ‘Encryption bill could have “catastrophic” outcomes for Australian business, industry leaders warn’.

In this potion of perspectives, there’s been a lot of misperception and exaggeration. The bill’s unofficial title—‘the encryption bill’—is itself a misnomer. The draft legislation explicitly rules out trying to stop end-to-end encryption. But beneath the fire and fury there is a balance we need to rediscover.

Intelligence chiefs have lined up to explain the problems caused by going dark (the inability to access suspected criminals’ communications because of the growing ubiquity of encryption on everyday devices). ASIO Director-General Duncan Lewis told the committee, ‘I anticipate ASIO would immediately use this legislation if it were available.’ And the head of the Australian Criminal Intelligence Commission, Mike Phelan, has argued that a ‘holistic package’ is the only way forward and that there’s no option of picking and choosing. These are serious people with a strong commitment to serving the national interest.

Balanced against the need to address these concerns is the need to get this bill right. It is not a simple piece of legislation. Several questions require further consideration—like the appropriate authorisations for the different types of requests that can be made, definitional clarity, implications for Australian exporters and the oversight regime. Considerable effort is needed to sharply distinguish (and communicate) the reach of this law and the protections it provides from the practices of states whose overreach we reject.

In the UK, a somewhat similar law (the Investigatory Powers Act 2016—an admittedly much broader piece of legislation) took a year to navigate its way through parliament. The Assistance and Access Bill was introduced into parliament just two and half months ago, or a period of 16 sitting days for the House of Representatives. Hardly an inordinate period of review.

The urgency of passing these laws does need some context. Other similar countries (with the exception of the UK) don’t have similar powers, so the Australian legislation is attracting a lot of attention from multinational companies which are anticipating that other countries will follow Australia’s lead. There are also considerable wait times before the bill’s most far-reaching provisions can be exercised. Before a technical capability notice—which could include compelling a communications provider to develop a way for authorities to access a person’s data—can be issued, the government must first engage in a minimum 28-day consultation period (section 317W), although this can be waived in limited circumstances. The notice can then be appealed, and there’s the time required to actually build the capability. Elements of the bill like this are not quick fixes to the ‘going dark’ problem.

Australia has a strong history of bipartisanship on national security issues and is much stronger for it. It is good that this tradition has been able to hold, if only just, during this debate and a pathway has been found for ongoing review. Given the many issues that still require addressing, further scrutiny will be a good thing.

Turning our technology against us

Every day we carry our lives on digital devices tucked in our pockets. But public trust in those devices has reached an all-time low thanks to scandals ranging from election interference by Russian hackers to the weaponisation of social media by governments and extremists. Last month, the Australian government proposed legislation that could make things worse.

Imagine if a law enforcement official could secretly force Apple to hack into your phone to access your encrypted data. Or compel Google to trick you into installing spyware on your phone by sending you a fake software update. Or require Facebook to covertly rewrite Messenger or WhatsApp so authorities can access your encrypted conversations.

The Australian government’s draft Telecommunications and Other Legislation Amendment (Assistance and Access) Bill opens the door to just that, and more. The bill, which was introduced into parliament by Home Affairs Minister Peter Dutton on 20 September (just 10 days after submissions closed), would allow Australian law enforcement and security agencies to order technology companies and even individuals to do vaguely described ‘acts or things’ to facilitate access to your encrypted data and devices through newly created ‘technical assistance’ and ‘technical capability’ notices. Although officials would still need a warrant to obtain private communications and data, the bill requires no prior judicial authorisation before the attorney-general could compel your phone maker or app provider to undermine their security features.

The bill states that Australian courts will retain their powers of judicial review to ensure officials are acting lawfully. However, the proposal doesn’t provide sufficient transparency, oversight or accountability mechanisms to ensure its broad powers aren’t abused. Agencies would impose notices in secret, and the bill makes it an offence for companies to tell the targeted person about it. While secrecy may often be necessary in an investigation, the bill doesn’t allow disclosure even when it would no longer pose a threat to security or jeopardise an investigation. It is also difficult to envision how an individual could seek judicial review if they never find out that their device was deliberately compromised.

In all, the proposed law leaves too much discretion to officials to decide whether an order is justified as necessary and proportionate, and doesn’t impose sufficient safeguards to prevent abuse.

The proposal does forbid the creation of ‘systemic’ weaknesses or vulnerabilities in technology. But the broadly drawn bill doesn’t define ‘systemic’, and other key terms, and provides too much room to agencies to determine their contours. The result is that many of the actions companies might be forced to take could introduce vulnerabilities that cause widespread harm to cybersecurity and human rights, despite the bill’s intent.

Agencies could, for example, require a company to use its software update system to trick users into installing government code or spyware, a move that would undermine trust in routine software update channels. If users fear that updates may be compromised, they may be more reluctant to install them. Phones and other devices would then be less secure over time because they wouldn’t have necessary software fixes, which would undermine cybersecurity for users beyond the targets of an investigation.

Because of the ambiguities in the bill, some of the capabilities it may compel could be interpreted by security experts (including those working for service providers) as creating security ‘backdoors’ or as preventing the use of strong, end-to-end encryption.

Australia’s proposal emulates the approach in the UK’s Investigatory Powers Act. It also follows a joint statement from the Five Eyes countries—the consortium of Australia, the US, the UK, Canada and New Zealand for joint cooperation in signals intelligence—demanding greater ‘voluntary’ cooperation from technology companies to access encrypted data or else face new laws or other ‘technological’ measures. In the US, the government is already trying to compel Facebook to circumvent security features in the Messenger app, much like it tried to do to Apple in 2016.

If adopted, the Australian bill would pose considerable threats to cybersecurity and human rights. And its effects wouldn’t be limited to Australia. Once Apple, Facebook or Google has to disclose the source code behind its products or to trick you into installing spyware posed as a software update for Australia, other governments will demand the same. And once a company rewrites code to access information held on your device, it could be forced to use that compromised code again and again, by Australian or other authorities. Such an outcome creates additional risks that the compromised code could be breached, stolen and disseminated, affecting users around the world.

On 10 September, Human Rights Watch submitted comments to the Department of Home Affairs urging the withdrawal of the draft bill and the crafting of an approach that meets the needs of law enforcement while also protecting cybersecurity and human rights. For example, any legislation creating new surveillance capabilities should require agencies to use the least intrusive measure to access private communications to ensure that any limit on privacy and security is proportionate. It should specifically affirm that it doesn’t prevent companies from employing end-to-end encryption. And it should require prior authorisation from a judicial authority that is independent of the agency seeking to compel action by a company, while also creating meaningful avenues to challenge overreaching orders.

Given the extraordinarily intrusive nature of the actions agencies could compel, any proposed law requires far more robust oversight and accountability mechanisms than the bill currently provides to check executive power and ensure people’s rights are preserved.

The technology companies we rely on to keep our data safe already face an escalating arms race to protect us from cybercriminals and other security threats. Encryption is a key part of their arsenal, and so is their ability to fix security problems through regular software updates. Ordinary users should be able to trust that their technology hasn’t been deliberately compromised by their own government. Australia, the US and the other Five Eyes governments should be promoting strong cybersecurity, not turning our own devices against us.

Encryption bill faces uphill battle

After a few false starts, the government has released its promised legislation to address the ‘going dark’ problem caused by encryption—something that affects more than 90% of the data lawfully intercepted by the Australian Federal Police.

Despite much speculation that it might attempt to destroy end-to-end-encryption, the Telecommunications and Other Legislation Amendment (Assistance and Access Bill) 2018 goes out of its way to make clear that it will do no such thing. The draft bill explicitly states that no measure can be taken that requires a designated communications provider to build ‘a systemic weakness, or a systemic vulnerability’ (section 317ZG). The key word here is ‘systemic’: instead of aiming to create systemic vulnerabilities, the bill seeks to facilitate tailored access and the creation of ‘alternative-collection capabilities’.

Reading the draft bill, you get the impression that it has benefited from a long consultation process with industry. But, as discussions that ASPI’s International Cyber Policy Centre has held with the major tech firms, the government, and privacy and encryption experts have revealed, there are a lot of varying views on this issue and the bill will still meet with resistance.

The bill is long and detailed, but here are a few of the key changes it ushers in.

Section 317C brings into the Telecommunications Act’s remit a broad array of companies and individuals under the banner of ‘designated communications providers’. This includes ‘the full range of participants in the global communications supply chain, from carriers to over-the-top messaging service providers’. The category also includes providers of an ‘electronic service’, which is broadly defined to capture ‘a range of existing and future technologies, including hardware and software’ (section 317D).

Part 15 creates three tools for requesting and compelling assistance from designated communications providers. One is voluntary (a technical assistance request) and two are compulsory (a technical assistance notice and a technical capability notice). Both compulsory requests must meet a test of being reasonable, proportionate, practicable and technically feasible.

A technical assistance notice compels a provider to cooperate if it has the capability to do so—for example, to decrypt messages if it already has that capability.

A technical capability notice, by contrast, compels a provider that doesn’t yet have a capability to enable it to assist, to develop one. As the accompanying explanatory document notes: ‘The things specified in technical capability notices may require significant investment.’ This is likely to be the most controversial provision of the bill for many of the big tech firms.

The terrain that the three types of notices can cover is broad (section 317E). One provision that’s likely to be especially controversial is the potential for companies to be asked or compelled to hand over source code (section 317E(1)(b)), subject to a test of whether it’s reasonable and proportionate.

The legislation anticipates that companies, in most cases, will cooperate; however, penalties have been added as an inducement. Companies that don’t comply face fines of up to $10 million and individuals can be fined up to $50,000 for each case of non-compliance. It also increases the penalties in the Crimes Act for those who refuse a lawful request to provide access to a device (for example, their password or fingerprint). The penalty increases from a maximum of six months’ or  two years’ imprisonment to a maximum of five years’  or 10 years’ imprisonment, depending on the seriousness of the crime being investigated.

Many would regard the government’s starting premise as reasonable—that provided a compelling public need exists (as demonstrated by a warrant) governments should be able to compel access to otherwise private information. In this new technological age, a broad range of organisations should help provide that access in the same way traditional telcos have been for a long time. The trick, of course, is in the execution.

The tech companies have been rightly concerned about any attempt to create systemic vulnerabilities or remove encryption. In the wake of the Snowden affair, when brand reputation depends on keeping an arm’s-length relationship with government, many of the tech companies will be loath to appear too close to any government and concerned about any precedents that might be set in a broader international context. Handing over source code, for example, might be one area where some companies draw a line, concerned about the implications in other more authoritarian jurisdictions where that information could be used to cause harm or intellectual property theft.

Some companies might also try to game the law. As drafted, the safeguards that require requests to be reasonable, proportionate, practicable and technically feasible could encourage some companies to secure data in a way that makes it impracticable for them to assist, even if they’re compelled to do so. Companies like Apple already encrypt communications in such a way that they claim they themselves can’t decrypt. Over time, areas where opportunities for assistance exist could be gradually closed off in a similar manner.

This bill is a long way from the one outlined in early reporting last year that claimed encryption would be broken. While it contains provisions that will no doubt receive pushback over the coming weeks, it’s a more nuanced response than reports suggested.

The trouble with Telegram (part 2)

Last Friday, the Turnbull government announced that it’s planning to introduce new laws that will compel international technology companies to cooperate with intelligence agencies. But the legislation will be no panacea. Although it demonstrates a political will to address shortcomings in current laws dealing with encrypted technology, one country’s legislative fix will be insufficient to address the challenge posed by Telegram and other encrypted service providers.

The fact that Telegram is incorporated overseas and has physical assets (such as data centres and servers) in multiple jurisdictions makes it nearly impossible to enforce a single nation’s laws. We saw this with the Cambridge Analytica scandal earlier in the year, when Facebook’s Mark Zuckerberg ignored requests from the British parliament to testify. Britain had the same problem when trying to compel Rupert Murdoch to give evidence to a parliamentary committee in 2011. And, while the CLOUD Act theoretically gives US law enforcement the power to acquire data from foreign tech companies, many subclauses limit that power, including conflict with the laws of foreign jurisdictions.

Moreover, the need for a warrant—which will have to specify exactly which server is being subpoenaed—will also hinder those seeking data from Telegram. According to Mounir Mahjoubi, former president of the French National Digital Council, Telegram has ‘done everything to make it a technological nightmare to find where their server is’. And after attacks on churchgoers in Normandy in 2016 were linked to Telegram, the French interior minister alleged that investigators, armed with court orders, had been unable to even find a Telegram ‘interlocutor’ to contact.

Users have many tools with which to circumvent country-specific regulation. Virtual private networks (VPNs) spoof user locations, as do internet browsers such as Tor. They aren’t hard to use and are effective—Facebook is still used regularly in China despite being blocked by the ‘Great Firewall’.

In Russia, Telegram rerouted its app through overseas cloud-based IP addresses. That caused the Kremlin to block more than 15 million IP addresses as Telegram hopped from one Google- or Amazon-based IP address to another, affecting a host of unrelated businesses and people.

One proposed solution is to put pressure on third parties that host Telegram on their app stores or provide cloud services. Amazon eventually put a stop to Telegram’s IP address-hopping, and Apple blocked Telegram updates worldwide until recently. Indeed, going directly to Apple to remove apps from a particular country’s store is proving an increasingly effective censorship technique.

Lawmakers sometimes lack technical expertise, which can make it hard for them to understand why quick legislative fixes won’t work. Some politicians voiced support for the idea that encrypted messaging services (EMSs) should weaken their encryption to aid law enforcement after the FBI versus Apple encryption dispute, when the FBI needed to access the San Bernardino terrorist’s iPhone. EMS operators, like Apple, have rightly pointed out that forcing them to weaken their encryption makes consumers vulnerable to nefarious hacking.

A dearth of technical expertise among US legislators was obvious earlier this year when Zuckerberg testified before Congress (by choice). The lack of even basic knowledge about how Facebook works meant that lawmakers’ questioning couldn’t reveal what went wrong.

Australian legislators have the benefit of expert advice from agencies such as the Australian Signals Directorate. However, if they can’t understand how Telegram’s technology works, they’re going to struggle to explain the proposed new law (as we saw with the 2014 metadata legislation), or to understand its limits.

Legislative processes don’t operate in isolation. Users who want encryption can be innovative, so it’s a safe bet that by the time we think of one fix they’ll already be exploiting another loophole. Telegram is the preferred platform of Islamic State’s online supporters, partly because they have found creative ways to quickly upload thousands of megabytes of data to maintain a consistent flow of propaganda to their support base, well ahead of moderators. Plus, Telegram doesn’t moderate private channels and groups.

And stopping the determined user is a lot harder than stopping the company from taking advantage of tools such as VPNs. Islamic State puts out multilingual manuals on VPNs, dark-web browsing and other techniques to evade law enforcement. And it’s possible to set up an Apple store account under a fake identity from a different country to acquire an app that’s been removed.

Telegram as a company may be more about tapping into the pro-privacy market than about upholding an unwavering, high-minded commitment to the principle of privacy. Certainly, there’s evidence that Telegram was initially designed with secrecy from the Russian state in mind, rather than flawless encryption. But if Telegram admits to handing over user data or seems like it’s bowing to government pressure, then another, more recalcitrant, EMS will take its place, and any legislative fix will probably be redundant.

This means that we need a solution that covers the overall problem of uncooperative multinational EMSs. And that means dealing with both the technical and legislative aspects. The European Union has put forward some good ideas, including establishing networks of encryption centres and harmonising rules about cross-border access to and storage of data.

Australia can lead in trying to expand European efforts to capture more servers and companies in the legal framework. And we can invest in similar efforts to improve our ability to unlock the sorts of encryption used by Telegram and understand the data its users generate.

It’s inefficient for multiple countries to try to simultaneously decrypt the same Telegram algorithm, and multiple legal regimes for data access make it easy for the company to pick and choose where to hide its servers.

Telegram is an international problem, and it requires an international solution.

Encryption: the perils of ‘going dark’

In June, Andrew Davies produced a pair of Strategist pieces (see here and here) on the encryption challenge to security, in the process succinctly explaining why our telecommunications intercept (TI) capability is ‘going dark’. Andrew’s second contribution paints a rather bleak picture of the future of this collection capability: ‘[T]he access to data through lawful intercept that our security agencies once enjoyed will never be possible again.’ The loss of TI effectiveness will hit the Australian law enforcement community particularly hard: it’s the fundamental building block for complex investigations. Before considering what ought to be done, it’s worth examining how an over-reliance on TI has shaped contemporary police thinking.

A number of dated assumptions underpin the Telecommunications (Interception and Access) Act 1979. While Andrew’s first instalment provided some insight into the prevailing technological conditions of the day, I’d argue that authorities at the time had also assumed the following:

  • The Australian government had a technological edge over the private sector and could arguably adopt technology rapidly (at least by the standards of the day) to any foreseeable change in the operating environment. However, that’s no longer the case.
  • Many Australians trusted their government to self-regulate its use of intrusive powers.
  • The government would maintain its monopoly control over the telecommunications industry. Deregulation and privatisation have, for better or worse, dramatically changed that arrangement.
  • Law enforcement’s physical access to telephone communications was a relatively simple affair—a point made particularly clear by Andrew.

While successive governments have tinkered with the Telecommunications (Interception and Access) Act, they have at various stages failed to engage with the seismic paradigm shifts that have occurred.

Since 1979, we all got phones in our houses, and later in our pockets, while law enforcement got an effective and efficient intelligence collection capability. That capability offered police officers direct access to the most private of conversations between alleged offenders.

From 1979 until the 11 September 2001 terror attacks, Australia’s law enforcement agencies experienced significant pressures from governments to reduce expenditure. With cost-effective access to TI material a given, successive police commissioners found savings in the degradation of undercover and human intelligence capabilities. A series of high-profile inquiries into Australian police services in the 1980s and 1990s hastened that degradation: allowing police officers to engage with criminals became an unacceptable risk.

In an intelligence collection management sense, TI became so effective that many of the most basic principles of intelligence collection management were discarded in the law enforcement domain. Long-held principles, such as redundancy in collection efforts, in which more than one collection asset is tasked to ensure that information can be confirmed, were abandoned.

Until recently, no Australian law enforcement agency seemed to believe that going dark was possible. However, Andrew’s recent work reveals that we’re unlikely to be able to legislate our way out of the problem.

I struggled to find a contemporary and comparative going-dark case study until I reflected on the intelligence collection challenge of the ADF’s 1993 deployment to Somalia. Ready for contemporary operations, the ADF deployed tactical signals intelligence capabilities to Somalia to support the operations of a battalion group. With precious few transmissions to intercept, the battalion intelligence collection went dark. In response, it rapidly established human source capabilities in the field. Over the years since that deployment, the ADF has developed a robust human intelligence collection capability that augments technical collection capabilities and provides protection against going dark.

With that vignette in mind, perhaps an alternative approach to going dark might be found in intelligence and investigations tradecraft.

Collection management is an often overlooked, yet critical, component of any successful intelligence endeavour. For law enforcement agencies, greater emphasis needs to be placed on adequately developing intelligence professionals who can approach the problem of collecting intelligence and evidence in an imaginative manner, using a combination of intelligence disciplines. There can be no doubt that in the emerging law enforcement operating environment the collection of evidence and information will be riskier and more difficult. But there appear to be few other options.

While collection management will provide a roadmap for where to go next, the loss of TIs will necessitate greater investment in alternative collection disciplines, which will be costly. At the very least, physical surveillance and undercover and human source capabilities will become increasingly important in our future TI-dark world.

While we may not be able to decrypt communications, we shouldn’t turn our back on technical intelligence or the exploitation of the electromagnetic spectrum. Through the study of communications using tried and tested techniques such as traffic analysis, intelligence value can still be drawn from identifying communication patterns.

TIs have provided police and intelligence agencies with a real advantage for almost 40 years. For my money, the secret now is not to lament the loss of that valuable capability but to innovate to the conditions of the day. While Andrew quoted Dylan, I’m going to refer to the sage words of ’80s popstar Kim Wilde to describe the opportunity this presents: ‘And we were dancing. Dancing in the dark … Something’s gonna start’.

Apple versus the State: finding the sweet spot in the encryption debate

Image courtesy of Flickr user Maurizio Pesce

Last week I gave a conference presentation examining how the public and private sectors work together, and at times clash, on issues of national security. Using Rod Lyon’s excellent schematic, I examined how the private sector fits into the debate around how to achieve both security and liberty. To do this, I examined the issue through the lens of the current clash between the US government and Apple over encryption that’s playing out through the media. The case illustrates that while both the public and private sectors are focused on the same goal of security for the ‘citizen’ or ‘customer’, different interpretations of what is meant by security can create friction. This piece will be the first in a three part series examining the case itself, some of the political messaging that’s emerging as a consequence, and finally what the consequences of the case will be for national security.

Good cyber security is essential for the growing digital economy. Everyone stands to benefit from less cybercrime, espionage or online disruption. If online services are delivered seamlessly in a secure and trusted environment, confidence in online marketplaces and products can be improved. A safe and supportive operating space is also conducive in the creation of new businesses and sowing the seeds of innovation, and enables us to communicate with confidence that unwanted prying eyes won’t be able to read our communications. Yet for the government and the private sector, it can be tough to coordinate and prioritise such outcomes into mutually beneficial practice. That was made more problematic by the negative impact that Snowden’s revelations had on the dialogue between big US tech industry companies and the US Government on matters of national security.

Reflective of the difficulty of achieving what Rod calls the sweet spot for security and liberty is America’s current war of words and legal proceedings relating to the encryption debate. Acting upon their interpretation of the ‘sweet spot’, Apple and a range of other high profile companies and software developers have increased the availability of encryption. While increased cyber security is a goal for both governments and private companies such as Apple, their motivations vastly differ.

For Apple, the increased encryption in their software was aimed at both protecting their customers from cybercrime and unwanted attention from governments. It also served as an important selling point to a customer base who increasingly demanded privacy in the wake of the 2013 Snowden revelations. Other motivations included regaining share price in a market that was suspicious of US firms perceived to be cooperating with the US government. Apple declared its independence in particularly dramatic fashion by developing encryption for the iOS8 operating system in 2014 that let consumers set their own passcode. This meant that Apple couldn’t hand the government an encryption ‘key’ because it simply didn’t have one.

The US government contends that more encryption plays into the hands of terrorists and criminals by offering them a degree of anonymity, and introducing challenges for evidence collection. They see that move as compromising broader national security. Jim Lewis makes an important case that the US government aren’t looking to create ‘backdoors’ in software, but rather that law enforcement agencies want the unencrypted plaintext when access has been authorised by law.

The issue is now coming to a head after Apple refused a court order to assist the FBI with unlocking an iPhone that belonged to one of the San Bernardino gunmen. Apple CEO Tim Cook has argued that such a move could set a ‘dangerous’ precedent and diminish the privacy that their customers have come to expect from Apple products.

Critics, of which there are many, argue that Apple’s position does a disservice to the victims of the shooting and potentially makes citizens less safe by limiting the amount of relevant data that law enforcement authorities can access in difficult national security cases like this. While many in the tech community have stood behind Apple, the general public remains fiercely divided on the issue, with recent surveys illustrating the schism in public opinion.

The thing about encryption is that it has so many valid uses, functioning as the essential enabler of an increasingly digital world. It allows people to establish their identity reliably, and keeps transactions out of criminals’ reach. Even the FBI’s Director James Comey recognises this, stating, ‘I love encryption, I love privacy… when I hear corporations saying we will take you to a world where no one can look at your stuff, part of me thinks that’s great. I don’t want anyone looking at my stuff’.

The outcome from the current Apple case will have wide reaching impacts for security and privacy, not just in America but in markets across the world—including here in Australia. It’s imperative, regardless of the outcome, that we maintain conversation between the private and public sectors. This is a time when increased dialogue and trust building is needed, not less. Despite differing motivations, we should remember that both the public and private sectors are working to achieve the same goal, a safe and secure society and online environment. However, finding that mutually agreeable sweet spot could be made difficult by the outcomes of this case.