Tuesday, 14 August 2018
Do you Trust Australia?
This morning, the Australian Department of Home Affairs released the Assistance and Access Bill 2018 for consultation.
This is the long-hinted-at effort to address the increasing prevalence of encryption without requiring “backdoors,” and is likely the starting point for similar efforts in other Five Eyes countries, if reports are to believed. It’s even possible that once it gains traction in a few of those places, it will serve as the basis for new norms around how modern societies address issues in this area.
As such, it’s important to understand what’s being proposed and evaluate it for long-term effects (intentional and otherwise).
In response to some press enquiries and to prepare for next week’s Internet Society panel in Canberra, I spent most of the morning reading the explanatory document, mostly focusing on Schedule 1, the framework for industry assistance.
These are my notes so far.
Overview of the Proposed Legislation
In an nutshell, there are three different instruments that an “interception agency” can use to ask a “designated communications provider” of “electronic service” for “listed acts or things”:
- a Technical Assistance Request that’s voluntary;
- a Technical Assistance Notice that’s compelled, but has to be for something that’s already an existing capability; or
- a Technical Capability Notice, where they’re compelled to develop a new capability.
The last of the three instruments has additional constraints, such as having to be approved by the Australian Attorney General, a minimum 28-day consultation period, and tighter conformance to the listed acts or things.
Interception agencies include the Australian Federal Police, a number of federal and state-level anti-corruption and crime commissions, and police forces at the state level.
Designated communications providers are defined in section 317C as a pretty long laundry list. Basically, if you design, create, maintain, support or supply hardware, software or services that’s connected to the Internet and has even one user in Australia, you’re a designated service provider – even if you’re not in Australia.
Electronic service is defined in section 317D, and is purposefully broad:
Examples of electronic services may include websites and chat fora, secure messaging applications, hosting services including cloud and web hosting, peer-to-peer sharing platforms and email distribution lists, and others.
Finally, Listed Acts or Things according to section 317E enumerates all of the ways they can ask for help, including for example:
- “Removing one or more forms of electronic protection that are or were applied by, or on behalf of, the provider” – but see below
- Providing “source code, network or service design plans”
- Providing the “configuration settings of network equipment and encryption schemes”
- Providing “demonstrations of technologies”
- “Installing, maintaining, testing or using software or equipement”
- “Assisting with the testing, modification, development or maintenance of a technology”
- “Modifying… any of the characteristics of a service”
- “Substituting… a service”
Importantly, section 317ZG explicitly states that “designated communications providers must not be required to implement or build systemic weakness or systemic vulnerability.” In other words, these instruments can’t be used to circumvent end-to-end encryption, introduce flaws, or prevent someone from addressing security bugs.
However, it’s made clear that a notice can require a provide to “facilitate access to information prior to or after an encryption method is employed, as this does not weaken the encryption itself.”
If you’re served with one of these instruments, you can’t talk about it (penalty: 5 years), but you can issue a transparency report (numbers only) every six months. Refusing to honour the latter two instruments comes with an AU$50,000 civil fine (if you’re a person); AU$10,000,000 if you’re not (317ZB). However, you’re immune from civil actions brought against you as a result of honouring them (317ZJ).
Previous discussions and proposals around encryption have been marked by vagueness, leading to confusion – and a lot of fear – about what was being proposed. So, it’s very helpful to have proposed legislation finally show up, and this is a considered, specific proposal, not something that can be dismissed out of hand.
I’m also happy to see what appears to be an honest attempt to avoid backdoors in encryption, or otherwise weakening it.
Overall, the drafters of this legislation got a number of things right. That said, a number of issues come to mind on a first reading.
The sheer number of people who could receive one of these instruments should give everyone pause. Everyone who’s ever written an app or hosted a Web site – worldwide, since one Australian user is the trigger – is a potential recipient, whether they’re a multimillion dollar company or a hobbyist.
Cloud service providers, hardware vendors, software developers, IT integrators, and many others will also have to assess their risk and determine how they’ll handle a request.
Even folks who work on infrastructure and standards are apparently not exempt:
This category would include, for example, persons involved in designing trust infrastructure used in encrypted communications or software utilised in secure messaging applications.
– Section 317C, Item 6
Read literally (as laws often are), this in combination with the “listed acts or things” leads to some frightening what-if scenarios:
- The Commissioner of Police for Western Australia could require Daniel Stenberg to include extra code in Curl (according to 317E(1)(c)).
- Same for Web browsers.
- Any state police commissioner could require Google or Amazon to assist them to develop a specified technology (317E(1)(f)). That’s one way to get that new Web site up, I guess.
- The Australian Federal Police can compel me to give them a briefing about HTTP or QUIC (according to 317E(1)(b)). I’d be happy to do that, by the way – but not under threat of a AU$50,000 fine.
- Any state police commissioner could require Cisco to swap out a router with one that’s compromised before shipping it to a customer (317E(1)(i)).
- The AFP could require Moxie to produce the source code to Signal to find bugs (317(1)(b)).
- Likewise, a state police commissioner could get the source for Wickr, if they had it out for Malcolm Turnbull.
- The AFP could require Microsoft, Apple and Cisco to produce all known security bugs for their products.
- The Attorney General could require every cloud provider, every CDN and every Operating System vendor to develop the ability to log on demand to a central server.
Would any of these things happen? Hopefully not in a still semi-sane world, but that gets us to the next point – what’s stopping them?
Oversight and Appeal
Australia has an excellent reputation for the checks and balances over its traditional Law Enforcement Access regimes, so it’s concerning to see the decision-making criteria for the proposed instruments so vaguely specified. For example, 317P sets them out for technical assistance notices:
Before giving a technical assistance notice, the Director-General of Security, or the chief officer of an interception agency, must be satisfied that the requirements imposed by the notice are reasonable and proportionate, and compliance with the notice is practicable and technically feasible.
Satisfaction for the purposes of this section is a subjective state of mind of the administrative decision-maker. It is a precondition to the exercise of the power. To meet the requisite state of satisfaction the decision-maker must consider the reasonableness and proportionality of the requirements imposed by the notice and the practicability and technical feasibility of compliance with that notice.
Emphasis mine. The other instruments have similar language, although technical capability notices require both the Minister to weigh in, and for a consultation period with the affected provider(s).
Elsewhere, it’s pointed out that interception agencies are still required to obtain a warrant through normal judicial channels to actually use the data obtained, seemingly with the implication that this provides sufficient oversight.
That might be true from the standpoint of the target of an investigation, but it leaves an opportunity for the capabilities exposed by these instruments to catalogued and left “open” for later use, over time expanding the reach of law enforcement without effective oversight.
Indeed, the description of technical assistance notices goes on:
Section 317L allows the Director-General of Security, or the chief officer of an interception agency, to give a provider a technical assistance notice requiring the provider to do things for the purpose of helping the relevant agency perform functions or powers conferred by or under a law of the Commonwealth, a State or a Territory, so far as the function or power relates to
- enforcing the criminal law and laws imposing pecuniary penalties; or
- assisting the enforcement of the criminal laws in force in a foreign country; or
- protecting the public revenue; or
- safeguarding national security.
The specified acts or things may also go to matters that facilitate, or are ancillary or incidental to, the agency’s performance of a function or exercise of power where the function or power relates to these purposes.
That seems like a remarkably broad power. And remember that while the constraints in section 317ZG preclude “systemic weakening” of security, the lack of effective oversight means that only the gaze of the Minister (or police commissioner, etc.) matters.
Furthermore, the affected provider doesn’t appear to have any effective mechanism to appeal the decision. The limitations and safeguards document assures us that:
Affected people and companies have an avenue to challenge a decision to issue a notice. Judicial review by the courts is available under the Commonwealth Constitution and the Judiciary Act 1903.
If this becomes law, the criteria above are used to make the decision, and since it’s a “subjective state of mind,” I suspect any appeal won’t go so well.
The cherry on top here is that the press can’t play its usual function in informing the public about such events either, since this will all presumably be secret (317ZF).
Transparency reports by providers and the Minister (as allowed and required by the draft legislation, respectively) are a small step, but they provide very little information. Noticeably, the other schedules (regarding expanded warrant access) provide oversight by an Ombudsman, but one doesn’t get a mention here.
One Size Does Not Fit All
Some of the stranger scenarios above might be prevented by segmenting the different types of designated communications providers and the listed acts or things that apply to them.
For example, right now you can require a software developer of something on the Internet to modify their product, thanks to 317E(1)(c). “Installing software” makes sense for a network provider, but not for many of the other listed parties.
This one-size-fits-all sledgehammer approach was perhaps taken because its drafters are attempting to “future-proof” this legislation, it creates extremely broad categories of empowered parties (interception agencies), covered actors (designated communications providers) and behaviours (listed acts or things), without any differentiation.
However, the Internet is a much more complex ecosystem than the draft admits. While they might never attempt to use it in such unintuitive ways, the possibility that they might introduces doubt, and that doubt has effects on how people behave.
Wrapping Up (for now)
Back in June, the Minister spoke about the motivation for this legislation:
Perhaps the most dangerous threat is less direct - a collapse or a loss of trust.
Trust in our institutions, in key organisations, in our values and culture, and trust in all arms of government and politicians.
Information, trade and financial flows on the extraordinary scale of the modern world demand regular interactions between people and organisations who don’t know each other, or certainly don’t know each other well.
That requires an unprecedented level of trust.
A loss of trust, whether orchestrated or otherwise, has the potential to undermine the foundations of our modern world.
American Political Scientist Francis Fukuyama, who was one of the first to write about social capital based on trust, put it well when he said:
‘Widespread distrust in a society imposes a kind of tax on all forms of economic activity, a tax that high-trust societies do not have to pay’
It’s great to see a politician acknowledge that trust is such a core issue in these discussions. What’s easy to forget is how fragile the trust we have in the Internet is; I think many technical people are vocal about issues like encryption because they realise how quickly and permanently it can all tumble down.
So they haven’t required access to end-to-end encrypted communications; if that sticks, it doesn’t undermine trust in that component of the Internet, and that’s great. However, much of the trust we have is rooted in the hardware and software we obtain from others, and the services that they run for us. The endpoints matter too.
If we get it wrong, a lot is at risk. People will act and react based upon their adjusted perceptions of trust:
- Development of critical software and hardware might move away from Australia, due to the uncertainty about its contents by people overseas.
- Overseas providers might not offer their products and services in Australia, or might block access here to contain their risk.
- Standards bodies might not want to meet in Australia when participants could be compelled to interact with security agencies.
- People who operate critical Internet infrastructure might avoid coming here.
- Terrorists and criminal targets adapt to use software and hardware that isn’t subject to these instruments.
- Other countries might adopt similar measures and use them to control their citizens even more tightly.
And while I believe that the Minister, and the AFP, and the state police commissioners are trying to do the right thing – just as people in Google, Facebook and other parts of the Internet are – the fact remains that individual people are fallible, and sometimes there are failures in institutions as well. This is why we have checks and balances in democracy.
It may surprise some, but I actually support what the drafters of this legislation appear to be trying to do. Australia values community and safety as well as individual rights, and it’s important to get the balance right.
I’m less supportive of the current iteration of the legislation, because of the issues outlined above, but that’s the purpose of draft legislation – holding a discussion about the laws we agree to.
That’s all that I see today. Other issues that didn’t occur to me might also pop up too (after all, it is day one).