Here’s how the attack happened. Dominic Tarr, the maintainer of event-stream and a number of other open source packages, handed off ownership to a volunteer.
This volunteer added a malicious package as a dependency to event-stream. The malicious package was narrowly (and expertly) targeted to only execute in the context of the Bitcoin wallet Copay.
Here’s what’s shocking: any malicious person in control of a widely used package could do something like this again, easily.
A number of different solutions have been suggested — that we need to change our open source culture to pay developers to maintain packages, that we should have better package management — with lockfiles, dependency pinning, and deterministic builds.
These are all good points, but they miss something huge.
An imported package (even one controlled by a malicious agent) shouldn’t have been able to take your private keys in the first place.
Believe it or not, this ideal is actually achievable, and the mechanism for achieving it is POLA — the principle of least authority.
The mistake is in asking “How can we prevent attacks?” when we should be asking “How can we limit the damage that can be done when an attack succeeds?”. The former assumes infallibility; the latter recognizes that building systems is a human process.
— Alan Karp, “POLA Today Keeps the Virus at Bay”, HP Labs
The Principle of Least Authority (POLA) says that code should be granted only the authority it needs to perform its task and no more. Code has a lot of power. Code can read your files, delete your files, send your files (and all of the information within them) to someone else, record your keystrokes, use your laptop camera, steal your identity, hold your computer for ransom, steal your cryptocurrency, drain your bank account, and more. But most of the code that we write doesn’t need to do any of those things — so why do we give it the authority to do so?
Let’s take a look at how authority is handled in modules in Node.js. Let’s say that I want to import a function that concatenates an exclamation point to the end of the strings. Let’s say it looks like this:
And then we can import it and use it like this:
This seems fine. And it is! But importing addExclamation could have easily read our Bitcoin wallet and sent our private keys off, if it had instead been written like this:
Well, that seems terrible. And worse, without inspecting the code, you would have no idea that this is happening.
The proposed solutions for this—such as not using any packages that you don’t personally trust, having curated lists of audited packages, etc. — are missing the actual problem: this code had easy access to both the file system and the network, and it didn’t need either.
This kind of easy access is known as ambient authority. I didn’t need to explicitly hand over access to the file system — it was just laying around in scope for any piece of code to pick up. By contrast, in a system that follows the principle of least authority, access is denied by default, and it must be granted explicitly to be able to be used.
The fact that my module didn’t need access to the file system or the network but still had access to them is known as excess authority.
POLA is ultimately about eliminating both ambient and excess authority. It’s not a motto that is meant to be inspirational; POLA can actually be achieved. But how?
In a code review, would this jump out as problematic? Maybe not. As David points out:
“it is very difficult to spot shenanigans in obfuscated code, you’ve got no chance.”
But, if I create a Realm in my browser and try to execute this code inside it (Test it out for yourself here), it doesn’t matter whether I can spot the shenanigans or not. The code just doesn’t have access to the network. window, self, and fetch are all undefined.
But isolation is a problem as well. Code needs be able to interact with the outside world in order to do things! The above xkcd, entitled “Sandboxing Cycle,” makes fun of the apparent tradeoff between security and interaction. In order to protect ourselves, we limit interaction, but in order to accomplish our tasks, we create new connections, and then we discover the connections make us vulnerable, so we create a new sandbox. And so on, in cycles forever.
The key insight to getting out of the Sandbox Cycle is that not all interactions are equally dangerous. In fact, we can make some interactions very safe, and quickly identify which of the small number of remaining interactions are dangerous, and limit those.
Let’s go back to our addExclamation example and let’s imagine that Node.js worked differently — that the module that wants to use the file system or the network has to be explicitly passed access. In other words, we’re eliminating ambient authority. If the code isn’t explicitly granted access, it doesn’t have it.
When we eliminate ambient authority, it makes it much, much easier to see where our code deserves extra scrutiny. In the previous examples, any module that we imported could include very dangerous actions, such as accessing the network, without us easily realizing it. If we were to rewrite Node.js such that you had to explicitly pass access, it would be immediately obvious where you should spend most of your time and money auditing. That string function that doesn’t have access to anything external? That’s pretty safe — the worst it could do is give you a wrong answer or hang. That simple string function that is asking for network access? Whoa nelly. Something’s wrong.
The Principle of Least Authority — by eliminating ambient and excess authority — makes it such that many attacks just aren’t worth it. That means that we can selectively, safely connect out of our sandbox without creating dangerous security holes and therefore without getting into the “sandbox cycle.” If we’re breached, the malicious actor still can’t do much. Safe interaction is possible.
- The Node.js community has been discussing various ways to try to implement POLA. See also the recorded Node.js Security Workgroup discussion on Least Authority.
- A module system that would have granted modules least authority and would have prevented the event-stream exploit was presented by Mark Miller of Agoric at the November 2018 TC39 meeting.
Thanks to Mark S. Miller, Dominic Tarr, Bill Tulloh, and Dean Tribble for their helpful insights.