Not to be too partisan, but as a general rule, Intelligencer is firmly in the “don’t murder people in your home” camp. But if you are going to murder someone in your home and you own an Amazon Echo, we’d strongly urge you not to do it around election season. (This is called service journalism.)
Why? Consider the case of Timothy Verrill, a New Hampshire man accused of stabbing two women in his home. On Friday, a judge ordered Amazon to turn over any recording made by an Verrill’s Echo, because the state believes there may be recordings on Amazon’s servers from the time of the crime, though Amazon seems ready to fight to keep the information private, telling the AP it would not release the recordings “without a valid and binding legal demand properly served on us.”
The case will likely play out in courts for a while — Amazon previously fought against releasing recordings in a similar murder trial in Arkansas on First Amendment grounds, saying both the queries and responses to an Echo were protected under “Amazon’s First Amendment–protected speech.”
Before we go any further, a quick refresher how an Echo ostensibly should work when it comes to recording audio in your home. An Amazon Echo is always “listening” in your home, waiting to hear its wake word, “Alexa.” Once you say “Alexa,” the Echo then records the next few seconds of what you say and sends that off to Amazon servers for voice processing. That is the only time an Alexa actually records in your home, and the only audio Amazon holds on its servers.
The problem — and why we say you should avoid committing violent felonies when elections are coming up — is that wake word detection on the Echo can be spotty. Words that are phonetically similar to Alexa can sometimes trigger false positives, and will start recording.
After reading about Verrill’s case, I was curious and started going through all the recordings the two Amazon Echoes we keep in our home had made. By default, you go through and read a text transcript of every query you’ve made to Alexa, as well as to the audio. (Curious about what you’ve asked your Echo? Go to https://www.amazon.com/alexaprivacy.)
Most of it was pretty dull stuff. We have smart lights installed and a 7-month-old daughter, so there’s a lot of recordings from 2:30 a.m. of my wife or me asking Alexa in a very tired voice to “turn off all lights” after we manage to get our kid back to sleep. There’s also us telling Alexa to play WNYC, queries about what the weather would be like that day, requests to set a timer — all pretty jejune.
But as I skimmed through, I noticed one odd query, which was transcribed as: “alexa i’m coming up soon terrifying where’s and kelsey.” We don’t know anyone named Kelsey, and I didn’t know what “coming up soon terrifying” would mean, so I played the actual audio recording. It turned out to be a snippet of my wife saying something to the effect of, “Elections coming up soon, terrifying words in an email.” (The audio quality is pretty bad, so this is just my best guess.)
I started combing through more audio snippets, and quick found that the word “election” or “elections” was phonetically similar enough to Alexa that there were at least four other instances of Alexa sending recordings to Amazon servers. Three were triggered by people on TV or podcasts saying the word elections, and another was triggered by me saying something like “election-night plans.”
All the rest of the recordings were shown as “Text not available — audio was not intended for Alexa,” which only means that Alexa didn’t provide an audio response. But that audio recording did still exist on Amazon’s servers, and if I did happen to be murdering someone (again, something we do not recommend doing!) while that false positive happened on the Echo, the audio would be there if the state managed to compel Amazon to turn over the recording.
We reached out to Amazon for comment and will update the post if they should respond. In the meantime, unless you’re a political junkie you’re probably safe until 2020 (though, again, don’t murder people).