IT’S IMPORTANT NOT to overstate the security risks of the Amazon Echo and other so-called smart speakers. They’re useful, fun, and generally have well thought-out privacy protections. Then again, putting a mic in your home naturally invites questions over whether it can be used for eavesdropping—which is why researchers at the security firm Checkmarx started fiddling with Alexa, to see if they could turn it into a spy device. They did, with no intensive meddling required.
The attack, which Amazon has since fixed, follows the intended flow of using and programming an Echo. Because an Echo’s mic only activates to send sound over the internet when someone says a wake word—usually “Alexa”— the researchers looked to see if they could piggyback on one of those legitimate reactions to listen in. A few clever manipulations later, they’d achieved their goal.
“We actually did not hack anything, we did not change anything, we just used the features that are given to developers,” says Erez Yalon, the head of research at Checkmarx. “We had a few challenges that we had to overcome, but step by step it happened.”
In fact, the researchers used an attack technique more common in mobile devices to carry off their eavesdropping. Whereas on a smartphone you might download a malicious app that snuck into, say, the Google Play Store, the researchers instead created a malicious Alexa applet—known as a “skill”—that could be uploaded to Amazon’s Skill Store. Specifically, the researchers designed a skill that acts as a calculator, but has a lot more going on behind the scenes. (The Checkmarx team did not actually make their skill available to the general public.)