RESEARCHERS HACKED AMAZON’S ALEXA TO SPY ON USERS, AGAIN

May 1, 2020

SHARE THIS

A malicious proof-of-concept Amazon Echo Skill shows how attackers can abuse the Alexa virtual assistant to eavesdrop on consumers with smart devices – and automatically transcribe every word said.

Checkmarx researchers told Threatpost that they created a proof-of-concept Alexa Skill that abuses the virtual assistant’s built-in request capabilities. The rogue Skill begins with the initiation of an Alexa voice-command session that fails to terminate (stop listening) after the command is given. Next, any recorded audio is transcribed (if voices are captured) and a text transcript is sent to a hacker. Checkmarx said it brought its proof-of-concept attack to Amazon’s attention and that the company fixed a coding flaw that allowed the rogue Skill to capture prolonged audio on April 10.

“On default, Alexa ends the sessions after each duration… we were able to build in a feature that kept the session going [so Alexa would continue listening]. We also wanted to make sure that the user is not prompted and that Alexa is still listening without re-prompts,” Erez Yalon, manager of Application Security Research at Checkmarx, told Threatpost.

Checkmarx researchers said they were able to manipulate code within a built-in Alexa JavaScript library (ShouldEndSession) to pull off the hack. The JavaScript library is tied to Alexa’s orders to stop listening if it doesn’t hear the user’s command properly. Checkmarx’s tweak to the code simply enabled Alexa to continue listening, no matter the voice request order.

One challenge for researchers was the issue of the “reprompt” feature in Alexa. Reprompts are used by Alexa if the service keeps the session open after sending the response but the user does not say anything, so Alexa will ask the user to repeat the order. However, Checkmarx researchers were able to replace the reprompt feature with empty reprompts, so that a listening cycle starts without letting the user know.

Finally, researchers accurately transcribed the voice received by skills: “In order to be able to listen and transcribe any arbitrary text, we had to do two tricks. First, we added a new slot-type, which captures any singleword, not limited to a closed list of words. Second, in order to capture sentences at almost any length, we had to build a formatted string for each possible length,” according to the report.

One big issue Checkmarx faced is that on Echo devices a shining blue ring reveals when Alexa listens. But, “the whole point of Alexa is that unlike a smartphone or tablet, you do not have to look at it to operate it,” said Yalon. “They are made to be placed in a corner where users simply speak to it without actively looking to its direction. And with Alexa voice services, vendor are embedding Alexa capabilities into their products and those products might not provide visual indication when the session is running.”

Amazon resolved this issue through tweaking several features on April 10, said Checkmarx. Researchers said Amazon fixed the problem by applying specific criteria to identify and reject eavesdropping skills during certification, detecting empty re-prompts and detecting longer-than-usual sessions.

According to Checkmarx researcher Yalon, every “skill” needs to go through a certification process and be approved by Amazon before it can be published to the Amazon store.

“Checkmarx did not try to publicly release the malicious skill… If we did, Amazon would need to approve it. We do not know the timeline of Amazon’s certification process, but we have no reason to believe (including after discussions with Amazon) that our malicious skill would not have been approved prior to the recent mitigations,” said Yalon.

“Customer trust is important to us and we take security and privacy seriously. We have put mitigations in place for detecting this type of skill behavior and reject or suppress those skills when we do,” an Amazon spokesperson told Threatpost.

The proof of concept raises questions about the privacy risks around voice services such as Alexa, as well as other connected devices in the home.

In September, researchers devised a proof of concept that gives potentially harmful instructions to popular voice assistants like Siri, Google, Cortana, and Alexa using ultrasonic frequencies instead of voice commands. And in November, security firm Armis disclosed that Amazon Echo and Google Home devices are vulnerable to attacks through the over-the-air BlueBorne Bluetooth vulnerability.

IT MANAGED SERVICES

5/5

Whether you need a computer tech onsite, virtual support or simply to speak with a qualified technician, COMNEXIA is here to help. Let us find the right support that suits your needs. Contact us today to learn more about our outsourced IT solutions.

CONNECT WITH US