Published: Mon, May 14, 2018
Tech | By Dwayne Harmon

Undetectable Commands for Apple's Siri and Amazon's Alexa Raise Serious Security Risks

Undetectable Commands for Apple's Siri and Amazon's Alexa Raise Serious Security Risks

The staff of higher education institutions in the USA and China have learned to apply the secret control signals of the voice assistants Apple Siri, Amazon and Google Alexa Assistant without the knowledge of owners of mobile devices.

The NY Times reports that it builds on research that began in 2016.

The researchers have now demonstrated that automatic speech recognition, too, is vulnerable to such attacks. Amazon and Google use technology to block commands that can not be heard.

Unfortunately, "in the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online - simply with music playing over the radio".

This attack required the perpetrator to be within whispering distance of the device, but further studies have since shown that attacks like this could be amplified and carried out as far away as 25ft.

Taking the research to the next level, some of the same Berkeley researchers published a paper this month, in which they demonstrated that they could insert silent commands into spoken text and music files. They embedded the commands inside normal audio, like recorded speech or even music.

A day after Google announced the addition of six new voices to its Artificial Intelligence (AI)-powered Assistant, users in the United States can now change the voices of their assistants, a media report said. Some, like Apple, declined to comment for this story. Android devices with voice-enabled search would respond by reading from the Whopper's Wikipedia page. For instance, it wouldn't stop an attacker from messing with your smart home gadgets or sending a message to someone. Both Amazon and Google make sure that it's the user's voice giving the command before acting on them. Are hackers already using this method to victimize users, and how can you protect yourself against it? The same goes for unlocking smart locks, where the spoken PIN code isn't just an option, but a default requirement.

"We limit the information we disclose about specific security measures we take", a spokesperson tells CNET, "but what I can tell you is Amazon takes customer security seriously and we have full teams dedicated to ensuring the safety and security of our products".

Amazon says it has taken steps to keep its smart speakers secure, while Google says that its Assistant has certain features that make undetectable audio commands less severe, according to the Times.

That's all well and good, but as attacks like these creep closer and closer to real-world plausibility, manufacturers will likely need to do more to assuage consumer fears.

Like this: