Subscribe to our Telegram channel!

American developers have created a «sound trojan» aimed at artificial intelligence voice systems

5:14 pm, March 22, 2023

American developers have created a «sound trojan» aimed at artificial intelligence voice systems. According to a published article by UTSA researchers, the new attack targets existing vulnerabilities in voice assistants such as Apple Siri, Amazon Alexa, or Microsoft Cortana and can turn the smart home ecosystem against its owner.

Experts used NUIT (a «sound trojan») to attack different types of devices. Experimental results showed that NUIT is effective in maliciously controlling the voice interfaces of popular tech products. The study also served as a proof of concept that such attacks are possible in principle, and that all modern AI voice assistants are vulnerable to some extent.

According to experts, an attacker does not need specially developed malware to successfully launch a NUIT attack. A hacker can use an ordinary video uploaded to YouTube or even an online Zoom call. The main thing is that the victim should not wear headphones, as this will reduce the range of such an attack to zero.

How does the sound trojan work in practice? The scheme is quite simple. A cybercriminal uses any method available to him to transmit a regular command to the victim’s speakers for a voice assistant. The command is repeatedly accelerated for instant execution and is reproduced in a frequency range close to ultrasound so that the human ear does not detect it. There can be several such commands, and they are most effective when executed sequentially. For example, «OK, Google, set the media volume to 0» (so that the victim does not hear the assistant’s responses), then «OK, Google, turn off the lights in the entire apartment» or «OK, Google, open the front door.»

Of course, for such an attack to cause real problems, the potential victim’s apartment must be literally overflowing with smart devices. The researchers noted that Apple devices are less vulnerable to such attacks because they memorize the user’s voice and do not respond to other people’s voice commands.

To avoid such attacks, the researchers suggested that users use headphones as the main source of multimedia content and keep smart devices in their line of sight to notice unauthorized commands in a timely manner. You should also not connect smart locks and other electronics that can affect physical security to the overall smart home ecosystem to avoid unexpectedly letting uninvited guests into your apartment.

BTC

$81,358.44

-0.01%

ETH

$2,328.27

0.20%

BNB

$651.83

0.31%

XRP

$1.45

1.57%

SOL

$94.80

1.10%

All courses
Show more