author photo
By SecureWorld News Team
Fri | Sep 8, 2017 | 9:48 AM PDT

Siri isn;t your friend anymore.

ZD Net explains:

Researchers have devised a method to give potentially harmful instructions to most popular voice assistants using voice commands that can't be heard by humans.

The researchers from Zheijiang University have validated that their so-called DolphinAttack works against a variety hardware with the Siri, Google, Cortana and Alexa assistants.

They have also developed proof-of-concept attacks to illustrate how an attacker could exploit inaudible voice commands, including silently instructing Siri to make a FaceTime call on an iPhone, telling Google to switch the phone into airplane mode, and even manipulating the navigation system in an Audi.

Tags: IoT Security,
Comments