Hey Alexa, are you vulnerable to hackers?
Like any piece of technology, the answer is yes.
And new research reveals an unexpected method, what you might call the laser light attack.
How can you hack an Alexa with light commands?
The connectivity in the Internet of Things is great. That is, until hackers gain access.
If an attacker can gain control of an MEMS microphone (like those in Alexa, Google Assistant, and Siri), they might be able to access your whole home, based on what you have the device controlling:
• Control smart home switches
• Open smart garage doors
• Make online purchases
• Remotely unlock and start certain vehicles
• Open smart locks by stealthily brute forcing a user's PIN number
New research from the University of Michigan shows how a light command vulnerability can be used on these devices.
Here's the gist: microphones convert sound into electrical signals.
The main discovery behind light commands is that in addition to sound, microphones also react to concentrated light aimed directly at them, like a laser. This means attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.
As if they were receiving a voice command that said, "Alexa, unlock front door," for example.
Here's a video of light commands in action on a Google Assistant.
That's scary stuff, and one of the risks associated with the connectivity of the IoT.
Check out the complete research here.