Smart speakers have become a home addition for many, but the choice requires some vigilance over potential security breaches. Now, researchers have shown that hackers could get these small devices to do their bidding without making a single noise.
Devices running Google Assistant, Amazon Alexa and Siri were all shown to be vulnerable to this security hole, and the researchers got it working on a Facebook Portal device, too. Even phones and tablets were shown to be vulnerable.
The trick works through the micro-electro-mechanical systems, or MEMs, built into smart speaker mics. These tiny components can interpret light as sound, which means they can be manipulated by something as simple as a laser pointer.
Considering smart speakers are often hooked up to smart locks, smart alarms, and other home security devices, it's not difficult to imagine how this could be used to sneak into a property (or maybe just fire up a smart coffee maker).
"We know that light triggers some sort of movement in the microphone's diaphragm, and that mics are built to interpret such movements as sound, as they're typically resulting from sound pressure physically hitting the diaphragm," the researchers told Dan Goodin at Ars Technica.
"However, we do not completely understand the physics behind it, and we are currently working on investigating this."
The team got the hack working through windows, at distances of up to 110 metres (361 feet), and with a kit costing just a few US dollars. Smart speakers don't often come with extra security protections – if you issue a voice command, it just works.
Before you lock your Amazon Echo in the cupboard though, bear in mind that the attack does need a line of sight with the device. These speakers usually issue audible feedback too, so you would know if someone was trying to do some online shopping or turn off your smart lights remotely.
The exploit also needs quite a sophisticated setup, with a strong and focussed laser, and equipment to convert audio commands into laser light modulations. It's not something that your neighbours are going to be able to do easily, though you might want to keep your smart speakers away from the windows, just in case.
There are ways that smart speakers could reduce the risk of this type of attack, the researchers say – by only responding to commands if multiple mics in the speaker are triggered, and by implementing voice recognition technology (this is available on some speakers, but isn't always enabled by default).
When contacted for comment, Amazon and Google said they were keeping tabs on the research. The general consensus is that this isn't really a practical hack that anyone would seriously get around to trying, but it is certainly something to be aware of.
And even if this might not be the sort of security attack that's likely to happen on your street, the research is valuable in figuring out what approaches hackers might take in the future, as our homes and businesses become increasingly littered with voice-activated gadgets.
"Better understanding of the physics behind the attack will benefit both new attacks and countermeasures," write the researchers in their paper.
The paper has yet to be peer-reviewed and published, but you can read more on the research here.