Scientists have found a new vulnerability in a common tech component, uncovering a security flaw that could expose potentially millions of smartphones, fitness wearables, and even cars to hacking.
By using sound waves, researchers have figured out how to trick accelerometers – the tiny sensors in gadgets that detect movement – into registering a fake motion signal, which hackers could exploit to take control of our devices.
"It's like the opera singer who hits the note to break a wine glass, only in our case, we can spell out words," computer scientist Kevin Fu from the University of Michigan told The New York Times.
"You can think of it as a musical virus."
The sensors that Fu's team investigated are called capacitive MEMS accelerometers, which register the rate of change in an object's speed in three dimensions.
It's these sensors that can tell which way you're holding or tilting your smartphone or tablet, and count the steps you take using an activity tracker.
But they're not just used in consumer gadgets – they're also embedded in the circuits of things like medical devices, vehicles, and even satellites – and we're becoming more reliant on them all the time.
"Thousands of everyday devices already contain tiny MEMS accelerometers," Fu explains in a press release.
"Tomorrow's devices will aggressively rely on sensors to make automated decisions with kinetic consequences."
But accelerometers have an Achilles heel: sound. By precisely tuning acoustic tones to the right frequency, Fu's team was able to deceive 15 out of 20 different models of accelerometers from five different manufacturers, and control output from the devices in 65 percent of cases.
Accelerometers may enable some high-tech functionalities, but the principle is fundamentally simple – using a mass suspended on springs to detect changes in speed or direction. But those measurements can effectively be forged if you use the right sonic frequency to fool the tech.
"The fundamental physics of the hardware allowed us to trick sensors into delivering a false reality to the microprocessor," Fu explains.
Once they figured out what the frequencies were to manipulate the sensors, they were able to trick a Fitbit into counting thousands of steps that were never taken; pilot a toy car by taking control of a smartphone app; and even use a music file to make a Samsung Galaxy S5 crudely write out a word ("Walnut") in a graph of its accelerometer readings.
The tech used to hijack these devices wasn't high-end audio gear either. In one case, the researchers used a US$5 external speaker; in another, a smartphone played a sound file on its own internal speaker and effectively hacked itself.
While all these proofs-of-concept were fairly harmless demonstrations of the technique, the researchers warn that it could easily be used for malicious and potentially very dangerous purposes.
"If a phone app used the accelerometer to start your car when you physically shake your phone, then you could intentionally spoof the accelerometer's output data to make the phone app think the phone is being shaken," one of the team, Timothy Trippel, told Gizmodo.
"The phone app would then send the car a signal to start."
The research is due to be presented at the IEEE European Symposium on Security and Privacy in Paris in April, and while the study hasn't yet been peer-reviewed, the findings are being treated seriously.
As John Markoff at The New York Times reports, the US Department of Homeland Security is expected to issue a security alert in relation to the specific sensors documented in the paper.
The manufacturers involved were separately forewarned of the vulnerability before the researchers went public with their findings this week.
Now that we know about the security flaw, hopefully researchers and technology companies will be able to work together and find a means of patching up the weak spot.
As technological devices get ever more powerful and independent, it's crucial that they can't be puppeteered by something as rudimentary as sound waves overriding their fundamental components.
"Humans have sensors, like eyes, ears, and a nose," says Trippel.
"We trust our senses and we use them to make decisions. If autonomous systems can't trust their senses, then the security and reliability of those systems will fail."