Phone tapping of a different kind
Always-on virtual assistants including Amazon Alexa are enabling side-channel attacks to be taken to the next level.
Smart speaker-enabled smart devices can infer a surprisingly large amount of information about what is been typed into a nearby smartphone just by listening, researchers from Cambridge University have discovered.
The research shows that the privacy threats posed by audio-gathering devices extend beyond the long-feared possibility of eavesdropping on private conversations via the key taps on physical keyboards to potentially sensitive data typed onto the touchscreens of nearby mobile devices.
As a paper by Almos Zarandy, Ilia Shumailov and Ross Anderson (amusingly titled ‘Hey Alexa what did I just type? Decoding smartphone sounds with a voice assistant’) explains:
Using two different smartphones and a tablet we demonstrate that the attacker can extract PIN codes and text messages from recordings collected by a voice assistant located up to half a meter away.
This shows that remote keyboard-inference attacks are not limited to physical keyboards but extend to virtual keyboards too.
The short distances alone are a substantial barrier to attack outside Mission Impossible-style scenarios.
Shumailov, a PhD candidate in Computer Science at the University of Cambridge, told The Daily Swig that “I don’t think right now anyone would use the attack we described”.
“If you are concerned about privacy best not to have always on mics. Unfortunately, there is no research about how to defeat attacks described without greatly reducing usability.”
Welcome to the biodome
In other exchanges on Twitter the researcher suggested that swipe gestures as well as passcodes could be vulnerable to a variant of the attack the Cambridge team developed.
However, this shouldn’t necessarily be taken as a recommendation for the security conscious to switch to Apple’s FaceID recognition technology or other forms of biometric authentication.
Shumailov told The Daily Swig: “On FaceID and other biometric markers are public information, anyone has access to your face, most of countries you have been to have your fingerprints. They stop a weak attacker, but not anyone remotely sophisticated.”
The latest research from Cambridge University follows a study last year that demonstrated how a gaming app could steal your banking PIN by listening to the vibration of the screen as your finger taps it.
That hack relied on on-phone microphones, whereas the latest exploits showed that voice assistants were nearly just as effective.
“Modern voice assistants have two to seven microphones, so they can do directional localization, just as human ears do, but with greater sensitivity,” Anderson writes in a blog post.
“A lot more work is needed to understand the privacy implications of the always-on microphones that are increasingly infesting our workspaces and our homes.”