Click the blue text to follow us
In the past two years, people have been concerned that popular mobile apps and smart speakers/voice assistants could leak personal privacy information, but the reality is scarier than we imagined.
Recently, researchers from the University of Cambridge discovered that any smart device capable of receiving voice commands (such as smart speakers or smartphones) can infer a large amount of information input on nearby smartphones (including entered passwords) just by listening.
The study indicates that the privacy threats posed by audio capture devices extend beyond eavesdropping on private conversations; information input on physical keyboards and smartphone touchscreens is equally vulnerable to their monitoring.
The researchers explained in their paper “Decoding Smartphone Sounds Using Voice Assistants” (download link at the end): By using two different smartphones and a tablet, we verified that attackers could extract PIN codes and text messages from recordings collected by a voice assistant located half a meter away. This indicates that remote keyboard inference attacks are not limited to physical keyboards but also extend to virtual keyboards.
Fortunately, the smaller attack radius is a major obstacle to the implementation of such attacks in the wild.
Shumailov, a PhD candidate in Computer Science at the University of Cambridge, noted: “I don’t think anyone is currently using the attack method we described. If you are concerned about privacy, it is best not to keep the microphone on all the time. Unfortunately, there is currently no research on how to overcome the attack without significantly reducing device usability.”
Welcome to the Biometric Dome
In discussions on Twitter, researchers believe that swipe gestures and passwords may be vulnerable to the type of attack developed by the Cambridge team.
However, this does not mean that privacy-conscious users need to switch their authentication methods to Apple’s FaceID facial recognition technology (which is less secure) or other forms of biometric authentication.
The researchers pointed out: “Facial information and other biometric markers often become public information, and anyone can use your face. Moreover, in most countries you have visited, you have left fingerprints. Such biometric authentication technologies can at most defend against some casual attacks but cannot withstand sophisticated attacks.”
It is worth noting that the new paper is related to a study conducted by the University of Cambridge last year. That study analyzed how gaming applications could listen to the vibrations of finger taps on the screen through the phone’s microphone to steal bank PIN codes, while the latest vulnerability indicates that smart speakers/voice assistants are almost equally effective.
“Smart speakers/voice assistants have 2 to 7 microphones, so they can perform directional localization like the human ear, with higher sensitivity,” the researchers pointed out.
Clearly, we have greatly underestimated the ‘hearing’ capabilities of smart speakers.
References
Decoding Smartphone Sounds Using Voice Assistants:
https://arxiv.org/abs/2012.00687
Related Articles
Analysis of Eight Major Security Issues in Smart Hardware Devices
How to Prevent ISPs from Spying on Your Privacy through Smart Devices
Turn Off Your Bluetooth! Vulnerabilities Put 5.3 Billion Smart Devices at Serious Risk
Cooperation Phone: 18311333376 Cooperation WeChat: aqniu001 Submission Email: [email protected]