A group of French researchers have discovered that the voice command on smartphones, both iPhone and Android, can easily and silently be controlled by hackers transmitting commands via radio from as far as 16 feet away.
[su_note note_color=”#ffffcc” text_color=”#00000″]Mark James, Security Specialist at IT Security Firm ESET :
How serious is this and what could be the implications?
“If we talk about “what ifs” and “could bes” then in theory we could do all manner of things with this attack method. We could in theory dial a premium rate number over and over again, we could access websites with malware installed or access private information on the phone but you have to be pretty close and the likelihood of the owner not realising is quite slim I think.”
How likely are we to see hackers exploiting this vulnerability in the wild?
“I think we will see the opportunistic hacker take a look at it but I don’t think we will see it as a main stream attack method.”
What should smartphone users do? Is it as simple as turning off voice command?
“Making sure voice control is not active on the lock screen can help, and keeping an eye on your phone will alert you if anything is amiss.”
What should the smartphone manufacturers do?
“Listen and evaluate what these researchers have found and work towards getting it fixed. I am sure they will want to fix this as urgently as they can or at least make it a lot harder to succeed.”[/su_note]
[su_note note_color=”#ffffcc” text_color=”#00000″]Craig Young, Security Researcher at Tripwire :
“This has been a very interesting year for software defined radio hacks.
We have seen hacks ranging from turning RAM chips into radios broadcasting air gapped data to pita sized radios stealing encryption keys to opening garage doors with a child’s toy.
This time we are hearing about a hack that remotely spoofs audio transmissions through a microphone wire, a trick which surely has other applications. My speculation would be that this technique may also have implications for other systems like phone based credit processing or using the audio port for data transfer with IoT devices.
As far as hacking Android’s Google Now, the voice identity feature would seem to be a slight saving grace. For example, I generally cannot invoke Google Now on my wife’s phone because it seems to ‘know’ her accent. This technology is not perfect however, and someone going through the trouble to target a victim in this manner probably also has the resources to obtain voice recordings to build up the attack phrases or find someone with a similar voice.”[/su_note]
[su_note note_color=”#ffffcc” text_color=”#00000″]Tim Erlin, Director of Security and Product Management at Tripwire :
“Leaving voice commands enabled on a locked phone is like locking your car with the windows down. You’ve done the right thing, but left a big hole in your strategy.
The researchers have presented an interesting and creative attack, but it probably isn’t of serious concern to the average user.
High value targets, such as government or intelligence employees, could be valid targets for this kind of attack.
The recommended mitigation of custom wake commands would be not only effective, but a pretty fun feature for users.”[/su_note]