Siri as an Attack Vector

Tuesday, October 20, 2015 @ 08:10 PM gHale

Voice activated personal assistants could turn against themselves, researchers said.

In a new type of attack it is possible to make the Siri and Google Now voice activated personal assistants to send messages and emails. In addition, it is possible to also trick the assistants into visiting potentially malicious sites, make pricey phone calls or even become an eavesdropping device, said researchers at French Network and Information Security Agency (ANSSI).

Mobile Malware Growing on Windows Devices
Report: Cyber Attacks On U.S. ‘Advanced, Persistent’
Fighting Off the ICS Pivot Point
German Steel Mill Attack: Inside Job

For the attack to be successful, the devices must have headphones with a microphone plugged in, and the attacker must be within close range of it, but doesn’t have to have immediate physical access to the device.

The headphones’ cord functions as an antenna, and the wire converts electromagnetic waves sent by the attackers into electrical signals, which the OS interprets as voice commands coming via the microphone.

On the attackers’ part the setup includes a laptop running GNU Radio, a USRP software-defined radio, an amplifier, and an antenna.

The whole setup can fit into a backpack, allowing the attacker to get close enough (some six and a half feet) to the target in a situation where crowds are normal (airport, commuting with public methods of transportation, etc.). The researchers said an increased attack range (up to 16 feet) is possible, but also the attack setup would be much, much larger, and therefore more unwieldy.

Such an attack would, of course, end up noticed by the targets if they are fiddling with their phone at that exact moment. Also, for the attack to be successful it’s best if the victim’s phone remains unlocked (although on newer iPhones Siri can end up enabled from the lock screen).

The researchers told Apple and Google on how to fix this attack vector.