Opinion Tuesday #3: Hardware Hackers Exploit Siri Using Nothing but an EMF Generator

Hardware Hackers Exploit Siri Using Nothing but an EMF Generator

Hello again fellow hackers. First of all, sorry that this Opinion Tuesday came a week late, but I couldn't make it home in time last week due to the terror attacks in Brussels. I'm safe and sound, though!

Today, I'm reviewing an interesting article that happened some time ago (October 2015), but it shows how powerful hardware hacking can be!

The Event

In October last year, security researchers from the from the French government agency "ANSSI" discovered that they could exploit Siri and Google Now to visit malware sites or call a paid number without any software or user interaction! There are only a few key criteria that need to be met:

  • The target must have earphones with a microphone plugged in
  • Siri or Google Now must be enabled from the lock screen

The researchers constructed a radio transmitter that would trigger voice commands for Siri or Google Now. Further details can be found in the original article:

Source http://thehackernews.com/2015/10/radio-wave-phone-hack.html

Phoenix750's Opinion

As most of you will know, I am deeply intrigued by hardware hacking, so this vulnerability really got me bumped up!

The way this vulnerability works is that the transmitter (attacker) creates a changing electromagnetic field that follows a certain frequency. In this case, the frequency would represent an analogue voice pattern. In other words, the strength of the electromagnetic field will change in order to simulate a voice command. The picture below shows how an analogue voice signal looks like.

Image via etutorials.org

As you all know from my article on electromagnetism & electromagnetic induction, when we have a changing magnetic field, we will induce a current in a conductor that is within the field's range. So if the attacker's electromagnetic field would change like the analogue signal of a voice, that "voice" will be induced in the wire of the microphone in the earphones. The result: we tricked the smartphone into thinking the user gave it a command!

The beauty of hardware hacking is that it mostly can't be patched. When there is a 0-day discovered in Flash Player, Adobe will simply re-encode their software to fix it. Apple and Google can't fix this vulnerability because it isn't software-related. And they can't re-encode the laws of electromagnetism! In other words, this vulnerability will most likely stay.

Now, there is a simple way that can protect you most of the time: simply disable Siri and Google Now on the lock screen, or disable it entirely. It is not convenient, but it will protect you from this attack!

So, what do you guys think about this? Let me know!



This was very well written. As for the vulnerability, I find it very interesting. I'm not so big electricity such as yourself, so I'll definitely be reading up on it for a better understanding.


Thanks for the positive feedback!


This is awesome.

I am still new to the whole area of Hardware Hacking, but the more I concern myself with it, the more my fascination for it rises.

I once read about a <100$ setup to make car keys work over distances of more than 100 metres, which really good me into the hardware area.

This vulnerability seems simple and brilliant at the same time. You just need to know which radio signals to use for a specific command. This is admittedly not a very directed attack - who knows who is in your range? - but it works, so what :)

It is the perfect tool for cyber criminals. Go to a busy airport, sit down, turn on your transmitter, and let the money flow in.


that hack is pretty damn sexy...

Great article Phoenix! Your passion about electricity and hardware hacking has changed my perspective as a whole. Keep up the great work.


It is a very clever hack, which, as you say, cannot be fixed by software.
I would not say, though, that it cannot be patched full stop.

A simple EMC shielding braid in the headphone wires would fix this, and if the hack becomes more widely known/gains more public attention, then headphones with built in mics will probably start having this feature as a default.

I am assuming that the attacker would have to increase the frequency so as to not be heard by the victim. Is it not possible to filter out these out-of-vocal-range frequencies in software, regardless of the effects on hardware?

The frequencies the transmitter generate aren't sounds, but magnetic fields.

As to not be induced in the "sound" wire: remember that every conductor that carries a current creates a magnetic field around itself. If the target is listening to music (which he most likely is if he is using his earphones), the magnetic field already present around the sound wire will "repel" the other magnetic fields, and thus no current will be induced.

It's a complicated effect.


wow...incredibly fascinating!!!! Very well written article.

This is unrelated but I just saw on the news that an undisclosed "3rd party hacker" assisted the FBI in hacking into the cell phone of a deceased terrorist involved in the San Bernadino terror attack. Originally the FBI went to Apple to get them to hack into the phone because the FBI wasn't able to. Apple refused to do so and was taken to court and ordered by a judge to do so. Apple appealed the decision and it's still being worked out through the courts. But today the FBI came forward to announce that they don't need Apple's assistance anymore because they got this "3rd party" to do their dirty work. Apple is not happy about it because they feel as though this hack needs to be disclosed to them so they can patch it, but of course, the FBI is refusing. Interesting stuff. What do you think?

I might write my next Opinion Tuesday about it.


Share Your Thoughts

  • Hot
  • Latest