Light Commands Laser attack virtual assistants

Digital assistants were created to ease our lives as they make most of our tasks hands-free. But sooner or later, hackers and security researchers have been able to compromise them (of course, hands-free!) if the tech companies themselves weren’t doing enough harm to people’s privacy.

The latest addition to this is “Light Commands,” as its makers like to call it. A team of researchers has achieved success in fooling virtual assistants by pointing a specially crafted laser beam towards various devices, including Google Home, Amazon Echo, iPhone XR, Google Pixel 2, Facebook Portal Mini, etc. So, it means their tech can target all popular assistants out there.

They found out that “by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.” It allows the researchers to issue almost any command they like.

Not the virtual assistants, the issue stems out of a vulnerability that exists in the MEMS microphones in many devices that these assistants use to recognize voice commands.

Until now, most of us know that microphones are sensitive to audio waves. But as explained by the researchers, under the right circumstances, they can also respond to light as well.

While accurately pointing the laser light at a smart speaker seems like a bit of a task, a determined attacker could do so from almost 110 meters away outside the window.

It should be noted that speaker recognition feature that identifies the user from his/her voice was disabled during their Light Commands tests. However, as per the researchers, the assistants verify the trigger word (like, “Ok, Google” or “Alexa”) to identify someone’s voice, not the entire command. So, the attack can still be performed.

Also, the entire Light Commands setup could cost around $600. So, it shouldn’t be the case that some random guy walking down the streets is pointing laser inside people’s homes.

As far as your own protection is concerned. You won’t hear anything, but vigilant users would be able to spot the laser light reflecting off their smart devices. Similar to what happens in the movies. The device waking up on its own could also be an alarm.

The researchers have published their findings in a new research paper. They have also suggested some mitigation techniques. For instance, devices that can reduce the amount of light reaching the microphones, or using sensor fusion techniques to acquire audio from multiple microphones as the single laser light would only target one.

Also Read: First Windows ‘BlueKeep’ Attacks Spotted Installing Cryptocurrency Miners