You Can Give Voice Commands To Smart Devices Using Light

Adjust Comment Print

The laser study was conducted by researchers at the University of Electro-Communications in Tokyo and the University of MI, who detail their work in a new paper, "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems".

Smart speakers can be hijacked this way from up to 350 (~107m) feet away. The study also lists other products that can take voice commands, including the Facebook Portal Mini, Fire Cube TV, EchoBee 4, iPhone XR, iPad 6th gen, Galaxy S9, and Google Pixel 2. That paper is available through a new website centered on explaining these so-called Light Commands, which essentially use lasers to manipulate smart speakers with bogus commands. "Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio".

Voice-activated digital assistants can be remotely hijacked by lasers as far as 350 feet away and made to order products, start cars, and otherwise drive a smart-home owner insane, researchers have discovered. For example, the voice-controlled system could ask the user a simple randomized question before executing a command.

Not the virtual assistants, the issue stems out of a vulnerability that exists in the MEMS microphones in many devices that these assistants use to recognize voice commands. Just shooting a beam of light at a device's speakers won't give the person with the laser full and instant control over a smart home's devices. Smart speakers typically don't come with any user authentication features turned on by default; the Apple devices are among a few exceptions that required the researchers to come up with a way to work around this privacy setting.

United States diplomat told that Ukraine aid held up for corruption probe
The committee also released a trove of text messages between Volker , other diplomats and Ukrainian officials. Zelensky had also discussed the aid with Pence, Sondland said.

Security researchers from the University of MI and the University of Electro-Communications (Tokyo) have invented a new technique called "Light Commands" that can take over voice assistants, including Amazon's Alexa, Google Assistant, Facebook Portal and Apple's Siri, through use of a laser beam. The researchers did the ethical thing and warned device manufacturers including Amazon, Apple, Google - even Tesla and Ford, whose cars could be remotely controlled - of the vulnerability, but it's not the first flaw in these supposedly smart assistants, and it surely won't be the last.

Smart speakers like Google Home (Nest), Apple HomePod, and Amazon Echo are constantly listening using local audio processing, but they only "wake up" when someone says the trigger phrase.

A Google spokesperson said the company is closely reviewing the research. Amazon did not respond to a request for comment at the time of publication. As far as smartphones and tablets are concerned, many of them with only respond specifically to the user's voice, making this exploit significantly harder to carry out on those devices. First, attempting a laser-based attack would require specialized equipment, although a lot of them are easily available on Amazon and aren't very expensive either. Assuming a smart speaker is visible from a window, hackers could use Light Commands to unlock smart doors, garage doors, and auto doors. Light-based command injection may change the equation.

Comments