Windows 10 News and info | Forum
November 12, 2019, Loading... *
Welcome, Guest. Please login or register.

Login with username, password and session length
News: This is a clean Ad-free Forum and protected by StopForumSpam, Project Honeypot, Botscout and AbuseIPDB | This forum does not use audio ads, popups, or other annoyances. New member registration currently disabled.
 
  Website   Home   Windows 8 Website GDPR Help Login Register  
By continuing to use the site or forum, you agree to the use of cookies, find out more by reading our GDPR policy.
Pages: [1]
  Print  
Share this topic on Del.icio.usShare this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on MySpaceShare this topic on RedditShare this topic on StumbleUponShare this topic on TechnoratiShare this topic on TwitterShare this topic on YahooShare this topic on Google buzz
Author Topic: Using Light Beams to Control Google, Apple, Amazon Assistants  (Read 7 times)
javajolt
Administrator
Hero Member
*****
Offline Offline

Gender: Male
United States United States

Posts: 30681


I Do Windows


WWW Email
« on: November 05, 2019, 05:52:17 PM »
ReplyReply

Academic researchers found that certain microphones convert light to sound, allowing voice commands to be sent to voice-controlled (VC) devices like Google Home, Amazon Echo, Facebook Portal, smartphones, or tablets.

Dubbed Light Commands, the attack works from afar by shining a laser beam at microphones that use micro-electro-mechanical systems (MEMS), which convert the light into an electrical signal.

By modulating the intensity of the light beam, MEMS can be tricked to produce the same electrical signals produced by audio commands. With careful aiming and laser focusing, attacks can be successful from as far as 110 meters.

Long-range attack

In their experiments, researchers from the University of Electro-Communications in Japan and the University of Michigan tested the attack on popular VC devices.

The voice recognition system in Google Home, Nest Cam, Amazon Echo, Fire Cube TV, iPhone, Samsung Galaxy S9, Google Pixel, and iPad, was tested from various distances.



A Light Command attack sends inaudible instructions to a voice-controlled device, making it react in a meaningful way. The researchers demonstrated that it can be used to open a garage door or to unlock the front door of a house.



No large investment is needed to pull this off, either. A low-cost setup used by the researchers consisted of a normal laser pointer, a Wavelength Electronics laser driver ($339), and a Neoteck NTK059 sound amplifier ($27.99). A computer that plays the recorded audio commands is also required.

Laser beams provide precise aiming, but the researchers showed that Light Commands attacks also work with a laser flashlight (Acebeam W30). From 10 meters, they were able to inject commands into Google Home.



As seen in the image above, the light covers the target device completely. This imprecise aiming, though, has its downsides: limited distance and potentially hitting microphones from other devices.

For long-range attacks, additional gear is required to focus the beam on the right spot: a telescope, a telephoto lens, and a tripod for focus and accurate aiming.

Windows are not an obstacle as long as there is a direct line of sight between the source of the light and the target device.



Despite the double-pane glass window and windy conditions, the experiment was successful. Reflections were negligible, the researchers write in a paper describing the details for Light Commands injection attacks.

To run the experiments, four commands were recorded for asking the time, setting the volume to zero, placing an order for a laser pointer, and opening a garage door. To these, the predefined device wake up phrase ("OK Google," "Hey Siri," "Alexa," "Hey Portal") was appended.

Real-life limitations

Although a novel type of attack, it is hard to imagine a successful Light Commands attack outside the preset conditions of an experiment. Clearly, there is no reason for concern at the moment.

A threat actor has to consider limitations such as line of sight to the device as well as the barriers in the way as light has trouble going through an opaque environment, such as fog or tainted windows.

Furthermore, the victim may be alerted by the visibility of the light beam, unless infrared is used - but additional gear is necessary in this case, and the audio response fom the target device confirm execution of the command.

The target device may also represent a problem. A smart speaker at the window is an easier target than a smartphone or a tablet, which are designed for mobility and their owner could place them in a position that does not allow a direct line to their microphone.

The Light Commands research is the result of Takeshi Sugawara (the University of Electro-Communications in Japan), Benjamin Cyr, Sara Rampazzi, Daniel Genkin, and Kevin Fu (University of Michigan). Details are provided in their paper called "Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems" (PDF). A website has also been set up for an overview of this type of attacks.

source
Logged


Pages: [1]
  Print  
 
Jump to:  

Powered by SMF 1.1.21 | SMF © 2017, Simple Machines

Google visited last this page November 06, 2019, 05:18:46 AM