Researchers discovered security loophole of light in the service of the audio assistance, such as Siri from Apple, andAlexa of the Amazon, andsee from Google makes it prone to a number of attacks using laser injection, commands not allowed in smart phones, and speakers, making them do things, like lock the doors, online shopping, and even run vehicles.
Discovered attacks – codenamed (orders light) Light Commands – researchers Security from the University of electrical contacts in the capital of Japan of Tokyo, the University of Michigan, USA. They explained that the new attack works by inputting the audio signals in the microphones using a laser light – from a distance of up to 110 meters – which use a vulnerability in microphones MEMS – also known as electro-mechanical systems micro – response to light inadvertently as you would if it was a voice.
The researchers said in their paper research: “by modifying the electrical signal in the intensity of the Optical Packet, an attacker could trick the microphones produce electrical signals as if you receive a real voice”. However, there are no indications so far that this attack was exploited badly by the inventors.
Although the attack requires that the laser beam in the line-of-sight direct to the target device, but it highlights the risk of systems that are controlled by the remote, which operates via voice without any form of authentication like a password. It is most disturbing, according to the researchers, is that it can issue these commands to light through buildings and even through the glass windows closed.
And microphones MEMS on the hot plate compact small called the diaphragm, which when hit by sound waves or light, they translate it into an electrical signal, translate then into actual orders. And what they found the researchers is a way to encrypt the images by adjusting the intensity of the laser beam, allowing deceives the microphones and make them produce electrical signals in the same way as images.
Related topics what you read now:
Therefore, attackers can take advantage of a device that has a laser pointer, the software program running the laser, pump the sound to penetrate the voice assistant, issuing orders to each of the Alexa, the will, the existence of the set without the intervention of the victim. And to make it more confidential, hackers can use the infrared laser, which is typically not visible to the naked eye.
Researchers are working to determine the exact cause that makes microphones MEMS respond to light. Now they have cited the reason as “the gap index between the physics and specifications of MEMS”.
The researchers said they chose to attack using a variety of devices that use voice assistants, including: Google Nest Cam IQ, and Amazon Echo, and the Facebook Portal, the phones: iPhone 10 silenced AR, Samsung Galaxy s, the existence of the Pixel 2. But they warn that any system that uses microphones MEMS may be at risk.