Alexa (Amazon), Google Home, and Siri (Apple) have proven their convenience and usefulness in our homes a lot of times. Smart assistants unlock doors, turn on lights, and even start some vehicles on the mere sound of your voice. But findings in the last couple of years and a few months ago have found major weaknesses.
Although Amazon and Google are taking action against vulnerabilities in their smart assistants, hackers are constantly trying to leverage the vulnerabilities. This way, they can listen to users’ every word and obtain sensitive information to use in future attacks.
According to David Emm of Kaspersky Lab, “Smart home devices get new functions every other day. So, users should know about the security issues they come with. If Alexa devices can transfer money to a person with the help of verbal command, criminals could intercept this information and then use it to scam money from the owner of the device. So, consumers need to be more if the smart devices they use at home. Since Alexa uses home Wi-Fi to work, owners must ensure the security of the network.”
He advises users to:
- Turn off the microphone of their devices.
- Use the settings of the device or add a password to prevent purchasing.
- Ensure device protection and safety to avoid the risk of a data leak.
- Also, change the wake word if u have a family member in your home with a name similar to Alexa.
Hackers can use both Amazon Alexa and Google Home smart assistants to listen to someone’s conversations without letting the user know. They can even trick hackers into handing them over sensitive information. And these attacks are not new. Previously, security researchers came across similar eavesdropping and phishing vectors affecting Amazon Alexa in April 2018, Alexa and Google Home devices in May 2018, and later on in Alexa devices in August 2018.
Amazon and Google have made a lot of efforts and continue to do so, but hackers find ways to exploit these smart assistants. So much so that a hacker recently got access to a child room’s security camera and using the camera assistance talked to her, which is alarming.
Now, the basic function of a smart assistant is to respond to commands by answering the user questions or completing the tasks required of them. Since most of this is done by verbally addressing the device, the internal microphone is the most important component and prone to hacking.
How do these hacks happen?
Hackers manipulate internal microphones. They trick the devices into executing tasks and responding even in the absence of verbal command. And hackers can accomplish this even from a distance.
According to ZDNet’s report, two security researchers at the Security Research Labs (SRLabs) found hackers using eavesdropping and phishing vectors to get access to the functions that can customize the demands from the user to the smart assistant, and how the assistant responds to it. The hacker places certain commands into the backend of a regular Alexa/Google Home app. The attacker can silence the assistant for long periods even though the assistant is active. The hacker then sends a phishing message that makes the user think that it had nothing to do with the smart assistant app.
The device then asks the user for its Amazon/Google password and sends a fake message feigning it to be from Amazon or Google asking for the password. After getting access to the home assistant, the hacker can listen to the user conversations and record them. The device might seem inactive, but it records everything. This works the same way as the Xnspy ambient recording app that hackers can use to gain access to someone’s phone activity. After installing it on your phone, the hacker can remotely turn on the microphone and record the surroundings. Applications like Xnspy aren’t just ambient recording apps but full-fledged spying apps that allow a hacker to access your text messages, call logs, emails, multimedia, locations, and a lot more.
The app works on both Android and iPhone, making your phones susceptible to hacking. What’s more? There is no way of knowing that spyware is working on your phone since it works silently in the background without showing itself in the applications list on the phone.
The attacks are increasing
According to recent news, the smart assistants we use to make our lives easier, hackers can attack them via laser beams. We all know that microphones work on basic technology. It converts sounds into electrical signals. A small plate present inside a microphone, called the diaphragm, moves when you speak into the microphone and creates electrical signals.
The internal software recognizes the command by translating the electrical signals to come up with the proper response. According to findings, light causes the same effect and can speak to these voice-activated devices, without making a sound. Microphones can respond to light the same way as to sound. And anything that acts on sound commands will respond to light commands too.
The smart assistant microphones are called microelectromechanical systems (MEMS) microphones. This form of attack is called light commands and controls the MEMS’ design. The attackers can trick them into producing electrical signals as if they are getting genuine audio by modulating them in the intensity of a light beam. The researchers launched inaudible commands by shining lasers on to the smart assistant devices from as far as 110 meters and even 360 feet.
In the real world, a criminal could shine a laser light onto a voice assistant he can see through a window, and command the device to unlock the doors from outside the house. He could even start vehicles remotely or make online purchases.
How are the tech giants dealing with it?
Both Amazon and Google are working together with researchers who found vulnerabilities within the smart assistants to boost security features. One way could be to cover the microphones or adding another to the other side, so the hackers find it hard to manipulate different areas of the device. So far, Apple and Facebook have not made any comments.
Although it is good to know that tech companies are paying attention to the issue, to improve the security of smart assistants, we must take safety measures on our own. Our phones also have smart assistants on them, and they are equally vulnerable. Ensuring the digital safety of our loved ones in this era of data leaks and identity thefts should be our primary concern. Also, to protect smart assistants from hacking, avoid placing your devices within proximity to a window or a clear path to the outside. Update your devices and check the security settings regularly. Also, if your device allows, use spoken PINs.