The idea of home electronics randomly chuckling is unnerving to some the latest findings by researchers regarding hackers breaking into Alexa and eavesdropping on users should be scary to anyone with the virtual assistant sitting in their home.
“Voice squatting” could soon become a significant issue for Alexa users. Unfortunately, these hacks are virtually unnoticeable.
The concept of voice squatting, uncovered by research groups from Indiana University, Bloomington, the University of Virginia and the Chinese Academy of Sciences, involves hackers using skill markets to install malicious skills on Alexa devices. Skill markets are the open markets where third-party app developers upload their services. These services usually sound incredibly similar to already popular apps and services.
Voice squatting can happen in a split-second, just by speaking a little too fast. Here’s a quick explanation of voice squatting techniques from Forbes:
“In one attack, the researchers registered an ‘attack skill’ called ‘rap game’ that sounded very similar to the legitimate skill ‘rat game.’ They showed that when someone asked for the latter from an Amazon Echo device, they got the former instead. They went a little further, doing what they dubbed ‘word squatting.’ For this, they created an evil skill dubbed ‘rat game please.’ This would, of course, exploit users’ etiquette when speaking with Alexa by strategically placing the word ‘please’ in the application’s title. Similar techniques were successfully used on Google Home.”
So how did researchers test if Alexa was susceptible to voice squatting and word squatting? They uploaded four skills to Amazon and another to a Google Home Mini that sounded similar to legitimate services. These skills contained no malicious content or hacking devices. The researchers were eager to test of the voice-activated assistants would open the bogus service instead of the intended app. Sure enough, Alexa and Google launched the fake service more than 50% of the time.
Check out this video explanation in case all of this info reads like a Terms & Conditions screen.
The researchers then went all James Bond to prove that hackers could record long chunks of private conversation for hours, and sometimes, forever.
“To add secret recording functionality, the academics used the ‘reprompt’ function. That feature allows a skill to keep running when it doesn’t receive a response, as long a notice is issued to users via an audio or text file. The researchers abused this by creating a long, silent audio file as a reprompt so the user wouldn’t receive any noticeable warning the mic was still recording. Their rogue skill was able to record for 102 seconds on Alexa and 264 seconds on Google. The researchers said that if the user continued talking, even if they weren’t speaking to the home assistants, the recording could continue ‘indefinitely.'”
Voice squatting is much more dangerous to the users, according to the researchers, because these skills can run forever without a user noticing anything was wrong.
“Unlike smartphone apps,” the researchers wrote, “skills run without any need for further user checks, such as for permissions, making them a ripe target for malicious use.”
Researchers took their voice squatting findings to Amazon and Google back in April. Both companies said, “they already had protections in place that tried to detect malicious skills.” The research teams remain skeptical that proper protections are in place for these voice squatting issues to stop any time soon.