Alexa Can Hear Commands You Can’t, Which Hackers Could Exploit

If you didn’t already know…

Your smart speaker can hear sounds that humans can’t, meaning attackers could hypothetically trigger a command without you noticing. It’s happening in labs right now.

Here’s Craig S. Smith, writing for the New York Times:

Over the last two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online—simply with music playing over the radio.

Imagine a car loudly playing music that’s also undetectably asking Alexa and Google Home to unlock the front door. I’m sure you can dream up other scenarios.

Don’t panic: there’s no evidence anyone is using these tricks outside of the lab right now, and both Amazon and Google are working on security questions like this. If you’re concerned, consider training Alexa to recognize your voice or setting up multiple accounts for Google Home—you can limit certain functionality to only recognized voices. And you should probably PIN protect voice purchasing on your Echo, regardless of whether you’re worried about this or not.

via Alexa Can Hear Commands You Can’t, Which Hackers Could Exploit

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.