PDA

View Full Version : Smart speakers can be manipulated with inaudible voice commands



Irving
05-10-2018, 08:20
I know there are a lot of paranoid people on here, so here is an article to give more concrete reasons to be worried about emerging technology. The article is about smart speakers or other listening devices that will react to voice commands that cannot be heard by people.

https://mobile.nytimes.com/2018/05/10/technology/alexa-siri-hidden-command-audio-attacks.html

Excerpt:

BERKELEY, Calif. — Many people have grown accustomed to talking to their smart devices, asking them to read a text, play a song or set an alarm. But someone else might be secretly talking to them, too.

Over the past two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.

CS1983
05-10-2018, 08:42
Moreover, it could be used to engage in criminal activity such that it appears the person who is a target for .gov did X action did so themselves with no alibi otherwise, leading to a conviction with no recourse.

Vocal critic of something and a thorn in the side of Uncle Scam? Why, you snuff/kiddie porn addict. AND you also visited known jihadist websites. AND you (insert here)? You're goin' to jail little fella!

Never mind that *you* never visited such content in your life.

Irving
05-10-2018, 08:51
I forgot to post that while I don't have any smart speakers, I do have a smart phone, which is the same thing. I don't think most of us will have much of a choice as that technology will be integrated into future purchases whether we like it or not.

Gman
05-10-2018, 09:36
This is old news that was made public in September of last year. The microphones could pick up ultrasonic sounds that people can't hear. Most of the vendors of such speakers have added frequency filters to avoid this problem.

The original exploit required physical access to the devices, so if someone is in your home doing this, then you may have bigger problems.

hollohas
05-10-2018, 09:46
The original exploit required physical access to the devices, so if someone is in your home doing this, then you may have bigger problems.

This. I would think inaudible commands would have to be transmitted from inside your home.

And if they figure out a way to transmit commands from outside my home, the moment my Amazon Echo tells me it's turning down my AC without me asking, it gets limited to music playback.

CS1983
05-10-2018, 09:48
The smart thing is not to attach one's life to the IoT.

Irving
05-10-2018, 09:53
Sounds like your information on this old news is outdated. Reading the article might be a good idea.

Irving
05-10-2018, 10:13
This. I would think inaudible commands would have to be transmitted from inside your home.

And if they figure out a way to transmit commands from outside my home, the moment my Amazon Echo tells me it's turning down my AC without me asking, it gets limited to music playback.

Great idea except whoops, the entire point of the article.


A group of students from University of California, Berkeley and Georgetown University showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list.

Great-Kazoo
05-10-2018, 10:52
Our issue is people have become so dependent on an electronic device, to literally control their waking moments.
From Alexa, what time do the kids need to be picked up, to Alexa, can you check my credit card balance then order X from Amazon.

The hard cord phone was replaced by the portable phone - cell phone. Writing letters home while overseas in the .mil went to email, to this Instagram and twitter stuff. We believe the "plan" is to get humanity so dependent on electronic devices, including VR no one leaves their home. Thus ignoring, worse than now what's going on politically, both local and word wide.


Remember.. They caught the Golden State Killer through DNA a family member submitted to one of those ancestry places. What's next ?

Irving
05-10-2018, 11:03
I don't think there is an overall plan at all. It's the same thing as the tech revolution of the 1950's. Marketing driven in order to sell products.

Rucker61
05-10-2018, 11:06
[Alexa, fdisk c: return]

hollohas
05-10-2018, 14:33
I read it in full before my post and my point still stands. Did you miss this part?


While DolphinAttack has its limitations — the transmitter must be close to the receiving device — experts warned that more powerful ultrasonic systems were possible.

That warning was borne out in April, when researchers at the University of Illinois at Urbana-Champaign demonstrated ultrasound attacks from 25 feet away. While the commands couldn’t penetrate walls, they could control smart devices through open windows from outside a building.

Close to device, Not through walls, or at worst, 25ft away. They have to be right outside...while the windows are open.

Or...how are tampered with music recordings or spoken text going to get in my home?

Maybe someone will hack a commerical on KOA while I'm listening on I Heart Radio on the Echo. A) I don't know if the Echo can hear itself. And B) if it can and if it did pickup a hidden command, it would acknowledge that command and I'll hear it. If they mute it first so I can't hear the acknowledgement...well, then it won't hear the muted hidden command either.

So, as I said, my point stands. I'm not too worried about this one. What's the worst they can do, turn my kicthen lights or have it tell me a joke? Or maybe they will set a timer...scary stuff.

Gman
05-10-2018, 14:34
Sounds like your information on this old news is outdated. Reading the article might be a good idea.

I read it before I posted. It's still a re-tread.

Irving
05-10-2018, 14:42
How difficult would it be to setup an app, or ring tone, or song, one minute video file, or some simple media that could be purchased via whatever smart device, then imbed the various different instructions into some YouTube video that goes viral. Then out of millions of views, the number of people who watch the video within hearing range of said device will be a certain percentage. The number of people who don't catch the purchase will be some smaller percentage. With enough views, I'd think the numbers would be in the favor of the scammer. Just like early bulk email scams.

Gman
05-10-2018, 14:47
There are devices that are listening all the time. They don't have to be a smart speaker. They could be a laptop or a cell phone. They hear inaudible tones during commercials and feed the data back to the web, so they know whose eyes saw a specific ad.

If you are that worried about what people can find out about you by leveraging technology, you probably aren't posting here.

Irving
05-10-2018, 14:54
I'm personally not worried about it, well a little but not enough to do anything different. Seems like a lot of potential for people with bad intentions to mess with strangers.

mattiooo
05-10-2018, 16:07
Moreover, it could be used to engage in criminal activity such that it appears the person who is a target for .gov did X action did so themselves with no alibi otherwise, leading to a conviction with no recourse.

Vocal critic of something and a thorn in the side of Uncle Scam? Why, you snuff/kiddie porn addict. AND you also visited known jihadist websites. AND you (insert here)? You're goin' to jail little fella!

Never mind that *you* never visited such content in your life.

This already happens and with simple hacking. You don't need elaborate tech exploits and workarounds to accomplish this, just a 13 year old with the right skill set.

CS1983
05-10-2018, 16:18
This already happens and with simple hacking. You don't need elaborate tech exploits and workarounds to accomplish this, just a 13 year old with the right skill set.

Yes, but another vector for it to be accomplished more easily is a problem.

Imagine this technology being used in a broad-brush manner against, say, Congress or really any high profile target when directly hacking them is more difficult.
Why shoot a dart when you can simply mist the air with poison?

mattiooo
05-10-2018, 16:37
Yes, but another vector for it to be accomplished more easily is a problem.

Imagine this technology being used in a broad-brush manner against, say, Congress or really any high profile target when directly hacking them is more difficult.
Why shoot a dart when you can simply mist the air with poison?

When it evolves, it may well be. Right now, with having to be in the same location as the device, I think it's the more difficult process, and I'd worry more right now about remote breaches. The good news is we're constantly evolving the defenses as the technology evolves. The good guys are always catching up, but we at least keep the bad guys having to find new ways.

Where there is a will, there is almost always a way. Especially if laws are not something you worry about.

Bailey Guns
05-10-2018, 17:02
I was listening to the radio today about someone "teaching" an AI computer to fabricate a video of someone saying something. They'd just feed all sorts of video and audio files into a computer of a person and eventually have enough data to produce a video of that person saying whatever the maker of the video wanted them to say. Apparently it's very difficult to determine it's not actually a real recording of the person speaking without some pretty sophisticated equipment and skills.

I can see where that could be a problem.

Irving
05-10-2018, 17:57
Radio Lab had an episode all about that.

Great-Kazoo
05-10-2018, 17:58
I was listening to the radio today about someone "teaching" an AI computer to fabricate a video of someone saying something. They'd just feed all sorts of video and audio files into a computer of a person and eventually have enough data to produce a video of that person saying whatever the maker of the video wanted them to say. Apparently it's very difficult to determine it's not actually a real recording of the person speaking without some pretty sophisticated equipment and skills.

I can see where that could be a problem.

There was a story on kdvr last night where one of those google smart or alexa's was demoing how someone would be unable to tell the person they were speaking to was not a human .

mattiooo
05-10-2018, 18:33
There was a story on kdvr last night where one of those google smart or alexa's was demoing how someone would be unable to tell the person they were speaking to was not a human .

https://mashable.com/2018/05/09/google-assistant-phone-call/#sjUKdz_Cikq7

Pretty impressive.