In old movies, especially spy movies, we would often see a person trying to place a mike for recording conversations in a hidden place. Later, hidden mobile phone (with auto-answering feature) was shown as being used to listen to the conversations. However, with the new age smart voice assistants occupying space in advertisement and, thus, in homes, the new generation spy may not have to take any trouble to install a mike or mobile for listening in to secret conversations.
As I had mentioned in
an earlier article on smart voice assistants or speakers, many incidents are being reported about these devices secretly recording private conversations or even falling prey to voice command hijacking by outside sources. Smart speakers can be thought of as a subset of home assistants in that they are voice-activated devices in your home.
The ever-helpful voice assistants offered by Amazon (Alexa), Siri from Apple, Google Assistant and Microsoft’s Cortana, are found to be listening to every voice in their vicinity. Although these devices may be doing this to be able to respond quickly to a voice command, the possibility of recording every voice and transmitting it from such ‘always-on’ and connected to the Internet devices cannot be ruled out.
And in case you have connected your mobile phone with such smart voice assistant, there are possibilities of the device making a call to someone or sharing something with anyone listed as contact in the phonebook
In Mumbai’s Chandivali area, an African grey parrot was found making calls over phone to his owner through Alexa. In fact, the parrot, named ‘Sniper’, also gives command to the smart speaker to play music or songs that he likes. “The pet bird hops close to Alexa and calls out in his rather raspy voice, ‘Alexa’, play ‘The Riddle’. And, as if on cue, digital assistant Alexa starts blaring out Sniper’s favourite song,” a
report from Times of India says.
The problem is that you/these smart assistants cannot fully depend on your ears/voice recognition technology to identify or differentiate a genuine voice from a rendered or modulated one. The ‘Sniper’ example just ratifies this as Alexa, in this case, had no idea if the voice command is originating from a human or a bird.
Researchers from University of California (Berkeley) and Georgetown University have been working on these vulnerabilities of smart voice assistants. A paper, published in May 2018 by these researchers, had claimed that they could embed voice commands directly into recordings of songs, music or spoken text.
This means that while we may hear what is being played on the speaker of the smart assistant device, this hidden command would be heard (only) by the device. So, while you are listening to some music, your Alexa might be adding or removing some items from your shopping list. This includes buying music albums on Amazon as well.
There are multiple triggers to concerns about smart speakers, the latest one being a person in Germany using Amazon's Alexa who received 1,700 audio files from a person he never met.
A woman from Oregon State in the US was in shock past year when the Amazon Echo device at her Portland home recorded a private conversation and then shared it with one of her husband's employees in Seattle. Amazon later clarified that Alexa mistakenly heard a series of commands and sent the recording as a voice message to one of the husband's employees.
The threat is very much real, with more and more consumers being hooked to the always-on and Internet-connected smart home devices.
According to a recent report from Forrester, ‘Secure the Rise of Intelligent Agent’, the security of smart voice assistants, like Alexa, Cortana, Google Assistant, and Siri, is questionable. In the report, Amy DeMartine and Jennifer Wise say, “Alexa doesn't currently authenticate or authorise individuals who access it, leaving a company's Alexa skills unprotected from anyone who can remember another user's commands."
While the user can mute the smart speakers and delete past recordings, let us not forget that a copy of every conversation you had with the voice assistants may already been stored by the manufacturer. Apple stores queries to Siri but these are not associated with Apple or the email ID. The company deletes such queries and codes after six months.
Both Amazon and Google, however, save histories until the customer deletes them. Microsoft’s Cortana users are required to manage their own data retention preferences on the Cloud and on their devices.
Google-owned YouTube is now offering its music streaming service for free on Google Home as well as other assistant-powered smart speakers. Aiming to take on the likes of Spotify, Amazon has also launched its new ad-supported music streaming service in the US which is available for use through the company's line up of Echo speaker and other Alexa devices.
For both these offerings from Google and Amazon, your conversations recorded on the smart speakers would be really helpful in creating your profile. This, in other words, can be used for targeted offerings from these tech giants.
So what can you do?
Think twice if you really need to buy the ‘smart voice assistants’ that may be recording every word spoken in your home. If you think your privacy is more important, then do not buy such smart speakers.
In case you had already bought and are using one, try to keep it on mute (or switch off) when not in use. Also try to configure individual user profiles.
Mudi
8 months agoI agree. Technology like this makes one `Naked' -while we are shouting about `privacy' from the roof-tops!