Fooling ‘Smart’ Voice Assistants
The media is full of advertisements for new-age, know-all, and ever-helpful voice assistants offered by Amazon (Alexa), Siri from Apple and Google Assistant. "Alexa, what is the capital of Canada?" asks the girl in the ad. And bang comes the answer, leaving the common viewer hypnotised. How can a small speaker-like device know these answers? One may wonder, before thinking about buying one such assistant for oneself. But then, this could be l inviting trouble (quite possible) into your home. Sounds bizarre, right? 
 
The basic issue is that all these smart assistants work generally on voice command and that is exactly what makes them vulnerable to manipulation. For example, Bollywood superstar Amitabh Bachchan features in almost every other advertisement on television. We hear his deep baritone voice marketing the product or service in an impressive manner. Well, most of the times. However, the voice may not be Mr Bachchan’s; some voiceover artiste may have rendered his modulated voice for the advertisement. Mimicry artist and singer Sudesh Bhosale often mentions how Mr Bachchan was impressed with his voice in the song ‘jumma chumma de de”.
 
The problem is you cannot fully depend upon your ears to identify or differentiate a genuine voice from a rendered or modulated one. Researchers from University of California (Berkeley) and Georgetown University are working on these vulnerabilities of smart voice assistants. A paper, published in May 2018 by researchers from the University of California, claims that they could embed voice commands directly into recordings of songs, music or spoken text. This means that while we may hear what is being played on the speaker of the smart assistant device, this hidden command would be heard (only) by the device. So, while you are listening to some music, your Alexa may add or remove some items from your shopping list. 
 
It, however, will not remain at this level. According to a report in The New York Times (NYT) , some researchers from the US and China had demonstrated in their labs how they can secretly activate artificial intelligence (AI) modules in smartphones and smart speakers, to dial phone numbers or open websites. It also means that similar hacking methods can be used, without the knowledge of the owner or user of a smart home device, to unlock doors, transfer money and buy items online. The best, or worst, part of this is that it can be done by simply playing a song over the radio with an embedded voice command! 
 
Nicholas Carlini, a fifth-year PhD student in computer security at University of California  and one of the researchers told NYT that there was no evidence that the techniques used by them had left their labs, but it may be a matter of time before someone starts exploiting this. 
 
Earlier, in 2016, students from these two Universities showed how they could hide voice commands in white noise played over speakers or through YouTube videos to get the smart device to turn on airplane mode (on their phone) or open a website. 
 
The NYT report says, “These deceptions illustrate how artificial intelligence — even as it is making great strides — can still be tricked and manipulated. Computers can be fooled into identifying an airplane as a cat just by changing a few pixels of a digital image, while researchers can make a self-driving car swerve or speed up simply by pasting small stickers on road signs and confusing the vehicle’s computer vision system. With audio attacks, the researchers are exploiting the gap between human and machine speech recognition.”
 
When asked about security vulnerabilities, Amazon and Google told the newspaper that their voice assistant systems are secure. May be. But, as we know, modulating voice or fooling someone with another person’s voice has been the favourite game we have been playing since our childhood days. Add to this voice recognition systems used for voice assistants and you will understand how it is easy to fool them too. 
 
Last year, an online advertisement from Burger King caused a stir in the US. This ad purposely asked, “OK, Google, what is the Whopper burger?” And all nearby Android devices with voice-enabled search responded by reading from the Whopper’s page on Wikipedia! The ad was removed after viewers started editing the Wikipedia page for comic effects. Tanish Vaidya, a researcher from California, who wrote one of the first papers on audio attacks, had titled it as ‘Cocaine Noodles’ since devices interpreted this voice command as ‘OK Google’!
 
Another issue with the 24x7 smart assistant is that it picks up all voices around it, then analyses and follows them if it is a command from the owner or an authorised user. However, you will never know if your smart assistant is picking up every audio in the room and passing it on to some remote server. 
 
To sum up, do not fall prey to any voiceover or so-called smart assistant device. Not yet. If you have already got one, make sure it is secured and cannot be easily tampered with by other voices. Also, be careful about what you speak near your smart assistant as you will never know who might be listening.
Comments
gaurav jain
5 years ago
Scary
Free Helpline
Legal Credit
Feedback