👀 Are smart speakers putting your privacy at risk?

Voice is becoming a primary interface. In our home appliances, cars, mobile apps… voice is everywhere. We can turn off the lights, order takeout, buy our weekly groceries or listen to our favorite album, all by using one of the most natural interfaces of all: our voice! This is made possible thanks to smart speakers such as Amazon Echo and Google Home! The convenience and fun these devices can bring is boundless. However, smart speakers and privacy is a hot topic at the moment… just how safe is it to sit these unassuming devices on our bedside table or in our living room, listening to our every word?

What are smart speakers?

Voice recognition technology, like Apple’s Siri, has been around for a while. However, smart speakers such as Amazon’s Echo and Google’s Google Home are game changers. These speakers want to be your virtual assistant and transform the way you interact with your home, other devices, even your favorite brands. Based on voice activated artificial intelligence, smart speakers can be connected to third party Internet of things devices, such as your thermostat or car doors, enabling you to order and control things using your voice! Smart speakers are equipped with a web connected microphone, that is constantly listening for their trigger word. When a user activates a smart speaker to make a request, the device sends a record or stream audio clip of the command to a server where the request is processed a response is formulated. The audio clips are stored remotely and with both Amazon and Google’s devices you can review and delete them online. However, it is not clear whether the data stays on servers after being deleted from the account. Furthermore, at the moment devices only record requests, however, as they advance and are we are able to do more with them, such as dictate emails to be sent, where will this data be stored ?

Your voice is only cloud processed if you say a specific trigger word

Your privacy at risk?

So, can hackers exploit the backdoor coding of these devices and listen to what you’re saying? Well, nothing is impossible, but both Google and Amazon have taken the necessary precautions to stop wiretapping. Furthermore, the audio file that is sent to their data centers is encrypted, meaning that even your network was compromised, it is unlikely that smart speakers can be used as listening devices. Someone getting hold of you Amazon or Google password and seeing your interactions is the biggest risk, so make sure you use a strong password, you could even consider 2 factor security!

What can you do?

If the thought of the smart speaker being about to listen in at any moment makes you uneasy, you can put it on mute manually or change your account settings to make your device even more secure, such as password protecting purchase options available with the speaker or making the device play an audible tone when it is active and recording. You can also log onto your Amazon or Google account and delete your voice history (either individually or in bulk. To do this for your Google device, head over to myactivity.google.com, click the three vertical dots in the “My Activity” bar, and hit “Delete activity by” in the drop-down menu. Click the “All Products” drop-down menu, choose “Voice & Audio,” and click delete. For Amazon’s speaker, go to  amazon.com/myx, click the “Your Devices” tab, select your Alexa device, and click “Manage voice recordings.” A pop-up message will appear, and all you need to do it click “Delete”. However, please note that deleting your history on your smart speaker may affect the personalisation of your experience. Check out this handy screen cast for further instructions on deleting your Amazon Alexa account history.

Developers could also use privacy by design assistants, such as Snips. However, use may be limited due to these kinds of assistants having no internet connection.

The privacy / convenience tradeoff

At the rate the smart speaker and IoT industries are evolving it looks like they are going to become more and more present in our daily lives, therefore, it is essential to understand how they work and and what you can do to prevent them from breaching your privacy. In conclusion, yes, theoretically smart speakers could pose a threat to privacy. However, they are not terribly intrusive, as they are only recording when awoken by a trigger word, and the likelihood of them picking up on a conversation they aren’t supposed to, and then someone intercepting it is very slight. Google, Amazon and other sites have been logging our web activity for years, now it is starting to happen with voice snippets. In the pursuit of convenience privacy is sometimes sacrificed, and in this particular trade off, convenience comes out on top for us!