‘Hey Google, are you listening to me?’

9

September

2019

5/5 (6)

 

Last week I had to make a call to the Coolblue customer service to make a return. Upon dialling the customer support number, I heard a friendly voice saying:

‘This call may be recorded for quality and training purposes’.
Well, of course I am happy to help new employees out and improve customer service, and I consent by staying on the line. Sounds easy right?

In the age of digital assistants and the Internet of Things, it is not always as clear as the above example and we do not always know who is listening to you and when.
The AI-powered virtual assistants of Google, Amazon, Apple and Microsoft were introduced to make life easier. Want to know what the weather will be for today? Just ask. Is there going to be traffic on your way to work? Ask Google. To make things even better, the devices are built with machine-learning capabilities. At the moment, the error rate of facial and speech cognition in machine-learning products is even lower than the error rate of actual human beings (Brynjulfsson and Mcafee, 2017). The smart assistants can learn your daily routine, play your favourite songs and call your grandmother, but at what cost?

Recently, the Belgian broadcaster VRT was able to obtain over 1,000 audio files from a Google contractor who was hired by the corporation to review audio captured by the Google Assistant from devices including smart speakers, phone and security cameras. After listening to an audio file where a couple was talking with their son and baby grandchild, Tim Verheyden, a journalist in VRT was successfully able to locate the couple in the audio file. But to his surprise, the Google Assistant did not only record conversations asked after triggering the AI with ‘Hey Google’, but also randomly. The smart devices recorded sensitive medical information, physical violence and even sexual intercourse (WIRED, 2019)

Privacy-by-design
These developments raise the question of how companies can clear-up the privacy cloud currently hanging over digital assistants. As a starter, the companies producing digital assistants should engineer the products with a privacy-by-design structure. For example, Privacy-minded Apple retains voice queries but decouples them from your name or user ID. The company tags them with a random string of numbers unique to each user. Then, after six months, even the connection between the utterance and the numerical identifier is eliminated (The Guardian, 2019).

The ethics around digital assistants remain a topic of discussion. How much of your privacy are you willing to give up to improve AI in digital assistants?

References
Brynjolfsson, E., and Mcafee, A. (2017). The business of artificial intelligence: what it can and cannot do for your organization. Harvard Business Review.
The Guardian (2019). Smart talking: are our devices threatening our privacy. Retrieved from: https://www.theguardian.com/technology/2019/mar/26/smart-talking-are-our-devices-threatening-our-privacy
Wired (2019). Who’s Listening When You Talk To Your Google Assistant?. Retrieved from
https://www.wired.com/story/whos-listening-talk-google-assistant/

Please rate this

3 thoughts on “‘Hey Google, are you listening to me?’”

  1. Thank you for raising the discussion around voice assistants Sjoerd! To give an answer to your question: I believe that the privacy rules related to this topic need to become more strict. Furthermore, the details surrounding the data collection of these technologies will have to become more transparent. Research from Accenture shows that 40% of the voice assistant users are concerned about who is listening and how their data is used (Accenture 2019). I think that this is a valid concern, since a lot of consumers do not know that voice assistants are always listening at the local level. This is necessary in order for these technologies to be functional, but leads to significant risks concerning privacy. Voice assistants are programmed to not transmit any information until they are triggered by phrases, such as ”OK Google”, ”Hey Siri” and ”Hello Alexa” (Voice Assistants and Privacy Issues n.d.). However, it can be questioned whether we should believe that…

    Bibliography
    Accenture (2019). Reshape to relevance. Retrieved from: https://www.accenture.com/gb-en/insights/high-tech/reshape-relevance.
    Voice Assistants and Privacy Issues (n.d.). Retrieved from: https://www.termsfeed.com/blog/voice-assistants-privacy-issues.

  2. Hi Sjoerd!

    Just like Anne said I think that privacy rules related to this topic need to become more strict. In addition, Google should have disclosed to their customers that audio recordings can be stored and that there is a possibility that it is going to be used to better their voice assistance. This way the consumer could consider if they think the benefits of the voice assistance outweigh the potential loss of their privacy.

    In my opinion the benefits of the voice assistance do not outweigh the loss of my privacy. In the several times that I have used voice assistance to answer a question or to do a task it did not work out. Either I was misunderstood or no answer could be given on my question, therefore, I still needed to use my fingers to get the job done.

    On a side note, have you ever thought about the fact that US states were already seeking access to voice-assistant recordings in the name of crime prevention and national security? For example, last year, a judge in New Hampshire made headlines by ordering Amazon to submit Echo recordings of a double murder to investigators. Therefore, voice assistants may be put in a very complicated position between their customers and the government. Which make this situation even more complicated.

    What do you think about this?

Leave a Reply

Your email address will not be published. Required fields are marked *