Voice-Enabled Devices and Data Privacy: Lessons Learned from Amazon
“Alexa, what’s the weather like in Nashville today?” Amazon’s Alexa, Apple’s Siri, the Google Assistant – the list of voice assistants and voice-enabled devices seems to just keep growing. “Hey Google, could you set an alarm for 8:00 AM tomorrow?” Their basic goal is to make our lives easier, right? Through voice assistants’ language processing abilities, they can complete all types of tasks – stream music, set an alarm, take notes, order products, smart home functionality, and integration with other applications. Voice assistants and voice-enabled devices live in the bedrooms, kitchens, and living rooms of millions of users. Voice assistants and voice-enabled devices are simultaneously helpful and vulnerable; what threats do they pose to data privacy? How do companies protect the data that users give Alexa, Siri, and the Google Assistant?
Amazon’s Data Privacy Worst Case Scenario
Under GDPR, any EU data subject may request that a company send them the entirety of the data collected about them, so a German Amazon user did just that. Amazon sent back fairly average findings – Amazon searches, orders, etc. – but also 1,700 voice recordings and transcriptions. The issue? This user doesn’t own any Alexa-enabled devices. He listened to the voice recordings to see if they were connected to him in some way but concluded that it was an error on Amazon’s part. When he discovered this information leak, the user contacted Amazon but never heard back.
This story broke when the user went to a German magazine c’t with his concerns, which eventually led to the identification of the voices in the recordings. C’t reported, “We were able to navigate around a complete stranger’s private life without his knowledge…The alarms, Spotify commands, and public transport inquiries included in the data revealed a lot about the victims’ personal habits, their jobs, and their taste in music. Using these files, it was fairly easy to identify the person involved and his female companion. Weather queries, first names, and even someone’s last name enabled us to quickly zero in on his circle of friends. Public data from Facebook and Twitter rounded out the picture.” This case is proof that even when users don’t think they’re giving up personal data to voice assistants, the culmination of that data can lead to a full picture of who they are, where they are, their habits, and their community. Our digital footprints reveal so much about us. Voice assistants must store or have access to stored personal data in order to personalize the user experience, resulting in a cycle that is ever-increasing users’ digital footprints.
In an effort of due diligence, c’t decided to contact the user behind the voice recordings. C’t report, “We couldn’t find a phone number, so we used Twitter to ask the victim to contact us. He called back immediately and we explained how we found him. We had scored a direct hit and Neil Schmidt (not his real name) was audibly shocked when we told him about the personal data Amazon had sent to a stranger. He started going through everything he and his friends had asked Alexa and wondered what secrets they might have revealed. He also confirmed that we had correctly identified his girlfriend.”
Lessons Learned from Amazon Alexa Data Collecting Mistake
Obviously, with the purchase of voice-enabled devices and use of Alexa, Siri, or the Google Assistant, a user is agreeing to terms and conditions that address data privacy concerns, but when these terms and conditions aren’t upheld by the data controller or processor, the foundation of trust is damaged.
Amazon’s reaction to this data privacy incident was disappointing. The first misstep occurred when Amazon didn’t even notice their mistake. Then, when the user notified Amazon of the data privacy incident, he reported that Amazon never responded. When Amazon did recognize this incident, there was seemingly no timely notice to a data protection authority or the victim. After c’t got involved, Amazon finally contacted the user and victim about the mistake and an Amazon spokesperson stated, “This unfortunate case was the result of a human error and an isolated single case.” Was Amazon planning to respond to this case, or did the media attention prompt them to address the situation?
The benefit of regulations like GDPR and CCPA are new ways to hold organizations accountable for securing data subjects’ personal information. Building customer trust is a difficult task in this day and age; digital consumers are fearful of unwanted follow-up, sales pitches, cold calls, and spam. Organizations that demonstrate a commitment to privacy regulations like GDPR and CCPA have the potential to rebuild the trust that many digital consumers have lost. This trust, in turn, may actually result in greater sharing of personal data.
The paranoia around voice assistants and their listening-in abilities will, hopefully, not fade anytime soon. Users must be aware of the relationship they’re creating with companies like Amazon, Google, and Apple by inviting them to listen into their lives. Likewise, data controllers and processors must protect personal data with the appropriate controls and care.
If any data privacy regulations apply to your organization, contact us today to avoid situations like this. We want to empower your organization to protect the data you hold and ensure the privacy of your customers.
More Data Privacy Resources
CCPA vs. GDPR: What Your Business Needs to Know
Privacy Policies Built for GDPR Compliance
Investing Where It Matters: Unbounce’s Commitment to GDPR Compliance