“Amazon and third parties (including advertising and tracking services) collect smart speaker interaction data. We find that Amazon processes voice data to infer user interests and uses it to serve targeted ads on-platform (Echo devices) as well as off-platform (web). Smart speaker interaction leads to as much as 30X higher ad bids from advertisers. Finally, we find that Amazon’s and skills’ operational practices are often not clearly disclosed in their privacy policies.”
Thus concludes a recent damning privacy study on the privacy of Amazon Smart Speaker Ecosystem(nueva ventana) from the University of Washington, University of California, and Northeastern University. So, to answer the question posed in our headline, yes, your Amazon Alexa is spying on you, and with an estimated 35% of adults(nueva ventana) (some 91 million people(nueva ventana)) now owning a smart speaker in the United States alone, this is alarming.
This is a view strongly held by the US Federal Trade Commission, which in May 2023 imposed a $25 million fine on Amazon over privacy violations(nueva ventana) involving its Alexa voice assistant (and its doorbell camera, Ring).
- What is Alexa?
- What’s supposed to happen when you use Alexa?
- What actually happens when you use Alexa
- Final thoughts and Alexa alternatives
What is Alexa?
Amazon Alexa is a virtual assistant technology you can control using voice commands. Alexa is built into many Amazon devices but is best known for its use in the Amazon Echo, Amazon Dot, and Amazon Studio smart speakers, where you can instruct it to play music, create to-do lists, control other smart devices around the home, provide live news, weather, and traffic reports, and much more.
Alexa is also built into third-party hardware, such as smart TVs and sound bars.
What’s supposed to happen when you use Alexa?
According to Amazon(nueva ventana), Alexa devices are always listening but only start recording when you say the wake word (by default, “Alexa”) or press a button on the Alexa device. This behavior can be customized, but blocking vocal activation undermines the purpose of having an Alexa device in the first place.
When Alexa hears its wake word, it sends the audio “snippet” containing the wake word to Amazon, where it undergoes a cloud-based wake word verification(nueva ventana) process designed to ensure the wake word was actually spoken.
If the wake word is verified, Alexa will record everything it hears for the next few seconds and send this recording to Amazon’s cloud computers, where it’s processed and (hopefully) triggers the correct response.
Alexa devices are supposed to give visual cues to ensure you can always tell if it’s actively listening to your conversation. For example, an Amazon Echo will show a circular blue light after it hears its wake word and is actively listening.
Alexa collects personal information
Amazon makes no secret of the fact that it links every interaction you have with your Alexa to your Amazon account and uses them to profile you for targeted advertising(nueva ventana). In some ways, this is no different from tracking your web browsing history using cookies and recording your purchase and search history on the Amazon website.
However, thanks to its highly versatile nature — answering your obscure questions, playing your favorite radio shows and podcasts, curating your music tastes, managing your smart devices, etc., Alexa provides Amazon with a much more detailed and intimate picture of your life than it could ever hope to gain through your shopping history.
In fact, the nature of information gained from your interactions with Alexa devices is so valuable that (as the research paper quoted earlier reports) advertisers will pay Amazon up to 30 times more for it than for information gathered by more traditional means.
By default, Amazon keeps your recordings forever. However, you can delete them entirely via your device’s privacy settings or limit how long Amazon keeps them before they’re deleted. (At least in theory — see the FTC case against Amazon, discussed below).
What actually happens when you use Alexa
Alexa listens to more than it should
A 2019 study by researchers from Northeastern University and the Imperial College of London found that smart speakers (including Alexa devices) misheard their wake words and accidentally activated up to 19 times a day(nueva ventana).
Around half of these accidental activations resulted in recordings longer than six seconds. Echo Dot 2nd Generation devices were among the worst offenders, with activation times of 20-43 seconds.
Amazon employees listen to your voice recordings
Bloomberg reported (also in 2019) that Amazon employs a team of thousands of people worldwide to listen to Alexa voice recordings(nueva ventana), transcribe them, and feed them back into the Alexa algorithm.
Amazon claims this is to improve Alexa’s AI and natural language recognition capabilities and that “employees do not have direct access to information that can identify the person or account“.
However, this statement is clearly disingenuous. A screenshot obtained by Bloomberg showed that while employees don’t have direct access to a user’s full name and address, the transcription is associated with an account number, the user’s first name, and their device’s serial number.
Not only are employees expected to transcribe accidental interactions with Alexa, but they can often hear background conversations, which include private details like names and banking information. When this happens, guidelines stipulate that they click a “critical data” box and move on.
Amazon has defended the practice, saying anyone can opt out of having their voice recordings analyzed by humans via their account settings. But even if you opt out, your recordings may be analyzed by hand by a person with access to your account details as part of Amazon’s regular review process.
Alexa makes mistakes
In 2018, Alexa recorded a private conversion(nueva ventana) without the wake word even being uttered and sent that recording to a random contact on the Alexa owner’s contact list. In the same year, it also mistakenly sent 1,700 Alexa voice recordings to another user(nueva ventana).
Amazon shares a lot of Alexa data with third parties
Alexa skills(nueva ventana) are small free apps that can add functionalities to your Alexa device, and by allowing some 200,000 skills from third-party developers onto the Amazon Marketplace, Alexa’s usefulness has greatly expanded.
However, this usefulness comes at a steep price. Amazon does impose strict privacy restrictions on the data these third-party “skills” are allowed to access. For example, they can’t collect highly sensitive information such as your social security number or bank account details. They must also ask for permission to access certain pieces of personal information, such as your email address, phone number, or location.
The 2022 research paper confirms previous research that thousands of third-party skills developers abuse Amazon’s privacy policies and actively collect voice data stored on Alexa devices, using it to deliver targeted advertising. They also share this data (and other Alexa interactions) directly with other third parties, with no oversight or control from either Alexa users or Amazon itself.
Many third-party skills don’t publish any privacy policy whatsoever, and even when they do, they don’t adhere to it.
Amazon doesn’t always delete your data when it says it does
The crux of the recent (2023) FTC settlement with Amazon over Alexa is that Amazon failed to delete active child accounts, some voice recordings, and geolocation information after it was asked to do so by users and parents.
“Amazon prominently and repeatedly assured its users, including parents, that they could delete voice recordings collected from its Alexa voice assistant and geolocation information collected by the Alexa app. The company, however, failed to follow through on these promises when it kept some of this information for years and used the data it unlawfully retained to help improve its Alexa algorithm.”
According to the FTC, Amazon has thus fallen foul of the Children’s Online Privacy Protection Act(nueva ventana) (COPPA). (At the time of writing, the FTC’s settlement with Amazon must still be approved by a federal court.)
Final thoughts
Amazon Alexa devices are undoubtedly amazing pieces of technology that can bring a level of convenience into our lives that would have been the stuff of science fiction only a few years ago.
But this convenience comes at a price — your privacy. Even if you take Amazon at its word, Alexa knows an awful lot about you. Amazon uses this information to profile you and target you with ever more personalized ads, or it sells this information to unknown third parties that you never consented to share your data with. As noted earlier in this article, information obtained through Alexa is up to thirty times more valuable than data gained in more traditional ways.
But as the recent FTC ruling shows, you can’t always trust Amazon. It also has little or no control and oversight over third-party skills that actively abuse your privacy.
So what can you do if you care about privacy but can’t do without the convenience of smart speakers? Of the big players in the commercial virtual assistant space — Amazon Alexa, Google Assistant, and Apple’s Siri — Apple offers the greatest privacy. Although Apple doesn’t allow you to delete past Siri recordings, it doesn’t tie recordings to your account, and it doesn’t allow third-party integrations.
For those who prefer open-source options and don’t mind making some compromises to improve their privacy, Mycroft(nueva ventana) is a free open-source natural language voice assistant designed to run on Linux-based devices. DIY enthusiasts can install Mycroft on a Raspberry Pi(nueva ventana), or even flash an Amazon Echo’s firmware to replace Alexa(nueva ventana) with Mycroft! If you’re less technically-minded, you can purchase the Mycroft Mark II(nueva ventana) smart display off-the-shelf (but please be aware that Mycroft AI, as a company, will soon be shutting down(nueva ventana)).