While their job is to “help improve” NSAlexa – which powers the company’s line of Echo speakers – the team “listens to voice recordings captured in Echo owners’ homes and offices,” which are then transcribed, annotated and fed back into the software in order to try and improve Alexa’s understanding of human speech for more successful interactions. In other words, humans are effectively helping to train Amazon’s algorithm.
The listening team is comprised of part-time contractors and full-time Amazon employees based all over the world; including India, Romania, Boston and Costa Rica.
Listeners work nine hour shifts, with each reviewing as many as 1,000 audio clips per shift according to two employees from Amazon’s Bucharest office – located in the top three floors of the Romanian capital’s Globalworth building. The location “stands out amid the crumbling infrastructure” of the Pipera district and “bears no exterior sign advertising Amazon’s presence.”
While much of the work is boring (one worker said his job was to mine for accumulated voice data for specific phrases such as “Taylor Swift” – letting the system know that the searcher was looking for the artist), reviewers are also listening on people’s most personal moments.
Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording. – Bloomberg
Occasionally Amazon listeners come across upsetting or possibly criminal recordings – such as two workers who say they listened in on what sounded like a sexual assault.
According to the report, when things like this happen the workers will mention it in the internal chat room to “relieve stress.”
And while Amazon says that it has procedures to follow when workers hear distressing things, two of the Romania-based employees say they were told “it wasn’t Amazon’s job to interfere” when they requested guidance for such instances.
“We take the security and privacy of our customers’ personal information seriously,” said an Amazon spokesman in a statement provided to Bloomberg.
“We only annotate an extremely small sample of Alexa voice recordings in order impAlternative Newsrove the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” the statement continues. “We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”
That said, Amazon does not mention the fact that humans are listening to recordings of some of the conversations picked up by Alexa. Instead, they have a generic disclaimer in their FAQ that says “We use your requests to Alexa to train our speech recognition and natural language understanding systems.”
What Amazon records
Last May, an Amazon Echo recorded a conversation between a husband and wife, then sent it to one of the husband’s phone contacts. Amazon claims that during the conversation someone used a word that sounded like “Alexa,” which caused the device to begin recording.
“Echo woke up due to a word in background conversation sounding like ‘Alexa,’” said Amazon in a statement. “Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
The wife, Danielle, however said that the Echo never requested her permission to send the audio. “At first, my husband was like, ‘No, you didn’t,’” Danielle told KIRO7. “And he’s like, ‘You sat there talking about hardwood floors.’ And we said, ‘Oh gosh, you really did!’”
Can you disable?
Alexa does allow people to stop sharing their voice recordings for the development of new features, while a screenshot reviewed by Bloomberg reveals that the recordings provided to Alexa’s listeners do not provide the full name or address of a user. It does, however, link the recording to an account number, the user’s first name, and the device’s serial number.
“You don’t necessarily think of another human listening to what you’re telling your smart speaker in the intimacy of your home,” said UMich professor Florian Schaub, who has researched privacy issues related to smart speakers. “I think we’ve been conditioned to the [assumption] that these machines are just doing magic machine learning. But the fact is there is still manual processing involved.”
“Whether that’s a privacy concern or not depends on how cautious Amazon and other companies are in what type of information they have manually annotated, and how they present that information to someone,” added Schaub.
This article was originally published in Zero Hedge