by Kevin Coupe

Bloomberg has a story saying that “Amazon.com Inc. employs thousands of people around the world to help improve the Alexa digital assistant powering its line of Echo speakers. The team listens to voice recordings captured in Echo owners’ homes and offices. The recordings are transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands.”

In other words, Alexa doesn’t only listen when you use a designated “wake word.”

The Bloomberg story says that “the Alexa voice review process, described by seven people who have worked on the program, highlights the often-overlooked human role in training software algorithms. In marketing materials Amazon says Alexa ‘lives in the cloud and is always getting smarter.’ But like many software tools built to learn from experience, humans are doing some of the teaching.

The teams working on the process “comprises a mix of contractors and full-time Amazon employees who work in outposts from Boston to Costa Rica, India and Romania,” Bloomberg writes.

Here’s where it gets troubling:

“The work is mostly mundane … Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording.

“Sometimes they hear recordings they find upsetting, or possibly criminal. Two of the workers said they picked up what they believe was a sexual assault. When something like that happens, they may share the experience in the internal chat room as a way of relieving stress. Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.”

Amazon officially responded to the Bloomberg report this way:

““We take the security and privacy of our customers’ personal information seriously,” an Amazon spokesman said in an emailed statement. “We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”

Don’t know about you, but I’m not sure I feel protected.

I am staring at the Echo on my desk right now, and I must confess I am thinking about it differently.

What bothers me more than the fact that somebody may be listening to me through the half-dozen Alexa-powered devices that he have spread throughout our house is the idea that Amazon hasn’t been very transparent about this.

I’ll accept - with reservations - that if you want machines to be smarter, you need to educate them.

Though in this case, we’re talking about machines that are just getting better at voice recognition, not getting smarter or more intuitive or able to build on previous conversations. Actually, I’d like Alexa to be able to do all these things. I’d be willing to bet money that Amazon is working on it, and that it won’t be too long before Alexa - and maybe Alex - will be given some sort of physical representation to go with a more educated “brain.”

I’ll accept that I may need to participate in this process, and that there could be tradeoffs.

But I think people ought to be able to make that decision, one way or the other. At the very least, we ought to be told about the process. Transparency matters, and in this care, I don’t think Amazon lived up to my expectations.

And this puts aside all the moral and ethical discussions about whether, if Alexa is listening, it needs to be more proactive if it hears an assault or some other crime taking place. I’d like to take an ethics class focusing on this subject alone … but Amazon’s response to the question suggests that it hasn’t thought about it all that much.

The Eye-Opener? Alexa is listening. Amazon, maybe not so much.