Amazon Alexa sent a woman’s private conversation to an employee

You may want to think twice about having sensitive conversations in front of your Alexa. The smart speaker may have a mind of its own. A woman in Portland, Oregon, said she feels “invaded” after her voice-controlled Amazon Echo sent a private conversation with her husband to her husband’s employee without her knowledge.

“I’m never plugging that device in again, because I can’t trust it”

“My husband and I would joke and say, ‘I’d bet these devices are listening to what we’re saying,’ ” a woman who only identified herself by her first name, Danielle, told Seattle-based news station KIRO 7.

According to Danielle, she got a call one day from her husband’s employee, who warned her to “Unplug your Alexa devices right now,’ ” she said. ” ‘You’re being hacked,’ ” she was told.

The employee, who lived 76 miles away in Seattle, had received an audio recording of a private conversation between the couple about hardwood floors.

Before the incident, Danielle had Amazon Echo devices in every room of her home to control heat, lights, and the house security system. But after what happened, she does not want to outsource these tasks to a mind she does not feel she can fully control. A talk about home improvement is one thing — but what if it had been a more embarrassing, intimate conversation?

“I felt invaded,” Danielle said. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.’ ”

Alexa is the voice-powered assistant that lives inside Amazon Echo devices. When it hears its programmed name, which is typically Alexa, it will start listening in, waiting for your request, such as, “Alexa, call my assistant.” According to Amazon, Danielle was not hacked. Alexa just misheard her request. Amazon said in a statement that:

“Echo woke up due to a word in background conversation sounding like ‘Alexa,’ and the next conversation was heard as ‘send message.’ Then, when Alexa said out loud, ‘To whom?,’ the device interpreted the background conversation as a name in the customer’s contact list. Alexa then responded, ‘[Contact name], right?;’ Alexa again interpreted the background conversation as, ‘Right.'”

Alexa had wrongly interpreted words in the couple’s talk as a command.

Amazon said this was an “extremely rare occurrence” that it was taking steps to make sure would not happen again.

But for those of us who do not want to put our trust in a device that can broadcast our private conversations to our employees, an “extremely rare occurrence” may not be rare enough.