Amazon's Alexa recorded and shared a conversation without consent, report says

Amazon Echo line

Amazon Echo line

Unfortunately, it seems that their smart speaker grew a mind of its own, recording a private conversation the family had in their home and then sending that recording as a file attachment to a person in their contacts that lived hundreds of miles away in Seattle, Washington.

Amazon confirmed an Echo owner's privacy-sensitive allegation on Thursday, after Seattle CBS affiliate KIRO-7 reported that an Echo device in OR sent private audio to someone on a user's contact list without permission.

She said the incident happened two weeks ago when the employee called them to say she'd received a unusual voice recording of them.

"I felt invaded", a family member called Danielle said. "A total privacy invasion".

Danielle and her husband were talking together at home when a colleague of her husband phoned up to warn them that their Alexa device had been hacked. "Amazon takes privacy very seriously". "They said 'our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we're sorry, '" Danielle said.

Amazon has an explanation as to what happened. The company said that the digital assistant thought that the couple said the hotword Alexa, which then activated the assistant.

But the notion that they aren't recording conversations whatsoever is wrong - Amazon has even unlocked capabilities that allow Amazon Echos to record conversations and send them to Alexa-compatible devices in other rooms of the same building, essentially acting as an intercom system.

More news: Tamil Nadu anti-Sterlite protests: Death toll mounts to 10
More news: House conservatives want second special counsel to investigate alleged DOJ 'misconduct'
More news: US will crush Iran: Mike Pompeo

The background conversation was interpreted as a name in the customers contact list. All Alexa devices are programmed to provide a verbal warning before any message or recording is sent to another individual.

One option immediately springs to mind. Amazon offered to fix Alexa so that it could be used only to enable and disable the appliances in the family's smart home, but that offer was declined.

In the above case, Amazon says that Alexa repeatedly misheard parts of the conversation as instructions in this chain.

The company has since asked Danielle if she and her husband wanted to disable the Alexa communication by putting it into a "de-provisioned" mode.

There's been deep concern that AI voice assistants might be always listening, and for one OR family, their Amazon Echo reportedly did just that.

"Background noise from our television is making it think we said Alexa", Wedbush Securities analyst Michael Pachter said of his personal experience.

As much as people enjoy their virtual assistants, sometimes they do things that are downright creepy.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.