Amazon’s Alexa is recording conversations without your consent

Share this…

A large number of information security experts are betting on a future with a great presence of voice assistants fed by artificial intelligence. Digital beings that would help us answer questions and make our lives easier. Although there is still a long way to go before this, in the Google Assistant and Amazon’s Alexa we can see a bit of this future.

Both AIs controlled by voice include functionalities that could soon turn them into a central element of our modern way of life.

alexa

Kiro 7, a television channel, commented that a woman stated that Amazon’s AI assistant, Alexa, recorded a private conversation between her and her husband and then sent her to a contact without their knowledge. This, without the wizard having been woken up by the word, “Alexa” that Amazon insists is absolutely essential for Alexa to start a task.

Danielle, who had all the rooms in her house hooked to Alexa, was alerted to the leak by the recipient of the message, one of her husband’s employees, who called to give notice believing that his smart speaker had been “hacked”.

Danielle heard the conversation when it was returned. “I felt invaded,” she commented. “An invasion of privacy.” I immediately said, “I will never plug that device again, because I cannot trust him.”

This incident is the reality of the main fear that many of us have with intelligent speakers and their voice aids that always listen. The fear that they are spying on us, or that the devices are being used by a third party to monitor and register us in vulnerable and intimate moments, commented an information security researcher.

According to the information, Amazon was called to investigate the matter. Amazon engineers did not report exactly how the violation occurred or if it is a problem with the Alexa-based smart speakers, but they confirmed the incident.

“They said the engineers checked the records, and they saw what happened, they saw exactly what you said happened, and we’re sorry … He told us that the device just guessed what we were saying … He apologized and said that they really appreciated I called them, this is something we have to fix! ”

In an official statement, Amazon confirmed the incident, but minimized the problem.

“Echo woke up as a word in the background conversation sounded like ‘Alexa.’ Then, in the conversation it was heard as a request to ‘send message.’” At this point, Alexa said loudly, “Who?” In the background conversation a name was interpreted in the contact list, Alexa asked aloud: “[contact name], right?” After this, Alexa interpreted the conversation as “correct.” Unlikely as it may be this chain of events; we are evaluating options to make this case less and less likely “.

It is not the first time something like this happens. The information security company, Symantec gave in a report earlier this year, the constant listening nature of smart speakers, which means that everything you speak runs the risk of being sent to back-end servers of these technological giants.

According to the policies, these smart speakers listen to, record and send their servers the conversations they have had with them after they have been woken up. However, we have seen cases in which it has been discovered that an “error” is recorded more than it should.