X
    Categories: Tech

These Words Accidentally Trigger Activation of Amazon Alexa, Google Home, and Siri

Smart-speaker users have always expressed fears that any words can accidentally trigger the false activation of listening devices such as Amazon Alexa, Google Home, and Siri among others. Independent research carried out by Digital Trends confirmed that users have reasons to fear where listening devices are concerned.

While Amazon’s Alexa has four wake-words such as Alexa, Amazon, Echo, and Computer, the researchers found that a similar-sounding word can activate Alexa, Google Home, and Siri among others.

For instance, if users set “Alexa” as the wake-word for Amazon’s Alexa, other words such as “Alexis,” “Lexa,” and “Lexus” could activate the devices. When “Echo” is set as the wake-word, similar-sounding words such as “Gecko” and “Art Deco” could activate Google Home devices.

Although “OK” or “Hey” must precede the wake-word for Google Home and Siri devices, the target devices still get activated when a similar-sounding word is spoken with a follow-up word. So words like “OK Google” or “Hey Google” are usually used to wake up the devices, but they can also be mistakenly triggered by words such as “OK Bugle” or “Hey Noodle”.

Further research also shows that keeping a smart device close to sources of sounds such as television could also accidentally activate them.

In a related development, Bloomberg reported instances where voice assistants on these devices transmit unintended information and sensitive user data to Google and Amazon – raising concerns for privacy issues. Given that several studies by Northeastern University and Imperial College London reveal that listening devices can be accidentally activated up to 19 times a day, Mishcon de Reya LLP in the UK advised users to mute or shut off Amazon’s Alexa or Google’s voice assistant when discussing sensitive matters at home.

Mishcon recommends that people not have these voice-activated devices where words carelessly were spoken might be transmitted to surveillance companies without their knowledge. Mishcon’s head of security, Joe Hancock, said people might misinterpret his cautions for paranoia, but it is better to not take risks with these listening devices where they could leak sensitive user information to their manufacturers.

Source: digitaltrends.com

Pablo Luna:
Related Post