Amazon’s Alexa virtual assistant reportedly tried to order dollhouses for households across San Diego after it picked up some dialogue from TV sets about a 6-year old who got a $170 dollhouse (and cookies) when she asked the Echo Dot speaker in her home to play with her. CW6 anchor Jim Patton was discussing the incident involving the kid and noted how the girl said ‘Alexa ordered me a dollhouse.’
That’s when it happened – audiences across San Diego complained that their own Echo devices ‘heard’ Patton and attempted to purchase dollhouses for them too. Since orders need to be confirmed with a ‘yes,’ Amazon’s intelligent speakers didn’t actually place them. But it does bring up some questions about privacy and online security.
Alexa doesn’t recognize specific voices; it’s easy for anyone without authorization to use it. The software stays alert for the wake command ‘Alexa.’ So it’s always listening to see if you need it to stream songs, book a taxi or order something online. Since its ears are constantly open, police have asked Amazon to turn in Echo data for a murder investigation in the past.
A few months ago, Amazon’s virtual assistant that pairs with the Echo speaker bombarded a little boy with adult content when he asked it to play Digger digger. In the case of the 6-year old girl, the Echo was obeying what it thought was a command aimed directly at it. The fact that it picked up an order from a TV set is a wee bit more unsettling.
The thing is, making a purchase using voice command is turned on by default in Alexa devices when it really shouldn’t be. Of course you can switch off this feature or enforce a 4-digit code as mandatory for confirming orders. But why not have it off by default in the first place?