An expert at security firm AVG found some voice-activated systems responded just as well to fake voices as they did to that of the owner.

Clever fraudsters could subvert this to send bogus messages or compromise gadgets in the future, said AVG.  Voice-activated systems needed to do a better job of checking who is talking, said a security expert.

Problems with voice-activated systems were found by Yuval Ben-Itzhak, chief technology officer at anti-virus firm AVG who managed to turn on and control a smart TV using a synthesised voice. The attack worked, he said, because the gadget did nothing to check who was speaking.

Voice-activated functions on Apple and Android smartphones were also vulnerable to the same attack, he found. In one demonstration, he used the synthesised voice to send a bogus message via an Android smartphone telling everyone in the device’s contacts book that a company was going out of business. Mr Ben-Itzhak also wondered if children could exploit the flaw and use it to turn off safety features that stop them seeing or using inappropriate content.