Google admits staff can listen to what people say on its home AI devices

Tech giant said it was investigating the leak of over 1,000 audio clips

Google staffers were able to pick up personal records on devices like Google Home. John G Mabanglo / EPA
Powered by automated translation

Google has said that its staff are able to listen to recordings of what its customers say through its artificial intelligence system, Google Assistant, via phones or through smart speakers such as Google Home.

The Assistant listens to human voice and understands it, so it is able to answer queries such as “what is the weather forecast for today?” or “Where can I get an Indian takeaway?”.

On Thursday the company admitted that humans can access recordings made by its AI voice recognition software, after some of its Dutch language recordings leaked. Google said it was investigating the data breach.

Google admitted the breach after Belgian broadcaster VRT reviewed more than 1,000 audio clips and found 153 had been captured accidentally, after the device was allegedly incorrectly activated. One of the recordings that was leaked included a voice of a woman’s son and baby grandchild. The recording contained the couple’s address and other information suggesting they are grandparents.

Assistant is only meant to send data to Google after the device detects a person interacting with it, by saying “Hello Google” for example. Mr Monsees said there should be a clear indicator, such as the extra lights turning on on the smart phone, when it is recording someone’s voice.

Other fragments that were captured included phone calls and private conversations, where topics such as a child’s growth rate, what sounded like domestic violence and someone’s love life were recorded, VRT reported.

In a blog post, David Monsees, Google’s product manager said: “We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”

Mr Monsees added that the tech firm tries to apply a range of safeguards to protect user privacy. For example, he said language experts only review 0.2 per cent of all audio snippets.

UK-based Juniper Research said in February that it estimates that there are 3.35 billion digital voice assistants in use today and it forecasts 8 billion will be in use by 2023.

Future Beat

Your round-up of the stories shaping tomorrow’s world

Future Beat