In a time when privacy is a major concern for tech users, Google keeps pushing for invasive methods. Google now admits that it listens to customers’ audio recordings on Google Home smart speakers. The tech giant hired language experts to analyze recordings made via Android smartphones and Google Home speakers.
The reason behind this breach of privacy is to teach Google Assistant AI. The tech company claims this helps its voice recognition technology. The artificial intelligence system listens and understands voice commands, it can answer queries for news, weather and can control your internet-connected devices in your home.
Well, at least the data is secure at Google, right? No, according to Google one of its audio experts leaked Dutch audio data.
We partner with language experts around the world to improve speech technology by transcribing a small set of queries – this work is critical to developing technology that powers products like the Google Assistant. Language experts only review around 0.2% of all audio snippets, and these snippets are not associated with user accounts as part of the review process.
We just learned that one of these reviewers has violated our data security policies by leaking confidential Dutch audio data
Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action.
We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.