Google reports that leaked assistant records violate data security policies

Google home

The search giant has confirmed that people listen to’ Okay,’ but it said that it is a breach of its data security policies that the recording is being leaked.

A Belgian report on VRT NWS showed earlier this week that the audio files of Google employees were recorded “systematically” by Google Home smartphone speakers and Google Assistant Smartphone apps.

The report explained how staff heard the extracts of recordings captured by the usual “Okay Google” or “Hey Google” commands when a user activates your system.

Upon receiving copies of certain recordings, VRT NWS contacted the users to verify their voice or their children’s voices and talked with the digital assistant.

Google replied to the report by publishing a blog entitled “More information on our speech protection processes.”

The search engine giant has confirmed the use of records in partnership with worldwide language experts “understanding the nuances and accents of a certain language,” and the experts have reviewed and transcribed a small number of questions that enable it to better understand these languages.

The terms and conditions of the blog post say that the capture of interaction is a key element of the construction of speech technology and it is necessary to create products such as the Google Assistant. The user’s conversation will be recorded.

But it was a violation of Google’s data security policies that said records were made.

We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again,the Google blog announced.

Google uses a broad range of safeguards to protect user privacy during an entertainment weekend.

“Language specialists review only approximately 0.2% of all audio snippets. As part of the review process, Audio snippets are not related to user accounts and reviewers are directed not to transcribe background conversations or other noises and to transcribe snippets directed at Google,” the blog continued.

The firm says the Google Assistant will only send Google audio after the device is turned on. It also stated that devices built into the Google Assistant can sometimes experience what the company has called a “false acceptance” which means that the background is a noise or words its software interprets as the hotword.

Although Google says that after an order has been heard, VRT NWS said 153 of the thousands of extracts that it heard would have never been recorded, saying that it did not clearly give the command “Okay Google.”

By February, Google stated that it would soon be provided with a Google Assistant functionality–that means the device that needed a speaker and a microphone. Nest Guard, the centerpiece of the Nest Secure home alarm system. However, the users were not aware of the microphone being used by the Nest Guard.

Google reactioned by saying that it was nothing more than a mistake to make users aware of a Nest Guard microphone.

Amazon used a team of people to send questions to Amazon Alexa-enabled smart speakers earlier this year, similar to that used by Google for the accuracy of its voice helpers.

The records sent to the human teams do not give complete names but do connect with the name of the account, serial number of the device and clip first name.

Some team members have to transcribe commands and to analyze whether Alexa has reacted correctly. Others are asked to record background sounds and unsuccessful conversations on the device.

Mark Funk
Mark Funk is an experienced information security specialist who works with enterprises to mature and improve their enterprise security programs. Previously, he worked as a security news reporter.