Google promises better privacy protection for Assistant
Google faced significant backlash earlier in the year when it was discovered that human workers listened to audio recordings from Google Assistant users. According to the company, the practice helped improve speech technology. But users weren't convinced with the argument, especially with reports that it could capture personal conversations as well. Google wants to convince the public that it has put in place privacy-related improvements for its virtual assistant.
The company emphasized that it isn't retaining audio recordings by default. Users are asked to opt in to the Voice and Audio Activity (VAA) program when first setting up Assistant. And now existing Assistant users can review this VAA setting. Google said the human review process will resume after you confirm your preference. Google said in its blog post, "We won't include your audio in the human review process unless you've re-confirmed your VAA setting as on."
Google said it automatically gets rid of recordings that happened during an unintentional activation. And that it is implementing "additional measures" to identify and exclude accidental activations from human reviews. The company is exploring Assistant's sensitivity to wake-words, too. It’s also working on adding “greater security protections” for the small set of queries it accesses (which Google claims is around 0.2 percent of all user audio snippets). According to Google, it is implementing “an extra layer of privacy filters.”
And the company promises to "vastly reduce the amount of audio data we store." And that it'll be cleaning out some old recordings. Google said in the post, "For those of you who have opted in to VAA, we will soon automatically delete the vast majority of audio data associated with your account that's older than a few months. This new policy will be coming to VAA later this year."
Reader Comments