How Google makes conversations with Assistant more natural


There was a time when it seemed strange, or even embarrassing, to have a chat on your smartphone. But habits have changed and now digital assistants are very popular. According to Google, for example, there are 700 million users who use Assistant to perform everyday tasks.

You should know that in recent years, Google has continued to improve its digital assistant, in order to improve performance, functionality, but also for users to have more natural interactions.

The firm did not forget Assistant, during the 2022 edition of its I/O conference. Indeed, it pursues its objective of making these interactions more natural.

New for Assistant on connected screens

Today, Google Assistant is not only present on Android smartphones, since it is also the main interface used by users of connected speakers and connected screens from Google. And at its conference, Google presented a new way to interact with the Nest Hub Max screen (equipped with a camera) without having to say the magic words “Ok, Google” or “Hey! Google “.

Indeed, this screen is now able to recognize when the user stares at the screen, in order to start listening to the instructions. “Our first new feature, Look and Talk, starts rolling out today in the US on Nest Hub Max. Once you sign up, you can just look at the screen and ask for what you need”explains Sissie Hsiao, vice president responsible for Google Assistant.

So, no need to say “Ok, Google”, just look towards the connected screen. But for this feature to be active, the user will need to enable voice recognition as well as facial recognition. Google clarifies that video recordings do not leave the user’s device.

According to the firm, the development of this functionality is a real exploit, on the technological level. According to Sissie Hsiao, Google needs to use six machine learning (AI) models, processing more than 100 signals from the camera and microphone, for the Nest Hub Max to detect user intent.

Google also mentions the progress made in terms of diversity. “Last year, we announced Real Tone, an effort to improve Google’s camera and imaging products for all skin tones. Continuing with that in mind, we tested and refined Look and Talk to work on a range of skin tones so it works well for people from diverse backgrounds”says Hsiao.

Google Assistant will also be less disturbed by the way you speak

Today, when giving an instruction to Google Assistant, it is necessary to speak with some clarity. But Google wants to fix that, so its digital assistant can understand even when speaking more naturally.

“In everyday conversation, we all naturally say ‘uh’, we correct ourselves and pause from time to time to find the right words. But people can still understand us because people are active listeners and can respond to conversational cues in less than 200 milliseconds. We think your Google Assistant should also be able to listen and understand you”explains the vice-president of Google Assistant.

So that Google Assistant can understand us even when we hesitate, Google is creating better speech and language recognition models that can take into account the “nuances” of human language. Google will also be able to base this work on the performance of the Tensor chip, which already equips its recent smartphones.

Leave a Comment