Google Assistant will get better at understanding us

The Google assistant will get better at understanding us, even when we talk like people do mostly with small pauses, corrections and filler words. To achieve this, Google is building powerful new speech and language models that understand the nuances of human speech.

The post also mentions Google’s own Tensor chip, which has been specially adapted to handle local machine learning at lightning speed. The chipset is in the Pixel 6 and Pixel 6 Pro and is also expected to be in the Pixel 6a.

Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.

Another news related to the Google assistant is “look and talk” for the Nest Hub Max. Users will be able to simply look at the device’s screen and ask a question, without first having to say “hey google”. Finesse is optional and can utilize both facial and voice recognition to verify the user’s identity.

Initially, look and talk is only available in the US in English.

Recommended
Sundar Pichai wore the Google Pixel Watch during the conference

Related Articles

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker. Thanks.