The first feature, called ‘Look and Talk,’ gives users the ability to activate Google Assistant without saying “Hey Google.” Instead, all you need to do is look at the Nest Hub Max and start talking.
Through machine learning models, Google says it was able to learn 100 signals to process user intent, including analyzing gaze detection, head orientation, proximity, semantics, and even using the Pixel 6’s ‘Real Tone’ for those with darker skin. With all of this, Google will be able to determine if a user is speaking to the Nest Hub Max or someone else nearby.
This feature is coming to the Nest Hub Max devices connected to Android handsets later this week. It will arrive on iOS users in the next few weeks. It’s worth noting that you can opt-in and out of this feature if you find it too invasive.
Another feature coming to the Nest Hub Max is ‘Quick Phrases.’ Quick Phrases are already available on the Pixel 6, but with the Nest Hub Max, you’ll now be able to turn off lights, kitchen timers, or even ask about the weather without even looking at the Nest Hub Max.
Further, Assistant now better understands when people stumble while speaking. During I/O 2022’s keynote, Sissie Hsiao, Google’s vice president of Assistant, offered an example of not knowing an artists’ name and stumbling while speaking. Google Assistant understood that she wasn’t clear, encouraged her to keep speaking and ackowledged the incorrect name of the artist.
For all of our content from I/O 2022, follow this link.
Image credit: Google