Back at Google I/O 2021, the search giant detailed ‘Multitask Unified Model,’ or MUM. The company billed it as an advancement in artificial intelligence that was much better at understanding language.
Google primarily talked about MUM as a way to enhance answers to search queries by better understanding difficult questions. Now, Google is bringing those enhancements to another search product: Lens.
One of the primary benefits of MUM is that it can understand information across a variety of formats, like text, images and video. By integrating MUM in Lens, Google says it’s opening a new way to search by letting users blend visuals and text-based queries to get better results.
The company shared a few examples of how this could work. One example was clothes shopping — if the customer saw a pattern they liked on a skirt but wanted the same design on socks instead, they could use Lens to search for that. Google says the feature will launch on Lens “in the coming months.”
New search experiences focus on expanding topics and visual results
MUM, our advanced AI model, is coming to #GoogleLens early next year. You’ll be able to snap a photo AND ask a question, which can be helpful in those moments you need to fix a broken part and have no idea what it is 🤷🔧 #SearchOn pic.twitter.com/cmedce3dB2
— Google (@Google) September 29, 2021
Next up, the company detailed a redesigned search experience coming to Google Search. Three new components are coming as part of this redesign.
First is ‘Things to know,’ which will offer expanded search suggestions based on broad topics. For example, searching for something like ‘acrylic painting’ can surface other “deeper insights” about the topic, like ‘how to make acrylic paintings with household items.’ Google pitched it as a way for users to dive deeper into search topics.
Next is ‘Refine this search’ and ‘Broaden this search.’ Working as two sides of the same
coin feature, refine and broaden are another way of helping users explore a topic. For example, if users look up a really broad topic, ‘Refine’ can suggest more narrow searches to help users zoom in on the specific. At the same time, ‘Broaden’ suggests searches that move in the opposite direction — if someone searches a narrow query, Broaden can help them zoom out to get a bigger picture.
Further, Google says it will soon offer a more visual search results page. The new results page will pull various types of results together for users — for example, it can combine text, picture and video results in one place. The new search results page won’t show up for everything, but users will start to see it when searching visual queries.
Things to know will launch in the coming months. So will Refine and Broaden, but it will be limited to English. Visual results will launch in the U.S. first for English users.
Finally, Google plans to introduce MUM to video with a new experience that identifies related topics in a video. Google says this works even if a topic isn’t explicitly mentioned in a video. The feature could be a way to help people dig deeper into video topics.
This will start rolling out on September 29th to English users in the U.S.