Google is rolling out a new feature for its Shopping tab that allows users to see how a clothing item would look on their body type.
The feature uses a diffusion-based image generation model to create realistic versions of the clothing item on 40 different models, ranging from XXS to 4XL sizes. The feature is currently limited to women’s tops, but Google plans to expand it to more clothing brands and categories over time, as shared by Gizmodo.
When browsing for women’s tops, users would notice a new “try on models” on the first image if the item is eligible for the feature. The virtual try-on feature is based on Google’s own Imagen model, which uses AI to send the image of the top and different-sized models to two separate neural networks. The neural networks then generate a version of the clothing item that fits the model’s body shape and pose. The AI model also takes into account how clothes tend to drape, fold, stretch, or wrinkle to depict the final ‘try-on’ image.
The virtual try-on feature aims to address one of the main challenges of online shopping: not knowing how a clothing item would fit or look on one’s own body. By providing a variety of models with different body types, Google hopes to help users make it easier for users to purchase clothing in sizes that would be suitable for them.
The feature is available for all users starting today. Some of the brands that are currently supported include H&M, Everlane, LOFT, and Anthropologie.
Image credit: Google
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.