Google has announced Style Match, a feature allowing users to point their smartphone’s camera at an outfit so it generates suggestions on what to buy online.
The feature is part of Google Lens, the company’s camera-powered search engine that was unveiled last year and is now fully integrated into its smartphone’s native camera.
The news, which was announced at the Google I/O 2018 conference in California on May 8, means any user can discover a specific item or receive suggestions to similar styles not only for fashion, but other categories including accessories and furniture.
Such functionality could give the tech giant the lead in facilitating discovery through mixed realities, particularly because by being embedded in the phone’s native camera, it doesn’t require the user to learn a new behavior or download a dedicated app they will eventually ditch. So far, brands such as eBay and ASOS have tinkered with image recognition within their own apps, but the ability to trigger the image via a smartphone’s main image-capturing tool can only lead to mass adoption.