Google has announced Style Match, a feature allowing users to point their smartphone’s camera at an outfit so it generates suggestions on what to buy online.
The feature is part of Google Lens, the company’s camera-powered search engine that was unveiled last year and is now fully integrated into its smartphone’s native camera.
The news, which was announced at the Google I/O 2018 conference in California on May 8, means any user can discover a specific item or receive suggestions to similar styles not only for fashion, but other categories including accessories and furniture.
Such functionality could give the tech giant the lead in facilitating discovery through mixed realities, particularly because by being embedded in the phone’s native camera, it doesn’t require the user to learn a new behavior or download a dedicated app they will eventually ditch. So far, brands such as eBay and ASOS have tinkered with image recognition within their own apps, but the ability to trigger the image via a smartphone’s main image-capturing tool can only lead to mass adoption.
Stop breadboarding and soldering – start making immediately! Adafruit’s Circuit Playground is jam-packed with LEDs, sensors, buttons, alligator clip pads and more. Build projects with Circuit Playground in a few minutes with the drag-and-drop MakeCode programming site, learn computer science using the CS Discoveries class on code.org, jump into CircuitPython to learn Python and hardware together, TinyGO, or even use the Arduino IDE. Circuit Playground Express is the newest and best Circuit Playground board, with support for CircuitPython, MakeCode, and Arduino. It has a powerful processor, 10 NeoPixels, mini speaker, InfraRed receive and transmit, two buttons, a switch, 14 alligator clip pads, and lots of sensors: capacitive touch, IR proximity, temperature, light, motion and sound. A whole wide world of electronics and coding is waiting for you, and it fits in the palm of your hand.