StyleSnap, which is available in the Amazon app, offers recommendations after a consumer uploads a photograph or screenshot.

Amazon.com Inc. on Wednesday rolled out StyleSnap, an artificial intelligence-powered visual search tool aimed at helping consumers use images to find apparel items. The ecommerce giant, No. 1 in the Internet Retailer 2019 Top 500, introduced the feature at its re: MARS conference that’s focused on machine learning, automation, robotics and space.

StyleSnap is available within the Amazon app. Consumers can use the tool by clicking a camera icon in the upper right hand corner of the app, selecting the “StyleSnap” option, then uploading a photograph or screenshot of an outfit that she likes. StyleSnap then will present her with recommendations for similar items on Amazon that match the look in the photo. StyleSnap’s recommendations draw on a number of factors, including brand, price range and customer reviews.

The feature aims to solve a common problem: It can be difficult for consumers to describe particular items, writes Arun Krishnan, an Amazon communications executive, in a blog post announcing StyleSnap.

“Think about the last time you were inspired to try a new look—from a cute tie-dye top you saw on Instagram, to a celebrity sporting a couture dress in the latest issue of ‘Vogue,'” he writes. “Later, when trying to come up with the best words to describe the look, you discover that you are not a poet. You struggle to find the right words to explain the shape of a neckline, or the spacing of a polka dot pattern, and when you attempt your text-based search, the results are far from the trend you were after.”

advertisement

While the tool is new to Amazon, a number of other apparel retailers have offered visual search tools for several years, including Amazon-owned Zappos, Asos and Nordstrom.

StyleSnap uses computer vision and deep learning to identify apparel items in a photo, regardless of setting. The deep-learning technology, which refers to a category of machine-learning techniques that use artificial neural networks to mimic the human brain, helps the tool classify the apparel items in the image into categories such as “fit-and-flair dresses” or “flannel shirts.”

Amazon’s engineers developed the technology by “training” it to detect images of outfits by feeding it a series of images, Krishnan writes. For example, after presenting the network with thousands of images of maxi and accordion skirts, it eventually was able to tell the difference between the two styles. But that took time and training, he writes. If it, for example, only showed it a single Scottish kilt, it could become confused and predict an incorrect class until enough examples are provided to train it otherwise.

While StyleSnap aims to help consumers find fashion items via visual search, the tool also enables fashion influencers who participate in the Amazon Influencer Program to receive commissions for purchases driven by consumers using their images to search using the tool.

advertisement
Favorite