In today’s world of fashion and technology there are a dizzying variety of choices, making it easier than ever for customers to fall into the “I don’t know what I want” category when shopping. As a result, it’s more important than ever to create innovative fashion shopping experiences that enable better search and discovery.
Many factors go into fashion decisions, from globally trending styles to unique individual preferences, including both objective factors like color and size, and subjective criteria like fit and feel. “Given the large number of factors that go into a shopper’s purchase decision, fashion can be an incredibly challenging problem for the machine learning professional,” says Soo-Min Pantel, Senior Manager of applied science at Amazon.
One approach to surfacing relevant fashion choices has been to follow the “people who bought ‘A’ also purchased ‘B’” method. While this method is effective, it doesn’t account for certain scenarios that are important in the world of fashion. For example, a person who purchased Nike yoga pants could also buy Nike running sneakers. However, another shopper might prefer to shop adidas or New Balance, or an entirely different style like slip-ons or tennis shoes.
Soo-Min’s Discovery Innovation team uses machine learning algorithms and a broader set of criteria to surface personalized product recommendations on product pages and on the main Amazon page. Her work involves mapping each product on Amazon to a broader set of dimensions such as brand, category and style. Those data points, along with other factors, look beyond the more obvious connections, and surface relevant products that match the shopper’s preferences.
Amazon Fashion also has a growing team dedicated to driving innovative fashion shopping experiences through machine learning to give customers confidence in how things fit, look and feel, from Prime Wardrobe’s try-before-you-buy experience to enhancing search, and more. In fact, this year, the Amazon Fashion Machine Learning team has developed models to detect visual attributes of products that might not be contained in the catalog data itself – like a floral pattern or neckline in a dress – and has published over 70 million of these attributes back to the catalog to make it easier for customers to find these products with common search terms.
“We’re focused on the intersection of technology and fashion, to not only improve and enhance a personalized experience and discoverability, but also to create fun experiences for customers shopping for Fashion on Amazon,” said Tony Bacos, Director of Technology and Innovation, Amazon Fashion. “Technology will continue to change how customers shop for what they want to wear, and what they want to be inspired by. We’re excited to bring them more intuitive ways to shop.”
Deriving product features from an image enables greater discoverability of relevant items in search. Imagine a short sleeved black dress that’s just right for you, but the seller hasn’t marked the item as “short sleeve.” If the machine learning algorithms were to go merely off the description in the product detail page, this dress might not show up in your search results on Amazon. This is where Gabriel Blanco’s Fashion Science team comes in. The team is enabling the discovery of fashion-related items through product images in our catalog. The system developed by the Fashion Science team can scan millions of images and automatically tag these with the relevant product attributes. The team accomplishes this by using convolutional neural networks that run off a scalable AWS infrastructure.
Neural networks are loosely based on the structure of the human brain, which is made up of neurons that are connected to each other. While the workings of neurons still aren’t entirely understood, this much is clear: neurons add up a sum of quantities, compare the sum to a threshold, and fire when that sum has been exceeded. Mimicking the mechanics of the human brain allows for computer algorithms to do things like detect features of a piece of clothing or a shoe without being explicitly programmed. For example, in detecting a sandal attribute, characteristics like “ankle wrap” will be activated (and will fire a row of neurons), while “hiking” would not (leaving the neurons coldly indifferent).
“An image is nothing more than a mechanized array of pixels,” says Rui Luo, Senior Scientist on the Fashion Science team. “So we can represent every feature, be it the type of neck, a pattern on a dress, or the type of sleeve as a sequence of numbers.” Once the team has the confidence that the network is drawing out product features accurately in the training set, they set it loose on Amazon’s entire fashion catalog.
“There’s a number of people both in academia and in the industry who are working on solving some of the toughest problems in the world of fashion,” says Soo-Min. “At the end of the day, we want customers to be to browse products that feel just right and make them feel good.”
Perhaps we all don’t have to know fashion — with a little help from our AI friends, we can be fashionable all the same.