Facebook Papers Reveal Progress on AI Shopping Assistant

In May, Facebook debuted Shops, which allows companies to set up digital stores across Facebook, WhatsApp, Messenger and Instagram, and also described its goal to develop an AI assistant to recommend products. The assistant would learn about a user’s preferences by analyzing images in his wardrobe and allow him to virtually try on clothing. Based on papers Facebook will present at the Conference on Computer Vision and Pattern Recognition 2020, it appears the company is deep in development of this assistant.

VentureBeat reports that, “one paper describes an algorithm that uncovers and quantifies fashion influences from images taken around the world … Another demonstrates an AI model that generates 3D models of people from single images … And a third proposes a system that captures clothing’s affinity with different body shapes.”

According to McKinsey, “Amazon, which recently deployed AI to handle incoming shopper inquiries, generates 35 percent of all sales from its product recommendation engine.” Amazon isn’t alone in using AI for e-commerce solutions. ModiFace, Vue.ai, Edited, Syte and Adverity let users try on lipsticks virtually, “see model images in every size, and spot trends and sales over time.”

Facebook AI researcher and University of Texas professor Kristen Grauman and co-author Ziad Al-Halah, also at the University of Texas contend that, “unlike vendors’ purchase data, other non-visual metadata, or hype from haute couture designers, everyday photos of what people are wearing in their daily life provide an unfiltered glimpse of current clothing styles ‘on the ground’.”

They sourced “a vocabulary of visual styles from unlabeled, geolocated, and time-stamped images of people” found on GeoStyle, “a corpus of over 7.7 million images of people in 44 cities from Instagram and Flickr.” An AI model “exploits the photographic relationships to anticipate future popular styles in any location.”

The researchers then “tapped attribute predictions to represent each photo with 46 attributes … learned 50 fashion styles based on these … [and then], for each style … inferred its popularity trajectories in individual cities over the course of weeks using the above-mentioned AI model.”

Another Facebook paper details the Animatable Reconstruction of Clothed Humans (ARCH) system, “an AI technique for generating 3D models of clothed people, which could become the centerpiece of a future Facebook-powered fashion assistant.” ARCH, which “would enable users to see how they look wearing apparel in various poses not only while standing, but when walking, sitting, and crouching in a range of environments and lighting,” is an “end-to-end framework for reconstructing ‘animation-ready’ 3D clothed humans from a single view.”

“We envision a future in which [a] system could … incorporate your friends’ recommendations on museums, restaurants, or the best ceramics class in the city — enabling you to more easily shop for those types of experiences,” said Facebook. “Our long-term vision is to build an all-in-one AI lifestyle assistant that can accurately search and rank billions of products, while personalizing to individual tastes. That same system would make online shopping just as social as shopping with friends in real life.”

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.