Amazon’s Lens Live Brings Real-Time AI Shopping to Mobile

Amazon has introduced Lens Live, an AI-powered update to the Amazon Lens shopping tool. The app, which works in concert with Amazon’s Rufus shopping assistant, uses smartphone cameras much like Google Lens does with visual search. Pinterest Lens is another such app. But the purpose-built Rufus ties it even more closely to the shopping experience with instant scanning, real-time product matches and insights from Rufus. Lens Live is already available to tens of millions of U.S. users on iOS in the Amazon Shopping app with plans to roll out to all U.S. customers in the coming months.

Amazon noted in a March explainer that the number of customers using Amazon Lens for visual search has grown “by more than 50 percent in the last year.”

TechCrunch writes that Lens Live will not replace Amazon Lens, “which lets you take a picture, upload an image, or scan a barcode to discover products.” Rather, it integrates a real-time component.

After opening Amazon Lens and activating Lens Live, your camera will instantly begin scanning products and show top matching items in a swipeable carousel at the bottom of the screen, Amazon explains in a retail post.

Customers using the app now have the ability to “tap an item in the camera view to focus in on a specific product, add items directly to their cart by tapping the + icon, and save to their wish lists by tapping the heart icon, all without leaving the camera view,” Amazon adds.

Lens Live “builds atop the foundations of Amazon Lens,” writes Digital Trends, adding that the generative AI capabilities at the heart of the new tool make it better equipped to detect objects and match them “against a database of items currently on sale via Amazon.”

Lens Live can also identify a product you can’t name and shop the components of that cute outfit you saw on social media (or, if it can’t find specific items, offer its reasoned approximation), offering comparison pricing, and basically saving the trouble of having to “describe the product, color, or shape,” according to Amazon.

Lens Live “capitalizes on activities customers are already doing: comparison shopping while in retail stores out in the real world,” TechCrunch observes, adding that “the feature is powered by Amazon SageMaker,” which allow machine learning models to be deployed at scale, and utilizes the AWS-managed service Amazon OpenSearch.

No Comments Yet

You can be the first to comment!

Leave a comment

You must be logged in to post a comment.