Consumers frequently tackle a difficulty when trying to trace a product similar to one they have seen somewhere and they wish to purchase. It could be a product already in their possession at home and they want to replace or complement it with a new one that looks alike, or it may be a product they have seen obtained by another person (e.g., on the street, in social media). They may take a photo image of that product by themselves or they may refer to a photo image found as an exemplar in a digital-online domain. Instead of struggling to describe the target product accurately enough in words, there are tools that make it possible now to search for similarly looking products as candidates based on a photographic image of the target product sought after. A visual search, or search by image, can help to narrow down considerably the set of relevant alternatives for further inspection. In turn, it can increase the probability that the search will lead to a purchase of a compatible product, whether identical or closely similar to the one the shopper has been looking for.
Finding a particular product can be a challenging task, even getting close enough to it in kind (e.g., shape and cut, colours, and other elements of design, materials, etc.). Sometimes it is difficult to capture verbally the product object one is looking for — an image of the target product can be much more telling, with little effort by the consumer. The onus is brought onto the search engine to interpret the image, specifically the focal object-of-interest in the image, and figure out what the consumer-shopper may be after. That capability is made possible through methods and algorithms of Artificial Intelligence (Deep Learning, Computer Vision) — the applied model principally conducts a visual analysis of an image to extract essential features of it and find images of product objects (and scenes) with the closest matching features.
The search for a product that resembles the one we are interested in often involves browsing several pages in sections of an online ‘catalogue’ of a retailer, which takes time. If we start with a search in a general-purpose search engine like Google, we may have to visit the websites (or apps) of a number of retailers (mixed or e-tailers only). Since consumers tend to use short queries, they are likely to get a broader range of results to scan, screen and inspect. A human could be required to form a long verbal list of descriptive features to capture the focal product fully enough; however, if one uses a visual search tool, an AI algorithm can produce a list of essential visual features or attributes much more effectively, and faster. The utilisation of a visual search tool is promised to free shoppers from constructing long and complex verbal queries in order to pin-point more accurately the kind and style of a product they are seeking; require less browsing time in search for the same or an adequately-resembling and satisfying product alternative; and increasing the chances of finding what they have been looking for (‘Commerce Obsessed: How AI Can Drive Results’, Adobe’s webinar on its AI-suite Sensei, October 2021). Adobe suggests in addition that a retailer can apply the same logic and techniques of searching by image in Sensei to put forward personally-customised recommendations to shoppers of similarly-looking products (e.g., clothing) based on the images of other products a shopper is browsing.
Examples of products are commonly given in the area of fashion, including garments of clothing, handbags, shoes, and more. However, one may be looking for products of various types. For instance, a consumer may be looking for furniture (e.g., a sofa, a dining table), a light lamp, or appliance (e.g., a refrigerator) of a certain style or design. Actually, one may be eager to find a particular model or style of a car (e.g., a sports car, a new crossover model, or rather a classic vintage car) of which he or she may have snapped a photograph in the street. Yet, in order to ensure that the focal object-of-interest is correctly identified in the image frame, it is advisable that a sharp view of this product-object takes the centre area of the photo image, with as little ‘noise’ of competing details as possible in the background. If the search tool cannot determine conclusively what is the object-of-interest, it may give diverse options (e.g., for the garment and the person who wears it); but if the frame is ‘crowded’ with more objects (e.g., streetscape, a room, a display area in a store), the search may yield images of similar scenes rather than images resembling a particular object.
Introducing a utility of visual search in an online store (website / mobile app) can help a retailer to enhance the shopping experience for its customers, such as by enabling customers to easily find compatible products, facilitating the path to purchase, or uploading photos for search from social media platforms (Shopify [Blog: Marketing], Kiera Abbamonte, 5 February 2019). Applying search by image may help shoppers-customers even while visiting a physical store, using their own phone or a search utility in an information ‘kiosk’ in-store (e.g., finding alternative designs close to a product item they fancied in the store, and they may also have access to alternatives available in any channel [physical/digital] of the retailer).
Yet, visual search is still not commonly integrated natively in websites or mobile apps of retailers for their online stores, and hence this utility is not available to shoppers within the website/app. Visual search tools may be utilised in other platforms, such as general-purpose search engines or social media networks, which may link to image-results in other websites. For example, Google Search includes a visual search tool in Images (the Camera icon), and Microsoft’s search engine Bing also includes a visual search tool (see Images). Google offers its dedicated app Lens for visual search on mobile phones. Other visual search applications are available in Pinterest (Lens), Amazon (StyleSnap), and Snapchat (Camera Search)(Productsup.com (Blog), Hayley Pearce, 26 January 2021). Users may be given options to upload a photo from their device (e.g., laptop PC, smartphone, tablet), link to an Internet webpage with an image, or use their camera ad-hoc to take a photo.
- It is noted that some retailers may have installed a visual tool only in their mobile apps (e.g., Neiman Marcus, Target). Since websites can usually be accessed and used by a wider audience of consumers-shoppers, making such an exclusion raises questions, whether it is the outcome of technical, financial, or marketing considerations.
Instead of conducting a visual search within the website of a retailer, shoppers may try to trace products by image using the visual search utility of external online services. Likewise, retailers are also advised to make images of their products admissible to the external search engines, so shoppers may be directed to relevant results to be found in their online store website (see suggestions made by Pearce in Productsup how to make their photo images better ready to be retrieved in visual search). But there are other options open to retailers. A retailer may integrate-host an existing visual search tool of another platform into its website or app rather than develop its own visual search utility; or it may acquire or hire one from a technology company to install on its website/app; a retailer that maintains its online store in the ‘cloud’ may also hire a visual search utility from the technology company as service provider (e.g., Amazon’s AWS, Adobe’s Experience Cloud).
Finally, ten retail websites (e.g., Target, Walmart, Neiman Marcus, Selfridges) were visited and inspected for availability of a visual search tool. Some of these retailers (mixed physical & online) reportedly enable searching by image. Unfortunately, this option could not be found on the websites. They all had the familiar text query box for locating products but no feature for referring to an image. Additionally, an Amazon (UK) website was visited, but there again, no sign of visual search could be found. Some product pages were browsed in hope to find the search feature in specific category sections, but to no avail. As noted earlier, the feature may have been added only in the mobile app (not accessible to the author of this post). The online store sites do provide, nevertheless, a list of attributes with a filtering tool that shoppers can apply to select preferred attribute levels (e.g., style, colour, brand), thus ‘drill through’ the product assortment and narrow down the set of available products relevant to them. Often enough this technique might be sufficient and satisfactory for a shopper who is flexible and not too ‘picky’. Still, it is hard to deny the advantages that visual search could contribute to direct a shopper to the closest match to his or her target product.
Conducting a visual search has attractive benefits; observing the result ‘palette’ of images can be fascinating and catching. They are not always the images one would expect, and this may depend on the composition and quality of the image provided for query. But one can use additional tools to continue browsing and checking relevant images (e.g., classification by tag words). This direction seems to have a bright future. It also seems, however, that it will take retailers more time to adopt the technology of visual search and make its tools available for application to the larger part of their customers-shoppers.