ºÎ»ê½Ãû µµ¼­¿ä¾à
   ¹Ìµð¾î ºê¸®Çνº³»¼­Àç´ã±â 

åǥÁö





  • [RH] When AI Meets Store Layout Design

    In a study just published in Artificial Intelligence Review, a research team proposes a new AI-powered store layout design framework for retailers. This research takes advantage of recent advances in the AI subfields of computer vision and deep learning to monitor the physical shopping behaviors of customers.

    Any shopper who has retrieved milk from the farthest corner of a store knows that an efficient store layout presents its merchandise to attract customer attention to items they had not intended to buy, increase browsing time, and make it easier to find related or viable alternative products grouped together.

    A well thought out layout has been shown to positively correlate with increased sales and customer satisfaction. It is one of the most effective in-store marketing tactics and it is used to directly influence customer decisions to boost profitability.

    The research proposes a comprehensive and novel framework to apply new AI techniques on top of the existing closed-circuit TV camera data to interpret and better understand customers and their behavior, in-store.

    It¡¯s well known that video offers insights into how shoppers travel through the store, the route they take, and the sections where they spend more time.  But this research drills down further, noting that people express emotion through observable facial expressions such as raising an eyebrow, eyes opening or smiling.

    Understanding this customer emotion as they browse could provide marketers and managers with a valuable tool to understand customer reactions to the products they sell.

    Emotion recognition algorithms work by employing computer vision techniques to locate the face, and identify key landmarks on the face, such as corners of the eyebrows, tip of the nose, and corners of the mouth.

    Understanding customer behaviors is the ultimate goal for business intelligence. Obvious actions like picking up products, putting products into the cart, and returning products back to the shelf have attracted great interest for the smart retailers.

    Other behaviors, like staring at a product or reading the packaging of a product are a gold mine for marketers seeking to understand the interest of customers in a product.

    Along with understanding emotions through facial cues and customer characterization, layout managers could employ heatmap analytics, human trajectory tracking and customer action recognition techniques to inform their decisions.

    This type of knowledge can be assessed directly from the video and can be helpful to understand customer behavior at a store-level while avoiding the need to know about individual identities.

    Based on this analysis, the team proposed a framework for retailers called Sense-Think-Act-Learn (or STAL). How is the framework applied?

    Firstly, ¡®Sense¡¯ means to collect raw data, such as from video footage from a store¡¯s closed-circuit TV cameras for processing and analysis. Store managers routinely do this with their own eyes; however, new approaches allow marketers to automate this aspect of sensing, and to perform it across the entire store, following a customer or customer population.

    Secondly, ¡®Think¡¯ means to process the data collected through advanced AI, data analytics, and deep machine learning techniques, in much the same way humans use their brains to process incoming data.

    Thirdly, ¡®Act¡¯ means to use the knowledge and insights from the ¡°Think¡± phase to improve and optimize the supermarket layout. Notably, the intelligent video analytic layer in the THINK phase plays a key role in interpreting the content of images and videos.

    Implemented fully, this process constitutes a continuous ¡°Learning¡± cycle, which is where the fourth phase of the STAL framework applies. Overtime, STAL helps store management learn how to optimize profitability.

    An advantage of this framework is that it allows retailers to evaluate store design predictions such as traffic flow and behavior when customers enter a store, or the effectiveness of store displays placed in different areas of the store.

    As the researchers observe, some retailers already routinely use AI empowered algorithms to better serve customer interests and wants, and to provide personalized recommendations.

    This is particularly true for the point-of-sale system and customer loyalty programs. STAL is simply another example of using AI to provide better data-driven store layouts and designs, and to better understand customer behavior in physical spaces.

    The researchers say data could be filtered and cleaned to improve quality and privacy and transformed into a structural form. As privacy was a key concern for customers, data could be de-identified or made anonymous, for example, by examining customers at an aggregate level.

    Since there is an intense data flow from the closed-circuit TV camera in store, a cloud-based system could be considered as a suitable approach in processing and storing video data for supermarket layout analysis.

    The researchers observe that AI implemented using the STAL framework could help managers adjust critical operating variables of ¡°the retail mix¡± based on at least three critical factors:

    First, design variables such as space design, point-of-purchase displays, product placement, and placement of check-outs.

    Second, employee variables such as the number, training and placement of personnel.

    And third, customer variables such as, crowding, visit duration, impulse purchases, use of furniture, queue formation and receptivity to product displays.

    ARTIFICIAL INTELLIGENCE REVIEW, February 10, 2022, ¡°When AI meets store layout design: a review,¡± by Kien Nguyen, Minh Le, et al. © 2022 Springer Nature Switzerland AG. Part of Springer Nature. All rights reserved.

    To view or purchase this article, please visit: