AI fills in the blanks

Editor’s note: Adrian Sanger is global head of brand and shopper at DVJ Insights, U.K. 

When there are so many choices and options that go into selecting a brand, it’s good for researchers to get a little help to understand what consumers see and what they don’t.

In grocery, shoppers are bombarded with visual information every time they enter a store or shop online. Cognitive overload is a well-documented phenomenon where the brain's ability to process information is overwhelmed by excessive stimuli. The notion that a shopper can accurately recall all the details of what they saw on a busy shelf is more myth than reality.

By putting AI to work to predict visibility, we can learn new things about the pack, the shelf and the aisle. By analyzing vast amounts of data, AI can provide insight into what people see when they shop – and what they miss. This can be an enhancement that solves the recall question without the complexity and cost of a full eye-tracking study.

Walking the aisle

Visibility capture using AI can provide critical insights into shopper behavior in supermarket aisles. This approach enables retailers to accurately track which products shoppers notice and interact with in real time. By gathering detailed visibility data, retailers gain a deeper understanding of how product placement, packaging and shelf layout influence consumer attention and decision-making. Traditional methods of assessing product visibility, such as post-visit surveys or static shelf audits, can fall short of capturing the dynamic interaction between customers and products on the shelves.

The findings are intuitive. High-traffic areas and eye-level shelves typically garner the most attention, while products on lower or upper shelves are often overlooked. Items with distinctive packaging or those positioned near popular products also attract more views. AI enables retailers to make informed adjustments that enhance the shopping experience and improve product visibility, ensuring that key items are seen and considered by customers. This capability is essential for optimizing store layouts and increasing the likelihood of product engagement and purchase.

Quantifying the impact of product placement

We’ve grown up knowing that there’s more visibility in some shelf positions than others. Eye-level placement, for example, is known to be prime real estate, but the delta between positions has lacked insights, largely due to cost of running multicell consumer testing.

AI technology means that mass experimentation on shelf visibility is now possible and practical.

In the simplified example below, we took two Swedish coffee brands, Arvid Nordquist and Lavazza. We then rearranged the shelf to position the brand across six zones. One set for Arvid and one set for Lavazza. Then we ran the analysis, in this case predicting visibility 12 times and comparing results:

  • The Lavazza pack is a more striking design and therefore enjoys greater head-to-head visibility than Arvid regardless of zone.
  • Central positions give a big advantage to brands.
  • The further from center of the fixture, the weaker the visibility.
  • A brands visibility is also a function of those brands immediately to the left and right. If they are dominant, it impacts others on the shelf.

This data-driven approach allows retailers to make informed decisions about product placement, optimizing shelf layouts to enhance visibility and drive sales.

Designing packaging for impact

The one thing a brand owner can consistently influence is the design of the pack. This becomes even more important given the retailer's ability to configure the planogram to their own requirements. Pack testing serves as a crucial step in understanding how consumers will react to new packaging designs before they hit the shelves.

AI can be used to enhance our understanding of pack design with additional metrics:

  • Heat map - a visual representation that uses colors to show the intensity of attention, across different areas of a pack.
  • Fog map - a visual representation that highlights areas of confusion or difficulty in understanding.
  • Areas of interest – a method to measure the impact of different visual elements on a product’s packaging in terms of percentage.
  • Start attention/end attention – similar to a heat map, breaking down the prediction into first two seconds and then seconds three and four.

AI can offer multiple ways of predicting on-pack attention.

Turning insights into action

The landscape of pack testing in the FMCG sector in 2024 is rapidly evolving. Using technology to predict visibility can open a new dialogue around winning at shelf. 

AI helps researchers because it predicts visibility and that can be hard for people to articulate. However, it does not stand alone as it cannot provide all the answers needed to drive pack design decisions. 

For example, researchers can also measuring instant recognition. Using a T-scope method precisely measures how quickly and effectively visual stimuli are processed in 80 milliseconds. This involves exposing individuals to a stimulus for no more than 80 milliseconds, too brief for cognitive processing. This rapid exposure assesses recognition and instant appeal, comparing these responses with other shelf brands and designs.

Or incorporate measuring speed of finding – how quickly a product is located compared to competitors. By presenting a product within a shelf context and timing how fast it’s identified, you gauge its cognitive visibility. This is particularly insightful when comparing not just brands but also sub-brands.

It's good to know that AI is now a part of today’s pack testing process, adding another valuable metric to enhance our understanding and optimize packaging designs effectively.