PET FOOD PACKAGING

Neuroscience

Package Design


— CASE STUDY: PET FOOD PACKAGING

Which Pet Food Package Design Drives Purchase Intent? A Neuromarketing Study.

North AI's cognitive analysis platform tested 11 package designs for European pet food brand Myasna Miska against each other and against key competitors. EEG, eye-tracking, and simulated shelf testing revealed which designs hold attention, which drive purchase intent — and which actively underperform.

Category: Neuromarketing · Package Design · Retail  ·  16 respondents  ·  6 EEG metrics

16

Respondents

11

Package Designs

6

EEG Metrics

2

Test Conditions

3

Research Methods

Background

Myasna Miska is a popular European wet pet food brand undergoing a packaging rebrand. The brand commissioned a neuromarketing study to determine which of 11 proposed package design variants would be most effective — not just visually appealing, but cognitively engaging and likely to drive purchase behaviour.

The study was conducted using a combination of EEG measurement, eye-tracking, and simulated virtual shelf testing. Designs were evaluated in isolation and against real competitor products including Whiskas, Felix, Pan Kot, Friskies, and Club 4 Paws.

The central question: which design best captures attention, generates positive emotional response, and motivates purchase — as measured by brain activity and involuntary physical response, not self-reported preference?


Fig. 1 — Myasna Miska brand packaging. The study tested 11 design variants of the wet cat food pouch format alongside competitor designs on shelf.

Methodology

The study used three complementary measurement methods, each capturing a different layer of consumer response:

  • EEG (Electroencephalography). Brain activity was measured across six metrics while participants viewed each package design. Metrics covered attention level, cognitive processing, emotional tone, activation, engagement, and ease of information processing. EEG captures involuntary neural responses that self-reported surveys cannot access.

  • Eye-tracking. Gaze data was recorded to identify which elements of each package participants noticed, how quickly they were noticed, and how long attention dwelled on each element. Heatmaps were generated for each design.

  • Virtual shelf simulation. Designs were tested in a simulated retail shelf context at varying levels of blur (80%, 50%, and 30% clarity) to simulate peripheral and in-aisle vision. This tests which designs are recognisable and stand out from competitors before a shopper consciously focuses on them.

Sample: 16 respondents. Both conditions — quick familiarisation (brief exposure) and extended viewing — were tested. Competitor designs included Whiskas, Felix, Pan Kot, Friskies, and Club 4 Paws.

Designs tested: 11 Myasna Miska variants (labelled 1–11), ranging from teal-background designs with food imagery, to dark backgrounds with stylised cat faces, to photorealistic cat portraits in various colour treatments.

Result 1: Logo Noticeability

Respondents noticed the logo across all six design variants tested in the eye-tracking task. However, only 6 of 16 respondents recalled the brand name "Myasna Miska" after viewing — indicating that visual noticeability and brand name recall are separate challenges.


Fig. 2 — Logo noticeability across 6 design variants. Design 1 was noticed by all 16 respondents and had the fastest time to first fixation (0.62s). Design 2 had the longest average dwell time (0.48s), suggesting stronger visual appeal of the logo element.

Logo noticeability data across all six designs:

METRIC

D1

D2

D3

D4

D5

D6

Respondents who noticed

16

14

13

13

12

11

Time to first fixation (s)

0.62

0.84

0.90

0.68

0.85

0.84

Average dwell time (s)

0.37

0.48

0.66

0.64

0.68

0.51

Key finding: Design 1 and Design 2 performed best on logo noticeability. Design 1 was noticed fastest (0.62s to first fixation) by all 16 respondents. Design 2, while slightly slower, had the longest average dwell time — indicating the logo held attention for longer once noticed.

Result 2: Elements Missed on Quick Viewing

Eye-tracking heatmaps revealed which packaging elements respondents did not notice during short exposure. This matters because on-shelf viewing time is typically brief — shoppers make split-second decisions based on peripheral visual cues.


Fig. 3 — Gaze heatmaps showing areas of low attention during quick familiarisation. Cold (blue) areas indicate elements missed by most respondents. Weight indicators, bottom-of-pack text, and secondary design elements were consistently overlooked across multiple design variants.

Across designs, the following elements received minimal attention during quick exposure:

  • Weight and quantity indicators (positioned at bottom of pack)

  • Secondary flavour descriptors

  • Small print and regulatory information

  • Brand sub-elements on certain cat-face designs where the face dominated gaze

"Elements that are invisible during quick shelf scanning provide no value. If a design feature is not noticed, it cannot influence purchase decisions."

Result 3: EEG Results — Brand Designs

EEG data was captured across six neurological and behavioural metrics for all 11 brand package designs. The table below shows scores for each design. Higher values indicate stronger performance on each metric. Designs marked with a tick (✓) by were identified as those most likely to prompt purchase based on EEG signals.


Fig. 4 — EEG results heatmap for all 11 Myasna Miska design variants. Green tick = designs that EEG data identifies as most likely to prompt purchase. Red cross = designs with weakest consumer response. Yellow cells indicate highest scores per metric.

Numerical EEG scores across all 11 designs (D1–D11):

METRIC

D1

D2

D3

D4

D5

D6

D7

D8

D9

D10

D11

Attention Level

189.6

123.4

144.3

133.8

139.2

165.6

170.4

186.2

153.3

128.8

161.7

Cognitive Processes

71.6

60.3

69.2

57.2

62.7

63.5

67.2

73.3

69.6

70.0

66.6

Emotional Tone

0.3

0.3

0.4

0.4

0.4

0.5

0.3

0.4

0.4

0.4

0.3

Activation

0.4

0.3

0.4

0.3

0.4

0.4

0.4

0.4

0.4

0.4

0.4

Engagement

1.1

1.0

1.1

0.9

1.1

1.2

1.2

1.0

1.1

1.0

1.4

Info Processing

2.6

1.9

2.3

1.8

1.8

2.1

1.9

1.8

2.5

2.1

2.2

Key findings: Designs 1, 7, 8, and 11 showed the strongest EEG profiles overall. Design 1 led on attention level (189.6) and cognitive processing (71.6). Design 11 had the highest engagement score (1.4) and the highest ease of information processing (2.2). Design 4 showed the weakest performance on multiple metrics.

Result 4: EEG Results — Brand vs Competitors

The same EEG metrics were applied when respondents viewed Myasna Miska designs alongside real competitor products in a simulated category shelf view. This tests not just individual design quality, but competitive cut-through — how the brand performs in the context where purchase decisions actually happen.


A screen shot of a graph

AI-generated content may be incorrect.

Fig. 5 — EEG results including competitor designs (labelled A–I). Green ticks indicate designs that performed best; red crosses indicate worst. "Pan Kot" (labelled C) showed the strongest competitor performance across attention, cognitive processing, activation, and ease of information processing.

Key findings from competitive context: 

  • Design 7 showed high scores on activation, emotional tone, and attention level — the strongest Myasna Miska performer in competitive context.

  • Design 9 scored highest on engagement, ease of information processing, and activation among the brand's own designs.

  • Design 11 had high scores on attention level, cognitive processes, and engagement.

  • Pan Kot was the strongest competitor, outperforming most Myasna Miska designs on attention level, cognitive processing, activation, and ease of information processing. This represents the primary competitive benchmark the brand needs to beat.

Result 5: Gaze Heatmaps — Where Attention Lands

Eye-tracking heatmaps show exactly where respondents' gaze concentrated on each design. Hot spots (red/orange) indicate elements that captured the most attention. Cold areas (blue/purple) indicate elements that were missed.


Fig. 6 — Gaze heatmaps across multiple design variants. Designs with food imagery (teal background, D1/D3) consistently drew attention to the food bowl element. Cat-face designs concentrated gaze on the brand name and upper-face area, with lower pack elements (weight, secondary text) consistently missed.

Across all designs, three patterns were consistent:

  • Food imagery designs (teal background): Gaze concentrated heavily on the food bowl and hand — the food itself was the primary attention anchor. The logo received secondary attention. Bottom-of-pack elements were largely missed.

  • Cat-face designs: The brand name positioned across the cat's chest consistently attracted strong gaze. Eyes of the cat drew attention but did not anchor gaze in the same way as food imagery.

  • Weight and secondary information: Consistently in cold zones across all designs. These elements were not being processed by respondents during normal viewing duration.

Result 6: Virtual Shelf Testing

Virtual shelf testing simulates how packages appear in peripheral vision and at varying distances — conditions that reflect how shoppers actually encounter products on a retail shelf before making a conscious decision to stop and look.

Designs were tested at three blur levels: 80% blur (simulating distant peripheral view), 50% blur (approaching the shelf), and 30% blur (near focus).


Fig. 7 — Virtual shelf test results showing noticeability and selection rates at 80%, 50%, and 30% blur for four leading designs. Design 3 (teal background) was noticed by all 16 respondents at 80% blur and had the highest selection rate. Design 5 was most difficult to identify at all blur levels.

Shelf visibility results at 80% blur (most peripheral, most challenging):

  • Design 3: Noticed by 16/16. Selected by 11/16. Fastest time to first fixation at 2 seconds. Teal/turquoise background provided strongest standout at distance.

  • Design 1: Noticed by 16/16. Selected by 8/16. Time to first fixation: 2 seconds. Teal background performed similarly to Design 3 on noticeability.

  • Design 2: Noticed by 15/16. Selected by 8/16. Dark background performed well on noticeability but lower on selection.

  • Design 5: Noticed by 13/16. Selected by 4/16. Consistently the weakest performer — cited as lacking a dominant colour, making it difficult to identify at any blur level.

"Teal/turquoise background designs (D3 and D1) consistently outperformed on the virtual shelf. The dominant colour creates the visual anchor that drives pre-attentive recognition."

Overall Package Design Ranking

Combining results across all three methods — EEG, eye-tracking, and virtual shelf — produces the following overall ranking:


Fig. 8 — Overall label ranking combining EEG, eye-tracking, and shelf performance. Designs 2 and 4 rank highest. Designs 6 and 5 rank lowest.

  • Rank 1 — Designs 2 & 4: Highest overall scores. Both have clear logo visibility and strong flavour communication. Design 2 has moderate noticeability but strong subjective scores. Design 4 had strong EEG response — respondents noted higher appeal of "food in bowl" visual.

  • Rank 2 — Designs 1 & 3: Strong on attention and eye-tracking metrics. Both are notable on the virtual shelf due to teal background. However, EEG data indicates these designs do not effectively prompt purchase intent despite holding attention. Design 3 drew disproportionate focus to the food visual, which EEG data suggests may cause aversion in some respondents.

  • Rank 3 — Designs 6 & 5: Lowest performers. Design 5 ranked worst across all three methods — lowest EEG scores, poorest virtual shelf performance, and lowest subjective ratings from respondents. Design 6 had low contrast, causing respondents to miss secondary elements even during extended viewing.

Implications for Package Design Decisions

This study demonstrates that consumer response to packaging cannot be reliably predicted from subjective preference data alone. Several findings directly contradict what respondents might say if simply asked which design they prefer:

  • Design 1 had the highest subjective ratings from respondents — but EEG data showed it does not prompt purchase intent as effectively as Designs 2 and 4.

  • Design 3 drew significant attention to the food visual — but that attention appears to suppress purchase motivation rather than enhance it.

  • Design 5 was rated "least attractive" by respondents and also performed worst on every objective measure. In this case, subjective and objective data aligned.

For brand teams making packaging decisions, the practical implications are:

  • Use dominant colour strategically. Teal/turquoise background consistently outperformed on shelf visibility. Designs without a dominant colour anchor struggle to register in peripheral vision.

  • Food imagery requires care. Showing the product creates strong visual anchors and drives noticeability — but the framing matters. EEG data suggests certain food presentations may trigger mild aversion.

  • Logo placement and size are critical. Only 6 of 16 respondents recalled the brand name after viewing. Logo noticeability alone is insufficient — brand name salience requires deliberate structural emphasis.

  • Competitor benchmarking matters. Pan Kot currently outperforms Myasna Miska on several EEG metrics in category context. Designs 7, 9, and 11 come closest to matching or exceeding this benchmark.

Conclusion

Across EEG, eye-tracking, and virtual shelf testing, Designs 2 and 4 emerged as the strongest overall performers for the Myasna Miska rebrand. They combined logo noticeability, positive EEG response, and competitive shelf performance.

Design 5 was the weakest performer on every measure. Design 1, despite having the highest self-reported preference, did not translate that subjective appeal into purchase-driving brain activity.

The study reinforces a consistent finding in neuromarketing research: what consumers say they prefer and what their brain activity indicates will drive behaviour are often different. Testing packaging with cognitive measurement — rather than surveys alone — produces more reliable guidance for design decisions that affect revenue.

KEY FINDINGS FROM THIS STUDY

→ Designs 2 & 4 ranked highest overall

→ Design 1 had the highest logo noticeability

→ Design 5 ranked worst across all methods

→ Designs 3 & 1 strong on attention, weak on purchase intent

→ Turquoise background (D3 & D1) best on virtual shelf

→ Competitors: "Pan Kot" top rival by EEG metrics

→ 6 EEG metrics measured per design

→ Design 5: lowest recognition, lowest EEG scores

Frequently Asked Questions

What is neuromarketing package testing?

Neuromarketing package testing measures involuntary brain and physiological responses to packaging designs using tools such as EEG, eye-tracking, and biometrics. Unlike surveys, these methods capture responses that consumers are unaware of and cannot consciously report. This makes them more reliable for predicting actual purchase behaviour than preference questionnaires.

What does EEG measure in a packaging study?

EEG (electroencephalography) measures electrical activity in the brain as respondents view packaging. In a packaging study, it typically captures attention level, cognitive processing load, emotional tone, activation (approach vs avoidance motivation), engagement, and ease of information processing. Designs with high activation and engagement scores are more likely to prompt purchase intent.

Which packaging design performed best in this study?

Designs 2 and 4 ranked highest when combining EEG, eye-tracking, and virtual shelf data. Both showed strong logo noticeability, positive EEG response across key metrics, and competitive shelf performance. Design 1 had the highest subjective preference ratings but did not translate these into purchase-driving EEG signals — demonstrating why cognitive testing produces different and more reliable results than surveys alone.

Why did the teal background designs perform best on the virtual shelf?

Teal and turquoise backgrounds create a dominant colour anchor that registers in peripheral vision before a shopper consciously focuses on a product. At high blur levels (simulating distant or peripheral viewing), designs with a single dominant colour are identified faster and selected more often. Designs without a clear colour anchor — such as Design 5 in this study — are significantly harder to identify on a crowded shelf.

How does neuromarketing testing compare to traditional consumer research?

Traditional methods such as surveys and focus groups capture what consumers say they think. Neuromarketing captures what their brain actually does. In this study, the design with the highest self-reported preference (Design 1) was not the design that EEG identified as most likely to drive purchase (Designs 2 and 4). This divergence between stated preference and measured cognitive response is a consistent finding in neuromarketing research and explains why packaging decisions based solely on survey data can underperform in market.

North AI  ·  north-ai.com  ·  hello@north-ai.com

North AI uses simulated cognitive testing to optimise creative and packaging performance before launch. Patent-pending technology. £1.3M government-funded. 3+ years R&D.  Brand: Myasna Miska.

Rishi Kapoor