Amazon Ads Benchmarks: Enabling Data-Driven Performance Comparison

I designed a benchmarking feature for Amazon Ads DSP that enables advertisers to evaluate their performance against similar campaigns, helping them contextualize results, set realistic goals, and identify optimization opportunities. This feature addressed a critical gap where advertisers either lacked access to comparative data or relied on external sources for performance benchmarking.

Screenshots of Amazon Ads campaign dashboard displaying benchmark metrics and performance data for a snack brand's cookie and biscuit product category.

My Role & Approach

The Challenge

Advertisers depend on benchmarks to evaluate new opportunities, contextualize performance, set realistic goals, and optimize their media investments. However, this critical data was either inaccessible or difficult to obtain within Amazon Ads, forcing advertisers to rely on external sources or manual comparisons. This created barriers to real-time, data-driven decision-making, preventing advertisers from gaining a clear understanding of their opportunity potential and ROI.

My Approach

As the solo designer on this project, I:

  • Designed the complete experience from in-context card to detailed comparison views

  • Collaborated with PM to determine the most valuable benchmarks based on advertiser needs and API availability

  • Worked with a UX writer to ensure clear, actionable copy throughout

  • Adapted designs as API constraints evolved during development

  • Validated approach by sharing designs with 5 extremely large advertisers

  • Aligned to existing patterns, leveraging established Amazon Ads DSP design language

Contextual Benchmarking

Benchmark Card Integration

I designed a benchmark card that lives directly on the Orders page of Amazon Ads DSP, providing advertisers with an at-a-glance performance comparison without leaving their workflow.

Card Design:

  • Performance metrics comparison (advertiser vs. benchmark)

  • Clear visualization showing relative performance

  • "View Details" action to access deeper insights

  • Contextual placement alongside campaign data

This approach made benchmarking a natural part of the advertiser's campaign monitoring workflow rather than requiring them to navigate to a separate reporting area.

Detailed Comparison Sidesheet

When advertisers want deeper insights, clicking "View Details" opens a comprehensive sidesheet that provides both visualization and detailed data.

Performance Visualization

  • Time-series graph comparing advertiser performance to benchmark

  • Clear distinction between advertiser data and benchmark data

  • Date range context showing comparison period

Benchmark Selection

  • Transparent display of comparison criteria

  • Shows how similar advertisers are grouped (by campaign spend, category, country)

  • Helps advertisers understand who they're being compared against

Detailed Metrics Table

  • Comprehensive performance data across key metrics

  • Side-by-side comparison of advertiser vs. benchmark

  • Sortable columns for different analysis perspectives

  • Data export capability for further analysis

Filtering & Customization

  • Date range selection to adjust the comparison period

  • Calendar picker for precise timeframe control

  • Filter options to refine comparison criteria

Adapting to API Constraints

A key challenge during development was adjusting the design as available benchmark data evolved based on API constraints. The benchmarks we could initially surface changed as technical limitations became clear.

Design Approach

  • Maintained flexible component structure to accommodate changing data availability

  • Prioritized most valuable benchmarks identified by PM and validated with advertisers

  • Designed system to gracefully handle varying data availability

  • Ensured experience remained valuable even when some planned benchmarks weren't available

This required balancing ideal user experience with technical reality, making strategic decisions about which benchmarks provided the most value given the constraints.

Validation & Feedback

We shared designs with 5 extremely large advertisers using Amazon DSP during the design phase to validate our design approach.

Positive Feedback

  • Benchmark selection: Advertisers found the chosen benchmarks valuable and relevant

  • Data visualization: The combination of graph and table provided both quick insights and detailed analysis

  • Comparison approach: The grouping logic (spend, category, country) resonated as meaningful

This validation gave confidence that the feature addressed real advertiser needs and would drive adoption.

Impact

The goal was to give Amazon Ads DSP advertisers access to in-platform benchmark data for the first time, eliminating their reliance on external sources for performance comparison.

Results

  • Successfully shipped to beta, providing advertisers with in-platform performance comparisons for the first time

  • Validated approach with 5 major advertisers during design phase, receiving positive feedback on benchmark selection and data visualisation

  • Designed flexible component structure that adapted as API constraints evolved during development

Advertiser outcomes

  • Advertisers can now contextualise performance against peers rather than relying on external benchmarks

  • Real-time access to comparative data eliminates delayed or manual reporting

  • Grouping logic by spend, category, and country resonated as meaningful with advertisers during validation