A/B Testing Guide

Get Started

Optimize ad performance with built-in A/B testing. Split-test ad variations and automatically identify the highest-performing creative — no third-party tools required. A/B testing (also called split testing) lets you compare two or more ad creatives in the same placement to find which one performs best.

Enabling A/B Testing

A/B Testing is a module that must be enabled before use.

  1. Go to WB Ad Manager -> Pro Settings -> Modules
  2. Toggle on A/B Testing
  3. Click Save Changes

A new A/B Testing menu item appears under WB Ad Manager.

Creating a Test

Step 1: Open A/B Testing

Go to WB Ad Manager -> A/B Testing.

A/B testing list showing active tests with variations, impressions, leader, and confidence level

Step 2: Add New Test

Click Add New Test and fill in the form:

Field Description
Test Name A descriptive name, e.g. “Homepage Banner — July 2025”
Original Ad The existing ad you want to test (the control)
Variant Ads 1-3 alternative ads to compare against the original
Goal What you’re optimizing for: CTR (clicks) or Impressions
Traffic Split How to divide traffic between variants (default: equal)

Step 3: Start the Test

Click Create Test. The test status is set to Running immediately.

Make sure all variant ads are published and assigned to the same placement as the original ad before starting the test.

Monitoring Your Test

The A/B Testing list shows a summary for each test:

Column Description
Test Name Your test identifier
Status Running, Paused, or Completed
Variations Number of ad versions being tested
Impressions Total views across all variations
Leader Currently winning variation
Confidence Statistical confidence level

Click a test name to see the full results with performance charts.

Reading the Results

Statistical Confidence

Results are meaningful only when the confidence level is high enough.

Confidence What It Means
Below 90% Not enough data — keep the test running
90-95% Suggestive but not conclusive
95% or above High confidence — you can trust this result

You typically need at least 100 impressions per variant before results start to be meaningful.

When to Declare a Winner

End your test when:
– One variant reaches 95%+ confidence
– You have at least 1,000 total impressions across all variants
– The test has run for at least 7 days

Declaring a Winner

  1. Go to WB Ad Manager -> A/B Testing
  2. Click View on your completed test
  3. Click Complete Test & Set Winner on the winning variant
  4. The winning ad remains active; other variants are paused

Test Actions

Action What It Does
Pause Temporarily stops the test while preserving all data
Resume Restarts a paused test
End Test Marks the test complete and declares a winner
Delete Removes the test and all associated data permanently

What to Test

Focus on one change per test to isolate what’s making the difference:

  • Different headline text
  • Different images or visual styles
  • Different call-to-action wording
  • Different color schemes or button designs
  • Different ad sizes (test the same zone with different creative dimensions)

Best Practices

Run tests long enough. Traffic patterns vary by day of the week. Run tests for at least 7 days to account for this variation.

Don’t peek too early. Looking at results before reaching statistical significance and then making decisions based on early data is a common mistake — it leads to false conclusions.

Document your learnings. Keep a record of what you tested, what won, and by how much. This builds institutional knowledge that improves future campaigns.

Traffic Level Minimum Test Duration
Low (under 100 impressions/day) 14-30 days
Medium (100-1,000/day) 7-14 days
High (1,000+ per day) 3-7 days

Next Steps

  • Ad Submissions — review incoming ads for testing
  • Revenue Dashboard — measure the impact of winning ads on revenue
Last updated: March 4, 2026