Skip to main content

Ab Testing

A/B testing lets you compare different versions of an announcement bar to see which one performs better. Show version A to some visitors and version B to others, then use the analytics to pick the w...

Updated today

A/B Testing

A/B testing lets you compare different versions of an announcement bar to see which one performs better. Show version A to some visitors and version B to others, then use the analytics to pick the winner. This feature is available on the Unlimited plan.

Why A/B Test

Small changes can have a big impact on clicks. A/B testing removes guesswork by giving you real data on what works. You might test:

  • Different message wording (“Free Shipping” vs. “We Ship Free”)

  • Different call-to-action text (“Shop Now” vs. “See the Sale”)

  • Different colors or styles

  • With or without a countdown timer

How It Works

When you create an A/B test, visitors are randomly split between the original announcement and the test variant based on the traffic percentage you set. Each visitor consistently sees the same version during their session so the experience is not jarring.

Both versions track views and clicks independently so you can compare performance directly.

Creating an A/B Test

  1. Open an existing announcement bar from your list.

  2. On the announcement’s detail page, click Add Variant and select A/B Test.

  3. Customize the variant’s messages, links, or styling – change whatever you want to test.

  4. Set the traffic percentage to control how visitors are split. For example, 50/50 gives each version equal traffic.

  5. Save and publish.

Setting Traffic Percentage

The traffic percentage determines how many visitors see the test variant versus the original.

  • 50% – Even split. Best for a fair comparison when you want results quickly.

  • 20-30% – Lower risk. Shows the test to a smaller group while most visitors see your proven original.

  • 10% – Conservative. Useful when you want to test something experimental without affecting most visitors.

The remaining percentage automatically goes to the original.

Reading the Results

Once your A/B test has been running and collecting data, compare these metrics:

Metric

What It Tells You

Views

How many times each version was displayed

Clicks

How many times visitors clicked the link on each version

Click Rate

Clicks divided by views – the key comparison metric

Look at the click rate to determine the winner. A higher click rate means that version is more effective at driving visitor action.

Applying the Winner

Once you have enough data to be confident in a winner:

  1. Note the message, style, and settings of the winning variant.

  2. Update the original announcement to match the winner (or keep it if the original won).

  3. Remove the A/B test variant.

Tips

  • Test one thing at a time. If you change both the message and the color, you will not know which change caused the difference.

  • Let tests run long enough. A few hours of data is not reliable. Aim for at least several days or a meaningful number of views before drawing conclusions.

  • Start with high-impact changes. Test message wording and CTA text before testing small style tweaks.

  • Use a 50/50 split for the fastest, most reliable results.

Did this answer your question?