Push A/B Testing

A/B testing lets you compare two push variants against the same audience and keep the better-performing message. Use it to test copy, visuals, personalization, or calls to action before sending a winner to the remaining audience.

circle-check

Before you start

Before creating a test, make sure:

  • your push or web push channel is already configured,

  • you know which single element you want to test,

  • your audience is large enough to split into two meaningful groups,

  • and your success metric is clear before launch.

If you want to measure post-click impact, define your conversion event while preparing both variants.

circle-info

The overall flow is the same for Mobile Push and Web Push. Available content fields and rich media options can differ by channel and selected push type.

Start a new A/B test campaign

Step 1: Setup

Go to Messages > Campaigns > Start A/B Testing.

In Setup:

  1. Enter a Campaign Name.

  2. Select the campaign type: Push Notification or Web Push.

  3. Select the push type you want to test.

  4. Review the Estimated Reach before moving on.

Setup step with campaign type and estimated reach

Step 2: Variant A

Create the first version of your message.

Common fields to configure:

  • Category for consent and reporting

  • Title

  • Notification Message

  • Subtext

  • Media

  • Click action

  • Conversion event, if you want conversion reporting

Keep this version as your baseline. You will compare Variant B against it.

Variant A content configuration

Step 3: Variant B

Create the second version of the message.

Change only the element you want to test. Keep the rest identical. This makes the result easier to interpret.

Good test candidates:

  • Title: “Your order is ready” vs. “Pickup is ready now”

  • CTA: “Open App” vs. “Track Order”

  • Personalization: generic copy vs. {@name}-based copy

  • Media: image A vs. image B

  • Message length: short copy vs. detailed copy

circle-exclamation
Variant B content configuration

Step 4: Audience

Select who will receive the test.

Targeting options include:

  • Send All

  • Select Users

  • Advanced

  • Distribution List

Then set how much of that audience goes to each test group:

  • Variant A %

  • Variant B %

How audience split works

  • Variant A receives the percentage assigned to A.

  • Variant B receives the percentage assigned to B.

  • If A + B = 100%, there is no control group.

  • If A + B < 100%, the remaining audience becomes the control group.

Examples:

  • 50 / 50 → no control group

  • 70 / 30 → no control group

  • 30 / 30 → remaining 40% becomes the control group

  • 40 / 20 → remaining 40% becomes the control group

When no control group exists, the test ends after the A/B send completes. When a control group exists, you can review test performance against that untreated audience.

circle-check

Step 5: Schedule

Choose when the test runs and how fast it is delivered.

You can configure:

  1. Start Sending Messages

    • Now

    • On a Specific Time

    • Send on Best Time for Each User Between

  2. Message Expiry

    • Never

    • Until a Specific Time

  3. Delivery Speed

    • Send Fast

    • Send in Packages

Use packaged delivery for large audiences or traffic-sensitive campaigns.

Scheduling, expiry, and delivery speed

Step 6: Review and Launch

Before sending:

  • verify targeting and percentages,

  • preview both variants,

  • review scheduling and expiry settings,

  • and confirm the test.

Once launched, Netmera sends each variant to its assigned audience.

Final review before launch

Reports and winner selection

After the test runs, compare both variants in the report.

Metrics include:

  • Target Audience

  • Sent

  • Success

  • Clicked

  • Conversion

  • Revenue

Choose the winner based on your campaign goal:

  • use Clicked for traffic-focused campaigns,

  • use Conversion for action-focused campaigns,

  • and use Revenue when business value matters most.

If a control group exists, include that comparison in your decision. After choosing the winner, you can send that winning variant to the remaining audience.

circle-exclamation

Report behavior

  • If a control group is used, Control Group Performance appears in the report.

  • If no control group is used, that section is hidden.

  • Variant A and Variant B always remain visible separately.

  • Audience percentages are shown under Target Audience.

A/B test report with performance comparison

Best practices

  • Define the success metric before you launch.

  • Test one major variable at a time.

  • Keep audience splits intentional.

  • Use a control group only when you need incremental measurement.

  • Let the test collect enough data before choosing a winner.

Last updated

Was this helpful?