Journey A/B Tests

Use an A/B test to determine which version of a message has the best impact on your journey.

You can create a variant for any message in a journey. The variant is a duplicate of the original message that you can then edit, changing a single element of the message: content, delivery settings, or Channel CoordinationA group of strategies you can use to target users where they are most likely to receive your message. settings. After starting the test, you will wait till the Confidence level meets or exceeds 95%, then select the winning message. The journey is then republished with the winning message.

Audience members who receive the variant message are randomly selected on entry to the journey. Related events and conversions are recorded for both audiences, providing data you can use to evaluate journey performance based on your selected metric for the test, either conversions or engagement.

 Tip

You can run A/B tests and control groups concurrently.

Create an A/B test for a journey

 Note

  • You must start the journey before you can create an A/B test.
  • You cannot start an A/B test for a journey that has unpublished changes.

From the journey ManageA preview of the messages in a journey, with options for editing its settings, content, and status. or PerformanceA report that compares audience behavior to a journey’s goal. It displays performance metrics and a link to the message report for each message in the journey. screen:

  1. Click Experiments in the leftside drawer.

  2. Click Create an A/B test.

  3. Enter a name and description, then select which message to make variant for, and choose a primary metric for reporting.

    • The Engagement metric is available for all journeys.
    • You can select the Journey conversion metric for any journey, but the test will not be effective unless you also set up a conversion event in the Outcomes step in the journey settings.
  4. Click Save and continue.

  5. Click Create variant on the A/B test summary screen.

  6. Make a single change on either the Content or Delivery steps, or edit the Channel CoordinationA group of strategies you can use to target users where they are most likely to receive your message. setting. For channel coordination, click  , make a new selection, then click Save & continue.

     Tip

    • A test with a single variable is measurable. When you make multiple changes in the variant, you will not know which change had an effect.

    • If your test’s primary metric is Conversions, consider editing any part of the message. For example, for a push notification, you could edit the title OR timing OR change your Channel Coordination selection.

    • If your test’s primary metric is Engagement, focus on what users can experience before they interact with the message. For example, for an email, you would change the subject line only.

  7. Click the Review step and review the device preview and message summary.

    Click the arrows to page through the various previews. The channel and display type dynamically update in the dropdown menu above. You can also select a preview directly from the dropdown menu. The Open Channel preview is a data table containing the selections and content that will be sent for the Open Channel message.

    If you would like to make further changes, return to Review when you finish editing.

  8. (Optional) Send a test message and verify its appearance and behavior on each channel the message is configured for.

    The message is sent to your selected recipients immediately, and it appears as a test in Messages Overview.

    1. Click Send Test.
    2. Enter at least one named user or Test GroupA reusable audience group that can be used as a recipient for test messages. Messages you send to a test group appear as tests in Messages Overview. and select from the results.
    3. Click Send.
  9. Click Save & continue and you will then see the original and variant on the A/B test summary screen.

  10. Click Start A/B test to make the variant available to your audience. You can also click Exit to save the test without starting it, or click Cancel to delete the test.

After starting an A/B test:

  • The Experiments drawer states that an A/B test is in progress. Click View results to go to the A/B test summary screen.
  • On the Manage and Performance screens:
    • Editing options are no longer available, for neither the journey nor its messages.
    • The message with the variant is labeled with a green  . Click   to go to the A/B test summary screen.
  • On the Performance screen:
    • The message with the variant is labeled with a green  . Click   to go to the A/B test summary screen.
    • Statistics are the aggregate of the variant and the original message.

Start a saved A/B test

You can start a saved A/B test from the A/B test summary screen. From the journey ManageA preview of the messages in a journey, with options for editing its settings, content, and status. or PerformanceA report that compares audience behavior to a journey’s goal. It displays performance metrics and a link to the message report for each message in the journey. screen:

  1. Click   in the message preview, OR click Experiments in the leftside drawer and click View detail.
  2. Click Start A/B test.

Select the winning message

After you start the A/B test, you will review the performance of the original message and variant and determine which (or if either) message is having the expected impact on engagement or conversion, depending on the test’s primary metric.

The A/B test summary screen displays the following information:

  • Primary metric: Journey conversion, or Engagement.
  • Sample size: The number of users selected to receive the variant. The threshold is 10,000 users.
  • Lift: The percent increase or decrease of your primary metric for users who have received the variant. Presented after 7 days or when the sample size of 10,000 users is reached.
  • Confidence: The probability that the same results would be obtained if the test were repeated. Presented after 7 days or when the sample size of 10,000 users is reached.
  • Additional statistics are displayed for the original message and variant, based on the test’s primary metric. If your primary metric is Engagement, you can see metrics per message type by selecting from the dropdown menu.
 Note

When you select a winning message, it includes the entire message as configured. You cannot select the content or settings per channel or message type.

Select a winning message after Confidence is at least 95%. The journey will be republished with the winning message, and the A/B test will end. From the A/B test summary screen:

  1. Click Select a winner.
  2. Click Select original or Select variant, and confirm your choice.
 Tip

You may want to end an A/B test early if you see a significant drop in conversions or engagement. If the drop is not significant or if it is observed early on in the test period, you may want to let the test continue, as the rate may correct itself. Another reason to end a test early is if you notice an error in your content.

To end a test early, select the original message as the winner. This effectively cancels the test.

View A/B test history

After selecting a winning message, the A/B test is added to the list of past experiments.

From the journey ManageA preview of the messages in a journey, with options for editing its settings, content, and status. or PerformanceA report that compares audience behavior to a journey’s goal. It displays performance metrics and a link to the message report for each message in the journey. screen:

  1. Click Experiments in the leftside drawer.
  2. Click Past Experiments. Each A/B test is listed by name along with its end date. Click   to go to its summary screen.