A/B Testing Help Guide

Use Experiture’s Powerful A/B Testing feature to optimize your marketing programs

How to Configure an A/B Test in Experiture

A/B testing is a powerful way to optimize your customer journeys by comparing different versions of your touchpoints, such as emails, SMS, or landing pages. You can test variations like subject lines, images, or calls to action to see which version drives better results. Here’s how to set up an A/B test in Experiture:

By following these steps, you’ll be able to optimize your marketing campaigns with A/B testing, ensuring that you are consistently improving engagement and conversion rates in your customer journeys.


Step-by-Step Guide for Configuring an A/B Test

  1. Open Your Campaign:
    • Navigate to the campaign you want to A/B test. You can access this through the Campaign Dashboard.
    • Once in the campaign, go to the Journey Builder or Designer tab.
  2. Choose the Touchpoint to Test:
    • In the journey map, click on the specific touchpoint (e.g., an email or SMS) that you want to create an A/B test for.
    • This touchpoint could be an email, SMS, or any other communication channel supported by Experiture.
  3. Enable A/B Testing:
    • In the settings of the touchpoint, look for the A/B Split option.
    • Click on the A/B Split icon or button to enable A/B testing for that specific touchpoint.
    • Choose the area of the message to test. For email, this can be the “From” address, the subject line, or the email content. FOr all other channels, only the content can be A/B split tested.
  4. Create Variants:
    • Once you have enabled A/B testing, you will need to create multiple versions (also called “variants”) of the touchpoint.
    • Click on Create New Variant to start creating your second version.
      • For an email touchpoint, you could vary:
        • From Address: Test different Sender names (i.e., the name of an individual on your team vs, COmpany name) to see which achieves better engagement.
        • Subject line: Test different subject lines to improve open rates.
        • Email content: Test variations in the email copy, images, layout, or calls to action.
      • For an SMS touchpoint, you could vary:
        • Message content: Test different SMS content for higher response rates.
        • Personalization: Try varying levels of personalization (e.g., include a name in one variant, leave it out in another).
    • Continue to add as many versions (A, B, C, etc.) as needed for the test.
  5. Set the Audience Split:
    • Decide how you want to split your audience between the variants. You can usually split this evenly or customize the percentage each variant will receive.
    • For example, you might choose to send 50% of your audience Version A and 50% Version B. Alternatively, you could do 33% each if you have three variants.
    • Experiture allows you to set these percentages in the A/B Test Configuration settings.
  6. Schedule or Send the Test:
    • Once the variants are created, schedule the test or send it immediately, just like any other touchpoint.
    • You can set it up so that the campaign automatically sends the A/B test emails or SMSs to a randomly selected portion of your audience.
  7. Monitor Results:
    • After the test goes live, head to the Reports or Analytics section of your campaign dashboard to monitor the performance of each variant.
    • Track key metrics such as:
      • Open rates (for emails).
      • Click-through rates (for emails and SMS).
      • Conversion rates (form submissions, purchases, etc.).
    • Compare the performance of each version to determine the winner.
  8. Declare a Winner:
    • Once you’ve gathered enough data, select the winning variant. In some cases, Experiture might have an automatic setting to declare the winner based on a predefined metric (e.g., highest open rate).
    • You can then choose to send the winning variant to the remainder of your audience or use it in future campaigns.

Best Practices for A/B Testing

  • Test one variable at a time: This ensures you’re accurately testing the impact of each change.
  • Sample size: Make sure your audience size is large enough to generate statistically significant results.
  • Duration: Allow enough time for the test to run so that you can collect sufficient data.
  • Automate: Use automation to follow up with users based on their interactions with the winning variant.