How to Run an Effective A/B Test

rishi sethiNews1 Comment

A/B testing (or split testing) is a method of comparing two versions of a digital asset (including webpages, apps, ads, or email subject lines) to determine which one performs better. Running more than two tests of a digital asset is known as multivariate testing.

Effective A/B testing drives tremendous growth across your entire marketing and product channels. Product and marketing teams need to follow a process to ensure that they are utilizing A/B testing to fit in with the company’s overall business objectives.

We’ll outline the steps to run an effective A/B test here:

1. Determine what to test.

Unfortunately, product teams often shoot from the hip when determining what to test. We recommend two ways for product and marketing teams to determine what they should test: a) mapping business objectives to specific elements in the product and marketing funnel and b) analyzing the entire customer funnel to identify bottlenecks in their customer journey, and run tests at points where customer drop off is highest.

Mapping business objectives to A/B tests gives lots of inspiration for what tests to run. For example, if a quarterly goal of the company is to acquire more customers virally, the marketing team should try running A/B tests on social media. If another business objective involves increasing revenue through email, then the marketing team should run A/B tests on different product offerings in the email newsletter.

Analyzing the entire customer funnel also helps identify where to run A/B tests. If your onboarding conversion rate is low, A/B test different onboarding flows. Likewise, if the data shows that customers do not open your emails, try testing out different subject lines.

2. Redesign

Be bold with your A/B tests. Dozens of companies find unexpected results when they test out of the box features.

If you have lots of traffic you can run more variations of a test, but if you have few visitors then you may need to run your test for a long time to get statistically significant results.

[Quick note: use common sense when designing bold tests. A lewd image can lead to higher conversions but probably does not fit your brand.]

AB testing 2

3. Implement the design changes.

For websites, we recommend Optimizely for running A/B or multivariate tests. On mobile, Optimizely and Apptimize have easy to use mobile SDKs (although some of our clients choose to just hardcode tests directly into their app without any SDK).

4. Analyze the results.

Make sure that your test results are statistically significant. Optimizely and Getdatadriven have handy calculators for determining sample size needed for each variation and whether your test is statistically significant. If your test does not result in a clear winner, then try another variation.

5. Implement the version that performs best.

AB testing 8

 

A/B testing forms the cornerstone of optimization. Your team should be constantly running A/B tests to try to improve conversions. Treating A/B testing as a process enables your team to easily align product and business objectives.