An A/B test is a controlled experiment

A/B Test Definition:

An A/B test is a controlled experiment where visitors are randomly selected to view one of two  variations of the same web/app screen. Where the new design (often called the challenger)  is on a separate URL this is known as split testing. Each design is referred to as a variant.

Here is an example of a successful A/B split test for new visitors to the Cheekybingo.com home page.  We created a simple splash page with a single prominent call-to-action and achieved a 27%  uplift in registrations and a 9% uplift in first-time deposits.

Image of example of A/B split test on Cheekybingo.com

As traffic is split randomly between variant A and B we can be confident that any statistically significant difference in the success metric (e.g. the conversion rate) between the two designs is likely to be due to the change in experience and not the result of other external factors (e.g. source of traffic, day of week etc). Most organisations subscribe to specialist software for their  A/B testing to help build, run and analyse their experiments.

An A/Bn test is where there are multiple variants of the same web/app page. For example an A/Bn test with 3 variants will mean that roughly 33% of visitors will see each variant. The more variants in yous A/B test the smaller the proportion of visitors will see each variant and the longer the test will need to run to obtain a statistically significant result.

How should you use A/B tests?

A/B and multivariate testing should be used to validate changes to your website. Use a recognised approach to prioritise your test ideas to ensure you only test when it justifies the resource.  A/B testing should be an important element of your conversion rate optimisation process. However, it is essential that you have a framework for optimisation that includes other inputs such as customer feedback, web analytics  and usability research.

A/B testing is a scientific approach to conversion optimisation. It requires a good grasp of statistical theory and behavioural science. Some experts believe that many  A/B test results are not reliable because common logical errors, such as confirmation bias.

Conclusion:

A/B tests and multivariate testing is not the same as conversion rate optimisation. It is one tool of many that should be employed as part of an optimisation strategy. However, it is the only way of scientifically validating what impact a change to your site has on your success metric.

Also see multivariate test (MVT) and split path test.

Resources:

Conversion marketing – Glossary of Conversion Marketing.

Over 300 tools reviewed – Digital Marketing Toolbox.

A/B testing software – Which A/B testing tools should you choose?

Types of A/B tests – How to optimise your website’s performance using A/B testing.

Call Me Now!