1-866-357-7422724 N 1st Street, Suite 200 St. Louis, MO 63102

Top

A/B Testing

A/B Testing still dominates testing in the Digital World

A/B testing is the easiest and quickest way to learn more about user behavior and to increase conversion rates, but it is underutilized by many digital marketers. This is often because any form of testing is wrongly assumed to be highly technical, very time consuming and hard to implement. This is not the case. When considering how important the measurement of conversion rate is and how powerful consumer behavior and insight can be it is surprising that so many digital marketing professionals ignore this simple form of testing.

What is A/B testing and what are conversion rates?

A/B testing does what it’s name suggests, it tests a control version against a different version to measure which is the most successful based on the metric being measuring.

In the online marketing world A/B testing will split traffic on a website (or platform) so that visitors experience different web page content on two different versions of a particular webpage (or ad, button, etc.) while you monitor visitor actions to identify the version that yields the highest conversion rate (the conversion can be any defined action that can be tracked). The conversion rate is the rate at which visitors perform a desired action on your site.

Through A/B testing insights about the visitors themselves are gained, such as visitor segments which consistently perform better with specific content.

Unlike A/B testing, which is mostly consisting of only measuring a single variable between two versions, Multivariate Testing can test multiple variables on multiple versions (warning on Multivariate Testing you need a pretty large volume of data to reach statistical significance and a high confidence level).

A page where traffic is directed to, to perform a specific action or conversion is called a landing page. Sales and lead gen landing pages are most often the target of A/B tests but anything online can be tested as long as it has the ability to be tracked (which is virtually everything).

A/B testing which also referred to as split testing, starts with an hypothesis of the types of content changes that could impact conversion rates. For example will a green submit button result in more submissions than a red submit button. The different web page content or variants, is configured for a test and traffic is then split between the variants. The test results indicate the conversion rate of one variant over another and are continually monitored until a statistically significant number of visitors have been included in the test.

Conversion rates & what to measure

To perform an A/B test you will need to measure differences in conversion rates; the objective of the test being to increase a conversion rate. The most obvious form of conversion rate is sales and can be worked out as the number of sales per divided by the number of total visits; so if you average 2 sales and one hundred visits your conversion rate is 2%. Raising this conversion rate from 2% to just 2.5% would mean a 25% increase in sales is needed, when viewed this way conversion rates really should be something worth paying a lot of attention to.

The act of testing to identify opportunities to generate a higher conversion is called Conversion Rate Optimization.

Conversions are any measurable action and are not just restricted to ecommerce sites or sales. Conversion rates can include:

  • Leads
  • Sales
  • Newsletter sign-ups
  • Banner Ad Clicks
  • Time on site (this is great for detecting low quality pages)

What to A/B test

Once the conversion rate you want to improve has been decided the next step is to decide on what to change on the page to attempt to increase conversions. These may include:

  • Images – placement, different images
  • Content – amount, wording, font, size and placement of content on the page
  • Headings – size, color wording
  • Call to action buttons – text, size, colors and placement.
  • Social media buttons – placement, size and wording are all worth testing
  • Logo and tagline
  • Use of association and security trust seals such as GeoTrust

Setting up a test

  • Decide what to test
  • Set up two versions that you want to test (one is usually a control)
  • Start with big differences
  • The goal is a 95% confidence level that your test is statistically significant

How do I know my results accurate?

Statistical significance is reached when your test has matured and is accurately telling you which version performed best. This depends on the confidence level; the higher the confidence level the higher the probability is that your test results are accurate and not random chance. A/B testing is often aimed to deliver at least a 95% confidence level or higher.