Sometimes, an educated guess just isn't enough.
Deciding on which image to use in your social ad, the location of that call-to-action button, or which adjective to use in your email subject line often needs more backing.
While seemingly minor, small factors like these can significantly impact conversion rates, leads and engagement.
You'll find there's an abundance of studies, statistics and information out there which detail the relationship between conversion rates and psychology. And while this is all extremely valuable, we can find some truth in those theories by running our own experiments.
A/B testing - also referred to as split testing, or multivariate testing - is the method of comparing two versions of the same conversion path - be that an email, a call-to-action button or a landing page, in which one variable changed and tested against a control (the original).
Subjects are shown one of two variants simultaneously and at random, and statistical analysis reveals which variant performs better.
Most marketing software will be able to facilitate multivariate testing to some capacity. The test will allow you to compare the end result of each multiple variants to determine what is better suited to your target audience and demographic.
How to make A/B testing work for you
Identify your problematic area
This requires ongoing and in-depth analysis, through which you'll be able to identify problematic areas. These might be areas of your site with high bounce rates and low conversion rates, or an email with dropping open rates.
A simple A/B test will be able to help you identify the best way to improve these numbers.
Decide on what you want to achieve
What's your reason for running this test? Do you have a goal in mind? Perhaps you want more opens on your newsletter? Or you need more clicks on that CTA (call-to-action)?
Once you’ve decided on your goal, you can delve into historical research and previous statistics, or look to external sources for information that will steer your test.
Maybe your target audience demonstrates a preference for informal language, for example? Or perhaps a new study suggests that red CTA buttons attract more clicks than grey.
Once you have decided what you want to achieve, you can select the most appropriate variation to test.
Choose ONE variant
The first rule of A/B testing is to never test more than one variant at once.
Simply because, if you notice a significant surge in click-throughs for example, how do you know which variant this can be attributed to?
This will skew your results and won't provide the accurate data you require.
Your variant can be anything from a simple language change such as ‘Request your free demo today’ to ‘Sign up for your free demo’.
Alternatively, you might wish to experiment with a different colour CTA button. Or, send out your newsletter in the AM as opposed to PM.
Here are some more ideas on what to test
- form length
- form fields
- email subject lines
- email sending time
- colour of CTA button
- position of CTA
- call to action size
- Imagery of an ad
Once you've selected your variant, remember to always test simultaneously alongside the control as opposed to testing separately. Testing separately poses a whole host of problems as other variables (uncontrollable variables such as time) can crop up and skew your results.
Always bear in mind the fragility of your SEO when testing website pages. While Google permits and even encourages multivariate testing, abusing an A/B testing tool can be detrimental to your SEO.
For example, cloaking.
Whether intentional or not, cloaking involves showing search engines different content to a typical visitor and can cause your site to be demoted or completely removed from search engine results. To avoid this, use A/B testing sparsely.
In addition, remember to use 302 redirects (temporary redirects) rather than 301 redirects if you are redirecting the original URL to a variation. Search engines will therefore be aware that the redirect is temporary and not the result of something harmful.
Analyse the results
Your testing software will present you with the results of each variation which you can analyse to determine whether your variation garnered better results than the control.
In the even that the variant was not successful, you can use your learnings to adapt the test and continue with the experimentation process. If you variable was indeed successful, congratulations! You can use these learnings to improve your conversion path and to inform your next test.
It's a great idea to implement A/B testing as a frequent part of your ongoing strategy. Nothing in digital marketing is stagnant. The way in which we work is constantly innovating and progressing along with users' behaviour. Insight and awareness will be dependent on your willingness to continuously monitor trends, analytics and traffic. When you spot an anomaly, a drop or surge, do you have the knowledge to attribute this to a particular factor? Regular A/B testing will arm you with that knowledge, enhancing your strategic approach at every part of the buyer's journey.