An introduction to A/B testingby
Remember at school when we grew runner beans in jam jars with a piece of rolled up blotting paper? We placed one in a dark cupboard, one we didn't water, one we put somewhere cold and another we placed in the sun and watered daily. The exercise was to determine how each element affected the plant's growth. After a couple of weeks the conclusion was that plants need light, warmth and water to grow! Well A/B testing is pretty much the same – though experimenting with just two variants at a time and testing small differences.
Also known as 'split testing', A/B testing brings scientific methodology to marketing and removes the guesswork. It provides data-backed decisions and can be used across a range of communications and considering many variables. In 2000, Google famously used A/B testing to ascertain the optimum number of search results to show on its listings pages – the rest is history!
Although it has been around for some time, A/B testing is less commonly used than other marketing tools. In the past it would have been expensive to run tests for magazine adverts or billboards, but in the digital age the costs are low and, done well, provide an insight into visitor behaviour that can significantly increase conversion rates.
How is it used?
A/B testing is used to consider how small differences in a marketing campaign might influence customer behaviour. This might be the title of a newsletter or email, the text for a banner advert, the text on a call-to-action button or the layout of a web page. The idea is to run two variations of the campaign with a controlled group of customers to see which version is the most successful. You can repeat the tests numerous times to fine-tune your content and improve the effectiveness of your marketing communication.
How does A/B testing work
When setting up a test you first need to think about all the metrics of your business and how you define the success of your marketing campaigns. This might be the number of sales, click-throughs, sign-ups, downloads etc. You then set up your marketing campaign with two variables (version A and version B). Don't be tempted to vary more than one thing at a time or you will never know which one made the difference. To measure which is better you trial them simultaneously, in identical circumstances, and select the most successful version for use.
If you are testing a web site page use the existing version as the control and set up a second for the test and split your traffic equally between the two. If you don't have the technical knowledge to do this yourself there are a number of free tools on the market that will help you, for example Google Analytics Content Experiments. There are also plenty of organisations specialising in conversion rate optimisation (CRO) that will run your A/B testing for you and make recommendations for your marketing.
If you are sending out an email or newsletter you will need to put some effort in beforehand to prepare your test groups. The two groups need to be identical – or as similar as possible. Firstly, you will need an equal number of contacts and, ideally, you will want to have equal numbers of men and women. If you have the data available, consider age ranges, geographic locations and any other factors that might contribute. Look to run the test using a small percentage of your database, maybe 10%, and make sure you send them simultaneously so you minimalise any variance.
You will also need to determine up front how long you are going to run the test for and how many responses you need to quantify the results. Use past data as a guide but be careful not to cut it off too soon or to leave it too long as this may mean other factors have affected the result. If you are testing low volumes you will need to determine how long a period you can realistically wait and whether the test result will be reliable.
Examples of using A/B testing
If your company is an SaaS provider your metric may well be the number of sign-ups you receive. Considering different versions of your web sign-up page will help optimise the page and increase sign-ups. For example, you may have an idea that changing the colour of your call-to-action button from blue to red would make it stand out better and increase sign-ups. In this case, you would use the existing blue design as your control, version A, and the new design with the red button as version B, and equally divide your website visitors between the two designs for a given period of time. At the end of the test you see which one work the best and you use that one. You may then choose to test the red design against another colour to further test or check your results.
Remember when running such test you'll need to make sure the sample size you are dealing with is statistically relevant. For example, if you normally get just two or three sign-ups per day, then ten click-throughs won't produce a relevant result. The larger the sample size the greater the reliability of your test results. However, the result will also depend upon difference in performance. If you normally expect a 5% sign-up from your blue button, you will need to determine what change in volume will make the variation relevant. If you are able to test in thousands, a 5.6% may mean a significant increase in business but if you are only testing in tens then the result will not be reliable. Whilst testing low traffic will never achieve significant scientific results it will still provide a level of insight but you will need to repeat the test frequently to ensure you are getting the best conversion rate.
Newsletters and emails
E-newsletters and mass email marketing face a great deal of competition in a crowded inbox so making your message stands out could be the deciding factor as to whether your email is opened or not. Testing which title has the greatest click-through rate before mailing to the rest of your database could mean a significant difference in your campaign success.
Google AdWords is the ultimate tool for A/B testing – it was made for it! You can create any number of advert variations and measure their success with Google Analytics. For example, if your business is CRM systems you might test the following advert titles:
- CRM made simple
- CRM for small business
- CRM Free Trial
You might also test different landing pages onyour website with the same advert to see how that impacts upon your results. You can then set up campaigns between AdWords and Google Analytics to accurately record your click-throughs, sign-ups and sales by determining the page a customer needs to get to for the transaction to be qualified.
The campaign should run over a set period of time, say 7 days, with these advert variations delivered in equal numbers at the same time of day. You might also use a test to see which day of the week, or time of day works best for your target audience. With accurate results Google Analytics allows the savvy marketer to schedule an advert with the most powerful title, at exactly the right time of day on the right day of the week. The result is that the advertising spend can be targeted where it is most effective, and an improved ROI can be achieved.
A/B testing does not only provide you with quantitative data that there is just no argument against, but it also provides insight into customer behaviour that can be used across other areas of your marketing. If you know that a red call-to-action button is more effective than blue, you might use this on other web pages. And if you know that you get a better response to one title than another on a newsletter use that insight to change other text in your promotional materials.
It is likely that that different versions of your web pages or campaigns will appeal to different customer segmentations, for example, according to gender, age, geographic location or industry. If so, use this intelligence to further target your marketing matching customers to particular products or services.
A/B testing should not be considered a one off activity. People and trends change so run regular checks to test your results and for each new campaign or product. Keep testing on a regular basis and remember that not every test will work. Be prepared to start again if you don't get a decisive response.
Make sure you plan a time period for your test and a minimum number of responses needed to make it meaningful – ending it too soon could mean inconclusive results and dragging it on longer could lead you to select a poor version.
And finally, don't be tempted to let your instinct overrule a test result – sometimes the outcome can be surprising!
Examples of communications where A/B testing would prove useful:
- Email marketing campaigns
- Web sites
- Internet advertising (banner/PPC/AdWords)
Examples for the variables tested
- Subject titles and subtitles
- Product descriptions
- Text (length, style)
- Call to action button (text, colour, position)
- Colour schemes
- Forms (length, question)
- Page layouts
Helen Armour is marketing manager at Really Simple Systems.