A/B testing will help you get the most out of your newsletters. What are split tests, why are they so popular and how do they take your marketing up a notch? You will learn everything in this article – including step-by-step instructions and specific examples.
What Is A/B Testing?
A/B testing, or split testing, is a method used in marketing to find out which of two variants of a certain element performs better. You can come across it mainly in email marketing, but it’s also no exception on websites or banner ads.
During the email A/B test you divide your audience into two parts. Then you send each group a slightly different newsletter. It can have a different subject line, a different CTA (call to actions) button colour or a different sender. You will then evaluate which version of the newsletter shows greater success. And voilà, you can take your marketing to the next level!
What is A/B newsletter testing good for?
Thanks to split tests, you can:
- create a newsletter tailored to your target audience
- increase the opening rate
- increase your clickthrough rate
- increase conversions
You will finetune your newsletters so that both you (because you increase sales) and your customers (because they get the newsletters they want) will be satisfied with the end result.
Tip: Are you concerned with customer satisfaction? Read 75 facts, quotes and statistics about customer service.
What do split tests actually test?
The best newsletter is not the one you like the most, but the one your target audience likes the most and that gets the most openings, clicks, or conversions (depending on your goal). You simply have to put together an ideal newsletter in which all the elements are exactly as they should be.
Customers can be affected by many different variables, from the length of the subject line and the location of a graphic element to the use of emoticons. You can test basically anything you want. But if you want to get to the point quickly and not waste your precious time, focus on the following elements:
1. Subject line
The subject line can determine if customers even open the email at all. You should therefore pay proper attention to it. Everywhere on the internet you will find a lot of advice on how to write the best subject line in the world, but the truth is that every audience is different and there is no universal advice. An A/B test will help you discover the best subject line that is tailored to your customers.
- The length of the subject line. Usually, the shorter the better. But your audience may be an exception. Try it.
- Use of emoticons. Will smileys attract attention in the subject line and engage your customers, or will they discourage them?
- Use of numbers. For most people, numbers in a subject line act like a magnet. 5 guaranteed tips on how to… or the 10 most beautiful places you must visit. Try a variant with a number and a variant without a number.
- Questions. Do interrogative or declarative sentences engage your customers? Or maybe even imperative?
- Communication style. You may already know how your customers speak and which slang terms they use. Or you may need to find out – using a split test.
2. Date and time of dispatch
The day and time when the customer’s email arrives play a more important role than it might seem.
Imagine, for example, a manager who receives an excellent business offer – on a Friday night. They just fly through their emails and say they will return to them on Monday morning when they are back at work. But guess what? They won’t remember any emails on Monday.
Or, on the contrary, imagine a person who is at work when they receive a newsletter about new clothes at their favourite store. They can’t view their clothes during working hours, so they postpone the newsletter until later. But as in the case of the manager, there will be no “later”.
The general rule, as implied by our two examples, is: send B2B emails during working hours, B2C outside standard working hours. It won’t come as a surprise to you that your audience can prefer something completely different, which is where A/B testing comes onto the scene again.
- if a weekday or weekend works better
- if it’s better to send emails in the morning or in the evening
- if it’s better to send emails during or outside working hours
People are more likely to open an email from someone they know. And also from someone they remember giving permission to. Try different sender options to see what works best. You can try for example:
- company name (XY)
- company name with an explanation of what it does (XY – smart vacuums)
- name and position of the owner (Robert Smith, CEO of XY)
- first name of owner (John from XY)
When you choose the sender, you set the tone of the email straight away. And above all, you will influence if customers associate the newsletter with your company, or automatically mark it as spam or send it straight to the trash.
4. Length of the newsletter
When it comes to corporate blogs, SEO rules are clear: articles longer than 2,000 words perform better than shorter ones. But what about newsletters? Again, it depends on the audience. In some cases, you score with long text that goes into detail and describes the offer at length. Other times, just a few sentences will get you a better conversion rate.
Create two variants of the same newsletter, with the same topic, subject, sender and other elements, but different in their length. Then watch which one has better results.
5. CTA button
The CTA (call to action) button is a graphically differentiated field that instructs customers on what to do: buy, book, read more, etc. This little button has tremendous power. How well it will work depends on its colour, length, size and text.
Also test whether it works better for your customers when the newsletter contains one distinctive CTA button, or when it has several. And find out if it’s better for you to place the button at the beginning, end or middle of the newsletter.
There is no need to speculate that graphics are important: customers spend 10% more time on the web browsing graphics than reading texts. A split test should therefore also focus on the graphics of your newsletter. Find the best:
- font colour
- background colour
- opening image
- photo style
- other graphic elements
Caution: Only test one element at a time
Whatever element you decide to A/B test, the most important thing to remember is to test only one at a time. If you send one variant of an email with a pink CTA button at the beginning, and another variant with a blue CTA button at the end, how would you know why the better performing variant won? Is it the colour or the location of the button? Therefore, everything except the element you are testing must be identical in both variants of the email.
Performing an A/B test on a newsletter step by step
A/B testing is not rocket science. Four steps are enough to successfully complete the split test:
1. Select the element to be tested
What is the goal of the A/B test? What do you want to find out and what do you want to achieve? Choose one thing that is most important to you and choose the element you will test accordingly.
Is the number of people who open the email crucial to you? Then focus on A/B testing of the subject line and gradually try variants with different lengths, styles and emoticons. Wondering how to increase your clicks? Take the CTA buttons into consideration and test the ideal number, location, colour and text.
Preparation of an A/B test in Mailchimp
2. Create an audience
A/B testing only makes sense if you have a large enough audience. Otherwise there will be statistical deviations and the results will not tell you anything anyway. Ideally you will have at least 2,000 recipients for your split test.
Another criterion is whether the given audience is actually relevant to you. If you want to test how many seamstresses open a newsletter offering new fabrics, there is no point in sending a split test to blacksmiths. Therefore, create an audience segment that is both large and relevant at the same time, taking into account the goal of the A/B test.
3. Perform the split test
Once you are clear about the goal of the A/B test, the element being tested and the audience, performing the test is a breeze. Just click through the mailing tools that will guide you through the entire A/B test.
For example, you can easily create split testing in Mailchimp:
Creating a split test in Mailchimp
4. Evaluate the A / B test
After a few days, open the mailing tool again and evaluate the split test. It’s not complicated – the tool will show you the results itself. Just hover your mouse over the sent newsletter and view the report. Sometimes, however, errors occur, and especially if the differences between the results are only slight, it’s worth repeating the test.
Split test results in Mailchimp
Which tools can you use?
A/B testing has proven so successful in practice that it is currently offered by most mailing tools. You can find it, for example, in Mailchimp, Ecomail or SmartEmailing applications, but also in many others.
There are situations where you can test multiple elements at once
Sometimes there are times when you are pressed for time or do not want to deal with A/B testing. In those cases it might be time for multivariate testing, also called A/B/N testing (where you replace N with a number that indicates the number of variants tested).
However, such tests are less accurate, require more advanced knowledge and are more difficult to evaluate. In addition, they require a much larger audience to avoid significant statistical discrepancies.
The most common mistakes in A/B testing – do you make them too?
No scholar has fallen from the sky, so it is perfectly normal for people to make mistakes when creating A/B tests. Which mistakes are the most common and how can you avoid them?
- Small audience size. If your segment doesn’t have enough recipients, the split test evaluation will be inaccurate.
- The tested variants are too different. During A/B testing, you should only make minor changes between the variants, such as the use of a particular word or the colour of a particular element. When you decide to replace an entire section of a newsletter, you’ll never know what caused a potential change in results – the colour? Text? Element size?
- Variants sent at different times. It often happens that the results of the split test are inaccurate because the recipients receive the emails at different times. If you send variant A in the morning and variant B in the evening, you will never know whether the success of one of the variants was caused by a change in the element or the different sending time. Therefore, you should send both variants at the same time – so that the tested element is really the only thing that distinguishes the newsletters.
More accurate results in less time
Have you been working on your A/B testing late into the night, but your newsletters still aren’t getting the desired results? Don’t fret, it’s completely normal. Like any other craft, A/B testing requires a lot of practice.
That’s why you have two options: continue to do split tests yourself, devote time and effort to them, educate yourself and gradually learn how to do them, or leave A/B testing to the experts and get more accurate results in less time.
Leave A/B tests or even your entire email marketing strategy to us. What do we offer?
- Your emails will be more successful.
- A/B testing will take much less time.
- You will avoid mistakes and get accurate results.
- You will free your hands and be able to focus on other parts of your business.
You don’t have to do more than a few clicks – just fill out our contact form and briefly describe what you need help with. Our marketing specialists will contact you as soon as possible and discuss the best possible solution for you.