Brenda Peterson
Brenda Peterson
How to Do A/B Testing: A Comprehensive Guide for Beginners
5 (100%) 1 vote

How often do you face the problem of choice in your digital marketing? How do you decide what’s right for your audience? What choices allow you to get the most out of your strategy and boost your conversion rate? These are the questions you can’t avoid when building a business website, creating content for your online presence, and engaging in marketing activities.

But why randomize your chances for success if there’s a proven way to find out which choices would most likely bring you the results you’re counting on? The solution is A/B testing. In this article, we will take a closer look at this marketing technique and answer the most important question: how to do A/B testing?

What is A/B testing?

A/B testing, aka split testing, is an experimental method in digital marketing that consists in comparing the performance of two variants of a single piece of content with the final goal of identifying which of them should be eventually used to yield the best results.

Most marketing activities come down to increasing the conversion rate, which is basically the ability of your campaign to turn prospects into customers. In this respect, A/B testing is a way of proving that one variation is more effective than another in terms of maximizing the conversion rate.

Here’s a simple example. You want to send a marketing email but have a hard time deciding which of the two variants of a subject line is more likely to convince recipients to click on your letter. What you need to do is come up with two equal sized audience groups and send each of them an email with different subjects. The one that shows a better open rate should be sent to the rest of your target audience. This is a typical A/B test in action.

For different industries, systematic and consistent split testing can result in a multitude of benefits. Optimizely reports the following statistics:

  • Media: A/B testing increases the average number of pageviews by 29% and reader engagement by 14%.
  • Ecommerce: shopper engagement grows by 13% while revenues are increased by 21%.
  • SaaS: up to 29% more pageviews and 17% higher customer engagement.

This said, it is hard to ignore the importance of A/B testing for optimizing your conversion rate as it is one of the primary methods of minimizing the randomness factor in how your decisions translate into conversions.

After all, doing business while relying solely on your “gut feeling” can hardly get you too far, while actual empirical data obtained as a result of an experiment will increase your chances for success.

What can be A/B tested?

The short answer: almost anything as long as it makes sense. All that can potentially influence the behavior and decision-making of your prospects can be subjected to an A/B test with various outcomes:

  • Blog: Testing headlines, subheadlines, paragraph text and length can increase your blog traffic and reader engagement.
  • Homepage: Both design elements (images, page layout, header, footer, navigation menu, color scheme, etc.) and content (headlines, text descriptions, testimonials, awards, media mentions, etc.) can be A/B tested in order to decrease bounce rate (keep visitors longer on your homepage) and boost conversions.
  • Landing page: Consider split testing pages featuring different images, registration forms, and CTAs (call-to-action) to achieve the best conversion rate. Even the slightest details, such as CTA button design, text, color, and on-page location, can impact the effectiveness of your landing page.
  • Product page: Optimizing your product names and descriptions, product images, and sales copy through A/B testing will lead to more purchases on your online store.
  • Email marketing: Different variables, such as a subject line, headline, salutation, body copy and its length, can influence your the open rate of your email, reader engagement, and conversion rate.
  • Online advertising: The effectiveness of your ads can be enhanced through A/B testing different headlines, advertising copy, and offers, thus, allowing you to cut down on your advertising expenditures.  
  • SEO: A/B testing can help you optimize your site for search engines if you have enough time and expertise for experimenting with different meta-tags, keywords, redirects, variation URLs, etc. But be careful not to get penalized by Google for violating the Webmaster Guidelines.

What’s needed for A/B testing?

Split testing is all about choosing one of two available variants of one and the same thing. Your choice depends on the feedback you receive after testing both samples in equal conditions. This feedback reflects in quantitative indicators, such as purchases, clicks, views, likes, shares, open rate, etc.

Thus, for your A/B test to be accurate, you need to make sure you have the following:

  1. An object of testing and its variable: The object is a piece of content or a campaign you want to test; the variable is a specific part or property of this object that can have two variations. Returning to our previous example, our object is a marketing email. The variable is its subject line.
  2. Variation A: The first version of your variable. In our case, it is one of the subject line formulations that you have in mind.
  3. Variation B: The second version of your variable. The alternative subject line, respectively.
  4. Two equal audience groups: Your goal is to make sure that the variables A and B are tested on relatively equal groups of people. Otherwise, the results of your experiment will be skewed. In the case of our email, all you need to do is take the part of your mailing list (this will be your focus group) and split it into two halves. When testing website pages, however, you have to deal with “infinite” audience as you don’t know how many visitors will come to see it. In this case, it makes sense to divide your traffic fifty-fifty. Thus, half of your visitors will be shown the version A, while another half will see the version B. But make sure that each group is shown only one variant every time. Most of A/B testing tools can ensure that.
  5. Hypothesis: Depending on what exactly you’re testing, you need to make an assumption as for the outcome of your experiment. For example, “the email A is more likely to be opened than the email B, because [...]”. This will allow you to clarify your goals upfront and determine what metrics you will have to use to confirm or refute the hypothesis.
  6. Testing metric: The metric you will rely on to identify which variation, A or B, performs better. Following our hypothesis, in our case, it is the email open rate. If the variant A gets more opens than the variant B by the end of the testing period, then it performs better, and your hypothesis is confirmed. If the variant B beats the variant A, the hypothesis is refuted, and it makes sense to send the email with the subject line B to the rest of your list.

Tips for effective A/B testing

A/B testing is a relatively simple experiment that allows you to figure out what works best for your campaign. However, for your test results to be accurate, consider taking the following recommendations into account:

  • Apply A/B test where it is most needed: Any experiment takes time to complete. You will hardly have time to A/B test everything, so make sure to focus on things that directly influence your conversion rate, e.g., CTAs, marketing emails, etc.
  • Study analytical data before conducting a test: Avoid running any tests blindly. Otherwise, you’ll just waste your time. Use tools like Google Analytics to find problematic areas within your site and conversion funnel. Look out for pages with high bounce rate, ineffective CTAs, etc. Consider using additional instruments to analyze your visitors’ behavior, for example, heat maps and on-page surveys.
  • A/B test no more than one variable at a time: Remember that your goal is to isolate the element in question from its environment to test which one of its variants would help the whole thing perform better. If you examine multiple variables at once, you will not be able to say for sure what exactly has influenced the results. For instance, if you test different versions of a CTA button, make sure you introduce one change at a time (color, size or text).
  • Test both variants simultaneously: Timing can also affect the results of your test. Hence the importance of testing one variant alongside another.
  • Give your test enough time to produce credible results: It is hard to say how long you should run your A/B test in order to obtain trustworthy results. It depends on what exactly you test and how large your audience is. Generally, the less traffic your site gets, the longer it takes to get substantial data.
  • Implement the changes based on your results: The last but not least, make sure that the time and efforts invested in A/B testing are put to good use. As long as you get any significant results, introduce them strategically in your campaign. If no significant data talks in favor of one of the variations you’ve tested, then the variable you picked is not conclusive and you should either run another test or go with any variant you prefer.

Conclusion

A/B testing is nothing compulsory. You either do it to minimize guesswork in your marketing campaign or not. However, after conducting a few successful tests, you will discover that some variants perform much better than others and that knowing them can bring you more profits, make it easier to take decisions and plan marketing materials.


Brenda Peterson

Brenda is Technical Specialist at Ning