How to use A/B Testing for Better Conversion and Engagement?

How to use A/B Testing for Better Conversion and Engagement?

With the power of data in our hands, most companies are leaning towards a data-driven approach. This means that A/B testing should be the main activity that drives decision-making. What is the most popular way to test in digital marketing, and how can you use it to drive better results in your campaigns?

There was a time when marketers weren’t able to accurately define what drove the results of their campaigns. But that changed with the introduction of data analytics.

With the right strategy and the right tools, we don’t have to rely on our instinct to make changes on our website, in our emails, customer surveys, or any other type of user-facing content or interface. We can test different versions, and pick the ones that generate greater results.

The most popular way to test in digital marketing is A/B testing. What is it, and how can you apply it to your website, e-mails and customer surveys?

Short Guide To A/B Testing

A/B testing is an approach used to compare two different versions of marketing content to find out which one works better. In other words, figure which one performs better in terms of a specific conversion goal. Let’s examine how it works depending on 3 things you might want to test.

Website A/B Testing

To test your website you need a platform to create different versions of a page, separate incoming traffic and display different versions of the site to users at random time. But you also need to collect and analyze the data from your experiments.

Depending on which tool you pick and how you use it, you can test almost anything on your website:

  • Headlines
  • Images and videos (placement, or versions with and without)
  • Placement of call-to-action or sign-up
  • Value propositions
  • Button location / type / color / description

Because there is so much to choose from, it’s important to focus on the things that have a real impact. For example, it’s not the best idea to test two completely different versions of a site against each other, as the results will tell you which one worked better – but they won’t tell you why.

When you test one thing at a time, and it’s a change that has a large impact then it’s a test that you can draw valid, specific conclusions from. A good example could be placing a sign-up button over the fold OR in the middle of the page.

Power of Data Analytics for A/B Testing
Source: https://unsplash.com/photos/8HhtNLFa07I

Website A/B Testing Tools

There are plenty of different tools for website A/B testing, like:

  • Unbounce – platform for creating and testing landing pages, with an intuitive drag-and-drop builder, easy to use testing tools, and insightful analytics.
  • VWO – an easy tool that allows you to change elements and content on your website to create different versions, run experiments, track specific conversion goals to easily evaluate performance, and continuously optimize your site.
  • Google Analytics Experiments – a powerful A/B testing tool that makes it possible to split-test multiple versions of a web page and deliver them to users through different URLs. It can be configured in a myriad of ways, from using it to handle the whole test, along with redirecting users to separate versions of the website, to just using the Analytics engine for reporting, and relying on another tool to display different website versions.

Another important thing is to take your time and allow your experiment to build up a decent amount of data. For e-commerce companies, a social media platform or a popular SaaS, generating a lot of traffic will be easier – but a B2B startup in a niche market with low website traffic needs to be much smarter about A/B testing.

However, testing on low-traffic sites can still work wonders – as long as you only test things that have the biggest impact on visitors, as Optimizely’s Steve Ebin confirms. He provided an example test that he ran for MuleSoft, on a page with only 75 visits per day. In less than two weeks, he was able to conclude that presenting data in a different way resulted in a 40% improvement in conversion.

A/B Testing Your Emails

Here things get a little bit easier. What can you test in an email campaign? For example:

  • Target groups
  • Subject lines
  • Email content
  • Sending times

A/B testing in email campaigns doesn’t require as sophisticated software as website A/B testing. You can run a valid test with just your GMail account and a spreadsheet, because all you need to do is to divide your contact list in half, send them different versions of your emails, and see which group responds / clicks the most. But it will be easier with tools like Woodpecker, Lemlist (for cold outreach) or MailChimp, Mailerlite(for newsletter, with built in A/B testing suite), which collect data, and provide useful automation tools.

Just like in website testing, you want to focus on the things that have the most impact, and test one thing at a time. For example, don’t send two different email versions with two different subject lines to two different target groups, because a test like this won’t give you any conclusive results.

But test one cold email campaign on two different target groups, and you’ll be able to see which one converts more. Or test two subject lines in your campaign, and you’ll see which one generates more opens.

A/B Test Surveys

Let’s move on to A/B testing in customer surveys and feedback forms.

This is a tricky one, because it can involve a bit of website and / or email testing. You can A/B test different versions of questions in your surveys, but you can also test how (and when) you ask your customers to fill out your survey.

To start with, you can use a basic tool like Google Forms to create your surveys, and perform tests like:

  • Send survey to one half of users right after sign-up, and a week after the sign-up to the second half
  • Ask one half of users to fill in customer survey on a post sign-up web page, and the other half in a post sign-up email
  • Send a different survey (for instance one limited only to most important questions, and one with additional questions) to each half of your customer base

But you can turn it up to 11 (Spinal Tap references, anybody?), and use customer feedback software like Feedier which will enable you to use one platform to test:

  • What questions you should ask
  • Where and when your users/customers are most likely to click on your survey
  • How you should reward users for giving feedback
  • What should be the frequency of your feedback loop

Let’s move on, and look at some examples that will show us how effective A/B testing can be.

That feeling when you got the right version of your form-  Thanks to A/B testing
That feeling when you got the right version of your form

Examples of A/B testing

These examples come from AB Tasty.

Testing Additional Information On a Website

In a test carried on a non-profit organisation’s website, Kiva.org, they wanted to increase donations from visitors seeing their page for the first time. They hypothesized that adding more information on the site would help increase donations.

The test resulted in a 11.5% increase in first-time visitor donations thanks to adding a simple information & FAQ box at the bottom of the page. The actionable conclusion from this test is that being upfront in providing answers to questions that website visitors might have, increases the credibility of the site and can in fact increase conversion rates.

More Clicks Thanks To Testing Subject Line

The organisers of an email blast sent out by Designhill.com wanted to promote a blog article, and they tested two different subject lines. The first one was just just the blog post title, and the second one had an added CTA at the beginning: “Check out my latest post -”.

As it turned out, the version without an added CTA ultimately performed better. The short subject line generated a 2.57% increase in open rates, and a 5.84% higher click-through rate.

As you can see, these tests revolved around a single, specific variable, which is why they provided clear results and actionable conclusions to learn from. This brings us back to what we said earlier – when you’re A/B testing, focus on testing one thing that has a big impact.

Wrap Up

You don’t need to test everything – but you should test the important things

If you rely on a gut feeling to make decisions about your website, marketing campaign, or customer survey, it can only take you so far. Why gamble, when you can make informed decisions based on data?

A/B testing is a useful approach to optimize content, interfaces, campaigns, and anything in between. But doing one-off tests every once in a while isn’t what it’s about. The best approach seems to be adopting a culture in which testing, and using data to drive informed decisions, are the norm.

Integrate testing into your business, use it to optimize your performance – and ultimately increase your revenue.

Leave a Reply

Your email address will not be published. Required fields are marked *

Up Next:

How and When Should You Gamify Your Surveys?

How and When Should You Gamify Your Surveys?