Blog

Why and how should I A/B test emails?

Felix Higgs

By Felix Higgs

Published Dec 15, 2021

A/B test emails

Summary - Enhance your email marketing with A/B testing: Learn from Knak's guide on testing elements like subject lines and CTAs for improved engagement.

A/B testing. We hear about it all the time in marketing, but what is it really?

In the simplest of terms, it’s the process of comparing two variations of the same thing, and assessing how that change impacts performance. For example, when it comes to emails, that could mean trying two versions of the same subject line and testing it with two similar segments of your subscriber pool to see which is most effective. The goal? To build a better understanding of what works – and what doesn’t – in your email campaigns.

With me so far? Great. Now that we have the definition out of the way, let’s dive into why this process is important for marketers.

Why should you A/B test?

For starters, as organizations become more data-driven, they need tools and processes that can collect the insights necessary to make informed decisions. A/B testing emails is one such process for marketing teams. Research shows that this approach delivers real-world results around engagement, click-through rates, and other conversion indicators.

With this information, you can improve your reader experiences, increase key metrics, reduce your abandonment rates, and optimize your content with very little risk or investment. This, in turn, can translate into increased revenue for your business. By optimizing your email campaigns, you’re more likely to increase conversions and the amount of money your customers are spending on your brand. Thus, A/B testing is not only a marketing tool, it’s also a strategic, revenue-generating one.

The one caveat to keep in mind is that you should make sure the results are statistically significant enough to influence a decision. In other words, you need to be certain that the shift in engagement is actually due to the variation you’ve made, and not a random human glitch because Mars is in retrograde.

What should you test?

When it comes to A/B testing an email, you really can test any of the elements within it. That leaves us with two approaches that give you actionable results. You can either test two versions of an individual element – like a subject line or CTA button – or have two completely different emails. The problem with the latter, however, is that it’s harder to get a true sense of what’s working and what isn’t.

So, if you’re going to take the individual element route (and we think you should), here’s a list of some of the things you can test.

Subject line and preview text

Just like the eyes are the windows to the soul, your subject line and preview text are the windows to your email. If you don’t get these right, the rest of your email doesn’t really matter. This makes it an important set of elements to test when creating emails for your audience.

Try testing:

  • Different key phrases
  • Action-oriented language versus descriptive language
  • Emojis versus no emojis
  • Humour versus conservative language

CTA button

The goal of most – if not all – of your emails will be for your readers to convert and follow a link that leads to a product page, piece of content, or landing page. The most effective approach for your call-to-action button will likely change continually, but testing it on various campaigns can help set you up for success.

Try testing:

  • Where in the email you place the button
  • Copy length and style
  • Button shape and colour

Personalization

Today, our marketing automation platforms allow us to use our customer data to deliver custom experiences in our marketing efforts. For some people, this is a big plus. In fact, adding someone’s name to a subject line can increase open rates by more than 14%.

Try testing:

  • Adding a first name in the subject line or intro
  • For B2B brands that have this info: adding the reader’s company name in the copy

Copy

Once your readers open an email, they can either be enamoured or turned off by your copy. It’s worth testing different approaches to make sure you’re doing it right. Whatever language or terminology you choose, make sure it’s consistent with your brand and doesn’t feel jarring to anyone who’s engaged with other brand assets.

Try testing:

  • Tone (e.g. casual vs. informative)
  • Terminology (e.g. jargony terms vs. more accessible language)
  • The length of the message

Design

While some people are very into beautifully designed emails, others may prefer simple, text-only emails that get to the point. Do you know which your audience prefers? A/B testing different elements can help you find your sweet spot.

Try testing:

  • The number of design elements you bring into the email
  • Which brand colours you feature
  • The typography you use

Out of all of the above, we recommend that you initially focus on the elements that are most relevant to your reader converting, like the call to action, and less on the aesthetic elements. That said, where you prioritize your efforts should depend on what your audience is most likely to respond to.

3 strategies to improve your A/B testing

Knowing what to A/B test is all well and good, but to really tap into the benefits of this approach, here are three best practices for you to keep in mind.

1. Go in with a hypothesisYour A/B testing shouldn’t be random. On the contrary, any changes you make to your emails should be informed and intentional, backed by industry trends, your subscriber behaviour, and what you’ve learned from other tests.Like you learned in your grade 6 science class, every experiment starts with a hypothesis – and the same is true for A/B testing. If you’re choosing to move the CTA button to the top of the email or use a tone that’s more conversational, what do you think that will accomplish in terms of engagement? This way, you can gather data against that hypothesis and use the results to guide future testing down the road.

2. Avoid random impacts to your resultsRemember: when you’re running an A/B test, you should really only focus on one element of the email, whether that’s the preview text, the header image, or the messaging. Everything else, including the audience segment and send time, should remain as consistent as possible. Otherwise, it’s almost impossible to pinpoint what (if anything) is responsible for different levels of engagement.

Let’s break that down with an example. Let’s say you work at a ski company that has a presence in North America and Australia, and your two primary customer segments are parents to new skiers and amateur adult skiers. You’re looking to A/B test a product launch email and want to evaluate which of your two subject line choices is better. Your two test emails should go to people within the same customer segment, in the same region (NA and Australia are in very different time zones, remember), and at the same stage in the marketing funnel. This way, you can guarantee that other influences – like receiving an email at 10pm versus 9am – aren’t influencing your results.

3. Make the most of your learningsWhether it has a positive or negative impact on reader engagement – or no impact at all – every A/B test you run will teach you something. Whatever the results, you can always use them to inform how you plan campaigns and where you prioritize your future A/B testing efforts. This is vital for any team that wants to be iterative in their marketing initiatives and stay proactive in meeting their readers where they are.

Curious to learn more about A/B testing? Get in touch. We’re always happy to share our best practices and continue the conversation around this important topic.


Share this article

Felix Higgs

Author

Felix Higgs

Director of Implementations & Support, Knak

Why marketing teams love Knak

  • 95%faster speed to market

  • 22 minutesto create an email*

  • 10K+marketers using Knak

* On average, for enterprise customers

Built by marketers, designed for everyone

Discover the future of no-code email and landing page creation.

See it in action