Featured: The 2020 Email Marketing Benchmark Survey

How Knak Does Email: Email Testing At-a-Glance

If you’re an email marketer, you may have some questions:

Should I put that heart-eyes emoji in the subject line of my email or leave it out?
What format is best for my content?
Will that GIF give me better click-thru rates?
Should my CTA say, “Download Now” or “Get the Facts”?
Should my CTA button be blue or green?
And while we’re at it, should my CTA button be at the top or bottom of my email?

Fortunately, there’s a great way to find out the answer to all of those questions. It’s called A/B testing, and if you’re not doing it, it’s time to start.

We’re going to take an in-depth look at email testing in the coming months, so consider this A/B Testing 101, and read on for some of the basics.

What is A/B testing?

A/B testing, also known as split testing, is the process of sending one variation of an email to a subset of your subscribers and another variation to another subset to see which performs better.

The “A” refers to the control group, and the “B” refers to the challenger, or test, group.

Here’s a real-life Knak example:
We launched our Knak newsletter recently. Beginning with our September edition, we wanted to test the efficacy of including an emoji in the subject line.

We set up the test in Marketo to send the newsletter with the emoji to 25% of our subscribers and the newsletter without the emoji to another 25%. The test ran, and at the end it, Marketo sent the email with the higher-performing subject line to the rest of our subscribers.

(The results? As you can see below, in September, the subject line without the emoji performed better. In October, the winner was the one with the emoji. Stay tuned for the tiebreaker …)

A/B test graph

Why do I need to do A/B testing?

If you aren’t doing A/B testing, you’re losing valuable insight into the marketing tactics that resonate with your audience. Over time, those insights translate into dollars.

Gain the insight ⇒ turn it into better campaigns ⇒ reap the benefits.

If you don’t have the insight, your campaigns are at the mercy of your marketing team’s best guess, and you won’t have the benefit of hard data to support those guesses.

Our recent Email Benchmark Report notes that only 46% of those we surveyed say they regularly conduct A/B testing. That means 54% of email marketers are leaving a wealth of information on the table.

Join the minority here. A/B test your campaigns, and get a leg up on your competition.

Ok, let’s make it happen.

Here are some A/B testing dos and don’ts to help you get the most accurate results:

  1. Don’t test more than one thing at a time.
    We know you want to see which subject line/emoji/CTA/format/font color works best, but you have to stick with one thing at a time, or you won’t know which piece is influencing which result.
  2. Do send both tests at the same time.
    The time of day that your email is sent impacts your open rates. Don’t weigh a green CTA button sent at 4 PM on Sunday against a blue CTA button sent at 8 AM on Monday.
    (Hint: email send time is a great thing to A/B test on its own. )
  3. Do set goals for your tests.
    You need to know what you hope to gain from your A/B testing. Are you trying to improve open rates? Click-thru rates (the ratio of “clickers” to email openers)?
    The metrics you’re trying to improve will dictate which elements you’re testing, so channel your inner middle-school science student and start hypothesizing. Finding out if what you think will work is actually what does work is a great learning exercise for your team.
  4. Do test more than once.
    If you truly want accurate results, you need to test and test again. If we had only tested our emoji-in-the-subject-line once, we would have thought it was ineffective. But since the results the next month showed something different, we know it’s worth exploring further.
  5. Don’t ignore the results.
    Collect the data and analyze it! Keep track of your results and use them to improve your campaigns. Do the results line up with your hypotheses? Do they support the types of campaigns you’re already sending? Do you need to consider tweaking your design? Insight like this is incredibly valuable and can have a lasting impact on your campaign results (and therefore your bottom line).

Help!

Marketo, Eloqua, and most other marketing automation software have built-in A/B testing capabilities. If you need help setting it up (or figuring out how to use it!), we can help!

After all, we’re marketers too, and we’d love to welcome you to the 46% of us who are improving our open and click-thru rates with A/B testing.

Tania Blake is Knak’s Marketing Manager, where she’s known for her strong attention to detail and unique ability to balance "process" with "getting stuff done". When she’s not juggling a million and one projects, you can find her cooking up a storm, doing yoga, or hanging out with her family.

Featured Posts

Why we’re big fans of customer-led growth

In this blog, Knak’s Co-Founder & CEO, Pierce Ujjainwalla, talks about why he’s a big fan of customer-led growth, and why he thinks that in many circumstances it can be more effective than its sales-led counterpart.

What We’ve Learned About Collaboration (and How We’re Improving It at Knak)

When this year started, none of us could have predicted the situation we’d be facing within just a few months. In the span of a few days, offices were shuttered, living rooms became offices, and…

Getting More Out of Knak: A Best-in-Class Security Partnership with Cysiv

Security has always been a priority at Knak. That’s why we partner with Cysiv, a full-service, next generation enterprise security organization to ensure our customer information is safe.