<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1822615684631785&amp;ev=PageView&amp;noscript=1"/>

Introducing A/B Testing

introducing AB testing

You can now A/B test your email outreach campaign using our powerful platform. Here are the basics:

  • Easily vary your Subject/Message with different sets of content.
  • Get separate open/click/reply statistics on each variation that’s sent.
  • Send to a portion of your list, and then decide which variation is the best, and then send the “winner” the rest of your list.
  • The “winner” can be chosen automatically by our algorithm or by you.

A demo

Animation of A/B test
I’m A/B testing my subject and body. I set the test so that I pick the winner manually.

A simple A/B test

To conduct an A/B test on a campaign, there are two steps to take:

  1. Create a variation using our “spin” syntax in the subject or the message. Here’s an example:
    variation using spin syntax
  2. Apply the A/B settings in the settings panel. Here’s an example:
    apply AB settings in settings panel

In the above simple example, the subject line will be varied as the email begins to send. If I send to a list of 100, only 30 emails will be sent at first, with approximately 10 emails with the subject line “Hi”, 10 emails with the subject of “Hello”, and 10 with “Hey”. Four hours after the 30 emails are sent, a winner will be chosen automatically based on the open rate of all three segments. At that time, the remaining 70 emails will send with the winning subject line. Technically, this is an A/B/C test, since there are three variations.

This is what my Sent folder looks like while running the test.

A more sophisticated test

In a real life scenario, you’ll want to do more than just vary the subject line between Hi, Hello, and Hey. If you’re sending cold email, you may want to vary your introductory line. Here’s a more practical scenario:

spin variation - introductory line

Here, three different introductory lines will be used, and if I’m a cold emailer, I’m likely to care about the “reply” rate more than anything else. Because it can take a while for people to reply, I want to wait 24 hours before picking a winner, so my A/B test settings would look like this:

time setting for AB testing

Using A/B testing on a complete cold email sequence

You can A/B test your initial message plus each message in your email sequence. Here’s how that might look in the Settings box.

settings to AB test a follow-up series

Note that you don’t have to have the same number of variations in your follow-up template as you do in the original message. Each follow-up template can have a different number of variations in the message, but of course, this will make it more difficult to assess which version is really having the most impact.

Using the “manual” option

If I choose the “manual” option, I’ll be notified in my dashboard and via email when it’s time to pick a winner.

After the initial batch of emails are sent, and after the specified number of hours have passed, you’ll get an email that looks like this:

A/B notifier - time to choose a winner

You’ll also be alerted in your dashboard that it’s time to pick a winner.

You can then choose the winner from either the email notification or your dashboard, and then the remaining emails will be sent.

Personalizing inside the variations

You can use our standard personalization syntax inside the variation blocks. Using our Hi/Hey/Hello example, you can also do this:

{{spin}}Hi {FirstName}{{variation}}Hello {FirstName}{{variation}}Hey {FirstName}{{end spin}}

In fact, inside each variation you can use our full personalization syntax and our full conditional content scripting language.

Interested in statistics only?

You might want to conduct an A/B test but not have any decisions made, manually or automatically. Meaning, you might want to send to 100% of your list while varying your content. To do so, just set the slider in the A/B settings to 100%.

If you set the A/B settings to send to 100% of recipients, then:

  • All emails will go out at once, with an equal number of recipients receiving each variation
  • No decision will need to be made, automatically or manually
  • You can still see how each variation performs in your campaign report

Coming from a different A/B testing platform?

If you’re migrating from another email platform offering A/B testing, our way of doing things might be different and take some getting used to. Traditional email marketing platforms have you create an entirely different subject/message as your “B” version, while in GMass, you put in all the variations within one single message using our special “spin” syntax. We’ve always focused on efficient workflows, and we think our way is a faster, easier way to launch an A/B test.

In most A/B tests, some content of the message is similar between the variations, and with our way, you don’t have to duplicate the common content across message spaces.

Understanding line spacing when composing variations

When you put in a variation into the Message area, you’re using our conditional logic spin feature, consisting of these tags:

{{spin}}

{{variation}}

{{end spin}}

A line that has just a conditional logic code on it does not count as a line of space when our processor interprets your campaign.

So both of the following examples will produce the exact same line spacing:

{{spin}}
Hi there,
{{variation}}
Hello!
{{end spin}}

How are you doing?

and…

{{spin}}Hi there,{{variation}}Hello!{{end spin}}

How are you doing?

How to test your A/B variations

Before you conduct an A/B test campaign, you probably want to see what each variation of your full complete message will look like to the receiver.

The easiest way to test your variations is to use the “Send Test Email” button at the top of the settings box. When you send a test email to yourself, each possible variation of the email will be sent. For example, if my subject line looks like:

{{spin}}Hi{{variation}}Hello{{variation}}Hey{{end spin}} there!

Then when I send a test email, three test emails will be sent, each one showing the different subject line.

Additionally, you can also use the A/B feature along with the “Create Drafts” setting. By doing so, you can review the DRAFTs that are generated for each of your email addresses before sending. You can also then modify the contents of individual DRAFTs should that suit you. However, note that the “timer” for when the testing period expires starts after the DRAFTs are created, so make sure to get them sent relatively soon after they’re generated.

FAQ

After the initial test is sent, can I change my mind and send more as part of the test?

Yes! Just adjust the percentage slider up, and save your campaign. More emails will then be sent as part of the test.

What happens if I use the “spin” syntax but don’t put in any A/B settings?

Then the email will still vary itself across the whole list, but a winner will never be chosen. Your entire list will be split amongst the variations.

What happens if I put in the A/B settings but don’t include any “spin” syntax in my Subject/Message?

You’ll get an error and won’t be able to launch your campaign, since you can’t conduct an A/B test without at least one variation.

Do the number of variations throughout the Subject and Message have to be equal?

Yes. For example, you cannot have 3 variations of your introductory text, and then just 2 variations of your second paragraph. That would produce inconsistent and confusing results, so we don’t allow it.

9 Comments
  1. Looks interesting, can’t wait to try it.

    Is there a typo/mistake in the first example (under “A simple A/B test”)? It seems there are 3 possible subject lines (Hi, Hello, Hey) in the screenshot, but the text below says 20 emails will be sent, 10 with “Hi” and 10 with “Hey”. Or am I missing something?

    1. No. Open and click tracking are completely unrelated to this A/B testing feature. Although you just gave me a good idea — letting users do an A/B test with tracking on vs off.

  2. I’m trying to use If/Then/Else statements inside spin/variation but mys statements seems to be ignored or generate errors.
    Is it possible to use conditions inside A/B tests variations?

  3. It looks like the A/B test results has disappeared from the gmass dashboard (when I go into a campaign to look at the results, it is not broken out for A vs B and it used to be). Any suggestions?

Leave a Reply

Your email address will not be published. Required fields are marked *

Try GMass today

It only takes 30 seconds to install it!

Install Now GMass requires Chrome

GMass

Share This