On-demand visual content is a Personality, not a strategy.

Find out how to break down messages rules (and boring stuff)

A/B testing emails: is that a good or bad thing?

Written by Colette on 5, September 2019

The AB test is the Holy Grail to the marketer, no one will say the contrary. This is often the reflex to test the performance of a CTA color or a product visual. Is it still suitable to deliver the right message at a time when we are communicating instantly ?

1/ A lot of work has gone into producing 

Designing two html may seem harmless to experienced organizations. Despite well established production processes, it takes time and resources for each AB test. However, it is already possible to use a single html to test the performance of every visual independently.

Some of our customers spend between 4 and 7 hours for each email (choice of content, message creation, workflow in the campaign management tool, BAT etc.). Can we still afford this luxury at a time when the customer experience is being played out everywhere?

2/ Are the statistics as reliable as that? 

Testing optimisation hypotheses to deliver an appropriate message is necessarily worthwhile.

This strategy is now diverted from its original objective: compare only two elements. However, I still often see a multitude of elements tested at the same time.

The interpretation of the results, on the other hand, raises questions. How do determine which zone makes the difference with an overall click rate? 

3/ Individuals condemned to see only the losing version 

20% of your database remains on a losing version with the AB test method. By testing in real time, we completely avoid this pitfall by not setting aside anyone, openers or non-openers of emails.

How does it work? If individuals interact more with certain elements, a winning version comes out. All new openers will see the most efficient version.

In general, only 2% of openers will see the lowest performing version of email with real time testing. A part of the base is no longer penalized.

4/ No more marketing message subjectivity! 

The AB test gives the marketer data to make the right decisions. Well, that's the idea. However, it is difficult to really take the pulse of a campaign when it is decided months in advance.

How to describe a successful campaign? It is above all a campaign that "lives its life". In other words, the individual chooses what he likes in real time and this inevitably impacts the click rate.

It is risky to rely on a "winning version" that does not take into account the expectations of individuals to lead the next campaigns.

To all things considered, AB testing its emails is:

  • a lot of production and interpretation time (and therefore money);
  • distorted performance statistics; 
  • static messages; 
  • part of the database penalized. 

Based on real time testing, we:

...save time with only one html

...detect and deliver the high-performance version quickly // optimize the performance of each email

...take into account the context of individuals' openness

...... allow you to create message combinations

...... learn in real time about campaign trends

Topics: Customer Experience, individualization, ABtest