A/B testing, also called split testing, is a method that uncovers which of your two campaign versions is the most effective at getting opens or clicks.

In an A/B test, you make two versions of one campaign and send them to two small groups. One group is sent Version A, and the other group gets Version B. The campaign with the most opens or clicks (you decide on the metric beforehand) is the winning version.

In this series of blog posts we curated what the world best expert has to say about it …

Running a lot of tests on a regular basis

Just the other day, I sat down with an entrepreneur who claimed his company were experts in A/B testing because they run over twenty A/B tests each month.
When I started to dig a bit deeper, I found out that their website had a lot of traffic, which is why they were able to go through so many tests so fast. In addition, they didn’t have a ton of pages, so it was easy for their design team to make changes.
Upon hearing all of the evidence, I explained to the person why it’s bad to run so many tests in a short period of time.
For starters, they weren’t basing their tests off existing data, and they weren’t running them long enough. For this reason they weren’t seeing big revenue increases.
To make matters worse, they had a lot of losing tests, which was causing them to lose a lot of money. A temporary decrease in conversion rates means you temporary lose money.
If you can collect enough data, create tests based on the data and then learn from the tests, that’s fine. But it’s unrealistic to do all of that in a very short period of time.
Don’t focus on quantity; focus on quality. Always run tests based on data and learn throughout the process… even if that slows down your A/B testing schedule.
The biggest thing you need to take away from this blunder is how important it is to learn from your tests. Learning takes time, so don’t try to force yourself to run a lot of tests each month.

neil-1 Neil Patel Co-founder, Crazy Egg

How to Choose the Right Time Frame for Your A/B Test

Okay, so this is where we get into the reality of email sending: You have to figure out how long to run your email A/B test before sending a (winning) version on to the rest of your list. Figuring out the timing aspect is a little less statistically driven, but you should definitely use past data to help you make better decisions. Here's how you can do that.
If you don't have timing restrictions on when to send the winning email to the rest of the list, head over to your analytics.
Figure out when your email opens/clicks (or whatever your success metrics are) starts to drop off. Look your past email sends to figure this out. For example, what percentage of total clicks did you get in your first day? If you found that you get 70% of your clicks in the first 24 hours, and then 5% each day after that, it'd make sense to cap your email A/B testing timing window for 24 hours because it wouldn't be worth delaying your results just to gather a little bit of extra data. In this scenario, you would probably want to keep your timing window to 24 hours, and at the end of 24 hours, your email program should let you know if they can determine a statistically significant winner.
Then, it's up to you what to do next. If you have a large enough sample size and found a statistically significant winner at the end of the testing time frame, many email marketing programs will automatically and immediately send the winning variation. If you have a large enough sample size and there's no statistically significant winner at the end of the testing time frame, email marketing toolsmight also allow you to automatically send a variation of your choice.
If you have a smaller sample size or are running a 50/50 A/B test, when to send the next email based on the initial email's results is entirely up to you.
If you have time restrictions on when to send the winning email to the rest of the list, figure out how late you can send the winner without it being untimely or affecting other email sends.

For example, if you've sent an email out at 6 p.m. EST for a flash sale that ends at midnight EST, you wouldn't want to determine an A/B test winner at 11 p.m. Instead, you'd want to send the email closer to 8 or 9 p.m. -- that'll give the people not involved in the A/B test enough time to act on your email.
And that's pretty much it, folks. After doing these calculations and examining your data, you should be in a much better state to send email A/B tests -- ones that are fairly statistically valid and help you actually move the needle in your email marketing.

ginny Ginny Mineo Director of Platform, NextView

Getting the Right Email Software for AB Testing

Surprisingly, not all email marketing software make A/B testing easy.
If you ask me, this is nuts. It’s one of the easiest ways for email software providers to improve the performance of their user’s campaigns, increasing their likelihood to continue as paying customers.
Thankfully though, a few do make it easy. There’s a detailed comparison of email software providers here with a column for A/B testing, but the quick summary is that if you want A/B testing, I’d recommend GetResponse or Infusionsoft.
I won’t go into the pros and cons here, but having used a myriad of tools (Mailchimp, Aweber, Sendy etc.), these are the only two that really let you go to town with A/B testing.
Once you’ve got a good email marketing tool that enables A/B testing, you can begin experimenting to find what does and doesn’t work.
So, let’s start off with one of the most important components: the subject line.

marcus Marcus Taylor CEO, Venture Harbour

Email Testing Categories

When it comes to email testing, there are three main categories you need to cover – content, design and timing. Content consists of everything from subject lines, to headlines, to calls to action, and each have their own significance in driving response from your users.
Design testing has to do with the look and feel of your email, but has a primary focus on placement. Where your text and imagery live on your email template can affect response rates. Design testing incorporates everything from fonts, font size, formatting, and plain text vs. HTML.
Time testing helps you determine when you should send your emails. This refers to day of the week, time of day, frequency, and ultimately cadence. We often get asked about when the best time to send an email is—and frankly, it’s a question that we can’t accurately answer. Optimal times to increase open and click rates vary by industry and subscriber base, so the only way to really know the perfect time to send emails to your users is to test it.

jillian Jillian Wohlfarth Director of Content, SendGrid

The Hard Truth About A/B Testing I Had to Swallow

For years, we, as providers of an A/B testing tool, told you it was easy. We made a visual editor and pretty graphs and gave you wins on engagement or a lower bounce rate, but we did not really contribute to your bottom line.
My apologies for making it look easy and dragging you into A/B testing when, in fact, it is actually very hard to do right.
Flashback: It was July 2012, and on a sunny afternoon at the Blue Dahlia Cafe in Austin, I had lunch with Bryan and Jeffrey Eisenberg, both recognized authorities and pioneers in online marketing and conversion optimization. I was presenting our just-funded startup Convert.com, and they were very politely sharing that the road to the top of A/B testing tools was a hard one.
They were advising another company that had already made a significant impact on the A/B testing market and were very nice to sit with me for an hour to share some of their knowledge on how conversion rate optimization was impacting the online market.
During our meeting, they were not understanding that our then-revolutionary visual editor, which we designed based on the idea we saw at Dapper, would change the industry they branded. They both smiled and were friendly but said we should focus on some other key features, and they mentioned some.
We thought making easy A/B testing would change the world and that the path was to make the tool easier. As I now know, I was ignoring their advice because I thought I had come up with something that would change it all. It would, but not in the way I thought.

dennis Dennis van der Heijden CEO, convert.com