In my last job, I spent most of my days talking about email. Discussions about strategy and execution for Welcome Campaigns, Win-backs, Newsletters… you name it. A topic that came up frequently, no matter the type of email campaign, was A/B testing. In fact, clients’ interest in A/B testing was sometimes so high, the question of “What should we be testing” would come up before we’d even looked at the email strategy together! Everyone recognizes the value of being able to cost-effectively optimize marketing messages based on email test outcomes.

The answer to, “What should we test?” always begins in the data. What metric do you want to move? Where are you underperforming? If your opens are lackluster – focus in on the subject line or the time of the deployment. But, if the problem lies in the click… well, then it’s a different ballgame. With a lot more variables to test.

 

Let’s start with some easy testing options to drive opens:

Send time

There’s no universal “best time to send.” It depends largely on the nature of your audience. If you can run tests based on time zones or geographical regions, that’s ideal but it’s not always feasible. Try testing sends on weekends vs weekdays, and/or mornings vs. afternoons vs. evenings. Send time testing can be done over time or can be a focused effort to try to maximize the impact of a specific email. If you want to optimize a specific send, run a 10/10/80 test where you send Test 1 to 10% of your subscriber list, Test 2 (at the later time) to 10% of your subscriber list, and use the winner of the test to dictate the time of day to send to the remaining 80% of your list.

Subject line

Subject line tests can vary. I’ve seen tests with emoticons vs. no emoticons, first name vs no name, using the word “free” or not (Note: This can increase the chance of your email being flagged as spam, but can also significantly increase open rates). You get the gist. Even though a subject line is (or should be) a very brief line of text, it has plenty of opportunity for testing. One thing to note, especially for your creative types… a commonly acknowledged “truth” about subject lines is clarity is more important than creativity.

Sender

Are you sending as your company, or from a person? Most likely, you are sending emails with your Company Name displayed as the sender. Some studies show that emails coming from a person, rather than a company, have positively impacted open rates. Many companies have found it’s worth testing.

 

Now we get into the more complex topic: testing to improve clicks. The reason this topic gets more complicated is that you’re dealing with more content within the body of an email and therefore more variables that could impact performance.

 

Consider testing the following, to increase engagement:

Send time

It’s relevant here, too. Just because you’ve reached someone at a time when they are willing to open an email, doesn’t mean you’ve reached them at a time they’re willing to engage with your content. Skimmers, like me, might be happy to see what’s there, but unwilling to look any further. If you realize conflicting outcomes in what the best time to send for opens is vs. when the best time to send for clicks, decide which metric carries the most significance for your organization.

CTAs

Assess how often you ask someone to take a particular desired action. Are you giving the CTA (call to action) visual emphasis? What language are you using? What language directly precedes it? These are all things that can be tested.

Use of dynamic content

If you have it, test whether your subscribers respond to dynamic content. This generally includes first name personalization but can apply to just about any kind of data you have that fits within your content. I’m a customer of Rover (the dog walking service), and they make frequent use of “Huxley” (my dog’s name). It catches my attention every time.

Order of topics

Take a newsletter, for instance. This often covers several topics, which translates into an opportunity to test order. Highlight different leading topics, and/or play with the right number of content areas within your layout to see which send yields the most clicks.

Additional creative components

You can test almost anything related to your email creative. The headlines. The length of copy. The layout. The prominent color. Types of photos (featured products or lifestyle). The font size. The fonts themselves. The list goes on. The point is, you can get “creative” in testing your creative.

 

The prospect of testing can be overwhelming, partially due to the sheer volume of testing options. So here’s an extra piece of advice to help you narrow your focus.

After you’ve identified a lagging performance area, develop a hypothesis. This often starts by looking at what you’ve been doing – your email designs and copy, for instance. Pull up your last email that suffered low clicks and determine why that might be. Does anything stand out as needing improvement? Then, develop a hypothesis… “The CTA buttons seems too small. I think it would stand out more, and therefore get more clicks, if we increased the size.” This might seem like an obvious piece of advice, but many people will jump into using a popular test, or whatever Google serves up in response to a search (“How to increase email click-throughs”) instead of bringing their own thoughts/experience to the table. The more you follow this practice, the more skilled you’ll become in developing strong hypotheses that supplement data-driven decisions.

Sound like a lot? (Maybe… like TOO MUCH?) Get in touch with us. We can help analyze your email strategies and content to craft an optimization/testing plan. We know you’ve got a lot on your plate and we’re here to help. Email is too valuable to ignore, even when you don’t personally have the time for it.

Kim Jones headshot

Written by Kim Jones

“Willow has been in my life for a long time. I’m excited about the future—where we’re heading—and I’m excited to lead the way.”