Categories
Best practices

A Big List of Things You Can A/B Test

A/B testing is one of the best ways to improve every part of your newsletter strategy. From your sent-from name to pop-ups, here’s a big list of things you can test.

The beehiiv logoThese stories are presented thanks to beehiiv, an all-in-one newsletter suite built by the early Morning Brew team. It’s fully equipped with built-in growth and monetization tools, no code website and newsletter builder, and best-in-class analytics that actually move the needle.

Some of the top newsletters in the world are built on beehiiv, and yours can be too. It’s one of the most affordable options in the market, and you can try it for free — no credit card required. Get started with beehiiv today.

There are dozens of opportunities for A/B testing for every kind of newsletter. The challenge for most newsletter operators, I’ve found, is figuring out what to test — and how often.

To figure out where to begin, I’ve put together a list of tests you can run, with suggestions on what to prioritize and how to run the test. Most email service providers allow you to run a variety of A/B tests, but some place restrictions on what you can do. Before you go too far down the testing rabbit hole, check with your ESP to make sure you can actually test out these ideas on your email platform.

At the end of this guide, I’ll answer a few questions about the optimal size and length of an A/B test and how to analyze your test results. Plus, I’ll explain the reason you should always avoid purple buttons. (More on that in a minute!)

But first, let’s dive into a list of things you can test in your newsletter.

Newsletter envelope

The newsletter envelope consists of the three items readers see before they open a newsletter: the subject line, the preheader text, and the sent-from name. Readers decide whether or not to open your newsletter based on these three spots — and yet, they’re often the last things that newsletters A/B test! That’s a missed opportunity. If you’re just starting with A/B testing, start here. 

The key metric to measure when testing the envelope is open rate. Even a small increase in opens due to testing would result in significantly more engagement with your content in the long run.

Subject line

The subject line is your opportunity to give readers a reason to make time for your newsletter and set the tone for whatever’s inside. But there’s no one right way to write a subject line — that’s something that varies based on the type of newsletter you send.

Any great A/B test starts by asking the right question. With subject lines, there are a lot of things you might want to ask:

  • What’s the right tone for my subject line? — Should I be serious? Sarcastic? Silly? (You’re free to choose something that doesn’t start with an “s,” too.) Test and see what gets readers to open.
  • What’s the right length for my subject line? — Every device and email client displays subject lines a little differently. Your subject line might get cut off anywhere between 25 characters and 100 characters, depending on what email client (Gmail, Yahoo, etc.), what device (phone, tablet, or desktop), and what app (like the Gmail or iPhone Mail app) you use. In their 2024 State of Email Newsletters Report, Beehiiv found that shorter subject lines consistently performed best, but I’ve also seen newsletters where longer subject lines performed far better. (My theory: Readers are curious when the subject line cuts off, and they open the email to see what they’re missing.) That’s why A/B testing matters here — you’ll figure out what’s true for you and your newsletter.
  • Should I use an emoji in my subject line? — Some newsletters lead with the same emoji to help it stand out. (Every edition of my newsletter starts with the 📬 emoji, for instance.) Others vary the emoji based on the topic of the newsletter. And many newsletters never use emojis — it just doesn’t match the tone of their newsletter. This is easy to test to see if it makes a difference.
  • Should I start every newsletter with the same subject line? — A local news newsletter, for instance, might have the same start for every subject line — “Today’s News:” — followed by the headline for that day’s lead story. You might want to test to see whether a consistent starting phrase leads to higher open rates.

Preheader text

Think of preheader text, also known as preview text, as the second part of your subject line. It’s what appears after the subject line in most email clients. Study after study has shown that newsletters that include preheader text always see a lift in open rates.

What do you want to do with that preheader space? Two options make sense for most newsletters:

  • Tell readers a little more about whatever’s in the subject line.
  • Tease a second story or topic that readers will find inside the newsletter.

I’d test both options and see if one drives more opens than the other.

Sent-from name

There are a surprising number of ways to tell readers that you sent them an email. You could send it from your organization’s name, your newsletter’s name, your own name, or some combination thereof. For my newsletter, for instance, any of these sent-from names would work:

Dan Oshinsky
Inbox Collective
Dan at Inbox Collective
Dan | Inbox Collective
Dan, Inbox Collective
Dan // Inbox Collective
Dan Oshinsky at Inbox Collective
Dan Oshinsky | Inbox Collective
Dan Oshinsky, Inbox Collective
Dan Oshinsky // Inbox Collective

And I could flip the org name and my name. “Inbox Collective | Dan Oshinsky” would also work.

Also to consider: Where does your sent-from name get cut off? Like with subject lines, every inbox has different character limits for the sent-from name. This isn’t something to A/B test — just send yourself a test email and make sure your sent-from name doesn’t get cut off in the inbox. (The optimal sent-from name is one that displays cleanly in the inbox.)

Newsletter body

The body of the newsletter is whatever your reader sees once they’ve opened your email. And within the newsletter body, there’s a lot you can test.

Newsletter content or frequency

The content you include within a newsletter might need a refresh. Maybe you’ve just done an audience survey, for instance, and realized that what readers look for and what you deliver don’t match up. Or maybe you’re looking at your underlying metrics — like open rates and click rates — and realized that the newsletter isn’t as effective as it could be. If that’s the case, it might be time to run an A/B test to see if you can improve your content and better serve your audience.

I’ll give you an example: When I got to the New Yorker, we had a twice-monthly Humor newsletter. But the New Yorker produced original cartoons and humor columns five days a week, and this was some of our most beloved content — so, I wondered, why didn’t we have a newsletter that matched that cadence?

We decided to run an A/B test. Over the course of a month, we took 20% of our audience and moved them to an experimental Daily Humor newsletter featuring all of our latest humor content. We emailed them to let them know about the test and asked them to write back with any feedback. At the end of the month, we also ran a survey of readers to gauge feedback. The other 80% of the audience got the usual twice-monthly newsletter.

At the end of the month, I looked at five metrics for both the A and B versions of the newsletter: Open rates, click rates, total traffic to our site, unsubscribe rate, and survey feedback. All five metrics pointed to an obvious conclusion: Readers far preferred the new Daily Humor product, and were more likely to engage and click on that compared to the old format. We launched the new Daily Humor newsletter to the entire audience the following month.

But not all content tests are that big. You might want to do something smaller. For instance, maybe you want to test how readers will respond to a new section of your newsletter, like a Q&A or a poll. Click rates or reader feedback can help you understand if that new content is worth implementing.

Intro vs. No Intro

I’ve never known quite what to call a newsletter’s introductory section — some newsletters call it a “top note” or “topper” — but the bigger concern is whether your newsletter should have one at all.

Testing this is easy: You’d send one version of the issue with an intro and one without. But then comes the measurement part of the test, and that’s trickier. If you’ve got lots of links in the intro, you could test to see whether the version with an intro drives more clicks than the one without. Or you could add a poll at the end of the intro and ask readers to vote and share how they feel about it.

Newsletter length

Here’s a question I get often — and one that doesn’t have an easy answer: How long should a newsletter be?

There’s no good way to measure the time that readers spend within a newsletter. (If there was, it’d make this a lot easier to A/B test!) But you could always test out the number of links within a newsletter. Is there a point at which readers simply stop clicking? This was the first A/B test I ran back at BuzzFeed. We started with five links in the newsletter and then started running tests to see what would happen if we went up to seven links, or ten, or 12.

In the end, we realized that we could easily add 15+ links in a newsletter, and readers would still click in high numbers on the links at the very bottom. Not all newsletters had that many links, but it was helpful to understand that we could add more links without harming the product.

Format of your links

There are lots of different ways to display links within a newsletter. Some newsletters use text within paragraphs. Some have standalone headlines and descriptions. Some will include images alongside the link. It’s worth testing out a few different formats to see which drives the most clicks back to your content.

You could also test out the call-to-action (CTA) text on buttons in your newsletter. Some newsletters, for instance, will default to a button that says “Read more” at the end of a story. You could test out alternative versions of that copy to see if you can find a version that drives additional clicks.

Growth

Many newsletters are focused on growing their lists efficiently, so this is one place where running A/B tests can have a huge impact. Even a small bump in conversion rate might lead to significantly more newsletter subscriptions in the long run. Luckily, many newsletter growth tools have A/B testing built in to make it easy to see what works.

Landing page

Your landing page is the front door to your newsletter. If a reader is thinking about signing up, the landing page is what takes them across that threshold from casual reader to engaged subscriber. And there’s a lot you can test on this page, from the tagline and copy to the design of the page. You can use tools like CrazyEgg or Optimizely to A/B test multiple versions of the page and see which drives the most sign-ups. Or you can do this the old-fashioned way: Manually insert a few CTAs on your site, and link half to one version of a page and half to the other. Try to drive equal amounts of traffic to each page — then see which has the better conversion rate. (Tools like OptinMonster or WPForms can show you the conversion metrics.)

Pop-ups and other conversion forms

What performs better: a full-screen pop-up or a smaller toaster unit? Does a pop-up convert better if it’s shown after a reader’s scrolled 50% of the way down the page or 75% of the way down the page? What’s the right call to action for a pop-up? How big a difference will a change in the CTA make? There are many things you can test with these sorts of units, and most tools that power them, like OptinMonster or Poptin, allow you to quickly create several variants to A/B test.

Paid acquisition channels

When it comes to spending money to grow your list, you want to make sure you spend money wisely. The goal is to find the highest-quality subscribers at the lowest cost, but doing so requires a lot of testing and optimization.

How much testing could you do in a year? The team at 1440 told Newsletter Circle that in 2023, they tested 1,500 different ads — and they send just one daily newsletter.

Many of the biggest ad platforms, like Meta and Google, allow you to test out several different ads at the same time. Every ad has a few different components, from the copy to image to the people you’re targeting the ad to. Each piece of an ad can be tested until you find the right mix.

Additionally, as part of your testing, you may want to keep tabs on what happens after someone converts so you can factor that into the results of your A/B test. (Some email service providers, like Beehiiv, allow you to track engagement based on referral source.) It’s great to be able to acquire a subscriber for cheap — but only if that reader becomes a loyal subscriber.

Automations

If you’re building out an automation, like a welcome series or reactivation series, it’s worth testing out different versions to see which might drive the results you want.

Welcome series

Some newsletters have just a single welcome email; some have a dozen or more emails spread out over the course of a few weeks.

There isn’t a perfect version of a welcome series, but there are a few things you can test out:

  • If you’ve got a +1 email that’s designed to drive readers to sign up for more newsletters or subscribe to your podcast, you could A/B test different versions of the copy to see which drives the action (sign-ups, downloads) you want.
  • If you’ve got an Evergreen email that pushes readers to top-performing content on your site, you can run an A/B test to see which actually drives the most clicks.
  • If you’ve got a Hard Sell email, which might drive readers to subscribe, donate, or buy something, you can test different versions to see which is most effective at getting readers to pay.

You could also test out the length of a series. How many emails are enough to drive engagement and conversions without leading to unsubscribes? Set up two different versions of the series and see where the opens decline and unsubscribes start to rise.

Or, for larger organizations, you can test out whether these emails work best if they have a personality behind them. Test out sent-from names to see which names lead to higher opens or see if emails with a personal touch convert better than generic messages.

Reactivation series

With a reactivation series, the goal is simple: How many inactive readers can you get to start opening your newsletters again?

With that goal in mind, it’s worth testing out different lengths of series. What happens if you add a second, a third, or a fourth email to a series? Does that make a difference in terms of your reactivation rate?

In the long run, the difference between winning back, say, 5% of readers versus 10% of readers might represent thousands (or tens of thousands) of additional active readers on your list. This could be a really valuable use of an A/B test.

Conversion emails

I always encourage large publishers I work with to run small tests on emails designed to convert readers to paying subscribers, members, or donors. You never know which email will convert well and which will be a dud. So, instead of blindly sending an email to your entire audience, run an A/B test on a small percentage of your list. Test out some of the ideas below — the copy, the offer, the personality the email comes from — and see if one version performs better than the other. And compare the results to your historical benchmarks for similar sorts of emails. If the email underperforms those benchmarks, go back to the drawing board and run another test until you find an email that really works.

If you make testing part of the process, you’ll never send an email to your full audience until you’ve first proven that the email can convert well.

As for testing these types of emails, here are a few ideas:

The offer in the email

What converts best: a 10% discount or a 20% discount on your product? What about a short-term offer ($1 for three months) versus a long-term deal (30% off an annual subscription)? If you’re A/B testing the offer, look at both the percentage of readers who convert on the offer as well as the lifetime value of the customers who convert.

The message of the email

Sometimes, it’s the larger message or story behind the offer that makes a difference. For instance, for non-profits that offer a membership, you could test out an email that plays up the benefits of membership versus one that emphasizes the value of the organization. Both might have the same subject line and closing CTA, so you’d be able to see if one variant was more effective at driving new memberships.

How long should the message be? You could test two versions of an email: one with a short note and a CTA, and one that adds on some additional messaging below that with a second CTA.

The sent-from name

Here’s a great thing to test: Who should this email come from? If you’re part of a larger org, you’ve got options: The email could come from the organization itself, with no specific writer behind it, or it could be a more personal note from a member of your staff. You may find that one version significantly outperforms the other. (I’ve found that more personal notes often do better, but that’s not a universal rule. Test it out and see what works for you!)

CTAs

There’s so much you can test when it comes to a CTA:

  • How the CTA is displayed — You may find that it’s worth testing whether a line of text or a button drives more people to a sales page.
  • What the CTA copy should be — What’s the right language to use on that CTA? As Ephraim Gopin wrote in a piece about non-profit email strategy, an environmental non-profit that raises money to plant trees could write their CTA several different ways: “Plant a tree,” “Help save nature, Dan,” “Be the change,” or “Give now.” Any of these could be effective at driving donations, but you won’t know which works best until you A/B test it.
  • How many CTAs to include — For many nonprofit newsrooms I work with, we’ve found that in an end-of-year campaign, it’s effective to include two different buttons stacked on top of one another. One is a CTA to give on a one-time basis, and the other is a CTA to give monthly. Would one button or two buttons work best for you? You’d have to A/B test it and see which leads to more conversions and revenue. 

HTML vs. text-only emails

Many teams work hard to design beautiful emails, but oddly, text-only emails often outperform those with a lot of design. So test it out for yourself: Take the same email copy, but send one version that’s only text and another that includes design elements. You may find that one brings in more revenue than the other.

A few other big questions about A/B testing

Now that we’ve gotten through a list of things you can test, I’m sure you’ve got other questions. I’ll try to answer some of the most common ones.

How much can you test at one time?

The short answer: It depends on the size of your list.

If you’ve got a million readers, you can A/B test dozens of different things at once. Getting to statistical significance won’t take long.

But if you’ve got an audience of, say, 20,000 readers, I’d encourage you to run one test at a time. Let’s say you’re A/B testing your sent-from name. Don’t try to do that while also running a test on subject lines — you’ll split your audience into such small segments that it’ll be hard to tell what really makes a difference. The more you focus on just one test, the more you can understand whether that test leads to positive results.

How big does your list need to be to run a test?

As a general rule, I wouldn’t start running any A/B testing until you have at least 5,000 subscribers on your list. If your newsletter is smaller than that, the results won’t mean much.

You can certainly run a test with a smaller list, but you’ll have to run it multiple times, maybe even for a few weeks, before you start to see trustworthy results.

There are tools built into Google Sheets and Microsoft Excel that can help you run the numbers to see if the test you’ve run is truly meaningful or if your sample size is too small to really be trusted. There are also free online tools, like this one from SurveyMonkey, that can help you calculate whether your results are statistically significant. If the results aren’t significant, you may need to run the test a few more times before you can feel confident in the results.

Which tests should you prioritize?

For my money, I’d focus on two things first: One is the newsletter envelope. If readers don’t open your emails, then it doesn’t matter how good the content is inside — they’re not seeing it. Make sure you optimize your emails with the right sent-from name, subject line, and preheader text.

The other metric to test is conversion emails. Anything that’s tied to revenue is worth testing out — you might be leaving money on the table if you’ve got the wrong copy or design in your emails.

How often do you need to test?

It depends on the newsletter. If I ran the email marketing strategy for a large ecommerce brand, for instance, I’d always want to be running tests since a slight lift in conversion rate might lead to huge amounts of additional revenue over the course of the year. But if I ran an independent newsletter, I might only run a handful of tests per year — probably around my marketing copy, newsletter envelope, or conversion emails.

Do you need to re-run a test?

There are cases where it makes sense to run a test again. Maybe you’ve tested out different subject line formats before, but it’s been a few years. You may want to revisit that since the strategy that worked a year or two ago may not be as effective as before. Or maybe previous A/B tests revealed a winning strategy around conversion emails, but now your audience has gotten used to the new format, and conversion rates are dropping. In that case, start testing again and see if you can find new ways to improve your strategy.

What if you don’t learn anything from a test?

This happens more than you’d think! Often, you’ll run a test — maybe even for a few weeks — and see no real difference between the two variants. That’s pretty normal. (And yes, it can be really frustrating when you put a lot of energy into a test and don’t see any sort of improvement in the data. Unfortunately, it’s just part of the testing process.) 

Not every test will lead to a result, but if you regularly run tests, in the long run, you should be able to identify improvements to help you build the best possible strategy for your newsletter.

Beware the purple button

I’ll leave you with one final thought on testing: Don’t just assume that what you’ve read elsewhere works for you.

Every few years in the email space, someone will put out a big piece of research that says something like this:

“After extensive testing across thousands of newsletters, we’ve discovered that purple call-to-action buttons drive nearly 2.3% more clicks than blue and red buttons, leading to significant improvements in long-term conversion rates.”

And for months after, every time you open someone else’s newsletter, you notice that they’ve changed their CTA buttons to purple, even in cases where purple isn’t one of that brand’s main colors.

But here’s my advice: The next time you see a case study like this, don’t simply steal those insights — use it as the starting place for a test.

Savvy newsletter operators see stories like these and think: I wonder if we should change the way we present our CTAs? Or: I wonder if we could try new email templates that would work for our audience?

And then they go out and test different tactics to see if they can create something that resonates with their unique audience.

Don’t plug purple buttons into your newsletter. Don’t copy — test.

Thanks to our sponsor
The stories you’re reading on inboxcollective.com are made possible thanks to the generous support of our winter sponsor, beehiiv, an all-in-one newsletter suite with built-in growth and monetization tools, no code website and newsletter builder, and best-in-class analytics that actually move the needle. Start your journey with beehiiv today, absolutely free — no credit card needed.

By Dan Oshinsky

Dan runs Inbox Collective, a consultancy that helps news organizations, non-profits, and independent operators get the most out of email. He specializes in helping others build loyal audiences via email and then converting that audience into subscribers, members, or donors.

He previously created Not a Newsletter, a monthly briefing with news, tips, and ideas about how to send better email, and worked as the Director of Newsletters at both The New Yorker and BuzzFeed.

He’s been a featured speaker at events like Litmus Live in Boston, Email Summit DK in Odense, and the Email Marketing Summit in Brisbane. He’s also been widely quoted on email strategies, including in publications like The Washington Post, Fortune, and Digiday.