What A/B Tests You Should Be Doing and How

By Yvette Hastings in Marketing
·9 min read

Marketers often face the challenge of proving what works and what doesn’t. Enter A/B tests. Done right, seemingly insignificant changes to a landing page or email can create super significant results in opens, click-throughs, downloads, engagement, and sales.

Thanks to A/B tests, you can be sure of how your efforts are being received and which ones are responsible for reducing bounces and cart abandonment, and boosting traffic and conversions.

Microsoft Bing sees an increase in revenue per search from 10 percent to 25 percent each year thanks to A/B testing. (Harvard Business Review, 2017)

That’s a space rocket I’d like to be on.

Where Do You Start?

  1. With a goal like “I want to boost conversions”.
  2. And a hypothesis likes “A larger CTA button would increase conversions.”
  3. And an A/B testing platform that doesn’t require deep technical knowledge allows an unlimited number of page variants, delivers real-time reports, and gives you full control of your tests and results. And if it’s all integrated into one system with your CMS and CRM, so much the better.


Why is a goal so important? Because your test will yield a lot of information and you need to keep sight of what you’re actually trying to do. Your variant may show a huge increase in Click-Through Rate (CTR), but no effect on conversions and getting seduced by a sexy CTR could be counterproductive.

The hypothesis helps you define your success metric. Did a larger CTA button increase conversions? Keeping your metric for success to one simple statement helps you clearly identify a winner from the test.

Keep things relevant. If you’re looking to increase video views, changing the look of your footer doesn’t make much sense. Spend your time, money (and grey hairs) wisely.

Next thing to consider is this: a small change could improve conversions by 5%. One big change could send your conversions up 130%...

Or down by equal measure.

So while powerful, big changes are risky. Start small and get savvy before getting out the big guns. Remember incremental improvements of 5% could lead to an overall improvement of 80% over the year - not something to be sniffed at.

Prioritization and Patience

Every marketer’s two favorite words. But first, what can we test? Would the button be better yellow? Would the alternative heading boost time spent on a page? Does open rate improve with a different subject line? Or by adding recipient names? Or being sent at a different time of day? Do conversations improve with a minimalist page? Or would it be worth adding testimonials?

The options are endless. And it’s important to prioritize according to the impact they are likely to have. So leave that footer redesign on the video page for later. Start by testing the video title, size, placement, thumbnail, length, play icon, and so on. You’ll have hundreds of alternative ideas for each element of your page. But restraint is key here.

One. Thing. At. A. Time.

Testing multiple things at once muddies your results. If you change the background color and the heading in your test, you’ll have no idea which element had the desired impact. One may even have hindered the effect of the other! Also, don’t make any changes on your site based on someone else’s A/B test. Would you expect your cousin’s tailor-made suit to look good on you? It doesn’t work that way. And take your time. Most A/B tests need about two weeks to deliver useful results, but test duration varies with sample size and buying cycles. And don’t stop your test as soon as the variant beats the original or you’ll only have a portion of the whole story.

Slow and steady wins the A/B race.

What to Test for Maximum Impact

1. Calls to Action

This has got to be the world’s most tested element on any page. It defines the conversion. So it’s a great place to start. Would a larger, clearer button help convert? Would “Get Yours Today” work better than “Buy Now”? Does a blue button get more clicks than green? Does font size have an impact? HubSpot tested button color and found that a red CTA button outperformed a green CTA button by 21%. That’s huge. When online community Fab added the text “Add to cart” to its CTA button, CTR increased 49% over the original.

2. Headings

Short headings, long headings, large headings, right-aligned headings, headings in caps, headings that flash (no, don’t do that - never do that), headings that contain numbers, headings that shock... you get the idea.

A catchier headline can mean more time spent on the page, which means people actually read your proposal and learn about what you’re offering and this can lead to more conversions!

Without a doubt, your content creator had more than one idea for the heading when it was published. And it broke her heart to have to choose between them. She most likely she went with the one her manager would approve. But she’s not writing for her manager (...uh, well, you know what I mean) she’s writing for the audience! Seeing clearly what copy has the most impact (and being able to prove it) empowers her to hone her skill towards audience-specific, revenue-boosting power copy... making her manager happy too.

They should call it win/win testing.

3. Email Subject Lines

A crucial moment in marketing is when an email subscriber scans the subject of your email. Will they open it? Will they read it? Will they click through? Or unsubscribe?

Best make an impact: “Limited time offer”, “Once-in-a-lifetime opportunity”, “Now or never”...

All these tend to get a little bit boy-crying-wolf to a savvy audience. So it’s time to start testing some other ideas. And the results might surprise you. Fast-growing peer-to-peer crowdsourcing platform Designhill.com tested just writing the blog name as their email subject as opposed to prefacing it with “Check out my latest blog...” and achieved a 5.84% higher CTR and a 2.57% higher open rate.

Don’t assume. Test.

4. Shopping Cart and Checkout

According to SaleCycle, 2018 global cart abandonment rates were at 76.63% across all sectors.

Is your layout confusing? Cluttered? Uninspiring? Does your coupon code field send people on a long detour to find one? Would an emphasis on security reassure customers? Do they need returns policy links and delivery options? Reviews? Testimonials? A Customer Service number? What about the shopping cart service itself? Is abandonment affected by which cart you use? The fact that this is such a sensitive area makes it an obvious candidate for testing. But remember, every tweak will likely have an impact here.

Non-profit organization Kiva.org, achieved 11.5% increase in donations after updating their landing page and adding FAQs, social proof, and stats.

5. Images

According to HubSpot, 32% of marketers consider visual images most important when it comes to content for their business, with 51% of B2B marketers prioritizing creating visual assets as part of their content marketing strategy.

And did you know that 65% of the world’s population are visual learners? So, cover the entire page? Or go minimalist? What if the man in the photo isn’t smiling? What if he’s asleep? What if it’s a woman instead? Do the darker colors work? Is humor the way to go? What about adding a puppy? Or a teacup piglet?

Have fun with this one. And you’ll learn a lot about how your brand is perceived and what images are best aligned with customer expectations.

6. Colors and Design

Did you know that red is meant to increase appetite and is therefore used by a lot of fast food chains? And that customers associate orange with “cheap” products and services.

When prioritizing in this section, consider that; 1) users don’t read, they scan, 2) bold colors grab attention, and 3) colors can influence emotional states.

If you’re going for impact, prioritize tests that take these three things into account. Test layout, backgrounds, menu icons, menu structure, images, above/below fold content, buttons, font, logo placement, etc.

A single ad display change resulting from an A/B test increased Microsoft Bing’s revenue by 12 percent in 2012. (Harvard Business Review, 2017)

7. Mobile

With the majority of global web traffic coming from mobile devices, to ignore the mobile experience is business suicide. A quick look into analytics will tell you that conversion rates differ between desktop and mobile visits. People use your website differently depending on which device they’re on. So it is important to segment within an A/B test to distinguish between mobile and desktop visits.

And when better UX design as a result of user testing can increase a company’s conversion rate by 400 percent, there’s no excuse not to.

Reading Your A/B Results

Keep notes. Don’t forget the tests you’ve already done and what you learned from them. Build a library of results that you can then use in all future redesigns.

Keep things in perspective. A 5% difference in performance might not mean anything. You could always do an A/A test before introducing B to see what the standard deviation is between two versions of the same content. Then your “negligible” 5% can be assessed in an informed context and discarded if irrelevant. But remember what we said about an incremental 5% over time!

Finally, look past the numbers. So option B wins the test. Yay for that bigger button! But what do the results tell you about your customer? Is the preference for a minimalist background indicative of a changing customer persona? If orange is working, would “the cheaper option” be a good way to go for your brand? If testimonials work, would it be worth more actively collecting reviews?

According to AppSumo, only 1 out of 8 of their own A/B tests have driven significant change. So you’re going to have to do a lot of tests to get the results you’re looking for. If you feel your tests aren’t bringing results that help you reach your goal, could there be a deeper problem? Has the right audience found your page? Could your leads be too cold? Is it too soon in the sales cycle? Does your product/service need a major rethink or rebrand?

A/B testing during the Obama campaign increased donation conversions by 49 percent (BrightEdge) and raised an estimated additional $75 million due to marketing decisions resulting from A/B testing (Wired, 2012).

The Potential Is There

Inside one year, Google conducted 7,000 A/B tests. ‘Nuf said.

A/B testing is not one of those nice-to-have things on your website. It is essential.

Don’t assume. Test.

Don’t guess. Test.

Don’t settle for “good”...

...Aim for better. And better. And better. It’s worth it!

If you’re interested in finding out how Kentico can help you execute awesome A/B tests, check out the A/B and Multivariate (MVT) Testing modules.

By Yvette Hastings in Marketing
Time to see how Kentico's powerful marketing features can turn you into an online marketing powerhouse.
Schedule a Demo
search
We're named a Challenger in the 2019
Gartner Magic Quadrant for WCM!
×