Our WorkWhy UsPricing
Summit

Ignite: How to Spark Creativity and Build Resilience

Join us for 3+ hours of free content with 5 sessions on sparking creativity and building resilience, featuring 12+ industry experts from Headspace, Webflow, Zapier, Reddit, Coursera, Clari, DirecTV, MasterControl, Matterport, and more.
View event
Gather & Grow

Gather & Grow - March 8th - Your Opinion About Your Brand Doesn’t Matter

Watch the session with Dmitry Shamis and Catherine Blair Timothy as they dive into the foundations of a good and sustainable brand!
View event
Gather & Grow

Zero-click Content | On-Demand Recording

Watch the full session with Amanda Natividad about content quality, social listening, content trends in 2023, and how to leverage zero-click content!
View event
Events & Summits

Explore all the content inside Events & Summits!

See all
BlogCommunityBook a DemoSign In

How to Make the Most of A/B and Split Testing

David Wilson
Senior Content Marketing Specialist
PublishedMar 7, 2023
The Definitive Guide to A/B Testing & Split Testing - Superside
8 min read
Share the article

No matter how well you know your audience, they'll always surprise you. However much research you conduct, however well you know your people, there are always places you haven’t looked. It’s a science built on nuance, and there’s room for improvement on almost any campaign. This is where A/B and split testing comes in.

Being able to test elements in real time is like having a heart monitor. It lets you see exactly how you’re doing on a moment-to-moment basis, giving you a heads up when it’s time to stop doing X and start doing Y. Do your subject lines perform better with or without emojis? Are your landing pages too copy-heavy or just right? What generates more engagement on your social media channels: videos or static posts?

Let the numbers be your guide. By relying on A/B testing to let your audience answer these questions, you can create immensely impactful, data-led ad creative.

What is A/B Testing?

A/B testing is sometimes referred to as "split" or "multivariate" testing. It's the method of comparing two versions of a marketing element, like a landing page or app layout, to see which performs better.

Your audience is split at random. One segment sees one design, while the other sees a variation. Based on the metrics you’re using to measure success (click-through rates, bounce rates, or time spent on site, etc.) you can determine which option is better suited to your audience.

It’s important to note that while everything can be A/B tested, not everything needs to be A/B tested. A/B tests are best used when you feel there are two approaches that could work, and you’d like data to show you which is better.

Note that A/B and split testing shouldn't be used to test opinion-based biases, such as background color or font choice.

Is There a Difference Between an A/B Test & a Split Test?

A/B tests and split tests are essentially the same thing, and are often used interchangeably. ‘A/B’ refers to the elements being tested, while the ‘split’ is in reference to the audience.

Once your test has reached the required sample size or predetermined time frame, the results garnered from the audience split will indicate whether element A or element B was preferred.

The Benefits of A/B & Split Testing

A/B and split tests are easy to conduct. They require low to no investment, and offer real-time results. Apart from leveraging data to guide business decisions, A/B tests provide a number of advantages, including:

  • Increasing ROI by testing and optimizing your website based on existing traffic data
  • Predicting upcoming trends or patterns by testing images, content or new features
  • Reducing bounce rates by finding and resolving any UX shortcomings in your website or app
  • Measuring audience appetite for any future site modifications by using an audience segment as a sample group
Increase Your Creative Capacity With Superside
Increase Your Creative Capacity With Superside

Increase Your Creative Capacity With Superside

Let your in-house team reach for the stars while Superside keeps your day-to-day creative needs at an industry-leading level.

What Marketing Elements can be A/B Tested?

You can A/B test virtually anything in your marketing ad creative catalog, from your homepage look and feel to what your CTA's color should be. While access to this kind of granular data is helpful, try not to A/B test for the sake of testing.

The key is to ensure that:

  1. Your test is based on strong reasoning
  2. The results of the test are measurable and provide direction

A/B testing is something that should be used on an ongoing basis. Provided you know how to interpret the data, the insights can be invaluable.

Here’s a simple example. Suppose you want to run a promotion, and you’re planning to send a number of emails to your existing customers. You want this promo to be seen by as many people as possible, so a strong subject line is critical. You decide to A/B test two possible approaches:

Subject line A: → Don’t miss these MASSIVE SAVINGS!! ←

Subject line B: We’re running a promotion, and we think you’ll love it.

In this instance, the test is based on sound reasoning. You want to use the most effective subject line formula for your promotional campaign, and the two options have clear differences.

By splitting your audience and sending each segment a different subject line, you can measure open rates in real time. You can see which subject line has a better open rate after one hour (telling you which drives more urgency), and check back in over two or three days (telling you which was more successful in general). Based on these results, you'll know which of the two lines resonates with your audience, and can use that subject line approach for the balance of the emails.

A/B Testing Demands Twice the Creative

You can’t put your brand to the test without creative to test with. And since creative quality is the most important variable in marketing performance, marketers simply cannot fall short in design. The digital market is too saturated; powerful businesses are too easy to scroll by to put anything less than a full tank into your creative engine.

This is where even the most ambitious brands often fall short. They know their brand, and they know how to read the litmus on each test. But proper testing calls for different creative in each iteration—twice the amount of creative must be developed to get a proper read on performance. Without variation in the assets, you’re essentially A/B’ing the same thing.

How We Helped PointCard Scale Their Testing

Pointcard brand

Dedicated, always-on design teams are the silver bullet here. Veteran design stewards like those at Superside have a storied history of getting brands the creative they need without adding heads or exploding budgets, leaving creative marketers to focus on strategy.

PointCard, a fintech with an eye on becoming the standard payment platform for everyday purchasing, found this out firsthand: They watched their CTR more than triple on social ads.

Within a few months of their partnership, they used their design support team to create nearly 30 creative ad concepts, turning them into 441 static and motion social media ads resulting in a 240% increase in CTR and a click-to-install conversion rate growth of 275%—nearly 90% above their campaign targets.

That’s the power of bountiful, brand-forward creative. When the creative marketer’s pocket is never empty, they can finally find their brand’s sweet spot.

The A/B & Split Testing Process

Most A/B tests follow the steps below. However, you can modify any of these steps to suit your business' needs. The length of the test or the audience segments are common variables.

Step 1: Decide What You Want to Test

An effective A/B test should produce usable results. This requires knowing where missed opportunities or customer pain points may exist. A good example of this is a product page: How big should the image be? Where's the pricing located, and what does the CTA look like? A product page built to fit its audience makes for a smoother shopping experience, and that makes for higher sales.

Step 2: Create Two Variations of Your Test Element

When designing or building your test elements, see that both have their own potential appeal. Creating a variation simply because you need something to test your original design against will give you unhelpful, uninformative data. In the product page example, this might look like a layout with a hero image versus three smaller images of equal weight.

Don’t forget to run the A/B test with your site designs optimized for mobile as well as desktop.

Step 3: Scope Out the Test Parameters and Metrics

How long do you want to run your test, and how will you measure the results? You may also decide to test using only a percentage of your total audience. For example, you could use 20% of your audience split into two groups, allowing you to serve the remaining 80% of your audience the content that performed better.

For a product page A/B test, the measurable metrics could be bounce rates, abandoned carts and completed transactions monitored over a one or two week period.

Step 4: Run Your Test

Depending on which testing tool you use (more on those below), you'll be able to view the data and results as they happen. Along with whatever metrics you already have in place, this data will also include the number of customer engagements, the drop-off rate, and the percentage of respondents still to act.

Step 5: Analyze the Results

This is the exciting part. Compare like-for-like results for both variations and see how they stack up. Sometimes the results might be very similar, and in these cases, it's often worth repeating the test rather than trying to uncover insights where there may be none.

Rounding off the product page example, it could be possible that Design A ranked higher in completed transactions and had lower bounce rates, but was outperformed by Design B in your abandoned cart metric. Use this information to analyze your designs, and make an educated guess as to why that might be.

For truly excellent ad creative based on informative, involved split tests, let dedicated design partners fulfill your design requirements. Get the assets you need when you need them, providing any ad creative feedback and input with ease.

Interpreting Your Test Results

The point of conducting an A/B test is, in part, to determine which element or feature better serves the brand. But there's also significant insight gleaned from understanding why your customers behaved the way they did. Think about how to use that data to make informed decisions going forward. Without getting too myopic (which can lead to manifesting "insights" that don't exist), spend time on the results and make sure you can use as much of the feedback as possible.

Helpful A/B Testing Tools

There are a number of A/B testing tools available. Google offers a free marketing platform called Optimize, designed for small and medium-sized businesses. Among a host of website assistance features, Optimize enables businesses to test site and app variations.

Full-feature paid tools include VWO, the world’s leading A/B testing tool and ultimate split testing platform. VWO allows businesses to A/B test anything and measure the impact across all platforms.

Optimizely (no relation) is another excellent tool, able to test and optimize every single customer touchpoint across any customer journey.

Join 25,000+ high-performing professionals and receive the best design and marketing content, biweekly.

Common A/B Testing Mistakes

Despite its fairly simple structure, there are a few mistakes you can make when A/B testing. To help you avoid the most common ones, we’ve listed them below.

1. Testing multiple things at once

An email A/B test is a great illustration of this mistake. If both the subject line and the email content are A/B tested concurrently, the results will be muddled, completely invalidating the test and making it impossible to ascertain what element influenced what decision.

Keep your tests simple.

2. Testing with the wrong metrics

Testing for the wrong amount of time, or for the wrong audience is equally destructive. Ensure that you’re measuring your A/B elements using relevant metrics. Don’t measure a landing page by revenue generated if that page doesn’t feature a sales option, for instance.

3. Trusting a "gut feeling" over statistical data

This seems obvious, but occasionally a test will indicate something that contradicts what was, in your mind, a certainty. You might be tempted to back yourself and go against the data, but this is rarely the right call. Data is unbiased. Cherry-picking when to ignore data limits the success of any A/B test to your own perception.

4. Not giving external factors any consideration

Before running an A/B test, especially a long one, assess the effect that any external factors could have (holidays, competitor movements etc.). Unless significant external factors are weighed in, results can easily be skewed or misinterpreted.

5. Stopping at only one A/B test

Often the insights gained from an A/B test can create the need for further testing. Full optimization is a long-term goal, and requires combining multiple tests with a well thought-out strategy.

To ensure you have testable social media content that resonates with your audience, let Superside handle your social media design and ad creative.

A/B Testing Examples

Netflix

The streaming service regularly runs a massive number of A/B tests with a sample size of about 100,000 users. Personalization is the name of the game, and this rigorous and highly structured series of tests allows Netflix to present a platform that looks completely different to each user.

Kiva

This non-profit organization wanted to increase donations from first-time users, and believed that adding more information to their landing page would do the trick. The A/B test proved that hypothesis correct, with the more information-rich landing page seeing an 11.5% increase in donations.

Kiva A/B Test Example 1

Version A of Kiva's A/B Test. Source: ConversionVoodo.

Example of Kiva A/B test

Version B of Kiva's A/B Test. Source: ConversionVoodoo.

SmartWool

When this European apparel brand wanted to increase the average revenue per user, it A/B tested different homepage designs. The test showed the version that followed best practice, and kept product images to a uniform size that increased the average revenue per user by over 17%.

Final Tips and Best Practices

Before you start using A/B and split testing to start making better calls, familiarize yourself with the following best practices:

  • Ensure your audience and test are correctly set up before you begin (this is worth the time to guarantee that your data is accurate).
  • Don’t implement anything until you have statistical reliability, even if the early signs point to the obvious result.
  • End the test at the allotted time. If the results are insignificant, delaying the end of the test is unlikely to change this. Conclude the test and either retest or amend any variables.
Creative as a Service: The New Way to Get Design Done
Creative as a Service: The New Way to Get Design Done

Creative as a Service: The New Way to Get Design Done

Get a dedicated design ops team that scales with your vision. Want to learn more?

David Wilson
David WilsonSenior Content Marketing Specialist

David is a Senior Content Marketing Specialist at Superside. A former journalist with bylines too numerous to enumerate, he brings his love of storytelling and semantics to the marketing world. Recognizing the sizable gaps in the creative-as-a-service (CaaS) sector, he jumped at the chance to fill the creative void for ambitious brands. In his off hours, he enjoys loud music, making vegan meals and being made fun of for making vegan meals. He’ll gladly talk to you about any of the above on LinkedIn.


Subscribe our newsletter

Don’t miss anything!

Join our community of 15,000 who receive the best in design and marketing content, biweekly.

Book a call with us

Hassle-free design starts here

Superside is an always-on design company that makes design frictionless and hassle-free for marketing, sales, creative and product design teams. This means top-quality designs at lightning-fast speeds, improved velocity and go-to-market and completely secure and confidential file sharing and collaboration.

In this one-on-one live demo, you’ll see:

  • How Superside works
  • A first look at the Superside platform
  • The most suitable subscription plan for you

Get ready to join 450+ scale-ups and enterprise teams doing good design at scale with Superside’s dedicated team model.

Loading...
We need your phone number for the demo. We'll never use it for any other purposes.
Superside is a revolutionary way for businesses to get good design done at scale.Trusted by 450+ ambitious companies, Superside makes design hassle-free for marketing and creative teams. By combining the top 1% of creative talent from around the world with purpose-built technology and the rigor of design ops, Superside helps ambitious brands grow faster. Since inception, Superside has been a fully remote company, with more than 700 team members working across 57 countries and 13 timezones.
© 2023 Superside. All rights reserved.