A/B Testing, or split testing as it’s called in some circles, is one of those terms that’s often used, but rarely understood because it involves performance testing and landing page optimization, carried out through a series of scientific-type tests.

In the next few days, count the number of times you hear the word “test” or “testing” as it related to marketing, and it’ll shock you! I know I say it all that time and I hear it from a lot of different people in their podcasts, webinars and blog posts…

The thing about A/B testing is, by being diligent about the pages that you include in your tests, you can test to a winning campaign if you have the patience and a traffic source.

What I mean by that is most marketing fails.  It’s a fact of business.  Product offers, startups, ad campaigns; the deck is stacked against us most of the time…  But, byA/B testing your marketing and sales material, you can make iterative improvements on the messaging, target markets, images and design, all adding up to a dramatically improved campaign over time.

That’s the trick to landing page optimization…  It’s not one test – oftentimes it’s multiple!

What I want to share with you today is what I call the “Split Test Evolution.”

In this iterative A/B testing process, you’ll see how we test one thing first, find the winner, and then start a second test based on that one control.

After Test 2, Test 3, and so on; you start to truly zero in on your ideal conversion because your landing pages are as closed to optimized as possible…

Most of the time when you get started with A/B testing, you put up a few variations of a page and figure out which one converts best…  Maybe there’s a method to your madness – maybe there isn’t.  But then what?  What happens after you find a winner?

What you DO with the results is what matters in your hunt for landing page optimization.

First, let’s establish some ground rules:

  • Split testing takes patience and practice.  You should only be testing out ONE thing per page, per test.  That might be colors, buttons, headlines or images.
  • For every additional variation you add, you need to send that much MORE traffic!  If you're testing 2 variations, you might need 200 clicks.  3 variations, 300 clicks.  8 variations, 800 or 1000 clicks.
  • Disable the losers when you think they're losing - not when the software tells you they are.  You can always re-enable variations as the apparent winners start to drop (and they always do!)
  • Expect pretty high conversions right after you start a campaign.  The true test of your variations is what they do when you start to scale them...  From 20 clicks to 200 and 500 clicks or more!

Now, to do split testing, you need to have split testing software.  My pick is Visual Website Optimizer because you can test any page you have online, and it’s super easy to use.

Click here to check out VWO >>

Now, let’s look at some tests…

Here’s a landing page optimization campaign that we’ve been running for quite a while.  The traffic source is Facebook Ads driven directly to an opt-in page.  From Facebook Ads Manager, this is a website clicks campaign to a cold audience.

As you can see, Variation 2 is the clear winner with 37.74% conversions after we finished the test.  We discovered that pretty early so the majority of traffic went to that page…

This is actually where the A/B testing started out though…


Our very first test, we were getting 24.47% conversions from cold, paid traffic.

Here’s what that page looked like:


From there, we tested different headlines to see which one resonated with our audience…


The headlines were:

  • 100 "Plug & Play" Subject Lines
  • 100 "Most-Opened" Subject Lines
  • "Tested For You" Subject Lines
  • Subject Lines That Get Your Emails Opened

And as a refresher, here were our stats at the beginning of the A/B testing:


As you can see, Version 1 was our winner.  The one that said “100 Most-Opened Subject Lines.”

Now, a 2% bump doesn’t sound like a lot (and it’s not!), but it did give us some very valuable intel…

We knew which headline to use on our next test!

For Test 2, we used the winning variation, Variation 2, and changed the look of the landing page itself.  Now, if you’re familiar with our Scriptly Page Builder, one of these examples is inside Scriptly for you to use…  Largely because of this A/B test!


The text is the same – the headline, body copy, book image and top headline.

What we changed were the colors of the background, the button and the button location.

  • In Control and Variation 1 - the difference is the primary background color.
  • In Variation 2 and Variation 3 - we moved the button to the other side of the page.
  • And in Variation 3, we changed the button color.

Here are the results:


As you can see, Variation 2 was the HUGE winner of the A/B test, at 37.74%!

Here’s the winning landing page:


That’s a 13% bump in optin conversions and landing page optimization that rivals most other landing pages out there, consider this traffic was cold Facebook Ads…  All from two simple tests…

That’s some pretty impressive performance testing for only a few weeks worth of time!  Please take notice too – a 13% bump in conversions means 13 MORE people out of 100 who hit that page optin for the lead magnet…  Meaning, my lead cost dropped and my ad budget is going further for building our list.

Now, the next thing I need to do after this A/B test is test subheadlines to see if there’s a noticeable bump in conversion from them.  I’m guessing that there isn’t, but I’ve been wrong in the past :0)

At the end of the day, make small changes, see what’s working, and use that data to keep improving your conversions!  If you need help with performance testing and landing page optimization, make sure to book a call with us here!