Get This Sales Funnel Custom Built >> Click Here!

 

 

Video Transcript:

Jason Drohn:
All right. Hey, what's up? This is Jason Drohn, welcome to today's episode of Sales System Experts. How are you doing, Aaron?

Aaron Parkinson:
We are in life.

Jason Drohn:
We are in life. Yes, we are.

Aaron Parkinson:
I'm doing great.

Jason Drohn:
That's fantastic. Any good news from the week?

Aaron Parkinson:
So many good news. Did a webinar yesterday, had 2,300 people registered, 700 of them, ish, showed up.

Jason Drohn:
Wow.

Aaron Parkinson:
Did about 50,000 in sales in the first wave. Now, the replays will go out and all that good stuff, which usually doubles the conversions of but $100,000 in sales. A really good client of ours, I was helping them do it. We kind of partnered up on it. That was super, super cool. Yeah, man, just so much going on like you, but let me ask you, how was your week? What cool thing happened in your week?

Jason Drohn:
Oh, geez. My cool thing, let's see, I've been on the phone all week.

Aaron Parkinson:
Not that cool really, but cool that so many people want to talk to you and give you money.

Jason Drohn:
Yeah, kind of. Yeah. Yeah. I've been on the phone all week, so it's been a lot of calls, a lot of client calls, just a lot of calls, you know?

Aaron Parkinson:
Did that make it feel like the shortest week in history or the longest?

Jason Drohn:
Oh, it was the longest. I'm an extrovert by nature or an introvert by nature, so yeah, these long call weeks are just not necessarily something that I get into. I'm going to share something that I have been getting into if that's okay?

Aaron Parkinson:
Please do, the topic of our call today.

Jason Drohn:
The topic of our call today is scaling and optimization. This is split testing.

Aaron Parkinson:
Are we going to do 42 calls about this? This could go long.

Jason Drohn:
Yeah. This is more probably just an optimization thing. It's funny because I haven't even shown Aaron this, so we're using this as an example, just as an excuse to have a conversation.

Aaron Parkinson:
Real live example of our stuff that you're filling me in on that we're going to use for training... I love it.

Jason Drohn:
Yeah. Let's see. Okay, so here's a good one. We're testing two different variations of a... So this is a piece of software called Visual Website Optimizer. I was in love with Visual Website Optimizer, and then I wasn't in love with Visual Website Optimizer. Now, I'm back in love with it again.

Aaron Parkinson:
Sounds like a girlfriend.

Jason Drohn:
Right, I know. It was the very first split testing tool that I ever played with. By and large, it was something I was just kind of really... I enjoyed. It also works very, very well. Then, they kind of added a bunch of features to it, and then it just got really like Microsoft. They did what Microsoft does to the software. They just move in, they fucking destroy it by adding a bunch of stupid shit to it, and then it doesn't work as well.

Aaron Parkinson:
Take that Microsoft.

Jason Drohn:
Since they have cleaned it all back up, they've reined in all their new features and now it's pretty cool again.

Aaron Parkinson:
They've hired somebody from Apple.

Jason Drohn:
Right. This is one test. Basically what this test is I'm split testing our old sales funnel page with a new sales funnel page. This sales funnel page is... The only thing that is different on this page, and this page converts at 2% typically, the only thing that's different from this page and this page is a new video. It kicked me to the new variation, but all it is is this video. This video is an updated video that is 50 pounds lighter than the previous video that I recorded a year and a half ago.

Aaron Parkinson:
This is going to be an evaluation of human nature. Do people trust the fat guy more or the skinny guy more? We're going to find out. This is going to be interesting. I can't wait to hear the results of this. That's the only thing that's changed, right?

Jason Drohn:
It's the same script and everything.

Aaron Parkinson:
Wow. This is going to be interesting.

Jason Drohn:
The winner is the skinny guy or the skinnier guy.

Aaron Parkinson:
Oh my God. You, humans, are so shallow.

Jason Drohn:
The variation one, which is the new video, has gotten three out of 58 conversions so far. That's three successful applications compared to zero out of 51 of the control. Now, this is still the early days. It's not like a 200 click test or anything like that. By and large, three to zero is ridiculous, so 6.9% conversion on the new video, which that's saying something, right? Here's the preview. You can see, this is the screenshot of the old video, and then this is the... It looks like this is just kind of a Vimeo screengrab or whatever, as an automated thing.

Aaron Parkinson:
the same thumbnail on both?

Jason Drohn:
Yeah, no, it's a different thumbnail. Both of them are playing, so you have to hit the play in both instances, but the rest of the content is the same. The form is the same, the content's the same, the headlines are the same. Everything besides the video is different. Or everything besides-

Aaron Parkinson:
What would be interesting, if the thumbnails are different, what would be interested just to see is look at the data of how many people played it?

Jason Drohn:
Oh yeah. The other thing we're tracking is engagement. Both of them do have... You have to play, it's not an automated play. In the thumbnail here, check this out. If we look at the previews, the thumbnail is this overview shot of a laptop. Then, the thumbnail on the new one, Vimeo has this weird capture thing, but with Vimeo, it's me. It's a thumbnail of me, my face, full-motion graphic. They have to click play in both scenarios. If we look at engagement, then engagement is... So we're at 34% when it's a full-motion shot, they have to click play on a thumbnail of me, versus 20% when it's a thumbnail of that overview stock photo thing. It's kind of an interesting little tidbit.

Aaron Parkinson:
Interesting. Are we getting... Because we've got 34% versus 20%.

Jason Drohn:
Mm-hmm (affirmative), engagement.

Aaron Parkinson:
Yeah, which is an increase in that video being played almost 100% more than the other one.

Jason Drohn:
Right, and then this has gotten... It's three to zero in terms of the number of applications submitted from the page. Let's look at the click map. Let's just check this baby out. This is control.

Aaron Parkinson:
Yeah.

Jason Drohn:
Then, we see, so we have a click there, a couple clicks through the page.

Aaron Parkinson:
Will this heat map software show us how many clicks were done on that video? Sometimes they'll summarize them in a popup.

Jason Drohn:
I don't know. Currently showing 10 clicks total with an intensity... There's a click map.

Aaron Parkinson:
Maybe the click map will show us.

Jason Drohn:
It doesn't look like it's doing anything. All right, so let's go look at the other one.

Aaron Parkinson:
See, now what would be interesting is put the same thumbnail on both of them.

Jason Drohn:
Yeah, to split testing...

Aaron Parkinson:
Thumbnail just got more people to click it, or if that video was the better performing video.

Jason Drohn:
Look at this. If we look at the heat map for the control, we got a couple of clicks, just random clicks here. This is the heat map for the new video, which is, I mean, it's lit up compared to the other one.

Aaron Parkinson:
It is.

Jason Drohn:
Then, we've got a couple of stuff here. It looks like people are just watching that video more.

Aaron Parkinson:
Yeah. That's what I think. I think that thumbnail of you there talking, whatever you want to call it, a thumbnail of you or whatever is just driving dramatically more traffic through. I'm not sure if I can make a conscientious evaluation of the shallowness of the human race.

Jason Drohn:
It might just be the thumbnail, human being.

Aaron Parkinson:
Yeah. Skinny Jason is more credible than a fat Jason. I think that it might just be a direct correlation to the volume of clicks-through based on the preview image.

Jason Drohn:
Yeah, might be. We should probably just split testing the thumbnail. We'll just test the thumbnail and see, maybe we'll update in a couple of weeks and see how shallow the human race is.

Aaron Parkinson:
Yeah. There's a lot of things that I won't share on here that are just so shallow. By split testing, it's like standard marketing 101, there's stuff I test and you're just like, "Oh man."

Jason Drohn:
That can't work.

Aaron Parkinson:
It's so disappointing, the human race, but it's so predictable. It's the go-to every time.

Jason Drohn:
The other one that we're going to look at is the Funnel Factor Lander, which you guys have seen this in a couple of our... You've already seen this in a couple of things. We're controlling a couple of different... We're doing a couple of different... So I split testing the main headline, so just to kind of catch everybody up, so basically I tested this headline right here and this is the winner by long shot, convert more clicks to customers. It was a way, way, way long shot winner over the other three tests that we ran. I locked that one in, and now I'm at this. I'm testing this black box, the text in this black box, so the updated master guide reveals the step-by-step process to building blah, blah, blah.

Then, this is a slightly different variation of that. Then, we have a shorter variation, the master guide reveals the top converting sales funnels. Then, we have another kind of just a tweaked version, so no major, major changes. It's a very prominent space on the landing page, but not necessarily major language changes there. Do you know? Here's the report. One of them came in... So I disabled version one because that was a fricking dud, so nothing going on there.

Then, we have these other versions. Our control is currently the winner. Our control is outperforming the rest from a leadership standpoint, so when we're looking at the lead. Now, Aaron was like, "Wow, that first sentence is grammatically weird." Yeah, dude, I get it, but it also converts better than everything else, so whatever.

Aaron Parkinson:
In my defense, I did say it could outperform because it's grammatically weird. People will read it two or three times...

Jason Drohn:
Yeah, but here's the crazy part. Here's the anomaly that I was like, "This shouldn't be right. This totally shouldn't be right." All right. Our variation two is converting at 39% and our variation three is converting at 37%. If you're only looking at lead conversion, you would immediately whack these two and then start another split testing, wouldn't you? Okay, so watch this.

If we actually kind of focus on a different conversion point, watch what happens. This is our add to cart, so this is how many people add to cart in their variations. Our control has an expected conversion rate of 1%. This is the best performing lead magnet. Check variation two out though, 12.7% add to cart, which means that our... So let me just flip back real quick. I see Aaron zeroing in, he's like, "What the fuck just happened?"

Aaron Parkinson:
I'm all about it right now. I just heard 12% add to cart, got my attention.

Jason Drohn:
The control is split testing on the front side. When we're optimizing per lead, the control is split testing at 45%, which is awesome. Variation two is converting at 39%, which is less awesome than the control. If we take a step back and we zero in on an add to cart conversion, so the number of people who are going through the confirmation page, buying the video course, or at least attempting to purchase the video course, 12% of variation two clicks through and adds it to the cart. 7% of variation three clicks to the cart, three out of 51. In this, when we focus on add to cart, variation two is the winner of the split testing, not the control.

Aaron Parkinson:
Then, it all comes down for us to ...

Jason Drohn:
Revenue.

Aaron Parkinson:
Return on ad spend. Therefore, the metric that we care about most is the sales metric, not necessarily the lead metric. Now, for people watching this, the three people that will watch it, what would you look at as a base number of conversions for statistical relevance in this case?

Jason Drohn:
I'm usually like 95%. If I can get a probability to be best at 95%, then that's a pretty good number for me, but sometimes it's flat out wrong. At this point, I'm looking at zero out of 49. I'm like, "Well, that thing, there's no way that's going to win. Not in these two, so let's let this two duke it out."

Aaron Parkinson:
Yeah, I agree.

Jason Drohn:
This is just added to the cart. This is just the number of people who added to the cart. Let's take this one step further and see how many people-

Aaron Parkinson:
Uh-oh, are you going to screw up my mind now?

Jason Drohn:
No, actually. Actually kind of. Right now, variation two and variation three are neck and neck, which is, I mean, it is what it is. Three and three out of 51, in each sense, ended up clicking through to purchase, so these are successful buyers since the split testing was rolled out. Just goes to show you that your first, your front end metric, when you're testing is not always the best metric to be looking at. We're going to disable this variation.

Being that this is the control and the base, what we have to do is it's going to... It can't disable base variation, change the base variation, and try again, so we're going to change the base variation into variation two. It's switching the base variation, so the one that originally tests from split testing, and then from there, we disable this control variation because that one sucked. Even though it was the best conversion, we don't care about conversion. We care about sales. These two are our best from a sales standpoint. Make sense?

Aaron Parkinson:
I like it.

Jason Drohn:
What do you think?

Aaron Parkinson:
One, I think that the initial results are very exciting. Number two, that's why we always are split testing.

Jason Drohn:
Right.

Aaron Parkinson:
Always be split testing, tests running 24/7, because of the more testing... As long as you don't... For me, you can over test.

Jason Drohn:
Yeah, totally.

Aaron Parkinson:
If you do it within an intelligent, controlled, organized way, like in the agency, we only test with the inside of 10% of the ad budget. Right? We don't mess up the overall return on ad spend, then you should always be split testing. You should always be trying to beat the control. Always. I do want to point something out, correct me if I'm wrong, the grammatically incorrect one was attracting stupid people who don't buy.

Jason Drohn:
You could make that case. You could make that case.

Aaron Parkinson:
We might have been getting the highest opt-ins there, but they were dumb-dumbs.

Jason Drohn:
Yeah. You could be making... I mean, because the smart ones buy because they recognize quality, right?

Aaron Parkinson:
That's what I'm saying. The buyers were like, "I ain't buying this. This guy can't even speak English." The dumb ones were like, "I resonate with this guy. He speaks like me."

Jason Drohn:
Oh, that's funny.

Aaron Parkinson:
Thank you.

Jason Drohn:
That is some of the big statistical stuff that has kept me sane throughout this process amidst my 35 calls.

Aaron Parkinson:
I love it. I love it, call this a wrap. I have a hard stop at another meeting and you and I have another meeting later today.

Jason Drohn:
We do.

Aaron Parkinson:
Thank you for joining us again on Sales Systems Experts. We'll see you next week.

Jason Drohn:
All right, see you. Bye.