3 Simple A/B Test Variables to Increase Your Site Conversion

You’ve probably heard about the benefits of split testing. There is a long list of case-studies that shows A/B testing increases conversions. However many business owners simply don’t know where to begin.

We’re often asked the question: “What factors do I split test?”

Knowing what to test isn’t always crystal clear. If you tweak too much at once, you run the risk of not being able to identify what works and what doesn’t.

That’s why in this article, I’m giving you 3 simple A/B testing variables that are guaranteed to make a real difference.

Split Testing Variable #1: Buy Buttons

A/B Testing Results

With a powerful but simple-to-use split testing tool, you can effortlessly weigh small split testing variables. I use the word small hesitantly—after all, what looks insignificant might be a deal-breaker for your traffic.

Look to the graph above, and you’ll see the results that a simple but powerful split testing tool creates.

So what do I mean by small tweak? Here’s a prime example…

The wording or color of your call to action buttons can have a tremendous impact on your conversion rate.

If your conversions are low, switch from your current blue button to action-colors such as red or orange.

Another variable is to change wording such as “buy now” or “add to cart” to something more enticing such as “get instant access.”

When language inspires action, that’s when opt-in rates increase. Here are some prime examples of wording that fosters higher opt-in rates:

Good Buy Button Example

Compare this button that has a color that pops and directive language versus this one…

Bad Buy Button

Will Orange or Yellow win every time?

Not necessarily, and that’s why it’s so important to test button colors and call to action wording.

You just never know.

A few years back, HubSpot developed a case study between red and green buy buttons, and the red version won at a 21% higher conversion rate.

Split Testing Variable #2: Landing Page Length

I like to think of split testing the length of your sales pages as “the Goldilocks Method.” Before you roll your eyes at my analogy, know that I’m about to share something that will increase your opt-in rate.

When you’re set to unveil or re-launch a new product/service, you might start with a long, robust sales page that hits on every benefit, solution, bonus offer, and pain point.

I suggest you split test 3 different sales pages: a long “Papa Bear,” a mid-length “Mama Bear,” and a short “Baby Bear” version.

Bear Family

The Papa Bear variable: the best part about this version is that it provides the best place to start split testing, but that’s not to say long sales pages are unusable.

Stick this dictionary-size landing page up, drive traffic to it, and see what happens. Your data might prove that readers abandon ship which isn’t always the case, but it can happen. That’s when you trim the content.

The Mama Bear variable: think of this landing page as a compromise between long/benefits-driven and short/impactful.

Pick through that monster sales page and find repetitive, confusing, or unnecessary information. To get concrete, if the Papa Bear version works out to 25 document pages, then Mama Bear’s version would end at around 7 to 10 pages.

The Baby Bear variable: this version is small but powerful, standing at roughly 10-20% of the original length.

Once you chisel your content from the longer forms, you’ll unearth powerful copy that explains benefits, but without the fluff. Use a high-performance split testing software to ascertain which length keeps your traffic on the page, clicking buttons, and converting.

Split Testing Variable #3: Video placement

Heatmap

Split testing provides valuable data, but sometimes you need to think about the information collected. When you analyze user behavior through analytics, it’s important to practice educated speculation.

One old cliche sums it up best: your eyes may deceive you.

If you use a heatmapping software such as CrazyEgg, you may notice that users click on your videos. Chances are, traffic is clicking the stop button, so that they don’t get distracted while reading the rest of the page.

This user behavior can be a good or bad sign, so it’s important to think on it a bit. Either the audience wants to know more, or they don’t like the video.

How do you know? You split test the video placement.

If the video is near the top of a sales page, try placing it in the middle. These small changes break up the monotony of the text, give your audience more value, and make the marketing content easier to digest.

Now that you know a few variables to test, start A/B testing to eliminate the guesswork and decide on your marketing strategy based on what actually works.

OptinMonster allows you to run A/B tests with just a few clicks, so you can attain powerful data on user behavior and make small changes that have large impacts. Click here to learn more.

Victoria is a freelance writer for OptinMonster who loves writing about email marketing and conversions.

Comments

  1. Owain says:

    If you are interested in tools within this space (heatmapping software), take a look at ours too “Decibel Insight”.

  2. Marcin R. says:

    Hi there 😉 Thanks for this article. Indeed, these 3 variables you mentioned are very important when it comes to increasing conversion rate. What I’d like to mention though is a fact that it’s not always good to focus solely on conversion rate when A/B testing. That’s because many A/B tests manage to deliver significant conversion rate growths but fail to deliver a real impact on business.
    Cheers,
    Marcin

Add a Comment

We're glad you have chosen to leave a comment. Please keep in mind that all comments are moderated according to our privacy policy, and all links are nofollow. Do NOT use keywords in the name field. Let's have a personal and meaningful conversation.