Back to Community feed

How to Work with A/B Testing Tools for Optimization Success? Top 6 Factors Explained

1. Start with Education (Don’t Skip This)


This is where you invest in the people using the tools. Because, whether free or paid, the gains from your A/B testing tool are only as good as the hands wielding it.

An A/B testing software in the hands of a beginner tester is not the same as that same software in the hands of an expert.

Ace optimizer Simar Gujral of OptiPhoenix understands this. He conducts tool training for new hires but goes beyond “how to use an A/B testing tool” to emphasize process training and strategy.

Jonny Longden of Journey Further says 80% of your investment should be in people and 20% in technology


So, your first step in getting maximum ROI benefits from optimizing is educating yourself and your team on the right A/B testing processes and strategies.

You can get started with CXL’s CRO and A/B testing program. In fact, as a Convert customer, you’ll get access to this CXL training program.


Training is what prepares a tester to:

  • Come up with great hypotheses to test
  • Prioritize testing high impact areas first
  • Read the data correctly and extract valuable insights
  • Find test ideas
  • Test the right things

… and more.


2. Use the Tool’s Flexibility to Your Advantage


Another way to use your tool to the best of its ability is to take advantage of its full range of features—but only as it applies to your unique use, of course.

You can use:


Integrations


Integrations are the backbone of creating an interconnected stack of teamed-up tools to supercharge your A/B testing, marketing, or conversion rate optimization efforts.

It could be to extract data from multiple tools of your stack and inform hypotheses, send this data to as many platforms as possible including Google Analytics, or run A/B tests on your email marketing campaigns.


Here’s what Silver Ringvee, CTO at Speero, says about this:


Make sure you don’t keep your data in isolation within your testing tool. I’d recommend pushing your experimentation data to as many relevant tools and destinations as possible. This way you can dig deeper into variant groups in your analytics tool, analyze the user behavior using something like Hotjar, and calculate the impact on long-term metrics like LTV or churn within your data warehouse.


Self Service Payment


This one is especially true for businesses that are just starting out with A/B testing and don’t really have an endless budget to get by. The aim is to steadily improve your testing velocity.

So make sure that the tool allows you to purchase more tested users as needed, without pausing your plan till you upgrade to a higher-priced tier (after tiring conversations with sales reps).


It would be ideal if you can upgrade to access features you may only need sporadically when you want to. And downgrade when your testing program is going through a lull so that you can use the budget to upskill your team. Don’t miss out on this opportunity.

Also, pricey tools come with a big promise that quickly devolves into hype. Don’t get sold on the bells and whistles that will let you test everything. You rarely need to do that.

Instead, focus on the tool that will let you test what your business needs. And this is where Education and Test Strategy come in.


3. Check Up on Tool Introduced (Validity) Threats

Before you put all your trust in your results, make sure you fully understand how your chosen tool works and how you’ve implemented the test.


The goal here is to see if your data has been corrupted in some way, so that you’ll know how much confidence to place in the results.


Does the A/B Testing Tool Affect Your Core Web Vitals?

Make sure you’ve implemented your A/B testing tool in a way that does not decimate your search engine rankings. You don’t want the SEO vs. CRO debate. The two go hand in hand.

There’s almost no change you’d make to your site for CRO purposes that will upset your standing with Google. Any change you make will usually affect keywords, page content, and user experience. That’s just 3 of the 200+ ranking factors.

And since you increase conversions on those pages, you’re also sending positive signals to Google that people love your content. There’s a lower bounce rate and the number of visitors is increasing.

According to Rand Fishkin from Moz, as long as you’re not making insane changes to your page, you should see CRO and SEO as teammates — not opposing factors.


Does Your A/B Testing Tool Cause Flickering?

You don’t want the blink skewing test data. Pick a tool that does not flicker.

Flicker can ruin the integrity of the data you collect from experimentation because it becomes obvious to your website visitors that something strange is going on with your webpage. When they get a glance at the original version before the variant appears, it raises question marks in their minds. It goes without saying that showing different versions of your page to the same visitor mars user experience.

You need to deploy the tool code in a way that explicitly supports no blinking. Here’s how you can do that.


Here’s what the experts have to share about some of the real-world effects of flickering in website optimization:

Even though we do enjoy having personalization in our experiences, in many cases we don’t want to know we’re being personalized to, the reason being is that we want to be in CONTROL, according to the self-determination theory, we want to have autonomy and “CONTROL” of what we’re doing.
Therefore having the flicker effect where it shows the control for seconds and then it changes to the variant; we can’t go back to the first version even if we try to reload the page or go back since the cookie has been saved. It causes mistrust and anxiety. “Why can’t I go back to what I’ve seen before?” “I liked the first page I saw, how can I go back?”
In a nutshell, it causes mistrust to the brand that has this issue, it increases bounce rate and loss in conversions.
Carlos Alberto Reyes Ramos, Speero

4. Look Under the Stats Hood & Consider Hiring Dedicated Talent

The tool is not the number one place to invest most of your testing budget. If that’s your strategy, you may even struggle to prove the ROI of your A/B tests.


If your organization is on the path to fully imbibing an experimentation culture and making data-driven decisions all the way, even if you’ve successfully democratized data and gotten all hands on deck in that area, it still makes perfect sense to invest in a dedicated talent for that purpose. And if you can afford it, a team.


This is because experimenting doesn’t yield desired results in the long term when it’s just a side task for someone on your marketing team. Even just a 50% tester will always outperform a 1% tester.


You’d also want to focus more on the leadership and communication skills of your talent to promote that testing culture in your organization. Coming up with great hypotheses and running sound A/B testing, split testing, or multivariate testing are skills that can be learned.


When it comes to your A/B testing tool, you want to be able to trust the results you’re getting. Go with an option that’s open about their statistical approach.

Whether Bayesian or Frequentist, your dedicated talent with solid statistics background should be able to understand how those numbers are calculated. That way you can extract much more accurate insights and get full value for the money invested in your tool.


Even if you’re using one of the best free A/B testing tools, Google Optimize, you need this information. Unfortunately, all you can learn about GO is that it uses Bayesian but won’t share its prior considerations with you. This is a lack of transparency and a big issue.

Maybe it is time to consider transitioning to more trustworthy testing?

On the other hand, with Frequentist stats engines, collaborators may look only at statistical significance levels and draw incorrect conclusions. Ah, the illogical sin of peeking! You’re supposed to wait until it hits the sample size.

What you can do about this is set rules against peeking. You don’t want people running with erroneous conclusions that impact the quality of decisions.


Always go for vendors with transparent stat engines.


5. Set Your Tool Up for Use & Adoption

One of the barriers for other members of your organization adopting a testing culture is that some of these tools come with a rather steep learning curve.


But you can make things a little welcoming and easier to grasp to an average user. Here’s how:


Start Right

You can easily overwhelm others if there are lots of features that seem too technical to even bother.

If you aren’t using these fancy features, go for a lightweight tool that cuts feature bloat. VWO’s products are well set-up for this (Yup, a competitor… but this aspect of theirs is really amazing).


Use Your Tool’s Project Management Features

This is a fantastic way to include others and seamlessly work on A/B tests as a team.

Also, other people can utilize the same tool for different purposes. For example, Convert has the ability to have multiple projects in one account, with each project capable of handling unlimited sub-domains.


Nomenclature Is Important

Set up in such a way that anybody on your team can hop on the tool and get an idea of what’s been happening. Familiarity, in this case, breeds adoption.


6. Measure the Right Output For Your A/B Testing Program


It seems you can’t get much-needed attention to an A/B testing program if you don’t attribute it to revenue gains. Usually, executives demand exact numbers to validate the need for an A/B test.


“What percentage lift should we expect? And how much will that add to this year’s revenue?”


But that is not what experimentation is designed for. A/B tests are meant to add a measure of certainty or confidence to ideas — whether a hypothesis is true or not.

In fact, that approach can lead to problems such as:


  1. Setting expectations that cannot realistically be met,
  2. Attributing gains to A/B testing alone and ignoring other key players,
  3. Seeing a failed test as a total failure, instead of insights into what works (profit growing) and what doesn’t (risk mitigation),
  4. Creating inaccurate extrapolations of test results.


That being said, you don’t want to use your A/B testing tool to track link clicks. Instead, choose the right A/B testing goal. And leverage the power of your tool’s Advanced Goal options to get granular about what you are tracking & why.


Here’s how to choose the right goals and metrics you should be tracking:


  1. Start with the goals that matter to your business. That way you can choose A/B testing goals that align with the goals of the business. If the goal of the business is to increase revenue by acquiring more customers, your experimentation should be geared towards generating leads.
  2. Pick your primary and secondary goals. Primary would be those that directly link to the goals of the business, such as app downloads and demo requests. Secondary goals support primary goals because they are micro-conversions that usually lead to the achievement of the secondary goal at some point in the future. These could be engaging with your content or signing up for newsletters. The idea here is to recognize factors that contribute to achieving the main business goals.
  3. Decide on what to measure. These are your key performance indicators (KPIs) — metrics that indicate where you are in relation to the main business goals. When you recognize this, it’s easier to actually measure and improve those metrics that directly impact positive growth for the business.


If you don’t set up your goals correctly, you will either celebrate micro-goals that don’t move the needle or constantly invest in “Big Sky Ideas” that are difficult to calibrate, design, deploy, and end up looking like failures. The balance lies in the middle.


Articles — Other
ActiveCampaign
Sponsored
ActiveCampaign is a marketing and sales automation platform that helps growing businesses meaningfully connect and engage with their customers. Its SaaS platform enables businesses to create optimized customer experiences by automating many behind the scenes processes and communicating with their customers across channels with personalized messages.