Case study – Capital One
Hippoc is a predictive A/B testing platform that uses AI and neuroscience to instantly produce A/B test winners — without needing to spend time or money rolling out a cumbersome A/B test.
Applying a real-life A/B test, we used the Hippoc platform to predict and validate the existing test results.
Real-life A/B Test – Capital One
As reported on the go-to A/B testing case study website, GuessTheTest, the original A/B test was run for the well-known American banking and loans company,Capital One,. The study ran on Adobe Target for 17 days with a total of 205,831 visitors;traffic was split 50/50.
With the goal of increasing application submissions, the team tested whether adding a second “Get Pre Qualified” Call To Action (CTA) button, in addition to the existing “Learn More” CTA would help or hinder conversions.
The team, therefore, decided to test what button format would work best.
Winner : Version A – with the added “Get Pre-Qualified” button.
It drove an astounding 363% more clicks to the first step of the application process, compared to the version with the single CTA but also increased application submission rates by an incredible 55%.
As proof to the pudding, after the winning version was implemented, there was a consistent 17% lift in pre-qualification applications.
Results achieved 99% confidence.
The Hippoc predictive A/B test experimentation
To validate this result, a quick test was run on Hippoc, comparing the variation A (Image 1) against the variation B (Image 2).
With an overall impact score of 50%, Hippoc successfully predicted version A would outperform version B which had an overall impact score of 45%.
How can this result be explained?
Let’s deep dive in the results to get a better understanding!
The first thing that stands out is that the additional CTA made the header and the CTA “learn more” more cognitively efficient.
Indeed, we can see that the addition of the CTA “get pre-qualified” in version A made the header and the CTA 3 more salient in the eyes of the user with respective attention scores of 79% and 49% in version A against 71% and 41% in version B.
Moreover, these same zones are more quickly recalled, with a recall score of 77% for the header 1 in version A against 65% in version B and a score of 53% for the CTA in version A against 44% in version B.
Therefore, by adding the CTA “get pre-qualified” we can argue that Capital One didn’t just offered an option that resonated more with a segment of high-intent users but also made the form more efficient from a cognitive point of view which together led to an increase of 363% more clicks and a 55% increase in the form submission rate.
Conversely, we realize the header 2 and the CTA 3 in version B outperformed the same zone in version A.
This result is mainly explained by a higher recall of these zones in version B with a recall score of 55% vs 53% for the header 2 and a recall score of 65% vs 56% for the CTA 3.
We can thus see that the addition of the CTA in version A concentrates the user’s cognitive resources on the left side of the page to the detriment of the right side.
The study the study does not tell us if there were more or less clicks on this 3rd CTA
We can clearly see here that attention and recall are limited resources that the brain must allocate strategically.
Adding or removing a visual element has a direct impact on how your brain processes the information of your landing page or ads. But if you don’t measure it, you can’t understand if it will have a positive or negative impact.
You can easily hide or make information more obvious.
Assuming Capital One paid $0.50 cents per visitor for over 200,000 leads on the 2 different landing pages, this experimentation would have cost over $100,000 US and took17 days, whereas it took us 60 seconds.
Find out more about this case study on the go-to A/B testing case study resource, GuessTheTest.