Case study – Minto Communities
Hippoc is a predictive A/B testing platform that uses AI and neuroscience to instantly produce A/B test winners — without needing to spend time or money rolling out a cumbersome A/B test.
Applying a real-life A/B test, we used the Hippoc platform to predict and validate the existing test results.
Real-life A/B Test – Minto Communities
As reported on the go-to A/B testing case study website, GuessTheTest, the original A/B test was run for a Canadian real estate development company, called Minto Communities, within the A/B testing heatmapping platform, Crazy Egg. The test ran for about 3 months with a total of 16,285 mobile visitors.
The experiment was run to determine which CTA color would improve clickthrough rates (CTRs).
The test goal was to drive more prospective home buyers to register for important housing updates that would keep Minto Communities top of prospects’ mind.
A total of four button color variations were tested across the desktop and mobile sites. Below are control and winning variation tested on mobile:
The study was set-up as a multi-armed bandit test. As such, for each device type, traffic was initially split equally across the 4 button color variants; however, as a winner began to emerge, traffic was weighted and diverted to the best performing version.
Winner: Version A – the yellow button won on mobile lifting CTRs a strong +47.91% where as the orange CTA version dragged down conversions -1.53%.
The Hippoc predictive A/B test experimentation
To validate this result, a quick test was run on Hippoc, comparing the control (Image 2) against the variant (Image 1).
When only measuring the CTA, Hippoc successfully predicted version A with an overall impact score of 65% would outperform the control version which has an overall impact score of 53%.
How can this result be explained?
Whether the analysis is done for all the zones or only on the CTA, we can see that the yellow button has a significantly higher impact score than the orange button.
Looking at the attention score, we can see that the yellow button attracts slightly less attention than the orange button. However, we can see that the recall score of the yellow button is much higher than the orange button. This means that the yellow button is much more quickly and efficiently processed by the consumer’s brain than the orange button.
Let’s not forget that the brain is lazy! By making the information easier to process, we encourage the desired action to occur.
Assuming it cost Minto Communities $0.50 cent per visitor to lead the 16,285 users to the 4 mobile variants, this experimentation cost over $16,285 in paid traffic and tooks 3 months.
In Contrast. it took Hippoc less than 1 minute to run this test and achieve the same winning results.
For more A/B testing ideas and insights, checkout the go-to A/B test case study resource, GuessTheTest.