Benchmark Analysis: A/B Testing Tool Speed Test Study
In our recent research study, we examined some noteworthy A/B Testing Platforms for their impact on website speed and performance, both on desktop and mobile devices.
Our aim was to understand how these tools affect the loading time for a site — an essential aspect that directly impacts a user’s online experience.
The speed test evaluated eight A/B testing tools:
- VWO - A/B testing platform with Heatmap
- Convert Experiences - A/B testing platform
- ABlyft - A/B testing platform
- Webtrends Optimize - A/B testing platform
- Zoho PageSense - All-in-one marketing tool with A/B testing
- Mida.so - A/B testing platform powered by AI
- CrazyEgg - Heatmap tool with A/B testing capability
- FigPii - Heatmap tool with A/B testing capability
Metrics
To successfully measure the speed performance, we had to understand these parameters:
- Load Content Paint (LCP) measures the time it takes to paint the largest content element visible on the screen. The faster, the better.
- Start Time To Variant (STTV) is the time from the start of the script until the variant is loaded. Again, the faster this happens, the better.
- Uncompressed size. The size of the tool matters because a larger, more complex tool might slow down the website more than a smaller, simpler tool.
- Number of requests is the number of requests observed and made by the platform for any HTTP requests or additional scripts loaded on the page.
By using these metrics, we managed to create a clear, fair and useful comparison that truly represents how these tools can impact your website's speed. Let's take a look at the results.
Desktop Result Analysis
Here are the results of how the A/B testing platform affects your website loading speed adding anywhere from 474ms up to 2000ms overhead to your site performance.
Interestingly, Mida.so emerged as the fastest A/B testing tool regarding LCP and STTV. In terms of project size and the number of requests, Convert and ABlyft were the most efficient, requiring just one request each.
Desktop speed test replay
Here is what it looks like for your user experience when it comes to a fast and a slow A/B testing performance:
Link to sources:
Mobile Result Analysis
Upon conducting the same tests on mobile platforms, we noticed different results.
For mobile, Mida.so yet again emerged as the A/B testing tool with the quickest LCP, while Convert displayed the smallest STTV. In terms of project size and the number of requests, ABlyft encountered the least number of requests, whereas Mida.so was the smallest in size.
Mobile speed test replay
Here is what it looks like for your user experience when it comes to a fast and a slow A/B testing performance:
Link to sources:
Interpreting The Analysis
Examining both tests, it becomes evident that Mida.so leads in LCP, which is crucial because it affects how fast a user perceives the webpage to load on both desktop and mobile. Pedestal contenders like Convert and ABlyft raised the bar in other significant areas.
Taking a closer look at STTV values, which present the time taken by a page to visually complete, Mida.so and Convert performed impressively on both desktop and mobile. These platforms ensure users see a complete page faster, which is vital for maintaining a positive user experience.
Our focus on the size of scripts and the number of requests stemmed from their impact on rendering speed. Here, Convert and ABlyft showcased optimal performance, thus promising quicker content delivery to end users.
Undeniably, choosing an A/B testing tool involves assessing multiple factors, not merely speed. Each site demands a different optimization requirement and hence may perform differently on the same A/B tool.
Interesting Platform Behavior
During the test, we have observed some very interesting behavior coming from some of these A/B testing platform:
- VWO: Whenever the test shows Control group (no change), the page loads instantly and when it shows Variant Group (with the V2 text change) it actually gets pretty slow.
- Convert.com: The result between Desktop and Mobile performance is very extreme. When it is loaded on a Desktop environment, it is quite slow comparing to the other platforms. On Mobile environment, it is one of the fastest out of the bunch and we have confirmed this by running multiple test and this behavior seems to be very consistent throughout the test.
Testing Methodology
To ensure fairness and reliability in the results, we established specific protocols for the test. Here is a step-by-step guide on how the tests were conducted:
1. Test Environment
To efficiently ascertain the speed of each platform, we constructed a rudimentary website. The simplicity of the website was essential as it eliminated any extraneous elements that could potentially affect or interfere with the speed test results. The website featured a basic “Hello World” text that was to be altered to “Hello World V2” during the test.
2. Purpose of the Simple Test
Our primary goal was to investigate the raw speed of the various vendors. By using a fundamental webpage coupled with a straightforward A/B test scenario, we ensured we had a controlled environment that was free from any complex influences.
This allowed us to focus entirely on the raw loading speed of the different platforms without any disruptions or interferences. The objective is clear, we want to see how fast does it render a simple V2 text.
3. Location and Environment
The geographic positioning of the server and the testing environment play a significant role in the loading speed of a website. To keep the playing ground level, all tests were conducted on a US East server, which maintained consistency across all platforms.
Furthermore, the tests were executed in a desktop environment with the bandwidth capped at a limit of 100MBPS, and 12 MBPS for mobile environments. This measure ensured that we had a standard environment and connectivity speed, eliminating any differences that could arise from varying Internet speeds.
4. Testing Platform
The tests were carried out using DebugBear, a performance monitoring tool for modern web development. DebugBear provided a comprehensive and reliable measure of performance, making it an ideal tool for this speed test study.
This meticulous and methodical procedure ensured that our speed test results were reliable, unbiased, and representative of the vendors’ actual performance. By eliminating any potential complexities or differing variables, we aimed to present results that accurately reflect the raw speed of each A/B testing platform.
Wrapping Up
Given that a few milliseconds can drastically sway your conversion rates and user experience, your selection of an A/B testing tool holds immense importance. Therefore although speed is always a crucial parameter to look at, it’s also vital to consider other factors like scalability, integration capabilities, feature set, ease of use, and price while selecting an A/B testing tool.
The results of this analysis provide valuable insights into various A/B testing tools’ abilities and can aid in making an informed decision. The data emphasizes the need for rigorous website performance testing to ensure your chosen testing tools, whether for desktop or mobile, do not hamper your website’s speed.
Feel free to reach out if you want more in-depth information about our speed test analysis.
Please note: The given results may not be universally applicable due to variable factors such as the type of experiment executed or the tested page. Therefore, we recommend supplementing our study with your own investigation to find the best-fitting A/B Testing platforms for your requirements.