πŸ’ΉReporting Overview

Every test in Shoplift has a corresponding report which presents essential information about the data collected over the duration of the test. This report data can provide a robust understanding of what experiences are working on your website so you can learn and improve your testing strategies over time.

Data for tests is updated hourly, so you can frequently keep an eye on the various outcomes of your tests.

Reporting page layout and hierarchy of information

The hierarchy of the test report page is organized by the level of detail provided for presented data.

At the top of the test report page is high level information relating to the performance and statistical significance of your test.

As you scroll down the page, more granular tables and breakdowns for your original and variant are presented in various tables and charts.

As you keep an eye on how your live tests are performing, this keeps it easy to pop in from time to time and get both a high-level understanding of how a test is performing or do a deep dive into more granular views for device, visitor, and audience segments.

Test overview tiles

The test overview section provides a high level overview of your test performance overall.

  • Total visitors: the total number of unique website visitors that encountered your tested experiences.

  • Lift: the performance improvement or reduction (displayed as a percentage) for your selected primary goal.

  • Progress: the progress towards statistical significance for your test data, based on your primary goal. When a test reaches the "Significant" state, it indicates that the performance observed has achieved sufficient statistical power (80%) to be validated at a confidence level of 95%. Please note that this is different from probability to win (below), which provides an estimate for how likely your variant is to be the higher performing experience. For more information, see Test progress.

  • Test duration: the total duration of your test (represented in days), to the hour.

  • Estimated time to significance: the estimated time remaining for your test to reach a confidence level of 95%, based on the data collected.

Test variant cards

The test variants section provides a visual preview of your tested templates, as well as high level metrics for each variant, such as:

  • Visitors: The count of unique visitors who encountered each variant and the percentage of total test traffic they contribute (your traffic allocation).

  • Clicks: A tally of unique visitors who encountered a tested experience and clicked on an element on an associated test page.

  • Clickthrough Rate: The ratio of total visitors who encountered a tested experience to the visitors who encountered it and clicked on an element on an associated test page.

  • Cart Adds: The count of unique visitors who encountered a tested experience and added at least one item to their cart.

  • Add-to-Cart Rate: The ratio of total visitors who encountered a tested experience to those who encountered it and added at least one item to their cart.

  • Orders: The number of orders transacted by visitors for each variant experience.

  • Revenue: The total revenue generated by each variant experience.

  • Conversion Rate: The percentage ratio of total orders to total traffic for a given variant experience.

  • Average Order Value: The monetary ratio of total revenue to total orders for a given variant experience.

  • Revenue per Visitor: The monetary ratio of total revenue to total traffic for a given variant experience.

  • Probability to Win: The estimated probability of each variant being the best-performing experience based on your primary goal. This probability is considered valid once your test achieves significance.

Performance overview charts

The "Performance Overview" section showcases multiple line charts illustrating the progression of your test's performance over time. Additionally, it features pie charts that break down your tested traffic by acquisition channel and device category.

Performance over time line chart

The performance over time graph can be used to spot outliers in your test data - for example, if you ran a flash sale that resulted in higher conversion rates on a particular day. There is a toggle that allows you to view daily data for:

  • Clickthrough rate over time

  • Add-to-cart rate over time

  • Conversion rate over time

  • Average order value over time

  • Revenue per visitor over time

  • Probability to win over time (for the selected metric view)

Device and channel breakdown pie charts

The pie charts for device and channel breakdown show how traffic is distributed across your overall test for various channel groups and different device categories. For additional details about channel groups, refer to Channel groups.

Device performance table

The "Device Performance" section shows a detailed table view that presents the data collected for your test variants across all performance metrics for both desktop and mobile. This table is an excellent way to comprehend how your tested experiences influenced desktop and mobile traffic.

Visitor performance table

The "Visitor Performance" section shows a detailed table view that presents the data collected for your test variants across all performance metrics for both new and returning visitor segments. This table is an excellent way to comprehend how your tested experiences influenced new and returning visitor segments, particularly for subscription-based businesses.

Audience performance table

The "Audience Performance" section presents a comprehensive table view displaying performance metrics collected for your test variants across different channel groups. This table is helpful in understanding scenarios such as an experience leading to a higher conversion rate on paid social but not on organic search.

For more information on channel groups, see Channel groups.

Last updated

Was this helpful?