Overview

To learn how we track various metrics, please see Metrics. To understand how we attribute visitors to tests, please see Visitor Attribution.

Reporting page overview

The hierarchy of the test report page is organized by the level of detail provided for presented data.

At the top of the test report page is high level information relating to the performance and statistical significance of your test. As you scroll down the page, more granular tables and breakdowns for your original and variant are presented in various tables and charts.

Keep reading to learn more about the various sections on the page.

Test overview section

The test overview section provides a high level overview of your test performance overall.

  • Total visitors: the total number of unique website visitors that encountered your tested experiences.

  • Lift: the performance improvement or reduction (displayed as a percentage) for your selected primary goal. To see the lift of other test goals, use the

  • Progress: the progress towards statistical significance for your test data, based on your primary goal. When a test reaches the "Significant" state, it indicates that the performance observed has achieved sufficient statistical power (80%) to be validated at a confidence level of 95%. Please note that this is different from probability to win (below), which provides an estimate for how likely your variant is to be the higher performing experience. For more information, see Test progress.

  • Test duration: the total duration of your test (represented in days), to the hour.

  • Estimated time to significance: the estimated time remaining for your test to reach a confidence level of 95%, based on the data collected.

Test variants section

The test variants section provides a visual preview of your tested templates, as well as high-level metrics for each variant, such as:

  • Visitors: The count of unique visitors who encountered each variant and the percentage of total test traffic they contribute (your traffic allocation).

  • Clicks: A tally of unique visitors who encountered a tested experience and clicked on an element on an associated test page.

  • Clickthrough Rate: The ratio of total visitors who encountered a tested experience to the visitors who encountered it and clicked on an element on an associated test page.

  • Cart Adds: The count of unique visitors who encountered a tested experience and added at least one item to their cart.

  • Add-to-Cart Rate: The ratio of total visitors who encountered a tested experience to those who encountered it and added at least one item to their cart.

  • Orders: The number of orders transacted by visitors for each variant experience.

  • Revenue: The total revenue generated by each variant experience.

  • Conversion Rate: The percentage ratio of total orders to total traffic for a given variant experience.

  • Average Order Value: The monetary ratio of total revenue to total orders for a given variant experience.

  • Revenue per Visitor: The monetary ratio of total revenue to total traffic for a given variant experience.

  • Probability to Win: The estimated probability of each variant being the best-performing experience based on your primary goal. This probability is considered valid once your test achieves significance.

FAQ: Does Shoplift track post-purchase upsells or one-click payments?

Yes and yes! We support all major post-purchase upsell apps, like Aftersell, out of the box. We also support all major one-click checkout payment solutions, even if they navigate away from Shopify's checkout flow, like Shop Pay, Apple Pay, Google Pay, PayPal, and more.

Performance overview charts

The "Performance Overview" section showcases multiple line charts illustrating the progression of your test's performance over time. Additionally, it features pie charts that break down your tested traffic by acquisition channel and device category.

Performance over time line chart

The performance over time graph can be used to spot outliers in your test data - for example, if you ran a flash sale that resulted in higher conversion rates on a particular day. There is a toggle that allows you to view daily data for:

  • Clickthrough rate over time

  • Add-to-cart rate over time

  • Conversion rate over time

  • Average order value over time

  • Revenue per visitor over time

  • Probability to win over time (for the selected metric view)

Device and channel breakdown pie charts

The pie charts for device and channel breakdown show how traffic is distributed across your overall test for various channel groups and different device categories. For additional details about channel groups, refer to Channel groups.

Device performance table

The "Device Performance" section shows a detailed table view that presents the data collected for your test variants across all performance metrics for both desktop and mobile. This table is an excellent way to comprehend how your tested experiences influenced desktop and mobile traffic.

Visitor performance table

The "Visitor Performance" section shows a detailed table view that presents the data collected for your test variants across all performance metrics for both new and returning visitor segments. This table is an excellent way to comprehend how your tested experiences influenced new and returning visitor segments, particularly for subscription-based businesses.

FAQ: When will I see new and returning visitor information for my test?

Shoplift begins tracking site visitors immediately after installation. We do not have access to historical visitor information and, therefore, cannot accurately distinguish between new and returning visitors for at least 14 days. This data becomes more accurate over time.

Within a test, "new" visitors are defined as visitors seen for the first time during the test period, while "returning" visitors have been seen before the test period. Because we cannot accurately distinguish between new and returning visitors in the first 14 days, tests launched in the first 14 days will not have new/returning visitor segmentation available. Only tests launched after the 14-day data collection period will have new and returning visitor information available.

Audience performance table

The "Audience Performance" section presents a comprehensive table view displaying performance metrics collected for your test variants across different channel groups. This table is helpful in understanding scenarios such as an experience leading to a higher conversion rate on paid social but not on organic search.

For more information on channel groups, see Channel groups.

Subscription metrics view

If you offer subscriptions, the subscription metrics view shows the breakdown of conversion metrics by one-time purchases and first-time subscriptions, so you can easily understand how your test impacts subscription engagement.

To view reporting on your tests for subscription apps, simply change the "View" on any particular table on the test report page to the "Subscription metrics" view (top-right corner of any table).

FAQ: Does Shoplift have integrations with subscription apps?

Yes! Shoplift has out-of-the-box integrations with all major subscription apps and requires no prior setup to report on essential subscription metrics in your test reports.

For information on which subscription apps are supported, see Subscription Apps.

Have a question? Reach out to Customer Support, and we'd be happy to help you.

Last updated