Review audience holdout results


The Results page shows data for an active audience holdout experiment. These experiments help marketers understand what advertising changes by comparing exposed users with a modeled baseline.

This article explains each metric on the Results page and describes how the charts work together to show incremental impact and reliability over time. It also clarifies how holdout results are scaled, and how statistical significance is indicated.

Review metrics

  1. Overall incrementalityIt's a composite metric that combines site engagement and primary conversions into a single measure.
  2. Incremental site engagement: Shows how advertising influences on-site behavior beyond the baseline. The Count metric represents net new site visits driven by ads. The Percentage metric shows how the engagement rate of the exposed group changed compared to the holdout group rate. Learn how incremental site engagement calculations work. 
  3. CpISECost per Incremental Site Engagement represents the average cost to drive one additional site visit beyond the natural baseline. The metric is calculated by dividing total media spend by incremental site engagements. It isolates visits caused by advertising from organic behavior. Learn about CpISE calculations
  4. Incremental conversionsDisplays two metrics that show incremental conversion impact. The Count metric shows net new conversions driven by advertising. The Percentage metric shows how much the exposed group conversion rate increased compared to the holdout group rate. Learn how incremental conversion calculations work.
  5. CpICCost per Incremental Conversion represents the average cost to generate one additional conversion beyond the natural baseline. The metric calculates total media spend divided by incremental conversions and reflects true acquisition cost by excluding organic conversions. Learn about CpIC calculations
  6. Audience split: Shows how the experiment divided unique users into two groups. The exposed group includes users targeted by the campaigns. The holdout group includes all users seen in the bid stream that were not shown ads. The allocation ratio is governed by the configured split rate. While the system dynamically adjusts over time to approximate this target, natural fluctuations in traffic and bidding conditions may cause variations in the exact distribution.
  7. Overall incrementality: Shows the total net new volume of conversions and site engagements driven by the campaign. Net new volume is the difference between the volume recorded by the exposed group and the estimated volume in the scaled holdout group. The scaled holdout volume is an estimate: multiply the full exposed audience size by the estimated baseline rate from the holdout sample.
  8. IncrementalityThese metrics show how performance changed beyond the estimated holdout baseline level. The Count metric shows net new actions generated by the campaign. The Percentage metric shows the relative increase in site engagement or conversion rates compared to the holdout baseline. Incrementality shows how much more likely exposed users are to act compared to the holdout baseline. Positive incrementality means advertising increased engagement or conversions, proving ads created net new activity. Neutral lift (near zero) means ads had little measurable effect. Exposed users behaved like the holdout. Negative incrementality means exposed users were less likely to act than the holdout. That suggests inefficiency or possible campaign issues.
  9. Holdout (Scaled): Represents estimated performance across the entire audience without advertising. illumin uses the Holdout sample rate to record natural behavior (i.e. no exposure to the ad) and then scales that result to the full audience to estimate the baseline. That baseline is used to calculate incremental impact. 

Time-series charts

These charts show how performance changes over time. They include a green line that indicates when the data reaches statistical confidence. This marker helps distinguish between early directional trends and confirmed advertising impact.

Statistical confidence indicator

Both charts include a green vertical line that marks when the data reaches statistical confidence. This indicates that the observed difference between exposed and holdout users reflects real advertising impact rather than random variation.

Before the green line appears, results are directional and may change as more data is collected. After the line appears, results are statistically reliable. At that point, the study can stop and move the holdout audience into the exposed group, or continue running to improve measurement precision.

Exposed vs. Holdout 

This chart shows how incrementality changes over time. The x-axis displays campaign flight dates, and the y-axis shows the volume of conversions or site engagements. Use the two buttons to switch between Site engagement and Conversion views.

One line tracks accumulated actions from exposed users served ads. The other line shows the estimated baseline from holdout users who were not served ads, scaled to the full eligible audience. The difference between these lines represents incremental volume, or net new actions driven by the campaign.

Incrementality vs Significance

This chart shows how incrementality changes over time. 

The x-axis shows campaign flight dates, and the y-axis shows incrementality. Use the two buttons to switch between site engagements and conversions.

Hover the green line to view incrementality at that point in time along with the margin of error.

Incrementality by campaign

This chart shows incrementality rates for site engagements and conversions for each campaign in the experiment. It highlights how much site engagement or conversion rates differ between the exposed group (users served ads) and the holdout group (users not served ads). This information helps marketers identify campaigns that deliver real incremental value.

Hover a bar to view incrementality rates for site engagements and conversions at that point in time.

Table view

This table shows incremental volume for each campaign in the study. Use the buttons to switch the view from site engagement to conversions, and the combined numbers. 

Steps to export data

There are two export options.

  1. To download the entire page as an image, go to the top-right corner of the page and click the Export as button. In the drop-down, select PDF or JPG.
  2. To download the incrementality data in a sheet, go to the bottom-right corner of the page and click the Export as CSV button. 

FAQs

What can marketers do after the study reaches statistical significance?
Once the platform confirms significance, marketers can stop the experiment and move the holdout audience into the exposed group so ads begin serving to those users. The experiment can also continue to collect more data, which improves the precision of the lift measurement and narrows the margin of error.

What is the difference between directional results and statistically reliable results?
Directional results show early performance trends before enough data exists to confirm impact. Statistically reliable results confirm that observed lift comes from advertising rather than random variation.

Why does incremental volume increase over time?
Incremental volume accumulates as exposed users generate more actions during the campaign. The gap between exposed and holdout lines reflects total net new impact.

What is the difference between incremental lift and incremental volume?
Incremental volume shows the total number of net new actions, such as site visits or conversions, driven by advertising. Incremental lift shows the relative percentage increase in performance compared to the holdout baseline, which helps compare impact across campaigns of different sizes.


Related articles

Launch an audience holdout experiment

How to interpret confidence and margin of error

How incremental site engagement is calculated

How incremental conversions are calculated

How to calculate CpISE

How CpIC is calculated