% OF MAX
The difference, in Score, between a weaker and the strongest Creative in an Experiment.
% OF MIN
The difference, in Score, between a stronger and weakest Creative in an Experiment.
Creative or Creatives running within an optimization Group without an Experiment setup and no auto-pause based on performance.
Determines how often a creative is served compared to others. By default all Creatives are set to identical weights, but it can be updated to give one Creative more traffic.
Creative selected as a Control to base Experiment lift on.
The conversion rate of clicks to conversions. The exact calculation is conversions/clicks*100.
The likelihood that a poor performing ad will never outperform the current winner, based on the current data set. The lower the Confidence%, the higher the chance will be of a false positive, but if a Confidence% is too high, the campaign will spend more impressions on losing creative, and slow the pace of progress. We recommend setting the Confidence% to 90-95%.
Clickthrough rate of clicks to impressions. The exact calculation is clicks/impressions*100.
A setup in an Optimization Group for an A/B Test Scenario. Based on selected thresholds determined by the Optimization Event and Experiment Rules, the system will automatically pause underperforming Creative until there is one remaining. When only one Creative is left, the Experiment automatically ends and is segmented into Phases from a reporting perspective.
An overview of what changes are being made. An Experiment can be separated into two categories: Concept or Tweak. If it is a Concept Experiment, generally multiple elements are being changed and none of the Creatives are similar. A Tweak Experiment has many Creatives with one or two minor updates, which is meant to capture exactly what type of element or adjustment is creating better performance.
The specific Element within the Creative that is being updated in a Tweak Experiment. Example: Button, Headline (primary), Background, etc.
An Element that was adjusted in a Tweak Experiment. Example: if a Button was adjusted, was it the color, size, copy, position, etc.
User or Users who have made changes to the Optimization Group, or want to receive notifications for a particular Optimization Group. INSIGHTS Creative data based on a selected Audience Attribute. A User may view the data within an Experiment as a single Creative’s performance across values of an Attribute, or all Creatives’ performance value within an Attribute.
The difference, in Score, between the Control Creative and the New Winner when the Control was paused.
Applied LP Set which is applied to Creatives within this Optimization Group. More on LP Sets can be found in the Media Section.
User who is responsible and receives notifications and alerts for a particular Optimization Group, which includes Experiment updates.
The breakup of an Experiment based on when any Creative is paused. Experiment metrics are cumulative, so each Phase should contain sufficient data for all of the Creatives active during a given portion of an Experiment.
The testing relationship creatives possess as they compete to be the winner. RevJet’s naming convention is either “Challenger X” or “Control.”
The current ranking for the test, determined by the conversion rate for each creative. The exact score calculation is conversions/impressions*1MM.
Applied Audience or Targeting data that determines what traffic an Optimization Group is eligible for.