Experimentation Brain Warehouse Based Test Analysis Framework


Document Type: Framework
Status: Active
Authority: Experimentation Brain
Parent: Experimentation Brain Architecture
Applies To: All MWMS experiments requiring accurate performance evaluation, including A/B tests, funnel tests, and conversion experiments
Version: v1.0
Last Reviewed: 2026-04-23


Purpose

The Experimentation Brain Warehouse Based Test Analysis Framework defines how MWMS evaluates experiments using raw data from data warehouse systems.

Standard analytics platforms often use:

• sampling
• estimation
• thresholding
• aggregation

These limitations can distort experiment results.

Warehouse-based analysis ensures:

• precise user counts
• accurate conversion counts
• correct test exposure tracking
• reliable experiment outcomes

This framework ensures MWMS evaluates experiments using the most accurate data available.


Core Principle

Experiment decisions must be based on accurate data.

If analysis uses estimated or incomplete data:

→ test results may be incorrect

If test results are incorrect:

→ optimisation decisions become unsafe

Therefore:

→ raw warehouse data must be used for final experiment evaluation


Position in MWMS System

This framework operates within:

• Experimentation Brain → test analysis and evaluation
• Data Brain → raw data access and validation
• HeadOffice → decision approval

It supports:

• Statistical Confidence Framework
• Data Trust Framework
• Data Decision Gate Framework


Warehouse Based Analysis Definition

Warehouse-based analysis uses:

• event-level data
• raw behavioural logs
• structured query-based extraction

Instead of:

• interface reports
• aggregated dashboards
• sampled data views


When to Use Warehouse Based Analysis

Warehouse-based analysis is required when:

• evaluating A/B test results
• validating experiment outcomes
• resolving discrepancies between systems
• analysing small performance differences
• making scaling decisions

Interface-level analysis may be used for:

• early monitoring
• trend observation
• directional insight


Experiment Data Requirements

To evaluate an experiment correctly, the following must be captured:


1. Test Exposure Data

• test identifier
• variant assignment
• timestamp of exposure

Purpose:

→ identify which users are part of the experiment


2. User Identification

• unique user ID
• session ID

Purpose:

→ ensure consistent tracking of users


3. Behavioural Events

• page views
• interactions
• funnel progression

Purpose:

→ understand user behaviour


4. Goal Completion Data

• conversion events
• goal events (purchase, lead, etc.)
• timestamps of conversion

Purpose:

→ measure performance


🔴 Exposure Before Conversion Rule

Conversions must occur after exposure.

If a conversion occurs before the user is exposed to the test:

→ it must not be counted

This ensures:

→ accurate causal interpretation


🔴 Session Consistency Rule

Test exposure and conversion must occur within a valid session context.

If:

• exposure and conversion are not linked
• sessions are mismatched

→ attribution becomes unreliable


🔴 Timestamp Validation Rule

All key events must include timestamps.

Timestamps are required to:

• verify sequence of events
• validate exposure before conversion
• ensure correct event ordering


Data Extraction Process


Step 1 — Identify Test Users

Extract users who:

• were assigned to a test
• have valid test identifiers
• have variant assignment


Step 2 — Extract Behavioural Data

Extract:

• relevant events
• interaction data
• funnel steps


Step 3 — Extract Goal Data

Extract:

• conversion events
• goal completion data


Step 4 — Join Data Sets

Combine:

• test exposure data
• behavioural data
• goal data

Using:

• user ID
• session ID
• timestamps


Step 5 — Validate Data

Confirm:

• no duplicate users
• no duplicate conversions
• correct sequencing
• complete data coverage


Analysis Outputs

Warehouse-based analysis produces:

• users per variant
• conversions per variant
• conversion rates
• performance differences
• segmented performance


🔴 Sample Ratio Mismatch Rule

Variant distribution must be checked.

If:

• users are not evenly distributed
• unexpected skew exists

→ results may be invalid


🔴 Segmentation Rule

Analysis may include segmentation:

• device type
• traffic source
• funnel stage

Segmentation must:

• maintain statistical validity
• not distort results


🔴 Statistical Integration Rule

Warehouse-based results must be evaluated using:

• statistical confidence
• sample size requirements
• signal stability

Raw data alone is not sufficient.

Statistical validation is still required.


🔴 Data Completeness Rule

Analysis must ensure:

• all relevant events are captured
• no missing conversion data
• no missing exposure data

Incomplete data invalidates results.


🔴 Latency Awareness Rule

Recent data may be incomplete.

Examples:

• delayed exports
• backfilled data

Analysis must:

→ avoid premature conclusions


Automation Integration

Warehouse-based analysis should feed into:

• automated dashboards
• reporting systems
• experiment monitoring tools

Automation reduces manual effort and improves consistency.


Relationship to Other Frameworks

Supports:

• Data Brain Raw Data Access Framework
• Data Brain Data Trust Framework
• Experimentation Brain Statistical Confidence Framework
• HeadOffice Data Decision Gate Framework


Failure Modes Prevented

incorrect test conclusions
false winners
false losers
misinterpreting small differences
scaling based on inaccurate data


Drift Protection

The system must prevent:

• reliance on interface data for test decisions
• incorrect data joins
• missing exposure tracking
• incorrect sequencing of events


Architectural Intent

This framework ensures MWMS evaluates experiments using:

the most accurate data available

It upgrades experimentation from:

interface-based evaluation → evidence-based evaluation


Final Rule

If experiment accuracy matters:

→ warehouse data must be used


Change Log

Version: v1.0
Date: 2026-04-23
Author: Experimentation Brain

Change:
Initial creation of Warehouse Based Test Analysis Framework defining how MWMS evaluates experiments using raw data.


Change Impact Declaration

Pages Created:
Experimentation Brain Warehouse Based Test Analysis Framework

Pages Updated:
None

Pages Deprecated:
None

Registries Requiring Update:
MWMS Architecture Registry

Canon Version Update Required:
No

Change Log Entry Required:
Yes