Document Type: Framework
Status: Active
Version: v1.5
Authority: MWMS HeadOffice
Parent: Data Brain Canon
Last Reviewed: 2026-04-25
Purpose
The Data Brain Measurement Integrity Framework defines how MWMS protects the reliability, clarity, and interpretability of measurement across its systems.
Measurement integrity ensures that the signals used for:
• optimisation
• experimentation
• attribution
• strategic decision-making
are structurally sound.
The framework improves:
• metric definition clarity
• signal comparability
• interpretation consistency
• experiment reliability
• attribution reliability
• reporting trust
Weak measurement integrity produces false learning.
Strong measurement integrity protects decision quality.
Measurement integrity is the bridge between:
👉 signal capture
and
👉 strategic interpretation
Scope
This framework applies to:
• metric definitions
• signal validity
• session and user definitions
• event-to-metric interpretation
• measurement consistency across systems
• behavioural signal trust
• conversion definition clarity
• modelled vs observed data interpretation
• event validation and tracking accuracy
• duplicate and missing data detection
• tracking stability across system changes
• event timing reliability
• event dependency sequencing reliability
• asynchronous data availability risks
• race condition risks (NEW clarity)
Core Principle
A metric is only useful if:
• its definition is understood
• its data source is valid
• its implementation is correct
• its behaviour is stable
Misunderstood or incorrectly implemented metrics create false optimisation decisions.
Metrics are not raw truth.
Metrics are outputs of structured event logic.
Measurement Integrity Layers
Measurement integrity operates across three layers:
Definition Integrity
Ensures:
• metrics are clearly defined
• interpretation is consistent
• measurement intent is understood
Event Integrity
Ensures:
• events fire correctly
• event values are accurate
• dependencies are respected
Operational Integrity
Ensures:
• measurement works under real-world conditions
• data is stable over time
• system behaviour does not degrade
⚠️ Most systems fail at Operational Integrity, not Definition.
🔴 Event Timing Reliability Rule
Events must fire in time to be captured.
Timing failure occurs when:
• user navigates before event completes
• page unload interrupts execution
• event depends on delayed processing
High-risk scenarios include:
• outbound clicks
• button-triggered navigation
• redirects
• file downloads
Events that depend on short execution windows must not be assumed reliable.
🔴 Event Dependency Sequencing Rule
Some events depend on other processes firing first.
Examples:
• configuration tags must fire before events
• user state must exist before event capture
• page context must be set before interaction
If dependencies fail or fire out of sequence:
→ events may fire with incorrect or missing values
Correct sequencing is required for measurement integrity.
🔴 Asynchronous Data Availability Rule
Some values are not immediately available at event time.
Examples:
• login state
• external system data
• consent state
• component data
If events fire before data becomes available:
→ measurement integrity is reduced
Events must be validated against real execution timing, not assumed state.
🔴 Race Condition Risk Rule
Where multiple processes compete in time:
→ event reliability must be validated
Examples:
• navigation vs event firing
• config updates vs event capture
• iframe communication vs listener readiness
Race conditions may not occur every time, but intermittent failure still degrades integrity.
Metric Definition Rule
(UNCHANGED)
Event Dependency Rule
(UNCHANGED — strengthened by sequencing rule)
🔴 Event Validation Rule
(All existing content preserved)
🔴 Duplicate Detection Rule
(All existing content preserved)
🔴 Missing Data Rule
(All existing content preserved)
🔴 Internal Traffic Integrity Rule
(All existing content preserved)
🔴 Source and Attribution Integrity Rule
(All existing content preserved)
Session Definition Discipline
(UNCHANGED)
Engaged Session Discipline
(UNCHANGED)
User Definition Discipline
(UNCHANGED)
Conversion Definition Integrity Rule
(UNCHANGED)
Bounce Interpretation Rule
(UNCHANGED)
Modelled vs Observed Data Rule
(UNCHANGED)
🔴 Cross-System Validation Rule
(UNCHANGED)
Segmentation Integrity Rule
(UNCHANGED)
Time Comparison Rule
(UNCHANGED)
Ratio Consistency Rule
(UNCHANGED)
Source of Truth Rule
(UNCHANGED)
🔴 Measurement Drift Rule
(UNCHANGED)
🔴 Stability Requirement Rule
(UNCHANGED)
Relationship to Other Data Brain Frameworks
• Data Brain Event Reliability Framework
(All others unchanged)
Drift Protection
• event timing failures remaining undetected
• dependency sequencing issues going unnoticed
(All existing content preserved)
Architectural Intent
(UNCHANGED)
Change Log
Version: v1.5
Date: 2026-04-25
Author: MWMS HeadOffice
Change
Refined and clarified operational integrity layer to include:
• explicit race condition risk recognition
• clearer positioning of timing and sequencing risks
• alignment with Data Brain Architecture control layers
Change Impact Declaration
Pages Created:
None
Pages Updated:
Data Brain Measurement Integrity Framework
Pages Deprecated:
None
Registries Requiring Update:
No
Canon Version Update Required:
No
Change Log Entry Required:
Yes