Data Brain Attribution Validation Protocol


Document Type: Protocol
Status: Active
Authority: Data Brain
Parent: Data Brain Architecture
Applies To: All MWMS environments where attribution data is used for optimisation, reporting, or decision-making
Version: v1.0
Last Reviewed: 2026-04-23


Purpose

The Data Brain Attribution Validation Protocol defines the mandatory process for verifying the reliability of attribution data before it is used within MWMS.

This protocol ensures:

• attribution signals are understood
• platform discrepancies are identified
• attribution limitations are accounted for
• invalid attribution does not influence decisions

Attribution without validation produces misleading optimisation direction.


Core Principle

Attribution must be validated before it is trusted.

If attribution is not validated:

→ it must not be used for decision-making


Position in MWMS System

This protocol operates between:

• Data Brain Attribution Reliability Framework
• Data Brain Measurement Validation Protocol
• HeadOffice Data Decision Gate Framework

It determines:

👉 whether attribution signals are usable
👉 whether attribution confidence is acceptable


Attribution Validation Execution Flow

All attribution must follow this process:


Step 1 — Validate Conversion Integrity

Before attribution is evaluated, confirm:

• conversion events are correct
• no duplicate conversions exist
• no missing conversion events
• values are accurate

If conversion integrity fails:

→ attribution validation cannot proceed


Step 2 — Identify Attribution Sources

Determine all sources reporting attribution:

• GA4
• Google Ads
• other ad platforms
• backend / CRM systems

Understanding sources is required before comparison.


Step 3 — Compare Cross-Platform Attribution

Compare:

• conversion counts
• channel contribution
• campaign performance

Expected outcome:

• directional alignment, not exact match


Step 4 — Identify Discrepancies

Check for:

• missing conversions in one system
• inflated conversions in another
• channel contribution differences
• inconsistent campaign results


Step 5 — Diagnose Discrepancy Causes

Common causes:

• attribution window differences
• attribution model differences
• platform bias
• tracking gaps
• consent restrictions
• cross-device fragmentation


Step 6 — Validate Attribution Inputs

Confirm:

• UTMs structured correctly
• source/medium assigned correctly
• no unwanted referrals
• campaign naming consistent


Step 7 — Evaluate Attribution Model

Confirm:

• attribution model used (last-click, DDA, etc.)
• model limitations understood
• comparison across models where relevant


Step 8 — Assess Attribution Stability

Check:

• consistency over time
• no sudden unexplained shifts
• repeatability of contribution patterns


Step 9 — Assign Attribution Confidence Level


High Confidence Attribution

Conditions:

• consistent across platforms
• validated conversion tracking
• stable patterns over time

→ Safe for decision-making


Medium Confidence Attribution

Conditions:

• minor discrepancies
• known limitations
• directional alignment present

→ Use with caution


Low Confidence Attribution

Conditions:

• major discrepancies
• unstable patterns
• incomplete validation

→ Do not use for decisions


Invalid Attribution

Conditions:

• broken tracking
• duplicate conversions
• missing events
• severe platform conflicts

→ Must not be used


Attribution Approval Rule

Attribution is approved only when:

• conversion integrity confirmed
• discrepancies understood
• attribution model limitations acknowledged
• confidence level acceptable

If any condition fails:

→ attribution is not approved


Validation Triggers

Attribution validation must be performed when:

• new campaigns launched
• tracking changes made
• discrepancies detected
• scaling decisions planned
• audit identifies issues


Common Attribution Failure Patterns

This protocol detects:

• GA4 vs Ads mismatch
• duplicate conversions inflating Ads data
• missing conversions in analytics
• attribution model distortion
• incorrect UTMs
• platform bias affecting results


🔴 Attribution Conflict Rule

When platforms disagree:

→ no platform is automatically correct

Required action:

• investigate
• identify cause
• assign confidence level


🔴 Attribution Usage Rule

Attribution may be used for:

• directional optimisation
• channel prioritisation
• budget allocation guidance

Attribution must NOT be used for:

• causal conclusions
• absolute performance truth
• decisions without validation


Relationship to Other Frameworks

This protocol supports:

• Data Brain Attribution Reliability Framework
• Data Brain Measurement Validation Protocol
• Data Brain Data Trust Framework
• HeadOffice Data Decision Gate Framework
• Experimentation Brain Statistical Confidence Framework


Key Outcomes

When applied correctly:

• attribution becomes reliable
• platform bias is controlled
• discrepancies are understood
• decision-making improves
• optimisation direction becomes safer


Change Log

Version: v1.0
Date: 2026-04-23
Author: Data Brain

Change:
Initial creation of Attribution Validation Protocol defining structured process for validating attribution signals across platforms.


Change Impact Declaration

Pages Created:
Data Brain Attribution Validation Protocol

Pages Updated:
None

Pages Deprecated:
None

Registries Requiring Update:
MWMS Architecture Registry

Canon Version Update Required:
No

Change Log Entry Required:
Yes