Data Brain Custom Exploration Design Framework

Document Type: Framework
Status: Draft
Authority: Data Brain
Applies To: Data Brain, Research Brain, Ads Brain, Experimentation Brain, Affiliate Brain, Conversion Brain
Parent: Data Brain
Version: v1.1
Last Reviewed: 2026-04-22


Purpose

The Data Brain Custom Exploration Design Framework defines how MWMS constructs investigative analysis views when standard dashboards do not provide sufficient insight depth.

Predefined reports often present aggregated metrics that hide behavioural detail.

Custom explorations enable:

deeper segmentation
behavioural pathway analysis
signal isolation
hypothesis investigation
anomaly detection
multi-dimensional interpretation
test insight extraction
multi-touch pathway visibility

This framework ensures MWMS can generate targeted analytical views that support decision-making clarity.

Exploration capability improves the interpretability of complex behavioural environments.


Core Principle

Standard dashboards provide summary visibility.

Explorations provide diagnostic insight.

Meaningful interpretation often requires flexible analysis structures.

MWMS must maintain the capability to create structured investigative views beyond fixed report formats.

Explorations allow signals to be examined from multiple structural perspectives.

Different analytical perspectives reveal different behavioural insights.


Definition

Custom exploration refers to the structured assembly of:

dimensions
metrics
parameters
filters

to analyze behavioural signals from multiple perspectives.

Explorations enable the system to:

combine multiple signal layers
test hypotheses
isolate behavioural patterns
identify signal relationships
evaluate decision progression
observe attribution pathway influence
compare segment-level signal variation

Explorations provide structured flexibility in analysis environments.

Explorations improve interpretive depth.


Exploration Components

Explorations are constructed from structured signal layers.


Dimensions

Dimensions represent descriptive attributes of behavioural signals.

Examples:

page location
traffic source
device category
campaign identifier
content classification
geography
channel grouping
offer identifier
user segment

Dimensions define how behaviour is categorized.

Dimensions enable segmentation.

Dimensions create analytical structure.


Metrics

Metrics represent quantitative measurement values.

Examples:

event count
active users
total users
conversion count
revenue
engagement duration
progression ratios

Metrics quantify behavioural signals.

Metrics represent observable signal magnitude.

Metrics require context to interpret meaning.


Parameters

Parameters describe event-level contextual attributes.

Examples:

scroll depth percentage
content category
interaction location
product identifier
funnel stage identifier

Parameters improve signal resolution.

Parameters support deeper segmentation logic.

Parameters increase diagnostic precision.


Filters

Filters restrict analysis scope to specific conditions.

Examples:

specific event name
traffic source
campaign identifier
behavioural stage
device category
geography

Filters isolate signal segments for deeper analysis.

Filters reduce noise.

Filters improve interpretability.


Exploration Structure Logic

Explorations typically combine:

rows (dimension categories)
columns (dimension segmentation layers)
values (metrics)
filters (scope constraints)

Example structure:

rows
traffic source

columns
device category

values
conversion count

filter
specific event type

This structure reveals behavioural differences across segments.

Multi-layer segmentation reveals relationships hidden in aggregated reports.


Exploration Use Cases

Signal Isolation

Identify behavioural differences across traffic segments.

Example:

conversion behaviour by device category

identifies performance differences between desktop and mobile traffic.

Signal isolation improves diagnosis precision.


Behavioural Pattern Analysis

Evaluate progression relationships across event tiers.

Example:

CTA clicks relative to landing page views

reveals persuasion effectiveness.

Behavioural ratios reveal friction and alignment patterns.


Funnel Diagnostics

Analyze drop-off between sequential behavioural steps.

Example:

checkout start vs purchase completion

identifies friction concentration.

Funnel exploration reveals structural weaknesses.


Traffic Quality Comparison

Evaluate behavioural strength across acquisition sources.

Example:

conversion behaviour across organic vs paid traffic

reveals source quality differences.

Behavioural variation reveals audience alignment differences.


Attribution Pathway Analysis

Evaluate multi-touch influence patterns.

Example:

interaction sequences across channels prior to conversion

reveals assisted influence contribution.

Multi-touch interpretation improves channel contribution clarity.


Experiment Interpretation Support

Support experiment result analysis through segmented behaviour evaluation.

Example:

conversion rate differences across creative variants

provides insight into persuasion performance.

Segment-level interpretation improves test learning depth.


Multi-Dimensional Analysis Principle

Single-dimension reports often hide behavioural relationships.

Combining dimensions enables deeper interpretation.

Example:

traffic source + device category + conversion rate

reveals performance variations across behavioural environments.

Multi-dimensional segmentation strengthens insight clarity.

Multiple segmentation layers improve signal interpretability.


Exploration Design Heuristics

Heuristic 1

Start with the behavioural question.

Define the insight objective before constructing exploration structure.

Example questions:

Which traffic sources produce strongest intent signals?

Where does friction occur in funnel progression?

Which device categories produce highest conversion probability?

Which channels contribute to multi-touch pathways?

Clarity of question improves clarity of analysis.


Heuristic 2

Limit dimension complexity initially.

Begin with simple structures and add complexity gradually.

Complex structures without clear question produce noise.

Incremental structure improves interpretability.


Heuristic 3

Ensure metric relevance aligns with question.

Example:

use conversion count when evaluating outcomes

use event count when evaluating engagement

use progression ratios when evaluating behavioural flow

Metrics must align with behavioural stage relevance.

Metric misuse reduces interpretability.


Heuristic 4

Use filters to isolate meaningful signal subsets.

Filtering reduces noise and improves interpretability.

Example:

filter to specific campaign

filter to specific event

filter to specific funnel stage

Filtering improves diagnostic clarity.


Heuristic 5

Compare multiple segment perspectives.

Changing dimension order can reveal hidden relationships.

Example:

traffic source first vs device category first

may produce different insight clarity.

Perspective changes improve insight discovery.


Exploration Limitations Awareness

Exploratory analysis environments may contain structural constraints.

Examples:

limited parameter visibility
inconsistent dimension availability
incomplete historical comparability
interface structural constraints
sampling behaviour
platform modelling effects
privacy-related signal loss

Exploration results must be interpreted with awareness of data limitations.

Exploration outputs may include inferred signals.

Signal confidence may vary across dimensions.

Interpretation must remain cautious.


Exploration Interpretation Cautions

Exploration outputs should not be interpreted without context.

Segment-level variation does not automatically indicate causal relationship.

Patterns require validation through structured testing.

Exploration findings should feed:

hypothesis generation
test design refinement
signal interpretation validation

Explorations support insight discovery but do not replace controlled experimentation.

Exploratory insight ≠ causal proof.

Exploration supports learning, not confirmation.


Relationship to Other MWMS Frameworks

Supports:

Research Brain Behavioural Event Analysis Framework
Experimentation Brain Test Design Framework
Data Brain Signal Flow Framework
Data Brain Measurement Integrity Framework
Data Brain Attribution Reliability Framework
Ads Brain Performance Interpretation structures
Conversion Brain Funnel Optimization structures

Explorations enable deeper signal interpretation across system intelligence layers.

Exploration capability improves cross-brain insight alignment.


Governance Notes

Exploratory analysis should be guided by structured questions.

Unstructured exploration increases noise and reduces clarity.

Exploration discipline improves insight quality.

MWMS should prioritise hypothesis-driven exploration.

Exploration must support decision clarity rather than curiosity alone.


Change Log

Version: v1.1
Date: 2026-04-22
Author: Data Brain

Change:

Added attribution pathway analysis capability.

Clarified relationship between parameters, dimensions, metrics, and events.

Strengthened compatibility with Attribution Reliability Framework.

Improved interpretation guidance regarding exploratory vs causal insight.

Expanded support for multi-touch pathway analysis.

Improved cross-brain interpretability alignment.