Data Assessment

Your data is already there. Find out what it’s actually telling you.

A complete map of your data sources, quality issues, analytics gaps, and AI readiness — with a sequenced investment plan for closing each one.

21
Days from kickoff
to final readout
7
Areas: sources, quality,
analytics, governance, AI
3
Deliverables including
board-ready executive summary
Fixed
Price agreed at engagement
start, no hourly billing
How it starts

What the kickoff conversation looks like.

A 90-minute structured call with the right people in the room. By the end, we have what we need to run the full assessment independently.

The kickoff brings together your data or analytics lead, an IT contact who understands where data lives, and a senior business stakeholder from finance or operations. Forty-eight hours before, we send a pre-work packet: a data source list, examples of reports leadership actually uses, and a prompt on recent decisions where better data would have changed the outcome.

The kickoff covers the decisions data should be informing but isn’t, who uses data today and how, what analytics tools and reports have been built and abandoned, and where trust in the numbers breaks down and with which audience.

The questions that surface things quickly: which number does the board use for revenue, which does finance use, and are they the same? Where does the data team spend most of its time, and how much of that is cleaning versus analyzing?

Who should be in the kickoff

  • Data, analytics, or BI lead — or whoever owns the reports
  • IT contact who knows where data is stored and how it flows
  • A senior business stakeholder who makes decisions using data
What we examine

Seven areas, every one assessed.

The Data Assessment covers the full data environment — not just the tools, but the quality, governance, and organizational capacity to use what exists.

01

Data sources

Every system that generates or stores data the organization depends on — named, described, and assessed for reliability and accessibility.

  • Full data source inventory
  • Data ownership and custodianship
  • Source reliability and update frequency
  • Shadow data and spreadsheet systems
02

Pipelines & integration

How data moves between systems, where it is transformed, where it breaks, and where manual intervention fills the gaps.

  • Data flow mapping between systems
  • ETL and integration health
  • Manual data transfer identification
  • Latency and freshness assessment
03

Data quality

Accuracy, completeness, consistency, and timeliness assessed by domain. Quality issues scored by business impact, not by how messy the data looks.

  • Quality assessment by domain and field
  • Duplicate and orphan record analysis
  • Definition consistency across systems
  • Quality issue business impact scoring
04

Analytics landscape

Every reporting tool in use, who uses it, what it measures, and whether it is trusted. Tool proliferation and the analytics graveyard are mapped explicitly.

  • Full BI and reporting tool inventory
  • Usage and adoption by tool and report
  • Abandoned analytics work documentation
  • Trust levels by stakeholder group
05

Governance & definitions

Who owns what data, how definitions are managed, and whether finance and ops agree on the numbers when it matters.

  • Data ownership and stewardship mapping
  • Metric and definition consistency
  • Source of truth identification
  • Data access and permission structure
06

AI & ML readiness

Whether the data foundation exists to support the AI use cases leadership is asking about. Scored by use case, not by aspiration.

  • Use case identification and scoping
  • Data readiness score per use case
  • Foundation gaps blocking AI deployment
  • Realistic sequencing for AI capability
07

Team capability

Who does what with data, what is manual, where the bottlenecks are, and how much of the data team’s time is spent cleaning rather than analyzing.

  • Data role and responsibility mapping
  • Manual vs. automated work breakdown
  • Key-person dependency assessment
  • Capacity vs. demand analysis
How it runs

Three weeks, no disruption.

We work from documentation and structured interviews. No systems accessed, no work disrupted.

Week 1

Discovery

Kickoff, stakeholder interviews, data source documentation, sample report collection, analytics landscape walkthrough.

  • Kickoff call (90 min)
  • 2–3 stakeholder interviews (30 min each)
  • Data source inventory review
  • Sample report and dashboard collection
  • Analytics tool inventory
Week 2

Analysis

Every data source, quality issue, governance gap, and AI use case assessed and prioritized by business impact.

  • Quality assessment by domain
  • Analytics maturity scoring
  • AI readiness scoring by use case
  • Governance gap identification
  • Investment sequencing
Week 3

Readout

Three documents and a 90-minute readout. The executive summary is written for the board and board-ready from delivery.

  • All three deliverables delivered
  • Readout call with leadership (90 min)
  • Q&A and next-step discussion
  • 30-day follow-up window included
What you get

Three deliverables, built for action.

Written for the right audience at every level, from the board to the person who will implement the roadmap.

01

Executive Summary

Two to four pages for senior leadership and the board. The headline data gaps and the top three investments that would change how decisions get made.

  • Top 3 data gaps in plain language
  • Investments that change decision quality
  • AI readiness headline: what’s possible now
  • Board-ready from delivery
02

Data Findings Report

Full picture for data, analytics, and IT leadership. Every source assessed, every quality issue documented, analytics maturity and AI readiness scored.

  • Data source inventory and assessment
  • Quality findings by domain
  • Analytics maturity score and gap analysis
  • AI/ML readiness score by use case
  • Governance gap documentation
03

Prioritized Data Roadmap

Foundation work first, then the analytics layer, then advanced use cases. Each action has a timeline, estimated cost range, and the business outcome it unlocks.

  • Foundation: quality and integration fixes
  • Analytics layer: reporting and dashboards
  • Advanced: AI/ML use case sequencing
  • Cost range and timeline per action
What typically comes up

What organizations discover.

Most organizations recognize at least three of these before the readout call is over.

Multiple reports with different numbers for the same metric.Finance uses one revenue figure, marketing uses another, operations a third. The assessment identifies which source is right and how to establish a single agreed source of truth.
Data that exists but has never been used for decisions.Usually because it was never cleaned or connected to reporting systems. The data has been there for years. The investment to make it useful is smaller than expected.
An analytics graveyard: tools purchased, partially built, abandoned.Most organizations have tried to build reporting at least once and stopped. The assessment documents what exists so the next investment doesn’t repeat the same mistakes.
AI use cases the board is asking about that the data can’t support yet.The current data foundation is usually 12 to 24 months behind what leadership wants to deploy. The assessment makes that gap explicit and sequences the foundation work required to close it.
Common questions

What organizations ask before starting.

Do you need access to our databases or data systems? +
No. We work from documentation, sample reports, data dictionaries, and structured stakeholder interviews. No credentials required. No systems accessed directly.
What if we don’t have a data team? +
That is precisely what this surfaces. Many organizations have data spread across systems with no dedicated ownership. The assessment tells you what a realistic first investment looks like given your current staff capacity and budget.
What if our data is a complete mess? +
That is the most common reason to run a data assessment, not a reason to wait. The assessment is designed to work with incomplete, inconsistent, and fragmented data environments. The findings tell you exactly what to fix first.
Is this the same as a data audit? +
No. A data audit is backward-looking and compliance-focused. The Data Assessment is forward-looking: what data foundation do you need, what exists today, and what should you build first and why.
What if leadership doesn’t trust our data? +
That is the most common starting condition. The assessment provides an external, objective read on why trust has broken down — usually a combination of quality issues, definition inconsistency, and tool fragmentation — and a concrete path to rebuilding it.

Find out what your data is actually telling you — and what’s missing.

Fill in a few details. We’ll confirm scope and price before anything starts.

Other assessment types

Not sure data is the right starting point?

Technology Assessment

A complete picture of your technology stack in 21 days.

Maps every system, vendor, contract, and dollar of technology spend. Right when systems or vendors are the primary concern.

See the Technology Assessment →
Combined Assessment

Technology and data problems are usually the same problem.

Runs both tracks in parallel, finds the dependencies. Four deliverables, one integrated roadmap, same 21-day timeline.

See the Combined Assessment →