Combined Assessment

Technology and data problems are usually the same problem.

The Combined Assessment runs both tracks in parallel, finds the dependencies between them, and delivers an integrated roadmap that sequences investments by cross-domain business impact.

21
Days, same timeline as
a standalone assessment
2
Parallel tracks: technology
and data, run simultaneously
4
Deliverables including
integrated executive summary
Fixed
Price agreed at engagement
start, no hourly billing
Why combined

The problems that look separate rarely are.

Running two separate assessments misses the connections. The combined engagement finds them explicitly and sequences the roadmap to take advantage of them.

Most organizations experience technology problems and data problems as distinct issues managed by different teams. They usually aren’t. The data quality problem that looks like a governance failure is often a systems integration gap. The AI use case that can’t get funded is usually blocked by both a platform limitation and a data foundation gap simultaneously.

Running the technology and data tracks in parallel — with the findings from each informing the other in week two — surfaces these connections before the roadmap is written. The result is a sequencing of investments that accounts for cross-domain dependencies: which technology investment unlocks multiple data capabilities, which data fix requires a systems change first.

An example of what the cross-track analysis finds

A technology track flags a legacy AMS as end-of-life. A data track on the same organization flags poor member engagement data quality. Run separately, these become two independent recommendations.

Run together, the cross-track analysis reveals that the data quality issue traces directly to the AMS’s limited API surface. The AMS replacement doesn’t just solve a technology problem — it solves the data quality problem at the same time. The integrated roadmap reflects that, and the investment case is significantly stronger.

How it starts

Two kickoff conversations, same week.

Both tracks start simultaneously. Two structured kickoffs in week one — one technology-focused, one data-focused.

The technology kickoff covers systems, vendors, integration architecture, technical debt, and team capability. The data kickoff covers data sources, quality, analytics, governance, and AI readiness. Both happen in week one, with pre-work sent 48 hours before each.

In week two, the findings from both tracks are reviewed together for cross-domain dependencies. A system flagged in the technology track might be the root cause of a quality issue in the data track. An AI use case blocked on the data side might require a platform change on the technology side. The cross-track analysis surfaces these connections explicitly before the roadmap is written.

Week three delivers all four documents and a single 90-minute integrated readout. Leadership receives one coherent picture — not two separate reports to reconcile after the fact.

Who should be in the kickoff calls

  • Technology kickoff: IT lead, operations stakeholder, senior leadership sponsor
  • Data kickoff: data or analytics lead, IT contact, finance or ops business stakeholder
  • Some attendees overlap — that is expected and useful
How it runs

Three weeks, two tracks, one roadmap.

Both tracks run in parallel. The 21-day timeline is the same as a standalone assessment — the scope is broader, not the calendar.

Week 1

Parallel discovery

Both technology and data tracks run discovery simultaneously. Two kickoffs, stakeholder interviews across both domains, documentation collection from both sides.

  • Technology kickoff (90 min)
  • Data kickoff (90 min)
  • Stakeholder interviews per track
  • System and data source inventory
  • Contract, report, and tool collection
Week 2

Cross-track analysis

Findings from both tracks are scored independently, then reviewed together for cross-domain dependencies and compound risks. This step is unique to the combined engagement.

  • Technology findings scoring
  • Data findings scoring
  • Cross-track dependency mapping
  • Compound risk identification
  • Integrated roadmap sequencing
Week 3

Integrated readout

Four documents delivered and one 90-minute integrated readout. One coherent picture, not two separate reports to reconcile.

  • All four deliverables delivered
  • Integrated readout call (90 min)
  • Cross-domain findings presented together
  • 30-day follow-up window
What you get

Four deliverables. The fourth is the most valuable one.

Three of the four match what a standalone assessment produces. The fourth — the integrated roadmap — is only possible when both tracks run together.

01 — Featured deliverable

Integrated Executive Summary

Not two summaries side by side. A single integrated read on how the technology and data environments interact, where they compound each other’s risks, and the sequenced investment case that addresses both. Board-ready from delivery.

  • Technology and data risks as a unified picture
  • Compound risk: where tech gaps and data gaps interact
  • Integrated investment sequencing with cross-domain impact
  • Written for leadership and board committee review
02

Technology Findings Report

Full depth of the standalone Technology Assessment: systems, vendors, contracts, technical debt, team capability, and spend — with cross-references to data dependencies where they exist.

  • System-by-system assessment and scoring
  • Vendor and contract risk inventory
  • Technical debt by business risk
  • Spend analysis with data dependency flags
03

Data Findings Report

Full depth of the standalone Data Assessment: sources, quality, analytics, governance, AI readiness — with cross-references to technology root causes where they exist.

  • Data source inventory and quality assessment
  • Analytics maturity and AI readiness scoring
  • Governance gap documentation
  • Technology root cause flags by finding
04

Integrated Roadmap

A single sequenced roadmap accounting for cross-domain dependencies. Investments prioritized by integrated business impact, not by domain independently.

  • Cross-domain investment sequencing
  • Actions that unlock both tech and data outcomes
  • Quick wins callable in first 90 days
  • Cost range and business outcome per action
What the cross-track analysis finds

Dependencies that neither standalone assessment surfaces.

These findings only appear when both tracks run together and the week-two cross-analysis happens.

A data quality problem that traces to a system integration gap.What looks like a governance failure turns out to be caused by two systems that were never integrated properly. The fix is a technology investment, not a data governance initiative.
Technology spend on systems that generate data nobody can use.A system collects detailed behavioral data, but the analytics environment was never built to receive it. The combined assessment finds it and values the investment that would unlock it.
An AI use case blocked by both a data problem and a technology problem.The board wants member personalization. The data track finds the engagement data is incomplete. The technology track finds the AMS can’t deliver real-time events. Both need to be fixed in sequence.
A planned investment sequence that would have made things worse.Leadership planned to replace the AMS first, then build analytics. The combined assessment reveals analytics requirements should inform AMS selection criteria — reversing the sequence avoids a costly rebuild 18 months later.
Common questions

What organizations ask before starting.

Is this just two assessments bundled together? +
No. The cross-track analysis in week two is the step that makes the combined engagement different. Identifying the dependencies between technology and data problems and building a roadmap that accounts for them is something neither standalone assessment does. The integrated executive summary and integrated roadmap are unique outputs.
Does it take longer than 21 days? +
No. Both tracks run in parallel during weeks one and two. The cross-track analysis happens in week two alongside the individual track analyses. The timeline is the same as a standalone assessment — 21 days from kickoff to readout. The scope is broader; the calendar is not.
When does combined make more sense than two separate assessments? +
When you suspect the technology and data problems are related, which is most of the time. Technology architecture decisions frequently create data quality problems, and data gaps are often caused by system integration failures. Seeing both at once, and finding the dependencies between them, is almost always more efficient than running them sequentially.
Is it more expensive than a single assessment? +
Yes. The combined assessment is priced above a single assessment but below the cost of two separate assessments. The fixed price is agreed at engagement start with no hourly billing.
What if we only have budget for one type this cycle? +
Start with whichever problem is creating the most immediate pain. The findings from a standalone Technology or Data Assessment will typically surface enough of the other domain to help you make the case for a follow-up engagement.

See both problems at once, and solve them in the right order.

Fill in a few details. We’ll confirm scope and price before anything starts.

Standalone assessment types

Not ready for a combined engagement?

Technology Assessment

A complete picture of your technology stack in 21 days.

Maps every system, vendor, contract, and dollar of technology spend. Right when systems or vendors are the primary concern.

See the Technology Assessment →
Data Assessment

Your data is already there. Find out what it’s telling you.

Maps data sources, quality issues, analytics gaps, and AI readiness. Right when reporting or data trust is the primary pain.

See the Data Assessment →