All articles

The $800,000 Problem Hiding in Your Analytics Team

The $800,000 Problem Hiding in Your Analytics Team
When 40% of your analytics team’s time goes toward hunting for data instead of analyzing it, the cost adds up fast. For one healthcare system, that hidden waste reached $800,000 a year. Here’s what actually fixed it.

This is the first blog in a five-part series based on our ebook, From Chaos to Clarity: The Strategic Guide to Healthcare Data Catalogs. Each blog addresses one of the root causes organizations face when data becomes a blocker instead of an asset.

Download the full ebook here.

When a regional healthcare provider on the West Coast came to our team at Definian, leadership couldn’t answer basic questions about their own operations.

This was not because the data didn’t exist. It existed across clinical systems, HR, scheduling, and finance. The problem was that none of it connected. So every time an executive needed to know something, where clinic space was underutilized, which vendors were driving spend, or how they were staffed across facilities, someone had to manually pull from multiple systems, reconcile the conflicts, and build a picture that was already out of date by the time it reached the meeting room.

We were brought in to build an operational command center for them. But before we touched a single dashboard, we had to solve the real problem first.

That problem carries a cost most organizations have never calculated.

Do the math on your own team

Look at your analytics team. Think about how much of their week goes toward finding data, chasing down where a number comes from, and validating whether a source can be trusted.

In the organizations we assess, that number is typically 40 to 60 percent. Not because the analysts are bad at their jobs, but because there is no reliable way to know what data exists, where it lives, or whether it means what you think it means.

Ten analysts. Forty percent of their time is spent hunting. At $100 an hour fully loaded, that’s $800,000 a year, not from failed projects or bad decisions, but from people looking for data that should already be accessible.

And that’s just the direct loss. It doesn’t include the strategic initiative that gets pushed back six months because nobody can agree on the data foundation. It doesn’t include the analytics platform, which sits at 30 percent adoption because people don’t trust its numbers. It doesn’t include the value-based care contract your organization isn’t ready to pursue because you can’t calculate the total cost of care with confidence.

For that health system, the cost compounded across dozens of analysts, two merged entities, and years of disconnected system growth. By the time we ran the assessment, the waste was structural, baked into how every project started.

Why this is harder in healthcare

Most health systems run 50 or more source systems that have evolved independently over decades. Different terminology. The same concept is defined differently depending on who built the system and when.

“Patient admission” might count ED visits in one system and only inpatient bed assignments in another. That’s not an error. That’s what happens when systems make sensible local decisions without ever interacting with each other.

Then there’s the regulatory layer. This includes HIPAA, CMS quality reporting, and interoperability mandates. Every one of those requirements depends on knowing what you have and where it is. Most organizations are meeting them without that foundation in place, manually, slowly, and expensively.

It’s not the tools or the team

The problem shows up in Slack messages. Someone asks where the readmission data lives. The same dataset gets rebuilt three times because no one knows the other two exist. A new analyst asks, “Where does this number come from?” and gets three different answers.

That’s not a skills problem. That’s not a platform problem. That’s a visibility problem. And in our experience, it’s almost always the real diagnosis, even when the first instinct is to hire more analysts or buy better tools.

When 40 percent of your analytics team’s week is going toward hunting instead of analysis, the answer isn’t more headcount. It’s knowing what you have.

What fixing it actually looks like

The organizations that fix this don’t start by documenting everything. They start with the data causing the most pain. The sources feeding executive dashboards. The domains behind regulatory reports. The datasets are blocking the initiatives that matter most.

In practice, that means three things:

First, a targeted data inventory. Not a sprawling cataloging project, but a focused audit of the 20 or 30 sources that drive the most decisions. For each one: what it contains, who owns it, and when it was last validated.

Second, ownership and stewardship mapping. Every domain gets a named owner who is accountable for quality and can answer questions. This alone cuts Slack noise significantly.

Third, lineage documentation for the data that matters most. Where this number comes from, what transforms it, and what decisions depend on it. This is what makes dashboards trustworthy rather than just visually polished.

That’s exactly what we did for the health system before writing a single line of dashboard code. Within 90 days, the key clinical and operational data domains were documented, owned, and understood. Slack questions slowed down. Duplicate rebuilds stopped. And when leadership finally saw the command center, they trusted what was on the screen because they had been part of building the foundation underneath it.

Without that foundation, we would have built another set of reports nobody believed.

One question before you move on

What data is your organization currently unable to find, trust, or explain?

If you don’t have a clear answer to that, that’s the answer.

The next post in this series goes deeper: once you know what you have, how do you decide what to fix first, and what does a governance structure that actually sticks look like in a complex organization?

If you want to know where you stand right now, the full ebook includes a 15-minute Data Catalog Readiness Assessment and a Data Chaos Cost Calculator you can take to your next leadership meeting. Download From Chaos to Clarity here. Or, if you’re ready to talk through your specific situation, reach out to start a diagnostic conversation, no sales pitch, just an honest assessment of whether this is the right problem to solve right now.

Other articles

Brittle Data Has a Cause. Data Malleability Is the Cure.

Brittle Data Has a Cause. Data Malleability Is the Cure.

Data Governance
Data Value Realization
Data debt names what went wrong. Data Malleability names the capability that prevents it. This article introduces a new framework for building data that absorbs change instead of fracturing under it.
Building a Data Quality Foundation for a Community College Using an AI-powered DQ Platform

Building a Data Quality Foundation for a Community College Using an AI-powered DQ Platform

Case Study
Data Governance
Best Practices
Following a Student Information System migration, a mid-sized community college faced persistent data reliability issues. A structured data quality program using an AI-powered platform established a scalable capability within six months.
Don’t Slow Down the AI Train—Fix the Tracks

Don’t Slow Down the AI Train—Fix the Tracks

Data Governance
Best Practices
Data Value Realization
A response to Tom Davenport’s call to slow AI development. The problem isn’t that AI is too fast—it’s that organizations are too unprepared. The answer is better foundations, not less innovation.
Client testimonial
The Definian team was great to work with. Professional, accommodating, organized, knowledgeable ... We could not have been as successful without you.
Senior Manager | Top Four Global Consulting Firm

Partners & Certifications

Ready to unleash the value in your data?