Explore our Centers of Excellence

Insights

3 Ways Healthcare Organizations Can Activate Data with AI

August 12, 2025

“It would have taken me a year to put together the work you’ve done in 2 months”

SVP, Chief Clinical Officer

Over the past decade, healthcare providers have invested billions in digitization. Migrating to cloud platforms. Building data lakes. Deploying analytics dashboards. Running pilots with machine learning and natural language processing.

But for all that activity, most organizations still struggle to generate real ROI from their data investments. The problem isn’t a lack of ambition. It’s a lack of activation.

Most healthcare data remains siloed. Stale. Structurally misaligned with frontline workflows.

Meanwhile, AI initiatives are often built on brittle foundations. Isolated pilots with no path to scale. Limited infrastructure for monitoring or retraining. Little connection to operational metrics.

This article offers a different playbook. Based on real work across enterprise health systems, it outlines what separates flashy demos from systems that actually transform work.

We also introduce a model for moving from raw data to activated workflows, and show how AI can create value when it’s embedded, not just bolted on.

If you’re a CIO, CDO, or data leader trying to actually transform your organization with AI, we hope this provides a useful roadmap. But first, some context.

The Healthcare AI Gap: Why the ROI Isn’t There Yet

For the past five years, provider groups, payers, and healthtech startups have filled board decks with ambitious promises. Predictive analytics, virtual agents, automated clinical decision support. But for most they have little to show for it. According to McKinsey, AI programs typically capture less than one‑third of expected value, and only 30 % succeed in large digital transformations. Why is that?

The reasons vary, but the underlying pattern is consistent: the data isn’t ready.

Healthcare organizations have tons of data - billions of rows spread across EHRs, claims systems, intake forms, wearable feeds, and internal spreadsheets. But the data is fragmented. Inconsistently governed. Structurally misaligned with real-world decision points. In other words, it’s not actionable.

The greatest barriers to AI success aren’t algorithmic, they’re architectural. Initiatives stall because:

  • Patient data is locked behind brittle APIs or outdated EMRs. One survey of 200 health-system execs ranked legacy IT as the most significant implementation obstacle.
  • Analysts can’t access it without writing custom queries against three different systems.
  • There’s no shared layer of truth between clinical, operational, and administrative teams.

Across dozens of Manifold engagements (some of which we’ll highlight below), the biggest performance gains have come from building infrastructure. Infrastructure that allows data to move faster, more securely, and more usefully through the organization. AI is the coolest part. But it’s just the last mile.

Healthcare leaders need to stop thinking about AI as an app you install, and start treating it as an operational capability. One that only works if your data foundation is solid.

So how do you go from data collection to data activation, and why does that make AI actually work? We have three specific recommendations for making this happen:

  • Get your data infrastructure in place.
  • Embed MLOps from the outset.
  • Build for tangible frontline use cases.

The Difference Between Centralization and Aggregation

Aggregation is about centralizing data. Activation is about operationalizing it.

Both matter, of course. But you can’t stop at centralization. You can have a state-of-the-art data lake, dozens of dashboards, and still fail to change a single clinician behavior, reduce a single claim error, or improve a single scheduling decision. For data to be activated, three things must be true:

  1. It must be trusted. Is the data accurate, governed, and accessible by the right people?
  2. It must be contextual. Is it structured and enriched in a way that’s relevant to the task or workflow?
  3. It must be readily available. Is it available at the moment and point of action (not just in a monthly report?)

Rule 1: Get Your Data Infrastructure in Place

The the hardest part in making this happen is the plumbing. Before you can predict, generate, or automate anything, you need clean, trusted data. And the system needs to know where that data came from, what it means, and who’s allowed to see it. This is even more true in healthcare where HIPAA compliance and clinical accuracy are musts.

Take the legacy EMR example. The upside was mostly infrastructure improvements. Synapse Analytics for scalable query performance. Power BI for visual delivery. Purview for data governance and lineage. PowerApps for structured PDF generation.

Not glamorous. But transformative. The client now has a durable, vendor-agnostic foundation for regulatory reporting, internal audit, and operational analytics.

Or consider the AI-enabled dashboards. The real unlock wasn’t Copilot. It was the underlying architecture. Ingested data that was cleaned, curated and with a consistent schema, and with business-context layers ready for AI queries. Without the supporting infrastructure, Copilot is a neat toy. But with it, you can get real-time, trustworthy insights from plain-language queries.

It’s not popular. But it’s true. The fastest path to AI outcomes often begins with slower, more deliberate investment in data infrastructure.

Rule 2: Embed MLOps The Outset

It’s not just accurate data though. You also need robust MLOps. And ideally you want it done right at the beginning. Without it, models degrade over time, leading to unreliable results (an absolute must in a healthcare context.)

MLOps are a set of practices that let you deploy models into production with minimal risk, monitor their performance (and drift), and retrain and version as needed. A robust MLOps toolkit includes an end-to-end pipeline that includes:

  • Feature stores for shared data access
  • Model registries for tracking versions
  • CI/CD workflows for retraining and deployment
  • Explainability layers to ensure clinical transparency

Rule 3: Build for Tangible Frontline Use Cases

AI is not just an executive tool. It’s more than a dashboard for leadership to spot trends or answer board questions. The real operational improvements happen when you embed it into the organization where the work is actually being done.

Does AI reduce keystrokes? Shorten wait times? Eliminate duplicative tasks? Make it easier for a staff member to do their job? Often the answer is no.

The best AI strategy is one that is grounded in the reality of how work gets done. It needs to meet clinicians, schedulers, intake coordinators, and administrative staff where they are.

That means it’s just as much of a user experience and change management problem as it is a data science or tooling problem. You need to engage your frontline workers from the beginning, to understand exactly how they work and what is frustrating about it. Without this step, you risk building a great model no one us

Five Use Cases That Actually Work

Examples are often helpful. A few vignettes from Manifold engagements where activated data is now a reality:

  • A call center had agents manually summarizing interactions after patient or provider calls. This process was inconsistent, time-consuming, and error-prone. Manifold developed an AI assistant that listened into live calls and generated structured summaries, integrated into their existing call center stack. The client realized over $500k in annual savings, reduced agent burnout, and dramatically improved wrap-up speed.
  • A legacy EMR had decades of patient data that needed to be accessible for audits, disclosures, and patient requests. Manifold built a HIPAA-compliant, self-service data viewer supporting full disclosure workflows (request validation, PDF generation, audit logging, and structured export.)
  • A multi-facility health system had fragmented dashboards that were static and hard to maintain. Manifold deployed a lakehouse architecture in Azure and built dynamic Power BI dashboards with AI Copilot functionality, enabling natural language queries and self-serve insights.
  • An enterprise healthcare org had non-conforming claims, which were driving up costs. Leadership had no visibility into root causes. Manifold implemented a live analytics environment with a high-performing rules engine and parallel processing pipelines, alongside Excel-based reporting tools tailored for finance users. The result was $2 million in annual savings and a 50% year-over-year reduction in staffing costs tied to claims processing.
  • Therapy teams across inpatient facilities were relying on spreadsheets to schedule complex care regimens, leading to inefficiencies and compliance risks. Manifold built a workflow-aware scheduling platform with carry-forward logic, conflict alerts, and role-based access.

None of these examples required moonshot research or speculative models. What they required was activated data, and a willingness to embed AI where it actually changes work.

A Practical Starting Point

The data foundation work can feel daunting. It competes with attention and resources. The instinct is to wait.

But you don’t need everything to be perfect to get started. You can start small.

One of the ways we help clients do this is our Data to AI Workshop. It’s a focused two-week engagement designed to move organizations from aspiration to action. Real infrastructure patterns. A scoped pilot plan. The groundwork for a sustainable AI capability.

Here’s what it includes:

  • Current State Assessment. Where is your data today? What platforms, pipelines, and pain points already exist?
  • Pilot Use Case Identification. What’s a realistic, high-leverage problem to solve with AI (ideally one that can prove value fast?)
  • MLOps and Infrastructure Design. What does a scalable, secure deployment pattern look like in your environment?
  • Roadmap and Technical Blueprint. How do you go from this first pilot to a repeatable platform?

It’s built entirely around your actual systems, constraints, and goals. In past workshops, clients have left with both clarity and velocity.

Conclusion

Activating data with AI can mean the difference between building dashboards and changing decisions. Between installing a model and transforming how work is done. To get there, you need to build a strong data foundation, operationalize MLOps from the start, and design for real-world frontline use cases that create tangible value. If you’re ready to make that shift, the path forward can practical, incremental, and iterative.

Partner with Us

In today’s data-driven landscape, the ability to harness and transform raw data into actionable insights is a powerful competitive advantage.

Making better decisions leads to measurably better outcomes. With a solid data and AI foundation, businesses can innovate, scale, and realize limitless opportunities for growth and efficiency.

We’ve built our Data & AI capabilities to help empower your organization with robust strategies, cutting-edge platforms, and self-service tools that put the power of data directly in your hands.

Self-Service Data Foundation


Empower your teams with scalable, real-time analytics and self-service data management.

Data to AI

Deliver actionable AI insights with a streamlined lifecycle from data to deployment.

AI Powered Engagement

Automate interactions and optimize processes with real-time analytics and AI enabled  experiences.

Advanced Analytics & AI

Provide predictive insights and enhanced experiences with AI, NLP, and generative models.

MLOps & DataOps

Provide predictive insights and enhanced experiences with AI, NLP, and generative models.

Ready to embrace transformation?

Let’s explore how our expertise and partnerships can accelerate impact for your organization.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.