The Economics of Interoperability: Why Data Sharing Is a Business Decision

A digital health vendor had built a working integration with one of the major EHR platforms. Not a prototype. Not a sandbox demo. A live, functioning connection, already deployed in two health systems, pulling real clinical data to power a care management workflow that clinicians were actually using.

A third health system wanted it. Same EHR. Same integration. Technically, it was a replication job.

Then procurement got involved.

The EHR vendor quoted a six-figure fee to enable the integration for the new health system. The implementation timeline came back at 14 months. Neither number was a mistake or an anomaly. It was just the price of access, set by the party that controlled the pipe.

The vendor had done everything right. The technology worked. The clinical case was proven. The health system wanted it. And yet the deal sat in a queue for over a year, held up not by anything technical, but by the commercial structure sitting on top of the technology.

That pattern, the working integration that dies in procurement, is one of the clearest illustrations of what interoperability actually is in practice. It is not primarily a technical problem. It is an economic one. And until you see it that way, you will keep underestimating how hard it is, and why.

What’s Actually Going On Underneath

Here is the thing about healthcare interoperability that doesn’t get said plainly enough: the standards largely exist. HL7 FHIR is mature enough to build on. TEFCA has established a framework for nationwide health information exchange. CMS has been pushing mandate after mandate, including the CMS-0057 Final Rule, which requires payers to implement FHIR-based Prior Authorization APIs and share data across payer transitions with real deadlines attached.

The infrastructure is further along than it was five years ago. The technical excuses are running out.

And yet data still doesn’t flow freely. Integration projects still stall. Vendors still hit walls that have nothing to do with technology. Why?

Because data in healthcare is not just clinical information. It is a competitive asset. And sharing a competitive asset freely, even when a regulation says you should, runs directly against the commercial interests of the organizations sitting on it.

Think about what a large health system actually controls when it controls its patient data. It controls the longitudinal record that makes its care management programs defensible. It controls the referral patterns that keep patients in network. It controls the analytics that give it negotiating leverage with payers. It controls the switching costs that make it expensive for patients, physicians, and vendors to leave. Every time that data moves freely to an outside party, some portion of that leverage moves with it.

This is not a conspiracy. It is just how competitive markets behave when the asset at the center of the market is information. The organizations that have it protect it. The organizations that want it negotiate for it. And the vendors whose business models depend on it being sticky, primarily the large EHR platforms, build commercial structures that reflect that reality, whether intentionally or not.

The procurement story from the opening is a clean example. The EHR vendor wasn’t blocking the integration out of malice. They were pricing access to their installed base the way any platform business would: at whatever the market will bear. Six figures and 14 months is not sabotage. It is leverage.

Reality check: When an interoperability project stalls, the first question most teams ask is “what’s the technical blocker?” The more useful question is “whose business model depends on this data not moving?”

The Cost of Fragmentation Nobody Fully Accounts For

While incumbent organizations protect their data assets, everyone else absorbs the cost. And that cost is substantial, even if it rarely shows up as a line item.

Clinicians work across disconnected systems, re-entering data, chasing records, making decisions with incomplete information. Care coordination breaks down at transitions: between hospital and primary care, between health system and specialist, between one payer’s network and another. Duplicate testing happens not because physicians are careless but because prior results aren’t available at the point of care.

For founders and product teams, the cost is different but just as real. Every integration into a new health system is a semi-custom project. Every EHR connection requires navigating a different commercial agreement, a different data governance policy, a different IT review process. What looks like a scalable product on paper becomes an implementation-heavy, margin-compressing services business in practice, with timelines that neither the founder nor the buyer fully anticipated upfront.

The fragmentation isn’t accidental. It is, in many cases, the outcome of a market structure where data control equals revenue protection. The cost just gets distributed across everyone who isn’t in a position to protect their own data the same way.

Why Interoperability Projects Stall Despite Technical Feasibility

Across enough of these engagements, a pattern becomes hard to ignore. A founder comes in having done the technical work correctly. FHIR-compliant APIs, HL7 interfaces designed, security posture solid, BAA templates ready. They’ve read the documentation, talked to the developer portal team, maybe even completed a sandbox integration. They are confident the hard part is behind them.

Then they enter a real health system environment and discover the hard part hadn’t started yet.

The blockers that actually kill interoperability projects rarely cluster around technology. They tend to be these four things.

Commercial structure. The EHR vendor charges a connectivity fee. The health system’s IT department has a backlog and your integration is number eleven in the queue. The data governance team wants a custom BAA clause your legal team has never seen before. None of these are technical problems. They are commercial and operational ones, and they compound.

Misaligned incentives between departments. The clinical champion wants the integration. IT is worried about security surface area. The CFO is looking at a budget line that wasn’t in the original proposal. Legal is reviewing data use terms that will take eight weeks. These people are not obstructing for the sake of it. They are each protecting something legitimate. But nobody in the room has a direct incentive to make the integration happen faster, and that matters more than most founders expect.

Data as internal political currency. Inside large health systems, data access is often a proxy for departmental influence. The team that controls the data warehouse controls what questions can be asked and by whom. Sharing data with an external vendor, or even with another internal team, can feel like a loss of that control. This dynamic is almost never discussed openly, but it shapes decisions quietly and consistently.

Minimum viable compliance. When regulation forces movement, the response from large organizations is rarely genuine openness. It is the minimum required to satisfy the rule. CMS-0057 is a useful illustration. The Final Rule mandates FHIR-based Prior Authorization APIs and payer-to-payer data sharing with hard deadlines. The intent is to reduce friction and stop data from disappearing at coverage transitions. The response from large payers has largely been to build what is required, scoped as narrowly as the rule allows, without the surrounding infrastructure that would make the data genuinely useful downstream. Compliance and openness are not the same thing. They rarely produce the same outcome.

If you only remember one thing: a mandate sets the floor. It does not change the incentive to stay there.

A Better Way to Think About This

If you are a founder building a product that depends on healthcare data, the reframe is this: interoperability is not a technical milestone on your roadmap. It is a business development problem with a technical component.

That means the integration conversation needs to start much earlier than most founders start it. Before you finalize your architecture, you need to know what data you need, who controls it, what their commercial incentive is to share it or not, and what it will realistically cost in time, money, and relationship capital to get access. That is not a post-fundraise problem. It is a pre-product-market-fit question.

For health system and payer product leaders, the reframe is different but equally important. Every time your organization makes data access slow, expensive, or contractually complicated for external partners, you are also making it harder for your own teams to build on top of that data. The same walls that keep competitors out keep your own innovation in. That tradeoff is worth making consciously, not by default.

Decision rule: Before scoping any interoperability initiative, map the incentives first. Who benefits from this data moving? Who loses leverage if it does? Where does the cost of integration actually land, and who is absorbing it? The technical architecture should follow that map, not precede it.

Before You Scope an Integration: A Practical Checklist

  • Have you identified who commercially controls the data you need, not just which system it lives in?
  • Do you know whether the EHR or platform vendor charges connectivity or activation fees, and have you gotten a real quote rather than a developer portal estimate?
  • Have you mapped the internal stakeholders at the health system who will need to approve the integration, beyond your clinical or operational champion?
  • Is there a data governance or legal review step, and do you know how long it typically takes at this organization?
  • Have you scoped the integration against production data volumes and real access patterns, not sandbox conditions?
  • Do you understand what the organization is contractually permitted to share with you, including any EHR vendor clauses that restrict derived data use?
  • Have you built timeline buffers that account for procurement, IT queue position, and legal review, not just development time?
  • Is your product’s core value proposition intact if the integration takes six months longer than planned?

Closing

Healthcare interoperability has been framed as a technical problem for long enough that the framing has become a convenient excuse. The pipes exist. The standards are workable. The APIs are being built, sometimes willingly, sometimes because a regulation required it.

What hasn’t changed is the underlying economics. Data confers leverage. Leverage has value. Organizations that have built business models on top of that leverage do not dismantle it because a standard was published or a rule was finalized. They adapt to the minimum required while protecting what they can.

This doesn’t mean interoperability is impossible or that progress isn’t real. It means that the founders and product leaders who navigate it well are the ones who stopped treating it as a technical integration challenge and started treating it as a market structure problem. They mapped the incentives before they mapped the APIs. They priced the commercial friction into their roadmap before they promised a go-live date. They understood that the question was never “can this data move?”

It was always “who benefits when it does, and who doesn’t?”

Related Post

Let's Bring Clarity to Your Product Journey

If you’re navigating product direction, data architecture, interoperability, or enterprise readiness, I’d be glad to talk.