Before AI can transform healthcare, someone has to fix the infrastructure underneath it. As Senior Analytics Engineer Mehulkumar Joshi embedded inside some of the largest healthcare data environments in the US, the work is harder, slower, and more invisible than the industry admits.
On January 14, 2026, Dr Gianrico Farrugia, President and CEO of Mayo Clinic (the largest integrated, not-for-profit medical group practice in the US, treating patients from more than 130 countries), published a piece in the World Economic Forum ahead of Davos. His warning landed with unusual weight. He argued that AI's transformation of healthcare is "hitting a wall", not because algorithms are unready, but because of "decades of disparate systems, incompatible formats, siloed records and legacy infrastructure." Outdated data structures are constraining AI to "narrow, task-specific tools instead of enabling solutions that can reason, learn, and act." The statement made headlines across boardrooms. But for analytics engineers quietly embedded inside hospital networks and EHR vendors, it wasn't a revelation. It was just another day at work.
A February 2026 NVIDIA global survey of healthcare and life sciences organisations found that 70% of respondents are now actively using AI, with digital healthcare leading at 78%. The technology is arriving. The infrastructure it depends on, in many cases, still isn't ready to receive it.
Mehulkumar Joshi knows that wall because he has spent 13 years taking it apart, rebuilding legacy healthcare systems into infrastructure that modern data and AI can actually use. At RXNT, a Philadelphia-based EHR vendor ranked by TIME Magazine among the world's top healthtech companies, he stepped into systems that medical practices had depended on and rebuilt every one of them, without interrupting a single practice. Before that, at eClinicalWorks, IBM Watson Health, and EXL Service, he worked across some of the largest healthcare data environments in the United States, including networks representing 300 million patient lives and Fortune 500 hospital systems. His trajectory is not a record of achievements. It is a window into what it actually takes to make healthcare data AI-ready, and why that work is harder, slower, and more invisible than the industry tends to admit.
At RXNT, the situation Joshi walked into in 2022 was typical of what deferred maintenance produces. The company's practice management analytics were running on eight legacy stored procedures, some of them written a decade or more before he arrived. The original authors had moved on, and the documentation, where it existed at all, was incomplete. Over 40 medical practices depended on these systems daily, and inconsistency from clients had become routine. Joshi reverse-engineered the underlying logic piece by piece, moving it to BigQuery and DBT without disrupting the practices that were still running on top of it.
The numbers made the same case. Before Joshi joined, RXNT was spending $168,000 a year on cloud infrastructure. He rebuilt the underlying systems, optimised the queries, shifted to slot-based pricing, and added monitoring. The annual bill dropped to $72,000, a 57% reduction. No features were cut, and no practices were disrupted. The savings came entirely from understanding the system well enough to rebuild it properly.
"People who understood the problem deeply built these systems to last, and for years, they delivered. But when you trust something you don't fully understand, you pay a price. Not dramatically. Slowly and quietly, until something goes wrong and nobody knows why," Joshi says.
The same problem had surfaced years earlier, at IBM Watson Health. Joshi was supporting Fortune 500 hospital networks there, working across a dataset that represented 300 million patient lives. Data loads for major clients were taking 12 to 16 hours, which made daily updates structurally impossible. His response was the Bootstrap Method, an original framework he designed from scratch, with no existing solution in Watson Health's toolkit to draw from. It replaced full-table reloads with incremental processing, pulling only records that had changed since the last run.
The industry focuses on what AI models can do, but underweights what has to be true before those models work at all. Healthcare data carries clinical meaning that technical skill alone cannot decode. A CPT denial, a revenue cycle irregularity, a compliance flag; these are domain problems living inside data systems. Engineers who understand both produce models that actually reflect reality.
The difference shows up most clearly under pressure. When Joshi was embedded at Adventist Health, a 24-hospital system spanning California, Hawaii, Oregon, and Washington, their analytics team was locked into a cycle that made daily data updates structurally impossible. Loads ran 12 to 16 hours. By the time one finished, new data was already waiting. Joshi's Bootstrap Method broke that cycle: by pulling only records changed since the last run, load times dropped to two to four hours. For the first time, Adventist's population health team could work with data that reflected yesterday, not last week. The contract was not just completed, the method was adopted by all 15 engineers on the team and became the standard.
Speed in this context is not a performance metric. It is a symptom of depth, the difference between an engineer who spends the first two weeks getting oriented and one who arrives already knowing the domain.
"You can't separate the data from the domain," Joshi argues. "I've worked with engineers who are technically strong but don't understand what a Meaningful Use attestation actually means, or why a specific CPT code is triggering a denial. That gap shows up in the models. It shows up in the dashboards. It shows up in the business decisions that come out the other side."
Dr Farrugia's Davos observation pointed to a structural problem. Even the most digitally advanced organisations, he argued, lack the AI-ready data architecture needed to support the next generation of agentic and reasoning AI systems. The path forward, in his framing, requires a fundamental reconception of how healthcare data is structured, unified, and maintained.
That recognition will not happen through model development alone. It will happen through work like Joshi's, engineers who can open a stored procedure written in 2012, understand what it was trying to do, and rebuild it cleanly enough that the AI systems of 2026 can trust what comes out the other side. The output of that work is never a product launch. It is a system that stops generating complaints. A pipeline that holds under pressure.
That pattern repeats across the industry, in EHR vendors, hospital networks, and payer systems. It is carried by engineers whose names never appear in funding announcements. Their most significant contributions are measured in hours saved and errors that never surfaced.
"No one's writing a press release because a stored procedure finally works," Joshi says. "But that's what the dashboard runs on. And that dashboard, that's what the doctor's looking at. So at the end of the day, it all comes back to one thing: is the data underneath actually right?"
Joshi received EXL's Shining Star award in 2020, a company-wide recognition for work that a credit union's own analysts called extraordinary. The more telling measure came later. When Joshi took two weeks of paternity leave at RXNT, projects stalled. He had documented everything, the systems, the logic, the decisions. It didn't matter. A colleague couldn't step in because the knowledge that kept those systems running wasn't in the documentation. It was over a decade of understanding why the data looked the way it did. Some expertise does not transfer on a timeline. And in healthcare, that is precisely the kind of expertise the industry cannot afford to lose.
The wall is real. Joshi has spent his career taking it apart, one stored procedure at a time, one undocumented system at a time, one legacy migration at a time. The work does not generate headlines. It generates pipelines that hold, dashboards that reflect reality, and data layers that AI can finally read. And occasionally, it generates something the industry does adopt at scale, a method, a framework, a new way of thinking about a problem that was previously considered fixed. The Bootstrap Method started when one engineer at IBM noticed a pattern no one had formally addressed. It ended up saving every client on a 15-person team four to five working days per week. For healthcare organisations in 2026, the question isn't whether to invest in AI, it's whether they've invested in the engineers doing the invisible work AI depends on.