9.1 C
Canberra
Monday, July 21, 2025

Reworking Affected person Referrals: Windfall Makes use of Databricks MLflow to Speed up Automation Throughout 1,000+ Clinics


Windfall serves susceptible and deprived communities by compassionate, high-quality care. As one of many largest nonprofit well being methods in the USA—with 51 hospitals, over 1,000 outpatient clinics, and greater than 130,000 caregivers throughout seven states—our capacity to ship well timed, coordinated care will depend on remodeling not solely scientific outcomes but in addition the workflows that assist them.

One of the urgent cases is automating the best way we deal with faxes. Regardless of advances in digital well being, faxes stay a dominant type of communication in healthcare, particularly for referrals between suppliers. Windfall receives greater than 40 million faxes yearly, totaling over 160 million pages. A good portion of that quantity have to be manually reviewed and transcribed into Epic, our digital well being report (EHR) system.

The method is gradual, error-prone and contributes to multi-month backlogs that in the end delay take care of sufferers. We knew there needed to be a greater manner.

Tackling messy workflows and unstructured knowledge at scale

The core problem wasn’t simply technical—it was human. In healthcare, workflows range broadly between clinics, roles and even people. One workers member may print and scan referrals earlier than manually coming into them into Epic, whereas one other may work inside a completely digital queue. The shortage of standardization makes it troublesome to outline a “common” automation pipeline or create take a look at situations that replicate real-world complexity.

On high of that, the underlying knowledge is commonly fragmented and inconsistently saved. From handwritten notes to typed PDFs, the range of incoming fax paperwork creates a variety of inputs to course of, classify and extract data from. And while you’re coping with a number of optical character recognition (OCR) instruments, immediate methods and language fashions, tuning all these hyperparameters turns into exponentially more durable.

This complexity made it clear that our success would hinge on constructing a low-friction testing ecosystem. One which lets us experiment quickly, examine outcomes throughout hundreds of permutations and constantly refine our fashions and prompts.

Accelerating GenAI experimentation with MLflow on Databricks

To fulfill that problem, we turned to the Databricks Information Intelligence Platform, and particularly MLflow, to orchestrate and scale our machine studying mannequin experimentation pipeline. Whereas our manufacturing infrastructure is constructed on microservices, the experimentation and validation phases are powered by Databricks, which is the place a lot of the worth lies.

For our eFax mission, we used MLflow to:

  • Outline and execute parameterized jobs that sweep throughout combos of OCR fashions, immediate templates and different hyperparameters. By permitting customers to offer dynamic inputs at runtime, parameterized jobs make duties extra versatile and reusable. We handle jobs by our CI/CD pipelines, producing YAML information to configure giant checks effectively and repeatably.
  • Observe and log experiment outcomes centrally for environment friendly comparability. This provides our crew clear visibility into what’s working and what wants tuning, with out duplicating effort. The central logging additionally helps deeper analysis of mannequin habits throughout doc sorts and referral situations.
  • Leverage historic knowledge to simulate downstream outcomes and refine our fashions earlier than pushing to manufacturing. Catching points early within the testing cycle reduces threat and accelerates deployment. That is notably essential given the range of referral varieties and the necessity for compliance inside closely regulated EHR environments like Epic.

This course of was impressed by our success working with Databricks on our deep studying frameworks. We’ve since tailored and expanded it for our eFax work and enormous language mannequin (LLM) experimentation.

Whereas we use Azure AI Doc Intelligence for OCR and OpenAI’s GPT-4.0 fashions for extraction, the true engineering accelerant has been the flexibility to run managed, repeated checks by MLflow pipelines—automating what would in any other case be handbook, fragmented growth. With the unifying nature of the Databricks Information Intelligence Platform, we’re capable of remodel uncooked faxes, experiment with completely different AI methods and validate outputs with pace and confidence in a single place.

All extracted referral knowledge have to be built-in into Epic, which requires seamless knowledge formatting, validation and safe supply. Databricks performs a important position in pre-processing and normalizing this data earlier than handoff to our EHR system.

We additionally depend on Databricks for batch ETL, metadata storage and downstream evaluation. Our broader tech stack contains Azure Kubernetes Service (AKS) for containerized deployment, Azure Search to assist retrieval-augmented technology (RAG) workflows and Postgres for structured storage. For future phases, we’re actively exploring Mosaic AI for RAG and Mannequin Serving to reinforce the accuracy, scalability and responsiveness of our AI options. With Mannequin Serving, we will likely be in a greater place to successfully deploy and handle fashions in actual time, guaranteeing extra constant workflows throughout all our AI efforts.

From months of backlog to real-time triage

In the end, the beneficiaries of this eFax resolution are our caregivers—clinicians, medical information directors, nurses, and different frontline workers whose time is at the moment consumed by repetitive doc processing. By eradicating low-value handbook bottlenecks, we purpose to return that point to affected person care.

In some areas, faxes have sat in queues for as much as two to 3 months with out being reviewed—delays that may severely affect affected person care. With AI-powered automation, we’re transferring towards real-time processing of over 40 million faxes yearly, eliminating bottlenecks and enabling sooner referral consumption. This shift has not solely improved productiveness and lowered operational overhead but in addition accelerated remedy timelines, enhanced affected person outcomes, and freed up scientific workers to deal with higher-value care supply. By modernizing a traditionally handbook workflow, we’re unlocking system-wide efficiencies that scale throughout our 1,000+ outpatient clinics, supporting our mission to offer well timed, coordinated care at scale.

Due to MLflow, we’re not simply experimenting. We’re operationalizing AI in a manner that’s aligned with our mission, our workflows, and the real-time wants of our caregivers and sufferers.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

[td_block_social_counter facebook="tagdiv" twitter="tagdivofficial" youtube="tagdiv" style="style8 td-social-boxed td-social-font-icons" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjM4IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9" custom_title="Stay Connected" block_template_id="td_block_template_8" f_header_font_family="712" f_header_font_transform="uppercase" f_header_font_weight="500" f_header_font_size="17" border_color="#dd3333"]
- Advertisement -spot_img

Latest Articles