Content Developer II at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
121356 stories
·
30 followers

Patient Referral Document Summarization using Azure OpenAI

1 Share

Introduction 

This article explores how the healthcare industry can utilize Generative AI, Large Language Model (LLM) Evaluation Metrics, and Machine Learning to streamline the patient referral process. Delays in reviewing referral documents can impact patient outcomes, making timely diagnosis and treatment is crucial. Generative AI Summarization enables hospitals to condense referral documents efficiently, speeding up patient admission and diagnoses while reducing physician review time. LLM evaluation metrics ensure trust in the summarization pipeline, enhancing physician confidence in using AI-generated content.  

 

Imagine the following Scenario:  

In a busy hospital setting, Dr. Jon Doe, a specialist, receives a referral for a critically ill patient. Typically, Dr. Jon Doe would spend days reading over multiple referral documents, trying to gather pertinent information. However, with the implementation of Generative AI Summarization powered by Azure technologies, Dr. Jon Doe receives a concise summary of the patient's medical history, symptoms, and relevant tests within minutes. This accelerated process allows Dr. Jon Doe to promptly assess the situation and initiate lifesaving treatment, ultimately improving patient outcomes. With the assurance provided by LLM evaluation metrics, Dr. Jon Doe trusts the AI-generated summary, confident that it captures all essential details accurately. This scenario illustrates how AI and ML technologies revolutionize healthcare, enabling faster decision-making and better patient care.  

 

Architecture :

manasa_ramalinga_1-1713296271942.png

Use case Workflow :

 

1. Data Sources  

Efficient patient referral systems in healthcare rely on access to diverse sources of patient information, encompassing handwritten notes and digitized data from various sources. These include Electronic Health Record (EHR) systems, enabling secure electronic transmission of referral documents, and Direct Secure Messaging facilitated by Health Information Service Providers (HISPs). Health Information Exchange (HIE) networks allow for the seamless sharing of patient data among different healthcare entities, while patient portals offer secure document exchange between patients and providers. Adoption of interoperability standards like HL7 or FHIR enhances data exchange between systems, complemented by the integration of telemedicine platforms for secure document sharing. Hospitals within integrated healthcare systems benefit from centralized patient information management. Analyzing these diverse data sources provides physicians with comprehensive patient health insights, guiding treatment decisions prior to admission.  

 

2. Data Ingestion and Staging  

Data ingestion is the process of moving data from different sources to a specific location. This process requires using specific connectors for each data source and target location.  

Azure health Data API service enables rapid exchange of data through FHIR APIs, backed by a managed PaaS offering in the cloud. This service makes it easy for anyone working with health data to ingest, manage, and persist Protected Health Information (PHI).  

Azure Data Factory offers a comprehensive range of connectors that can be leveraged to extract data from various sources, including databases, file systems, and cloud services. The health documents, usually in the form of PDFs, could be ingested using Azure Data Factory and placed inside a secure and private container inside Azure Data Lake Storage.  

Logic Apps offer a visual designer to automate workflows and connect applications, data, and services across on-premises and cloud environments. By incorporating Logic Apps into the workflow, organizations can streamline data movement and processing tasks, further enhancing the efficiency and reliability of the data ingestion process.  

 

3. Generative AI and Machine Learning  

Azure Machine Learning Studio is a cloud-based service offering a user-friendly graphical interface (GUI) for constructing and operationalizing machine learning workflows within the Azure environment. Tailored to streamline the entire machine learning lifecycle, it facilitates tasks ranging from data preparation to model deployment and management, providing an integrated environment for building, training, and deploying machine learning models.  

 

a. Data Extraction with Azure AI Document Intelligence   

Referral documents, generally in unstructured PDF format, are securely stored in a private container within Azure Data Lake Storage (ADLS), potentially necessitating REST encryption for safeguarding sensitive information. In the workflow, an Azure ML Studio Python notebook is employed to retrieve documents from ADLS and invoke the Azure AI Document Intelligence API to utilize its OCR (Optical Character Recognition) capabilities to convert referral PDFs into textual format, facilitating easier retrieval and summarization in subsequent steps.   

b. Data Manipulation for Summarization  

Preparing data is a vital step in utilizing Generative AI foundational models for summarization. Prior to summarizing referral documents, it's essential to cleanse the text by eliminating irrelevant characters like special symbols, punctuation marks, or HTML tags that might hinder the summarization process. Furthermore, expanding abbreviations and acronyms to their full forms is crucial to maintain clarity and coherence in the summarization output. Additionally, tokenizing the text into smaller units, such as words or sub-words, helps create a structured input suitable for the OpenAI model.  

 

 c. Prompt Engineering & Summarization  

To achieve precise summarization using Large Language Models (LLMs), prompt engineering is indispensable. Jinja templates offer a solution for crafting these prompts, which can also be conveniently stored in ADLS storage. Users have the flexibility to finely tune various prompts to invoke Azure OpenAI Chat Completion models, thereby generating customized summaries tailored to different types of referral documents.   

LangChain serves as an open-source orchestration framework designed to streamline the application-building process with LLMs. By leveraging the summary chain feature within LangChain, alongside OpenAI models and summary prompts, one can efficiently generate the required summaries for referral documents in the desired format.    

 

 d. LLM Evaluation using Azure AI SDK  

Developing an evaluation strategy is crucial for instilling trust among users, stakeholders, and the broader hospital community in the output generated through summarizing referral documents using Generative AI Large Language Models (LLMs).   

The Azure AI SDK offers a solution, providing both out-of-the-box AI assisted metrics such as Groundedness, Coherence, Fluency, and Relevance, as well as the capability to create custom AI assisted metrics tailored to business and stakeholder requirements.   

These standardized and customized metrics ensure that the generated summaries adhere to quality standards, evaluating coherence, relevance, and groundedness. Moreover, they enable researchers and developers to compare different models or prompt variations thereof to pinpoint the most effective one. Evaluation also gauges the generalization ability of LLMs across unseen data formats, showcasing their resilience and practical applicability. Ultimately, insights gathered from evaluation processes can guide iterative enhancements to LLM architectures, training methodologies, or prompt fine-tuning strategies, thereby elevating summarization performance.   

  

e. LLM Evaluation Dashboard for Performance Monitoring in Azure AI Studio  

Azure AI Studio is a comprehensive platform tailored for developers, facilitating the creation of generative AI applications within an enterprise-grade environment. With direct interaction capabilities via the Azure AI SDK and Azure AI CLI, developers can seamlessly engage with project code-first. The platform is designed to be inclusive, accommodating developers of varying abilities and preferences, fostering innovation in AI and influencing future advancements. Through Azure AI Studio, users can effortlessly explore, construct, evaluate, and deploy cutting-edge AI tools and machine learning models while adhering to responsible AI practices. Additionally, the platform fosters collaborative development, providing enterprise-grade security and a shared environment for team collaboration, enabling easy access to pre-trained models, data, and computational resources.  

 

Azure AI Studio incorporates a Large Language Model (LLM) evaluation dashboard, which improves the user experience by enabling streamlined monitoring. This dashboard simplifies the tracking of crucial out-of-box and custom AI-assisted metrics essential for evaluation purposes. Developers and data scientists can utilize this dashboard to monitor these metrics continuously, enabling them to evaluate and respond to the performance and reliability of referral document summaries generated using Gen AI LLM Models effectively.  

 

4. Analytical Workload   

The extracted summaries from health documents can be effectively stored within resilient analytics systems, including Azure Synapse Analytics, Azure Data Lake, or Azure CosmosDB, capitalizing on their robust database capabilities. The extensive output produced in Azure Machine Learning (AML) notebooks, covering results from the Large Language Model (LLM) evaluation toolbox workflow, are securely archived in any of the recommended storage solutions. This deliberate storage strategy not only ensures the secure preservation of information but also facilitates future retrieval, in-depth analysis, and comparative assessments of the generated outputs. This approach underscores a seamless and comprehensive data management strategy, contributing to the efficiency and reliability of operations within the Azure ecosystem.  

 

5. End-user Consumption  

Physicians can see the summaries in their portal and instead of having to go through hundreds of documents, they just read a summary, streamlining the patient referral process and securing a faster admission process for a potential patient. Also, if needed, web apps and a visualization in PowerBI can be built.  

 

Components 

  • Azure Data Lake: Limitless storage with easy integration and enterprise-grade security.  
  • LangChain: Open-source framework for language model-powered applications.  
  • Power BI: Business analytics with rich connectors and visualization.  
  • Power Apps: Rapid development of custom business apps connecting to data sources.  
  • Azure AI Studio: Simplifies AI application development in the cloud.  
  • Key Vault: Key management solution for securing sensitive data.  
  • Azure Monitor: Comprehensive monitoring solution for cloud environments.  

 

 

Potential Use Cases:  

The integration of Generative Artificial Intelligence (GenAI)  has the potential to bring transformative benefits across various industries. Here are alternative use cases in different sectors:  

 

  • Finance: Streamline financial report analysis for quicker decision-making.  
  • Retail: Summarize product reviews to understand customer sentiments rapidly.  
  • Manufacturing: Analyze quality control reports for efficient issue identification.  
  • Education: Automatically summarize educational content for quicker comprehension.  
  • Customer Service: Summarize emails to improve response times and communication.  
  • Human Resources: Streamline resume screening for efficient talent acquisition. 
  • Legal: Summarize complex legal documents for faster information extraction.  

 

Contributors 

This article is maintained by Microsoft. It was originally written by the following contributors. 

Principal authors:  

  • Manasa Ramalinga | Principal Cloud Solution Architect – US Customer Success  
  • Abed Sau | Senior Cloud Solution Architect – US Customer Success  
Read the whole story
alvinashcraft
21 minutes ago
reply
West Grove, PA
Share this story
Delete

Microsoft OneNote now available on Apple Vision Pro

1 Share

Today, we are introducing the newest member of the OneNote family, on the Apple Vision Pro. We have worked closely with Apple for many years to bring these experiences to iPhone, iPad, and Mac. Now, with Apple Vision Pro, OneNote will make use of the infinite canvas of spatial computing and can appear side-by-side with other great Microsoft apps like Word, Excel, and Teams at any scale for incredible multitasking. 

 

You can plan trips, practice daily habits, and create/edit your task list, all in spatial reality – the OneNote experience on the Apple Vision Pro helps you stay productive, no matter where you are. 

 

screenshot showing Onenote on the Apple vision proscreenshot showing Onenote on the Apple vision pro

 

Overview 

OneNote for Apple Vision Pro is a native app, that supports many of the features available on OneNote for iPad. These include: 

  • Write memos, make a digital notebook, or jot down notes. 
  • Highlight can’t-miss notes with Important and To Do tags. 
  • Secure your notes with a password and control permissions when sharing with others. 
  • Sync your notes to cloud (OneDrive, OneDrive for Business and SharePoint), making it easy to access your notes anywhere. 
  • Share ideas and your notes with friends and colleagues. 
  •  

You can go hands-free or also use a keyboard and trackpad. If using a keyboard and trackpad, just pair them with Bluetooth. 

To learn more about Apple Vision Pro, watch this video from Apple.

 

How to install Microsoft OneNote on Apple Vision Pro 

To install OneNote on your Apple Vision Pro, follow these steps: 

  1. From your Apple Vision Pro, go to the App Store and search for Microsoft OneNote. 
  2. Tap Get. 
  3. Enter your Apple ID and password, if prompted, to start the download. 
  4. Once the app is installed, tap to launch it. 
  5. Sign in with your Microsoft personal, work, or school account. 

To update Microsoft OneNote, return to the App Store and follow the steps above. 

Tip: Enable automatic updates from Settings > App Store. Under App Updates, slide the Automatic Downloads toggle to On.

 

screenshot showing Onenote on the Apple vision proscreenshot showing Onenote on the Apple vision pro

 

Features coming soon. 

Some OneNote features are not yet available on Apple Vision Pro: 

  • Insert from camera & photos (coming soon) 
  • Copilot (coming soon) 
  • Two-factor authentication (Microsoft Authenticator is not currently available for the Apple Vision Pro. We're working on it, though, so check back for updates soon.) 

 

What types of accounts are supported? 

OneNote for Apple Vision Pro only supports personal accounts and work accounts that are not managed by your organization. If your work account is managed by your organization, you will be unable to sign in. 

 

Related topics 

Use Microsoft Teams on Apple Vision Pro 

 

Read the whole story
alvinashcraft
23 minutes ago
reply
West Grove, PA
Share this story
Delete

Azure OpenAI offering models - Explain it Like I'm 5

1 Share

It's been almost 2 years since our Azure OpenAI service was released, an Azure AI Service enabling customers to leverage the power of State-of-the-art Generative AI models.  These models have become ubiquitous in our daily lives, from Copilots and ChatGPT providing productivity boosts to our daily tasks, all the way to AI agents, that are applying Artificial Intelligence to solve business tasks with amazing speed and accuracy.  The common thread behind these experiences is the model interaction which for many enterprise customers has become paramount to its business applications.  No matter the use case, the models are the engine providing astoundingly intelligent responses to our questions.  

 

In the Azure OpenAI service, which provides Azure customers access to these models, there are fundamentally 2 different levels of service offerings:

 

1. Pay-as-you-go, priced based on usage of the service

2. Provisioned Throughput Units (PTU), fixed-term commitment pricing  

 

Given the 2 options, you would probably gravitate toward the pay-as-you-go pricing, this is a logical conclusion for customers just starting to use these models in Proof of Concept/Experimental use cases.  But as customer use cases become production-ready, the PTU model will be the obvious choice.

 

_70a8dbc8-520e-43c0-a72f-25c56427779b.jpg

 

If you think about the Azure OpenAI service as analogous to a freeway, the service helps facilitate Cars (requests) travelling to the models and ultimately back to the original location.  The funny thing about highways and interstates, like standard pay-as-you-go deployments, is you are unable to control who is using the highway the same time as you are―which is akin to the service's utilization we all experience during peak hours of the day. We all have a posted a speed limit, like rate limits, but may never reach the speed we expect due to the factors mentioned above. Moreover, if you managed a fleet of vehiclesvehicles we can think of as service callsall using different aspects of the highway, you also cannot predict which lane you get stuck in. Some may luckily find the fast lane, but you can never prevent the circumstances ahead in the road. That's the risk we take when using the highway, but tollways (token-based consumption) all give us the right to use it whenever we want.  While some high demand times are foreseeable, such as during rush hour, there may be cases where phantom slowdowns exist where there is no rhyme or reason as to why these slowdowns occur.  Therefore, your estimated travel times (Response Latency) can vary drastically based on the different traffic scenarios that can occur on the road. 

 

 

Boring-tunnel.jpg

Provisioned throughput (PTUs) is more analogous to The Boring Company's Loop than anything else. Unlike public transportation that has predefined stops, the Loop provides a predetermined estimation of time it will take to arrive at your destination because there are no scheduled stopsyou travel directly to your destination. Provisioned throughput, like a Loop, is a function of how many tunnels (capacity), stations (client-side queuing), and quantity of vehicles (concurrency) you can handle at any one time. During peak travel times, even the queued wait time to get in the loop at your first station (time to first token) may have you arrive at your destination (end to end response time) faster than taking the highway because the speed limit is conceptually much higher with no traffic.  This makes Provisioned throughput much more advantageousif and only ifwe implement a new methodology in our client-side retry logic than how we've handled it previously. For instance, if you have a tolerance for longer per-call latenciesonly adding a little latency in front of the call (time to first token), exploiting the retry-after-ms value returned from the 429 responseyou can define how long you are willing to wait before you redirect traffic to Pay-as-you-go or other models. This implementation will also ensure you are getting the highest throughput out of PTUs as possible.

In summary, for Azure OpenAI use cases that require PredictableConsistent, and Cost Efficient usage of the service the Provisioned Throughput Unit (PTU) offering becomes the most reasonable solution especially when it comes to business-critical production workloads. 

 

Read the whole story
alvinashcraft
23 minutes ago
reply
West Grove, PA
Share this story
Delete

Project Gutenberg expands access to literature by creating audiobooks with AI

1 Share

Project Gutenberg is the world’s first digital library. For over 50 years, it has been steadfast in its commitment to provide free, unfettered access to digitized literature for everyone.


Run entirely by volunteers and driven purely by its mission to share public domain literature and information, Project Gutenberg stands out for how closely it hews to its altruistic vision, from creating e-books to audiobooks.


Now, they are using AI text-to-speech capabilities from Microsoft to accelerate the progress of creating an extensive audiobook library. They've already created over 5,000 audiobooks using AI, and through this process, they've demonstrated how cost-effective AI can be to help build a world where literature can easily be shared.


Link to full episode transcript.


Produced by Larj Media.





Download audio: https://audio-delivery.cohostpodcasting.com/audio/7d408271-1f41-4337-9cdd-ac45711d73e7/episodes/9d1b4347-bf09-4b41-b1cb-6aa41cd64bd8/episode.mp3
Read the whole story
alvinashcraft
23 minutes ago
reply
West Grove, PA
Share this story
Delete

Do NOT update your Visual Studio solutions to .slnx just yet!

1 Share
From: Martin Zikmund
Duration: 4:41

In the last video I introduced the new .slnx file format that is available to try out in the latest preview of Visual Studio 2022. Why that's something to be very enthusiastic about, you should not rush to update your production solutions to it just yet. Let me explain why and how you can try it out in a safe way.

#visualstudio #dotnet #dev #coding

Contents:
0:00 - Intro
0:39 - The risks of updating right now
2:30 - Trying it out safely
3:52 - Summary

Read the whole story
alvinashcraft
23 minutes ago
reply
West Grove, PA
Share this story
Delete

Windows Photos adds new Microsoft Designer integration

1 Share
Hello Windows Insiders, Today we are introducing the ability to send an image from the Photos app to Microsoft Designer, a cloud based graphic design app that helps you create professional quality social media posts, invitations, digital postcards, graphics, and new visuals with the power of generative AI. This update is available on Windows 11 to start, and is currently limited to users within the US, UK, Australia, Ireland, India, and New Zealand. We are starting to roll this out to Insiders in the Canary and Dev channels.

Photos app integration with Designer

We're adding a new way to unlock the creative potential of your images through Designer. Simply click the Designer button in the title bar when viewing a single image in the Photos app, and your image will open Designer in the default web browser. [caption id="attachment_177003" align="alignnone" width="700"]Send an image from Photos to Designer. Send an image from Photos to Designer.[/caption] Once in Designer, you can apply templates, add visuals, and use AI tools to help you personalize your photo further. For example, try using AI to generate a new custom background for your photo using these simple steps:
  • With your photo selected, click on “Remove background” in the toolbar.
  • Click Visuals in the left pane and select Generate to create a custom image background from a prompt.
  • Click on the desired generated image to insert it on the canvas.
  • From the toolbar, select Set as background to place the generated image behind your photo.
[caption id="attachment_177004" align="alignnone" width="1066"]Designer editing experience showing swapping the background. Designer editing experience showing swapping the background.[/caption] To get the latest Photos experience, update your app to version number 2024.11040.16001.0 or higher. Sign in with a free personal Microsoft account to try Designer’s generative AI features currently in preview, as well as share your creations. We appreciate your comments and suggestions, so please share your feedback with us! FEEDBACK: Please file feedback in Feedback Hub (WIN + F) under Apps > Photos. Thanks, Mala Srivatsa, Senior Product Manager – Windows Photos
Read the whole story
alvinashcraft
24 minutes ago
reply
West Grove, PA
Share this story
Delete
Next Page of Stories