Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
150172 stories
·
33 followers

Steam Machine today, Steam Phones tomorrow

1 Share
The Steam Controller.

It's a big deal that Valve is making a game console. But I'm beginning to think the Steam Machine may end up a footnote in gaming history. What if Valve could bring PC games not just to its own living room consoles, but also to the Arm chips that billions of people have in their phones? What if you no longer had to wait for game developers to do the hard work of porting PC games to your phone, Mac, or other Arm hardware, because games built for desktop PCs could just work?

If you wrote off the Steam Frame as yet another VR headset few will want to wear, I guarantee you're not alone. But the Steam Frame isn't just a headset; it's a Trojan ho …

Read the full story at The Verge.

Read the whole story
alvinashcraft
8 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Coffee and Open Source Conversation - Jeff Williams

1 Share
From: Isaac Levin
Duration: 0:00
Views: 0

Jeff Williams is the Co-Founder and CTO of Contrast Security, where he leads innovation in runtime-based application security. A pioneer in modern AppSec and co-founder of OWASP, Jeff has spent more than two decades helping organizations understand and manage software risk through instrumentation, context, and continuous learning.

You can follow Jeff on Social Media
https://www.linkedin.com/in/planetlevel/
https://x.com/planetlevel

Also be sure to check out Contrast Security

PLEASE SUBSCRIBE TO THE PODCAST
Spotify: http://isaacl.dev/podcast-spotify
Apple Podcasts: http://isaacl.dev/podcast-apple
Google Podcasts: http://isaacl.dev/podcast-google
RSS: http://isaacl.dev/podcast-rss

You can check out more episodes of Coffee and Open Source on https://www.coffeeandopensource.com

Coffee and Open Source is hosted by Isaac Levin (https://twitter.com/isaacrlevin )

Read the whole story
alvinashcraft
9 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Highlights from Ignite 2025: How Agentic AI and Microsoft Copilot are Empowering Healthcare

1 Share

Copilot in Healthcare: Microsoft 365’s AI Innovations Transforming HLS Workflows

Healthcare and Life Sciences (HLS) organizations today face intense pressures: workforce shortages, clinician burnout, rising costs, and ever-growing patient care demands. Doctors, nurses, and staff are overburdened with administrative tasks and documentation, fueling stress and attrition. The promise of AI-powered assistants, or Copilots, offers a ray of hope. At the latest Microsoft Ignite conference, Microsoft unveiled a wave of Copilot and Modern Work innovations designed to relieve these burdens, streamline workflows, and improve patient care – all while maintaining the stringent security and compliance standards healthcare demands. This report examines how Microsoft 365 Copilot and related AI announcements from Ignite are poised to transform HLS workflows, from frontline clinical documentation to back-office operations, and what it means for healthcare organizations moving forward.

 

Reinventing Healthcare Workflows with AI Copilots

Microsoft’s vision for healthcare is clear: integrate AI assistants across the continuum of care to streamline clinical and operational workflows. At Ignite, a number of key announcements underscored this vision, highlighting how Copilot and AI “agents” can tackle healthcare’s unique challenges:

  • Microsoft 365 Copilot for HLS: The AI capabilities of Microsoft 365 Copilot – which embeds generative AI into everyday apps like Outlook, Teams, Word, and Excel – are being showcased for healthcare scenarios. Care team members can use Copilot in their familiar tools to summarize information, draft communications, schedule meetings, and more using natural language. This means, for example, a hospital manager could ask Copilot, “Summarize the shift handoff notes and list any urgent action items for the team”, and get an instant, actionable summary.
  • Copilot Studio & Healthcare Agents: Perhaps the most game-changing announcement is the Healthcare agent service in Microsoft Copilot Studio, now in public preview. Copilot Studio is Microsoft’s platform for building custom AI copilots (or “agents”), and the new healthcare-specific service allows health organizations to create their own AI assistants with healthcare-tailored models and templates. These agents come with pre-built medical intelligence and comply with industry standards out-of-the-box, thanks to built-in clinical safety safeguards. In short, a provider can build a Copilot that knows healthcare workflows – from appointment scheduling bots to patient triage chatbots – without starting from scratch, and with assurance that it will handle clinical content responsibly.
  • Integrated Clinical Documentation AI: Microsoft, through its Nuance division, introduced Dragon Copilot for nurses, an ambient clinical assistant that automatically captures and generates documentation from clinician-patient encounters. This is an expansion of the AI-powered Dragon Ambient Experience that many physicians use, now tailored for nursing workflows. A nurse can converse naturally with a patient while Dragon Copilot listens in the background via a secure mobile app, then produces structured flowsheet entries, nursing notes, and concise summaries of the encounter – all without the nurse needing to type during the interaction. In addition, nurses can query the Copilot for information not typically charted (for example, “What was the fluid intake today?”) or ask clinical questions, and get answers from trusted medical sources like the FDA and MedlinePlus right at the bedside. This kind of ambient AI integration is aimed at giving clinicians more face time with patients and less screen time.
  • Security & Compliance Enhancements: Given the sensitivity of health data, Microsoft has bolstered the security, privacy, and compliance features around Copilot. New governance controls announced at Ignite 2024 and 2025 address concerns of oversharing or unauthorized data use by AI. For instance, administrators can now mark certain SharePoint sites or data repositories as off-limits to Copilot’s semantic search using Restricted Content Discovery – meaning Copilot will not surface content from those sources even if a user has access. There’s also Data Loss Prevention for Copilot in preview, which can prevent Copilot from including the contents of documents labeled as sensitive (e.g. “PHI – Protected Health Information”) in its responses. In practice, if a file containing patient data is labeled “Confidential,” Copilot will know not to draw from that file in its answers, adding an extra layer of protection against accidental disclosure. Furthermore, Microsoft Purview’s compliance suite can now detect “risky AI usage” – for example, flagging if a user’s Copilot prompt attempts to access sensitive info – allowing organizations to audit and mitigate any potential misuse. All these innovations ensure that even as AI agents become more powerful, healthcare data stays under tight control.
  • Industry Integrations: Microsoft is also integrating Copilot and AI capabilities into popular healthcare platforms and workflows. For example, leading electronic health record (EHR) providers are partnering with Microsoft so that clinicians can leverage Copilot within their EHR systems. (At Ignite, Epic Systems – a major EHR vendor – demonstrated AI assistants embedded in its software, from ambient note generation in exam rooms to an AI agent that retrieves insights from its massive clinical database, all powered by Azure OpenAI.) And beyond clinical care, Microsoft introduced pre-built “Employee Self-Service” Copilot agents for enterprise workflows that can connect into systems like Workday or ServiceNow. Imagine an HR help bot or an IT support bot that hospital staff can query right from Teams – these can now be built or deployed through Copilot Studio, further integrating AI into daily operations.

Each of these developments addresses a facet of healthcare’s needs – whether it’s reducing documentation drudgery, connecting siloed data, or enforcing privacy. Below, we dive deeper into how these Copilot capabilities benefit HLS users and provide concrete examples of their impact.

Empowering Healthcare Teams with Microsoft 365 Copilot

One of the most immediate ways AI Copilots are changing HLS is by enhancing day-to-day productivity for healthcare workers. Microsoft 365 Copilot serves as an ever-ready assistant across the Outlook email, Teams collaboration, Word documents, Excel spreadsheets, and more. For healthcare professionals, this means a dramatic reduction in the time spent searching, summarizing, or drafting routine content:

  • Summarizing Information & Email Drafting: Clinicians and administrators deal with a flood of information – lengthy email threads, policy documents, meeting notes – which can be time-consuming to parse. Copilot’s natural language summary capability can condense a long email chain or a multi-page report into concise bullet points in seconds. For instance, a care coordinator could ask Copilot, “Summarize the key updates from yesterday’s interdisciplinary team meeting”, and Copilot would generate a quick summary of what each department reported. It can also draft responses or documents based on prompts – like creating a first draft of an outreach email to patients about a new clinic service, which the user can then refine. This not only saves time but ensures important details aren’t overlooked.
  • Task and Schedule Management: With Copilot’s integration into Microsoft Teams Business Chat, staff can query across their calendars, emails, and chats at once. A hospital manager might use Business Chat to ask, “What are the open action items for my team this week?”, and Copilot will leverage data from Teams messages, To Do tasks, and Shift schedules to produce a tailored to-do list. In fact, Microsoft has demonstrated a Shifts plugin for Microsoft 365 Copilot that is particularly relevant to healthcare operations: it allows Copilot to check workforce scheduling data. Using the Shifts plugin, a nurse manager could simply ask, “Do we have any open shifts this weekend that still need coverage?” and Copilot will identify unfilled slots and suggest staff who could fill them. This intelligent handling of scheduling and staffing questions helps frontline managers quickly triage workforce needs, instead of manually combing through schedules.
  • Enhanced Team Communications: Keeping geographically dispersed or round-the-clock care teams on the same page is a perennial challenge. Here, Copilot can help craft and target internal communications. Copilot in Microsoft Viva Engage (the enterprise social platform) was highlighted as a tool for corporate or clinical leaders to generate engaging posts and announcements. In a healthcare context, a Chief Nursing Officer could use Copilot to draft an announcement about a new patient safety protocol – the Copilot will suggest phrasing, tone adjustments, and even relevant imagery to make the message clear and compelling. The leader can then refine the draft, and post it to reach the intended audience (for example, all nurses in the organization) with confidence that the message is well-crafted. Copilot can even analyze the engagement metrics on such posts and suggest follow-ups, making internal communication more of a data-driven dialogue than a shot in the dark.

These capabilities are already proving valuable. Microsoft notes that frontline healthcare workers have rapidly embraced digital tools like Teams, with “strong and consistent double-digit growth in frontline Teams monthly active users in healthcare” over the past year. This sets the stage for Copilot: healthcare staff are ready for AI assistance in the same tools they’re already using daily. By reducing tedious work (like reading through threads or coordinating schedules), Microsoft 365 Copilot acts as a force-multiplier for healthcare teams. Early scenarios in hospitals show promise – from speeding up how charge nurses hand off shifts to helping quality managers prepare reports – and the technology is only improving as it learns from more usage.

Building Custom Healthcare AI Agents with Copilot Studio

While Microsoft 365 Copilot provides general productivity boosts, many healthcare providers have more specialized needs – think of a triage chatbot that guides patients, or an AI assistant that helps researchers sift through medical literature. Addressing this, Microsoft launched the Healthcare Agent Service for Copilot Studio (announced at Ignite) which empowers organizations to create their own AI copilots tailored to healthcare scenarios.

Copilot Studio is essentially a one-stop platform for building and managing AI agents (the behind-the-scenes technology that powers Copilots). With the new healthcare agent service in Copilot Studio, even organizations that don’t have a fleet of AI experts can quickly spin up an assistant that speaks the language of healthcare:

  • Pre-Built Medical Intelligence: The service provides a healthcare-specialized AI stack – including access to medical ontologies, terminologies, and even some clinical reasoning capabilities out-of-the-box. It comes with templates and reusable healthcare-specific components so that your agent can handle, for example, understanding a patient symptom query or matching it to known triage protocols, without starting from zero. This dramatically lowers the barrier to create useful healthcare bots or assistants.
  • Reusable Use-Case Templates: Microsoft has included templates for common healthcare use cases. Want to build a virtual agent to help patients schedule appointments? Or one to match eligible patients with clinical trials? These scenarios can be bootstrapped with provided templates, which can then be customized to an organization’s workflows. During Ignite, Microsoft specifically noted scenarios like appointment scheduling, clinical trial matching, and patient triage as targets for these agents. An organization could take a scheduling bot template and quickly train it on their clinic locations, provider schedules, and rules (like referral requirements), resulting in a ready-to-use “Copilot” that patients or staff can interact with to book appointments conversationally.
  • Built-In Clinical Safeguards: One of the standout features is that agents built with this service come with responsible AI safeguards specifically tuned for healthcare. Microsoft is well aware that a hallucinated answer in general business can be an annoyance, but in healthcare it can be dangerous. Thus, they’ve introduced a Clinical Safeguards API (in private preview) that developers can use to have the AI’s output double-checked. These safeguards include checks for fabrications or omissions in answers compared to the source data, medical context awareness, source provenance tracking, and even validation of clinical codes and terms. For example, if an AI agent summarizes a patient’s symptoms and suggests a diagnostic code, the system can verify that the ICD-10 code actually exists and matches the context, or if the agent answers a medical question, it can flag if the answer introduces facts not present in the trusted knowledge base. This level of scrutiny is invaluable for building trust in AI outputs. It means a healthcare Copilot can be designed to always show where it got its information (e.g. citing the medical textbook or guideline it used), and to avoid “making things up” about clinical content. The fact that these safeguards are provided by Microsoft’s platform saves health IT teams from having to engineer their own safety net for the AI.
  • Extensibility with Plugins and Data: The healthcare agent service supports integration of plugins to extend functionality. Essentially, a hospital’s AI agent can plug into other systems – whether it’s pulling real-time data from an EHR, executing an update in a scheduling system, or calling an external API. This is critical because healthcare workflows touch many systems. Microsoft indicated that whether the plugins are built by Microsoft or third parties, they can be incorporated, suggesting a growing ecosystem of healthcare connectors. Additionally, these agents can tap into the organization’s internal data through the Microsoft Graph (with proper permissions). For instance, a clinician-facing agent could retrieve information from internal SharePoint policies or a medical protocols library when answering a question. All of this happens within the organization’s secure environment.

What does this enable in practice? A myriad of innovative solutions. A few examples already emerging include: virtual patient assistants, which let patients conversationally ask health questions or navigate hospital services (instead of wading through portals or phone menus), and clinical support bots, which aid clinicians by fetching guidelines or doing intake Q&As with patients. Microsoft shared that major medical institutions have been early testers of these capabilities – one US health system worked with Microsoft to prototype an AI that helps patients “find the information they need, ask health-related questions, and navigate our services,” improving how patients interact with the hospital online. Another healthcare provider used the system to generate patient-friendly explanations of radiology results, allowing patients to get an easy-to-understand summary of their MRI report and even ask follow-up questions about it, with the AI tracing each answer back to the original report for transparency. These examples show how custom agents can directly improve patient experience and health literacy – all built on the Copilot Studio platform.

Crucially, all this innovation is happening with PHI (Protected Health Information) data managed safely. The healthcare Copilot agents live in the Microsoft Cloud for Healthcare environment, meaning data goes nowhere unexpected. Microsoft emphasizes that health organizations can “manage PHI data with confidence” when using these new Copilot Studio agents. The combination of powerful AI plus strict safeguards is what makes this a turning point. Healthcare providers can finally leverage generative AI at scale, not just in isolated pilot projects, but baked into their own apps and processes – and do so responsibly. As one Microsoft healthcare leader put it, these tools enable organizations to “transform their medical professional and patient experiences… in a safer way” and truly accelerate the future of healthcare innovation.

Ambient Clinical Documentation with Dragon Copilot

One of the most labor-intensive aspects of healthcare is clinical documentation. Doctors and nurses spend hours each day writing notes, filling out charts, and updating records – time not spent directly caring for patients. Nuance Dragon Ambient AI has already been a game-changer for physicians by automatically transcribing doctor-patient conversations and generating draft clinical notes (often called DAX, the Dragon Ambient Experience). Now, the Dragon Copilot experience for nurses brings similar AI assistance to nursing workflows, a boon for a segment of the workforce that has historically been under-supported by documentation tech.

Dragon Copilot for nurses, announced in late 2025, is an AI clinical assistant “designed with nurses at the helm.” It fundamentally reimagines nursing documentation by making it ambient and context-aware rather than a separate task. Here’s how it supports nursing staff:

  • Seamless Capture of Nurse-Patient Interactions: Rather than forcing nurses to remember everything and document later (or pause to type during care), Dragon Copilot turns real-time bedside conversations and observations into documentation automatically. Nurses simply speak with patients as they normally would – e.g. asking about pain levels, explaining medications – and the Copilot, running on a secure mobile device (integrated with the EHR’s mobile app), listens in the background. It transcribes and intelligently structures the relevant details into the nurse’s documentation templates. For example, if a nurse asks a patient how they’re feeling after a procedure and the patient describes their pain and mobility, Dragon Copilot will capture that and draft an entry in the flowsheet (the structured chart) under pain assessment, mobility, etc., aligning to the hospital’s existing documentation schema. Nurses can also choose to record their observations or notes verbally when not with the patient (e.g. in the hallway), and the system will interpret those too. The key is flexibility – it adapts to how each nurse prefers to work (whether narrating in room or afterwards), and it does so without requiring the nurse to invoke specific commands or fill forms; the AI figures out the context.
  • Drafting Flowsheets and Notes Automatically: Dragon Copilot produces multiple documentation outputs:
    • Flowsheet entries – These are the structured data points in the EHR (like vital signs, assessments). The Copilot generates suggested values or text for appropriate flowsheet fields from the conversation, mapping them correctly to the hospital’s specific rows/fields (even if each hospital uses a slightly different flowsheet setup). This is significant because mapping free-form speech to a precise medical record field is complex; the AI had to be trained on nursing vocabulary and workflow so it knows, for instance, that when a patient says “it’s a bit hard to breathe when I lie down,” it should populate the respiratory assessment field appropriately. Microsoft notes their model is purpose-built for nursing and can handle even ambiguous statements, achieving accuracy beyond what a general large language model could do in this niche.
    • Nurse narrative notes – If something notable happens (say a patient had an incident or a change in status), the system can draft a narrative note describing it. Nurses typically write such notes in their own words; Dragon Copilot gives them a first draft to edit, saving time on writing out long narratives.
    • Concise shift summaries – It also produces a brief summary of major findings and suggested next steps from the interaction. This helps nurses recall important details during shift hand-offs or when juggling multiple patients. For instance, it might output: “Summary: Post-op patient improving, pain 3/10 after medication, wound site dry, continue IV antibiotics, likely ready for ambulation tomorrow.” Such at-a-glance recaps keep nurses organized amid constant interruptions.
  • Realtime Knowledge and Q&A: Dragon Copilot goes beyond documentation. Nurses can query the transcribed conversation for anything they might have missed and even ask clinical questions in the moment. For example, after talking with a patient, a nurse could ask the Copilot, “Did the patient mention any side effects from the new medication?” and it will highlight the relevant transcript section if it was mentioned. Or if a patient asks the nurse a question like, “Why do I need this CT scan?”, the nurse can quickly ask Copilot for information – the Copilot can fetch answers from trusted medical knowledge sources (e.g. FDA guidelines, MedlinePlus consumer health info) that the hospital has approved. It will deliver a brief answer the nurse can relay, like explaining the purpose of the CT in simple terms, right at the bedside. This immediate access to knowledge supports nurses in patient education and answering questions accurately without leaving the patient’s side or searching manuals.
  • Fits into Existing Workflows: A critical factor is that Dragon Copilot integrates with the hospital’s EHR systems and existing forms. It doesn’t force a new interface or new documentation style. In fact, it’s designed to work with standard nursing flowsheets and the Epic Rover mobile app (as noted by Microsoft) that many nurses already use for bedside charting. It can populate the data it captures into the correct fields automatically. Moreover, during setup it can analyze an organization’s flowsheet definitions and even suggest optimizations (for instance, identifying duplicate fields that could be streamlined). This means adopting the Copilot doesn’t require an overhaul of IT systems – it augments what’s there and can make those systems more efficient over time.
  • Nurse Oversight and Control: Recognizing that clinical documentation must be accurate, the solution is built so nurses remain in full control. Dragon Copilot does not directly write into the permanent record without approval. It presents the drafted entries, notes, and recommendations for the nurse to review, edit if needed, and accept. If something is off, the nurse can correct it before saving. The system also provides transparency: for every value it suggests (say, a blood pressure reading it heard or an assessment it inferred), the nurse can click to see the exact snippet of transcript that led to that entry. This traceability builds trust – the nurse sees why the AI thought a patient was “in moderate pain” and can confirm or adjust the phrasing. Any gaps or uncertainties in the draft are highlighted for the nurse to fill in manually, ensuring nothing critical is skipped. In short, the nurse is the ultimate editor; the AI just does the first 90% of the grunt work.
  • Secure and Scalable Deployment: Microsoft provides an admin center for Dragon Copilot where administrators can manage settings, such as which documentation templates are enabled for ambient capture, which staff have access, and monitor usage analytics. This central management is key for enterprise deployment (e.g., a health system can roll it out to a pilot unit, configure settings, and then scale up). It also ties into standard IT controls and Microsoft’s cloud security. Dragon Copilot is built on the same secure Azure infrastructure and complies with privacy requirements – all patient data audio and transcripts are handled under the hospital’s HIPAA business associate agreement with Microsoft (Azure). The tight Azure integration, as Epic’s leadership noted about these ambient solutions, means data doesn’t leave the trusted environment and clinicians can adopt AI without compromising patient privacy.

The impact of Dragon Copilot can be substantial. By automating documentation, it is expected to give nurses back significant time in their shifts – time they can spend on direct patient care or simply reduce overtime spent finishing notes. It can also improve documentation quality (capturing details more reliably in real-time than memory-based notes at end of shift). Ultimately, it addresses a critical driver of burnout. Nurses often cite administrative burden as a major frustration; Dragon Copilot directly targets this, aiming to make documentation “invisible” and let nurses focus on what they do best: caring for patients. Microsoft’s investment here, following success with physician ambient documentation, underscores a theme: every member of the care team should have an AI assistant to offload paperwork. As the Dragon Copilot team proclaimed, Microsoft was first to bring ambient AI to physicians and now is delivering the first solution purpose-built for nursing – all built on “decades of clinical expertise with Microsoft’s scale, security, and commitment to innovation”. This is a prime example of how modern work tools and AI are merging in healthcare to change day-to-day operations in a very human-centric way.

Ensuring Security, Compliance, and Trust in AI Solutions

In healthcare, no innovation can be adopted without absolute confidence in security and compliance. Recognizing this, Microsoft has engineered Copilot and its AI solutions to meet the high bar set by health regulations like HIPAA, as well as the internal governance policies of health organizations. From the ground up, Microsoft 365 Copilot and the new agents operate within the secure Microsoft Cloud environment that HLS organizations are already familiar with. This means all the enterprise-grade protections of Microsoft 365 apply to Copilot as well:

  • Same Compliance Boundary as M365: Microsoft 365 Copilot doesn’t use a mysterious external data store – it works with the data in your Microsoft 365 tenant (files, emails, chats) and respects all existing permissions and controls on that data. Importantly, any data that Copilot accesses or any AI-generated content stays within your Microsoft 365 compliance boundary. Microsoft confirms that Copilot is built on the foundation of enterprise-grade security, privacy, and compliance that customers expect. Features like Microsoft Purview (for data loss prevention, eDiscovery, audit logs) apply to Copilot activities just as they do to user activities. Enterprise Data Protection (EDP) ensures content remains encrypted and access-controlled. Services like Customer Lockbox (which requires explicit permission for a Microsoft engineer to ever access content, even during support) and Multi-Geo data residency options extend to Copilot-related data handling. In short, Copilot runs within your trusted cloud footprint – not in some consumer cloud – giving healthcare IT teams the assurance that there's no data exposure to unknown parties. This design is a key differentiator, especially for HIPAA considerations. Microsoft can even sign Business Associate Agreements (BAA) covering Copilot services as part of the Office 365 umbrella, meaning they contractually commit to HIPAA safeguards for any PHI processed by Copilot.
  • Data Loss Prevention (DLP) for AI: As noted earlier, Microsoft introduced Purview DLP controls specifically for Copilot. Administrators can create policies such that if a document has a certain sensitivity label (say “Health Records – Highly Confidential”), Copilot will be prevented from using that document’s content in any responses. This is an elegant solution to ensure no AI even inadvertently quotes or summarizes sensitive patient data if the organization deems it off-limits. If a clinician tries to have Copilot summarize a file containing patient identifiers and that file has a PHI label, Copilot could politely refuse because of the DLP policy. These label-based exclusions give fine-grained control – they can be scoped by user, site, or group as well. Healthcare organizations can thus enforce rules like “Copilot should never reveal info from Behavioral Health clinic files except to members of that clinic” using the same labeling framework they use for manual data handling policies.
  • Content Access Restrictions: Beyond DLP, the Restricted Content Discovery (RCD) feature of SharePoint Advanced Management allows IT to ensure Copilot won’t index or reason over specific content collections. For example, imagine a research department has a highly sensitive project. They might keep it accessible only to certain users, but there’s always a small risk an AI agent with broad reading powers could summarize something from it if one of those users asks an unrelated question. With RCD, IT can essentially tell Copilot, “Ignore Site X entirely.” The site’s content remains available to those with permission normally, but Copilot (and even global search) will act as if it’s invisible. This targeted control helps prevent oversharing in an AI-empowered environment. Microsoft announced the preview of this feature as part of ensuring organizations can adopt Copilot faster by mitigating regulatory concerns.
  • Monitoring and Risk Detection: Microsoft Purview’s Insider Risk Management now includes “risky AI usage” detections in preview. This means if someone is using Copilot or Copilot Studio in a way that might indicate improper handling of data, the system can flag it. For instance, if an employee attempted to prompt Copilot to divulge information from files they normally shouldn’t access, or if Copilot provided an answer that included sensitive data, these signals can be caught. Purview can then alert compliance officers or trigger an investigation workflow, just as it would for other insider risk activities. This level of oversight is crucial for healthcare, where monitoring accesses to patient data is a compliance requirement. It adds an accountability layer to Copilot’s usage.
  • Secure External Knowledge Integration: One question that arises is how to let Copilot use external medical knowledge (which is often on the web) without violating policies (since enabling web search might send queries out to Bing from a HIPAA-covered environment, which many organizations disallow). Microsoft’s solution is to use Graph Connectors to pull external data in, rather than sending queries out. For example, a hospital could use the Enterprise Websites Graph Connector to index trusted public healthcare websites – say the CDC guidelines, NIH articles, or a drug database – into their own Microsoft 365 tenant. That content then becomes searchable internally by Copilot, as if it were just more organizational data, and it’s governed under the same compliance rules. The benefit is twofold: Copilot can draw on rich medical knowledge bases to answer clinical questions or provide medical explanations, but no live web browsing or external data exchange is happening at query time. Everything stays within the secure boundary: Copilot only references the cached copy of, say, a CDC guideline that the organization pre-approved and indexed. This approach was highlighted as a way to achieve HIPAA-compliant web content queries with Copilot. It avoids exposing any patient context to the public internet and allows full control over what external info is available to the AI. In essence, Microsoft is enabling HLS customers to “bring the world’s medical knowledge to Copilot, without letting Copilot go out into the world”. Auditability is maintained (since you can see exactly which sources Copilot used – they’re your copies of, e.g., NIH articles), and compliance teams can curate the sources (ensuring the AI isn’t reading dubious content).

All these measures – and the ongoing commitment by Microsoft to add more – demonstrate that security and compliance are not afterthoughts but foundational to Copilot’s design for healthcare. The platform’s ability to enforce HIPAA safeguards, keep data residency where required, and give admins fine control means healthcare orgs can adopt AI assistance with peace of mind. As Satya Nadella (Microsoft’s CEO) and industry leaders have stressed, health AI solutions must be built in a way that “AI is not just powerful, but also safe, secure, and transparent”. Microsoft’s updates at Ignite continued to reinforce that principle.

To summarize the features and their HLS benefits, here is a quick overview:

Innovation from Ignite

Benefit for Healthcare

Example Use Case

Microsoft 365 Copilot in Office Apps

Automates content creation and information synthesis for busy staff. Frees up time by handling emails, summaries, and routine documentation.

A doctor asks Copilot in Outlook to draft a response to a referral request, pulling key points from an attached report. Or a clinic manager uses Copilot in Word to summarize a 20-page policy into a one-page FAQ for staff.

Teams Chat + Shifts Plugin

Eases operational strain by surfacing actionable insights from data (schedules, chats) in natural language. Helps manage staffing and tasks in real-time.

A nursing supervisor types “Any open night shifts this week?” into Teams Copilot Chat – Copilot lists unfilled shifts and suggests available float pool nurses to call.

Copilot Studio – Healthcare Agent Service

Enables custom AI agents built for healthcare scenarios, with built-in medical knowledge. Accelerates digital solutions (scheduling bots, triage assistants) tailored to an organization’s needs.

Using a template, a hospital quickly creates a patient triage chatbot. Patients can describe symptoms to the bot and it advises next steps (e.g. self-care, schedule visit, go to ER) based on clinical protocols – handing off to a nurse only when necessary.

Clinical Safeguards API (Preview)

Ensures accuracy and safety of generative AI in clinical use. Detects hallucinations or errors in AI responses and provides traceability for medical info.

A custom oncology Copilot that summarizes clinical trial options for a patient uses safeguards to verify that any medical codes (e.g. a chemo regimen code) it mentions are valid and that every recommendation is grounded in the patient’s actual data. Any unverified content is flagged for review.

Nuance Dragon Copilot for Nursing

Reduces documentation burden on clinicians by ambiently capturing and generating notes. Improves quality of records and gives clinicians more face-to-face time with patients.

During a post-surgery follow-up, a nurse just converses with the patient. Dragon Copilot listens and drafts the flowsheet entries (vitals, pain assessment) and a narrative note for that visit automatically. The nurse quickly reviews and accepts the draft, instead of spending 15 minutes typing it from scratch later.

Advanced Data Governance for Copilot

Protects sensitive patient data and prevents AI overreach. Administrators can enforce compliance rules on what Copilot can see or share. Maintains HIPAA compliance and patient confidentiality at scale.

IT sets a DLP policy so that any document labeled “Patient Records” is excluded from Copilot’s answers. They also use Restricted Content settings to block Copilot from indexing a research project site with unpublished clinical trial data. This way, even if an employee has access to those, the AI will not reveal that content.

Integration of External Medical Knowledge

Brings world-class medical information to clinicians’ fingertips within the secure environment. Enhances decision-making and patient education without breaking compliance boundaries.

The hospital IT team uses a Graph Connector to index NIH clinical guidelines and FDA drug info into Microsoft 365. A physician using Copilot can ask, “What are the latest CDC guidelines for COVID boosters in pregnant patients?” and get an up-to-date answer sourced from those internal copies of CDC content – no public web search needed.

Conclusion: A New Era for Healthcare – AI Empowered, Human-Focused

The announcements from Microsoft Ignite showcase a healthcare future where AI copilots and agents are woven into every aspect of the industry’s workflows. What’s striking is how these tools directly target HLS pain points: from alleviating staff burnout by offloading administrative tasks, to accelerating complex data analysis for clinical decisions, to safeguarding patient data while using AI. This isn’t technology for technology’s sake – it’s aimed at measurable outcomes like reducing documentation time, speeding up information retrieval, improving care coordination, and ensuring compliance every step of the way.

For healthcare leaders, the message is both exciting and urgent. The maturity of AI solutions like Microsoft 365 Copilot means organizations can move from experimentation to implementation. In fact, some pioneering health institutions have already deployed dozens of internal AI agents through Copilot Studio to streamline processes in departments ranging from oncology to revenue cycle. These early adopters report significant time savings and more consistent outcomes, confirming that AI copilots can be “production tools” and not just pilot projects.

Microsoft is also making it easier to adopt these innovations by providing the management, governance, and partner ecosystem to support them. The introduction of Microsoft Agent 365 (the control plane for managing fleets of AI agents) and the availability of a Marketplace for AI agents mean that in the near future, healthcare organizations can not only build their own copilots but also buy ready-made healthcare agents or share their solutions securely. This opens the door to an era where common healthcare workflows (like prior authorizations, patient FAQs, documentation drafting) might be handled by proven AI agents that any hospital can deploy, with minimal setup, on their trusted Microsoft cloud environment.

Ultimately, the role of AI in HLS is evolving from a back-end assistant to a front-line collaborator. We are seeing the emergence of what Microsoft terms the “Frontier health organization” – one that is human-led and AI-powered, where every clinician or staff member has an agent helping them work smarter. It’s a future where a nurse can walk into a patient room knowing an AI is keeping track of documentation, where a doctor can consult an AI for the latest medical evidence in seconds, and where an administrator can rely on AI to monitor compliance and efficiency in the background.

Microsoft’s investments in Copilot and Modern Work for healthcare are deliberately aligning with this vision. By delivering tangible, scenario-focused AI solutions (like the nurse documentation Copilot) and combining them with the necessary guardrails, they are helping healthcare organizations confidently embrace AI. The upside is transformative: less burnout, more productivity, more informed decisions, and ultimately more time and focus for patient care.

As we stand at this inflection point, it’s clear that AI copilots are not hype – they are here and now, making a difference in healthcare. Those HLS organizations that learn to harness these tools effectively will lead the way in providing higher-quality, efficient, and patient-centered care. In the words of one industry expert, the future of work in healthcare is “human-led and agent-powered.” It’s a future where AI becomes a trusted member of the care team, and Microsoft’s latest Copilot innovations are a major step in that direction. The journey is just beginning, but the path is set: healthcare is entering a new era where embracing AI copilots will be key to overcoming our biggest challenges and delivering better outcomes for all.

 

Read the whole story
alvinashcraft
9 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Android 16 QPR2 is Released

1 Share
Posted by Matthew McCullough, VP of Product Management, Android Developer






Faster Innovation with Android's first Minor SDK Release

Today we're releasing Android 16 QPR2, bringing a host of enhancements to user experience, developer productivity, and media capabilities. It marks a significant milestone in the evolution of the Android platform as the first release to utilize a minor SDK version.


A Milestone for Platform Evolution: The Minor SDK Release


Minor SDK releases allow us to deliver APIs and features more rapidly outside of the major yearly platform release cadence, ensuring that the platform and your apps can innovate faster with new functionality. Unlike major releases that may include behavior changes impacting app compatibility, the changes in QPR2 are largely additive, minimizing the need for regression testing. Behavior changes in QPR2 are largely focused on security or accessibility, such as SMS OTP protection, or the support for the expanded dark theme.

To support this, we have introduced new fields to the Build class as of Android 16, allowing your app to check for these new APIs using SDK_INT_FULL and VERSION_CODES_FULL.

if ((Build.VERSION.SDK_INT >= Build.VERSION_CODES.BAKLAVA) && (Build.VERSION.SDK_INT_FULL >= Build.VERSION_CODES_FULL.BAKLAVA_1)) {
    // Call new APIs from the Android 16 QPR2 release
}

Enhanced User Experience and Customization

QPR2 improves Android's personalization and accessibility, giving users more control over how their devices look and feel.

Expanded Dark Theme

To create a more consistent user experience for users who have low vision, photosensitivity, or simply those who prefer a dark system-wide appearance, QPR2 introduced an expanded option under dark theme.

The old Fitbit app showing the impact of expanded dark theme; the new Fitbit app directly supports a dark theme

When the expanded dark theme setting is enabled by a user, the system uses your app's isLightTheme theme attribute to determine whether to apply inversion. If your app inherits from one of the standard DayNight themes, this is done automatically for you. If it does not, make sure to declare isLightTheme="false" in your dark theme to ensure your app is not inadvertently inverted. Standard Android Views, Composables, and WebViews will be inverted, while custom rendering engines like Flutter will not.

This is largely intended as an accessibility feature. We strongly recommend implementing a native dark theme, which gives you full control over your app's appearance; you can protect your brand's identity, ensure text is readable, and prevent visual glitches from happening when your UI is automatically inverted, guaranteeing a polished, reliable experience for your users.

Custom Icon Shapes & Auto-Theming

In QPR2, users can select specific shapes for their app icons, which apply to all icons and folder previews. Additionally, if your app does not provide a dedicated themed icon, the system can now automatically generate one by applying a color filtering algorithm to your existing launcher icon.

Custom Icon Shapes

Test Icon Shape & Color in Android Studio

Automatic system icon color filtering

Interactive Chooser Sessions

The sharing experience is now more dynamic. Apps can keep the UI interactive even when the system sharesheet is open, allowing for real-time content updates within the Chooser.

Boosting Your Productivity and App Performance

We are introducing tools and updates designed to streamline your workflow and improve app performance.

Linux Development Environment with GUI Applications

The Linux development environment feature has been expanded to support running Linux GUI applications directly within the terminal environment.

Wilber, the GIMP mascot, designed by Aryeom Han, is licensed under CC BY-SA 4.0. The screenshot of the GIMP interface is used with courtesy.

Generational Garbage Collection

The Android Runtime (ART) now includes a Generational Concurrent Mark-Compact (CMC) Garbage Collector. This focuses collection on newly allocated objects, resulting in reduced CPU usage and improved battery efficiency.

Widget Engagement Metrics

You can now query user interaction events—such as clicks, scrolls, and impressions—to better understand how users engage with your widgets.

16KB Page Size Readiness

To help prepare for future architecture requirements, we have added early warning dialogs for debuggable apps that are not 16KB page-aligned.


Media, Connectivity, and Health

QPR2 brings robust updates to media standards and device connectivity.

IAMF and Audio Sharing

We have added software decoding support for Immersive Audio Model and Formats (IAMF), an open-source spatial audio format. Additionally, Personal Audio Sharing for Bluetooth LE Audio is now integrated directly into the system Output Switcher.


Health Connect Updates

Health Connect now automatically tracks steps using the device's sensors. If your app has the READ_STEPS permission, this data will be available from the "android" package. Not only does this simplify the code needed to do step tracking, it's also more power efficient. It also can now track weight, set index, and Rate of Perceived Exertion (RPE) in exercise segments.

Smoother Migrations

A new 3rd-party Data Transfer API enables more reliable data migration between Android and iOS devices.

Strengthening Privacy and Security

Security remains a top priority with new features designed to protect user data and device integrity.

Developer Verification

We introduced APIs to support developer verification during app installation along with new ADB commands to simulate verification outcomes. As a developer, you are free to install apps without verification by using ADB, so you can continue to test apps that are not intended or not yet ready to distribute to the wider consumer population.

SMS OTP Protection

The delivery of messages containing an SMS retriever hash will be delayed for most apps for three hours to help prevent OTP hijacking. The RECEIVE_SMS broadcast will be withheld and sms provider database queries will be filtered. The SMS will be available to these apps after the three hour delay.

Secure Lock Device

A new system-level security state, Secure Lock Device, is being introduced. When enabled (e.g., remotely via "Find My Device"), the device locks immediately and requires the primary PIN, pattern, or password to unlock, heightening security. When active, notifications and quick affordances on the lock screen will be hidden, and biometric unlock may be temporarily disabled.

Get Started

If you're not in the Beta or Canary programs, your Pixel device should get the Android 16 QPR2 release shortly. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you are currently on the Android 16 QPR2 Beta and have not yet installed the Android 16 QPR3 beta, you can opt out of the program and you will then be offered the release version of Android 16 QPR2 over the air.
For the best development experience with Android 16 QPR2, we recommend that you use the latest Canary build of Android Studio Otter.
Thank you again to everyone who participated in our Android beta program. We're looking forward to seeing how your apps take advantage of the updates in Android 16 QPR2.
For complete information on Android 16 QPR2 please visit the Android 16 developer site.
Read the whole story
alvinashcraft
9 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Securing AI agents and tool calls

1 Share
Beyond authentication: Learn how to secure your AI agent's tool calls from prompt injection by using application context and the principle of least privilege.
Read the whole story
alvinashcraft
10 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

PostgreSQL integration

1 Share
Learn how to integrate PostgreSQL with Aspire applications, using both hosting and client integrations.
Read the whole story
alvinashcraft
10 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories