Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
149532 stories
·
33 followers

Welcome to the Microsoft Security Community!

1 Share

Protect it all with Microsoft Security 

Eliminate gaps and get the simplified, comprehensive protection, expertise, and AI-powered solutions you need to innovate and grow in a changing world. The Microsoft Security Community is your gateway to connect, learn, and collaborate with peers, experts, and product teams. Gain access to technical discussions, webinars, and help shape Microsoft’s security products. 

Get there fast 

Index
Community Calls: December 2025 | January 2026 | February 2026 

Upcoming Community Calls 

December 2025 

Dec. 2 | 9:00am | Microsoft Sentinel and Microsoft Defender XDR | Empowering the Modern SOC 

Microsoft is simplifying the SecOps experience and delivering innovation that will allow your team to scale in new ways. Join us for actionable learnings to help your team modernize your operations and enhance protection of your organization. 

Dec. 3 | 8:00am | Microsoft Defender for Identity | Identity Centric Protection in the Cloud Era 

Safeguarding identities is challenging, but Microsoft Defender for Identity offers enhanced visibility, security posture, and protection focused on identity. 

Dec. 4 | 8:00am | Microsoft Defender for Cloud | Unlocking New Capabilities in Defender for Storage 

Discover the latest Microsoft Defender for Storage updates! Explore public preview features: Cloud Storage Aggregated Events and Automated Malware Remediation for Malicious Blobs, with live demos and best practices. 

Dec. 4 | 8:00am | Security Copilot Skilling Series Discussion of Ignite Announcements 

Get ready for an info-packed session highlighting the latest Security Copilot breakthroughs from Ignite! Discover how powerful agents and Copilot’s seamless integration with Intune, Entra, Purview, and Defender combine to deliver unbeatable, all-around protection. 

Dec. 4 | 9:00am | Microsoft Sentinel | What’s New in the Past 6 Months 

Learn what’s new in Microsoft Sentinel! See deeper Defender integration, evolving data lake capabilities for scalable security, plus demos and real-world use cases to help you stay ahead. 

Dec. 8 | 9:00am | Microsoft Security Store | Security, Simplified: A look inside the Security Store 

Welcome to Microsoft Security Store! During this session, you’ll learn all about this centralized destination where customers can discover, deploy, and manage trusted security solutions built to extend Microsoft’s security platforms like Defender, Sentinel, Entra, Purview, and Intune. 

Dec. 9 | 8:00am | Microsoft Defender XDR | A Deep Dive into Automated Attack Disruption 

Learn what’s new in Microsoft Sentinel! See deeper Defender integration, evolving data lake capabilities for scalable security, plus demos and real-world use cases to help you stay ahead. 

Dec. 9 | 9:00am | Microsoft Sentinel | Part 1: Stop Waiting, Start Onboarding: Get Sentinel Defender-Ready Today 

The Microsoft Sentinel portal retires July 2026—explore the Defender unified portal! Learn to manage incidents in a unified queue, enrich investigations with UEBA and Threat Intelligence, and leverage automation and dashboards for smarter SOC operations. 

Dec. 10 | 8:00am | Azure Network Security | Deep Dive into Azure DDoS Protection 

Explore Azure DDoS Protection! Learn to secure apps and infrastructure with end-to-end architecture, detection and mitigation flow, telemetry, analytics, and seamless integration for visibility and protection. 

Dec. 10 | 9:00am | Microsoft Defender for Cloud | Expose Less, Protect More with Microsoft Security Exposure Management  

Join us for an in-depth look at how Microsoft Security Exposure Management helps organizations reduce risk by identifying and prioritizing exposures before attackers can exploit them. Learn practical strategies to minimize your attack surface, strengthen defenses, and protect what matters most. 

Dec. 11 | 8:00am | Microsoft Defender for Cloud | Modernizing Cloud Security with Next Generation Microsoft Defender for Cloud 

Discover how Microsoft Defender for Cloud simplifies multi-cloud security. Learn to streamline posture management and threat protection across Azure, AWS, and GCP, improving efficiency, reducing risk, and enabling smarter prioritization. 

Dec. 11 | 9:00am | Microsoft Sentinel data lake | Transforming data collection for AI-ready security operations with Microsoft Sentinel 

See how Microsoft Sentinel transforms multi-cloud/multiplatform data collection. Learn a unified, cloud-native approach; ingest from on-prem, Microsoft workloads, and multicloud via codeless connectors (350+; App Assure), plus the roadmap for scaling to AI driven SecOps. 

Dec. 15 | 9:00am | Microsoft Entra | Diving into the New Microsoft Entra Agent ID

Join our first session in the Microsoft Entra Agent ID series to learn why agent identity matters, explore core concepts, and see how it fits into Microsoft’s identity ecosystem. Perfect for developers and product owners building AI agents.

Dec. 16 | 8:00am | Microsoft Defender for Office 365 | Ask the Experts: Tips and Tricks 

Engage in this interactive panel with Microsoft MVPs! Get answers to real-world Defender for Office 365 scenarios, best practices, and tips on migration, SOC optimization, Teams protection, and more. Bring your toughest questions for the live discussion. 

Dec. 16 | 9:00am | Microsoft Sentinel | Part 2: Don’t Get Left Behind: Complete Your Sentinel Move to Defender 

Prepare for the July 2026 transition! Unlock Microsoft Defender’s full potential with data onboarding, retention, governance, Content Hub, analytic rules, MTO for simplified management, and Security Copilot for AI-driven insights. 

January 2026 

Jan.13 | 9:00am | Microsoft Sentinel | AI-Powered Entity Analysis in Sentinel’s MCP Server 

Simplify entity risk assessment with Entity Analyzer. Eliminate complex playbooks; get unified, AI-driven analysis using Sentinel’s semantic understanding. Accelerate automation and enrich SOAR workflows with native Logic Apps integration. 

Jan. 20 | 8:00am | Microsoft Defender for Cloud | What’s New in Microsoft Defender CSPM 

Cloud security posture management (CSPM) continues to evolve, and Microsoft Defender CSPM is leading the way with powerful enhancements introduced at Microsoft Ignite. This session will showcase the latest innovations designed to help security teams strengthen their posture and streamline operations. 

Jan. 22 | 8:00am | Azure Network Security | Advancing web application Protection with Azure WAF: Ruleset and Security Enhancements 

Explore the latest Azure WAF ruleset and security enhancements. Learn to fine-tune configurations, reduce false positives, gain threat visibility, and ensure consistent protection for web workloads—whether starting fresh or optimizing deployments. 

Looking for more? 

Join the Microsoft Customer Connection Program (MCCP)! As a MCCP member, you’ll gain early visibility into product roadmaps, participate in focus groups, and access private preview features before public release. You’ll have a direct channel to share feedback with engineering teams, influencing the direction of Microsoft Security products. The program also offers opportunities to collaborate and network with fellow security experts and Microsoft product teams. Join the MCCP that best fits your interests: www.aka.ms/joincommunity. 

Additional resources 

 

Read the whole story
alvinashcraft
12 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Hybrid AI Using Foundry Local, Microsoft Foundry and the Agent Framework - Part 2

1 Share

Background

In Part 1, we explored how a local LLM running entirely on your GPU can call out to the cloud for advanced capabilities The theme was: Keep your data local. Pull intelligence in only when necessary. That was local-first AI calling cloud agents as needed.

This time, the cloud is in charge, and the user interacts with a Microsoft Foundry hosted agent — but whenever it needs private, sensitive, or user-specific information, it securely “calls back home” to a local agent running next to the user via MCP.

Think of it as:

  • The cloud agent = specialist doctor
  • The local agent = your health coach who you trust and who knows your medical history
  • The cloud never sees your raw medical history
  • The local agent only shares the minimum amount of information needed to support the cloud agent reasoning

This hybrid intelligence pattern respects privacy while still benefiting from hosted frontier-level reasoning.

Disclaimer:
The diagnostic results, symptom checker, and any medical guidance provided in this article are for illustrative and informational purposes only. They are not intended to provide medical advice, diagnosis, or treatment.

Architecture Overview

The diagram illustrates a hybrid AI workflow where a Microsoft Foundry–hosted agent in Azure works together with a local MCP server running on the user’s machine. The cloud agent receives user symptoms and uses a frontier model (GPT-4.1) for reasoning, but when it needs personal context—like medical history—it securely calls back into the local MCP Health Coach via a dev-tunnel. The local MCP server queries a local GPU-accelerated LLM (Phi-4-mini via Foundry Local) along with stored health-history JSON, returning only the minimal structured background the cloud model needs. The cloud agent then combines both pieces—its own reasoning plus the local context—to produce tailored recommendations, all while sensitive data stays fully on the user’s device.

Hosting the agent in Microsoft Foundry on Azure provides a reliable, scalable orchestration layer that integrates directly with Azure identity, monitoring, and governance. It lets you keep your logic, policies, and reasoning engine in the cloud, while still delegating private or resource-intensive tasks to your local environment. This gives you the best of both worlds: enterprise-grade control and flexibility with edge-level privacy and efficiency.

Demo Setup

Create the Cloud Hosted Agent

Using Microsoft Foundry, I created an agent in the UI and pick gpt-4.1 as model:

I provided rigorous instructions as system prompt:

You are a medical-specialist reasoning assistant for non-emergency triage.  
You do NOT have access to the patient’s identity or private medical history.  
A privacy firewall limits what you can see.

A local “Personal Health Coach” LLM exists on the user’s device.  
It holds the patient’s full medical history privately.

You may request information from this local model ONLY by calling the MCP tool:
   get_patient_background(symptoms)

This tool returns a privacy-safe, PII-free medical summary, including:
- chronic conditions  
- allergies  
- medications  
- relevant risk factors  
- relevant recent labs  
- family history relevance  
- age group  

Rules:
1. When symptoms are provided, ALWAYS call get_patient_background BEFORE reasoning.
2. NEVER guess or invent medical history — always retrieve it from the local tool.
3. NEVER ask the user for sensitive personal details. The local model handles that.
4. After the tool runs, combine:
      (a) the patient_background output  
      (b) the user’s reported symptoms  
   to deliver high-level triage guidance.
5. Do not diagnose or prescribe medication.
6. Always end with: “This is not medical advice.”

You MUST display the section “Local Health Coach Summary:” containing the JSON returned from the tool before giving your reasoning.

Build the Local MCP Server (Local LLM + Personal Medical Memory)

The full code for the MCP server is available here but here are the main parts:

HTTP JSON-RPC Wrapper (“MCP Gateway”)

The first part of the server exposes a minimal HTTP API that accepts MCP-style JSON-RPC messages and routes them to handler functions:

  • Listens on a local port
  • Accepts POST JSON-RPC
  • Normalizes the payload
  • Passes requests to handle_mcp_request()
  • Returns JSON-RPC responses
  • Exposes initialize and tools/list
class MCPHandler(BaseHTTPRequestHandler): def _set_headers(self, status=200): self.send_response(status) self.send_header("Content-Type", "application/json") self.end_headers() def do_GET(self): self._set_headers() self.wfile.write(b"OK") def do_POST(self): content_len = int(self.headers.get("Content-Length", 0)) raw = self.rfile.read(content_len) print("---- RAW BODY ----") print(raw) print("-------------------") try: req = json.loads(raw.decode("utf-8")) except: self._set_headers(400) self.wfile.write(b'{"error":"Invalid JSON"}') return resp = handle_mcp_request(req) self._set_headers() self.wfile.write(json.dumps(resp).encode("utf-8"))

Tool Definition: get_patient_background

This section defines the tool contract exposed to Azure AI Foundry. The hosted agent sees this tool exactly as if it were a cloud function:

  • Advertises the tool via tools/list
  • Accepts arguments passed from the cloud agent
  • Delegates local reasoning to the GPU LLM
  • Returns structured JSON back to the cloud
def handle_mcp_request(req): method = req.get("method") req_id = req.get("id") if method == "tools/list": return { "jsonrpc": "2.0", "id": req_id, "result": { "tools": [ { "name": "get_patient_background", "description": "Returns anonymized personal medical context using your local LLM.", "inputSchema": { "type": "object", "properties": { "symptoms": {"type": "string"} }, "required": ["symptoms"] } } ] } } if method == "tools/call": tool = req["params"]["name"] args = req["params"]["arguments"] if tool == "get_patient_background": symptoms = args.get("symptoms", "") summary = summarize_patient_locally(symptoms) return { "jsonrpc": "2.0", "id": req_id, "result": { "content": [ { "type": "text", "text": json.dumps(summary) } ] } }

Local GPU LLM Caller (Foundry Local Integration)

This is where personalization happens — entirely on your machine, not in the cloud:

  • Calls the local GPU LLM through Foundry Local
  • Injects private medical data (loaded from a file or memory)
  • Produces anonymized structured outputs
  • Logs debug info so you can see when local inference is running
FOUNDRY_LOCAL_BASE_URL = "http://127.0.0.1:52403" FOUNDRY_LOCAL_CHAT_URL = f"{FOUNDRY_LOCAL_BASE_URL}/v1/chat/completions" FOUNDRY_LOCAL_MODEL_ID = "Phi-4-mini-instruct-cuda-gpu:5" def summarize_patient_locally(symptoms: str): print("[LOCAL] Calling Foundry Local GPU model...") payload = { "model": FOUNDRY_LOCAL_MODEL_ID, "messages": [ {"role": "system", "content": PERSONAL_SYSTEM_PROMPT}, {"role": "user", "content": symptoms} ], "max_tokens": 300, "temperature": 0.1 } resp = requests.post( FOUNDRY_LOCAL_CHAT_URL, headers={"Content-Type": "application/json"}, data=json.dumps(payload), timeout=60 ) llm_content = resp.json()["choices"][0]["message"]["content"] print("[LOCAL] Raw content:\n", llm_content) cleaned = _strip_code_fences(llm_content) return json.loads(cleaned)

Start a Dev Tunnel

Now we need to do some plumbing work to make sure the cloud can resolve the MCP endpoint. I used Azure Dev Tunnels for this demo.

The snippet below shows how to set that up in 4 PowerShell commands:

PS C:\Windows\system32> winget install Microsoft.DevTunnel PS C:\Windows\system32> devtunnel create mcp-health PS C:\Windows\system32> devtunnel port create mcp-health -p 8081 --protocol http PS C:\Windows\system32> devtunnel host mcp-health

I have now a public endpoint: 

https://xxxxxxxxx.devtunnels.ms:8081

Wire Everything Together in Azure AI Foundry

Now let's us the UI to add a new custom tool as MCP for our agent:

And point to the public endpoint created previously:

Voila, we're done with the setup, let's test it

Demo: The Cloud Agent Talks to Your Local Private LLM

I am going to use a simple prompt in the agent: “Hi, I’ve been feeling feverish, fatigued, and a bit short of breath since yesterday. Should I be worried?”

Disclaimer:
The diagnostic results and any medical guidance provided in this article are for illustrative and informational purposes only. They are not intended to provide medical advice, diagnosis, or treatment.

Below is the sequence shown in real time:

Conclusion — Why This Hybrid Pattern Matters

Hybrid AI lets you place intelligence exactly where it belongs: high-value reasoning in the cloud, sensitive or contextual data on the local machine. This protects privacy while reducing cloud compute costs—routine lookups, context gathering, and personal history retrieval can all run on lightweight local models instead of expensive frontier models.

This pattern also unlocks powerful real-world applications: local financial data paired with cloud financial analysis, on-device coding knowledge combined with cloud-scale refactoring, or local corporate context augmenting cloud automation agents. In industrial and edge environments, local agents can sit directly next to the action—embedded in factory sensors, cameras, kiosks, or ambient IoT devices—providing instant, private intelligence while the cloud handles complex decision-making.

Hybrid AI turns every device into an intelligent collaborator, and every cloud agent into a specialist that can safely leverage local expertise.

References

 

Full demo repo available here.

Read the whole story
alvinashcraft
26 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Gaining Confidence with Az CLI and Az PowerShell: Introducing What if & Export Bicep

1 Share

Ever hesitated before hitting Enter on a command, wondering what changes it might make? You’re not alone. Whether you’re deploying resources or updating configurations, the fear of unintended consequences can slow you down. That’s why we’re introducing new powerful features in Azure CLI and Azure PowerShell to preview the changes the commands may make: the What if and Export Bicep features.

These capabilities allow you to preview the impact of your commands and allow you to export them as Bicep templates, all before making any changes to your Azure environment. Think of them as your safety net: you can validate actions, confirm resource changes, and even generate reusable infrastructure-as-code templates with confidence.

Currently, these features are in private preview, and we’re excited to share how you can get early access.

Why This Matters
  • Reduce risk: Avoid accidental resource deletions or costly misconfigurations.
  • Build confidence: Understand exactly what your command will do before execution.
  • Accelerate adoption of IaC: Convert CLI commands into Bicep templates automatically.
  • Improve productivity: Validate scripts quickly without trial-and-error deployments.

How It Works

What if preview of commands

All you have to do is add the `--what-if` parameter to Azure CLI commands and then the `-DryRun` command to Azure PowerShell commands like below.

Azure CLI:

az storage account create --name "mystorageaccount" --resource-group "myResourceGroup" --location "eastus" --what-if

Azure PowerShell:

New-AzVirtualNetwork -name MyVNET -ResourceGroupName MyResourceGroup -Location eastus -AddressPrefix "10.0.0.0/16" -DryRun
Exporting commands to Bicep

To generate bicep from the command you will have to add the `--export-bicep` command with the --what-if parameter to generate a bicep file. The bicep code will be saved under the `~/.azure/whatif` directory on your machine. The command will specific exactly where the file is saved on your machine.

Behind the scenes, AI translates your CLI command into Bicep code, creating a reusable template for future deployments. After generating the Bicep file, the CLI automatically runs a What-If analysis on the Bicep template to show you the expected changes before applying them.

Here is a video of it in action!

Here is another example where there is delete, modify and create actions happening all together.

Private Preview Access

These features are available in private preview. To sign up:

  1. Visit the aka.ms/PreviewSignupPSCLI
  2. Submit your request for access.
  3. Once approved, you’ll receive instructions to download the preview package.

Supported Commands (Private Preview)

Given these features are in a preview we have only added support for a small set of commands for the time being. Here’s a list of commands that will support these features during the private preview:

Azure CLI

  • Az vm create
  • Az vm update
  • az storage account create
  • az storage container create
  • az storage share create
  • az network vnet create
  • az network vnet update
  • az storage account network-rule add
  • az vm disk attach
  • az vm disk detach
  • az vm nic remove

Azure PowerShell

  • New-AzVM
  • Update-AzVM
  • New-AzStorageAccount
  • New-AzRmStorageShare
  • New-AzRmStorageContainer
  • New-AzVirtualNetwork
  • Set-AzVirtualNetwork
  • Add-AzStorageAccountNetworkRule

Next Steps

  • Sign up for the private preview.
  • Install the packages using the upcoming script.
  • Start using --what-if, -DryRun, and --export-bicep to make safer, smarter decisions and accelerate your IaC journey.
  • Give us feedback on what you think of the feature! At https://aka.ms/PreviewFeedbackWhatIf

Thanks so much!

 

Steven Bucher

PM for Azure Client Tools

 

Read the whole story
alvinashcraft
35 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

AI and human potential: Advancing skills, innovation, and outcomes

1 Share

When empowered employees put AI skills into action, true transformation begins. Across industries and around the world, organizations are teaming up with Microsoft to help their employees learn the skills they need to put AI into practice.

Building AI skills today sets up employees and organizations to thrive in the opportunities of tomorrow. At visionary organizations, like Albertsons Companies, Casey’s, Levi Strauss, and Newell Brands, teams are moving beyond theory and weaving AI into everyday work, transforming routine tasks into opportunities for innovation and business growth.

This isn’t just about technology—it’s about people reimagining how they make an impact—and, in the process, they’re building resilience, driving growth, and creating success.

AI Skills Navigator: Guidance for your AI-powered growth

To help more people tap into this potential, we recently announced AI Skills Navigator an agentic learning space, bringing together AI-powered skilling experiences and credentials that help individuals build career skills and organizations worldwide accelerate their business.

Forward-thinking companies Icertis, LexisNexis Risk Solutions, MTN, and Vodafone previewed AI Skills Navigator and shared their excitement about the experience. They see this learning space as a way to strengthen their learning cultures, prepare their workforces for the future, and deliver meaningful results—from better customer experiences to innovation that scales.

Empowering teams to lead with confidence

We believe skilling should always start with people, especially in a fast-changing AI landscape. We’re inspired by what teams can achieve when they feel supported, and we’re proud to work with organizations of all sizes as they help their employees grow and lead with confidence. The examples that follow highlight how enterprises across industries and regions are building future-ready teams with Microsoft skilling—and seeing real impact.

  1. The Adecco Group, a leading talent solutions and advisory firm that serves more than 100,000 companies every year, is committed to upskilling its workforce and preparing employees for the future of work. Partnering with Microsoft, The Adecco Group provides skill-building that supports 300 million candidate interactions annually and fosters a learning culture that scales AI expertise across the company.
  2. Abu Dhabi National Oil Company (ADNOC), the state-owned energy company of Abu Dhabi, United Arab Emirates, operates across the entire oil and gas value chain. ADNOC and Microsoft have committed to co-develop and deploy agents that enhance efficiency, enable autonomous operations, and reduce emissions. This collaboration includes advanced AI tools and workforce training from Microsoft, in addition to the creation of a joint innovation ecosystem to drive transformative energy solutions.
  3. Albertsons Companies, one of the largest food and drug retailers in the United States, is helping its associates be more effective with AI. According to Anuj Dhanda, Executive Vice President (EVP), Chief Technology & Transformation Officer, "At Albertsons Companies, we’re better together, whether in our stores or behind the scenes, and we embrace groundbreaking innovation that empowers each of our team members to earn customers for life. Our partnership with Microsoft is one of many transformative AI initiatives we’re implementing to unlock the potential of AI, helping our associates to be more effective, simplify their work, and build a more capable workforce to serve our 37 million customers each week."
  4. The Belgian Ministry of Defense is working with Microsoft to strengthen its capabilities through the practical application of generative AI, supporting a future-ready workforce. With AI coaching, the ministry equips its personnel with the skills to enhance digital resilience and operational effectiveness.
  5. Bupa APAC, a global leader in health insurance, health services, and aged care, keeps skilling central to its strategy evolution. The company cites AI as a critical part of its transformation but points out that technology alone isn't enough. It’s focused on its teams—building the right skills to make AI effective across the organization.
  6. Casey’s, a Fortune 500 convenience store chain and one of the largest pizza retailers in the United States, is embracing AI throughout the company. As Sanjeev Satturu, Senior Vice President (SVP) and Chief Information Officer (CIO), explains, “At Casey's, we are dedicated to fostering digital dexterity by equipping our teams with innovative AI tools. Through our strategic partnership with Microsoft, we are providing employees with the knowledge and resources to seamlessly integrate AI into their daily work, empowering them to extend their capabilities, boost productivity, and drive our evolution. By embracing AI in everyday actions, we amplify our collective potential and accelerate meaningful progress across the organization.”
  7. Commonwealth Bank of Australia (CommBank), one of Australia’s largest banks, is guided by its goal to build a brighter future for all. With Microsoft skilling offerings, the bank is enabling its teams to engage with AI, use the latest tools, and embrace new ways of working, as employees find out how AI can drive real impact. With a structured skilling approach and real-world experimentation, CommBank employees are learning to confidently apply AI to their day-to-day tasks, stay ahead, innovate, and help shape the future of banking.
  8. Danone, a world leader in specialized nutrition, is exploring new frontiers in AI transformation. Juergen Esser, Deputy Chief Executive Officer (CEO) in charge of Finance, Technology and Data, observes, “Our collaboration with Microsoft will accelerate our AI transformation, providing us with the tools, technology, and expertise to explore new frontiers in data analysis, operational efficiency, and consumer engagement. Working together is not just about technology; it’s about fostering a culture of continuous learning, innovation, and performance across our organisation."
  9. EPAM, a global leader in digital engineering and AI-powered software engineering services, is differentiating itself by focusing on Microsoft Applied Skills, moving beyond academic knowledge to real client impact. This approach demonstrates proficiency, strengthens outcomes, and provides a clear competitive advantage. With Applied Skills, EPAM fast-tracks AI readiness and project outcomes.
  10. Icertis, a leader in AI-powered contract intelligence, previewed AI Skills Navigator. Shwetambari Salgar, Learning & Organisational Development, is enthusiastic, emphasizing, “AI Skills Navigator provides an opportunity to upskill Icertis teams for the future through a unified learning platform. As a long-time Microsoft partner, we have a shared drive for innovation and continuous learning.”
  11. Koç Holding is a Fortune Global 500 company and Turkey’s largest multinational conglomerate, operating across energy, automotive, finance, consumer goods, and retail sectors. The company is taking advantage of Microsoft Applied Skills, empowering more than 6,000 employees to build future-ready skills in cloud, AI, and automation—underscoring the company’s belief that digital transformation begins with people. Koç Holding employees demonstrate their technical expertise through real-world tasks with Applied Skills, which focus on targeted, scenario-based learning and validation. These staff members are turning their new skills into impact, solving business challenges and driving agility.
  12. Lenovo, a global technology powerhouse focused on delivering smarter technology for all, is expanding its collaboration with Microsoft to upskill an additional 4,200 employees this year, following the successful training of more than 3,000 to date. What began as a small pilot has evolved into one of Lenovo’s largest learning communities, reinforcing the company’s position as a leading force in the AI revolution.
  13. Levi Strauss & Co., global apparel company, is building the foundation to apply AI across its operations. As Karen Scholl, Vice President, Office of the Chief Digital & Technology Officer, explains, “We’re empowering our people to tap into the power of AI to unlock new possibilities, reimagine how we work, and accelerate our evolution into a best-in-class, direct-to-consumer retailer. Our partnership with Microsoft isn’t just about technology, but it’s about building the right foundation to embed AI across everything we do. From product design to store operations, we’re building and equipping every team member with the tools to drive innovation and growth, ensuring Levi’s continues to thrive for another 170 years. By leveraging Microsoft’s technological expertise, we’re redefining what’s possible for the future of retail.”
  14. LexisNexis Risk Solutions provides data analytics and technology to help organizations assess risk, ensure compliance, and prevent fraud. Sarah Fabius, Technology Optimization Program Manager, notes that the company is dedicated to equipping its workforce “with cutting-edge AI skills that fuel innovation in risk and data analytics technology. Through our partnership with Microsoft and the integration of AI Skills Navigator, we’re ensuring our teams have the tools and expertise to lead in a rapidly evolving industry.”
  15. MTN is Africa’s largest mobile network operator. Paul Norman, Group Chief Human Resources Officer, shares his perspective on AI Skills Navigator, noting, “At MTN, we believe in developing an AI-fluent organisation that empowers every employee to adopt and utilise AI responsibly and ethically. Microsoft AI Skills Navigator provides an exciting new learning innovation that can help demystify AI through its learner-centric and human-first approach to grow and develop an agile, resilient, and AI-inspired workforce of the future!"
  16. National Australia Bank (NAB), one of Australia’s largest financial institutions, is advancing its strategic commitment to AI, with more than 100 initiatives underway and thousands of employees already applying generative AI to deliver real business value. The bank emphasizes responsible AI, inclusive innovation, and strong leadership. Through a joint effort with Microsoft, 600 women have received AI training, reinforcing NAB’s vision of a diverse, future-ready workforce equipped to lead in the age of AI.
  17. Newell Brands, a leading consumer products company with a portfolio of iconic brands, is empowering its employees with Microsoft technology and skilling. As Chris Peterson, President and CEO, points out, “At Newell Brands, we see AI as a catalyst for creativity, productivity, and transformation. Through the Microsoft AI experience training series and access to Microsoft 365 Copilot, we’re empowering our employees with technology that helps them work smarter, boost productivity, and turn ideas into action faster. This is about more than adopting new technology—it’s about fueling innovation, driving efficiency, and creating lasting value across our business.”
  18. NTT DATA is a global IT services and consulting company that provides digital, cloud, and business solutions to help organizations innovate and transform. The company is building its workforce skills in generative AI in close collaboration with Microsoft and other leading partners. The company intends to train 200,000 of its employees by fiscal year 2027 and, as of October 2025, had already trained 70,000 of them. This initiative empowers staff members to apply generative AI independently, driving innovation in daily tasks and creating new value across the organization.
  19. OP Pohjola, Finland’s largest provider of financial services, is fueling transformation in the financial sector by embedding continuous learning into its culture. In a joint effort with Microsoft, the company is ensuring that every employee has the opportunity to build AI skills and can apply them responsibly every day. This commitment and world-class learning strengthens customer service, fosters innovation, and helps to shape the future of banking.
  20. Ricoh, a global technology company that provides digital services, printing solutions, and workplace innovation, believes that real transformation starts with people. The company is advancing AI adoption across its teams in the Asia-Pacific region, focusing on skill-building so team members can use new tools with confidence. Ricoh and Microsoft worked together to deliver AI Learning Week, which gave employees the opportunity to explore ways to apply AI meaningfully and responsibly and to reimagine how they work, enhancing human potential.
  21. Vodafone, a global telecommunications company, looks to the exciting possibilities of AI Skills Navigator. As Steve Garley, Senior Manager Technical and Digital Skills, explains, “AI Skills Navigator learner-first design aligns well with an AI-first approach and Vodafone’s ambition for readiness in the AI era. We see potential in Microsoft’s [skilling experience] to transform traditional training models and deliver role-based, just-in-time upskilling that drives real business outcomes.”

 

From retail to telecom and more, these organizations prove that empowered teams can spark innovation and lasting growth. Their experiences highlight a truth worth remembering—transformation starts with people. When employees feel confident in their skills, they create success that extends well beyond daily tasks. If you’re ready to empower your workforce with the full potential of AI skilling, get your teams started with AI Skills Navigator.

Read the whole story
alvinashcraft
47 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Fully Optimized: Wrapping up Performance Spotlight Week

1 Share
Posted by Ben Weiss, Senior Developer Relations Engineer and Sara Hamilton, Product Manager




We spent the past week diving deep into sharing best practices and guidance that helps to make Android apps faster, smaller, and more stable. From the foundational powers of the R8 optimizer and Profile Guided Optimizations, to performance improvements with Jetpack Compose, to a new guide on levelling up your app's performance, we've covered the low effort, high impact tools you need to build a performant app.

This post serves as your index and roadmap to revisit these resources whenever you need to optimize. Here are the five key takeaways from our journey together.

Use the R8 optimizer to speed up your app

The single most impactful, low-effort change you can make is fully enabling the R8 optimizer. It doesn't just reduce app size; it performs deep, whole-program optimizations to fundamentally rewrite your code for efficiency. Revisit your Keep Rules and get R8 back into your engineering tasks.


Our newly updated and expanded documentation on the R8 optimizer is here to help.


Reddit observed a 40% faster cold startup and 30% fewer ANR errors after enabling R8 full mode.

You can read the full case study on our blog.




Engineers at Disney+ invest in app performance and are optimizing the app's user experience. Sometimes even seemingly small changes can make a huge impact. While inspecting their R8 configuration, the team found that the -dontoptimize flag was being used. After enabling optimizations by removing this flag, the Disney+ team saw significant improvements in their app's performance.




So next time someone asks you what you could do to improve app performance, just link them to this post.


Read more in our Day 1 blog: Use R8 to shrink, optimize, and fast-track your app

Guiding you to better performance


Baseline Profiles effectively remove the need for Just in Time compilation, improving startup speed, scrolling, animation and overall rendering performance. Startup Profiles make app startup more even more lightweight by bringing an intelligent order to your app's classes.dex files.


And to learn more about just how important Baseline Profiles are for app performance, read Meta's engineering blog where they shared how Baseline Profiles improved various critical performance metrics by up to 40% across their apps.


We continue to make Jetpack Compose more performant for you in Jetpack Compose 1.10. Features like pausable composition and the customizable cache window are crucial for maintaining zero scroll jank when dealing with complex list items.Take a look at the latest episode of #TheAndroidShow where we explain this in more detail.


Read more in our Wednesday's blog: Deeper Performance Considerations

Measuring performance can be easy as 1, 2, 3


You can't manage what you don't measure. Our Performance Leveling Guide breaks down your measurement journey into five steps, starting with easily available data and building up to advanced local tooling.

Starting at level 1, we’ll teach you how to use readily available data from Android Vitals, which provides you with field data on ANRs, crashes, and excessive battery usage.


We’ll also teach you how to level up. For example, we’ll demonstrate how to reach level 3 with local performance testing using Jetpack Macrobenchmark and the new UiAutomator 2.4 API to accurately measure and verify any change in your app's performance.


Read more in our Thursday's blog

Debugging performance just got an upgrade


Advanced optimization shouldn't mean unreadable crash reports. New features are designed to help you confidently debug R8 and background work:

Automatic Logcat Retrace

Starting in Android Studio Narwhal, stack traces can automatically be de-obfuscated in the Logcat window. This way you can immediately see and debug any crashes in a production-ready build.

Narrow Keep Rules

On Tuesday we demystified the Keep Rules needed to fix runtime crashes, emphasizing writing specific, member-level rules over overly-broad wildcards. And because it's an important topic, we made you a video as well.

And with the new lint check for wide Keep Rules, the Android Studio Otter 3 Feature Drop has you covered here as well.

We also released new guidance on testing and troubleshooting your R8 configuration to help you get the configuration right with confidence.



Read more in our Tuesday's blog: Configure and troubleshoot R8 Keep Rules

Background Work

We shared guidance on debugging common scenarios you may encounter when scheduling tasks with WorkManager.

Background Task Inspector gives you a visual representation and graph view of WorkManager tasks, helping debug why scheduled work is delayed or failed. And our refreshed Background Work documentation landing page highlights task-specific APIs that are optimized for particular use cases, helping you achieve more reliable execution.


Read more in our Wednesday's blog: Background work performance considerations

Performance optimization is an ongoing journey

If you successfully took our challenge to enable R8 full mode this week, your next step is to integrate performance into your product roadmap using the App Performance Score. This standardized framework helps you find the highest leverage action items for continuous improvement.

We capped off the week with the #AskAndroid Live Q&A session, where engineers answered your toughest questions on R8, Profile Guided Optimizations, and more. If you missed it, look for the replay!


Thank you for joining us! Now, get building and keep that momentum going.




Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Antigravity and Firebase MCP accelerate app development

1 Share
Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories