Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
151024 stories
·
33 followers

Microsoft admits Windows 10’s extended updates are causing issues, MSMQ won’t work

1 Share

Microsoft has confirmed what we heard from our readers: December 2025’s Patch update (KB5071546) for Windows 10 has a bug that causes issues with MSMQ, otherwise known as Message Queuing (MSMQ). This issue mostly affects businesses, as consumer PCs do not have MSMQ installed or enabled, and consumer apps also don’t depend on MSMQ.

2025-12 Cumulative Update for Windows 10 Version 22H2 for x64-based Systems (KB5071546)

For those unaware, MSMQ (Microsoft Message Queuing) is a component that allows apps or services to send messages to a queue so another app or service can process them later.

In most cases, it is used by enterprise apps to manage background tasks, and if the MSMQ component stops working, a lot of background tasks simply stop running and block the primary app or website.

If your apps rely on Messaging Queuing (MSMQ), you might not be able to open apps or sites that rely on IIS, which stands for Internet Information Services, and is also responsible for locally hosting apps or sites. Again, as I mentioned, a casual user would not do any of this, but if you’re affected, you’ll see the following error.

System.Messaging.MessageQueueException: Insufficient resources to perform operation.

How Microsoft broke MSMQ with Windows 10 December 2025 update

Windows Latest understands that the December 2025 update (KB5071546), which is only for ESU (Extended Security Update)-enrolled PCs, likely changed how MSMQ works. This affected the component’s security behaviour. Microsoft also updated NTS permissions where MSMQ stores queue data, which, according to our tests, is the following:

C:\Windows\System32\MSMQ\storage

After the patch, the account that uses MSMQ now needs write access to that storage folder. In many real setups, MSMQ is accessed by IIS app pool identities, LocalService or NetworkService, or a locked-down service account that doesn’t have write permission there.

As a result, MSMQ can’t create or write its message files, and it starts failing

One user told Windows Latest that their queues refuse to connect. You’ll also see an “inactive” state, which confirms something is odd. Unfortunately, if you restart the service or the server itself, and even try reinstalling the Messaging Queuing (MSMQ) from optional features in Control Panel, it will still not work.

However, if you manually navigate to the Windows Update history and remove Windows 10 KB5071546, MSMQ will start working again. We’re also seeing reports of the issue appearing on Windows Server 2019, but Windows Latest tests could not reproduce it on Windows Server 2022.

“I also noticed that the NTFS-Security-Descriptor gets changed from D:P to D:PAI. The AI-Flag (auto-inherited) seems that the DACLs get modified or changed. That could lead to Users like iis_iusrs / localservice /networkservice to be not allowed anymore on this folder,” one of the affected users wrote in a Microsoft forum post spotted by Windows Latest.

Another system admin who installed KB5071544 on Windows Server 2019 also experienced similar issues, including MSMQ “insufficient disk space or memory” errors,

“Correct, my IIS apps that require MSMQ to function completely stop, and my monitor records it as a 500 error,” one user explained.

Here’s what the error “Insufficient resources to perform operation” looks like when you’re affected by MSMQ issues.

System.Messaging.MessageQueueException Insufficient resources to perform operation.

Microsoft confirms MSMQ failure as a known issue in the latest update for Windows 10 or Windows Server

Microsoft has confirmed that it’s investigating issues with Message Queuing (MSMQ) after installing Windows 10 KB5071546, and it has nothing more to share.

“After installing this update, users might face issues with the Message Queuing (MSMQ) functionality. This issue also impacts clustered MSMQ environments under load,” Microsoft noted in an update to the support document spotted by Windows Latest.

This issue does not affect Windows 11, so it’s isolated to just Windows 10, which is odd because the operating system is on extended security support, and it’s barely getting any noticeable changes. It makes you wonder how Microsoft could break older features in Windows even when the OS itself is no longer being developed for new “features.”

How to fix MSMQ issues in Windows 10

If you run into problems after the updates, you will need to uninstall Windows 10 KB5071546 and pause updates while Microsoft figures out a patch.

The post Microsoft admits Windows 10’s extended updates are causing issues, MSMQ won’t work appeared first on Windows Latest

Read the whole story
alvinashcraft
40 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

493: Git's most powerful but underutilized tool

1 Share

In this episode of 'Merge Conflict,' James and Frank dive into the intricacies of Git work trees, exploring how they revolutionize local machine development by allowing developers to manage multiple branches simultaneously. Frank initially struggles to grasp the concept, but James breaks down the functionality, explaining how work trees enable parallel branching and commit management. From managing complex code branches to optimizing lighting setups, this episode is packed with insights that are both educational and entertaining, making it a must-listen for developers and creatives alike.

Follow Us

⭐⭐ Review Us ⭐⭐

Machine transcription available on http://mergeconflict.fm

Support Merge Conflict





Download audio: https://aphid.fireside.fm/d/1437767933/02d84890-e58d-43eb-ab4c-26bcc8524289/ad4d743b-6222-4f3f-bfaa-c6ffee2c8294.mp3
Read the whole story
alvinashcraft
41 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

When Az PowerShell Gets Weird: How to Clean Up Duplicate Modules Without Breaking Anything

1 Share
The companion code for this blog can be found here . One thing you may have noticed is that I have been in PowerShell a bit more frequently as of late. The back story is I had a machine that all of a sudden didn't respond well with PowerShell and I spent quite a bit of time cleaning everything up. Whenever plagued with PowerShell issues, I run through an extensive list of troubleshooting every time a machine doesn't play nice and figured it might be a good time to share some of what's helped...

Read the whole story
alvinashcraft
41 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Introducing GPT-5.2 in Microsoft Foundry: The new standard for enterprise AI

1 Share

The age of AI small talk is over. Enterprise applications demand more than clever chat. They require a reliable, reasoning partner capable of solving the most ambiguous, high-stakes problems, including planning multi-agent workflows and delivering auditable code.

Azure is the foundation for solving these challenges. Today, OpenAI’s GPT-5.2 is announced as generally available in Microsoft Foundry, introducing a new frontier model series purposefully built to meet the needs of enterprise developers and technical leaders—setting a new standard for a new era.

GPT-5.1 vs. GPT-5.2: Key upgrades for developers to know

GPT-5.2 series introduces deeper logical chains, richer context handling, and agentic execution that prompts shippable artifacts. For example, design docs, runnable code, unit tests, and deployment scripts can be generated with fewer iterations. The GPT-5.2 series is built on new architecture, delivering superior performance, efficiency, and reasoning depth compared to prior generations. It’s also trained on the proven GPT-5.1 dataset and further enhanced with improved safety and integrations. GPT-5.2 leaps beyond previous models with substantial performance improvements across core metrics.

Today, we’re shipping GPT-5.2 and GPT-5.2-Chat. Each is greatly improved from its predecessor, and together they excel in everyday professional excellence.

GPT-5.2: The most advanced reasoning model that solves harder problems more effectively and with more polish. An example of this is information work, where great thinking is now complemented with better communication skills and improved formatting in spreadsheets and slideshow creation.

GPT-5.2-Chat: A powerful yet efficient workhorse for everyday work and learning, with clear improvements in info-seeking questions, how-to’s and walk-throughs, technical writing, and translation. It’s also more effective at supporting studying and skill-building, as well as offering clearer job and career guidance.

Why GPT-5.2 sets a new standard for enterprise AI

For long term success in complex professional tasks, teams need structured outputs, reliable tool use, and enterprise guardrails. GPT‑5.2 is optimized for these agent scenarios within Foundry’s enterprise-grade platform, offering consistent developer experience across reasoning, chat, and coding.

  • Multi-Step Logical Chains: Decomposes complex tasks, justifies decisions, and produces explainable plans.
  • Context-Aware Planning: Ingests large inputs (project briefs, codebases, meeting notes) for holistic, actionable output.
  • Agentic Execution: Coordinates tasks end-to-end, across design, implementation, testing, and deployment, reducing iteration cycles and manual oversight.
  • Safety and Governance: Enterprise-grade controls, managed identities, and policy enforcement for secure, compliant AI adoption.

GPT-5.2’s deep reasoning capabilities, expanded context handling, and agentic patterns make it the smart choice for building AI agents that can tackle long-running, complex tasks across industries, including financial services, healthcare, manufacturing, and customer support.

  • Analytics and Decision Support: Useful for wind tunneling scenarios, explaining trade-offs, and producing defensible plans for stakeholders.
  • Application Modernization: Make rapid progress in refactoring services, generating tests, and producing migration plans with risk and rollback criteria.
  • Data and Pipelines: Audit ETL, recommend monitors/SLAs, and generate validation SQL for data integrity.
  • Customer Experiences: Build context-aware assistants and agentic workflows that integrate into existing apps.

The results? Agents that maintain reliability through complex workflows and agent service, while producing structured, auditable outputs that scale confidently in Microsoft Foundry.

GPT-5.2 deployment and pricing

Model Deployment Pricing (USD $/million tokens) 
  InputCached Input Output 
GPT-5.2Standard Global  $1.75 $0.175 $14.00
 Standard Data Zones (US) $1.925 $0.193 $15.40
GPT-5.2-Chat Standard Global  $1.75 $0.175 $14.00

The post Introducing GPT-5.2 in Microsoft Foundry: The new standard for enterprise AI appeared first on Microsoft Azure Blog.

Read the whole story
alvinashcraft
41 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Azure Storage innovations: Unlocking the future of data

1 Share

Microsoft is redefining what’s possible in the public cloud and driving the next wave of AI-powered transformation for organizations. Whether you’re pushing the boundaries with AI, improving the resilience of mission-critical workloads, or modernizing legacy systems with cloud-native solutions, Azure Storage has a solution for you.

At Microsoft Ignite 2025 and KubeCon North America last month, we showcased the latest innovations in Azure Storage, powering your workloads. Here is a recap of those releases and advancements.

Innovating for the future with AI

Azure Blob Storage provides a unified storage foundation for the entire AI lifecycle, powering everything from ingestion and preparation to checkpoint management and model deployment.

To enable customers to rapidly train, fine-tune, and deploy AI models, we evolved the Azure Blob Storage architecture to scale and deliver exabytes of capacity, 10s of Tbps throughput, and millions of IOPS to GPUs. In this video, you can see a single storage account scaling to over 50 Tbps on read throughput. Azure Blob Storage is also the foundation that enables OpenAI to train and serve models at unprecedented speed and scale. 

Fig 1. Storage-centric view of AI training and fine-tuning

For customers handling terabyte or petabyte scale AI training data, Azure Managed Lustre (AMLFS) is a high-performance parallel file system delivering massive throughput and parallel I/O to keep GPUs continuously fed with data. AMLFS 20 (preview) supports 25 PiB namespaces and up to 512 GBps throughput. Hierarchical Storage Management (HSM) integration enhances AMLFS scalability by enabling seamless data movement between AMLFS and your exabyte-scale datasets in Azure Blob Storage. Auto-import (preview) allows you to pull only required datasets into AMLFS, and auto-export sends trained models to long-term storage or inferencing.

Rakuten is accelerating the training of Japanese large language models on Microsoft Azure, leveraging Azure Managed Lustre, Azure Blob Storage, and Azure Kubernetes Service to maximize GPU utilization and simplify scaling.

Natalie Mao, VP, AI & Data Division, Rakuten Group

Once models are trained and fine-tuned, inferencing takes center stage delivering real-time predictions and insights. Azure Blob Storage provides best-in-class storage for Microsoft AI services, including Microsoft Foundry Agent Knowledge (preview) and AI Search retrieval agents (preview), enabling customers to bring their own storage accounts for full flexibility and control, ensuring that enterprise data remains secure and ready for retrieval-augmented generation (RAG).

Additionally, Premium Blob Storage delivers consistent low-latency and up to 3X faster retrieval performance, critical for RAG agents. For customers that prefer open-source AI frameworks, Azure Storage built LangChain Azure Blob Loader which delivers granular security, memory-efficient loading of millions of objects and up to 5x faster performance compared to prior community implementations.

Fig 2. Storage-centric view of AI inference with enterprise data

Azure Storage is evolving to be an integrated, intelligent AI-driven platform simplifying management of exabyte-scale AI data. Storage Discovery and Copilot work together to help you analyze and understand how your data estate is evolving over time using dashboards and questions in natural language. With Storage Discovery and Storage Actions, you can optimize costs, protect your data and govern large datasets with hundreds of billions of objects used for training, and fine-tuning.

Optimizing modern applications with Cloud Native

Modern cloud-native applications demand agility. Two principles consistently stand out: elasticity and flexibility. Your storage should scale seamlessly with dynamic workloads—without operational overhead. The innovations below are designed for the cloud, enabling you to auto-scale, optimize costs intelligently, and deliver the performance needed by modern applications.

Azure Elastic SAN provides cloud-native block storage for scale and tight Kubernetes integration for fast scaling, and multi‑tenancy that optimizes cost. With new auto scaling support, Elastic SAN automatically expands resources as needed, making it easier to manage storage footprints across workloads. Early next year, we’ll extend Kubernetes integration via Azure Container Storage for Azure Kubernetes Service (AKS) to general availability (GA). These enhancements let you maintain familiar hosting environments while layering in cloud-native capabilities.

Cloud-native agility is also critical for modern applications built on object storage, with the need to optimize costs and performance for dynamic and unpredictable traffic patterns. Smart Tier (preview) on Azure Blob Storage continuously analyzes access patterns, moving data between tiers automatically.

New data starts in the hot tier. After 30 days of inactivity, it moves to cool, and after 90 days, to cold. If an object is accessed again, it’s promoted back to hot which keeps data in the most cost-effective tier automatically. You can optimize costs without sacrificing performance, simplifying data management at scale and keeping your focus on building.

Hosting mission-critical workloads

Enterprises today run mission-critical workloads that require block storage and deliver predictable performance and uncompromising business continuity. Azure Ultra Disk is our highest-performance block storage offering, purpose-built for workloads like high-frequency trading, ecommerce platforms, transactional databases, and electronic health record systems that demand exceptional speed, reliability, and scalability.

With Azure Ultra Disk, we can confidently scale our platform globally, knowing that performance and resilience will meet enterprise expectations, that consistency allows our teams to focus on AI innovation and workflow automation rather than infrastructure.

Charles McDaniels, Director of Systems Engineering Management for Global Cloud Services, ServiceNow

We know performance, cost, and business continuity remain the top priorities for our customers and we are raising the bar in every category:

  • Performance: We have further improved the average latency for Azure Ultra Disk by 30% with average latency well under 0.5ms for small IOs on virtual machines (VMs) with Azure Boost. A single Azure Ultra Disk can deliver industry leading performance of 400K IOPS and 10 GBps throughput. In addition, with Ebsv6 VMs, both Premium SSD v2 and Azure Ultra Disk can deliver industry leading VM performance scale of 800K IOPS and 14 GBps throughput for the most demanding applications.
  • Cost: Flexible provisioning for Azure Ultra Disk reduces total cost of ownership by up to 50%, letting you scale capacity, IOPS, and MBps independently at finer granularity.
  • Business continuity: Instant Access Snapshots (preview) lets you backup and restore your workloads instantly with exceptional performance on rehydration. This differentiated experience for Azure Premium v2 and Ultra Disk helps eliminate the operational overhead of monitoring snapshot readiness or pre‑warming resources, while reducing recovery, refresh, and scale‑out times from hours to seconds.

Azure NetApp Files (ANF) is designed to deliver low latency, high performance, and data management at scale. Its large volumes capabilities have been significantly expanded providing an over 3x increase in single volume capacity scale to 7.2 PiB and a 4x increase in throughput to 50 GiBps. Cache volumes bring data and files closer to where users need rapid access in a space efficient footprint. These make ANF suitable for several high-performance computing workloads such as Electronic Design Automation (EDA), Seismic Interpretation and Visualization, Reservoir Simulations, and Risk Modeling. Microsoft is not only positioning ANF for mission critical applications but also using ANF for in-house silicon design.

Breaking barriers—migrating your storage infrastructure

Every organization’s cloud journey is unique. Whether you need to move existing environments to the cloud with minimal disruption or plan a full modernization, Azure Storage offers solutions for you. Storage Migration Solution Advisor in Copilot can provide recommendations to help streamline the decision-making process for these migrations. 

Azure Data Box and Storage Mover simplify the migration journey from on-premises and other clouds to Azure. The next generation Azure Data Box is now GA. Storage Mover is our fully managed data migration service that is secure, efficient and scalable with new capabilities: on-premises NFS shares to Azure Files NFS 4.1, on-premises SMB shares to Azure Blob storage, and cloud-to-cloud transfers. Storage Migration Solution Advisor in Copilot accelerates decision-making for migrations.

For users ready to migrate their NAS data estates, Azure Files now makes this easier than ever. We have introduced a new management model making it easier and more cost effective to use file shares. Additionally, Azure Files now enables you to eliminate complex on-premises Active Directory or domain controller infrastructure, with Entra-only identities for SMB shares. With cloud native identity support, you can now manage your user permissions directly in Azure, including external identities for applications like Azure Virtual Desktop (AVD).

Entra-only identities support with Azure Files transforms SLB’s Petrel workflows by removing dependencies on on-premises domain controllers, simplifying identity management and storage infrastructure for globally distributed teams working on complex exploration and reservoir characterization. This cloud-native architecture allows customers to access SMB shares in an easy and secure manner without complex VPN or hybrid infrastructure setups.

Swapnil Daga, Storage Architect for Tenant Infrastructure, SLB

ANF Migration Assistant simplifies moving ONTAP workloads from on-premises or other clouds to Azure. Behind the scenes, the Migration Assistant uses NetApp’s SnapMirror replication technology, providing efficient, full fidelity, block-level incremental transfers. You can now leverage large datasets without impacting production workloads.

For customers running on-premises partner solutions who want to migrate to Azure using the same partner-provided technology, Azure has recently introduced Azure Native offers with Pure Storage and Dell PowerScale.

To make migrations easier, Azure Storage’s Migration Program connects you with a robust ecosystem of experts and tools. Trusted partners like Atempo, Cirata, Cirrus Data, and Komprise can accelerate migration of SAN and NAS workloads. This program offers secure, low-risk transfers of files, objects, and block storage to help enterprises unlock the full potential of Azure.

Start your next chapter with Azure Storage

The era of AI-powered transformation is here. Begin your journey by exploring Azure’s advanced storage offerings and migration tools, designed to accelerate AI adoption, cloud migration, and modernization. Take the next step today and unlock new possibilities with Azure Storage as the foundation for your AI initiatives.

For any questions, reach out at azurestoragefeedback@microsoft.com.

The post Azure Storage innovations: Unlocking the future of data appeared first on Microsoft Azure Blog.

Read the whole story
alvinashcraft
41 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Meta's React Compiler 1.0 Brings Automatic Memoization to Production

1 Share

Introducing React Compiler 1.0, a game-changing tool that automates optimization for React apps, enhancing performance by up to 12% for faster loads and 2.5x quicker interactions. Compatible with major frameworks and battle-tested at Meta, it simplifies builds with integrated diagnostics. Experience seamless improvement without code rewrites, empowering developers to code smarter.

By Daniel Curtis
Read the whole story
alvinashcraft
42 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories