In this stream, I'll continue working on adding replacement capabilities to Rocks.
https://github.com/JasonBock/Rocks/issues/410
#dotnet #csharp #roslyn
In this stream, I'll continue working on adding replacement capabilities to Rocks.
https://github.com/JasonBock/Rocks/issues/410
#dotnet #csharp #roslyn
Hello and Welcome, I’m your Code Monkey!
I'm finally back from my travels, Brazil and Gamescom Latam were awesome but it's time to get back to work, there's so much stuff on my to-do list!
I got back Wednesday at 6am, took a quick nap and got right to work on building challenge #4 for my Game Dev Practice Lab, this is the first 2D challenge which a lot of people have been asking about. It is already live and later today I will publish the FREE YouTube video talking about challenges #2 and #3. So whether you can afford it or not it doesn't matter, go learn by DOING!
Game Dev: Unity AI useful
Tech: Make your own Steam Controller
Gaming: Fight your unplayed Games!
Fun: Glue THIS to THAT!

Unity AI has just entered into Open Beta, meaning anyone who wants to try it can, no need to request access, you can try it out here.
I'm currently researching Unity AI for a super detailed tutorial and so far in my research I am finding the tool is actually genuinely helpful! This is not mindless AI hype, and it's not even focused on AI generation, but rather is capable of building genuinely useful things like custom editor tools, helping you analyze the profiler, taking actions in your project, it can see your project visually, it can do basic level design, helps you organize your codebase, fix bugs (with context of your project) and a bunch more.
Like I said I'm still in the research phase but thankfully Unity actually invited me to be a guest of their Unity AI livestream showcase which was quite educational to me. They showed how they built a really nice demo and included a ton of useful hints for how to get the most out of these tools.
The demo is a vehicle combat game in an arena where the player controls a car and the NPC trucks spawn and try to destroy the player. The demo was built with a handful of the developers WITH AI, not replaced by it. Meaning in order for this AI to be useful it required actually developers who know how to get the most out of the AI.
You can watch the full livestream but here are the main useful takeaways that I got myself:
Build systems and learn. At 13:00 they showcase terrain deformation which I personally found really nice. And they describe how they used AI to help come up with that system. Personally this is something that I've wanted to research for ages and I think AI is super useful in this process, you can ask it to help you build such a system like they did, and then importantly ask it to help you understand WHY it works. The fact that this AI exists inside Unity means it has context of Unity itself and game development, so it's much better at building systems like these that rely on shaders and rendering as opposed to a generic LLM.
Use it for editor tools. At 18:00 they showcase a simple editor tool made with AI that helps position all the modular pieces to make an arena of any size. Super important takeaway: Ask AI to give you tons of sliders and settings so you can manually make it perfect, because the AI output will not be perfect.
Great for research. At 26:00 they wanted to have some robots cheering, but did not know which approach would be best, so they just asked AI and it gave them 3 possible options with their pros and cons, another great learning example. And after picking an option (in this case Vertex Animated Textures) the AI helped implement that system (with another custom tool)
Profiler Analysis. At 33:20 the game was having performance issues, so they asked AI what could be the cause and it identified a very niche issue, one ProBuilder mesh had super insanely long triangles which apparently tanks performance. This is one of those things that is hard to find and easy to fix, and "finding" things is where AI excels.
There's a lot more uses they showed in the livestream, like making a Hexagon shader to protect the robots, generating the UI and the code behind it, generating the statue and vehicles and more.
Some general best practices:
Break problems down. Don't ask "build me an entire game", instead ask "help me with this tiny specific task", then ask for another one, etc. Many tiny tasks instead of huge ones.
Be as descriptive as possible, the more detail the better
Generate characters in T-pose to make them easier to animate.
When generating sprites, ask for a solid color background (like green) to make it easier to remove later.
Ask for editor tools with tons of fine-tuning sliders
Ask it brainstorming questions instead of asking for a specific output. "I have some robots in the stands and I want a shield to protect them, give me 5 possible approaches for solving this problem"
Use screenshots and images to visually guide the AI if you want a visual task (like level design or particle effects)
Drag related prefabs and assets to give more context to the AI
Use Plan mode to come up with a plan before you attempt any changes
Again the main thing here is this is NOT "AI will make the game for you" but rather "your skills + AI will help you make better games faster". As always it's a tool meant to help you, not replace you. So give it a try here and see how it helps you in your workflow.
In terms of pricing, here is their page. On the Free Unity Personal plan you get 1000 free credits you can try out, and after that you can pay $10 per month. If you're on the Unity Pro tier you get 2000 credits per month. Then you can buy more separately if you need them. Different tasks require different credits, for example generating a sprite is 5-10 credits, and a quick query is 2-5 credits, so it feels like 1000 is actually a pretty decent amount. Example: This demo which had very heavy usage of AI (with lots of trial and error) was built with around 1800 credits.
At the end of the livestream I asked if this demo would be available for download so we could inspect all of it and they mentioned how they hadn't considered it but maybe.
![]() | I am very very curious to see how people adopt these AI tools. From my limited research so far (and AI in general) I would say learning how to use these tools is the most important thing. Knowing the best practices for how to break tasks down and knowing how to prompt correctly are crucial to getting the best results, so I look forward to seeing how those best practices come to be. Stay tuned for my dedicated video tutorial coming out in the near future. |
Unity is running an surprising bundle! Surprising because I instantly spotted how it contains one of my most recommended assets of all time, that I definitely did not expect to be bundled!
It’s worth it for that asset alone (Asset Inventory) and on top of that getting all the other assets is just an awesome bonus. The whole thing has a huge discount and it’s mostly made up of new assets so you probably don’t have any yet. (I only had one asset myself)
Get it HERE!
The Publisher of the Week this time is Daniel Ilett, who makes a bunch of useful URP shaders.
Get the FREE Toon Shaders Pro for URP which is a really nice toon shader with a ton of rendering options.
Get it HERE and use coupon DANIELILETT at checkout to get it for FREE!
Looking for Characters and Weapons in a realistic style to make your games? Check out this awesome HumbleBundle!
It’s made by Bugrimov Maksim which is one of the best realistic publishers. This pack has a mountain of characters in all styles, alongside a ton of weapons with first person animations.
It’s 98% OFF! Get it HERE!

Valve is such a weird company, but usually weird in the good way. (although I do wish they would lower their 30% cut for indie games)
They just released their latest piece of hardware, the Steam Controller, and pretty much instantly it went out of stock. You can't buy it on the official site for $99 (although there are already scalpers on ebay selling them for $300)
While you wait for more official stock, you can actually build one yourself! By that I mean that Valve has just released the CAD files for the Steam Controller and its Puck under a slightly restrictive Creative Commons license, which means people can now freely download the official shell files, modify them, 3D print accessories, and make all sorts of weird custom creations, as long as it is non-commercial.
They also included engineering drawings and "keep out" zones so people do not accidentally block things like the antenna or other critical areas.
Now technically you can't really build your own complete Steam Controller, these files are for the external shell and not the internals, but still this is pretty fun for them to do! Usually things like console shells are proprietary so it's nice to see a big company just put them out for anyone to build third party accessories for. There's a nice Valve-like message on the GitLab page: "Your Steam Controller is yours, and you have the right to do with it what you want."
![]() | I am now curious to see what people will build from this. A giant Steam Controller? A cursed ergonomic shell? A phone mount? A clever puck holder with flashing lights? This is one of those stories that is just fun, and I really hope the community goes crazy with creativity. |

Glue (or more technically, adhesives) is a fascinating thing. How do you connect one thing to another thing? It is also a surprisingly complex topic since you need different glue types to glue different objects. Gluing a Hat to a piece of Wood? You need something different than if you were gluing Metal to Plastic.
Here is a fun website that shows you what you need to glue THIS to THAT. You just pick the this and that from the dropdown menu and it tells you what works best, neat!
Plus there's a Trivia page! Did you know that "The Aztec Indians in Central America used animal blood mixed with cement as a mortar for their buildings"?
![]() | I love super niche websites like these, so silly but so useful when you really need it. Now I know that if I ever want to glue Ceramic to some Rubber that I should be using Household Goop |

RollerCoaster Tycoon Optimizations are Insane
https://www.youtube.com/watch?v=ZANFhJ9HYsM
Another great video on just how good Chris Sawyer is.
AI Learns To Park Vs 2 Humans
https://www.youtube.com/watch?v=LkvKWIk6MoY
AIA's scenarios are always so much fun, the visual scripting tool he has built for these videos seems quite impressive!
Get Rewards by Sending the Game Dev Report to a friend!
(please don’t try to cheat the system with temp emails, it won’t work, just makes it annoying for me to validate)

Thanks for reading!
Code Monkey

This blog post was created with the help of AI tools. Yes, I used a bit of magic from language models to organize my thoughts and automate the boring parts, but the geeky fun and the
in C# are 100% mine.
Hi!
Some of these started as small pet projects.
IE: a quick helper for a demo, a tiny tool for a conference, a library to avoid repeating the same code again and again, or one of those “I’ll just build this in one evening” ideas that somehow becomes a real thing.
And now, thanks to GitHub Copilot, many of these experiments are becoming open source, free NuGet packages that I hope are useful to everyone.
Some are focused on AI. Some are focused on local models. Some help with embeddings, speech, QR codes, document processing, MCP tools, agents, and developer productivity.
In other words: a beautiful collection of useful chaos.
Versions are current as of the latest dashboard snapshot: 2026-05-10.
Dashboard repo: https://github.com/elbruno/nuget-repo-dashboard
The ElBruno.AI.Evaluation family is about bringing testing discipline to AI apps.
Packages:
ElBruno.AI.EvaluationElBruno.AI.Evaluation.ReportingElBruno.AI.Evaluation.SyntheticDataElBruno.AI.Evaluation.XunitRepo:
https://github.com/elbruno/elbruno-ai-evaluation
These packages help with evaluation workflows, reporting, synthetic test data, and xUnit integration. The idea is simple: AI applications should not rely only on “it worked once in my demo.”
They need repeatable checks.
They need quality gates.
They need tests that can run again tomorrow, when the model, prompt, data, or weather in the cloud changes.
Because yes, AI apps are fun. But “trust me bro, the prompt is good” is not a testing strategy.
The agents orchestration packages are focused on coordinating multiple AI agents through a lightweight workflow.
Packages:
ElBruno.AgentsOrchestration.AbstractionsElBruno.AgentsOrchestration.OrchestrationRepo:
https://github.com/elbruno/elbruno.agentsorchestration
This is useful when you want agents to do more than just reply to a prompt. The orchestration repo describes a 6-step pipeline:
Plan → Parse → Execute → Verify → Review → Report
That is a nice mental model for agent-based workflows. It also fits very well with GitHub Copilot, SQUAD-style automation, and repo-based development experiments.
Because once you have more than one agent, you need orchestration.
Otherwise, congratulations, you invented a very expensive group chat.
The ElBruno.AotMapper family is focused on compile-time DTO mapping.
Packages:
ElBruno.AotMapperElBruno.AotMapper.AspNetCoreElBruno.AotMapper.EntityFrameworkElBruno.AotMapper.GeneratorRepo:
https://github.com/elbruno/ElBruno.AotMapper
This project is especially interesting for modern .NET workloads because it avoids runtime reflection and generates mapping code at compile time.
That makes it useful for:
In short: less runtime magic, more generated code that you can actually inspect.
And sometimes that is exactly what you want.
The ElBruno.LocalEmbeddings family is one of the most useful groups for AI developers building RAG, semantic search, local-first AI apps, or privacy-aware demos.
Packages include:
ElBruno.LocalEmbeddingsElBruno.LocalEmbeddings.HarrierElBruno.LocalEmbeddings.ImageEmbeddingsElBruno.LocalEmbeddings.ImageEmbeddings.DownloaderElBruno.LocalEmbeddings.KernelMemoryElBruno.LocalEmbeddings.NpuElBruno.LocalEmbeddings.Npu.IntelElBruno.LocalEmbeddings.Npu.QualcommElBruno.LocalEmbeddings.VectorDataRepo:
https://github.com/elbruno/elbruno.localembeddings
This family gives you local embedding generation in .NET, including support for text embeddings, image embeddings, vector data integrations, Kernel Memory, and NPU-specific packages.
The NPU packages are especially cool because they connect directly with the AI PC story.
Not every embedding call needs to cross the internet.
Sometimes the best cloud call is the one you did not make.
The ElBruno.LocalLLMs packages are focused on running and integrating local language models from .NET.
Packages:
ElBruno.LocalLLMsElBruno.LocalLLMs.BitNetElBruno.LocalLLMs.RagRepo:
https://github.com/elbruno/ElBruno.LocalLLMs
These are useful for local chat, local model experimentation, and RAG-style workflows.
Cloud models are amazing. But sometimes you want the model running right there next to your code, your logs, your fan noise, and your questionable coffee.
The ElBruno.MarkItDotNet family is probably one of the biggest and most useful areas in this package collection.
Packages include:
ElBruno.MarkItDotNetElBruno.MarkItDotNet.AIElBruno.MarkItDotNet.AzureSearchElBruno.MarkItDotNet.ChunkingElBruno.MarkItDotNet.CitationsElBruno.MarkItDotNet.CliElBruno.MarkItDotNet.CoreModelElBruno.MarkItDotNet.DocumentIntelligenceElBruno.MarkItDotNet.ExcelElBruno.MarkItDotNet.MetadataElBruno.MarkItDotNet.PowerPointElBruno.MarkItDotNet.QualityElBruno.MarkItDotNet.SyncElBruno.MarkItDotNet.VectorDataElBruno.MarkItDotNet.WhisperRepo:
https://github.com/elbruno/ElBruno.MarkItDotNet
This family is about converting, preparing, enriching, chunking, indexing, validating, and syncing content for AI workflows.
In a typical RAG project, everyone wants the fancy chat UI.
But before that, someone has to solve the real problem:
Can we clean and prepare the documents first?
That is where this package family fits.
It helps turn files into AI-ready Markdown and supports scenarios like document conversion, chunking, metadata extraction, citations, Azure AI Search, Whisper transcription, and quality checks.
Because every RAG project eventually becomes a document-cleanup project wearing an AI hat.
The ElBruno.Text2Image packages provide a .NET-friendly way to work with image generation.
Packages:
ElBruno.Text2ImageElBruno.Text2Image.CliElBruno.Text2Image.CpuElBruno.Text2Image.CudaElBruno.Text2Image.DirectMLElBruno.Text2Image.FoundryRepo:
https://github.com/elbruno/ElBruno.Text2Image
The nice part here is the multi-backend design:
That gives you flexibility depending on the machine, the demo, the budget, and how much your GPU is already crying.
One API, multiple ways to make pixels appear like magic.
Very expensive magic sometimes, but still magic.
This group is about speech, transcription, realtime audio, text-to-speech, and voice scenarios.
Packages:
ElBruno.WhisperElBruno.RealtimeElBruno.Realtime.SileroVadElBruno.Realtime.WhisperElBruno.QwenTTSElBruno.QwenTTS.VoiceCloningElBruno.VibeVoiceTTSElBruno.PersonaPlexRepos:
https://github.com/elbruno/ElBruno.Whisper
https://github.com/elbruno/ElBruno.Realtime
https://github.com/elbruno/ElBruno.QwenTTS
https://github.com/elbruno/ElBruno.VibeVoiceTTS
https://github.com/elbruno/ElBruno.PersonaPlex
This is useful for voice agents, transcription tools, accessibility scenarios, podcast workflows, and realtime conversational demos.
Because sooner or later every AI demo becomes:
Can I talk to it?
And then, ten minutes later:
Why is my microphone still open?
The ElBruno.QRCodeGenerator family is a lightweight utility set for generating QR codes in different formats.
Packages:
ElBruno.QRCodeGenerator.AsciiElBruno.QRCodeGenerator.CLIElBruno.QRCodeGenerator.ImageElBruno.QRCodeGenerator.PayloadsElBruno.QRCodeGenerator.PdfElBruno.QRCodeGenerator.SvgElBruno.QRCodeGenerator.ToolRepos:
https://github.com/elbruno/ElBruno.QRCodeGenerator
https://github.com/elbruno/ElBruno.QRCodeGenerator.CLI
This one is very practical: terminal output, images, SVG, PDF, payloads, and a global tool.
Because every conference session eventually needs a QR code.
Usually five minutes before going live.
The ElBruno.ModelContextProtocol.MCPToolRouter package focuses on smarter tool selection for Model Context Protocol scenarios.
Package:
ElBruno.ModelContextProtocol.MCPToolRouterRepo:
https://github.com/elbruno/ElBruno.ModelContextProtocol
This is useful when working with agents that have access to many tools.
Sending every possible tool to the model all the time is not always a good idea. It burns tokens, increases noise, and makes the model work harder than needed.
Tool routing helps select the most relevant tools for the task.
Agents are great.
Agents with 97 tools in context are a token bonfire.
These packages help with retrieval, ranking, and search scenarios.
Packages:
ElBruno.BM25ElBruno.Rerankinggraphify-dotnetRepos:
https://github.com/ElBruno/ElBruno.BM25
https://github.com/elbruno/ElBruno.Reranking
https://github.com/elbruno/graphify-dotnet
This group fits nicely into RAG and knowledge discovery workflows.
You need search.
Then you need better search.
Then you need reranking.
Then you need graphs.
Then you realize your “simple chatbot” is now a distributed knowledge system.
Classic.
These are small tools that make developer life better.
Packages:
ElBruno.AspireMonitorElBruno.ClockTrayElBruno.OllamaMonitorRepos:
https://github.com/elbruno/ElBruno.AspireMonitor
https://github.com/elbruno/ElBruno.ClockTray
https://github.com/elbruno/ElBruno.OllamaMonitor
These are the kind of tools that start with:
I just need a tiny helper.
And then become useful enough to publish.
AspireMonitor helps with Aspire monitoring.ClockTray helps with Windows tray clock scenarios.OllamaMonitor gives quick visibility into local Ollama runtime status.
Tiny tools. Big quality-of-life improvement.
The best kind of yak shaving.
The Hugging Face downloader packages help with downloading models and related assets from Hugging Face.
Packages:
ElBruno.HuggingFace.DownloaderElBruno.HuggingFace.Downloader.CliRepo:
https://github.com/elbruno/ElBruno.HuggingFace.Downloader
This is useful across many of the local AI packages, because local AI usually starts with:
Great, now where do I get the model files?
And then:
Why is this model 4 GB?
Package:
ElBruno.OllamaSharp.ExtensionsRepo:
https://github.com/elbruno/elbruno.OllamaSharp.Extensions
This package adds useful extensions around OllamaSharp, especially for scenarios where local LLM calls can take longer and timeout management becomes important.
If you have ever waited for a local model to respond and wondered whether it was thinking, frozen, or silently judging your prompt, this one makes sense.
The MemPalace family is focused on memory infrastructure for AI apps and agents.
Packages:
MemPalace.CoreMemPalace.AiMemPalace.AgentsMemPalace.Backends.SqliteMemPalace.KnowledgeGraphMemPalace.McpMemPalace.MiningMemPalace.Searchmempalacenetmempalacenet-benchRepo:
https://github.com/elbruno/ElBruno.MempalaceNet
This is about local-first AI memory, storage, search, knowledge graphs, agents, MCP integration, mining, and benchmarking.
Because agents without memory are just very confident goldfish.
Most of these packages started as small ideas, experiments, demos, or helper libraries.
But this is one of the things I love about modern development with GitHub Copilot: it makes it easier to move from:
This works on my machine
to:
This is packaged, documented, published, open source, and maybe useful for someone else.
Are all of these perfect?
Of course not.
Are they useful?
I hope so.
Are they free, open source, and ready for you to try, break, improve, fork, complain about, and maybe even use in a real project?
Yes. That is the idea.
And as always:
Today, code is cheap. The decisions are expensive.
Happy coding!
Greetings
El Bruno
More posts in my blog ElBruno.com.
More info in https://beacons.ai/elbruno
Modern .NET applications are increasingly distributed, integrating APIs, background services, and external AI systems. With the rise of AI coding tools such as GitHub Copilot and frameworks like the Microsoft Agent Framework, developers can now generate large portions of application logic.
This raises a question: When AI can generate much of the code, what becomes the core responsibility of a .NET developer?
This article will answer you the above question by sharing a practical case study of that shift—highlighting how architecture, contracts, and observability via Aspire - As systems become more dynamic and AI-driven, observability and orchestration become just as important as implementation.
To explore this in an easy-to-understand way, I built a simple full-stack e-commerce application (“flowershop”) using:
• Vue.js (frontend)
• ASP.NET Core Web API (backend)
• Microsoft Agent Framework (agent orchestration)
• .NET Aspire (distributed tracing and system visibility)
• GitHub Copilot (AI-assisted development)
\ Now, let’s explore!
\ 1. System Overview
The application includes features:• Product browsing and checkout• AI-powered chat assistant• Automated product description generation• End-to-end observability using .NET Aspire
This is final UI (Vue.js + AI Assistant)
Figure 1. Vue.js frontend with product listing, admin form, and AI assistant.
The frontend communicates with ASP.NET Core APIs, which orchestrates AI agents and external services.
\ 2. Architecture: Orchestrating AI in .NET
Sales Assistant Flow
Figure 2. Sales Assistant architecture (Vue.js → ASP.NET Core API → Agent → LLM).
In this architecture:• The Vue.js client sends requests to the API• The API routes requests to a Sales Agent• The agent interacts with the LLM and backend tools
Writer Flow
Figure 3. Writer flow for generating product descriptions.
When a product image is uploaded:
Observability and Orchestration with .NET Aspire
Figure 4. .NET Aspire tracing of LLM interactions and tool calls.
\ Using .NET Aspire, I was able to:
\ This is essential because:
AI systems are non-deterministic—without observability, their behavior is difficult to understand and debug.
\ 3. Implementation Challenges
During implementation, I encountered several challenges:
\ Ambiguous Specifications
Initial GitHub issues were short and informal. This led to:
• Misinterpreted requirements
• Inconsistent outputs
AI requires structured and explicit instructions.
\ Loss of Control
AI-generated pull requests often:
• Ignored coding conventions
• Required heavy revision
Effort shifted from writing code to reviewing and testing it manually.
\ Debugging Complexity
\ 4. Evolving the Development Approach
To address these challenges, the process was refined.
\ Structured Issue Definition
Issues were rewritten using Markdown, clear requirements, and acceptance criteria, improving clarity and reducing ambiguity [1][2].
\ API Contract Design
Explicit API contracts were introduced to align frontend and backend components, ensuring clear interfaces and predictable integration [3][4].
Contracts become critical when AI generates both sides of a system.
\ Instruction and Agent Design
Custom instructions and configurations were used to guide coding conventions, architecture, and workflow [5][6].
Figure 3: Setup GitHub Copilot in project
\ Continuous Learning
Improving outcomes required continuous learning from official documentation and evolving frameworks [7][8].
\ AI amplifies—not replace the need for technical knowledge.
\ 5. Key Lessons for .NET Developers
• Design before generating code
• Be explicit across the system
• Treat AI as a co-engineer
• Invest in observability (Aspire is critical)
\ 6. Discussion: The Shift in Developer Responsibility
AI does not remove responsibility—it redistributes it to a higher level.
Key questions remain:
\
• Which responsibilities should remain human-controlled?
• Who is accountable for AI-generated code?
• How might teams adapt workflows to integrate AI effectively?
• What skills are required to remain effective in this new paradigm?
\ 7. Conclusion
AI-assisted development in .NET is not just about generating code—it is about building systems that integrate AI reliably.With tools like GitHub Copilot, Microsoft Agent Framework, and especially .NET Aspire, developers gain new capabilities—but also new responsibilities.
Success depends on:
\
• Clear architecture
• Strong contracts
• Well-defined orchestration
• Deep observability
\ 8. Source Code
The full implementation is available on GitHub:👉 GitHub Repo
\ 9. References
[1] GitHub, Inc. 2026. About issues.
[2] GitHub, Inc. 2026. Creating an issue
[3] Microsoft. 2026. API design best practices
[4] OpenAPI Initiative. 2023. OpenAPI Specification.
[5] Copilot Academy. 2025. Copilot customization workshop.
[6] GitHub, Inc. 2026. Prompt engineering for GitHub Copilot.
Read more of this story at Slashdot.
In May 2026, Microsoft is preparing to roll out several new features for Outlook (new) and Outlook Classic. The list of new features includes teammates’ calendars sync in the navigation pane in Outlook (new). For Outlook Classic, Microsoft is preparing to roll out new Copilot features, including AI-based insights.
Microsoft Outlook has undergone significant changes across various platforms over the years. We’ll continue to see more big changes in Outlook in the future, as the software giant keeps adding new features to improve the email experience
On its Microsoft 365 Roadmap website, it has posted updates regarding what new features it plans to introduce to Outlook in May 2026. I’ve gone through the list, verified everything in the beta version, and here’s everything you need to know.
The support for automapped calendars in the new Outlook first appeared on the Microsoft 365 roadmap website back in 2024. But the feature has faced several delays and has never been rolled out to the general users until this month.
If you are an Outlook user, you can now toggle from the classic Outlook to the New Outlook without worrying about automapped calendars. Starting this month, you will automatically see your automapped calendars when switching from classic Outlook to the new Outlook.
This is rolling out for desktop users.
If you are using the new Outlook, you can now see your teammates’ calendars in the left navigation pane. The teammates’ calendars, which will include peers, direct reports, and managers, will be displayed automatically when you access the left navigation pane.
In our tests, Windows Latest observed that the calendars of your teammates show up instantly in the left navigation pane, and it also works with non-Microsoft accounts, as long as the accoutns are tied to the same organization.
This will start rolling out to the web client of the new Outlook this month.
In classic Outlook, you can select multiple calendars within a group in the left nav bar. However, you could no longer use the feature after switching to the new Outlook. Fortunately for the new Outlook users, this limitation no longer exists.
Microsoft has confirmed that Outlook users can bulk select or deselect calendars within a group in the left nav bar, starting this month. While the company hasn’t specified the rollout date, this feature is likely to be rolled out by the end of this month.
In another Microsoft 365 Roadmap update, Microsoft has mentioned support for multiselect events on calendar surface in the new Outlook. This means you’ll be able to open, copy and paste, delete & categorize in bulk, just like you do on the classic Outlook.

The new Outlook can now press the Shift key on your keyboard and click or CTRL and click to select non-consecutive dates in the calendar mini-month. This feature aims to make it easy for users to view and take action on the dates you care about.
This will be available on the web client of the new Outlook.

Microsoft has something for the classic Outlook, too. If you haven’t switched to the new Outlook yet, you’ll be able to select text in emails and ask Copilot for the relevant information in the classic Outlook.
This is already available in the new Outlook and will now be rolled out to the classic version this month.

If you are looking to find an old email in your Outlook inbox, one of the best tricks that you should follow is the ability to sort emails. Starting this month, this feature will get even more powerful, as Microsoft plans to add support for sorting by flag status, flag due date, and flag start date.
The ability to sort emails by flag status, flag due date, and flag start date will be available on both Desktop and web, per the Roadmap page.
Moreover, the new Outlook will allow users to save calendar events as .ics this month. It’ll be limited to the web client at the launch.
All the above features are set to arrive before the end of this month. However, Microsoft is notorious for making last-minute changes to its rollout plan, so there is always a possibility of some of its features getting delayed to a later month.
The post Microsoft confirms new features coming to Outlook and Outlook Classic in May 2026 appeared first on Windows Latest