Read more of this story at Slashdot.
Read more of this story at Slashdot.
Kavitha Radhakrishnan is a General Manager in Microsoft Global Skilling, where she leads the teams creating AI‑first, learner‑centered experiences that help people and teams build skills they can apply at work.
Sunday night. The week hasn’t started yet, but the questions already have.
A leader is scrolling through AI headlines, trying to keep up with the constant changes. Every day there’s a new tool, a new capability, a new prediction about how work is changing. And it is. By Monday morning, the pressure isn’t theoretical; it’s sitting on a packed calendar and a team that’s already running hot. Everyone’s saying, “We should be using AI,” but nobody’s quite sure what that means for this team, this week.
Elsewhere, an employee is watching coworkers use AI with speed and confidence. They want to keep up without feeling exposed for what they don’t know yet. The gap isn’t intelligence; it’s psychological safety and a clear starting point.
And then there’s the learning leader who’s had the “training participation” conversation a hundred times, but now the question is sharper. It isn’t How many people finished?, but What changed in the way they work? The bar has moved from awareness to application.
None of these people are asking for more content. There’s plenty of that. They want a path that respects their time, fits their role, and helps them build confidence, both individually and as part of a team.
AI is moving faster than most of us can track. The problem is figuring out what to do next.
Leaders don’t want to stitch together five different tools. People don’t want another long course about AI. Teams are looking for skilling that fits into real work. That’s the gap AI Skills Navigator is built to address.
AI Skills Navigator brings role‑based, practical skilling into a single experience, so individuals and teams have a clear starting point, a sense of direction, and ways to see progress as they go. Instead of an endless catalog, it offers guided paths that respect time, align to real responsibilities, and make it easier to turn learning into action. At its core, it’s designed to help turn skilling into execution—progress that people can feel and leaders can point to.
Alex, a team manager, is trying to set the team up for success.
The team is kicking off a new project with clear goals, tight timelines, and a mix of responsibilities across roles. Everyone is expected to use AI more effectively, but “go learn AI” isn’t a plan. Sending people to a long list of links doesn’t help either.
So Alex turns to AI Skills Navigator.
Instead of gathering content from multiple places, Alex uses AI Skills Navigator to design a skilling playlist for the team. The conversational AI experience helps him identify what his team really needs. The playlist is grounded in what the team is actually working on and intentionally structured around the project goals and role-specific responsibilities. It brings together different content formats on purpose: short sessions for core concepts, practice where it matters, and optional deeper dives for people who want to explore further.
It’s not about forcing everyone through the same experience. It’s about giving the team a shared path forward, while respecting different roles, learning needs, and preferences.
Sam, an experienced marketing manager on the team, doesn’t have to figure out where to start. A link from Alex lands in their inbox, and the intent is clear: this is what matters for our work right now.
As Sam works through the playlist, they move naturally between different ways of learning. The structure makes it easy for them to focus without feeling boxed in.
For a topic that matters most to the project, Sam chooses a skilling session. A video sets the context, and the Skilling Session Coach AI agent is there along the way—ready to clarify a concept, answer a quick question, or pause to check understanding. Sometimes there’s a short quiz to help Sam confirm that they’re learning and making progress.
People are juggling meetings, messages, deadlines, and more. Attention spans are shorter, and learning often happens in brief moments between tasks. AI Skills Navigator is designed for that reality.
Sometimes Sam feels like listening instead of reading. An AI‑generated podcast turns dense material into something easier to absorb. When time is tight, an AI-generated summary helps Sam catch up in minutes, without losing the thread.
From Alex’s perspective, there’s a simple view of how the team is progressing—enough to see who’s moving forward, where people might be getting stuck, and when it’s time to adjust the plan.
Together, these moments add up. Skilling sessions provide depth when it’s needed. Podcasts and summaries offer flexibility when attention is limited. Skilling playlists keep everything connected, so learning feels purposeful rather than scattered.
By combining structured paths with flexible ways to learn, and pairing AI support with human expertise, AI Skills Navigator helps individuals and teams build confidence, apply skills, and make progress together.
AI Skills Navigator is designed to help people and teams build confidence, not by adding more noise, but by providing guidance that fits real work.
It brings together training content from sources that many people already know and trust, including Microsoft Learn, LinkedIn, and GitHub, and connects it into structured paths that make it easier to start, go deeper, and keep moving forward. Whether you’re learning on your own or designing skilling for a team, the goal is the same: turn learning into progress that you can feel.
And this isn’t static. We’ll continue evolving AI Skills Navigator based on how people learn, how teams work, and the feedback we hear from learners and leaders along the way. We’ll share updates regularly, including new content, new capabilities, and what’s coming next, so you can stay current as the experience grows.
After you've signed in, you can get started with these options. (Pro tip: To expand the navigation pane on the left, try selecting it.)
As content growth accelerates worldwide, organizations need better ways to manage inactive data without sacrificing security, compliance, retention, or discoverability.
Today, we’re announcing a public preview of file‑level archiving in Microsoft 365 Archive. This new capability enables you to archive individual files moving them into a lower-cost, cold-storage tier in SharePoint. This means you can archive outdated and redundant files while keeping the rest of the site active, improving both your Copilot relevancy and your search results in the process.
Read on to learn more about how it works and how to get started. You can also join our webinar on April 7, 10:00am PDT, to see file-level archive in action and engage with our product team.
Until now, Microsoft 365 Archive supported site‑level archiving, meaning admins had to choose whether to archive an entire site, often without knowing the details of the work happening there. With file‑level archiving, it’s now possible to archive individual documents within active SharePoint sites. This is ideal for older project files, completed events, or reference materials that must be retained but are rarely accessed.
Microsoft 365 Archive helps improve Copilot results by removing archived files and sites from Copilot’s active index. By reducing clutter across SharePoint and search experiences, you enable Copilot to surface higher‑value, current information while still maintaining governance and compliance. Admins can also enable users to identify and archive the content they are familiar with, instead of admins having to make judgement calls about the relevancy of entire sites.
Archiving also helps reduce active SharePoint storage consumption. This can help you save up to 75% compared to adding additional SharePoint storage – freeing up your SharePoint storage for active collaboration. As a reminder, archived storage is only charged when your tenant has exceeded its storage quota. There are no reactivation fees for SharePoint sites and content stored in Microsoft 365 Archive.
File‑level archiving helps organizations take a more intentional approach to data lifecycle management and responsible data growth – separating active collaboration content from long‑term records. Meanwhile, it retains the security settings, compliance protections, and metadata of the archived content. Importantly, archived content remains searchable by administrators using Microsoft Purview and admin search and all Purview flows remain intact. This approach keeps inactive data protected inside the Microsoft 365 trust boundary – without cluttering active workspaces or inflating primary storage usage.
Microsoft 365 Archive includes support for Microsoft Graph APIs, enabling organizations and partners to integrate site- and file‑level archiving into custom workflows and lifecycle management solutions.
For public preview, file‑level archiving focuses on manual and API‑based experiences. Looking ahead, we know that the best way to manage archive files at scale is through policies. We’re working hard to make policy‑based automation for archiving files available soon.
We have partners, including Preservica, using these APIs to incorporate Microsoft 365 Archive into their own solutions. These integrations can offer you more ways to manage your organization’s long-term SharePoint content.
“We are delighted to be part of the public preview for file-level archiving for Microsoft 365 Archive. This enables us to build on our Preserve365® integration with Microsoft 365 Archive for site-level archiving as we work together to make Active Digital Preservation a seamless part of the Microsoft 365 ecosystem.”
– Stuart Reed, Chief Product Officer, Preservica
Kantar, a global leader in marketing data and analytics, is embracing Gen AI to save time, improve quality, and deliver better results to both employees and customers. After moving content to Microsoft 365 and rolling out Microsoft 365 Copilot, Kantar faced increasing storage costs. By adopting Microsoft 365 Archive, Kantar reports they have successfully reduced storage costs and improved data quality. Kantar also helped ensure Copilot could access clean, relevant information while keeping inactive data secure and cost-effective.
“Microsoft 365 Archive helps us not only address storage costs, but also provide our end users the most up-to-date, relevant content across SharePoint, Teams, Copilot and Gen AI agents.”
- Davide Ranchetti, Principal Engineering Manager, Digital Workspace
To date, Kantar reports that they have archived more than 40,000 sites – nearly 100 terabytes of data – significantly cleaning their data estate and reducing SharePoint storage costs. They’re looking forward to using file-level archiving to save even more on storage costs by archiving large, inactive video files.
Learn more about how Kantar is reducing storage costs with Microsoft 365 Archive.
File‑level archiving in Microsoft 365 Archive is now available in public preview for eligible Microsoft 365 tenants. Learn more about setting up Microsoft 365 Archive, and how to enable and manage file-level and site-level archiving.
Register today for our webinar on April 7, 10:00am PDT to learn more about Microsoft 365 Archive and ask our product team questions about file-level archiving.
If you’re a developer, check out the developer guidance for Microsoft 365 Archive to learn more about using the Microsoft Graph APIs with Microsoft 365 Archive.
When deploying large language models like GPT‑4o, capacity planning is no longer about picking a GPU SKU. Instead, Azure abstracts GPU compute behind Provisioned Throughput Units (PTUs)—a model‑centric way to reason about GPU usage, throughput, and latency.
This post explains how GPU capacity is computed for GPT‑4o‑class models, and how to translate your workload into the right number of PTUs.
From GPUs to Tokens: The Mental Shift
With GPT‑4o and newer models, Azure does not expose GPUs directly. Instead:
A PTU is not “one GPU.” It is a guaranteed amount of model‑processing capacity, backed by GPUs under the hood and optimized by Azure for that specific model. [learn.microsoft.com], [learn.microsoft.com]
The Key Change with GPT‑4o
For GPT‑4o and later models, input and output tokens are metered separately.
That matters because:
Azure therefore assigns separate TPM budgets per PTU for input and output tokens.
GPT‑4o Throughput per PTU
For gpt‑4o, the effective per‑PTU capacities are:
|
Metric |
Value |
|
Input TPM per PTU |
~2,500 |
|
Output TPM per PTU |
~625 |
|
Input : Output ratio |
4 : 1 |
These ratios are baked into Azure’s PTU calculators and provisioning logic.
The Core Formula
To compute required GPU capacity (PTUs):
Then:
Step‑by‑Step Example
Assume this workload:
Input TPM
Output TPM
Input side
Output side
Apply Azure’s minimum deployment size → 15 PTUs required.
This is why tables often show PTUs higher than a simple TPM Ă· constant calculation.
Why Output Tokens Matter More
Output tokens:
That’s why GPT‑4o uses a 4:1 input‑to‑output ratio, and why output TPM often becomes the bottleneck in chatty or agentic workloads. [modelavail...bility.com]
Practical Guidance
Azure recommends validating sizing with the PTU Calculator and real traffic benchmarks before committing long‑term reservations.
Final Takeaway
For GPT‑4o and newer models, GPU sizing is token‑driven, not hardware‑driven.
PTUs abstract GPUs, and the required capacity is simply the maximum of input‑bound and output‑bound throughput needs.
Once you understand that, GPT‑4o capacity planning becomes predictable, explainable, and much easier to operate at scale.
Take control of your data by discovering sensitive information across every file type and location with Microsoft Purview Information Protection. Classify your data, apply clear labels, and enforce protections that automatically adapt to human and AI interactions so you can reduce risk without slowing down workflows. Proactively monitor, assess, and respond to risk in real time. Use labeling and layered policies to stop accidental sharing, manage AI access, and maintain consistent protection across your organization.
Matt McSpirit, Microsoft Mechanics expert, joins Jeremy Chapman to share how to turn scattered data into actionable security that moves as fast as your team and AI.
â–ş QUICK LINKS:
00:00 - Microsoft Purview data protection
01:04 - Data Loss Prevention
03:36 - Layered approach in addition to DLP
04:13 - Unified classification
04:27 - How sensitive data is determined
06:23 - Create trainable classifiers
07:06 - Distinction between classification and labeling
08:06 - Configure policy protections
09:12 - DLP in action
10:10 - IRM in action
10:51 - See how protections show up
13:37 - Move from reactive to proactive protection
15:00 - Wrap up
â–ş Link References
For deeper guidance, go to https://aka.ms/PurviewInformationProtection
â–ş Unfamiliar with Microsoft Mechanics?
As Microsoft's official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.
• Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries
• Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog
• Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast
â–ş Keep getting this insider knowledge, join us on social:
• Follow us on Twitter: https://twitter.com/MSFTMechanics
• Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/
• Enjoy us on Instagram: https://www.instagram.com/msftmechanics/
• Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics
#DataSecurity #DataLossPrevention #Microsoft365 #AISecurity
Guests:
Topics:
Resource: