Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
152148 stories
·
33 followers

I Don’t Have Any Ideas

1 Share

It is something you hear from folks about why they use artificial intelligence. Interestingly, it is the same thing people would say before this moment when you asked them why they don’t write. As I sit here on a Sunday afternoon, writing various fictional and non-fictional stories in my notebook, as well as running multiple Claude agents to crawl and produce machine-readable artifacts for thousands of different companies—you can really feel the extraction that occurs, leaving us without ideas.

I wouldn’t just blame AI here. TV has long been eroding our creativity since mid way through the last century. The Internet just personalized it. And AI is turning up the volume. We don’t have any ideas, because we have resorted to just being consumers, and we are in the business have having ideas. We’ve outsourced our need for ideas rather than seeing ourselves as the source of ideas. I’ve walked this line my whole life. I was a TV baby for sure. I was an early Internet adopter and believer. I have suffered chronically throughout my life at not being good at anything and not having any good ideas.

In 2010 I bought a lot of books, but never really read. Then I began writing. I fell in love with writing. I learned to need writing. It would take me another decade before I would fall back in love with reading. I now see reading and writing as essential. Not just to having ideas, but living. They are related. If I don’t have ideas, it is because I don’t read enough. If I don’t have ideas it is because I don’t write. This isn’t just about a single book or single story. This is about doing it in general. Having muscles. Having a desire to read and write. Hitting the wall with reading and writing. Getting back to it. Practicing, failing, succeeding. Just showing up and doing it regularly.

As I write this I am tending to 3 separate Claude agents. One is doing research, another is taking that research and updating a website, and a third is looking through the work I’ve done this week. I do all of this for my business Naftiko to understand how my clients operate, what they need, and where the gaps are. This story lives in Kin Lane, where we don’t use AI, but straddling the two worlds reveals very clearly for me why we don’t have ideas. Like I said before, this isn’t just AI. This is a social media, Internet, and television thing. We don’t have ideas because we’ve opted to not have ideas. We made that choice. We continue to make the choice each day.

When I started API Evangelist I barely had any ideas. I took me years to get to the point where I had a sustained flow of ideas, let alone any good ideas. I have had seasons on API Evangelist where I write 3-5 blog posts a day. Too many ideas (not possible). Even some of the unpolished ones are worth it because they contributed to more polished ones down the road. I also have seen the unintended side-effect of the process when I produce ideas I barely notice, but others see as gems. After reading this, I want you to know that if you find yourself in a moment where you don’t have any ideas—you have put yourself there. Go read. Go write. You’ll be fine.



Read the whole story
alvinashcraft
52 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Accelerating Frontier Transformation with Microsoft partners

1 Share

AI has moved quickly from experimentation to production. Customers want measurable business outcomes, along with security, governance and responsible AI built in from day one. Microsoft partners are a meaningful differentiator to deliver these objectives. They turn ideas into deployable solutions by prioritizing the highest value use cases, building the right data and security foundations and establishing adoption and measurement capabilities so customers can run AI reliably in production.

Frontier Transformation is where AI becomes a repeatable, governed capability embedded into the flow of work, business processes and customer engagement. Customers are quickly moving from targeted pilots to operating AI at scale with a foundation built upon identity, data protection, compliance, monitoring and change management. As organizations expand from custom agents to agent-led processes, unified governance is essential so leaders can manage risk, track performance and scale with confidence.

Two essentials: Intelligence and Trust

Frontier Transformation depends on two essential elements: intelligence and trust. Customers want solutions grounded in their unique work intelligence, including their data, business context and operational realities. They also expect trust by design, with AI artifacts observable, managed and secured across the technology stack so they can deploy responsibly and scale with confidence.

A success framework for Frontier Transformation

Microsoft has developed a powerful framework for success as partners enable AI transformation for customers across all segments, industries and geographies:

  1. Enriching employee experiences: enabling businesses to empower employees with world-class tools and capabilities to activate a thriving, productive workforce
  2. Reinventing customer engagement: applying AI and agentic solutions to break through with customers, accelerate revenue growth, become more efficient at customer acquisition and deliver more personalized solutions
  3. Reshaping business processes: redesigning workflows across the business, enhanced by AI and agentic capability
  4. Bending the curve on innovation: AI acceleration is a powerful catalyst for business transformation and for addressing society’s biggest challenges — curing disease, addressing climate change and famine and other meaningful advancements

The “what” matters, and so does the “how.” Organizations that scale successfully put AI where people already work, enable innovation close to the business challenge and build observability at every layer so leaders can measure quality, govern risk and manage AI like a production system.

More than 90% of the Fortune 500 use Microsoft 365 Copilot, reflecting how quickly AI is becoming part of everyday work.(1) IDC predicts 1.3 billion agents in circulation by 2028(2) and 80% of the Fortune 500 are already using Microsoft agents, led by operationally complex industries like manufacturing, financial services and retail.(3) As customers move from piloting AI to agents embedded in their flow of work, governance and security need to scale with them.

Microsoft’s approach is straightforward: Copilot drives action in the flow of work, agents orchestrate workflows across systems and Microsoft Agent 365 provides a unified control plane designed to govern and secure agents at scale, with the same tools businesses use for employee administration, such as Microsoft admin center, Defender, Entra and Purview.

Partners are creating impact right now in three areas. First, agentic workflows that remove operational friction and orchestrate end-to-end work across operations, finance, supply chain and service. Second, Customer Zero maturity. Partners who adopt Copilot and agents internally build credibility and move faster because they have meaningful, real-world experiences that they translate into their go-to-market plans. Third, security as the foundation. There is no AI at scale without secure identity, protected data and strong governance.

Microsoft 365 E7 and Agent 365: The Frontier Suite

In March, Microsoft introduced Wave 3 of Microsoft 365 Copilot and announced Microsoft 365 E7: The Frontier Suite, with general availability of Microsoft 365 E7 and Microsoft Agent 365 on May 1, 2026.

Microsoft 365 E7 brings together Microsoft 365 E5 for secure productivity, Entra Suite for identity and access control, Microsoft 365 Copilot for AI in the flow of work and Agent 365 as the control plane to govern and scale agents. It is grounded in shared intelligence from Work IQ, the layer that brings together signals from the Microsoft 365 environment, including content, context and activity, so AI can operate with the right business grounding and policy awareness.

Microsoft Agent 365 provides a unified control plane for agents, enabling IT, security and business teams to observe, govern and secure agents across the organization. This applies to any agents an organization uses, whether they are built on Microsoft AI platforms, delivered by ecosystem partners or introduced through other technology stacks. It also applies the same security and compliance capabilities teams already rely on, including Microsoft Defender, Microsoft Entra and Microsoft Purview.

Some customer scenarios require custom agents. Microsoft Agent Factory is designed to accelerate the move from experimentation to execution. The Microsoft Agent Factory Pre-purchase Plan (P3) adds licensing flexibility across Copilot Studio, Microsoft Foundry, Fabric and GitHub, with tiered discounts intended to support broader adoption rather than isolated pilots. It also enables inclusion of tailored, role-based skilling at no additional cost to the customer, reducing adoption friction and increasing delivered value.

The opportunity for partners is end-to-end, and this is where the partner’s strategy really matters. Shifting from transaction-first to outcome-first, partners who iterate quickly, establish clear guardrails and build an operating rhythm for adoption move customers from interest to impact.

Over time, every organization will employ people who can direct and govern agents as part of daily work. Partners can make that capability real through packaged offers, change management and managed operations. Publishing those packaged offers in the Microsoft Marketplace adds a scalable route to market, improving discoverability and enabling a more repeatable buy-and-deploy motion as customers expand agent usage.

Partner success: What governed scale looks like in practice

“AI is at the forefront of everything we do. Through our ‘learn, use, create’ methodology and our AI Academy, we really support partners with learning paths.”
— Nicole Clark, Global Alliance Manager, Arrow Electronics

Partners are embracing Frontier Transformation by modernizing foundations, driving adoption, designing security into delivery and building agents that automate repeatable work and orchestrate business processes.

  • Cognizant treated legacy automation as a platform modernization effort. Using Microsoft Power Platform, Copilot agents and governance frameworks, Cognizant migrated and modernized automation and scaled it across teams, consolidating platforms, lowering costs and reducing manual work through agent-led workflows.
  • EPAM’s work with their customer Albert Heijn demonstrates what agent-first execution looks like in frontline scenarios. By delivering an employee-facing virtual assistant inside the retailer’s staff app, EPAM supported scenarios like restocking, onboarding and faster access to product and inventory information, with enterprise governance and observability in mind.
  • Insight’s Flight Academy shows what it looks like to treat adoption as a program, not an announcement. Through a structured approach, Insight enables teams to build AI fluency in daily work and reinforces usage with practical learning and internal momentum that can scale beyond early enthusiasts.
  • aCloud demonstrated a repeatable security pattern with their customer Jurong Engineering Limited (JEL) by bringing together Microsoft Purview, Microsoft Sentinel, Microsoft Defender XDR and Microsoft Security Copilot, paired with co-design workshops and cross-team alignment to strengthen compliance readiness.
  • Arrow Electronics showed how distributor-led enablement can accelerate partner execution by using ArrowSphere to streamline Cloud Solution Provider (CSP) lifecycle management and ArrowSphere Assistant to surface AI-driven insights for renewals, upsell opportunities and Copilot adoption, complemented by a security dashboard that strengthens posture visibility and supports trust-by-design conversations.

Find more stories of partners innovating and driving meaningful outcomes for customers with Microsoft technology.

This same disciplined approach is especially relevant in the small and medium business space (SMB), where Microsoft partners offer end-to-end capability through managed services offerings and solutions packaged into repeatable motions, tailored to this customer segment.

SMB momentum: Scaling work with Copilots and agents

As Microsoft 365 Copilot Business expands AI built for work to organizations with fewer than 300 users, SMBs have a practical path to adopt AI more broadly. CSP partners are well positioned to guide that journey with a motion that combines adoption, security and ongoing management.

New Omdia research illustrates that CSP is a durable growth model for partners. In a study of 267 CSP partners across 36 countries, 79% rated CSP authorization as good, very good or excellent, and 88% would recommend it to other partners.(4) Omdia also found that 60% of CSP partner revenue is now tied to value-added services, with licensing acting as the entry point to broader, services-led engagements.(6)

“We’re bringing customers resources that only a partner can deliver to them: our relationship with Microsoft, technical training and programs that push them further and faster to learn technologies like Microsoft Copilot Studio, Foundry and Fabric.”
— Chance Weaver, Global VP of AI Adoption, Pax8

SMB demand is also expanding. For CSP partners, the near-term opportunity is to standardize advancing Copilot and agents from conversation to consumption. Lead with a simple, repeatable motion: outcome selection, security baseline, deployment, adoption and optimization cadence. Renewal moments are often the easiest time to introduce change when paired with a clear business case and time-bound offers.

A simple, scalable approach is to roll out in stages:

  1. Deploy Microsoft 365 Copilot Business broadly, paired with a strong foundation of identity, data protection and compliance.
  2. Target high-propensity accounts with tools such as Microsoft CloudAscent and the AI Business Solutions & Security Insights dashboard to deepen adoption and standardize responsible prompting.
  3. Extend with agents to take on repeatable tasks and support key business processes, with governance and security built in.

Microsoft provides CSP partners with a powerful set of tools to combine licensing, lifecycle management and optimization into one customer relationship. Omdia notes that partners value operational advantages such as monthly billing flexibility and managing licenses through Partner Center for real-time provisioning and 24/7 license management. Partners can review the CSP incentives guide to understand the latest CSP incentives and how they map to an SMB motion.

Microsoft supports SMB-focused partners by combining product, security and go-to-market resources that make it easy to deliver a repeatable motion. That includes tools to assess readiness, prioritize the right use cases and track adoption over time, plus role-based skilling to build sales and technical and delivery confidence across Copilot, security and agents. For partners building managed services offerings, Microsoft Marketplace also provides a scalable route to market, improving discoverability and enabling customers to buy through familiar procurement paths, while Partner Center brings licensing and lifecycle management into the same operational flow.

Program momentum and updates

The Microsoft AI Cloud Partner Program continues to be the primary way we invest in partners as they build, sell and deliver cloud and AI solutions. Our focus is simple: enable partners to build capability, accelerate demand, differentiate in the market and scale repeatable delivery.

In February 2026, Microsoft introduced a wealth of expanded benefits updates across Copilot, security, Azure credits and go-to-market resources. These updates are designed to strengthen how partners run their business and accelerate the ability to take solutions to market. We continue to evolve partner benefits packages as a practical growth lever, combining product, support and advisory benefits so partners can invest with confidence.

To enable AI Transformation, Microsoft is introducing program updates and offers in the coming months. These updates are intended to enable partners, including services partners, channel partners and software companies, to build and deliver agents across the Frontier product stack. 

  • Differentiation via Frontier Partner specialization: The Frontier Badge is evolving into a Frontier Partner specialization. This specialization differentiates partners, including services partners and channel partners, who demonstrate capabilities to build or deliver agents across Microsoft’s Frontier product stack. It creates a clear way for customers and Microsoft field sales teams to identify partners with validated readiness for agentic AI scenarios.
  • Updated Frontier Distributor designation: Microsoft is evolving the Frontier Distributor designation to reflect distributor capabilities that matter for scaling agentic AI across the channel. For partners, this will make it easier to identify distributors that can deliver repeatable skilling, enablement to build and manage agents and Marketplace-backed motions to transact and grow agent sales.
  • Benefits for software companies building AI apps and agents via App Accelerate: App Accelerate supports software companies building AI apps and agents on the Microsoft agent stack, with benefits designed to bring agentic solutions to market with strong foundations for trust.

Investments in skilling

This year we are investing in partner skilling that connects certification readiness to project-ready execution, with role-based experiences like Project Ready Workshops that translate skills into repeatable delivery practices. These learning experiences are delivered through our Partner Skilling Hub.

We are also introducing the Frontier Engineer Badge, a new learning path delivered through Titan Academy that prepares Solution Engineers and Solution Architects within the partners’ organization to design, build and operate production-ready agentic AI solutions across the Frontier Transformation stack, including Microsoft Copilot, Copilot Studio, Azure AI Foundry, GitHub Copilot, Microsoft Fabric and Agent 365.

The journey is hands-on by design and follows a three-part model: earn required certifications to establish a shared technical baseline, demonstrate delivery capability through Project Ready (building and integrating agents with governance, security and compliance) and build advanced readiness for operating at scale through governance, velocity and industry solution patterns. The outcome is clear: delivery-ready engineers who can move customers from prototypes to trusted, governed deployments.

Capturing the Marketplace opportunity

Marketplaces matter more in 2026 as customers consolidate procurement and expect faster time to value, especially as AI moves from pilots to production. With over 5,000 AI solutions available, Microsoft Marketplace increases discoverability for partner-built AI solutions, including agents, and supports a more repeatable buy-and-deploy motion through familiar procurement. It also makes it easy for partners to package multiparty software and services offers, so customers can purchase what they need to implement, govern and scale AI in production.

Omdia projects Microsoft Marketplace as a nearly $300 billion partner services opportunity by 2030. In the same study, partners selling through Marketplace reported go-to-market benefits, including faster sales cycles and larger deals, with 75% of study participants reporting faster closes and 69% reporting larger deals through Microsoft Marketplace.(5)

To accelerate demand generation and make it easy to activate these motions, we recently introduced Partner Marketing Center Pro, an AI-powered experience for end-to-end campaign creation. It brings campaign discovery, customization and co-branding, intelligent localization and translation, automated publishing and built-in reporting into one workflow, with an AI assistant that provides coaching throughout the process.

Partner Marketing Center Pro is a benefit of the Microsoft AI Cloud Partner Program available to partners who have purchased at least one partner benefits package, who have attained a Solutions Partner designation or who are currently enrolled in ISV Success.

Ways to engage now

Here are a few practical next steps partners can take to maximize their Microsoft investment:

Start by joining the Microsoft AI Cloud Partner Program.

Making Frontier Transformation real

Frontier Transformation is about building AI-powered operating capability grounded in intelligence and trust, and delivered consistently across industries, geographies and market segments. Partners make that real for customers by turning strategy into production-ready solutions, with governance, security and adoption built in from day one.

Microsoft is committed to partner success. We will continue investing in the Microsoft AI Cloud Partner Program, with the incentives, the skilling and the go-to-market capabilities that enable partners to build repeatable offers, increase discoverability and deliver trusted AI outcomes for customers at scale.

Nicole Dezen leads the Microsoft partner ecosystem and the Global Channel Partner Sales organization in Small, Medium Enterprises and Channel (SME&C). As Chief Partner Officer, she has grown the Microsoft partner ecosystem to become the largest in the industry, enabling more than 500,000 partners to deliver AI transformation to millions of customers in each segment around the world.

 

Footnotes

1 Microsoft FY25 Third Quarter Earnings Conference Call, Microsoft, April 2025.
2 IDC Info Snapshot, sponsored by Microsoft, 1.3 Billion AI Agents by 2028, #US53361825, May 2025.
3 Based on Microsoft first party telemetry measuring agents built with Microsoft Copilot Studio or Microsoft Agent Builder that were in use during the last 28 days of November 2025.
4 Omdia, Unlocking Growth Potential: Partner Perspectives on Microsoft CSP, December 2025. Results are not an endorsement of Microsoft. Any reliance on these results is at the third party’s own risk.
5 Microsoft estimate based on IDC data (SMB TAM: $777B by FY26; $1T+ by 2030), as published on the Microsoft Partner Blog, “The Microsoft Marketplace opportunity for channel ecosystem,” November 20, 2025.
6 Omdia, Partner Ecosystem Multiplier – The Microsoft Marketplace Opportunity, commissioned research sponsored by Microsoft, December 2025. Results are not an endorsement of Microsoft. Any reliance on these results is at the third party’s own risk.

Throughout this document, $ refers to USD.

 

 

 

 

The post Accelerating Frontier Transformation with Microsoft partners appeared first on The Official Microsoft Blog.

Read the whole story
alvinashcraft
53 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Connect to Copilot CLI with a QR code on web & mobile - this is crazy!

1 Share
From: Gerald Versluis
Duration: 0:47
Views: 20

You can now connect to you GitHub Copilot CLI sessions remotely from the web or your mobile app from ALL around the world and just keep working on projects that run on your desktop.

💝 Join this channel to get access to perks:
https://www.youtube.com/channel/GeraldVersluis/join

🛑 Don't forget to subscribe to my channel for more cool content: https://www.youtube.com/GeraldVersluis/?sub_confirmation=1

🎥 Video edited with Camtasia (ref): https://techsmith.z6rjha.net/AJoeD

🙋‍♂️ Also find my...
Blog: https://blog.verslu.is
All the rest: https://jfversluis.dev

#githubcopilot #copilotcli #agenticai

Read the whole story
alvinashcraft
53 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Getting The Most for Your Money Using Automated SMS

1 Share
tips and tricks for learning how to count segments using Twilio's automated SMS in ways that will save you budget and time
Read the whole story
alvinashcraft
53 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

EF Core is Better with Wolverine

1 Share

TL;DR: Wolverine has a pretty good development and production time story for developers using EF Core and that is constantly being improved.

Wolverine was explicitly restarted 3-4 years back specifically to combine with Marten as a complete end to end solution for Event Sourcing and CQRS with asynchronous messaging support. While that “Critter Stack” strategy has definitely paid off, vastly more .NET developers and systems are using EF Core as their primary persistence mechanism. And since I’d personally like to see Wolverine get much more usage and see JasperFx Software continue to grow, we’ve made a serious effort to improve the development time experience with EF Core and Wolverine.

To get started using EF Core with Wolverine, install this Nuget:

dotnet add package WolverineFx.EntityFrameworkCore

I should say, that’s not expressly necessary, but all of the development time accelerators, middleware, and transactional inbox/outbox integration we’re about to utilize require that library.

Let’s just get started with a simple Wolverine bootstrapping configuration that is going to use a single EF Core DbContext (for now, Wolverine happily supports using multiple DbContext types in a single application) and SQL Server for the Wolverine message persistence we’ll need for transactional outbox support later:

var builder = Host.CreateApplicationBuilder();
var connectionString = builder.Configuration.GetConnectionString("sqlserver")!;
// Register a DbContext or multiple DbContext types as normal
builder.Services.AddDbContext<ItemsDbContext>(
x => x.UseSqlServer(connectionString),
// This is actually a significant performance gain
// for Wolverine's sake
optionsLifetime:ServiceLifetime.Singleton);
// Register Wolverine
builder.UseWolverine(opts =>
{
// You'll need to independently tell Wolverine where and how to
// store messages as part of the transactional inbox/outbox
opts.PersistMessagesWithSqlServer(connectionString);
// Adding EF Core transactional middleware, saga support,
// and EF Core support for Wolverine storage operations
opts.UseEntityFrameworkCoreTransactions();
});
// Rest of your bootstrapping...

With that in place, let’s look at a simple message handler that uses our ItemsDbContext:

public static class CreateItemCommandHandler
{
public static ItemCreated Handle(
// This would be the message
CreateItemCommand command,
// Any other arguments are assumed
// to be service dependencies
ItemsDbContext db)
{
// Create a new Item entity
var item = new Item
{
Name = command.Name
};
// Add the item to the current
// DbContext unit of work
db.Items.Add(item);
// This event being returned
// by the handler will be automatically sent
// out as a "cascading" message
return new ItemCreated
{
Id = item.Id
};
}
}

In the handler above, you’ll notice there’s no synchronous calls at all, and that’s because we’ve turned on Wolverine’s transactional middleware for EF Core that will handle the actual transaction management. You’ll also notice that we’re using Wolverine’s cascading messages syntax to kick out an ItemCreated domain event upon the successful completion of this handler. With the EF Core transactional middleware, that is also handling any integration with Wolverine’s transactional outbox for reliable messaging. Absolutely nothing else for you to do in that handler to enable any of that behavior, and we can shove off some of the typically ugly async/await mechanics into Wolverine itself while keeping our actual application behavior cleaner.

Now let’s go a little farther and utilize some Wolverine optimizations for our EF Core usage and change the service registration up above to this:

// If you're okay with this, this will register the DbContext as normally,
// but make some Wolverine specific optimizations at the same time
builder.Services.AddDbContextWithWolverineIntegration<ItemsDbContext>(
x => x.UseSqlServer(connectionString), "wolverine");

That version of the integration optimizes application performance by fine tuning the service lifetimes in a way that improves Wolverine’s internal usage of the DbContext type, and adds direct mappings for Wolverine’s internal inbox and outbox storage. By using a “Wolverine optimized DbContext” like this, Wolverine is able to improve your system’s performance by allowing EF Core to batch the SQL commands for your application code and Wolverine’s transactional outbox storage in a single database round trip — and that’s important because the single most common killer of performance in enterprise applications is database chattiness!

So that’s the bare bones basics, now let’s look at some recent improvements in Wolverine for…

Development Time Usage with EF Core

We’ve invested a lot of time recently in trying to make EF Core easier to work with at development time with Wolverine. Coming from Marten where our database migrations have an “it should just work” model that quietly configures the database to match your application configuration at runtime for quick iteration at development time.

With the Wolverine.EntityFrameworkCore library, you can get that same behavior with EF Core through this option:

builder.UseWolverine(opts =>
{
opts.Services.AddDbContextWithWolverineIntegration<ItemsDbContext>(
x => x.UseSqlServer(connectionString));
// Diff the DbContext against the live DB at startup and apply missing DDL.
opts.UseEntityFrameworkCoreWolverineManagedMigrations();
// This will make Wolverine do any necessary database migration
// work happen at application startup
opts.Services.AddResourceSetupOnStartup();
});

To be clear, with this setup, you can change your EF Core mappings, then restart the application or an IHost in testing and your application will automatically detect any database differences from the configuration and quietly apply a patch for you on application startup. This enables a much faster iteration cycle than EF Core Migrations do in my opinion.

The Weasel docs go deeper on the diff engine, opt-outs, and how it handles schemas.

Another feature in Marten that our community utilizes very heavily is the ability to quickly reset the state of a database in tests. I’ve also occasionally used the Respawn library for the same kind of ability when developing closer to the metal of a relational database to do the same. In a recent version of Wolverine, we’ve added similar abilities to our EF Core support including a version of Marten’s IInitialData concept to help you reset data in tests:

public class SeedItems : IInitialData<ItemsDbContext>
{
public async Task Populate(ItemsDbContext context, CancellationToken cancellation)
{
context.Items.Add(new Item { Name = "Seed" });
await context.SaveChangesAsync(cancellation);
}
}
builder.Services.AddInitialData<ItemsDbContext, SeedItems>();

And to see that in usage:

[Fact]
public async Task ordering_flow()
{
await _host.ResetAllDataAsync<ItemsDbContext>();
// arrange ... act ... assert
}

The ResetAllDataAsync<T>() method will look through a DbContext object to see all the tables it maps to, and delete all the data in those tables. It does take into account foreign key relationships to order its operations. After the data is wiped out, each IInitialData<T> registered in your system will be applied to lay down baseline data.

While this feature will surely have to be enhanced if many people start using it, this is already helping us make the Wolverine internal EF Core testing a lot more reliable and easier to use.

Declarative Persistence with EF Core

The next usage is special to Wolverine. A lot of times in simpler HTTP endpoints or command handlers you simply need to load an entity by its identity or primary key. And frequently, you’ll need to apply some repetitive validation that the entity exists in the first place. For that common need, Wolverine has its declarative persistence helpers like the [Entity] attribute shown below that can automatically load an entity through EF Core by its identity on the incoming command type implied by some naming conventions like this sample below:

The mapping of the identity can be explicitly mapped as well of course, and the pre-generated code always reveals Wolverine’s behavior around handlers or HTTP endpoint methods.

public class ItemsDbContext : DbContext
{
public DbSet<BacklogItem> BacklogItems { get; set; }
public DbSet<Sprint> Sprints { get; set; }
}
public record CommitToSprint(Guid BacklogItemId, Guid SprintId);
public static class CommitToSprintHandler
{
public static object[] Handle(
CommitToSprint command,
// There's a naming convention here about how
// Wolverine "knows" the id for the BacklogItem
// from the incoming command
[Entity(Required = true)] BacklogItem item,
[Entity(Required = true)] Sprint sprint
)
{
return item.CommitTo(sprint);
}
}

In the code above, Wolverine “knows” that the ItemsDbContext persists both the BacklogItems and Sprint entities, so it’s generating code around your handler to load these entities through ItemsDbContext. We can also tell Wolverine to automatically stop handling or in HTTP usage return a 400 ProblemDetails response if either of the requested entities are missing in the database. This helps keep Wolverine handler or HTTP endpoint code simpler by eliminating asynchronous code and letting you write more and more business or workflow logic in pure functions that are easy to test.

In the code above, the EF Core transactional middleware is calling ItemsDbContext.SaveChangesAsync() for you, and the automatic EF Core change tracking will catch the change to the BacklogItem.

And now, I think this is cool, Wolverine has its own new mechanism to batch up the two queries above through a custom EF Core futures query mechanism so that the handler above can fetch both the BacklogItem and the Sprint entity in one database round trip.

But wait, there’s more!

At the risk of making this blog post way too long, here’s more ways that Wolverine can make EF Core usage more successful:



Read the whole story
alvinashcraft
53 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

General Availability: Dynamic Data Masking for Azure Cosmos DB

1 Share

Protecting sensitive data is a foundational requirement for modern applications. As organizations scale their use of Azure Cosmos DB across teams, environments, and workloads, controlling who can see sensitive information becomes increasingly important.

Today, we are excited to announce the general availability of Dynamic Data Masking (DDM) for Azure Cosmos DB for NoSQL. This capability helps protect sensitive data by automatically masking it for users who do not have permission to view the original values, without requiring changes to application code or data models.

Dynamic Data Masking helps teams reduce the risk of accidental data exposure while simplifying security and compliance across production workloads.

What is Dynamic Data Masking?

Dynamic Data Masking is a server-side, policy-based security feature that masks sensitive fields in query results for non-privileged users. Azure Cosmos DB applies masking dynamically at query execution time, while the underlying data stored in the database remains unchanged.

Authorized users continue to see full, unmasked values. All other users see masked values based on the configured policy. As a result, Azure Cosmos DB protects sensitive data by default, even as access patterns and team structures evolve.

In particular, Dynamic Data Masking is well-suited for protecting PII, PHI, and other confidential data that should only be visible to a limited set of users.

ddm image

Why is This Important?

In many applications, read access must be granted broadly to developers, support engineers, analysts, or partner teams. Historically, protecting sensitive fields required implementing custom masking logic in every application or service accessing the data.

That approach is difficult to maintain, easy to misconfigure, and hard to audit at scale.

Instead, Dynamic Data Masking moves this responsibility into Azure Cosmos DB itself. Azure Cosmos DB enforces masking policies centrally and consistently for all queries, across all clients and SDKs, without requiring conditional logic in applications. As a result, teams reduce operational risk and find it easier to enforce least-privilege access in production environments.

Here is what you get immediately after enabling it:

  • Safer read access in production – Non-privileged users see masked field values, while privileged users continue to see full values.
  • No application changes – Azure Cosmos DB enforces masking server-side at query time, without changing stored data.
  • Role-aware enforcement – Azure Cosmos DB evaluates masking based on the user’s role and privilege, so you can grant broad read access while controlling who can unmask.
  • Easier compliance posture – DDM helps limit exposure of personal or protected data by ensuring non-privileged reads return masked values.

How Does It Work?

You can configure Dynamic Data Masking for your Azure Cosmos DB account through the Azure portal. Specifically, the process involves:

  • First, enable Dynamic Data Masking in the Features tab under Settings.

Features

  • Next, define roles and permissions using Azure Cosmos DB’s data plane role-based access control (RBAC).
  • Then, assign users to roles: privileged users get unmask permissions, while others receive standard roles.
  • Finally, apply a masking policy at the container level, specifying which fields to mask and which masking strategies to use.

MaskingPolicy image

Supported Masking Strategies

DDM supports multiple masking strategies to accommodate different data types and scenarios.

Type Description Example
Default String values are replaced with a fixed mask as XXXX

Numeric values are replaced with a default value of 0

Boolean values are always set to false

Original: Redmond Masked: XXXXOriginal: 95 Masked: 0

Original: true Masked: false

Custom String A portion of the string is masked based on a defined starting index and length using MaskSubstring(Start, Length) MaskSubstring(3,5)

Original: Washington Masked: WasXXXXXon

Email Only the first letter of the username and the domain ending (such as .com) remain visible. All other characters are replaced with Xs. Original: alpha@microsoft.com

Masked: aXXXX@XXXXXXXXX.com

Sample Masking Policy

Below is an example policy structure that uses included and excluded paths and turns the policy on for a container. This illustrates the shape of a masking policy so you can map it to your own document paths:

"dataMaskingPolicy": 
{
  "includedPaths": [
    {
      "path": "/" // Mask all fields
    },
    {
      "path": "/profile/contact/email", 
      "strategy": "Email" //Email strategy overrides the default mask
    },
    {
      "path": "/employment/history/[]/company",
      "strategy": "MaskSubstring", // MaskSubstring overrides the default mask
      "startPosition": 2,
      "length": 4
    }
  ],
  "excludedPaths": [
    {
      "path": "/projects/[]/projectId" //Exclude projectId from masking
    },
    {
      "path": "/id"
    },
    {
      "path": "/department"
    },
    {
      "path": "/employment/history/[]/duration" 
    },
    {
      "path": "/projects/[]/details/technologies"
    }
  ]
}

Get Started

Dynamic Data Masking is now generally available for Azure Cosmos DB for NoSQL and ready for production workloads. Enable it in the Azure portal or see Dynamic Data Masking in Azure Cosmos DB to learn more.

About Azure Cosmos DB

Azure Cosmos DB is a fully managed and serverless NoSQL and vector database for modern app development, including AI applications. With its SLA-backed speed and availability as well as instant dynamic scalability, it is ideal for real-time NoSQL and MongoDB applications that require high performance and distributed computing over massive volumes of NoSQL and vector data.

To stay in the loop on Azure Cosmos DB updates, follow us on XYouTube, and LinkedIn.  Join the discussion with other developers on the #nosql channel on the Microsoft Open Source Discord.

 

The post General Availability: Dynamic Data Masking for Azure Cosmos DB appeared first on Azure Cosmos DB Blog.

Read the whole story
alvinashcraft
54 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories