Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
139061 stories
·
31 followers

Is the enterprise (actually) ready for AI?

1 Share
Maryam Ashoori, Head of Product for watsonx.ai at IBM, joins Ryan and Eira to talk about the complexity of enterprise AI, the role of governance, the AI skill gap among developers, how AI coding tools impact developer productivity, what chain-of-thought reasoning entails, and what observability and monitoring look like for AI.
Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Create Your First AI Agent with JavaScript and Azure AI Agent Service!

1 Share

Introduction: The Era of AI Agents in JavaScript

During the AI Agents Hackathon, one of the most anticipated sessions was presented by Wassim Chegham, Senior AI Developer Advocate for JavaScript at Microsoft. The topic? "How to Create Your First AI Agent with JavaScript and Azure AI Agent Service" — a powerful tool designed for modern developers looking to build AI-first applications with security, scalability, and productivity in mind.

In this article, we explore the main highlights of the session, focusing on how you can create your own AI agent using JavaScript and Azure AI Agent Service.

The video’s goal is clear: walk through the step-by-step process of creating AI agents using JavaScript and TypeScript with Azure AI Foundry, and explain all the key concepts behind this new development paradigm.

If you missed the session, don’t worry! You can watch the recording:

What Are AI Agents?

Wassim starts the session with a historical overview: from traditional chatbots to the intelligent and autonomous agents we know today. He highlights:

  • LLM-based Agents (Large Language Models) that understand natural language.
  • Tool-using Agents that perform real-world tasks like API calls, searches, code execution, etc.
  • Multi-agent Systems, coordinating multiple agents to solve complex problems.

The main advantage of Azure AI Agent Service is how it simplifies all of this by offering a managed platform that handles orchestration, security, tracking, and agent execution.

Solution Architecture with Azure AI Agent Service

During the session, Wassim provided a clear view of a typical AI agent app architecture built with JavaScript. He explained that while you can use a graphical interface (Frontend) with frameworks like Angular or React, it’s not mandatory — the app can work just fine from a terminal, as demonstrated live.

In the Backend, the focus is on using Node.js, often combined with frameworks like Express.js or Fastify to expose APIs that communicate with agents. This API layer acts as a bridge between users and the agent’s logic, coordinating messages, executions, and tool invocations.

The agent itself is created and managed using the azure​/ai-projects SDK, which provides a simple API to register agents, define instructions, attach tools, and control executions. Wassim emphasized how this approach reduces complexity compared to other agent frameworks that require manual state configuration, orchestration, and context management.

Additionally, there is a layer of integrated tools that greatly expand the agent's capabilities:

  • Code Interpreter: sandboxed Python code execution
  • Function Calling: user-defined function invocation
  • Azure AI Search: vector search and RAG (Retrieval-Augmented Generation)
  • Bing Search: real-time web data grounding

All these tools are available plug-and-play through the Azure AI Agent Service infrastructure.

This architecture is powered by an Azure AI Foundry instance, which centralizes control over models, tools, connections, and data, providing a robust, secure, and scalable base for AI-first applications. Wassim made it clear: the agent is the true "brain" of the application — receiving instructions, reasoning over them, and coordinating task execution with external tools, all with just a few JavaScript commands.

Creating Your First Agent: Hands-on with JavaScript

During the hands-on demo, Wassim walks participants through every step to create a working AI agent using JavaScript and Azure AI Agent Service. He begins by highlighting that all code is publicly available in a GitHub repository, so anyone can clone, run, and adapt it.

> Repository link: Azure AI Agent Service - Demonstration

The first step is installing the required packages. The core one is the azure​/ai-projects SDK (npm package), which handles agent interactions. You’ll also need azure​/identity to securely authenticate with Azure credentials using, for example, DefaultAzureCredential.

Once the environment is set up, Wassim shows how to create an authenticated client using a connection string from the Azure Foundry portal. This string is stored in a .env file and allows secure communication with the agent service.

With the client ready, the next step is to create the agent. You define its name, the language model (like GPT-4), and clear instructions about what the agent should do — whether it’s calculating, answering questions, interpreting data, or interacting with external tools.

Wassim then introduces the thread concept, which acts as a conversation space between the agent and user. This is where messages are stored, executions are initiated, and interaction history is tracked. He shows how to create a thread, send a message, and launch a run, or agent execution.

The session then showcases tool usage. In the first example, the agent solves a simple equation using its internal knowledge — a classic case demonstrating reasoning capabilities based on instructions. Next, Wassim activates a custom function call: the agent fetches local CPU usage, demonstrating environment interaction.

Another impressive example is using the Code Interpreter tool to run Python code remotely. Wassim uploads a CSV with car sales data, and the agent processes the data and generates charts in real-time.

He also demonstrates using Bing Grounding to fetch up-to-date info from the internet (e.g., stock prices). Finally, he shows how Azure AI Search queries a vector index with healthcare plan data to answer specific questions — with precise source citations. A great RAG (Retrieval-Augmented Generation) example.

These examples prove that with just a few JavaScript commands, you can build powerful agents capable of interacting with users, data, and tools seamlessly and securely.

Understanding the Inner Workings: How an Agent Works

Wassim explains the key concepts in an agent's lifecycle:

  • Agent: configured with model and instructions.
  • Thread: represents conversation (context).
  • Run: task execution.
  • Run Steps: steps in the execution.
  • Tools: defined via schema and triggered as needed.
  • Events: emitted during execution (streaming, tool-call, response, error, etc).

He also showcases a personal project: a visual tracing tool to track agent steps in real-time — helpful for understanding and debugging.

A Bit About the Technologies Used

For the tech-curious, Wassim highlighted the stack powering the project:

  • 📦 SDKs
    • azure​/ai-projects@2.0.0-beta.4
    • azure​/identity for secure credential auth.
  • 🔧 Integrated tools
    • Function Calling: run functions based on LLM input.
    • Code Interpreter: safely run Python remotely.
    • Azure AI Search: vector and full-text search (RAG).
    • Bing Search Grounding: real-time web info.
    • File Search (coming soon): search uploaded files.
  • ⚙️ Security & Compliance
    • Keyless authentication
    • Private Networking (VNet)
    • Content Filtering
    • Tracing/logging to prevent hallucinations

Conclusion: The Future of AI Agents with JavaScript

Wassim Chegham's session at the AI Agents Hackathon was a masterclass in how to create AI agents using JavaScript and Azure AI Agent Service. He not only introduced the core concepts, but demonstrated how quick and easy it is to develop intelligent apps with this new approach.

Again, if you missed the session, you can watch the full recording here.

And don’t forget to check out the GitHub repo with all the examples and code used in the session: Azure AI Agent Service - Demonstration.

Wassim’s closing message was clear: the future of AI agents is bright. With the right tools, any developer can build impactful and innovative solutions. So don’t wait — start building your AI agent with Azure today!

Useful Links

 

Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Part 2 - How to Create a VS Code Extension for API Health Checks?

1 Share

Introduction

Have you ever thought about to build a Visual Studio Code extension as your capstone project? That’s what I did: Part 1 - Develop a VS Code Extension for Your Capstone Project.

I have created a Visual Studio Code Extension, API Guardian, that identifies API endpoints in a project and checks their functionality before deployment. This solution was developed to help developers save time spent fixing issues caused by breaking or non-breaking changes and to alleviate the difficulties in performing maintenance due to unclear or outdated documentation.

Let's build your very own extension!

Now, let’s do it step by step.

 

Step 1 – Install the NPM package for generator-code

Ensure Node.js is installed before proceeding. Verify by running node -v, which will display the installed version.

Check the node installation in your system.

Once Node.js is verified to be installed, run the following command to install the generator-code, “npm install -g yo generator code”

Yeoman generator is a easy starting tools to build the extension

After installation is complete, run this command, “yo code”, in your desired folder to create the project. For this tutorial, I will choose to create a new extension using JavaScript.

Interactive menu in Yeoman generator

You can then fill in the relevant information.

Provide the project information

Once completed your project has been created.

Continue the development in Visual Code

 

 

Step 2 – Customize Command

To customize your command, open "package.json" and modify the relevant text.

Fine tune configuration in the package.jsonOutput the Hello World message in the console, ensure the setup is correct by make it run.

Keep in mind that if you change the “Command” in "package.json", you'll also need to update it in "extension.js".

Continue with some logic testing by writing simple prompt selection.Run it.

Step 3 – Adding the logic for your extension

In this section of the tutorial, we will explore and utilize the Quick Pick functionality offered by the Visual Studio Code API. This feature allows us to create interactive menus that let users quickly select from a list of options, streamlining their workflow within the extension. I’ve provided a code snippet as an example: if users select "Say Hello", a "Hello World" message will be displayed, and if they choose "Say Goodbye", a "Goodbye!" message will appear.

Pick the definded action from the list.You should see the corresponding options you just defined.

 

Step 4 – Testing the Extension

To test your extension, go to the Run menu and select Start Debugging. This will open a new instance of VS Code, called the Extension Development Host, where you can test your extension.

In the Command Palette of the development host go to Show and Run Command, type the name of your custom command, such as "Say Hello," and select it.

This triggers the extension to run according to the logic in your extension.js file, allowing you to interact with and debug its functionality.

 

Summary

Congratulations! You've just created your first Visual Studio Code extension. Building your extension for your capstone project can be challenging, as you’ll encounter various obstacles while ensuring everything works as expected. Not only does this strengthen your technical skills, but it also enhances key soft skills such as problem-solving, critical thinking, collaboration, and time management. Overcoming these challenges builds resilience and adaptability, preparing you for real-world software engineering roles and professional teamwork.

 

Acknowledgement

I would like to express my sincere gratitude to Dr. Peter Yau and Mr. Francis Teo for their invaluable guidance and support throughout this project. I would also like to extend my appreciation to the Singapore Institute of Technology, University of Glasgow, and Wizvision Pte Ltd for their ongoing support and for providing me with the opportunity to work on this project.

 

API Guardian

https://marketplace.visualstudio.com/items?itemName=APIGuardian-vsc.api 

About the Authors

Main Author - Ms Joy Cheng Yee Shing, BSc (Hon) Computing Science

Academic Supervisor - Dr Peter Yau, Microsoft MVP

Group photo from the Glasgow research team: Dr Peter Yau (left), Ms Joy Cheng (middle), and A/Prof Lawrence Seow (right)
Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Why 'epitome' is confusing. Quirky stories behind baby animal names. Alice doors

1 Share

1081. Is an epitome a summary or a shining example? We look at why this word trips people up and how its meaning has changed over time. Then, we take a linguistic safari through the world of baby animal names—and what they tell us about language, culture, and human history.

The "baby animal names" segment is by Karen Lunde, a career writer and editor. In the late '90s, as a young mom with two kids and a dog, she founded one of the internet's first writing workshop communities. These days, she facilitates expressive writing workshops, both online and off. Find her at chanterellestorystudio.com

🔗 Grammar Girl AP style webinar (use the code MACMIL for $50 off).

🔗 Share your familect recording in a WhatsApp chat.

🔗 Watch my LinkedIn Learning writing courses.

🔗 Subscribe to the newsletter.

🔗 Take our advertising survey

🔗 Get the edited transcript.

🔗 Get Grammar Girl books

🔗 Join GrammarpaloozaGet ad-free and bonus episodes at Apple Podcasts or SubtextLearn more about the difference

| HOST: Mignon Fogarty

| VOICEMAIL: 833-214-GIRL (833-214-4475).

| Grammar Girl is part of the Quick and Dirty Tips podcast network.

  • Audio Engineer: Dan Feierabend
  • Director of Podcast: Brannan Goetschius
  • Advertising Operations Specialist: Morgan Christianson
  • Digital Operations Specialist: Holly Hutchings
  • Marketing and Video: Nat Hoopes

| Theme music by Catherine Rannus.

| Grammar Girl Social Media: YouTubeTikTokFacebook.ThreadsInstagramLinkedInMastodonBluesky.





Download audio: https://dts.podtrac.com/redirect.mp3/tracking.swap.fm/track/0bDcdoop59bdTYSfajQW/media.blubrry.com/grammargirl/stitcher.simplecastaudio.com/e7b2fc84-d82d-4b4d-980c-6414facd80c3/episodes/e22de58b-9c58-426b-9b57-6bff2906cea6/audio/128/default.mp3?aid=rss_feed&awCollectionId=e7b2fc84-d82d-4b4d-980c-6414facd80c3&awEpisodeId=e22de58b-9c58-426b-9b57-6bff2906cea6&feed=XcH2p3Ah
Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

How to Use KurrentDB for Event Sourcing in C# on Azure

1 Share

In this blog post, you will learn how to deploy a test instance of KurrentDB to Azure and access it from a console application in .NET.

Why did I write this blog post?

While updating my DDD, CQRS, and Event Sourcing in .NET training course to the latest version of .NET and C#, I also decided to switch from EventStoreDB to KurrentDB for the course’s hands-on exercises.
Previously, we ran EventStoreDB locally, but during our corporate training sessions, we found that installing software onto student machines often gets restricted. To work around this, I chose to host KurrentDB in Azure, with a separate instance running for each student.

As usual, I thought to share my learning with you! This blog post walks through how I deployed KurrentDB to Azure and connected to it from C#.

What is KurrentDB?

KurrentDB is the new name for EventStoreDB, one of the first purpose-built databases for event sourcing. It was created by Greg Young, who also introduced the Command and Query Responsibility Segregation (CQRS) pattern.

KurrentDB stores every change to your data as an event. Instead of just saving the latest state, it records what happened, when, and why. This approach lets you rebuild the full history of your system, create audit logs, replay behavior, or build read models that are tailored for specific needs.

The KurrentDB database is open-source, written in C#, and available on GitHub. The source code is well written and makes for an interesting codebase to review and explore. I find that reading and exploring real-world projects is a great way to pick up new techniques and ideas.

Is KurrentDB a new database?

No, it was originally released in 2012 and is one of the few purpose-built event store databases available. It has had many years to mature and evolve. If you’re curious about how it works internally, check out this talk by Gregory Young: How an EventStore actually works.

What is Event Sourcing?

Event sourcing is a way of storing data by recording each change as an event. Instead of saving the latest state, you store the full history of everything that happened. This makes it easy to rebuild states, trace changes, and support things like auditing and custom views.

For example, in a shopping cart, you can store each action as it happens:

An event-sourced shopping-cart example

To get to the current state, you would load all events and apply them one by one to the shopping cart. A big benefit of event sourcing, is that it makes it much easier to understand how and why something changed within a system, which is great for troubleshooting.

Deploying KurrentDB to Azure

There are several ways to host KurrentDB in Azure, but my requirements were:

  • Each student should get their own dedicated instance.
  • It must be easy to deploy and clean up using a PowerShell script.
  • No need to deal with https (TLS) certificates.
  • It should be cost-effective.

Given these needs, I decided to use Azure Container Instances. KurrentDB is available as a container image, and you can find the details in their installation guide.

Azure Container Instances are suitable for testing KurrentDB, but should not be used for production. ACI enforces certain restrictions that limit some features KurrentDB relies on to run properly.

KurrentDB and the network

Before we deploy KurrentDB, we need to understand its network configuration. By default, KurrentDB listens to two network ports:

Port 1113

KurrentDB uses port 1113 for internal TCP traffic between cluster nodes, such as replication and gossip.

Four KurrentDB nodes communicating over port 1113.

In our case, since we’re running a single-node setup, we don’t need to worry about port 1113. This port is used only for internal node-to-node communication and should never be exposed to the public internet or client applications.

Port 2113

KurrentDB uses by default port 2113 for client communication over HTTP or HTTPS.

How a client talks to KurrentDB over port 2113

In this example, we expose port 2113 to the public internet. To simplify setup and avoid dealing with TLS certificates, we run KurrentDB in insecure mode by setting the KURRENTDB_INSECURE environment variable to true. This is fine for testing and development, but it should never be used in production.

Deploying KurrentDB to Azure Container Instances

For my first attempt at deploying KurrentDB as an Azure Container Instance, I used the following PowerShell script:

				
					$containerImage = "docker.kurrent.io/kurrent-latest/kurrentdb:latest"
$ResourceGroupName = "rg-kurrenteventstore"
$containerName = "kurrentdb-attempt1"
$Location = "swedencentral"

# Create resource group if it doesn't exist
az group create --name $ResourceGroupName `
                --location $Location `

# Create Azure Container Instance
az container create `
	--resource-group $ResourceGroupName `
	--name $containerName `
	--image $containerImage `
	--ports 2113 `
	--environment-variables "KURRENTDB_INSECURE=true" `
	--cpu 1 `
	--memory 1 `
	--os-type Linux `
	--location $Location `
	--restart-policy Never `
	--dns-name-label "$containerName-$(Get-Random)" `
	--query "ipAddress.fqdn" 

				
			

After running the script, we had a working KurrentDB instance deployed to the public internet, accessible on port 2113.

When navigating to the URL in your browser, you’ll see KurrentDB’s built-in web interface. This gives you a quick way to check that the instance is up and running, and allows you to explore some basic information and diagnostics like below.

The KurrentDB built-in admin user-interface.

Deploying multiple instances of KurrentDB

Being able to deploy an instance to Azure is awesome. However, I need a script to deploy multiple instances to Azure.

The modified script eventually became this:

				
					$containerImage = "docker.kurrent.io/kurrent-latest/kurrentdb:latest"
$ResourceGroupName = "rg-kurrenteventstore2"
$Location = "swedencentral"

$instanceCount = 2

# Create resource group if it doesn't exist
az group create --name $ResourceGroupName `
                --location $Location `

$urls = @()

for ($i = 1; $i -le $InstanceCount; $i++) 
{
    $containerName = "kurrentdb-$i"

	# Create Azure Container Instance
	$fqdn = az container create `
		--resource-group $ResourceGroupName `
		--name $containerName `
		--image $containerImage `
		--ports 2113 `
		--environment-variables "KURRENTDB_INSECURE=true" `
		--cpu 1 `
		--memory 1 `
		--os-type Linux `
		--location $Location `
		--restart-policy Never `
		--dns-name-label "$containerName-$(Get-Random)" `
		--query "ipAddress.fqdn" `
		--output tsv
		
	$urls = $urls + "${fqdn}:2113"

}

Write-Output "`nAll instance URLs:"
$urls

				
			

Kurrent Navigator

To manage your KurrentDB instance, you can use Kurrent Navigator. This is a standalone desktop app with a clean, modern interface for browsing events, inspecting streams, working with projections, and more. It runs on Windows, macOS, and Linux, and it is useful for both daily administration and troubleshooting.

KurrentDB Navigator desktop application

Sending events to KurrentDB from C#

For .NET developers, KurrentDB offers an official .NET client which is also available as a NuGet package: KurrentDB.Client. It supports both .NET Framework 4.8 and modern .NET versions. Details about how to use this package can be found in the getting started documentation.

the KurrentDB.Client NuGet package

The code below creates four events and sends it to KurrentDB:

				
					using KurrentDB.Client;
using KurrentDB.Client.Core.Serialization;

var server = "[ServerAddress]";

var settings = KurrentDBClientSettings.Create($"esdb://{server}?tls=false");

settings.OperationOptions.ThrowOnAppendFailure = false;

await using var client = new KurrentDBClient(settings);

var events = new List
{
    new NewOrderCreated
    {
      Id = Guid.NewGuid(),
      CustomerID = Guid.NewGuid(),
      SalesPersonID = Guid.NewGuid(),
      SalesPersonName = "John Doe"
    },
    new OrderCancelled
    {
      Id = Guid.NewGuid(),
      Reason = "Customer changed their mind"
    },
    new ProductAddedToOrder
    {
      Id = Guid.NewGuid(),
      ProductId = Guid.NewGuid(),
      ProductName = "Laptop",
      Quantity = 1,
      PricePerUnit = 1200.00m
    },
    new ProductRemovedFromOrder
    {
      Id = Guid.NewGuid(),
      ProductId = Guid.NewGuid()
    }
};

var streamName = "order-12345";

await AppendEventsAsync(client, streamName, events);


async Task AppendEventsAsync(KurrentDBClient client, 
                             string stream, 
                             List events)
{
    // Map the events to KurrentDB message type.
    var messages = events
        .Select(e => Message.From(data: e,
                                  messageId: Uuid.NewUuid())).ToList();

    await client.AppendToStreamAsync(streamName: stream, 
                                     messages: messages);
}

// Event definitions

public class NewOrderCreated
{
    public Guid Id { get; set; }
    public Guid CustomerID { get; set; }
    public Guid SalesPersonID { get; set; }
    public string? SalesPersonName { get; set; }
}

public class OrderCancelled
{
    public Guid Id { get; set; }
    public string? Reason { get; set; }
}

public class ProductAddedToOrder
{
    public Guid Id { get; set; }
    public Guid ProductId { get; set; }
    public string? ProductName { get; set; }
    public int Quantity { get; set; }
    public decimal PricePerUnit { get; set; }
}

public class ProductRemovedFromOrder
{
    public Guid Id { get; set; }
    public Guid ProductId { get; set; }
}

				
			

After sending the events, we can open Kurrent Navigator to view the events stored in the database:

Viewing the four events in KurrentDB Navigator

Reading events from KurrentDB

Reading from KurrentDB is straightforward. You just need to create a client and specify the stream you want to read from, like this:

				
					using KurrentDB.Client;
using System.Text;
using System.Text.Json;

Console.WriteLine("Reading events from KurrentDB...");

var server = "kurrentdb-2-1927695997.swedencentral.azurecontainer.io";

var settings = KurrentDBClientSettings.Create($"esdb://{server}?tls=false");
settings.OperationOptions.ThrowOnAppendFailure = false;

await using var client = new KurrentDBClient(settings);

var streamName = "order-12345";

var events = await client
    .ReadStreamAsync(Direction.Forwards, streamName, StreamPosition.Start)
    .ToListAsync();

foreach (var ev in events)
{
    var json = Encoding.UTF8.GetString(ev.Event.Data.Span);
    Console.WriteLine($"{ev.Event.EventNumber} - {json}");
}

				
			

As an output, we will get:

				
					Reading events from KurrentDB...
0 - {"id":"0d0545fe-5242-4c06-8ad5-573cd4a89bd1",
     "customerID":"264c704d-1754-4e56-9f2a-0f8b4202a045",
     "salesPersonID":"3fa6e8f1-c114-4eea-81e0-eefa9fe2a4f4",
     "salesPersonName":"John Doe"}

1 - {"id":"b3f37164-94c6-426d-bfe2-b8a6063f37ac",
     "reason":"Customer changed their mind"}

2 - {"id":"4b391bb9-b288-4dc9-9f19-ad65dc577cad",
     "productId":"4bcb44e5-6bc6-42dd-b47c-18496a4d5380",
     "productName":"Laptop","quantity":1,"pricePerUnit":1200.00}

3 - {"id":"6f2fa70f-12ec-46e7-b715-aad3f559be32",
     "productId":"bb5147e0-5fbb-4339-88ce-1249fb002d0b"}

				
			

Summary

Writing this blog post was a great way to explore everything new from the team behind KurrentDB. I especially enjoyed trying out the Kurrent Navigator, which was completely new to me. Hosting KurrentDB in Azure using Container Instances turned out to be a smooth experience. Just keep in mind that if you’re coming from EventStoreDB, some names and behaviors have changed, which caused a few bumps along the way.

Curious about DDD, CQRS, and Event Sourcing?

If you’re interested in learning more about these concepts, I offer several training classes, including DDD, CQRS and Event Sourcing in .NET and Modern Application Architecture. You can find the full list of workshops on my training page. I also offer coaching if you’d like help working through architectural challenges.

The CQRS and Event Sourcing framework that I use in my training can be found at https://cqrs.nu. It’s a project I was part of creating. There, you’ll also find an example of how to do BDD-style testing of Event Sourced systems.

About Tore

Hey! I’m Tore 👋 I’m an independent consultant, coach, trainer and a Microsoft certified MVP in .NET. My mission is to help developer teams to solve problems more quickly and effectively across tools and platforms like ASP.NET Core, Duende IdentityServer, web security, C#, Azure, .NET, and more. Sounds helpful? I’d love to be of service! You can check out my workshops for teams and my wider consulting and coaching services here

Resources

More Blog Posts by the Author

The post How to Use KurrentDB for Event Sourcing in C# on Azure appeared first on Personal Blog of Tore Nestenius | Insights on .NET, C#, and Software Development.

Read the whole story
alvinashcraft
5 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Soft deletes in EF Core: How to implement and query efficiently

1 Share

This blog post is originally published on https://blog.elmah.io/soft-deletes-in-ef-core-how-to-implement-and-query-efficiently/

What does your application actually do with a record when a user deletes it? Can your application afford to delete a record permanently? One mistake can result in irreparable damage to the application. Today, I will shed light on D of the CRUD. Among all the CRUD operations, Delete is the most critical. Removing a record using a database delete is a simple way, but it does not provide a rollback option. In many cases, you don't want a cascading chain of deletes, which cannot be recovered. Soft delete is a common practice for such scenarios, where the record is flagged as deleted, allowing you to rollback using the same flag.

Soft deletes in EF Core: How to implement and query efficiently

What is a soft delete?

Soft delete is a data persistence strategy that provides a secure way to manage data without permanently removing it. Instead of deleting records, the system marks them as inactive by toggling a deletion flag. Soft delete ensures that sensitive or critical data can remain available for restoration, auditing, or historical reference. A hard delete can result in cascade deletion, losing essential relations or associated data.

Traditionally, a hard delete performs the following:

DELETE FROM [dbo].[Book]
WHERE [Book].[Id] = 02;

Our rescuer soft delete does

UPDATE [dbo].[Book]
SET [dbo].[Book].[IsDeleted] = 1
WHERE [Book].[Id] = 02;

It is like telling your database, "Hide this record, but don't actually delete it.' to fetch records filtered by the IsDeleted flag.

SELECT *
FROM [Book]
WHERE IsDeleted = 0;

We have seen the introduction of the soft delete, but now our purpose was to know "How to implement soft delete efficiently?"

Way 1: Simplest - Manual Flagging

Step 1: Define the model

public class Book
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string IBN { get; set; }
    public int PageCount { get; set; }

    // Setting the flag false by default for any new record

    public bool IsDeleted { get; set; } = false;
}

Step 2: Configure ApplicationDbContext

public class ApplicationDbContext: DbContext
{
    public DbSet<Book> Books { get; set; }

    public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options) : base(options)  
    {  
    }
}

Step 3: Implement the Remove method

public async Task RemoveBookAsync(int id)
{
    var book = await context.Books.FirstOrDefaultAsync(id);
    book.IsDeleted = true;
    await context.SaveChangesAsync();
}

Step 4: Filter in the get methods

public async Task GetBooksAsync () =>
    await context.Books.Where(b => !b.IsDeleted).ToListAsync();

Manual flagging is the most straightforward way to handle deletion softly. However, if the application grows larger, managing IsDeleted for each model becomes troublesome.

Way 2: Global Query Filter

Defining common properties in a base model is a cleaner and OOP-based approach. To avoid repetitive code, you can define the IsDeleted property in such a model and inherit it across other classes.

Step 1: Declare the base model

public abstract class SoftDeletableModel
{
    public bool IsDeleted { get; set; } = false;
    public DateTime? DeletedAt { get; set; }
}

We added DeletedAt to record the time of deletion.

Step 2: Create Book model as an inherited class

public class Book: SoftDeletableModel
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string IBN { get; set; }
    public int PageCount { get; set; }
}

Step 3: Configure ApplicationDbContext

public class ApplicationDbContext: DbContext
{
    public DbSet<Book> Books { get; set; }

    protected override void OnModelCreating(ModelBuilder modelBuilder)  
    {  
        modelBuilder.Entity<Book>().HasQueryFilter(b => !b.IsDeleted);  
    }  
}

Here, we applied a filter in the DbContext itself, so it will filter book records while fetching.

Step 4: Implement the Remove method

public async Task RemoveBookAsync(int id)
{
    var book = await context.Books.FirstOrDefaultAsync(id);
    book.IsDeleted = true;
    book.DeletedAt = DateTime.UtcNow();
    await context.SaveChangesAsync();
}

The remove method remains the same.

Step 5: Filter in the get methods

public async Task GetBooksAsync () => await context.Books.ToListAsync();

No explicit filter required.

By configuring the IsDeleted filter, all the book records will be filtered out during fetching. However, if you need to skip the filter for complex joins, you can use IgnoreQueryFilters() like:

await context.Books.IgnoreQueryFilters().ToList();

Way 3: SaveChanges Override to Intercept Deletes

Now we are moving one step ahead, we will change the SaveChangesAsync default behavior for this purpose.

Step 1: Declare the base model

public abstract class SoftDeletableModel
{
    public bool IsDeleted { get; set; } = false;
    public DateTime? DeletedAt { get; set; }
}

We added DeletedAt to record the time of deletion.

Step 2: Create Book model as an inherited class

public class Book: SoftDeletableModel
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string IBN { get; set; }
    public int PageCount { get; set; }
}

Step 3: Configure ApplicationDbContext soft delete via SaveChanges interception

public class ApplicationDbContext: DbContext
{
    public DbSet<Book> Books { get; set; }

    public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options) : base(options)
    {
    }

    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.Entity<Book>().HasQueryFilter(b => !b.IsDeleted);
        base.OnModelCreating(modelBuilder);
    }

    public override int SaveChanges()
    {
        foreach (var entry in ChangeTracker
            .Entries()
            .Where(e => e.State == EntityState.Deleted && e.Entity is SoftDeletableEntity))  
        {
            entry.State = EntityState.Modified;
            ((SoftDeletableEntity)entry.Entity).IsDeleted = true;
            ((SoftDeletableEntity)entry.Entity).DeletedAt = DateTime.UtcNow;
        }

        return base.SaveChanges();
    }

    public override async Task<int> SaveChangesAsync(CancellationToken cancellationToken = default)
    {  
        foreach (var entry in ChangeTracker
            .Entries()
            .Where(e => e.State == EntityState.Deleted && e.Entity is SoftDeletableEntity))
        {
            entry.State = EntityState.Modified;
            ((SoftDeletableEntity)entry.Entity).IsDeleted = true;
            ((SoftDeletableEntity)entry.Entity).DeletedAt = DateTime.UtcNow;
        }

        return await base.SaveChangesAsync(cancellationToken);
    }
}

Here we applied a filter in the DbContext itself, hence it will filter book records while fetching.

You can also define a separate interceptor and register it. The DbContext code becomes:

public class ApplicationDbContext: DbContext
{
    public DbSet<Book> Books { get; set; }

    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.Entity<Book>().HasQueryFilter(b => !b.IsDeleted);
        base.OnModelCreating(modelBuilder);
    }
}

And define a separate inceptor.

public sealed class SoftDeleteInterceptor: SaveChangesInterceptor
{
    public override ValueTask<InterceptionResult<int>> SavingChangesAsync(
        DbContextEventData eventData,
        InterceptionResult<int> result,
        CancellationToken cancellationToken = default)
    {
        if (eventData.Context is null)
            return base.SavingChangesAsync(eventData, result, cancellationToken);

        var entries = eventData.Context.ChangeTracker
            .Entries<ISoftDeletable>()
            .Where(e => e.State == EntityState.Deleted);

        foreach (var entry in entries)
        {
            entry.State = EntityState.Modified;
            entry.Entity.IsDeleted = true;
            entry.Entity.DeletedAt = DateTime.UtcNow;
        }

        return base.SavingChangesAsync(eventData, result, cancellationToken);
    }
}

Register it in the services

services.AddSingleton<SoftDeleteInterceptor>();
services.AddDbContext<ApplicationDbContext>((sp, options) =>
    options.UseSqlServer(configuration.GetConnectionString("DefaultConnection"))
    .AddInterceptors(sp.GetRequiredService<SoftDeleteInterceptor>()));

Step 4: Implement the Remove method

public async Task RemoveBookAsync(int id)
{
    // Soft delete via the Remove method
    var book = await context.Books.FindAsync(1);
    context.Books.Remove(book); // ← Intercepted by SaveChanges
    await context.SaveChangesAsync();
}

Remove() acts as a soft delete.

Step 5: Filter in the get methods

public async Task GetBooksAsync () => await context.Books.ToListAsync();

Here, you don't need to use Update logic in the RemoveBookAsync method.

Way 4: With Repository Pattern

The repository pattern will utilize a generic type to implement soft deletion in the Remove method. The first two steps are the same as Way 2

Step 3: Define ApplicationDbContext

public class ApplicationDbContext: DbContext
{
    public DbSet<Book> Books { get; set; }

    public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options) : base(options)
    {
    }

    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        modelBuilder.Entity<Book>().HasQueryFilter(b => !b.IsDeleted);
        base.OnModelCreating(modelBuilder);
    }
}

Step 4: Define a generic repository

public class Repository<T> where T : SoftDeletableModel
{
    private readonly ApplicationDbContext _context;
    public Repository(ApplicationDbContext context) => _context = context;

    public async Task RemoveAsync(int id)
    {
        var entity = await _context.Set<T>().FindAsync(id);  

        if (entity != null)
        {
            entity.IsDeleted = true;
            entity.DeletedOnUtc = DateTime.UtcNow;
            await _context.SaveChangesAsync();
        }
    }

    public IQueryable<T> GetAll() => _context.Set<T>();
}

Step 5: Use the code

To Remove:

var repo = new Repository<Book>(context);
await repo.RemoveAsync(1);

To Get:

var books = await repo.GetAll().ToListAsync();

Conclusion

Traditional hard delete methods can be problematic in many ways. They permanently erase data from the database, and you cannot roll back that data. Such operations are expensive if your application is prone to mistakes and becomes troublesome when cascade deleting occurs. Additionally, many law enforcement bodies have established data retention and disposal procedures, and deleting data permanently could result in penalties for non-compliance. A soft delete strategy that deletes records logically by setting a flag on the record, indicating it as "deleted." It maintains the foreign key integrity and ensures the consistency of other associated documents. This approach allows the application to ignore these records during regular queries. However, you can restore these records if necessary. We discussed some standard methods for implementing soft deletion in EF Core.



Read the whole story
alvinashcraft
5 hours ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories