Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
146615 stories
·
33 followers

Forget VMware and VirtualBox: This should be your new VM manager

1 Share

For decades, I used VirtualBox for all my virtual machine (VM) needs. I could use it as a graphical user interface (GUI), or I could run it from the command line. It was easy to work with, was cross-platform, and rarely caused me any problems.

Until it didn’t.

Over the past few years, I cannot tell you how many times I wound up with a broken VirtualBox installation that required me to completely remove the software and reinstall it. Generally speaking, it was a hassle, but the solution worked.

Until it didn’t.

A few weeks ago, VirtualBox broke again. I ran through the usual tricks, but was unable to get it working. I removed conflicting kernel modules (which were often the problem), did a purge uninstall, reboots, upgrades … you name it, I did it. This time, however, the fixes wouldn’t work.

The problem is, I depend on VMs every day. Without the ability to spin up VMs, I wouldn’t be able to review Linux distributions, test software, and perform several other tasks.

Thus, I permanently removed VirtualBox, vowing to never use it again.

The solution came by way of KVM, which is a Linux Kernel-based VM. KVM is an open source virtualization technology that allows the Linux kernel to function as a hypervisor to deliver near-native performance and reliability.

Since adopting KVM, I’ve not had a single issue with my VMs. However, I don’t use KVM alone. Instead, I pair it with Virt-Manager to make working with VMs exponentially easier.

How do you install and use KVM/Virt-Manager?

Let me show you.

What you’ll need

The only things you’ll need for this are a running instance of Linux, a user with sudo privileges, and an ISO of any Linux distribution.

Let’s get to work.

Installing Virt-Manager

First off, you don’t have to install KVM, as it is built into the Linux kernel. With that said, you do need to install the GUI frontend, Virt-Manager, and here’s how:

  • On Ubuntu/Debian-based machines: sudo apt-get install virt-manager -y
  • On Fedora-based machines: sudo dnf install virt-manager -y
  • On Arch-based machines: sudo pacman -S virt-manager
  • On openSUSE-based machines: sudo zypper install virt-manager

With Virt-Manager installed, you’re ready to create your first VM.

Creating a VM with KVM/Virt-Manager

You should find a new entry in your desktop menu named Virtual Machine Manager. Click on that to run the app. When it appears, you’ll see a single, small window (Figure 1).

Figure 1: The Virt-Manager main window in my default bubblegum pink theme.

Click the far left icon, which looks like a monitor, to create a new VM. In the resulting window (Figure 2), make sure Local install media is selected and click “Forward.”

Figure 2: Local install media should be selected by default.

In the next screen (Figure 3), click “Browse” and then, with your default file picker, locate and select the ISO you want to use.

Figure 3: If you’ve already created a VM, the ISO for your distribution will appear in the drop-down.

Chances are, Virt-Manager won’t auto-detect the OS, so type “gen” and then select “Generic Linux 2024.” Click “Forward” to continue.

You can now dedicate any amount of RAM and CPU cores you need for the OS (Figure 4).

Figure 4: I typically leave this as-is, unless the OS requires more RAM.

The next window (Figure 5) is where things start to get a bit more complicated. Don’t worry, once you understand what’s happening, you’ll be fine.

Figure 5: Make sure to use your storage wisely.

By default, Virt-Manager will store your VMs on the same drive as your OS. Because I create so many VMs, I prefer to store them on external drives to avoid running out of room. If you’re not worried about that, check “Create a disk image for the virtual machine” and allocate however much storage you want.

Choose “Select or create custom,” and then click “Manage.”

It’s now time to create a new storage pool and then a volume to house the VM. In the “Locate or create storage volume” window (Figure 6), click the + at the bottom left of the window. Give the new pool a name, change the Target Path to a directory on an external drive (if necessary), and click “Finish.”

Figure 6: You have to create a storage pool before you create a volume.

Once you’ve created the pool, click + to the right of Volumes. In the resulting window (Figure 7), give the new volume a name (probably the same name as the distro), allocate the space you want for the VM, and click “Finish.”

Figure 7: You’re almost done.

Make sure your new volume is selected (it’ll end in qcow2) and click “Choose Volume.”

Back at the New VM window, click “Forward.” In the next window, give the VM a name and then check “Customize configuration before install.”

Click “Finish,” and the custom config window will open (Figure 8).

Figure 8: There’s a lot to customize here.

For my KDE Neon installation, I have to select UEFI from the Firmware drop-down. Once I do that, I click “Apply” and then click “Begin Installation.”

At this point, a new window will open, where you can begin the OS installation process.

And that’s how you install Virt-Manager and use it, along with KVM, to create reliable, near-native-performing virtual machines.

The post Forget VMware and VirtualBox: This should be your new VM manager appeared first on The New Stack.

Read the whole story
alvinashcraft
16 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Automating TDD: Using AI to Generate Edge-Case Unit Tests

1 Share

The Problem: The "Happy Path" Trap in TDD

Test-driven development (Red-Green-Refactor) is the gold standard for reliable software. However, it has a flaw: The quality of your code is capped by the imagination of your test cases.

If you are building a payment processing function, you will naturally write a test for "valid payment." You might even remember "insufficient funds." But will you remember to test for:

Read the whole story
alvinashcraft
18 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

The Power of Community: Applying Dr. King’s Lessons at New Relic

1 Share
See how Dr. King’s lessons on equity shape life at New Relic. From inclusive ERGs to social impact partnerships, we're building a culture of belonging.

Read the whole story
alvinashcraft
18 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Consuming an authenticated MCP server with a declarative agent in Copilot Chat

1 Share
Featured image of post Consuming an authenticated MCP server with a declarative agent in Copilot Chat

In the previous post, we built an MCP server protected with Microsoft Entra authentication using Azure API Management. We tested it using Visual Studio Code with GitHub Copilot, proving that the authentication flow works correctly. However, one of the most exciting scenarios for MCP servers is integrating them with Microsoft 365 Copilot through declarative agents.

In this post, we’re going to explore how to create a declarative agent using the Microsoft 365 Agents Toolkit that consumes our Entra-authenticated MCP server. This is a relatively new feature, and there are some gotchas along the way that I’ll help you navigate. By the end of this tutorial, you’ll have a working declarative agent that can search for flights using our protected MCP server.

This is the first post in a series exploring authenticated MCP in the Microsoft 365 Copilot ecosystem. In the upcoming posts, we’ll cover how to consume authenticated MCP servers from Microsoft Foundry and Copilot Studio as well.

Let’s get started!

Prerequisites

Before we begin, make sure you have the following in place:

  • An Entra-authenticated MCP server: You can use the one we built in the previous post, or any other MCP server that supports OAuth authentication through Microsoft Entra. The server should be deployed and accessible via a public URL.
  • Microsoft 365 Agents Toolkit: You need version 6.4.3 or higher installed as an extension in Visual Studio Code. You can install it from the VS Code Marketplace.
  • Microsoft 365 Copilot license or free Copilot Chat: You’ll need access to Copilot Chat to test the declarative agent.

Creating the declarative agent project

The Microsoft 365 Agents Toolkit provides a streamlined way to create declarative agents that can connect to MCP servers. Let’s create our project:

  1. Open Visual Studio Code and select the Microsoft 365 Agents Toolkit icon from the sidebar.
  2. Click on Create a new Agent / App.
  3. Choose Declarative Agent.
  4. Select Add an action.
  5. Choose Start with an MCP server (preview).
  6. Provide the URL of your MCP server. In my case, it’s https://flightsapismcp-apim.azure-api.net/mcp.
  7. Choose a folder where to store the project.
  8. Give it a meaningful name (for example, “Flights Agent”).

The toolkit will create a declarative agent project with the mcp.json file already configured to connect to the MCP server you provided.

Connecting to the MCP server

Now that we have our project set up, we need to establish a connection to the MCP server. Open the mcp.json file and you’ll see a CodeLens toolbar above the server entry. Click on Start to initiate the connection.

Since our MCP server is protected with Entra authentication, Visual Studio Code will prompt you to provide:

  1. The client ID of your app registration
  2. The client secret

This is the same flow we covered in the previous post when testing the MCP server. After providing the credentials, you’ll be redirected to the Microsoft Entra login page to authenticate with your account. Complete the authentication flow, and the connection should be established.

Fetching the MCP tools

Here’s where the first gotcha comes in. After connecting, you might need to restart Visual Studio Code because the Microsoft 365 Agents Toolkit options in the CodeLens sometimes don’t show up immediately after the initial connection.

Once you’ve restarted, open the mcp.json file again and look for the ATK Fetch action from MCP option in the CodeLens toolbar. Click on it to import the available tools from your MCP server.

The ATK option to fetch tools from the MCP server

Issue #1: The GET endpoint requirement

If you followed my previous blog post exactly, the toolkit will fail to recognize that the MCP server requires authentication. The option to fetch the tools will work, but you will end up with a configuration that treats the MCP server as unauthenticated.

This happens because the Microsoft 365 Agents Toolkit performs a GET request to the MCP server to discover authentication requirements. It expects the response to fail with a 401 Unauthorized status and include the WWW-Authenticate header containing the authorization endpoint URL. We support the WWW-Authenticate header in our APIM configuration, but the MCP server we built only configured this authentication challenge for POST requests, not GET. As such, the toolkit will get in response a 404 error, leading to treat the MCP server as unauthenticated.

To fix this, we need to add a GET endpoint in Azure API Management:

  1. Go to the Azure portal and navigate to your API Management instance.
  2. Go to APIsFlight APIs.
  3. Click on the three dots () next to the MCP POST operation and choose Clone.
  4. Click on the pencil icon on the Frontend section.
  5. Change the HTTP method from POST to GET.
  6. Set the endpoint to /mcp.
  7. Save the changes.

Now the toolkit will be able to properly discover the authentication requirements of your MCP server.

Importing the tools

After fixing the GET endpoint issue, click on ATK Fetch action from MCP. This time, the process should work correctly:

  1. Visual Studio Code will ask which action manifest you want to update. Select the ai-plugin.json file in your project.
  2. You’ll see a list of all available tools from your MCP server. For the Flights API, you should see 4 tools: search_flights, get_flight_details, get_airports_origins, and get_airports_destinations. Select all of them.
  3. Click OK to import the tools.

Since the toolkit detected that the MCP server is protected, it will ask you to choose the authentication method. Select OAuth (with static registration).

The ai-plugin.json file will be updated with the metadata of all the MCP tools. You’ll notice that the auth property looks like this:

1
2
3
4
"auth": {
 "type": "OAuthPluginVault",
 "reference_id": "${{MCP_DA_AUTH_ID_FLIGHTSAPI}}"
}

The reference_id points to an OAuth registration in the Teams Developer Portal. The good news is that you don’t need to create this manually. The toolkit will take care of it when you provision the agent.

Provisioning the agent

Now we’re ready to provision our declarative agent. In the Microsoft 365 Agents Toolkit sidebar, navigate to the Lifecycle panel and click on Provision.

The provisioning process will start, and VS Code will prompt you for additional information to create the OAuth registration:

  • Client ID: The client ID of your app registration
  • Client secret: The client secret
  • Scopes: The OAuth scopes required by your API, which should be api://your-client-id/access_as_user

Issue #2: Missing OAuth properties in m365agents.yml

At this point, you might encounter an error like this:

1
2
3
[Error] - code:oauthRegister.InvalidActionInputError, message: The 'oauth/register' action cannot be completed as the following parameter(s):
authorizationUrl,tokenUrl,apiSpecPath, are either missing or have an invalid value in the provided yaml file:
d:\src\flights-agent\m365agents.yml. Ensure that the required parameters are provided and have valid values and try again.

This happens because the m365agents.yml file in your project root is missing some required OAuth properties. To fix this, open the file and find the step that contains uses: oauth/register. Add the following properties:

1
2
3
authorizationUrl: https://login.microsoftonline.com/common/oauth2/v2.0/authorize
tokenUrl: https://login.microsoftonline.com/common/oauth2/v2.0/token
apiSpecPath: https://flightsapismcp-apim.azure-api.net/mcp

The first two URLs are the standard Microsoft Entra OAuth endpoints (they’re the same for all Entra-authenticated APIs). The apiSpecPath should be the endpoint of your MCP server.

Issue #3: Tool parameter type incompatibilities

After fixing the OAuth properties, you might encounter additional errors related to the tool definitions. This happens because the toolkit imports MCP tool parameters with some types and formats that aren’t supported by the declarative agent runtime.

Here are the the fixes I had to apply:

  • in the search_flights tool, the property maxResults is set as an integer, but it should be actually treated as a string. You must change it from this:

    1
    2
    3
    4
    5
    
    "maxResults": {
     "description": "Maximum number of flights to return",
     "type": "integer",
     "default": 10
    }
    

    to this:

    1
    2
    3
    4
    5
    
    "maxResults": {
     "description": "Maximum number of flights to return",
     "type": "string",
     "default": "10"
    }
    
  • The departureDate parameter has an invalid property (format) and the type should be string, not an array. You must change it from:

    1
    2
    3
    4
    5
    6
    7
    8
    9
    
    "departureDate": {
     "description": "Departure date in YYYY-MM-DD format (e.g., '2025-01-20')",
     "type": [
     "string",
     "null"
     ],
     "format": "date",
     "default": null
    }
    

    to:

    1
    2
    3
    4
    5
    
    "departureDate": {
     "description": "Departure date in YYYY-MM-DD format (e.g., '2025-01-20')",
     "type": "string",
     "default": null
    }
    

Here’s how the corrected search_flights tool should look in its entirety:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
{
 "name": "search_flights",
 "description": "Search for flights by origin, destination, and departure date. Date is optional, if the user doesn't provide it, don't ask for it. Returns a list of available flights matching the criteria.",
 "parameters": {
 "type": "object",
 "properties": {
 "origin": {
 "description": "Departure airport or city (e.g., 'New York', 'JFK', 'Los Angeles')",
 "type": "string",
 "default": null
 },
 "destination": {
 "description": "Arrival airport or city (e.g., 'London', 'LHR', 'Tokyo')",
 "type": "string",
 "default": null
 },
 "departureDate": {
 "description": "Departure date in YYYY-MM-DD format (e.g., '2025-01-20')",
 "type": "string",
 "default": null
 },
 "maxResults": {
 "description": "Maximum number of flights to return",
 "type": "string",
 "default": "10"
 }
 },
 "required": []
 }
}

After making these corrections, run the Provision command again. This time, it should complete successfully, providing you are logged into the Microsoft 365 Agents Toolkit with your Microsoft 365 account.

Testing the agent

Now that our declarative agent is provisioned, we can test it in Copilot Chat. Open Copilot Chat and select your agent from the list of available agents.

Try asking a question like:

Find me flights from New York to London

The agent will go through the following flow:

  1. Permission request: The agent will ask for your permission to connect to the MCP server. Approve this request.

    The declarative agent asking for permission to connect to the MCP server

  2. Authentication: You’ll see a Sign in button. Click it to authenticate with your Microsoft Entra credentials.

    The sign-in button for Entra authentication

  3. Results: After authentication, the agent will call the MCP server and return a list of available flights.

    The agent returning flight results

We did it! The declarative agent is now successfully consuming our Entra-authenticated MCP server.

What happened behind the scenes?

When we imported the MCP server using the Microsoft 365 Agents Toolkit, I called out how the following entry was added to manage the authentication:

1
2
3
4
"auth": {
 "type": "OAuthPluginVault",
 "reference_id": "${{MCP_DA_AUTH_ID_FLIGHTSAPI}}"
}

The value of the ${{MCP_DA_AUTH_ID_FLIGHTSAPI}} placeholder will be saved inside the .env.dev file.

This entry references an entry in the Teams Developer Portal, which is created behind the scenes for you. However, if you want to better understand where it comes from, you can check it by logging in with your Entra Id on the portal at https://dev.teams.microsoft.com/. Move to Tools -> OAuth Client Registration. You will find the entry created by the Microsoft 365 Agents Toolkit and, if you click on it, you will see all the information required for the authentication, like the client id or the authorization URL. Notice that, at the top, there’s a box titled OAuth client registration ID, with a value inside it. This is the id that gets stored in the .env.dev file and assigned to the the ${{MCP_DA_AUTH_ID_FLIGHTSAPI}} placeholder.

The OAuth app registration in the Teams Developer Portal

Additional resources

If you want a more hands-on experience with MCP in declarative agents, I recommend checking out the Copilot Developer Camp. The Microsoft 365 Dev Advocacy team created an amazing lab which will guide you though the step-by-step experience of building your own MCP server with the authentication flow managed in code rather than through Azure API Management.

Wrapping up

In this post, we’ve seen how to create a declarative agent using the Microsoft 365 Agents Toolkit that consumes an Entra-authenticated MCP server. Along the way, we navigated several gotchas:

  1. The GET endpoint requirement: The toolkit needs a GET endpoint (not just POST) to discover authentication requirements.
  2. Missing OAuth properties: The m365agents.yml file needs authorizationUrl, tokenUrl, and apiSpecPath properties added manually.
  3. Tool parameter type incompatibilities: Integer and array types must be converted to strings, and unsupported format properties must be removed.

The one that took me the longest to figure out was the GET endpoint requirement. It’s not immediately obvious from the error messages, and the MCP specification primarily focuses on POST requests. But GitHub Copilot came to the rescue, by analyzing the repository of the Microsoft 365 Agents Toolkit to analyze how the operation was performed and matching it with my configuration.

As I mentioned at the beginning, this is the first post in a series exploring authenticated MCP in the Copilot ecosystem. In the next posts, we’ll see how to consume the same authenticated MCP server from Azure AI Foundry and Copilot Studio. Stay tuned!

Happy coding!

Read the whole story
alvinashcraft
18 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

0.0.400-0

1 Share

Added

  • Add MCP server instructions support
  • Add theme picker with /theme command and GitHub Dark/Light themes
  • ACP server supports changing models during a session
  • Show progress indicator in terminal tab when thinking
  • Add fuzzy search to model picker

Improved

  • CLI now sends DELETE requests to remove MCP servers when shutting down
  • Markdown table headers display in bold
  • Better support for UNIX keyboard bindings (Ctrl+A/E/W/U/K, Alt+arrows) and multiline content in various text inputs

Fixed

  • Ordered lists display with numbers instead of dashes
  • Fix support for pasting large content on Windows Terminal
  • Freeform text input in list pickers works correctly
  • The Code Review tool handles large changesets by ignoring build artifacts and limiting to 100 files
Read the whole story
alvinashcraft
18 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

The AI Acceleration Gap

1 Share
From: AIDailyBrief
Duration: 16:23
Views: 856

Examination of the AI acceleration gap and widening divergence between frontier adopters and mainstream users. Exploration of career and equity risks from compounding advantages alongside cultural and political polarization around AI adoption. Recommendation to adopt modest, regular experimentation with accessible tools and to create organizational time for practical AI skill development.

Brought to you by:
KPMG – Go to ⁠www.kpmg.us/ai⁠ to learn more about how KPMG can help you drive value with our AI solutions.
Vanta - Simplify compliance - ⁠⁠⁠⁠⁠⁠⁠https://vanta.com/nlw

The AI Daily Brief helps you understand the most important news and discussions in AI.
Subscribe to the podcast version of The AI Daily Brief wherever you listen: https://pod.link/1680633614
Get it ad free at
Join our Discord: https://bit.ly/aibreakdown

Read the whole story
alvinashcraft
19 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories