Read more of this story at Slashdot.
Read more of this story at Slashdot.
Read more of this story at Slashdot.
Every developer has that project they build just for the fun of it. You know how it goes: you start by asking “what if?” and then you have something weird and wonderful hours later.
This summer, we decided to celebrate that spirit with For the Love of Code, our first-ever competition for projects built purely for fun. More than 300 developers answered the call. Some leaned on GitHub Copilot to refactor ideas, fix bugs, and spark inspiration. Some teamed up. Others flew solo, guided only by caffeine and curiosity.
Entries spanned everything from a Breakout game powered by your GitHub graph, to a laugh-track that plays on every commit, a Yelp-style code reviewer in VS Code ★★★★☆, a Copilot you can literally see on camera, and even a comic strip made from your release notes.
We invited participants to build anything that sparks joy across six whimsical categories:
Here are the top three entries from each category.
Plane Tracker by @cpstroum is a DIY radar that uses an Adafruit Circuit Playground, Bluetooth, and the ADS-B Exchange API to fetch live flight data. It turns nearby planes into a real-time mini radar display.
Copilot cameo: GitHub Copilot helped @cpstroum with Git itself and with structuring the initial project for their first real push to GitHub. Thanks, Copilot! And welcome aboard, @cpstroum!Cadrephoto by @ozh is a Raspberry Pi and e-ink photo frame that displays pictures emailed to it (no app, no setup, perfect for less tech-savvy people). It checks an inbox, downloads the latest photo, and updates the screen automatically.
Copilot cameo: GitHub Copilot helped @ozh with their first Python project. It worked smoothly inside JetBrains IDEs and made code completion feel almost like magic.BuildIn by @SUNSET-Sejong-University and @lepetitprince99 is a real-life traffic light for your code that sits on your desk. Using an Arduino and the GitHub API, it lights up red, yellow, green, or blue to show your repository’s build status at a glance.
Copilot cameo: GitHub Copilot helped @SUNSET-Sejong-University debug and optimize their code. It saved time tracking down tricky hardware issues and made troubleshooting much easier.RestoHack by @Critlist resurrects the 1984 roguelike game that inspired NetHack, rebuilt from the original source with modern tools and a preservationist’s touch. It compiles cleanly, runs faithfully, and proves that forty years later, permadeath still hits hard.
Jukebox CLI by @FedeCarollo is a colorful, animated jukebox that runs right in your terminal. Built in Rust with Ratatui, it plays MP3s, shows floating musical notes, and color-codes each track in a scrollable playlist. You can play, pause, skip, and adjust the volume without ever leaving your command line.
Copilot cameo: GitHub Copilot helped @FedeCarollo explore unfamiliar Rust libraries and find their footing.Tuneminal by @heza-ru turns your terminal into a full-blown karaoke stage with scrolling lyrics, live audio visualization, and scoring that rewards your inner rock star. It’s open source, cross-platform, and the perfect excuse to sing while that git clone
takes a while.
Netstalgia by @heza-ru (again!) is a fully functional ‘90s web fever dream built with modern tech, but visually stuck on virtual dial-up. It’s got dancing babies, popup ads, a fake BBS, and more CRT glow than your old Gateway 2000 ever survived.
In true retro internet spirit, it even ships with a fake GitHub Star Ransomware—a tongue-in-cheek “virus” that demands you star the repo to “decrypt your files.” A clever (and harmless) new twist on the eternal quest for GitHub stars. ⭐💾
Bionic Reader by @Awesome-XV rewires how you read by bolding the first few letters of each word so your brain fills in the rest. It’s like giving your eyes a speed boost without the caffeine jitters to read faster than ever.
Copilot cameo: GitHub Copilot helped @Awesome-XV write project documentation and scaffold the initial codebase.Git Roast Show by @rawrnuck and @Anmol0201 is a full-stack web app that humorously “roasts” your GitHub profile. Built with React, Vite, and Express, it fetches live GitHub data to generate personalized, sound-enhanced, and animated comedy roasts.
Copilot cameo: GitHub Copilot helped @rawrnuck understand algorithms and handle the repetitive parts of their project.Nightlio by @shirsakm is a privacy-first mood tracker and daily journal you can self-host in minutes. Log how you feel on a 5-point scale, add Markdown notes, tag entries like #Sleep or #Productivity, then explore calendars, streaks, and simple stats to spot patterns. It runs anywhere with Docker, stores data in a local SQLite file, and keeps things clean with JWT-protected APIs, a React/Vite front end, and optional Google OAuth. No ads. No subscriptions. Your server, your rules.
Note: Because @heza-ru placed in two categories, we’ve added a fourth winner to this category.
Copilot cameo: GitHub Copilot helped @shirsakm with refactors, color palette updates, and codebase-wide changes that would have taken much longer by hand.Neosgenesis by @answeryt is a metacognitive AI framework that teaches machines to think about how they think. It runs a five-stage loop (think, verify, learn, optimize, decide) while juggling multiple LLMs, tools, and real-time feedback. A multi-armed bandit picks the best reasoning patterns, and when it stalls, an “aha” mode explores fresh paths.
MediVision Assistant by @omkardongre is an AI healthcare companion that helps elderly and disabled users manage their health through voice, image, and video. Users can scan medications, analyze skin conditions, log symptoms by voice, and chat with an AI doctor-like assistant.
Copilot cameo: GitHub Copilot helped @omkardongre generate React components, API templates, and AI integration code. It handled the boilerplate so they could focus on building features and improving the experience.Quiviva by @katawiecz is an interactive AI-powered CV that turns a job hunt into a chat adventure. Ask about skills or projects, or type “Gandalf” to unlock secret nerd mode. All this goes to show that even résumés can be fun.
AI-Dventure by @FedeCarollo is an interactive text adventure built in Rust and powered by OpenAI’s models. Players explore dynamically generated worlds in fantasy, horror, sci-fi, or historical settings where every command shapes the story and no two runs are the same.
BeatBugging by @sandra-aliaga, @Joshep-c, @RyanValdivia, and @tniia turns debugging into a rhythm game that converts your system logs into musical beats. Built in Python, it lets you fix bugs to the rhythm on a 5-by-5 grid and makes debugging sound unexpectedly good.
Copilot cameo: GitHub Copilot helped the team figure out next steps when they got stuck, offering helpful hints that kept development moving.MuMind by @FontesHabana is a web-based multiplayer version of the party game Herd Mentality, where players try to match the majority’s answers to score points. Built with React, Tailwind CSS, and Framer Motion, it offers multilingual support, lively animations, and a smooth, responsive experience.
@chornonoh-vova built GitFrag to reorganize your contributions graph using classic sorting algorithms (bubble, merge, quick, and counting sort). Each is visualized with smooth progress animations, GitHub login, and dark mode support. There’s also a wonderful writeup of how the developer approached it.
Copilot cameo: GitHub Copilot helped @chornonoh-vova structure their understanding of algorithms and add thoughtful details that made their visualization shine.Code Sensei by @redhatsam09 turns your VS Code sessions into a zen pixel adventure where your focus fuels the fun. Type to walk, pause to hop—but stay away too long away and your sensei meets a dramatic, 8-bit demise.
Reviewer Karma by @master-wayne7 keeps your pull requests peaceful by rewarding reviewers for good vibes and great feedback. Every emoji, comment, and code critique earns points on a live leaderboard that turns pull request reviews into a friendly competition.
Copilot cameo: GitHub Copilot helped @master-wayne7 write efficient Go code for the GitHub API, structure logic for assigning karma points, and handle repetitive tasks like error checking and markdown generation. It kept the project flowing smoothly from start to finish.Remember these are hackathon projects. They might not be feature complete, there may be bugs, spaghetti code, and the occasional rogue program escaped from the Grid. But they are clear examples of what we can accomplish when we do something just for the love of it.
All of our category winners get 12 months of GitHub Copilot Pro+.
If For the Love of Code proved anything, it’s that creativity and code thrive best together—especially with Copilot lending a hand.
Congratulations to all of our winners: @Anmol0201, @answeryt, @Awesome-XV, @chornonoh-vova, @cpstroum, @Critlist, @FedeCarollo, @FontesHabana, @heza-ru, @joshep-c, @katawiecz, @lepetitprince99, @master-wayne7, @omkardongre, @RyanValdivia, @ozh, @rawrnuck, @redhatsam09, @sandra-aliaga, @shirsakm, @SUNSET-Sejong-University, @tniia.
Massive thank you to our judges, which included a mix of GitHub Stars, Campus Experts, and GitHub Developer Relations friends: @Ba4bes, @colbyfayock, @j0ashm, @JuanGdev, @howard-lio, @luckyjoseph, @metzinaround, @Taiwrash, and @xavidop.
And thank you Copilot for your assistance!
Now back to work everyone! Playtime is over.
💜 If you enjoyed For the Love of Code, stay tuned… Game Off 2025 begins this November!
The post From karaoke terminals to AI résumés: The winners of GitHub’s For the Love of Code challenge appeared first on The GitHub Blog.
One way to achieve this result is by integrating few pieces of available technology. It sounds like a lot of moving parts but we do not need
What this does is to provide a connection point between PowerShell and the information you want to access. Since creating an app registration creates a client secret and appId we can leverage these two pieces of information on the PowerShell script.
For reference from our documentation: How to register an application in EntraID.
1.- Jump into your EntraID tenant --> manage --> App registrations and click on “New registration” (Figure 1)
Figure 1. EntraID App registration process
2.- This is a simple app registration process, nothing complicated, it is just to obtain that AppID and ClientSecret value we need for the PowerShell script. For the purpose of the test, I called the app “DefenderEntraQueryApp” and configured with the following settings:
Authentication settings:
Fig 2. Authentication settings of app registration.
Certificates & Secrets:
Fig 3. Certificates and App client secret.
NOTE: remember when you create the app registration the client secret (Value) is shown only that time, after you move away from the app registration creation screen, the client secret will not be shown again. If you cannot grab the ClientSecret during registration of the app, you can click on the “New client secret” button from the view, create a new client secret and delete the previous one.
API Permissions:
Fig 4. API permissions.
After these parameters are configured in EntraID, you need to grab the following parameters from EntraID and insert this in the PowerShell script:
$tenantId = "MY-TENANT-ID"
$clientId = "MY-CLIENT-ID"
$secretPath = "C:\certs\clientSecret.dat"
$deviceList = Get-Content "C:\temp\devices.txt"
NOTE: The $deviceList variable is the text file where you will input the computer names you want to interrogate. Adjust the path for the text file to your preferred path and file name but be sure to reflect that in the script logic.
Since the client secret generated when registering the application in EntraID is in plain text, we cannot allow this information to be passed on in the script in plain text.
For this, we encrypt the client secret information into a .dat file using Windows DPAPI encryption and the script will pull it from a location on the user’s computer.
It is worth noting that the .dat file is bound to the user creating it, so, if you try to export this .dat file to another computer, the script will fail. Below is the one-time setup needed to create the .dat encrypted file the script will use.
To encrypt the client secret from your EntraID registered application, do the following:
The first line on the code sequence below adds the assemblies for System.Security that will instruct PowerShell this is an encryption operation. Run each of these lines, one by one in a PowerShell window with elevated privileges.
NOTE: change the path in line 6 of this piece of code to the path where you want the .dat file generated.
Add-Type -AssemblyName System.Security
$plainText = "your-client-secret"
$secureBytes = [System.Text.Encoding]::UTF8.GetBytes($plainText)
$encrypted = [System.Security.Cryptography.ProtectedData]::Protect($secureBytes, $null, [System.Security.Cryptography.DataProtectionScope]::CurrentUser)
[System.IO.File]::WriteAllBytes("C:\temp\clientSecret.dat", $encrypted)
Now that the app registration and client secret encryption is out of the way you can populate the text file with the list of computers you want to check, for example:
For the example, I am assuming the path for the text file is c:\temp
Fig 5. devices.txt file used as input to target multiple computers
Single entry per line, no space at the end. As you can see based on the device names from Fig 5. the script works for any supported OS as long as the machine is registered in EntraID.
After the text file is saved, open a Powershell windows with elevated privileges and proceed to connect your EntraID tenant for authentication purposes.
Use this command to connect to EntraID: Connect-AzureAD validate your credentials and them switch to the path where the script is, if you are not already there, and then execute the script in the PowerShell window:
Fig 6. Running the PS Script
Fig 7. Authenticating to EntraID:
Fig 8. Output of the script:
The script also exports the list to a .csv file, by default the path for the .csv file is c:\temp\DeviceStatus.csv
The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.
NOTE: When using your scripting editing tool of choice, be aware of any additional spaces added as a result of the copy/past operation into your editing tool.
=== CONFIGURATION ===
#This script loops through a list of devices to check if the device is enabled or disabled in EntraID
#It uses the MS Graph API and a simple app registration in EntraID with consent granted to access
#Defender via Defender API
#Steps to define the pre-requisites for the script to run will be provided in a separate doc guide #Author: Edgar Parra - Microsoft v1.2
$tenantId = "MY-TENANT-ID"
$clientId = "MY-CLIENT-ID"
$secretPath = "C:\certs\clientSecret.dat"
$deviceList = Get-Content "C:\temp\devices.txt"
=== LOAD ENCRYPTED CLIENT SECRET ===
Add-Type -AssemblyName System.Security if (-not (Test-Path $secretPath)) { Write-Host "Encrypted client secret file not found at $secretPath." return } try { $encryptedSecret = [System.IO.File]::ReadAllBytes($secretPath) $decryptedBytes = [System.Security.Cryptography.ProtectedData]::Unprotect( $encryptedSecret, $null, [System.Security.Cryptography.DataProtectionScope]::CurrentUser ) $clientSecret = [System.Text.Encoding]::UTF8.GetString($decryptedBytes) } catch { Write-Host "Error decrypting client secret: $($_.Exception.Message)" return }
=== AUTHENTICATION ===
$body = @{ grant_type = "client_credentials" scope = "https://graph.microsoft.com/.default" client_id = $clientId client_secret = $clientSecret } try { $tokenResponse = Invoke-RestMethod -Method Post -Uri "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" -Body $body $accessToken = $tokenResponse.access_token } catch { Write-Host "Error retrieving token: $($_.Exception.Message)" return } $headers = @{ Authorization = "Bearer $accessToken" "Content-Type" = "application/json" Accept = "application/json" }
=== LOOP THROUGH DEVICES ===
$results = @() foreach ($deviceName in $deviceList) { $escapedDeviceName = $deviceName -replace "'", "''" $uri = "https://graph.microsoft.com/v1.0/devices?`$filter=displayName eq '$escapedDeviceName'" try { $response = Invoke-RestMethod -Uri $uri -Headers $headers -Method Get if ($response.value.Count -eq 0) { Write-Host "Device '$deviceName' not found." $results += [PSCustomObject]@{ DeviceName = $deviceName Status = "Not Found" } } else { $device = $response.value[0] $status = if ($device.accountEnabled) { "Enabled" } else { "Disabled" } Write-Host "$($device.displayName): $status" $results += [PSCustomObject]@{ DeviceName = $device.displayName Status = $status } } } catch { $errorMessage = $.Exception.Response.GetResponseStream() | % { New-Object System.IO.StreamReader($) } | % { $_.ReadToEnd() } Write-Host "Error querying '$deviceName': $errorMessage" $results += [PSCustomObject]@{ DeviceName = $deviceName Status = "Error" } } }
=== EXPORT RESULTS TO CSV ===
$results | Export-Csv -Path "C:\temp\DeviceStatus.csv" -NoTypeInformation
For further insights and guidance on data protection encryption and app registrations in EntraID, consider reviewing these related articles:
In today’s enterprise landscape, enabling AI agents to interact with backend systems securely and at scale is critical. By exposing MCP servers through Azure API Management (APIM), organizations can provide controlled access to these services. When combined with OAuth 2.0 authorization code flow, this setup ensures robust, enterprise-grade security for AI agents built in Copilot Studio—empowering intelligent automation while maintaining strict access governance.
This article explores how to configure a MCP tool—exposed as a MCP server via APIM—for secure consumption by AI agents built in Copilot Studio. Leveraging the OAuth 2.0 Authorization Code Flow, this setup ensures enterprise-grade security by enabling delegated access without exposing user credentials.
With Azure API Management now supporting MCP server capabilities in public preview, developers can expose REST APIs as MCP tools using a standardized JSON-RPC interface. This allows AI agents to invoke backend services securely and scalable, without the need to rebuild existing APIs. Copilot Studio, also in preview for MCP integration, empowers organizations to orchestrate intelligent agents that interact with these tools in real time.
While this guide provides a foundational approach, every environment is unique. You can enhance security further by implementing app roles, conditional access policies, and extending your integration logic with custom Python code for advanced scenarios.
⚠️ Note: Both MCP server support in APIM and MCP tool integration in Copilot Studio are currently in public preview. As these platforms evolve rapidly, expect changes and improvements over time. Always refer to the https://learn.microsoft.com/en-us/azure/api-management/export-rest-mcp-server for the latest updates.
This article is about consuming remote MCP servers. In Azure, managed identity can also be leveraged for APIM integration.
The Authorization Code Flow is designed for applications that can securely store a client secret (like server-side apps). It allows the app to obtain an access token on behalf of the user without exposing their credentials. This flow uses an intermediate authorization code to exchange for tokens, adding an extra layer of security.
Below diagram shows the Authorization code flow in detail.
Press enter or click to view image in full size
Press enter or click to view image in full size
This architecture can also be implemented with APIM backend app registration only. However, stay cautious in configuring redirect URIs appropriately.
APIM exposing Remote MCP servers, enabling AI agents—such as those built in Copilot Studio—to securely access backend services using standardized JSON-RPC interfaces. This integration offers a robust, scalable, and secure way to connect AI tools with enterprise APIs.
To learn more, visit https://learn.microsoft.com/en-us/samples/azure-samples/remote-mcp-apim-functions-python/remote-mcp-apim-functions-python/
This deployment guide provides sample MCP code written in python for ease of use. It is available on the following GitHub repo. However, you can also use your own MCP server.
Clone the following repository and open in VS Code.
git clone https://github.com/mafzal786/mcp-server.git Run the following to execute it locally. cd mcp-server uv venv uv sync uv run mcpserver.pyIn this deployment guide, MCP server is deployed in Azure Container App. It can also be deployed as Azure App service.
Deploy the MCP server in Azure container App by running the following command. It can be deployed by many other various ways such as via VS Code or CI/CD pipeline. AZ Cli is used for simplicity.
az containerapp up \ --resource-group <RESOURCE_GROUP_NAME> \ --name streamable-mcp-server2 \ --environment mcp \ --location <REGION> \ --source .1. Sign in Azure portal. Visit the container App in Azure and Click “Authentication” as shown below.
Press enter or click to view image in full size
For more details, visit the following link: Enable authentication and authorization in Azure Container Apps with Microsoft Entra ID | Microsoft Learn
Click Add Identity Provider as shown.
2. Select Microsoft from the drop down and leave everything as is as shown below.
3. This will create a new app registration for the container App. After it is all setup, it will look like as below.
As soon as authentication is configured. it will make container app inaccessible except for OAuth.
Note: If you have app registration for Azure Container App already configured, use that by selecting "pick an existing app registration in this directory" option.
Note: Make sure to "Grant admin consent" before proceeding to next step.
In these steps, we will be configuring app registration for the client app, such as copilot studio in this case acting as a client app. This is also mentioned in the “high level architecture” diagram in the earlier section of this article.
3. Click on API permission and click “Add a permission”. Click Microsoft Graph and then click “Delegated permissions”. Select email, openid, profile as shown below.
4. Make sure to Grant admin consent and it should look like as below.
5. Create a secret. click “Certificates & secrets”. Create a new client secret by clicking “New client secret”. store the value as it will be masked after some time. if that happens, you can always delete and re-create a new secret.
6. Capture the following as you would need it in configuring MCP tool in Copilot Studio.
7. Configure API permissions for APIM API i.e. "apim-mcp-backend-api" in this case. Click “API permissions” tab. Click “Add a permission”. Click on “My APIs” tab as shown below and select "apim-mcp-backend-api".
Note: If you don't see the app registration in "My APIs". Go to App registration. Click "Owners". Add your AD account as Owners.
8. Select "Delegated permissions". Then select the permission as shown below.
9. Select the Application permission. Select the App roles created in the apim-mcp-backend-api registration. Such as mcp.read in this case.
You MUST “Grant admin consent” as final step. It is very important!!! I can’t emphasize more on that. without it, nothing will work!!!
10. End result of this client app registration should look like as mentioned in the below figure.
Note: Don't forget to Grant admin consent.
It defines which audience values (aud claim) in a token are considered valid for your app. When a client app requests an access token from Microsoft Entra ID (Azure AD), the token includes an aud claim that identifies the intended recipient. Your container app will only accept tokens where the aud claim matches one of the values in the Allowed Token Audiences list.
This is important as it ensures that only tokens issued for your API or app are accepted and prevents misuse of tokens intended for other resources. This adds extra layer of security.
Note: Provisioning an API Management resource is outside the scope of this document.
If you do not already have an API Management instance, follow this QuickStart: https://learn.microsoft.com/en-us/azure/api-management/get-started-create-service-instance
The following service tiers are available for preview: Classic Basic, Standard, Premium, and Basic v2, Standard v2, Premium v2.
For the Classic Basic, Standard, or Premium tiers, you must join the AI Gateway Early Update group to enable MCP server features. Please allow up to 2 hours for the update to take effect.
Follow these steps to expose an existing MCP server is API Management:
Below diagram shows the MCP servers configured in APIM for reference.
Configure one or more API Management policies to help manage the MCP server. The policies are applied to all API operations exposed as tools in the MCP server and can be used to control access, authentication, and other aspects of the tools.
To configure policies for the MCP server:
<!-- - Policies are applied in the order they appear. - Position <base/> inside a section to inherit policies from the outer scope. - Comments within policies are not preserved. --> <!-- Add policies as children to the <inbound>, <outbound>, <backend>, and <on-error> elements --> <policies> <!-- Throttle, authorize, validate, cache, or transform the requests --> <inbound> <base /> <set-variable name="accessToken" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "").Replace("Bearer ", ""))" /> <!-- Log the captured access token to the trace logs --> <trace source="Access Token Debug" severity="information"> <message>@("Access Token: " + (string)context.Variables["accessToken"])</message> </trace> <set-variable name="userId" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["oid"].FirstOrDefault())" /> <set-variable name="userName" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["name"].FirstOrDefault())" /> <trace source="User Name Debug" severity="information"> <message>@("username: " + (string)context.Variables["userName"])</message> </trace> <set-variable name="scp" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["scp"].FirstOrDefault())" /> <trace source="Scope Debug" severity="information"> <message>@("scope: " + (string)context.Variables["scp"])</message> </trace> <set-variable name="roles" value="@(context.Request.Headers.GetValueOrDefault("Authorization", "Bearer ").Split(' ')[1].AsJwt().Claims["roles"].FirstOrDefault())" /> <trace source="Role Debug" severity="information"> <message>@("Roles: " + (string)context.Variables["roles"])</message> </trace> <!-- <set-variable name="requestBody" value="@{ return context.Request.Body.As<string>(preserveContent:true); }" /> <trace source="Request Body information" severity="information"> <message>@("Request body: " + (string)context.Variables["requestBody"])</message> </trace> --> <validate-azure-ad-token tenant-id="{{tenant-id}}" header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized. Access token is missing or invalid."> <client-application-ids> <application-id>{{client-application-id}}</application-id> </client-application-ids> <audiences> <audience>{{audience}}</audience> </audiences> <required-claims> <claim name="roles" match="any"> <value>mcp.read</value> </claim> </required-claims> </validate-azure-ad-token> </inbound> <!-- Control if and how the requests are forwarded to services --> <backend> <base /> </backend> <!-- Customize the responses --> <outbound> <base /> </outbound> <!-- Handle exceptions and customize error responses --> <on-error> <base /> <trace source="Role Debug" severity="error"> <message>@("username: " + (string)context.Variables["userName"] + " has error in accessing the MCP server, could be auth or role related...")</message> </trace> <return-response> <set-status code="403" reason="Forbidden" /> <set-body> {"error":"Missing required scope or role"} </set-body> </return-response> </on-error> </policies>
Note: Update the above inbound policy with the tenant Id, client application id, and audience as per your environment. It is recommended to use APIM "Named values" instead of hard coding inside the policy. To learn more, visit Use named values in Azure API Management policies
In this solution, APIM diagnostics are configured to forward log data to Log Analytics. Testing and validation will be carried out using insights from Log Analytics.
Note: Setting up diagnostics is outside the scope of this article. However, you can visit the following link for more information. https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-azure-monitor
Below diagram shows what Logs are being sent to Log Analytics workspace.
3. Click on New tool.
4. Select Model Context Protocol.
5. Provide all relevant information for MCP server. Make sure your server URL ends with your mcp setup. In this case, it is APIM MCP server URL, with base path configured in APIM in the end. Provide server name and server description.
Select OAuth 2.0 radio button.
6. Provide the following in the OAuth 2.0 section
This will provide you Redirect URL. you need to configure the redirect URL in client app registration. In this case, it is copilot-agent-client.
Visit client app registration. i.e. copilot-studio-client. Click Authentication Tab and provide the Web Redirect URIs as shown below.
Note: Configure Redirect URIs MUST be configured in app registration. Otherwise, authorization will not complete and sign on will fail.
Also configure apim-mcp-backend-api app registration with the same redirect URI as shown below.
Now visit the https://make.powerapps.com and open the newly created connector as shown below.
Select the security tab and modify the Resource URL with application ID URI of apim-mcp-backend-api configured earlier in app registration for expose an API. Add .default in the scope. Provide the secret of client app registration as it will not let you update the connector. This is extra security measure for updating the connector in Powerapps.
Click Update connector.
CORS configuration is a MUST!!! Since our Azure Container App is a remote MCP server with totally different domain or origin.
When embedding or integrating Power Apps with external web applications or APIs, Cross-Origin Resource Sharing (CORS) becomes a critical consideration. CORS is a browser security feature that restricts web pages from making requests to a different domain than the one that served the page, unless explicitly allowed.
If the CORS are not setup. You will encounter following error in copilot studio after pressing F12 (Browser Developer)
CORS policy — blocking the container app
Azure container app provides very efficient way of configuring CORS in the Azure portal.
4. Click on “Allowed Method” tab and provide the following.
5. Provide wild card “*” in “Allowed Headers”tab. Although, it is not recommended for production system. it is done for the sake for simplicity. Configure that for added security
6. Click “Apply”. This will configure CORS for remote application.
We are in the final stages of configuring the connector. It is time to test it, if everything is configured correctly and works.
2. New connection will launch the Authorization flow and browser dialog will pop up for making a request for authorization code.
3. Click “Create”.
4. Complete the login process. This will create a successful connection.
5. Click “Test operation”. If the response is 406 means everything is configured correctly as shown below.
Roles have been defined under the required claims in the APIM inbound policy and also configured in the apim-mcp-backend-api app registration. As a result, any request from Copilot Studio will be denied if this role is not properly assigned. This role is included in the JWT access token, which we will validate in the following sections.
To assign role, perform the following steps.
5. Select User or Group who should have access to the role.
6. Click "Assign". It will look like as below.
Note: Role assignment for users or groups is an important step. If it is not configured, MCP server tests will fail in Copilot studio.
Make sure it is “Enabled” if you have other tools attached to the same agent, disable them for now for testing.
Make sure you have connection available which we created during the testing of custom connector in earlier step. You can also initiate a fresh connection by clicking on the drop down under “Connection” as shown below.
Refreshing the tools will show all the tools available in this MCP server.
Provide the sample prompt such as “Give me the stock price of tesla”. This will trigger the MCP server and call the respective method to bring the stock price of Tesla.
Now try a weather-related question to see more.
Now invoking weather forecast tool in the MCP server.
We previously configured APIM diagnostic settings to forward log data to Log Analytics. In this section, we’ll review that data, as the inbound policy in APIM sends valuable information to Log Analytics.
Run the Kusto query to retrieve data from the last 30 minutes. As shown, the logs capture the APIM API endpoint URL and the backend URL, which corresponds to the Azure Container App endpoint.
Scrolling further, we find the TraceRecords section. This contains the information captured by APIM inbound policies and sent to Log Analytics. The figure below illustrates the TraceRecords data. In the inbound policy, we configured it to extract details from the access token—such as the token itself, username, scope, and roles—and forward them to Log Analytics.
Now let's capture the access token in the clip board, launch the http://jwt.io which is JSON Web Token (JWT) debugger, and paste the access token in the ENCODED VALUE box as show below. Note the following information.
Note: As you can see, roles are included in access token and if it is not assigned in the enterprise application for "apim-mcp-backend-api", all requests will be denied by APIM inbound policy configured earlier.
Now, let's try the copilot studio agent by logging in with another account which is not assigned for the "mcp.read" role.
Let's, review the below diagram.
Let's review log analytics. As you can see request failed due to inbound APIM policy with 403 error and there is no backend URL. Error is also reported under TraceRecords as we configured it in APIM policy.
Now copy the Access token from log analytics and paste it into jwt.io. You can notice in the below diagram, there is no "roles" in the access token, resulting access denied from APIM inbound policy definition to the APIM backend i.e. azure container app.
Let's assign the "mcp.read" role to the demo account and test if it accesses the tool.
End result would look like as shown below.
Now, login again as demo.
Make sure a new access token is generated. Access token refresh happens after one hours.
As you can see in the image below, this time the request is successful after assigning the "mcp.read" app roles.
Now let's review the log analytics entries.
Let's review the access token in JWT.io. As you can see, roles are included in the access token.
Exposing the MCP server through Azure API Management (APIM) and integrating it with Copilot Studio agents provides a secure and scalable way to extend enterprise capabilities. By implementing OAuth 2.0, you ensure robust authentication and authorization, protecting sensitive data and maintaining compliance with industry standards.
Beyond security, APIM adds significant operational value. With APIM policies, you can monitor traffic, enforce rate limits, and apply fine-grained controls to manage access and performance effectively. This combination of security and governance empowers organizations to innovate confidently while maintaining control and visibility over API usage.
In today’s enterprise landscape, leveraging APIM with OAuth 2.0 for MCP integration is not just best practice—it’s a strategic move toward building resilient, secure, and well-governed solutions.
Hello Folks!
Managing file servers across on-premises datacenters and cloud environments can be challenging for IT professionals. Azure File Sync (AFS) has been a game-changer by centralizing file shares in Azure while keeping your on-premises Windows servers in play. With AFS, a lightweight agent on a Windows file server keeps its files synced to an Azure file share, effectively turning the server into a cache for the cloud copy. This enables classic file server performance and compatibility, cloud tiering of cold data to save local storage costs, and capabilities like multi-site file access, backups, and disaster recovery using Azure’s infrastructure. Now, with the introduction of Azure Arc integration for Azure File Sync, it gets even better. Azure Arc, which allows you to project on-prem and multi-cloud servers into Azure for unified management, now offers an Azure File Sync agent extension that dramatically simplifies deployment and management of AFS on your hybrid servers.
In this post, I’ll explain how this new integration works and how you can leverage it to streamline hybrid file server management, enable cloud tiering, and improve performance and cost efficiency.
You can see the E2E 10-Minute Drill - Azure File sync with ARC, better together episode on YouTube below.
Azure File Sync has already enabled a hybrid cloud file system for many organizations. You install the AFS agent on a Windows Server (2016 or later) and register it with an Azure Storage Sync Service. From that point, the server’s designated folders continuously sync to an Azure file share. AFS’s hallmark feature is cloud tiering, older, infrequently used files can be transparently offloaded to Azure storage, while your active files stay on the local server cache. Users and applications continue to see all files in their usual paths; if someone opens a file that’s tiered, Azure File Sync pulls it down on-demand. This means IT pros can drastically reduce expensive on-premises storage usage without limiting users’ access to files. You also get multi-site synchronization (multiple servers in different locations can sync to the same Azure share), which is great for branch offices sharing data, and cloud backup/DR by virtue of having the data in Azure. In short, Azure File Sync transforms your traditional file server into a cloud-connected cache that combines the performance of local storage with the scalability and durability of Azure.
Azure Arc comes into play to solve the management side of hybrid IT. Arc lets you project non-Azure machines (whether on-prem or even in other Clouds) into Azure and manage them alongside Azure VMs. An Arc-enabled server appears in the Azure portal and can have Extensions installed, which are components or agents that Azure can remotely deploy to the machine.
Prior to now, installing or updating the Azure File Sync agent on a bunch of file servers meant handling each machine individually (via Remote Desktop, scripting, or System Center). This is where the Azure File Sync Agent Extension for Windows changes the game.
Using the new Arc extension, deploying Azure File Sync is as easy as a few clicks. In the Azure Portal, if your Windows server is Arc-connected (i.e. the Azure Arc agent is installed and the server is registered in Azure), you can navigate to that server resource and simply Add the “Azure File Sync Agent for Windows” extension. The extension will automatically download and install the latest Azure File Sync agent (MSI) on the server. In other words, Azure Arc acts like a central deployment tool: you no longer need to manually log on or run separate install scripts on each server to set up or update AFS. If you have 10, 50, or 100 Arc-connected file servers, you can push Azure File Sync to all of them in a standardized way from Azure – a huge time saver for large environments. The extension also supports configuration options (like proxy settings or automatic update preferences) that you can set during deployment, ensuring the agent is installed with the right settings for your environment
Note: The Azure File Sync Arc extension is currently Windows-only. Azure Arc supports Linux servers too, but the AFS agent (and thus this extension) works only on Windows Server 2016 or newer. So, you’ll need a Windows file server to take advantage of this feature (which is usually the case, since AFS relies on NTFS/Windows currently). |
Once the extension installs the agent, the remaining steps to fully enable sync are the same as a traditional Azure File Sync deployment: you register the server with your Storage Sync Service (if not done automatically) and then create a sync group linking a local folder (server endpoint) to an Azure file share (cloud endpoint). This can be done through the Azure portal, PowerShell, or CLI. The key point is that Azure Arc now handles the heavy lifting of agent deployment, and in the future, we may see even tighter integration where more of the configuration can be done centrally. For now, IT pros get a much simpler installation process – and once configured, all the hybrid benefits of Azure File Sync are in effect for your Arc-managed servers.
For more info and step-by-step guidance, check out these resources:
You, as an IT Pro, can provide your organization with the benefits of cloud storage – scalability, reliability, pay-as-you-go economics – while retaining the performance and control of on-premises file servers. All of this can be achieved with minimal overhead, thanks to the new Arc-delivered agent deployment and the powerful features of Azure File Sync.
Check it out if you have not done so before. I highly recommend exploring this integration to modernize your file services.
Cheers!
Pierre Roman