Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
141488 stories
·
32 followers

You’ve got 99 problems but data shouldn’t be one

1 Share
Ryan is joined by Tobiko Data co-founders Toby Mao and Iaroslav Zeigerman to talk about the crucial role of rigorous data practices and tooling, the innovations of Tobiko Data’s SQLMesh and SQLGlot, and their insights into the future of data engineering with the rise of AI.
Read the whole story
alvinashcraft
5 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

New Default Model for Visual Studio Copilot, So How Do You Choose?

1 Share

Along with a new default model, a new Consumptions panel in the IDE helps developers monitor their usage of the various models, paired with UI to help easily switch among models.

Read the whole story
alvinashcraft
5 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Microsoft Teases HQ Dev Conference Workflows 'Never Shown Publicly Before'

1 Share

Mads Kristensen, Microsoft's principal product manager for Visual Studio, will demonstrate "workflows we've never shown publicly before" at the Aug. 4-8 VSLive! developer conference being held at Microsoft headquarters in Redmond, Wash.

Read the whole story
alvinashcraft
5 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

GitHub Copilot for Azure Ships: Now Powered by Agent Mode

1 Share

GitHub Copilot for Azure just shipped with an important addition since its debut at Ignite 2024 as a private preview, becoming an autonomous operator for DevOps workflows.

Read the whole story
alvinashcraft
5 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Deploy Machine Learning Models the Smart Way with Azure Blob & Web App

1 Share

💡 Why This Approach?

Traditional deployments often include models inside the app, leading to:

  • Large container sizes
  • Long build times
  • Slow cold starts
  • Painful updates when models change

With Azure Blob Storage, you can offload the model and only fetch it at runtime — reducing size, improving flexibility, and enabling easier updates.

What You will Need
  • An ML model (model.pkl, model.pt, etc.)
  • An Azure Blob Storage account
  • A Python web app (FastAPI, Flask, or Streamlit)
  • Azure Web App (App Service for Python)
  • Azure Python SDK: azure-storage-blob

Step 1: Save and Upload Your Model to Blob Storage

First, save your trained model locally:

# PyTorch example import torch torch.save(model.state_dict(), "model.pt")

Then, upload it to Azure Blob Storage:

from azure.storage.blob import BlobServiceClient conn_str = "your_connection_string" blob_service = BlobServiceClient.from_connection_string(conn_str) container = blob_service.get_container_client("models") with open("model.pt", "rb") as f: container.upload_blob(name="model.pt", data=f, overwrite=True)

Step 2: Build a Lightweight Inference App

Create a simple FastAPI app that loads the model from Blob Storage on startup:

from fastapi import FastAPI from azure.storage.blob import BlobClient import io, torch app = FastAPI() @app.on_event("startup") def load_model(): print("Loading model from Azure Blob Storage...") blob = BlobClient.from_connection_string("your_connection_string", container_name="models", blob_name="model.pt") stream = io.BytesIO(blob.download_blob().readall()) global model model = torch.load(stream, map_location='cpu') model.eval() @app.get("/") def read_root(): return {"message": "Model loaded and ready!"} @app.post("/predict") def predict(data: dict): # Example input, dummy output return {"result": "prediction goes here"}

Step 3: Push Your App to GitHub and Deploy to Azure Model Loads at Runtime!

Now that your ML model is safely uploaded to Azure Blob Storage (Step 2), it’s time to push your inference app (without the model) to GitHub and deploy it via Azure Web App.

The trick? Your app will dynamically fetch the model from Blob Storage at runtime — keeping your repo light and deployment fast!

 3.1 Push Your App Code (Without the Model) to GitHub

Your project structure should look like this:

azure-ml-deploy/ │ ├── main.py # Your FastAPI/Flask app ├── requirements.txt # Python dependencies ├── README.md # Optional documentation

🚫 Do NOT include model.pt or any large model files in your GitHub repo!

 3.2 main.py: Load the Model from Azure Blob Storage at Runtime

Here's your main.py — which automatically pulls the model during startup:

from fastapi import FastAPI from azure.storage.blob import BlobClient import torch import io app = FastAPI() @app.on_event("startup") def load_model(): print(" Loading model from Azure Blob Storage...") blob = BlobClient.from_connection_string( conn_str="your_connection_string", # You’ll set this in Azure Portal container_name="models", blob_name="model.pt" ) stream = io.BytesIO(blob.download_blob().readall()) global model model = torch.load(stream, map_location='cpu') model.eval() @app.get("/") def home(): return {"status": "Model loaded from Azure Blob!"} @app.post("/predict") def predict(data: dict): # Replace this with your own prediction logic return {"prediction": "sample output"}

3.3 requirements.txt

fastapi uvicorn torch azure-storage-blob

 3.4 Deploy to Azure Web App Using GitHub Repo

  1. Go to Azure Portal
  2. Create a new Web App
    • Runtime Stack: Python 3.10
    • OS: Linux
  3. Under Deployment > GitHub, connect your GitHub repo
  4. In Configuration > Application Settings, add:
    • AZURE_STORAGE_CONN_STRING = <your-blob-connection-string>

This way, your app doesn’t store any secrets in code.

Benefits of This Setup

  • Clean separation of model and code
  • Smaller, faster deployable packages
  • Easy model updates (just replace the blob!)
  • No need for GPUs or complex infrastructure
  • Ideal for web APIs, dashboards, and even chatbots.

Conclusion

In this blog, you learned how to separate your ML model storage from deployment, making your applications faster, cleaner, and more scalable using Microsoft Azure technologies.

By pushing a lightweight API to GitHub and having your application download the model from Azure Blob Storage at runtime, you:

  • Avoid bloated GitHub repos
  • Accelerate deployments via Azure Web App
  • Keep credentials and models secure with Azure App Settings
  • Enable dynamic updates to your model without redeploying your app

This architecture is perfect for real-world, production-grade ML systems whether you're building prototypes or enterprise-grade APIs.

💡 Final Thought

Decouple. Deploy. Deliver.
With the power of Azure Blob Storage + Azure App Service, you can scale smarter — not heavier.

Happy Building! ✨

If you found this blog helpful or you're working on something similar, I’d love to connect and exchange ideas join the Azure AI Foundry communitues or reach out to me on Linkedin Mohamed Faraazman Bin Farooq S | LinkedIn

Read the whole story
alvinashcraft
6 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Learning Azure with Jonah Andersson: A Developer's Guide to Cloud Computing and Development Fundamentals

1 Share

RJJ Software's Software Development Service

This episode of The Modern .NET Show is supported, in part, by RJJ Software's Software Development Services, whether your company is looking to elevate its UK operations or reshape its US strategy, we can provide tailored solutions that exceed expectations.

Show Notes

"So the cloud adoption framework actually has a lot of steps for organizations or IT teams to start assessing their existing environments first and planning the stage before they modernise and migrate to Azure. And then the well-architected framework allows the team, whoever is involved, developers, engineers, or architects, working in that migration project to think how they're going to think about architecting for the cloud in a way that it meets all the pillars in terms of resiliency, performance, architecture, and everything. Security, for example, that they need to think about."— Jonah Andersson

Welcome friends to The Modern .NET Show; the premier .NET podcast, focusing entirely on the knowledge, tools, and frameworks that all .NET developers should have in their toolbox. We are the go-to podcast for .NET developers worldwide, and I am your host: Jamie “GaProgMan” Taylor.

In this episode, which is the final one of season 7, Jonah Andersson joins us to talk all things Azure, the many pathways involved in migrating and modernising .NET applications, and publishing to the cloud.

"So one tool that I actually highly recommend when it comes to .NET, there is a plug-in for Visual Studio, actually, for .NET, and even, I think, with Java. There';s a tool called AppCAT plugin, and it's like a modernization tool that is part of the Azure Migrate that allows .NET developers who are ever working in a migration project with .NET, that they can add a plugin in Visual Studio and they can assess their existing source code, .NET source code, based on the well-architected framework, if it's ready or not, or there are gaps in the code."— Jonah Andersoon

Along the way, we talk about Jonah's podcast "Extend Women in Tech Podcast" (which I would highly recommend), and her book "Learning Microsoft Azure: Cloud Computing and Development Fundamentals" and why she chose to write it.

Anyway, without further ado, let's sit back, open up a terminal, type in `dotnet new podcast` and we'll dive into the core of Modern .NET.

Supporting the Show

If you find this episode useful in any way, please consider supporting the show by either leaving a review (check our review page for ways to do that), sharing the episode with a friend or colleague, buying the host a coffee, or considering becoming a Patron of the show.

Full Show Notes

The full show notes, including links to some of the things we discussed and a full transcription of this episode, can be found at: https://dotnetcore.show/season-7/learning-azure-with-jonah-andersson-a-developers-guide-to-cloud-computing-and-development-fundamentals/

Jonah's Links:

Useful Links:

Supporting the show:

Getting in Touch:

Remember to rate and review the show on Apple Podcasts, Podchaser, or wherever you find your podcasts, this will help the show's audience grow. Or you can just share the show with a friend.

And don't forget to reach out via our Contact page. We're very interested in your opinion of the show, so please get in touch.

You can support the show by making a monthly donation on the show's Patreon page at: https://www.patreon.com/TheDotNetCorePodcast.

Music created by Mono Memory Music, licensed to RJJ Software for use in The Modern .NET Show





Download audio: https://traffic.libsyn.com/clean/secure/thedotnetcorepodcast/722-Jonah-Andersson.mp3?dest-id=767916
Read the whole story
alvinashcraft
6 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories