Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
151089 stories
·
33 followers

How to Build a Personal Context MCP

1 Share
From: AIDailyBrief
Duration: 20:20
Views: 1,391

Why context is the core bottleneck for agentic AI adoption in enterprises, with data readiness, access, and portability as decisive factors. Presentation of a Personal Context Portfolio: modular markdown files (identity, roles, projects, tools, communication style, domain knowledge, decision log) as a machine-readable, portable context package. Demonstration of practical tooling and deployment patterns, including Context Hub, CLI-based context sharing, MCP server setup, and common troubleshooting lessons.

The AI Daily Brief helps you understand the most important news and discussions in AI.
Subscribe to the podcast version of The AI Daily Brief wherever you listen: https://pod.link/1680633614
Get it ad free at http://patreon.com/aidailybrief
Learn more about the show https://aidailybrief.ai/

Read the whole story
alvinashcraft
14 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

BONUS #NoEstimates, Throughput, and the Superstition of Project Management With Felipe Engineer-Manriquez

1 Share

BONUS: Why Your Plan Is Lying to You — #NoEstimates, Throughput, and the Superstition of Project Management

This episode is a cross-post from The EBFC Show, Felipe Engineer-Manriquez's podcast exploring Lean and Agile in construction. In this conversation, Felipe interviews Vasco about the #NoEstimates movement, throughput-based planning, and why traditional project management is still stuck in the middle ages of managing creative work.

The Human Side of Scrum That the Scrum Guide Doesn't Cover

"When you go into a daily meeting and you start looking at the people in that room, maybe they are the exact same people that were there yesterday, but the team is totally different. Somebody might have had a bad night's sleep, somebody might have had an argument with their spouse. These are human beings. These are not machines that you can just distribute work to."

 

Vasco's path to agile coaching started with a realization that most practitioners eventually reach: the problems in software development aren't technological. They're about people — getting agreements, sharing information at the right time, making the collective brain of a team actually function. The Scrum Guide gives you organizing principles — how many meetings, who's in them — but it says almost nothing about the real-time feedback cycle between humans that makes or breaks a team. That's why the Scrum Master role exists: to be the lubricant for human interactions, to break down complex ideas into items the collective mind can process. It's the piece that makes Scrum work, and it's the piece that's hardest to teach.

From Project Manager to #NoEstimates — The Bet That Changed Everything

"The PM wanted 15 items per sprint, and the team said 'yeah, we can do 15.' I said, this is not gonna happen. The team had been delivering between five and eight items per sprint. I said, I'm gonna be positive — I'm gonna say seven. And no surprise, by the end of the sprint, they delivered seven."

 

Vasco started as a project manager — and not the easy certification kind. He went through IPMA, which means six months of training, a four-hour written exam, and an expert interview, just for the entry level. Planning and estimating was the job. Then he ran his first Scrum project, specifically to prove it couldn't work. By the second month, he couldn't understand how anything else could work. The team delivered something to show every single sprint — something that never happened with traditional project management. The turning point came when he made a bet with a product manager: the PM needed 15 items per sprint, the team committed to 15, but historical throughput was 5-8 items. Reality delivered seven. That moment crystallized the #NoEstimates insight: we can't fight reality, but we can choose which seven items to deliver.

Reality Is a Bitch — Why Linear Predictive Planning Fails

"Never believe the plan. Or as in Scarface — never get high on your own supply. It's so unbelievable how project managers still today believe their freaking plans."

 

At Nokia, Vasco managed a program of 500 people across 100 teams on four continents. No way to get everyone in a room. So he tracked system-level throughput — features delivered to integration per week. Six months into a twelve-month project, the data said they'd be at least six months late. He told the program manager: cut scope now. The program manager did what every PMI-trained program manager does — sent an email asking all 100 teams if they'd deliver on time. Every single team said yes. Nobody wants to be first to admit they're late. Twelve months in, they discovered they were six months late. The project got canceled. 500 people, millions of euros, all because somebody believed the plan. Linear predictive planning is useful for exploring what might be possible if nothing goes wrong. It is not reality. The only tool that reflects reality is throughput — the number of items completed per unit of time.

Earned Value Management — George Orwell at His Best

"It's not earned, it's spent. It's not value, it's cost. It's not management, it's just observation. Monty Python could not have come up with a better name."

 

Felipe shares a story that mirrors the absurdity: an industrial project with a dedicated 35-person earned value management department. Before the meeting even started, the department head announced, "Let's all acknowledge that earned value management is more an art than a science." Their charts were made up, the contractor's charts were made up, and the goal of the meeting was to agree that the project would finish on time — regardless of what any data said. This is where traditional project management ends up when it disconnects from throughput: a $30 million scope addition with zero additional time, defended by charts that a mediocre attorney can invalidate in the first week of litigation. Felipe knows — he spent a year being cross-examined by forensic schedulers whose full-time job is proving that construction schedules are fiction.

One Small Experiment to Test #NoEstimates

"Never convince anyone. Convince yourself. Once you're convinced, whatever other people say, it doesn't really matter because you're not gonna take them seriously anyway."

 

Here's how to validate throughput-based planning with your own data: take the last 10 sprints (or periods). Calculate the average throughput and control limits from the first five. Then check whether the next five sprints fall within that range. They will. If you're in software and using Jira, you already have this data. You don't need anyone's permission. You don't need to change anything. Just look at what your team actually delivers versus what they planned to deliver. The gap between those two numbers is the gap between superstition and reality.

About Felipe Engineer-Manriquez

Felipe Engineer-Manriquez is a best-selling author, international keynote speaker, Project Delivery Services Director at The Boldt Company, host of The EBFC Show podcast, and a proven construction change-maker implementing Lean and Agile practices on projects from millions to billions of dollars worldwide. He is a Registered Scrum Trainer™ (RST), Registered Scrum Master™ (RSM), and recipient of the Lean Construction Institute Chairman's Award. His book Construction Scrum is the first practical guide for applying Scrum in construction.

 

You can link with Felipe Engineer-Manriquez on LinkedIn.





Download audio: https://traffic.libsyn.com/secure/scrummastertoolbox/20260404_Felipe_Engineer_BONUS.mp3?dest-id=246429
Read the whole story
alvinashcraft
14 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Ep. 20 Project Hail Mary + Artemis II's Historic Flight to the Moon

1 Share
From: itskatehill
Duration: 37:05
Views: 8

In this episode, I explore the strange, beautiful phenomenon of perspective—through science, space, and the quiet awe of being alive on Earth.

I talk about the cultural moment surrounding Project Hail Mary and the Artemis II mission—the first human return to the Moon in over 50 years—and how both point to something deeper than innovation: a reorientation of how we see ourselves in the universe.

At the center of this episode is the observer effect—not just as a scientific concept, but as a lived experience. The idea that the act of observing changes what is observed. That our attention, our awareness, our witnessing…matters.

I reflect on the posts I’ve shared this week on Threads, how they reached thousands of people, and the responses that came back. The shared sense of wonder. The quiet recognition that something about being human, here, now, is far more miraculous than we tend to let ourselves feel.

We move through the words of astronauts—those who have seen Earth from space—and the overwhelming shift in perspective that follows. Including the iconic reflection from Carl Sagan on the “pale blue dot”—a reminder of how small we are, and how precious.

I talk about how our understanding of our place in the cosmos has evolved—from ancient handprints pressed into cave walls, to images of Earth suspended in darkness. From wondering where we are…to finally seeing it.

This episode is about awe. About beauty. About remembering that this planet—this life—is not mundane, but extraordinary.

We explore:

✔ The cultural resonance of Project Hail Mary and why it’s striking a chord right now

✔ Artemis II and the significance of returning to the Moon after decades

✔ The observer effect—scientifically and philosophically

✔ The “overview effect” astronauts experience when seeing Earth from space

✔ Carl Sagan’s pale blue dot, and what it asks us to remember

✔ How perspective shapes meaning—and how we participate in that shaping

✔ The quiet, radical act of letting yourself feel wonder again

This episode is for anyone who has felt, even briefly, the pull of something larger. For those moments when the sky looks different. When the world feels alive. When you remember that you are here—not by accident, but as part of something vast and unfolding.

If you’ve been craving a sense of perspective…of beauty…of meaning that doesn’t need to be forced, this one’s for you.

Thank you for being here. And thank you for listening. 🕯️

The best way to support the podcast is to become a patron of The Folklore Library Substack.

And if you have topics or questions you’d like me to cover, email me at insertwisdom@gmail.com.

The Art of After Workbook: How to Turn Grief into Art (https://itskatehill.gumroad.com/l/theartofafter)

Under the Same Sky by Kate Hill (https://www.amazon.com/dp/B0DJY2DWRD/ref=tmm_pap_swatch_0?_encoding=UTF8&qid=&sr=) 📖

Find me here 👇🏼

Email: insertwisdom@gmail.com

Become a patron of the Folklore Library Substack ✍🏼 (https://insertwisdom.substack.com)

Threads (https://www.threads.net/@itskatehill) ✨

Ambiance Channel ✨ (https://www.youtube.com/@etherandink)

Tiktok (https://www.tiktok.com/@itskatehill?lang=en) ✨

Instagram (https://www.instagram.com/itskatehill/) ✨

Goodreads (https://www.goodreads.com/author/show/52471695.Kate_Hill) ✨

Get full access to The Folklore Library at insertwisdom.substack.com/subscribe (https://insertwisdom.substack.com/subscribe?utm_medium=podcast&utm_campaign=CTA_4)

Read the whole story
alvinashcraft
14 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Microsoft Agent Framework – Part 0

1 Share

I’ve been looking at a number of different ways to build Agents. I’ve settled on two and will be documenting what I learn as I go:

  • Building from first principles based on my course on Agentics at Johns Hopkins
  • Building using Microsoft’s new Agent Framework

The advantage of the first is that you understand the underlying mechanisms in more depth; the advantage of the second is that a lot of the plumbing is done for you and you become more productive more quickly.

I will do the .NET work in C#, and probably do all the other work in Python. See my blogpost on why Python.

I will, to a degree, be documenting what I learn as I learn it, without infringing on copyright, of course.

Project 1 – Jupyter

The work for Johns Hopkins is done in a Jupyter notebook. These are very convenient files that contain runnable cells. You put your code snippet in a cell and run it, either receiving a result or extensive error information. To get started, open Visual Studio Code and click New File. In the drop down on top, choose Jupyter Notebook:

Choose your Python environment, and put code into the first “cell.” Then click on the knob on the left and the cell runs and the output is displayed:

You can add markdown, you can even tell it to generate code based on your description of what you want. It is possible to have these cells generate a proper Python program and/or generate

Open AI Endpoint

Whether you are working in the Jupyter notebook or in Agent Framework, you will want an OpenAI Endpoint. To do so go to https://platform.openai.com, sign in, go to API Keys, create a new key. Your endpoint is the base url used for API calls.

Or… if you are working in the Microsoft ecosystem, in the Azure Portal search for Azure OpenAI and create a resource. Then go to Resource Management -> Keys and Endpoint. Copy the Endpoint and Key 1 (your API key) Deploy under Resource Management -> Model Deployments. Your endpoint will look like this:

https://<your-resource-name>.openai.azure.com/

Copilot can set all this up for you.

Next up: setting up your environment.

Read the whole story
alvinashcraft
14 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

How to Become AI-Native: A Practical Guide

1 Share
Learn how to become AI-native with a step-by-step guide, mindset shifts, and practical strategies to stay ahead in the AI-driven world.
Read the whole story
alvinashcraft
15 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

🚀 Convert Anything to Markdown in .NET — Meet ElBruno.MarkItDotNet🚀 Build QR Codes in .NET FAST with ElBruno.QRCodeGenerator

1 Share

⚠ This blog post was created with the help of AI tools. Yes, I used a bit of magic from language models to organize my thoughts and automate the boring parts, but the geeky fun and the 🤖 in C# are 100% mine.

Hi!

You know that feeling when you’re building an AI pipeline or a RAG workflow and you realize: “Wait… I need to turn all these PDFs, Word docs, HTML pages, and random files into something my LLM can actually eat”? 😅

Yeah, me too. That’s exactly why I built:

👉 ElBruno.MarkItDotNet

.NET library that converts files to clean Markdown. Think of it as the .NET version of Python’s markitdown — this one with dependency injection, streaming support, and a plugin architecture. Because we’re C# developers and we like our things in this way. 😎


⚡ Getting Started

Install the NuGet package:

dotnet add package ElBruno.MarkItDotNet

And then… this is all you need:

using ElBruno.MarkItDotNet;

var converter = new MarkdownConverter();
var markdown = converter.ConvertToMarkdown("document.pdf");
Console.WriteLine(markdown);

That’s it. PDF → Markdown. Done. ✅


📂 What Can It Convert?

Here’s where it gets fun. The core package supports 12 file formats out of the box:

  • 📄 Plain text (.txt.log.md)
  • 📋 JSON — pretty-printed and fenced
  • 🌐 HTML / HTM — strips tags, keeps content
  • 🔗 URLs — fetches and converts web pages
  • 📝 Word DOCX — headings, tables, links, images, footnotes
  • 📕 PDF — word-level extraction with heading detection
  • 📊 CSV / TSV — clean Markdown tables
  • 📦 XML — structured fenced blocks
  • ⚙ YAML / YML — fenced code blocks
  • 📰 RTF — rich text to Markdown
  • 📚 EPUB — ebooks to Markdown
  • 🖼 Images — .jpg.png.gif.bmp.webp.svg

And with the satellite packages, you get even more:

PackageWhat it does
ElBruno.MarkItDotNet.Excel.xlsx spreadsheets → Markdown tables
ElBruno.MarkItDotNet.PowerPoint.pptx slides → Markdown with notes
ElBruno.MarkItDotNet.AIAI-powered OCR, image captioning, audio transcription
ElBruno.MarkItDotNet.WhisperLocal audio transcription with Whisper (no API key!)

🧠 Stream It — Because Large Files Are Real

One of the things I find that someone were requested was >> streaming API. When you’re processing a 500-page PDF, you don’t want to wait for the entire thing to load in memory. So:

using var stream = File.OpenRead("huge-document.pdf");

await foreach (var chunk in converter.ConvertStreamingAsync(stream, ".pdf"))
{
    Console.Write(chunk); // chunks arrive as they're processed
}

This uses IAsyncEnumerable<string> — so it plays nicely with your async pipelines, web APIs, and real-time UIs.

To be honest, I never faced this scenario before, but it really makes sense.


💉 Dependency Injection? Of Course

If you’re building a real app (not just a console demo), you’ll want the DI registration:

// Program.cs or Startup
services.AddMarkItDotNet();          // core converters
services.AddMarkItDotNetExcel();     // Excel support
services.AddMarkItDotNetPowerPoint(); // PowerPoint support
services.AddMarkItDotNetWhisper();   // local audio transcription

Then inject IMarkdownService wherever you need it:

public class MyDocProcessor
{
    private readonly IMarkdownService _markdownService;

    public MyDocProcessor(IMarkdownService markdownService)
    {
        _markdownService = markdownService;
    }

    public async Task<string> ProcessAsync(Stream fileStream, string extension)
    {
        var result = await _markdownService.ConvertAsync(fileStream, extension);
        return result.Markdown;
    }
}

🤖 AI-Powered Conversions

This is where things get really interesting. And thanks Copilot CLI for suggesting this 👇

The ElBruno.MarkItDotNet.AI package uses Microsoft.Extensions.AI and an IChatClient to power:

  • 🖼 Image OCR & captioning — describe what’s in an image
  • 📕 Scanned PDF enhancement — detects low-text pages and uses AI to extract content
  • 🎙 Audio transcription — turn audio files into Markdown
services.AddMarkItDotNetAI(options =>
{
    options.ImagePrompt = "Describe this image in detail";
    options.AudioPrompt = "Transcribe this audio";
});
Works with OpenAIAzure OpenAI, or any IChatClient implementation. Your choice.

And if you want local audio transcription with zero cloud dependency? There’s ElBruno.MarkItDotNet.Whisper for that.


🔗 URL to Markdown

One more thing my friend Hector suggested >> converting web pages:

var service = new MarkdownService(registry);
var result = await service.ConvertUrlAsync("https://example.com");
Console.WriteLine(result.Markdown);

Super handy for web scraping, research pipelines, or just saving articles as Markdown.


🔌 Build Your Own Converters

Don’t see your format? No problem. Implement IMarkdownConverter and plug it in:

public class MyCustomConverter : IMarkdownConverter
{
    public string[] SupportedExtensions => [".custom"];

    public Task<ConversionResult> ConvertAsync(Stream stream, string extension)
    {
        // your conversion logic here
    }
}

Or bundle multiple converters into a plugin with IConverterPlugin. The architecture is designed to be extended.


🎮 18 Sample Apps

Yes, 18 samples. I went a bit overboard 😅 (not me, Copilot, you know what I mean):

  • BasicConversion — text, JSON, HTML
  • PdfConversion — PDF + streaming
  • DocxConversion — Word documents
  • ExcelConversion — spreadsheets
  • PowerPointConversion — slides
  • AiImageDescription — AI image analysis
  • WhisperTranscription — local audio
  • MarkItDotNet.WebApi — minimal API with uploads + SSE
  • BatchProcessor — folder batch conversion
  • RagPipeline — RAG ingestion pipeline
  • …and more!

💡 Final Thoughts

This project started because I needed a clean, extensible way to convert files to Markdown in .NET — especially for AI workflows. Python had markitdown, but .NET didn’t have a good equivalent. So I built some pet projects and they were on my personal toolbox for a while.

Then someone ask a question, and put a Squad to package everything.

Currently supports 15+ file formats, has streaming APIs, plays nice with dependency injection, and can even use AI for OCR and transcription. Plus, it’s open source and ready for your PRs. 🚀

👉 NuGet: ElBruno.MarkItDotNet 

👉 Repo: https://github.com/elbruno/ElBruno.MarkItDotNet

If you try it, let me know what you build! 🙌

Happy coding!

Greetings

El Bruno

More posts in my blog ElBruno.com.

More info in https://beacons.ai/elbruno




Read the whole story
alvinashcraft
15 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories