Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
151667 stories
·
33 followers

These fifth graders vibe coded a real-world Braille tool — and wowed their Microsoft teacher

1 Share
Fifth graders who worked on the Braille 3D Generator at Global Idea School in Redmond, Wash., from left: Valentin, Grayson, Ella, Hunter and Julian. (Photo courtesy of Juan Lavista Ferres)

As the head of Microsoft’s AI for Good Lab, Juan Lavista Ferres and his researchers can spend months building real-world AI solutions. Fifth graders that he teaches built an accessibility tool in their class.

The students in a computer science class at the Global Idea School, an independent, non-profit elementary school in Redmond, Wash., learned vibe coding through GitHub Spark and built a Braille 3D Generator, a tool that turns text into printable, tactile 3D Braille models in seconds. 

Juan Lavista Ferres, Microsoft corporate VP and director of the AI for Good Lab. (LinkedIn Photo)

“We live in an amazing time,” said Lavista Ferres, a 17-year Microsoft vet who has taught at the school, co-founded by his wife, for seven years. “The fact that a 10-year-old can do it in a class without any training? That thing is an actual working solution.”

Six students worked on the Braille 3D Generator. They were inspired by the idea of creating signage to help blind or low-vision people navigate in their school to find classrooms.

The group is the youngest to enter the “AI for a Better World” competition, a national initiative in collaboration with MIT, that invites students in grades 6-12 to explore how artificial intelligence can improve their communities and the broader world.

The students interviewed Anne Taylor, principal program manager for Microsoft Accessibility and an expert in Braille embossers that convert digital text into raised Braille text on paper. Taylor was able to provide feedback and help the students fine-tune their solution so that it was useful for someone who is blind.

The students also visited Microsoft’s Inclusive Tech Lab where they saw how people interact with specialized computer keyboards, game controllers and more.

“I think it would be very good to help people with disabilities,” said Grayson, 10, one of the students in the class. “We’re trying to help the people who can’t see with this Braille project to make it more affordable, so they can tell areas easier — because it would be cheaper for areas to have Braille instead of having to go through a really expensive process.”

An example shows the word “classroom” translated to Braille in the Braille 3D Generator, with a model at right that’s ready to be exported and 3D printed. (Image via Braille 3D Generator)

For the students, the process felt unlike anything they had done before in class, where they had previously used block-based coding tools like Code.org.

“Instead of having to type the code, we could just say English to the AI and it would make this whole app,” Grayson said.

Vibe coding is a style of software development in which the programmer describes what they want in plain English and lets AI generate the underlying code. GitHub Spark, a tool from Microsoft-owned GitHub, takes that approach and lets users build and deploy web applications through natural language prompts alone — no coding experience required.

What surprised even Lavista Ferres was the leap from browser-based app to physical object. GitHub Spark typically generates React code for web applications, and he didn’t realize it could produce 3D models until the students tried it. Some of their early attempts didn’t work, but they kept experimenting.

“When I saw the output, I was like, ‘wow,'” he said. “I’ve been vibe coding for some time now. I wasn’t aware that we could do this.”

Lavista Ferres started at Microsoft as a data scientist in 2009 and became a lab director 10 years later. The AI for Good Lab operates as part of Microsoft Philanthropies, separate from the company’s product groups. Last year the lab launched an AI for Good Open Call to support projects in public health, education, sustainability, and humanitarian action.

Lavista Ferres said the kids in his class could be future Microsoft colleagues, because they’re creating real-world applied solutions that work.

“This is a new world,” he said. “I show it to all my team and say, ‘Guys, if these kids can do this, you guys can be much more productive. We need to start using this technology more and more.'”

Read the whole story
alvinashcraft
45 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Opinion: Make Democracy capitalist again

1 Share
Washington state’s Legislative Building, which houses the Legislature. (GeekWire Photo / Brent Roraback)

Longtime Seattle investor and entrepreneur Chris DeVore is managing partner of Founders’ Co-op.

I have a confession to make. I’m a Democrat. And a capitalist. Both, at the same time.

This didn’t used to be a position that needed defending. But over the course of my adult life these two ideas have moved farther and farther apart. The bond is now at the breaking point, and if it snaps, the party I grew up in will abdicate its once-legitimate claim to the best of the American idea. 

The belief in free markets is actually shared by the vast majority of Americans, and while it may anger the populist fringe, embracing capitalism would be a rallying cry to centrists from both parties who despair for our future and are hungry for a message that makes sense.

Today, the party that has labored to defend and perfect the American experiment — with opportunity, justice and equal treatment under the law for all — has either lost its mind, or its memory, for the motive force that makes those ideals possible.

Take away the promise of a better life (immigration), the means to achieve it (capitalism), and the certainty that the fruits of your labor won’t be arbitrarily confiscated (rule of law), and the engine that has made America the richest, most powerful and most admired country in the world grinds to a halt, and the whole grand experiment comes to an end.

One can acknowledge all the historical errors that mar the American project — the displacement and murder of indigenous people, slavery and Jim Crow, the creeping capture of government by corporations, rich people, old people, the list goes on — and not lose sight of the three essential ingredients that make our strange and complicated country possible: capitalism, the rule of law, and a welcome embrace of all who wish to make America their home.

But if you listen to Democrats at both the state and national level today, capitalism is the enemy. Billionaires and their current avatars, AI and data centers, have become the bogeymen that electeds and party leaders invoke to stir outrage in the base. 

What’s offered as an alternative isn’t economically coherent (“tax the rich,” when the top 10% of earners already pay ~75% of all federal income taxes; “ban data centers,” industrial-scale NIMBYism that simply pushes development elsewhere), but the message behind the slogans is clear: American prosperity is not something to be conserved, much less promoted; it is a natural resource that we somehow lucked into and can harvest at will, an overflowing fountain of wealth that will never run dry.

How did we get here? How has capitalism, the incontrovertible powerplant of democracy, become anathema to the Democratic party? 

Today’s apparent loss of faith is actually rooted in capitalism’s undefeated record of success, coupled with the fitful but now accelerating failure of our democratic machinery.

It’s strange that the centrality of capitalism to our national project requires explanation, but that’s actually the best evidence of its truth: we have been so rich for so long, so embarrassed with our abundance of material and experiential choices, that we have come to take it for granted. We blithely assume that the neighborhood business owners and global corporations that make abundance possible, depositing bi-weekly paychecks in the bank accounts of their millions of workers and filling store shelves with the bewildering array of goods and services we enjoy every day, have simply always been there, will always be there, like the air we breathe. 

This is a tragic mistake.

I have made a career, or more truly, I have found a calling, in supporting entrepreneurs from their moment of inception. Every business that exists, from the most humble corner cafe up to and including General Motors and Amazon, only does so because a small number of unreasonable people overcame extraordinary obstacles over many years to create something from nothing. 

Every paying job, every charitable gift, every nickel of tax revenue that finances the safe and convenient world we all enjoy, springs from that improbable act of creation. The machinery of capitalism works so well, allowing one person’s vision to be transformed into millions of jobs and billions of dollars of tax revenue, that we have simply forgotten how extraordinary it is, how dramatic a break it represents from thousands of years of autocracy, feudalism, injustice and inequality.

The engine of capitalism is so efficient that it also conceals the deepest truth of all organic systems: companies, just like people, are born, live a short time, and then decline and die. This is hidden by the irrepressible generative energy of well-regulated self-interest: new companies arise to fill the gaps and address the shortcomings of current incumbents, fueling an endlessly diverse and creative process of regeneration. Every company that falters is replaced by two more, eager to serve the customers no longer satisfied by the prior wave’s lackluster efforts. 

To paint a picture of this cycle of renewal, of the top 100 most valuable companies in America today, 15 were founded in just the past 10 years, 30 didn’t exist 25 years ago, 45 didn’t exist 50 years ago, and less than a third (30 of 100) have been around for 100 years or more. Great companies can seem like they’ve always been here, but in fact they are dying and being born every day. New companies have to come from somewhere, and that somewhere is the solar energy of the capitalist biosphere: entrepreneurship.

If capitalism, and its essential generative act of entrepreneurship, are so great, how could we possibly have turned against them? 

The answer is both democracy’s greatest failure, and its most obvious path to redemption.

For at least the past century, Democrats and Republicans have divided themselves by their views on the role of the state. Democrats see government as an essential partner in the national project: providing critical infrastructure like roads and airports, securing the national defense, providing basic education and health services, and ensuring that the rule of law is applied fairly and equally, both to the companies that help our economy thrive as well as to its individual citizens. Republicans share many of these same views, but where Democrats push for more, Republicans have generally wanted less: lower taxes, fewer regulations, and a generally less-generous redistribution of national income to those lower on the economic ladder.

But to obtain the levers of power needed to advance their respective goals, both parties have relied on the obvious carrot of legislative giveaways to secure blocs of electoral support: farmers, labor unions, business owners, real estate developers, the list is as endless and varied as the economy itself. The result is a regulatory and tax system so stuffed with incentives, tax breaks and special protections that any citizen, even and especially those favored by one set of legislated advantages, can point to those in another group and cry “unfair!”, “undemocratic!”, “corrupt!”

It is this general stench of favoritism and corruption, slowly accreted over 250 years of electoral back-scratching on both sides of the aisle, that has brought us to our present crisis. Each party is so captured by its crazy quilt of protected electoral blocs and aggrieved parties, and so credibly able to point to the injustices perpetrated by the other side, that it becomes plausible to question the entire free-market edifice. 

Great wealth now has the taint of theft, with no fine distinctions between entrepreneurial success and a systematic looting of the Treasury.

Things tend to continue as they began. So the most likely, and most depressing, scenario is that we are witnessing the final throes of the American idea. Two centuries of bipartisan regulatory capture have so encrusted our legislative and fiscal infrastructure that equal treatment under the law is now a bitter punchline, not the proud aspiration that once bound us together as a nation. Each party is now fully captive to its donor base, its electoral security purchased with gifts of regulatory ledgermain and dollars siphoned from public coffers, that there is precious little oxygen left for the promises on which the nation was built.

But to use this bipartisan failure of democracy to make a villain of capitalism, to paint as enemies of the state the few founders who have reaped extraordinary gains from their entrepreneurial ventures, when the vast majority are lucky to keep their employees paid and the lights of their modest establishments lit, is to eat out the very heart of the American project. 

This is already playing out in miniature at the state level. Traditionally Democratic states like Washington, Oregon and California are pursuing confiscatory tax policies that villainize entrepreneurial wealth. The net result is not the hoped-for increase in state tax revenue, but a highly visible and accelerating flight of entrepreneurial wealth and energy to more capitalist-friendly domiciles like Florida, Texas and Wyoming.

This is not to argue that the unexampled boon of living in a society where one can both earn and keep great wealth does not come with serious civic obligations. By all means use regulation to ensure fair and safe business operations and prevent abuse. Levy the taxes necessary to nurture our remarkable civic infrastructure, allowing entrepreneurs to build new companies from scratch without fear of expropriation, whether by criminals or the state itself. Unquestionably demand that corporations be positive civic actors, as if they were citizens themselves, with all the rights and obligations that entails.

But as a lifelong Democrat, and a passionate believer in the fundamental goodness of the American idea, I have one simple request for the party I still believe is most likely to carry our national experiment forward: recognize capitalist entrepreneurship as the motive force that has made our extraordinary success possible, and restore capitalism as one of the central pillars of our national promise. 

By continuing to take our unprecedented prosperity for granted, you misunderstand both its source and its chances of survival. Worse yet, by demonizing the engine of our shared prosperity, you are sowing the seeds of our collective destruction. 

Stop now, before it’s too late.

Read the whole story
alvinashcraft
45 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

How exposed is your code? Find out in minutes—for free

1 Share

Most security leaders share the same suspicion: there are vulnerabilities in our codebase that we don’t know about.

The uncomfortable truth is that most code never gets a thorough security review. Vulnerabilities accumulate quietly in active repositories, across languages and teams, often undetected until something goes wrong. And if you’re relying on manual reviews or narrowly scoped tools, the gaps may be wider than you think.

Today, we’re introducing the Code Security Risk Assessment: a free, one-click scan that reveals vulnerabilities hiding in your organization’s code. No license required. No configuration. No commitment. Just clarity.

The Code Security Risk Assessment is available to GitHub organization admins and security managers. If that’s not you, this post is still worth reading and sharing: it explains what the assessment reveals and why it’s worth running.

Run your free assessment >

What you’ll learn

The Code Security Risk Assessment scans up to 20 of your most active repositories using CodeQL, GitHub’s industry-leading static analysis engine, and delivers a dashboard summarizing what it finds:

  • Total vulnerabilities found across your scanned repositories, broken down by severity: critical, high, medium, and low
  • Vulnerabilities by language, so you can see which parts of your codebase carry the most risk
  • Rules detected, showing the specific classes of security issues found, how many repositories they affect, and their severity
  • Most vulnerable repositories, helping you identify where to focus remediation first
  • Copilot Autofix eligibility — how many of your vulnerabilities could be automatically fixed with Copilot Autofix, GitHub’s AI-powered remediation tool

The assessment is available to organization admins and security managers on GitHub Enterprise Cloud and GitHub Team plans. It’s completely free — you won’t be charged for any licenses, and the GitHub Actions minutes used for scanning don’t count against your quota.

See how it works. 👇

Completing the security picture

If you’ve already run a Secret Risk Assessment, you know the value of visibility. Since launching last year, the Secret Risk Assessment has helped thousands of organizations understand their exposure to leaked credentials. In 2025 alone, customers using Secret Protection scanned nearly 2 billion pushes and blocked 19 million secret exposures.

The Code Security Risk Assessment brings that same philosophy to vulnerabilities in your source code. Both assessments now run together from a single entry point, with a tabbed interface that lets you switch between your secret exposure and your code vulnerability findings. Together, they give you a unified view of your organization’s security posture—secrets and code—in minutes.

Even if you’re not responsible for running security scans yourself, the results of these assessments can help your team align on where risk exists and what to fix first.

And when you’re ready to act on what you find, each assessment has a corresponding GitHub product designed to help. Secret Protection stops credentials from leaking. Code Security finds and fixes vulnerabilities. The assessments show you why you need them.

From found to fixed

Knowing where your vulnerabilities are is the first step. Fixing them is what actually reduces risk.

That’s where GitHub Code Security and Copilot Autofix change the equation. Across GitHub in 2025:

  • 460,258 security alerts were fixed using Copilot Autofix
  • 50% of vulnerability alerts were resolved directly in pull requests — where developers are already working
  • Mean time to remediation was nearly twice as fast with Copilot Autofix (0.66 hours) compared to manual fixes (1.29 hours)

Your Code Security Risk Assessment results will show you how many of your detected vulnerabilities are eligible for Copilot Autofix — giving you a concrete picture of how quickly you could start reducing risk. When you’re ready, you can enable Code Security directly from the results page with a single click.

Find what you’ve been missing

Whether you have no security scanning in place, you’re evaluating your current tools, or you want a broader view of risk across your organization — the Code Security Risk Assessment meets you where you are.

It’s free. It takes minutes. And what you learn might change how you think about your security posture.

Run your free Code Security Risk Assessment, or to learn more, read the docs.

The post How exposed is your code? Find out in minutes—for free appeared first on The GitHub Blog.

Read the whole story
alvinashcraft
46 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Three ways to connect an AI agent to your business data

1 Share

There are several ways to connect an AI agent to your data. Each approach solves a different problem and comes with its own trade-offs, what you’re really choosing between is speed and reliability. In this post, we’ll explore how to balance the two to get the best results for your team and use case.

Today, many teams are looking to use AI agents to interact with their data through natural, conversational interfaces. Instead of writing queries or building dashboards, users can ask questions like “How did my week-2 conversion change compared to last week?” and get immediate, contextual answers.

Giving direct access: The faster but more fragile approach

The simplest setup is to connect an agent directly to your database. You give it access to your schema, provide it with some documentation, and let it write SQL on the fly. You can ask it a question in plain English, and it will generate a query or use an MCP server to give you an answer.

With this setup, you can be up and running in a few hours.

But the answers you receive are only as good as the agent’s interpretation of the data. It won’t really know your metrics and their specific definitions, so it will have to make an educated guess. Sometimes it will guess well, but often it won’t. The numbers might still look reasonable, though, and this problem could be solved by providing additional context via text files or tools like a Confluence MCP server. But without a single source of truth and predefined guardrails, there’s no guarantee that the agent will generate the right queries.

This is why these setups are typically limited to data teams, since someone will always need to double-check the results.

Building a formal semantic layer: The reliable, but slow to build approach

A more structured approach is to define a semantic layer. You model your metrics in tools like dbt or Cube. The agent no longer queries raw tables; it now works with predefined metrics. With query engines, SQL generation becomes more reliable because it follows predefined logic and metric definitions instead of making assumptions.

This alone improves many aspects of performance, since once the logic is encoded, answers become consistent, and the agent is no longer second-guessing what “revenue”, “churn”, or “subscriber growth” mean.

But in this case, the trade-off is time.

Building a proper semantic layer takes months. Every metric must be defined, reviewed by your data team, and maintained. As your business grows and your data evolves, new logic and metrics will inevitably be needed. A large part of your work will now be focused on maintaining and keeping your semantic layer up to date.

This approach yields more reliable answers because queries are based on predefined metrics. However, it requires ongoing effort to build and maintain the layer. Over time, data teams spend less time answering questions directly and more time maintaining these definitions.

The agent can’t do this work on its own, as it relies on humans updating the semantic layer to ensure it can answer questions consistently.

Building an automated semantic layer: Avoiding trade-offs

A third approach is to build an automated semantic layer that can learn and maintain itself as usage grows.

Instead of defining every metric upfront, the system will build it from your existing dbt projects and data sources. And every time someone asks a question, new metrics will be created on the fly in that semantic layer, generating new PRs in Git that data teams can then review and approve. 

This way, the layer is generated from your existing data, helping you avoid the usual cold-start problem and the need to define all your metrics before anyone can use the system.

As questions come in, the agent proposes new metrics, which are in turn reviewed by your data team. This keeps the system aligned with real usage, while ensuring that definitions stay consistent and trusted. Business users can interact with the data earlier, and the metrics evolve based on actual needs.

Databao is built around this principle, bringing together automated semantic layer generation and human-in-the-loop validation so teams can scale usage without sacrificing consistency or trust.

What actually matters

At a glance, all three approaches address a similar need: enabling business users to ask their data questions in natural language.

But the hard part is not generating answers – it’s making sure those can be trusted. Silent errors, when numbers look right but are based on the wrong definition, are the hardest to catch and can often prove to be the most damaging. That’s why the structure behind the agent matters more than the interface itself.

If we sum it all up: 

  • Direct access is fast, but it requires constant verification.
  • A formal semantic layer is reliable but slow to build.
  • An automated layer tries to strike a balance between both, with built-in review processes that ensure the agent has context, without placing all the burden of providing it on humans.

There isn’t a perfect option, and the right choice depends on where you are in your data journey.

If you’re exploring, speed might matter more. But if you’re scaling usage across a company, consistency becomes critical.

About Databao

If you’d like to try enabling self-service analytics through an automated semantic layer, you can integrate Databao into your workflow and join us in building a proof of concept together. We’ll work with you to understand your use case, define a context-building process, and give the agent access to a select group of business users. Together, we’ll evaluate the quality of the responses and your overall satisfaction with the results.

TALK TO THE TEAM

Read the whole story
alvinashcraft
46 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Under the Hood of USPS: Automation, Innovation and AI

1 Share

What tech goes into powering a modern delivery network? Go beyond the iconic mail truck and into the world of technology-driven logistics and robotics at the postal service. Join the Mailin’ It! team as USPS’s VP of Applied Engineering Lina Malone pulls back the curtain on the organization’s massive tech innovations – from colossal, automated sorting hubs to a digital backbone that can predict failures before they happen and pinpoint delivery window times.  USPS is building the speed and intelligence needed for mail and package delivery in the 21st century.


Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.





Download audio: https://afp-920619-injected.calisto.simplecastaudio.com/f32cca5f-79ec-4392-8613-6b30c923629b/episodes/3eadaa7b-2193-434b-b9ee-97b36481614b/audio/128/default.mp3?aid=rss_feed&awCollectionId=f32cca5f-79ec-4392-8613-6b30c923629b&awEpisodeId=3eadaa7b-2193-434b-b9ee-97b36481614b&feed=bArttHdR
Read the whole story
alvinashcraft
47 minutes ago
reply
Pennsylvania, USA
Share this story
Delete

Fast Focus: Caching Options in .NET | Visual Studio Live Las Vegas 2026

1 Share
From: VisualStudio
Duration: 16:47
Views: 19

Want to make your .NET applications faster without overcomplicating your architecture? In this @VisualStudioLive session from Visual Studio Live! Las Vegas 2026, Jason Bock walks through practical caching strategies in .NET—from simple in-memory caching to distributed and hybrid approaches.

Through real code examples, you’ll learn how to improve performance, reduce repeated data access, and balance speed with memory usage using built-in .NET caching APIs.

🔑 What You’ll Learn
• Why caching is essential for improving application performance and user experience
• How to use in-memory caching with IMemoryCache in .NET
• Managing cache expiration, eviction policies, and memory pressure
• When to use distributed caching in modern, scalable applications
• How Hybrid Cache (new in .NET 9) simplifies local + distributed caching
• Understanding cache stampede protection and how it improves efficiency
• Real-world examples of caching static and frequently accessed data

⏱️ Chapters
00:00 Introduction + why caching matters
02:22 Performance tradeoffs: speed vs memory
04:00 Demo: retrieving data without caching
06:00 Using local (in-memory) caching
09:00 Cache expiration and eviction strategies
12:47 Distributed caching + Hybrid Cache
15:00 Stampede protection and optimization

👤 Speaker
Jason Bock (@jasonbock)
Staff Software Engineer, Rocket Mortgage

🔗 Links
• Download Visual Studio 2026: http://visualstudio.com/download
• Explore more VS Live! Las Vegas sessions: https://aka.ms/VSLiveLV26
• Join upcoming VS Live! events: https://aka.ms/VSLiveEvents

#dotnet #visualstudio #visualstudio2026 #VSLive

Read the whole story
alvinashcraft
47 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories