Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
153473 stories
·
33 followers

WinUI 3 Performance: A Leap Forward

1 Share

Hello WinUI Community!

Our mission is to make WinUI 3 the best native UI platform for Windows experiences and apps and performance is at the heart of that effort. Moving from WinUI 2 to WinUI 3 should always be a clear win for performance, and apps should get great results without heavy lifting.

Why Now

Pavan recently shared a blog post, in which "more fluid and responsive app interactions: Reducing interaction latency by moving core Windows experiences to the WinUI3 framework" was mentioned as part of our quality commitment. Making this a reality means delivering performance improvements at multiple levels, including within WinUI itself. This also reinforces our strategic commitment to WinUI as the native framework going forward. We know that performance is just one piece of the puzzle and that there are many other areas that deserve our continued attention. Rest assured, we remain focused on those as well.

Where We're Focused

We've been zeroing in on launch time, using File Explorer and Notepad as our primary benchmarks, with an emphasis on improvements that broadly benefit most apps.

The Results So Far

Here's what we're seeing for the WinUI portion of File Explorer launch:

Metric Improvement
Allocations 41% fewer
Transient allocations 63% fewer
Function calls 45% fewer
Time spent in WinUI code 25% reduction

When Can You Expect These Changes?

These improvements will be brought out of the development branch soon, and you will see them showing up in the winui3/main branch. We'll also be bringing these changes into WinAppSDK 2.x where possible, though some changes may be too risky or complex to deliver as servicing updates.

A Note on Breaking Changes

Some optimizations involve small or large breaking changes and will require apps to opt in. For example, we're optimizing default control styles, which should work fine for most apps but could cause issues for apps that:

  • Expect to find a specific container element in a control template
  • Rely on a property being set via an animation rather than a Setter

Each app can determine which of these changes to opt in to. Over time, perhaps as early as 3.0 or potentially in 4.0+, many of these will switch to opt-out, enabling the best possible performance by default.


Stay tuned for more updates, and thank you for being part of the WinUI community!

Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

PowerShell Newbie – Variables

1 Share

This is Week 2 of PowerShell Strikes Back – a four-week May series for SQL Server DBAs who have dabbled in PowerShell but never stopped to nail down the fundamentals. If you missed Week 1 on single vs double quotes, start there first – it’ll make this post land better.


Last week, we learned that quotes are not created equal. This week, we’re going deeper into the building blocks that make PowerShell scripts actually useful: variables.

If quotes are your lightsaber, variables are the Force itself. They carry information from one part of your script to another. They make the difference between a script that works on one server and a script that works on all of them. Get comfortable with variables, and you’ll look back at your pre-PowerShell DBA life the way Luke looked back at Tatooine – relieved to be moving on.


The Basics – Declaring and Assigning Variables

In PowerShell, every variable starts with a dollar sign ($). You don’t need to declare a variable before using it – you just assign it a value and PowerShell figures out the rest.

# Assigning basic variables
$ServerName   = 'SQL-PROD-01'
$DatabaseName = 'WideWorldImporters'
$MaxConnections = 100
$IsProduction   = $true

Write-Host "Server: $ServerName"
Write-Host "Database: $DatabaseName"
Write-Host "Max Connections: $MaxConnections"
Write-Host "Production: $IsProduction"

Notice there are no data type declarations here. PowerShell infers the type from what you assign, strings get quotes, integers don’t, and booleans use the built-in $true and $false values. For most DBA scripting, this works perfectly fine.


Why Loose Typing Can Strike Back

PowerShell’s automatic type inference is convenient until it isn’t. Consider this:

$Port = '1433'
$NewPort = $Port + 1
Write-Host $NewPort
# Output: 14331
# Not 1434. PowerShell treated $Port as a string and concatenated instead of adding.

The Empire strikes back through your own assumptions. If you need a variable to behave as a specific type, declare it explicitly by putting the type in brackets before the variable name:

[int]$Port    = '1433'    # Now it's an integer
[string]$SPID = 57        # Now it's a string
[bool]$IsAG   = $true     # Explicitly boolean
[datetime]$BackupTime = '2026-05-04 02:00:00'

$NewPort = $Port + 1
Write-Host $NewPort
# Output: 1434 — the way the Force intended

As a DBA, you’ll care about this most when dealing with port numbers, SPIDs, database IDs, and any value that comes back from a query as a string but needs to be treated as a number.


Variables That DBAs Actually Use

Let’s get practical. Here are the kinds of variables you’ll build real scripts around.

Server and Instance Lists

# Single server
$TargetServer = 'SQL-PROD-01'

# Array of servers — we'll loop over these in Week 3
$SqlServers = @(
    'SQL-PROD-01',
    'SQL-PROD-02',
    'SQL-DR-01'
)

Credentials

# Prompt for credentials securely — never hardcode passwords
$Credential = Get-Credential

# Or build a credential object from a stored secure string
$SecurePass  = ConvertTo-SecureString 'YourPassword' -AsPlainText -Force
$Credential  = New-Object System.Management.Automation.PSCredential('sa', $SecurePass)

Date and Time Variables

# Useful for backup paths, log file names, report timestamps
$Today        = Get-Date
$DateStamp    = Get-Date -Format 'yyyyMMdd'
$TimeStamp    = Get-Date -Format 'yyyyMMdd_HHmmss'
$BackupWindow = (Get-Date).AddHours(-24)  # 24 hours ago

Write-Host "Running backup check for window: $BackupWindow to $Today"

File Paths

$BackupRoot  = 'D:Backups'
$ServerName  = 'SQL-PROD-01'
$DateStamp   = Get-Date -Format 'yyyyMMdd'
$BackupPath  = "$BackupRoot$ServerName$DateStamp"
$LogFile     = "$BackupRootLogsBackupCheck_$DateStamp.log"

Write-Host "Backup path: $BackupPath"
Write-Host "Log file: $LogFile"

 


Storing Query Results in Variables

This is where variables go from useful to genuinely powerful for DBA work. You can store the results of a SQL query or a dbatools command directly in a variable and work with the data in PowerShell.

# Store dbatools query results in a variable
$DatabaseList = Get-DbaDatabase -SqlInstance 'SQL-PROD-01' |
    Where-Object { $_.IsSystemObject -eq $false }

# Now $DatabaseList holds all user databases — inspect it
$DatabaseList.Count                    # How many databases
$DatabaseList[0].Name                  # Name of the first one
$DatabaseList | Select-Object Name, Size, RecoveryModel  # Specific properties

# Store raw T-SQL results using Invoke-DbaQuery
$Results = Invoke-DbaQuery -SqlInstance 'SQL-PROD-01' -Query @"
    SELECT 
        name,
        state_desc,
        recovery_model_desc
    FROM sys.databases
    WHERE database_id > 4
"@

# Work with the results
foreach ($Row in $Results) {
    Write-Host "$($Row.name) is $($Row.state_desc) — Recovery: $($Row.recovery_model_desc)"
}
# We'll cover foreach properly next week — consider this a teaser


Variable Scope — Know Your Territory

Variables in PowerShell have a scope; they exist in a specific context and may not be visible outside it. For most DBA scripts running top to bottom in a single file, this won’t trip you up. But once you start writing functions or calling scripts from other scripts, scope matters.

# Script scope — visible throughout the script
$script:ServerName = 'SQL-PROD-01'

# Global scope — visible everywhere, including child scripts
$global:LogPath = 'D:Logs'

# Local scope — the default, visible only in the current block
$ServerName = 'SQL-PROD-01'  # Local by default

The Jedi rule of thumb: default local scope is fine for most scripts. Only reach for $script: or $global: When you have a specific reason, unnecessary global variables are the midi-chlorians of PowerShell. Everyone argues about them, and they usually cause more problems than they solve.


A Few Variables You Get for Free

PowerShell has a set of automatic variables that are always available and genuinely useful for DBA scripting:

Variable What It Contains DBA Use Case
$_ or $PSItem Current pipeline object Used inside loops and Where-Object filters
$true / $false Boolean values Flag variables, conditional logic
$null Empty/no value Checking if a result came back empty
$Error An array of recent errors Checking what went wrong after a failure
$MyInvocation Info about the current script Getting the script name for log entries
$PSScriptRoot The directory the script lives in Building relative paths from the script location

Your Assignment, Rebel Pilot

  1. Open a PowerShell window and create typed variables for a server name (string), a port number (int), and a backup date (datetime).
  2. Build a backup file path string using double quotes that combines all three into a meaningful path.
  3. If you have dbatools available, run Get-DbaDatabase against a local or dev instance and store the results in a variable. Then check $Results.Count to see how many databases came back.

Variables are the backbone of every script you’ll write. Get comfortable assigning, typing, and reading them. Next week, we put them to work inside loops.

Next week in PowerShell Strikes Back: Return of the Loop  foreach, ForEach-Object, and how to stop running the same command twelve times by hand. See you on May 18th.

The post PowerShell Newbie – Variables appeared first on GarryBargsley.com.

The post PowerShell Newbie – Variables appeared first on SQLServerCentral.

Read the whole story
alvinashcraft
23 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Your AI Use Is Breaking My Brain

1 Share

Your AI Use Is Breaking My Brain

Excellent, angry piece by Jason Koebler on how AI writing online is becoming impossible to avoid, filtering it is mentally exhausting and it's even starting to distort regular human writing styles.

I particularly liked his use of the term "Zombie Internet" to define a different, more insidious alternative to the "Dead Internet" (which is just bots talking to each other):

I called it the Zombie Internet because the truth is that large parts of the internet are not just bots talking to bots or bots talking to people. It’s people talking to bots, people talking to people, people creating “AI agents” and then instructing them to interact with people. It’s people using AI talking to people who are not using AI, and it’s people using AI talking to other people who are using AI. It’s influencer hustlebros who are teaching each other how to make AI influencers and have spun up automated YouTube channels and blogs and social media accounts that are spamming the internet for the sole purpose of making money. It is whatever the fuck “Moltbook” is and whatever the fuck X and LinkedIn have become. It’s AI summaries of real books being sold as the book itself and inspirational Reddit posts and comment threads in which people give heartfelt advice to some account that’s actually being run by a marketing firm. [...]

Via @jasonkoebler.bsky.social

Tags: definitions, ai, generative-ai, llms, slop, jason-koebler, ai-ethics

Read the whole story
alvinashcraft
43 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

AWS Weekly Roundup: Amazon Bedrock AgentCore payments, Agent Toolkit for AWS, and more (May 11, 2026)

1 Share

My most exciting news of last week: Amazon Bedrock AgentCore previewed the first managed payment capabilities enabling AI agents to autonomously access and pay for APIs, MCP servers, web content, and other agents. Built in partnership with Coinbase and Stripe, it removes the undifferentiated heavy lifting of building customized systems for billing, credential management, and compliance.

You can connect a Coinbase CDP wallet or Stripe Privy wallet as a payment connection, set session-level spending limits, and your agent transacts autonomously during execution. What excites me most is what AgentCore payments can unlock—like a research agent that can pay for real-time market data on the fly, or a coding agent calling paid APIs mid-task.

To learn more, visit the blog post, dive deeper using the documentation, and get started with the AgentCore CLI.

Last week’s launches
Here are last week’s launches that caught my attention:

  • Agent Toolkit for AWS – A production-ready suite of tools and guidance, available at no additional charge, that helps AI coding agents build on AWS with fewer errors, lower token costs, and enterprise-grade security controls. The Agent Toolkit for AWS is the successor to the MCP servers, plugins, and skills available on AWS Labs. To get started, visit the quick start guide or browse the available skills and plugins on GitHub.
  • AWS MCP Server GA – You can use a managed remote Model Context Protocol (MCP) server that gives AI agents and coding assistants secure, authenticated access to all AWS services through a small, fixed set of tools. It is part of the Agent Toolkit for AWS. To learn more, visit Seb Stormacq’s blog post.
  • Amazon WorkSpaces for AI agents (Preview) – You can use AI agents to securely access and operate desktop applications through managed WorkSpaces environments. This capability allows organizations to automate everyday workflows at scale while maintaining full enterprise-grade governance and compliance. To learn more, visit Micah Walter’s blog post.
  • Amazon EC2 M8idn/M8idb and R8idn/R8idb instances – These instances are powered by custom sixth-generation Intel Xeon Scalable processors available only on AWS and the latest sixth-generation AWS Nitro cards. These instances deliver up to 43% better compute performance per vCPU compared to previous-generation instances. M8idn/R8idn instances offer up to 600 Gbps network bandwidth, and M8idb/R8idb instances deliver up to 300 Gbps EBS bandwidth.

For a full list of AWS announcements, be sure to keep an eye on the What’s New with AWS page.

Additional updates
Here are some additional news items that you might find interesting:

  • Valkey turns two – Valkey stands as proof that open, community-driven technology innovates faster, scales further, and delivers more value than any single-vendor model. Valkey has surpassed 100 million Docker pulls (up 17x year over year) and attracted more than 225 contributors who have submitted over 1,500 pull requests, roughly double the development pace of Redis over the same period. You can also use the latest Valkey 9.0 in Amazon ElastiCache.
  • Query billion-scale vectors with SQL – You can learn how to query Amazon S3 Vectors from Amazon Aurora PostgreSQL-Compatible Edition using standard SQL, and how to combine vector similarity results with relational filters in a single query, for example, finding the most semantically similar products and then filtering by price, stock status, or tenant in one SQL statement.
  • Building an end-to-end agentic SRE using AWS DevOps Agent – Learn how to configure DevOps Agent Spaces that define an investigation scope, integrating seamlessly with Amazon CloudWatch, Splunk, GitHub, and Slack. You can also learn how to trigger automated investigations via webhooks, generate mitigation plans, and hand off agent-ready specs to coding agents like Kiro for implementation.

For a full list of AWS blog posts, be sure to keep an eye on the AWS Blogs page.

Learn more about AWS, browse and join upcoming AWS-led in-person and virtual events, startup events, and developer-focused events as well as AWS Summits and AWS Community Days. Join the AWS Builder Center to connect with builders, share solutions, and access content that supports your development.

That’s all for this week. Check back next Monday for another Weekly Roundup!

Channy

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Introducing the Claude Platform on AWS

1 Share
Introducing the Claude Platform on AWS
Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Windows 11 is getting a macOS-like speed boost

1 Share

Microsoft is currently testing a new speed boost feature in Windows 11 that is designed to improve app launch times and make things like the Start menu feel more responsive. The feature, which is reportedly called "Low Latency Profile," will ramp up CPU frequency in short bursts to improve the speed of menus, flyouts, apps, and more - much like how macOS handles similar tasks.

Windows 11 testers have been trying out the new unannounced feature over the past week, and noticing significant speed improvements launching File Explorer or the Start menu, as well as apps like Outlook, the Microsoft Store, and Paint.

Read the full story at The Verge.

Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories