Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
153076 stories
·
33 followers

What are managed identities in SQL Server 2025? A complete guide

1 Share

Learn how managed identities in SQL Server 2025 enhance security by eliminating passwords and enabling seamless Microsoft Entra authentication for Azure resources.

Those who have worked with SQL Server will understand the need to avoid storing passwords for accessing resources. Windows-based identities are fine for on-premises SQL Server systems, including those on cloud-based virtual machines (VMs), but are of no use when you need to access cloud-based resources like those in Azure.

Some Azure-based resources (including storage accounts) offer other access methods, such as shared access signatures (SAS), but these aren’t much of a step-up from passwords.

What’s really needed is for SQL Server to have its own Microsoft Entra based identity. These can be used directly with Azure-based resources – and that’s exactly where managed identities come in.

What are managed identities?

A managed identity is a Microsoft Entra identity that Azure creates and manages for a resource. This allows the resource to obtain tokens for other Azure services without the need for storing passwords, client secrets, or certificates in code or configuration. Two types of managed identities are available.

First, a system-assigned managed identity, which is specific to a resource. For example, I could add a system-assigned managed identity to an Azure Data Factory (ADF). The ADF could then use the identity when making connections to, or accessing, other resources. When the ADF is deleted, the system-assigned managed identity is also removed.

Then there’s a user-assigned managed identity. This is created independently of any specific resource and can be applied to a resource, in addition to any system-assigned managed identity. User-assigned managed identities are very powerful because a single identity can be assigned to multiple resources.

For example, I might have a single identity for a group of data factories. This simplifies the configuration of access and, importantly, if I need to recreate the data factory, I can use the same identity. Even when all the data factories are deleted, the user-assigned managed identity can survive. Another common use for these is to assign the same identity to a group of web servers.

The official Microsoft documentation for managed identities can be found here.

Why managed identities are not just service principals

Under the covers, managed identities are a special kind of service principal, but there are important differences. The major one is that service principals expire, whereas Azure manages the lifecycle and credential rotation of managed identities for you.

I’m always concerned when I see solutions built using service principals without plans for what would happen when they expire. These are like time-bombs at many sites today, but using managed identities avoids these issues. The credential expires after 90 days and is rolled over after 45 days.

The biggest contrast is operational. With a service principal, someone must create an app registration, issue and protect a secret or certificate, and periodically rotate that credential. That might also require updating dependent systems.

Fast, reliable and consistent SQL Server development…

…with SQL Toolbelt Essentials. 10 ingeniously simple tools for accelerating development, reducing risk, and standardizing workflows.
Learn more & try for free

What’s needed to use a managed identity?

To be able to use managed identities, you need two things:

  • The ability to assign a managed identity to a resource or service
  • The ability to use the managed identity when making a connection from that resource or service.

You might think that anything that can have a managed identity could also use it for connections, but that’s not the case.

For example, you can assign either or both a system-assigned managed identity and a user-assigned managed identity to an Azure Analysis Services (AAS). However, AAS does not currently provide any way to use that identity when connecting to an Azure-based resource or service.

We currently have AAS servers making connections to Azure SQL Database using logins and passwords, but I wish we didn’t have to do this. It wouldn’t be required at all if we could just specify the managed identity in the connection details.

How do managed identities apply to SQL Server 2025?

It’s increasingly common for SQL Server to need to connect to Azure-based resources. This not only happens with SQL Server in Azure VMs, but also with on-premises SQL Server installations. However, SQL Server is not a native Azure resource or service.

From SQL Server 2025 onwards, it’s possible to both configure and use a system-assigned managed identity when SQL Server is enrolled in Azure Arc. To enable this, the SQL Server instance must be connected to Arc and run on Windows Server. It must also use the latest Azure Extension for SQL Server.

Currently, the documentation notes that SQL Server must have access to Azure public cloud for Microsoft Entra authentication, and that failover cluster instances are not supported for this managed-identity setup. We also can’t (yet) use user-assigned managed identities with SQL Server.

So, we only have part of what we need right now – but it’s a great start.

How to configure and use the managed identity in SQL Server 2025

Once the SQL Server instance is Arc-enabled, you can assign the server’s primary managed identity. You do this from the Azure portal while configuring the properties of the Arc-enabled SQL Server resource, under the Microsoft Entra ID and Purview settings. This primary managed identity is what SQL Server 2025 uses when making outbound connections to Azure-based resources.

You also need to grant the identity permission to the target resource in Azure. The role required is a role-based access control (RBAC) configuration, and will usually be different for each service type.

As part of making an outbound connection, you need to use a database-scoped credential like this:

EXECUTE sp_configure 'allow server scoped db credentials', 1;
RECONFIGURE;
GO
CREATE DATABASE SCOPED CREDENTIAL [storage_credential]
WITH IDENTITY = 'Managed Identity';
GO

Some credentials will also require you to specify a ‘secret’, for example when making a connection to an Azure OpenAI endpoint. Here, you not only specify IDENTITY = ‘Managed Identity’, but also a JSON secret identifying the token audience.

Note that it’s common to name the credential after the scope of the resource you’re accessing:

CREATE DATABASE SCOPED CREDENTIAL
[https://my-azure-openai-endpoint.cognitiveservices.azure.com/]
WITH IDENTITY = 'Managed Identity',
     SECRET = {"resourceid":"https://cognitiveservices.azure.com"}';
GO

This credential can then be used when executing calls like CREATE EXTERNAL MODEL. Once this is done, calls to the external model can use that credential to authenticate without the need to supply an API key.

In summary: managed identities in SQL Server 2025

We still have limitations with how we can configure and use managed identities with SQL Server, but what’s been provided so far is a great start. It lets traditional on-premises (and cloud VM-based) SQL Server instances participate in modern Microsoft Entra-based authentication patterns, and in turn reduce potential issues like credential leakage.

Protect your data. Demonstrate compliance.

With Redgate, stay ahead of threats with real-time monitoring and alerts, protect sensitive data with automated discovery & masking, and demonstrate compliance with traceability across every environment.
Learn more

FAQs: Managed identities in SQL Server 2025

1. What is a managed identity in Azure?

A managed identity is a Microsoft Entra ID identity automatically created and managed by Azure. It allows services to securely access other Azure resources without storing passwords, secrets, or certificates.

2. What are the types of managed identities?

  • System-assigned managed identity – tied to a single resource and deleted with it
  • User-assigned managed identity – reusable across multiple resources and managed independently

3. Why use managed identities instead of service principals?

Managed identities eliminate the need to manage credentials manually. Unlike service principals, Azure automatically rotates and manages credentials, reducing security risks and maintenance overhead.

4. Can SQL Server 2025 use managed identities?

Yes. SQL Server 2025 supports system-assigned managed identities when connected via Azure Arc, enabling secure access to Azure resources.

5. What are the requirements to use managed identities in SQL Server 2025?

  • SQL Server must be Arc-enabled
  • Must run on Windows Server
  • Requires the latest Azure Extension for SQL Server
  • Needs access to Azure public cloud for authentication

6. Does SQL Server 2025 support user-assigned managed identities?

No, currently only system-assigned managed identities are supported.

7. How do managed identities improve security in SQL Server?

They remove the need to store credentials like passwords or API keys, reducing the risk of credential leaks and simplifying secure access to Azure services.

8. How are managed identities used in SQL Server connections?

They are configured via database-scoped credentials using IDENTITY = 'Managed Identity', allowing SQL Server to authenticate securely when accessing external Azure resources.

The post What are managed identities in SQL Server 2025? A complete guide appeared first on Simple Talk.

Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

How to Build a Loan Document Workflow Using an Embedded React PDF Viewer

1 Share

TL;DR: Simplify loan processing by building a React-based workflow with an embedded PDF viewer to handle form filling, document review, approvals, e-signatures, audit trails, and final delivery in one place. Using a React PDF viewer from Syncfusion, you can replace fragmented processes with a unified, secure, and compliance-ready solution.

Loan processing breaks where documents begin

Loan processing is document-intensive by nature, and that’s exactly where most workflows break down.

Loan officers juggle email threads, outdated file copies, fragmented review tools, and customers wait for long days for approvals, and incomplete audit trails. When form filling, reviewing, annotating, signing, and approving all happen in separate systems, every transition is a potential point of failure.

This blog shows how to solve that problem by building a complete, role-based loan document review workflow by embedding the Syncfusion® React PDF Viewer, all inside a single React application. You’ll get a step-by-step walkthrough, key code snippets, and a working GitHub sample to reproduce the full flow.

Experience a leap in PDF technology with Syncfusion's PDF Library, shaping the future of digital document processing.

Why digital loan workflows matter more than ever

Loan processing is the end-to-end sequence that turns an application into a sanctioned loan: intake, verification, review, approval, and delivery. When each stage depends on manual handoffs or disconnected tools, the entire pipeline slows down.

Here’s what that looks like in practice:

  • Slow turnaround: Manual routing and email threads add days or weeks.
  • Broken context: Reviewers work from copies or fragmented annotations, causing rework.
  • Compliance exposure: Incomplete audit trails and versioning gaps increase risk.
  • Operational cost: Multiple-point tools and manual steps inflate headcount and processing cost.
  • Poor customer experience: Long waits and repeated document requests kill conversion.

In 2026, a unified, auditable digital workflow isn’t just a modernization effort. It’s a competitive advantage that turns loan pipelines into predictable revenue engines.

What a unified digital workflow delivers

Replacing fragmented processes with structured, automated digital document flows in a single interface changes the picture entirely:

  • Speed: Digital workflows cut decision time. Faster approvals mean more closed loans.
  • Accuracy: Inline validation and single-source PDFs reduce errors and resubmission.
  • Visibility: Real-time dashboarding and status badges eliminate chase work for staff.
  • Compliance: Embedded signatures, stamps, and audit logs create defensible records.
  • Lower cost, higher conversion: Fewer touchpoints, less rework, and better customer experience directly boost margins.

Architecture overview: One app, one document, full traceability

The loan processing application uses a clean, modern architecture that pairs an intuitive frontend review experience with a secure, auditable backend. The result is a seamless end-to-end workflow, form filling, annotation, review, approval, and final PDF generation, all inside a unified digital system.

Technology stack

Frontend (React + Syncfusion PDF Viewer)

A React application embeds the Syncfusion PDF Viewer, enabling users to:

  • Fill form fields directly in the browser.
  • Add annotations (highlights, comments, notes).
  • Apply E-signatures.
  • Download final PDFs.

All interactions happen inside a single interface. No external viewers, no file switching, and no context loss.

Backend services

A lightweight backend service handles the business logic and compliance needs:

  • Persist annotation JSON for each review cycle.
  • Manage workflow status across the loan requester → loan officer → manager.
  • Record audit events (timestamps, actions, version history) for compliance.
  • Generate flattened PDFs with embedded stamps, signatures, and metadata.
  • Store documents with version tracking for future verification or audits.

Together, the stack ensures a secure, traceable, and fully digital review flow.

End-to-end role-based architecture workflow

The loan processing workflow is structured with a role-based workflow, each interacting with the PDF viewer differently:

  1. Loan requester: Fills out loan applications, uploads supporting documents, and signs forms.
  2. Loan officer: Reviews applications, adds annotations, and provides preliminary approval.
  3. Site officer: Verifies the details submitted by the loan requester and completes the verification fields in the loan application.
  4. Manager: Final review, approval authority, and sanction letter generation.

This architecture delivers a fully connected, auditable, and efficient loan processing ecosystem powered by an embedded PDF experience.

Building the end-to-end loan processing workflow with Syncfusion React PDF Viewer

Let’s break down the steps to build the loan document review processing workflow using the Syncfusion React PDF Viewer.

Step 1: Initialize the React PDF Viewer

Setting up the Syncfusion React PDF Viewer is quick and developer-friendly. All you need to do is:

  • Install the Syncfusion React packages.
  • Import the required style CSS reference.
  • Then, add the PDF Viewer component to your React app.

From there, you can load PDFs, enable form filling, annotations, and signatures, and connect the viewer to your backend APIs.

Below is a code snippet to initialize the React PDF Viewer component.

import * as ReactDOM from 'react-dom/client';
import * as React from 'react';
import './index.css';
import { PdfViewerComponent, Toolbar, Magnification, Navigation, LinkAnnotation, BookmarkView,
         ThumbnailView, Print, TextSelection, Annotation, TextSearch, FormFields, FormDesigner, Inject} from '@syncfusion/ej2-react-pdfviewer';

function App() {
    return (<div>
    <div className='control-section'>
    {/* Render the PDF Viewer */}
      <PdfViewerComponent
        id="container"
        documentPath="https://cdn.syncfusion.com/content/pdf/pdf-succinctly.pdf"
        resourceUrl="https://cdn.syncfusion.com/ej2/31.2.2/dist/ej2-pdfviewer-lib"
        style={{ 'height': '640px' }}>

         <Inject services={[ Toolbar, Magnification, Navigation, Annotation, LinkAnnotation, BookmarkView,
                             ThumbnailView, Print, TextSelection, TextSearch, FormFields, FormDesigner ]}/>

      </PdfViewerComponent>
    </div>
  </div>);
}

For more details, explore thfaq_e complete setup guide for the Syncfusion React PDF Viewer!

Step 2: Building a unified user login

The application uses a single login page for all four roles:

  • Loan Requester
  • Loan Officer
  • Site Officer
  • Manager

Users select their role at login, and after authentication, they’re directed to a role-specific dashboard showing only the tasks, documents, and permissions relevant to their stage in the workflow.

Unified user login page in the loan application
Unified user login page in the loan application

Note: This demo uses mock user-role details. In a production environment, you can integrate it with your organization’s authentication system to enforce actual access control.

Explore the wide array of rich features in Syncfusion's PDF Library through step-by-step instructions and best practices.

Step 3: Filling out the loan application

After logging in, the loan requester lands on their dashboard, where they can view existing applications, check statuses, and create a new loan request. The dashboard shows the following key details:

  • Loan ID
  • Application Name
  • Status
  • Action
  • Comments

When the customer clicks the Create button, the loan form opens directly inside the Syncfusion React PDF Viewer. The loan form includes sections for applicant details, employment information, and loan specifics. Attachment fields allow applicants to upload supporting documents without leaving the interface.

The customers need to complete all required fields, add their e-signature, and submit.

The built-in validation enforces clean data entry. Fields like Date of Birth and Phone Number accept only valid formats, preventing errors before submission.

Refer to the following code example to implement numeric field validation.

if (targetName) {
    // If field name references numeric fields (date, phone, amount, tenure, etc.), enforce digits-only
    const numericKeywords = [
        'date',
        'dob',
        'dateofbirth',
        'phone',
        'mobile',
        'contact',
        'amount',
        'tenure',
        'number',
        'no',
        'age'
    ];

    let isNumericField = numericKeywords.some(k => targetName.includes(k));

    // If the field is a signature field, skip numeric sanitization even if the name contains keywords
    const fieldTypeRaw =
        (args.field && (args.field.type || args.field.fieldType || '')) || '';

    const fieldType = (fieldTypeRaw + '').toString().toLowerCase();
    const isSignatureField = fieldType.includes('signature');

    if (isSignatureField && isNumericField) {
        console.debug(
            'Skipping numeric sanitization for signature field',
            { targetName, fieldType }
        );
        isNumericField = false;
    }
}

Refer to the following image.

Filling the loan application form using React PDF Viewer
Filling the loan application form using React PDF Viewer

Step 4: Site officer reviews

Now, the loan officer checks the loan details filled by the loan requester and forwards them to the site officer for verification.

The site officer can access all pending verification loan forms from their dashboard. Their primary responsibility is to confirm that the information provided by the loan requester is accurate, genuine, and complete. The site officer also fills site‑specific fields and returns the application to the loan officer with all relevant verification details added as comments.

This step introduces an essential layer of field‑level authentication, strengthening the reliability of the loan approval process before the application moves forward for final approval.

See the following image for more information.

Site officer reviewing the loan application
Site officer reviewing the loan application

Step 5: Loan officer reviews

Loan officers access their dedicated dashboard listing all pending applications. Selecting an application opens it inside the Syncfusion React PDF Viewer.

Loan officers cannot edit fields submitted by the loan requester. Instead, they use annotations, highlights, and comments to review the document and provide structured feedback.

After site officer verification, a loan officer can:

  • Request additional information from the loan requester.
  • Reject the application with comments.
  • Approve and forward to the manager for final review.

Implement the following code example to prevent editing field values submitted by the customer.

/**
 * onDocumentLoad
 */
const onDocumentLoad = () => {
    evaluateFields();

    // Apply read-only mode based on loan status and role
    const canEdit =
        loanStatus === LoanStatus.INFO_REQUIRED &&
        (role === "Manager" || role === "Loan Officer");

    if (
        (!canEdit || loanStatus !== LoanStatus.INFO_REQUIRED) &&
        loanStatus !== ""
    ) {
        readOnly();
    }
};

The following GIF image provides visual clarity about the loan reviewing process.

Loan officer reviewing the loan application
Loan officer reviewing the loan application

Step 6: Manager reviews and approvals

Managers hold final approval authority. Their dashboard displays all applications waiting for approval, and each document opens directly inside the Syncfusion React PDF Viewer for review. Managers cannot modify customer-submitted fields but can review all annotations added by other officers and verify supporting documents.

From here, they can:

  • Approve the application and trigger the sanction‑letter creation.
  • Reject the application with comments.

Once approved, the manager sends the sanction letter to the customer for signature. After the customer signs, the manager completes the final signing and closes the loan process.

The following code example explains how sanction letter fields are populated and locked during the approval stage.

/**
 * UpdateForm
 * Populate sanction letter fields with values collected during approval
 * and set them to read-only.
 */
function UpdateForm() {
    const viewer = viewerRef.current;
    if (!viewer) return;

    let forms = viewer.retrieveFormFields();

    for (let i = 0; i < forms.length; i++) {
        const field = forms[i];

        // Populate sanction fields based on field name
        if (field.name === "ApplicantName") {
            field.value = sanctionValues.name;
        } else if (field.name === "Amount") {
            field.value = sanctionValues.amount;
        } else if (field.name === "Tenure") {
            field.value = sanctionValues.tenure;
        } else if (field.name === "Date") {
            field.value = new Date().toLocaleDateString("en-GB");
        }

        // If the field has a value, update and lock as read-only
        if (field.value !== "") {
            viewer.updateFormFieldsValue(field);
            viewer.formDesignerModule.updateFormField(field, {
                isReadOnly: true
            });
        }
    }
}
Manager reviewing and approving the loan application
Manager reviewing and approving the loan application

Step 7: Downloading the loan sanction letter

After the manager approves the loan, the loan requester’s dashboard updates the application status to “APPROVED.” They can now open the sanction letter directly inside the React PDF Viewer and use the built-in Download option to save a signed copy locally.

Downloading the loan sanction letter
Downloading the loan sanction letter

Witness the advanced capabilities of Syncfusion's PDF Library with feature showcases.

Comparing manual vs. digital loan processing workflow

The move from manual loan processing to an embedded digital workflow significantly reduces delays, errors, and compliance risks. The table below shows how each stage improves when using the Syncfusion React PDF Viewer instead of traditional, fragmented tools.

Stage Traditional approach Embedded PDF Viewer workflow
Application submission Email attachments or physical forms require manual data entry. Inline form filling with real-time validation and instant submission.
Document review Printed markups, handwritten notes, or separate annotation tools require scanning. Built-in annotation directly on the PDF with digital commenting.
Approval and signing In-person meetings or wet signatures require physical presence. E-signatures can be completed within the viewer, entirely remotely, in just seconds.
Status tracking Manual email updates with no centralized visibility. Real-time dashboard status is visible to all authorized parties.
Sanction letter delivery Postal or email attachment with security risks. Instant in-app download with full encryption and audit trail.
Integration complexity 4-5 separate tools. Single embedded component with unified functionality.

GitHub reference

Ready to try it yourself? Clone the Loan processing demo on GitHub, run it in minutes, and experience a complete end‑to‑end loan review and approval process firsthand.

Frequently Asked Questions

Can I customize the workflow for my application?

Absolutely. The workflow, UI, permissions, backend logic, and viewer tools can all be configured to match your organization’s loan lifecycle and approval structure.

Can we automate notifications for each workflow step?

Yes. Email, SMS, Teams, or in-app notifications can be triggered by backend workflow events such as form submission, loan officer information requests, or manager approvals.

Can we add custom validation rules to the form fields?

Yes. Beyond the default PDF field validation, you can add custom form field validation rules (e.g., PAN format, IFSC validation, salary range logic).

Does the demo support multiple loan types with different workflows?

Yes. You can configure different PDF template forms, add custom approval pipelines, and include additional review steps as required by your banking solution.

Can multiple customers and loan officers work on apps at the same time?

Yes. Because the app runs in the browser and is role-based, multiple customers and loan officers can work on the loan document review demo and submit loan forms concurrently.

Can we integrate the system with OCR or document classification tools?

Yes. You can integrate Syncfusion OCR engines to auto-extract data from customer‑uploaded documents like pay slips, IDs, or bank statements before reviewing.

Does the viewer support dark mode?

Yes. Syncfusion supports built‑in theme variations, including Material, Tailwind, Fabric, Bootstrap, and custom themes with dark mode variants to match your brand style.

How does the system prevent tampering after a document is signed?

Once the document is signed, signatures and loan details are set to read-only form fields to prevent unauthorized edits from others.

Syncfusion’s high-performance PDF Library allows you to create PDF documents from scratch without Adobe dependencies.

Build faster, compliant loan workflows directly inside your React app

Thanks for reading! The Syncfusion React PDF Viewer brings together every capability needed to build a seamless, end-to-end digital loan processing workflow directly inside the browser. Its comprehensive built-in features eliminate the need for external tools, reduce operational friction, and ensure a secure, compliant loan lifecycle from app to sanction letter.

Ready to modernize your loan processing? Start exploring the full power of the React PDF Viewer:

  • Form filling & annotations: Interactive form fields with validation plus rich markup tools for reviewing documents.
  • E‑Signature support: Secure electronic signing without the need for wet signatures.
  • Role‑based access: Customizable toolbars and permissions tailored for each user role.
  • API & customization: Deep customization with 200+ APIs and easy backend integration for workflow state, audits, and versioning.
  • Performance & compliance: Fast, secure rendering (client or server) with GDPR, HIPAA, and SOC 2 compliant infrastructure.
  • Multi‑platform support: Works across React, Angular, Vue, Blazor, ASP.NET Core, and more.

If you’re a Syncfusion user, you can download the setup from the license and downloads page. Otherwise, you can download a free 30-day trial.

You can also contact us through our support forumsupport portal, or feedback portal for queries. We are always happy to assist you!

Read the whole story
alvinashcraft
12 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Mistral Adds Remote Agents and Work Mode to Le Chat

1 Share

Mistral has released Mistral Medium 3.5, a 128-billion parameter model designed to handle instruction following, reasoning, and coding within a single system, and introduced new cloud-based agent capabilities in its Vibe and Le Chat products.

By Daniel Dominguez
Read the whole story
alvinashcraft
18 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Test SQL Server Backup Performance For Free

1 Share

Test SQL Server Backup Performance For Free


Chapters

Full Transcript

Erik Darling here, Darling Data, continuing my epic journey through monitoring tool mogulhood. I want to walk through a stored procedure that I wrote to help, I mean, I wrote it for one person to help them test different backup configuration performance, but I’m sharing it with you because that’s how I roll, right? Lots of stuff gets out in the world for free. It’s over at my GitHub repo at code.erikdarling.com. That’s also where you can find other free stuff that I’ve done. This one’s in the Darling data repo where you find the majority of my other SQL Server related scripts, but there’s other stuff up there too, like performance monitors, like this one right here. Look at this handsome devil telling you all sorts of stuff about SQL Server. And then there’s an alternate version of the monitoring tool that also tells you all sorts of useful stuff about SQL Server. You can go read about all the differences between them and what they do. And then there’s Performance Studio. Where you can get good advice about query plans. Just open one up in there and you have me looking at your query plan for you and giving you feedback on it. So there’s all sorts of free stuff that I do that I just want to point out. So this is a stored procedure though. And just kind of walking through some of the code. The whole idea here is that there’s all sorts of stuff that you can do to try to make backups faster.
So for example, you can mess with how many files you stripe it out to. You can choose to compress a backup or not. You can mess with more sort of rocket science-y settings like buffer count and max transfer size. And there’s stuff that you might want to know about your backups while they’re running, like how are they making progress. So you can choose that here. And you can also choose how many times you want to run through the test because just running a test once, you know, you can get some good information. But you kind of want to run it maybe more than once to figure out, like, was that an outlier? Was that like the good average of what happened? So there’s things that we can do programmatically that are a lot easier to do. But it requires testing, right? We’ve got to know what we’re in for here. So the stored procedure, what it essentially does, there’s a help section here in case you forget what any of those things mean. But there’s also an example execution here if you’re into that sort of thing.
But it’s kind of what this thing does after it, you know, does some initial sort of smart stuff is it, it takes all of the parameters. So the you pass in parameters is a comma separated list. And what what I do, there’s a table where we hold the sort of logging results for each run through. What it does is, it takes all of the different things that you pass in and those CSVs. So like, like compression settings, number of files you’re striping to max transfer sizes, you know, buffer counts, things like that. And to put in and it parses those comma separated lists out into temp tables. And then what it does is it cross joins all of those temp tables together, right?
And what that does is it creates a combination of everything that essentially explodes all those CSVs out into into a whole bunch of different test combinations of things. So you can test combinations of like all that stuff in one run through over many, many iterations, then, you know, of course, the usual sort of, you know, logging and, you know, message tab awareness of where you are in the loop, because that’s important stuff, right? If you’re running things, you kind of want to know where they are. So you’re not like, hey, when’s this thing going to be over? What am I doing here?
So that’s what like a lot of this code does. And then, you know, after we kind of go through all this, the cursor that does important things, and more printing things out, that’s fun. And then when we get down, though, this is where we build up the sort of backup command that we’re going to run. And then we grab a whole bunch of information about what happened there, and we log it for each run through.
And then that allows us at the end to not only show you sort of a general, give you a general sense of which ones work best, but also to sort of do some reporting on things, right? So like down here, this will, the first report result that you’ll get back is going to be the result of all configurations ranked by which one had the best throughput. So like which one, which one completed the backups the fastest.
Then there’s another report that will tell you the best config per compression setting. So if you test compressed and uncompressed, this doesn’t, like if you only, if you only test compressed or only test uncompressed, this one’s a little bit less useful. But if you’re testing both, then you’ll get the best configuration per compression setting.
And then the third one is which parameters had the most impact. So like when we want to figure out like, you know, like, like, like across all of the backups that we took, which ones like had the biggest impact on like, like, like every backup that they were involved in. That’s what this will tell you.
And then, which might be important for your environment, right? That’s, it might be a good thing to know about. And then we have another report that tells you the best throughput per megabyte of RAM. So like, you know, different types of backups require different amounts of memory.
Uncompressed backups, you know, can, like the amount of memory buffers required is going to depend on max transfer size and buffer count. And then like compressed backups are actually sort of like 3x the memory size usually because you have three different compression streams that you kind of have to deal with. So you use like, like, like, well, compression is wonderful for like what ends up on disk.
The memory that it uses while the backup is being taken can be, you know, is more by 3x. And then there is a final fifth report that only runs if you did multiple iterations where we talk, where I basically report on how consistent each backup setting was. So the lower the standard deviation is, the more predictable it is.
And that is ordered by most to least stable, which helps you sort of figure out if there were outliers or if like, you know, taking like one set of backup options just had wildly different metrics after each run through, then that’s what, you know, that’s what this would show. But anyway, there’s an example run of this over here where I tested this just backing up to null. If you do backup to a path, it will clean up the backups after each iteration through.
So you’re not going to have to worry about like disk space blowing up. It uses the built in sys.delete, xp delete file thing that like all the scripts use it. It’s like like one of like the purge things and maintenance plans, but it allows you to delete backups from a particular path.
So if you are using a test backup path, don’t back, don’t run these backups to your like default location. That’s insane. But it will delete like the backups that it made.
It’s not going to delete all your backups. It keeps track of which, like the backup files that it created and like the name of them and the path to them. And it just deletes those. It’s not deleting every backup.
Don’t worry. I don’t want to get in trouble for you being like, I went to test my backups and Eric deleted all my backups. I wouldn’t do that. That’s cruel. So, but I just ran a sort of quick round through these and the results look about like so. It is maybe not terribly interesting backing up to null.
Oh, look at that. We expanded all our results at once. What a good time. So this is what it’ll show you. And I use the Stack Overflow 2010 database for this, which is maybe not the greatest example because it’s only about like 8, 10 gigs. All right.
It’s pretty small. But I just needed something that I can run through kind of quickly to show you what the results look like. So this just shows you sort of what I got locally backing up to null with a pretty small database. For you with bigger databases, you might find different things make more sense.
You might find that different configurations, maybe even depending on like how your database is set up internally with like files and file groups. You might find that different backup strategies work a little bit better for you. But for me, this is what the numbers look like.
Again, take these with a grain of salt. I’m not saying that this is like my results here or how you should take your backups. What I’m saying is you should use this procedure to test and see which ones work best for you, which is the whole point of why I wrote this. So, yeah.
Again, this is all available at code.erikdarling.com. You can go find it in my GitHub repo and it’s in the Darling Data one. It’s going to be called Test Backup Performance.
It’s got its own folder with a little readme in there so you can read all about it and enjoy yourselves. And maybe you’ll even use it and run it. Maybe you’ll even, I don’t know, want to do something with it. So, this is not part of the bunch of scripts that I usually run to do like server analysis.
Note that there is no SP underscore in front of the name because this one is a little bit outside of what I would normally be doing with someone. But I had to do it. I thought you might like it.
I thought you might find it useful. So, there we have it. Anyway, again, that’s at code.erikdarling.com. Check it out. If you run into any issues with it, open a GitHub issue because that’s where issues get solved. Don’t email me.
Say, I hit an error running your procedure. Help me. I’m going to tell you to open a GitHub issue. All this stuff is on GitHub for a reason. Open an issue where it makes sense.
Anyway, thank you for watching. Hope you enjoyed yourselves. I hope you learned something. I hope that you will test your backup performance. And I will see you in tomorrow’s video where we will talk about, where we’re going to get back to, I’m going to get back to our roots a little bit and we’re going to talk about SQL Server stuff that is not an open source tool or vectors or backups. Backups.
Can you imagine me having to do backups in this day and age? Anyway. All right. Thank you for watching.

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. Blog readers get 25% off the Everything Bundle — over 100 hours of performance tuning content. Need hands-on help? I offer consulting engagements from targeted investigations to ongoing retainers. Want a quick sanity check before committing to a full engagement? Schedule a call — no commitment required.

The post Test SQL Server Backup Performance For Free appeared first on Darling Data.

Read the whole story
alvinashcraft
33 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Philly’s Newest Museums, Galleries & Attractions for 2026

1 Share

Every time America celebrates a major milestone, Philly gets cool new stuff for the occasion. The best part: We get to keep it all.

For the Bicentennial, thousands of people flocked to Philadelphia to watch the Liberty Bell move from its former home in Independence Hall to its own pavilion on Independence Mall. The African American Museum in Philadelphia opened its doors, the first museum of its kind in the nation. And the arrival of Robert Indiana’s LOVE statue forever informally renamed JFK Plaza to LOVE Park.  

And like every milestone national anniversary before, the Semiquincentennial brings the opening of new institutions, including the First Bank of the United States museum, the Philly Pride Visitor Center in Midtown Village, the Pennsylvania Hospital Museum, two new permanent galleries at the National Constitution Center and much more.

Check out this guide to all of Philly’s 2026 birthday presents that residents and visitors alike now get to enjoy.

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

It's the Smell

1 Share

It’s now a joy to simply read any blog post that’s not AI-generated.

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories