Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
146487 stories
·
33 followers

Checkout a Git repository using a tag in VSCode

1 Share

If you're working with Git repositories in Visual Studio Code, you might occasionally need to checkout a specific tag—perhaps to review a previous release, test an older version, or understand how the codebase looked at a particular milestone. While VSCode's built-in Git integration is powerful, checking out tags wasn't immediately obvious to me.

Let me walk you through the process.

What are Git tags?

Before diving in, a quick refresher: Git tags are references that point to specific commits in your repository's history. They're commonly used to mark release points (like v1.0.0, v2.1.3, etc.). Unlike branches, tags are meant to be immutable snapshots of your code at a particular moment in time.

Checkout a Git tag in VS Code

Method 1: Using the Command Palette

The quickest way to checkout a tag in VSCode is through the Command Palette:

  • Open the Command Palette by pressing Ctrl+Shift+P (Windows/Linux) or Cmd+Shift+P (Mac)
  • Type "Git: Checkout to..." and select it from the dropdown

  • Select the tag from the list that appears. Tags are typically shown with a tag icon or labeled clearly

 

VSCode will checkout the tag, putting your repository in a "detached HEAD" state (more on that in a moment).

Method 2: Using the Source Control Panel

You can also checkout tags through VSCode's Source Control interface:

  • Click on the Source Control icon in the Activity Bar (or press Ctrl+Shift+G)

 

  • Click on the branch name in the bottom-left corner of the VSCode window

 

  • Select your tag from the list of refs that appears

 

Could I do the same thing in Visual Studio?

Short answer; NO. I couldn’t find a way to do this directly from the IDE.

Of course you can fall back to the command line:

git clone <repo-url>

cd <repo-folder>

git checkout <tag-name>

An example:

git clone https://github.com/user/project.git

cd project

git checkout v1.2.0

Understanding "Detached HEAD" state

When you checkout a tag, Git will inform you that you're in a "detached HEAD" state. This sounds scarier than it is! It simply means you're not on a branch—you're viewing the repository at a specific point in time.

What this means for you:

  • You can browse the code, run builds, and test the tagged version
  • If you make changes and commit them, those commits won't belong to any branch
  • To save your work, you should create a new branch: git checkout -b new-branch-name
Read the whole story
alvinashcraft
58 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

SQL Server 2025 Enterprise Developer Edition Download, Install and Configure

1 Share

Check out this step by step guide to install SQL Server 2025 Enterprise Developer Edition using either Basic or Custom options.

The post SQL Server 2025 Enterprise Developer Edition Download, Install and Configure appeared first on MSSQLTips.com.

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Getting Started with SQL Database Project Power Tools

1 Share

SQL Database Project Power Tools is a free Visual Studio extension that makes working with SQL database projects easier and more productive. This guide will help you get started with the key features.

What is SQL Database Project Power Tools?

SQL Database Project Power Tools enhances your Visual Studio experience when working with SQL Server database projects. It provides a collection of useful tools for importing databases, comparing schemas, analyzing code, creating diagrams, and more.

Installation

You can install the extension in two ways:

  1. From Visual Studio: Open Visual Studio, go to Extensions > Manage Extensions, search for "SQL Database Project Power Tools", and click Install.

  2. From the Visual Studio Marketplace: Download and install from the Visual Studio Marketplace.

After installation, restart Visual Studio to activate the extension.

Creating a New SQL Database Project

SQL Database Project Power Tools adds project templates to make it easy to create new database projects.

New Project Templates

  1. In Visual Studio, select File > New > Project
  2. Search for "SQL" in the project templates
  3. Choose the SQL Server Database Project template
  4. Name your project and choose a location
  5. Click Create

You can also add new items to your project using the enhanced item templates:

New Item Templates

Importing a Database

One of the most useful features is the ability to import an existing database schema into your project. This saves you time by automatically generating all the necessary SQL scripts.

Import Database

To import a database:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Import database
  3. Enter your database connection details
  4. Choose the file layout for the imported objects
  5. Click Import

The tool will create all the necessary files in your project, organized by object type.

Comparing Schemas

The schema compare feature helps you keep your database project in sync with your live databases. You can compare in both directions:

  • Compare your project with a database to see what needs to be deployed
  • Compare a database with your project to update your project files

To use schema compare:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Schema compare
  3. Choose your comparison source database and target (project or database)
  4. Review the differences in the generated script
  5. Apply the changes as needed

This is especially useful when working in teams or managing multiple environments.

Analyzing Your Code

Static code analysis helps you find potential issues in your database code before deployment. The analyze feature checks your SQL scripts against best practices and common pitfalls.

To analyze your project:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Analyze
  3. Review the analysis report
  4. Address any issues found and improve your code quality

The analysis includes checks for design issues, naming conventions, performance concerns, and more. Consider adding this step to your regular development workflow.

Creating Entity/Relationship Diagrams

Visualizing your database structure is easy with the E/R diagram feature. This creates a Mermaid diagram showing the relationships between your tables.

E/R Diagram

To create a diagram:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Create Mermaid E/R diagram
  3. Choose which tables to include
  4. The tool generates a Mermaid markdown diagram
  5. View the diagram in Visual Studio or use it for documentation

These diagrams are perfect for documentation and help team members understand the database structure.

Viewing .dacpac Files

The extension adds a Solution Explorer node for the output of your project (a .dacpac file), making it easy to explore their contents.

Solution Explorer node

To view a .dacpac file:

  1. Build your project
  2. Expand the project in Solution Explorer
  3. Browse through the xml files and postdeploy / predeploy scripts contained in the package

This is helpful when troubleshooting post and predeployment script issues.

Scripting Table Data

When you need to include seed data in your database project, the Script Table Data feature generates INSERT statements for you.

To script table data:

  1. Select SQL Project Power Tools > Script Table Data
  2. Choose your data source and pick the table to script
  3. The tool generates INSERT statements for the table data
  4. The tool adds the generated script to your project in the Post-Deployment folder

This is based on the popular generate-sql-merge script.

Accessing the Tools

All SQL Database Project Power Tools features are accessible from the context menu in Solution Explorer:

Power Tools Menu

Simply right-click on your SQL database project and look for the SQL Project Power Tools menu option.

Power Pack Extension

For even more features, consider installing the SQL Project Power Pack, which includes:

  • T-SQL Analyzer: Real-time code analysis with over 140 rules
  • SQL Formatter: Automatic code formatting with .editorconfig support

Tips for Success

  • Start with Import: If you have an existing database, use the import feature to get started quickly
  • Regular Schema Compares: Keep your project and database in sync by comparing regularly
  • Use Analysis: Run the analyzer before deploying to catch issues early, or integrate static code analysis into your CI/CD pipeline
  • Document with Diagrams: Create E/R diagrams to help your team understand the database structure
  • Version Control: Keep your database project in source control to track changes over time

Getting Help

If you need help or want to learn more:

Next Steps

Now that you're familiar with the basics:

  1. Create or import a database project
  2. Explore the various features
  3. Set up your development workflow
  4. Share your feedback to help improve the tool

Happy database development!

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Daily Reading List – January 28, 2026 (#709)

1 Share

Some cool AI updates from Google this week, including new treats in Chrome, an updated Gemini CLI, and Agentic Vision in Gemini 3 Flash.

[blog] The new era of browsing: Putting Gemini to work in Chrome. Each of these capabilities is genuinely useful. I like how Chrome embeds this into the experience, even for enterprise scenarios.

[blog] The AI Evolution of Graph Search at Netflix: From Structured Queries to Natural Language. Detailed post from Netflix about the text-to-query capability in their platform.

[paper] Rethinking the Value of Multi-Agent Workflow: A Strong Single Agent Baseline. If you’ve got multiple agents that contribute to the same outcome and use the same underlying model, couldn’t you just solve the problem with a single agent? That’s what this paper explores.

[blog] Tailor Gemini CLI to your workflow with hooks. This is terrific. Intercept key stages of the agentic loop to insert logic that improves your security posture or performance.

[article] QCon chat: Is agentic AI killing continuous integration? Not killing, but definitely forcing everyone to rethink some key aspects of it.

[article] Google’s more affordable AI Plus plan rolls out to all markets, including the US. This is a great deal for people who want expansive, affordable access to great Google technologies.

[blog] The Mighty Metaphor. Are you providing your listening or reading audience with thoughtful metaphors that aid exploration?

[article] How should product managers decide which tasks to delegate to AI? I’m having a conversation tomorrow with a customer on this very topic, so this was well-timed!

[blog] Introducing Agentic Vision in Gemini 3 Flash. Treating AI vision as “active investigation” is super interesting. News here.

[article] Welcome to the last 18 months of labor-intentive services. The clock is ticking for service providers. Companies are better on software as services.

Want to get this update sent to you every day? Subscribe to my RSS feed or subscribe via email below:



Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Day 29: Measuring What Matters: Is AI Actually Helping?

1 Share

I felt faster.

Prompts flying. Code generating. Features shipping. The vibe was strong.

Then I looked at my actual output. Features shipped per week. Bugs in production. Time from issue to deployment. The numbers told a different story than my feelings.

I wasn’t faster. I was busier. More code was being written. But the important metrics were about the same.

That’s when I learned: feeling productive and being productive are different things. If you want to know whether AI is actually helping, you need to measure.

What Not to Measure

Some metrics are useless or misleading:

Lines of code: More code isn’t better. AI generates verbose code. You might ship more lines and less value.

Prompts per day: Using AI more doesn’t mean accomplishing more. You could be prompting in circles.

Features started: Starting is easy. Finishing is what matters.

Time in AI tools: Time spent doesn’t equal value produced.

These metrics make you feel productive without telling you if you’re productive.

What to Measure

Focus on outcomes, not activities:

Cycle Time

Time from starting a task to deploying it.

Cycle time = Deploy timestamp - Start timestamp

If AI is helping, cycle time should decrease. Track this per feature or per issue.

Throughput

Features or issues completed per week.

Throughput = Completed items / Time period

If AI is helping, throughput should increase while quality stays constant.

Quality Metrics

Bugs in AI-generated code vs. manually written code.

Track:

  • Bugs reported per feature
  • Time to find bugs (in testing vs. production)
  • Severity of bugs
  • Rework needed after initial implementation

If AI code has more bugs, you’re not actually saving time.

Time Distribution

Where does your time go?

Categories:

  • Planning and design
  • Writing prompts
  • Reviewing AI output
  • Fixing AI mistakes
  • Manual implementation
  • Testing
  • Debugging
  • Deployment

If you spend 2 hours prompting and reviewing to save 1 hour of coding, that’s a net loss.

Setting Up Tracking

You don’t need complex tooling. Start simple:

Option 1: GitHub Labels

Label issues with how they were built:

  • ai-assisted
  • manual
  • ai-heavy

Compare metrics between labels.

Option 2: Time Tracking

Track time per task with notes on AI usage. At the end of each week, review:

  • What took longest?
  • Where did AI help?
  • Where did AI hurt?

Option 3: Simple Spreadsheet

| Feature | Start | Deploy | AI? | Bugs | Rework? |
|---------|-------|--------|-----|------|---------|
| Wishlist | 1/15 | 1/17  | Yes | 1    | No      |
| Search   | 1/18 | 1/25  | Yes | 3    | Yes     |
| Profile  | 1/26 | 1/27  | No  | 0    | No      |

Patterns emerge quickly.

Honest Assessment Questions

Ask yourself weekly:

  1. What did I ship this week? Not start. Ship.

  2. What took longer than expected? Was AI a factor?

  3. What bugs did I introduce? How many were in AI code?

  4. What did I waste time on? Prompting in circles? Fixing AI mistakes?

  5. What would I do differently? With hindsight, would AI have been the right choice?

The AI Overhead Trap

AI has overhead:

  • Writing prompts takes time
  • Reviewing output takes time
  • Fixing mistakes takes time
  • Context switching takes time

For simple tasks, this overhead can exceed the benefit.

AI benefit = Time saved - (Prompt time + Review time + Fix time)

If the benefit is negative, AI slowed you down.

Where AI Actually Helps

In my tracking, AI helps most with:

Boilerplate generation: Tests, CRUD endpoints, similar components. High repetition, low complexity.

Code review: Finding issues I’d miss. Consistent multi-pass review.

Exploration: “How would I approach this?” Planning before coding.

Edge cases: Thinking of scenarios I wouldn’t consider.

Documentation: Explaining code, writing docs, creating runbooks.

Where AI Hurts

AI hurts most with:

Novel problems: Unique architecture, unusual requirements. AI has no patterns to draw from.

Subtle bugs: AI confidently generates code with subtle issues. Review time exceeds benefit.

Over-engineering: AI adds complexity when simplicity would work. Then I maintain the complexity.

Context-heavy work: When you need to understand 20 files to make a small change. AI’s understanding is shallow.

The Comparison Test

Try this experiment:

  1. Pick two similar features
  2. Build one with heavy AI assistance
  3. Build one with minimal AI
  4. Compare: time, quality, bugs, rework

What you find might surprise you. Sometimes the manual approach is faster for your context.

Tracking Template

Weekly review template:

# Week of [date]

## Shipped
- [feature 1] - AI heavy/light/none - [time] - [bugs]
- [feature 2] - ...

## Time Distribution
- Planning: X hours
- Prompting: X hours
- Reviewing AI: X hours
- Fixing AI: X hours
- Manual coding: X hours
- Testing: X hours
- Other: X hours

## What Worked
- [what AI helped with]

## What Didn't Work
- [where AI hurt]

## Next Week
- [what to do differently]

The Honest Truth

AI doesn’t make everyone faster on everything.

It makes some people faster on some things. The only way to know if it’s helping you is to measure.

Track your outcomes. Be honest about what you find. Adjust your usage based on evidence, not vibes.

Tomorrow

Fast is good. Sustainable is better. Tomorrow I’ll cover managing technical debt when you’re shipping fast with AI. How to stay fast without drowning in accumulated mess.


Try This Today

  1. Pick a feature you built with AI recently
  2. Estimate the time breakdown: prompting, reviewing, fixing, manual work
  3. Would it have been faster without AI?

Be honest. The answer might be yes. That’s useful information. It tells you where to use AI and where not to.

The goal isn’t to use AI. The goal is to ship good software. AI is one tool. Measure whether it’s actually helping.

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Kernel Community Drafts a Plan For Replacing Linus Torvalds

1 Share
The Linux kernel community has formalized a continuity plan for the day Linus Torvalds eventually steps aside, defining how the process would work to replace him as the top-level maintainer. ZDNet's Steven Vaughan-Nichols reports: The new "plan for a plan," drafted by longtime kernel contributor Dan Williams, was discussed at the latest Linux Kernel Maintainer Summit in Tokyo, where he introduced it as "an uplifting subject tied to our eventual march toward death." Torvalds added, in our conversation, that "part of the reason it came up this time around was that my previous contract with Linux Foundation ended Q3 last year, and people on the Linux Foundation Technical Advisory Board had been aware of that. Of course, they were also aware that we'd renewed the contract, but it meant that it had been discussed." The plan stops short of naming a single heir. Instead, it creates an explicit process for selecting one or more maintainers to take over the top-level Linux repository in a worst-case or orderly-transition scenario, including convening a conclave to weigh options and maximize long-term project health. One maintainer in Tokyo jokingly suggested that the group, like the conclave that selects a new pope, be locked in a room and that a puff of white smoke be sent out when a decision was reached. The document frames this as a way to protect against the classic "bus factor" problem. That is, what happens to a project if its leader is hit by a bus? Torvalds' central role today means the project currently assumes a bus-factor of one, where a single person's exit could, in theory, destabilize merges and final releases. In practice, as Torvalds and other top maintainers have discussed, the job of top penguin would almost certainly currently go to Greg Kroah-Hartman, the stable-branch Linux kernel maintainer. Responding to the suggestion that the backup replacement would be Greg KH, Torvalds said: "But the thing is, Greg hasn't always been Greg. Before Greg, there was Andrew Morton and Alan Cox. After Greg, there will be Shannon and Steve. The real issue is you have to have a person or a group of people that the development community can trust, and part of trust is fundamentally about having been around for long enough that people know how you work, but long enough does not mean to be 30 years."

Read more of this story at Slashdot.

Read the whole story
alvinashcraft
8 hours ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories