Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
150350 stories
·
33 followers

Verified Argo CD deployments

1 Share

Since Argo CD in Octopus was released in Early Access in 2025, we’ve been incrementally adding new features to make the integration even more useful. This blog post is a deep dive into the new step verification feature that lets you wait for the updated Argo CD applications to be healthy before the step in Octopus completes.

Step verification

Previously, Octopus would consider a step complete once changes were pushed to Git.

There are some new options in the step editor now that allow you to customize the behavior. The options are listed under step Verification:

  • Direct commit: Progress to the next step once changes are pushed to Git (this is the existing behavior)
  • Pull request merged: Progress to the next step once pull requests are merged, or fail the step if pull requests are closed or abandoned. This option results in a no-op if changes are committed directly without a pull request (see Git commit method)
  • Argo CD application is healthy: Progress to the next step once all the Argo CD applications have synced the new changes and the applications are in a healthy state. Choosing this setting means your Octopus dashboard will accurately reflect the version and status of your applications deployed to the cluster

Step verification options

The task is paused while Octopus waits for pull requests to be merged or for Argo CD applications to be healthy. This means the task does not count towards your instance task cap.

Trigger sync

Turning on this option will trigger Argo CD to explicitly sync applications with the changes committed to Git by this same step.

If the application has auto-sync turned off, then triggering sync ensures Argo CD will look at the latest changes in Git when verifying application health.

If the application has auto-sync turned on, then triggering sync speeds up the deployment because Octopus does not have to wait for the next Argo CD refresh loop.

Trigger sync options

When is the application synced and healthy?

When verifying that the application is healthy after a change, we first need to check whether it references the changes we just made. Unfortunately, we can’t rely on Argo CD’s sync status alone, since Argo CD doesn’t know what Octopus’s intended changes are.

Let’s go through a few scenarios:

Scenario 1: All synced

Same commit

  1. Octopus commits 97A2
  2. Argo CD refreshes to 97A2 and syncs the changes to the cluster

Sync status:

  • Argo CD: In sync
  • Octopus: In sync

This is the simplest scenario where all parties are looking at the same commit, so everyone is In sync.

Scenario 2: Out of sync

Argo out of sync

  1. Octopus commits 97A2
  2. Argo CD refreshes to 97A2 but has yet to sync the changes to the cluster

Sync status:

  • Argo CD: Out of sync
  • Octopus: Out of sync

Even though Octopus and Argo CD are looking at the same commit, the changes have not yet been applied to the cluster, so Octopus still shows Out of sync.

Scenario 3: Octopus is ahead of Argo CD

Octopus is ahead of Argo

  1. Octopus commits 8DEF
  2. Argo CD has yet to refresh, so it still considers 97A2 to be the latest

Sync status:

  • Argo CD: In sync
  • Octopus: Git drift

In this scenario, Octopus has made a change that Argo CD doesn’t see yet. Here we introduce a concept called Git drift - this means even though everything looks up to date from Argo CD’s perspective, the changes made by Octopus aren’t in the cluster.

Scenario 4: External change overwrites Octopus-generated changes

Octopus is ahead of Argo

  1. Octopus commits 97A2
  2. Another process commits 1123 with contents overwriting Octopus-generated changes
  3. Argo CD refreshes to 1123 and syncs the changes to the cluster

Sync status:

  • Argo CD: In sync
  • Octopus: Git drift

This scenario also results in Git drift because a later commit overwrites Octopus’s changes - an example would be the user updating image tags that Octopus updated.

Scenario 5: External change is unrelated to Octopus-generated changes

Octopus is ahead of Argo

  1. Octopus commits 97A2
  2. Another process commits 1124 with contents unrelated to Octopus-generated changes
  3. Argo CD refreshes to 1124 and syncs the changes to the cluster

Sync status:

  • Argo CD: In sync
  • Octopus: In sync

This scenario is similar to the previous one, but here the later commit only contains unrelated changes - an example would be the user updating the replica count after Octopus updates the image tags. Since Octopus-generated changes made are still in the cluster, it displays In sync.

How does Octopus know what changes are intended?

Since Octopus pushed the changes to the Git repository, it can keep track of the intended changes.

The two Argo CD steps have different functionality, so the way they record the intended changes is different.

Update Argo CD Application Image Tags

This step updates the image tags in the manifests. To track changes, Octopus records JSON patches for the files it updates.

When detecting whether these changes have been overwritten later on:

  1. Octopus checks out the Git repository files for the commit that Argo CD is looking at
  2. Octopus re-applies the JSON patches to the files it previously updated
  3. If the files have any changes, then it means Octopus’s changes have been overwritten

Note that JSON patches have limitations, so if the manifest has been significantly restructured, you might see an unexpected Git drift status. Simply redeploy to remove this false positive.

Update Argo CD Application Manifests

This step generates the manifests that go into the application’s repository. To track changes, Octopus records the file hashes it generates.

When detecting whether these changes have been overwritten later on:

  1. Octopus checks out the Git repository files for the commit that Argo CD is looking at
  2. Octopus checks if the file contents have changed by comparing the hashes of the files it generated
  3. If the files have any changes, then it means Octopus’s changes have been overwritten

Does Octopus inspect the Git tree?

While planning this feature, we initially went down the route of inspecting the Git tree to figure out whether Argo CD was including the latest changes deployed by Octopus. We soon realised that we would need to inspect file contents anyway because a later commit could easily overwrite Octopus’s changes.

Other than some small optimizations to skip file comparisons when the commit SHA matches between Octopus and Argo CD, Octopus only looks at the file contents in Git; it doesn’t care whether the commit is in the Git history.

How to try it out

The step verification functionality is currently available to all customers starting with 2026.1. Pull request merged verification is available from 2026.2 (rolling out in Octopus Cloud).

There’s nothing extra required to enable the feature. Simply open the Argo CD Instances section under Infrastructure, and connect one of your Argo CD instances to Octopus. From there, you can start modelling deployments that combine GitOps and Continuous Delivery out of the box.

Conclusion

Argo CD is a powerful GitOps tool for Kubernetes, but it wasn’t built to manage the full software delivery lifecycle. Octopus complements Argo CD by adding environment promotions, orchestration across diverse workloads, fine-grained RBAC, and centralized visibility across clusters.

With Argo CD integration, Octopus lets teams combine the strengths of GitOps and Continuous Delivery without building custom automations. You get the reliability of Git-driven deployments and the safety, governance, and flexibility of a full CD platform—all in one place.

Learn more on our use-case page.

Happy deployments!

Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

Podcast: [Video Podcast] Agentic Systems Without Chaos: Early Operating Models for Autonomous Agents

1 Share

In this episode, Shweta Vohra and Joseph Stein explore what changes when software systems start planning, acting, and making decisions on their own. The conversation distinguishes truly agentic use cases from traditional automation and looks at how architects and engineers should think about boundaries, orchestration, and system design in this new environment.

By Joseph Stein
Read the whole story
alvinashcraft
19 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Vim + Markdown = Writer's Heaven

1 Share

I use Jekyll, Markdown and Vim to write content for my blog. Rather than
wrestling with a full-fledged CMS or writing raw HTML, I can use a human
readable markup language to write my posts. Vim is my editor of choice, and
while I like the friendliness of GUI markdown tools, I miss my shortcuts,
autocomplete and plugins in vim. Having a minimalistic text editor gives me an
environment that is distraction free, version controlled and easy to publish.
This article will go into how to set up vim to effectively edit Markdown with
these features:

  • Spelling
  • English Auto-Completion
  • Auto-Formatting
  • Grammar Checking

Some notes before we begin:

  • I use vim-plug to manage my plugins, and this guide assumes you do too.
  • There are two .vimrc files used here: ~/.vimrc and ~/.vim/after/ftplugin/markdown.vim, which is a file that runs only after ~/.vimrc is loaded and a markdown file is detected.

The Basics

Vanilla vim itself comes with a lot of markdown support, such as frontmatter
highlighting and spelling.

~/.vim/after/ftplugin/markdown.vim

" Disable line numbers
setlocal nonumber

I never found much value in line numbers when writing, so I turned them off
specifically for Markdown.

Spelling

~/.vim/after/ftplugin/markdown.vim

" Turn spellcheck on
setlocal spell
nnoremap zs 1z=
" Disable check for sentence capitalization
setlocal spellcapcheck=

Vim has a very effective spellchecking system that you can enable with
set spell, and a few keystrokes that make correcting your spelling easy. z=
brings up a list of possible corrections, while 1z= picks the most likely one.
I remapped it to zs to make it easy to correct spelling. If it's a word vim
doesn't recognize, you can use zg to add it to the dictionary.

spellcapcheck is a feature that detects if you forgot to capitalize the
beginning of a sentence. Unfortunately, it is just a regular expression, so if
you write something like "vs." it will decide that you forgot to capitalize the
next word and highlight it. You can disable it by emptying the regex, as I did
with setlocal spellcapcheck= .

vim-markdown

Vim-markdown, preservim/vim-markdown is a plugin that provides several nice
features, such as:

  • Folding
  • Highlighting of fenced code blocks
  • Highlighting of front matter

and some useful commands like:

  • :Toc — Create a table of contents in the quickfix list
  • :InsertToc — insert a table of contents into the buffer
  • :SetexToAtx, :HeaderDecrease, :HeaderIncrease

~/.vimrc

Plug 'preservim/vim-markdown'
" Enable folding.
let g:vim_markdown_folding_disabled = 0

" Fold heading in with the contents.
let g:vim_markdown_folding_style_pythonic = 1

" Don't use the shipped key bindings.
let g:vim_markdown_no_default_key_mappings = 1

" Filetype names and aliases for fenced code blocks.
let g:vim_markdown_fenced_languages = ['php', 'py=python', 'js=javascript', 'bash=sh', 'viml=vim']

" Highlight front matter (useful for Jekyll/Hugo posts).
let g:vim_markdown_toml_frontmatter = 1
let g:vim_markdown_frontmatter = 1
let g:vim_markdown_json_frontmatter = 1

An explanation of the settings:

  • vim_markdown_folding_disabled is set to 0 to enable folding, which lets you collapse sections of your document under their headings. To understand folding behavior, see :help folding.
  • vim_markdown_folding_style_pythonic changes the fold behavior so that the heading line itself stays visible when folded, rather than being hidden with the rest of the section.
  • vim_markdown_no_default_key_mappings disables the plugin's built-in key mappings, letting you define your own without conflicts.
  • vim_markdown_fenced_languages defines a list of language names and aliases for syntax highlighting inside fenced code blocks, so that e.g. a block tagged bash gets highlighted as sh.
  • vim_markdown_frontmatter, vim_markdown_toml_frontmatter, and vim_markdown_json_frontmatter enable syntax highlighting for YAML, TOML, and JSON front matter respectively, which is useful if you write Jekyll or Hugo posts.

Completion

Why type each word by hand when you can tab-complete it? I use the plugin
girishji/vimcomplete and its companion, girishji/ngram-complete.vim to
provide auto-completion for English.

~/.vimrc

Plug 'girishji/vimcomplete'
let g:vimcomplete_tab_enable = 1
Plug 'girishji/ngram-complete.vim'
let vimcompleteoptions = {
      \ 'buffer': {
      \     'enable': v:true,
      \     'priority': 2
      \  },
      \  'ngram': {
      \     'enable': v:true,
      \     'priority': 1,
      \     'filetypes': ['markdown'],
      \     'bigram': v:true,
      \  },
      \}
autocmd VimEnter * call g:VimCompleteOptionsSet(vimcompleteoptions)

priority determines which completions show up first in the completion menu,
with larger numbers == higher priority.

ngram-complete.vim allows for completion based on the frequency of words,
making it much more useful than standard dictionary completion, which picks
words in alphabetical order. The bigram option allows completion based on
frequency of words given the previous word rather than just the frequency of the
current word you are completing.

Auto-formatting

~/.vimrc

Plug 'dense-analysis/ale'

let g:ale_fixers = {
    \ 'markdown': ['prettier']
    \}

The ale plugin (Asynchronous Lint Engine) allows auto-formatting and linting
in vim, running external tools asynchronously so they don't block your editing.
With the configuration above, you can run :ALEFix to format the current file,
or add the following to have it format on save:

~/.vimrc

let g:ale_fix_on_save = 1

Prettier

I use prettier to auto-format my markdown
files so that they are easy to read.

~/.prettierrc.yaml

overrides:
  - files:
      - "*.md"
      - "*.markdown"
    options:
      proseWrap: "always"

proseWrap automatically wraps lines into 80 character columns. Be careful when
enabling it if you haven't started your post with it, as it can create large
diffs.

Linting

The ale plugin enables automatic linting of your posts on save. While I don't
use the lint features personally, I will guide you on how to set them up in case
you find them useful.

Markdown-lint

Markdown-lint highlights common issues with Markdown files. You can install it
with

npm install -g markdownlint-cli

Set it up with a .markdownlint.yaml file.

.markdownlint.yaml

# Enable all rules by default
# https://github.com/markdownlint/markdownlint/blob/main/docs/RULES.md
default: true

# Allow inline HTML which is useful in Github-flavour markdown for:
# - crafting headings with friendly hash fragments.
# - adding collapsible sections with <details> and <summary>.
MD033: false

# Ignore line length rules (as Prettier handles this).
MD013: false

~/.vimrc

let g:ale_linters = {
    \ 'markdown': ['markdownlint']
    \}

Vale

Vale is a command-line tool that brings code-like linting to prose. It is not a
grammar checker. You can find it here.

I did find it took a bit of effort to install and get working. Here's a guide
for what I did:

  1. Install Vale. You can find instructions here
  2. Create a ~/.vale.ini file
# Where the styles are kept.
StylesPath = .vale
Packages = write-good, proselint

MinAlertLevel = suggestion

# Where to look for local vocabulary files.
Vocab = Local

# Define which styles to use for Markdown.
[*.{md,markdown,txt}]
BasedOnStyles = Vale, write-good, proselint

[*]
BasedOnStyles = Vale

# Disable any rules that are more annoying than useful
write-good.E-Prime  = NO
  1. Create the folder ~/.vale

  2. Run vale sync

  3. Create the folder ~/.vale/config and ~/.vale/config/vocabularies/Local

  4. Create the files ~/.vale/config/vocabularies/Local/accept.txt and
    ~/.vale/config/vocabularies/Local/reject.txt

Once you've completed these instructions, you can change your ~/.vimrc as
follows:

~/.vimrc

let g:ale_linters = {
    \ 'markdown': ['vale', 'markdownlint']
    \}

Harper

However, I found the above linters didn't highlight anything useful, and were
more an annoyance than anything else. I next turned to harper, which is not
supported by ALE, so I had to build in support for it.

I am currently trying to merge the PR into the main ALE repository, and will
update this post when it happens. But for now, you can use my fork if you want
to try it.

You can find instructions on how to install Harper
here

~/.vimrc

" Not the standard ALE repository!
Plug 'ahalbert/ale'
let g:ale_linters = {
    \ 'markdown': ['harper']
    \}
let g:ale_markdown_harper_config = {
\   'harper-ls': {
\       'diagnosticSeverity': 'warning',
\       'dialect': 'American',
\       'linters': {
\           'SpellCheck': v:false,
\           'SentenceCapitalization': v:true,
\           'RepeatedWords': v:true,
\           'LongSentences': v:true,
\           'AnA': v:true,
\           'Spaces': v:true,
\           'SpelledNumbers': v:false,
\           'WrongQuotes': v:false,
\       },
\   },
\}

Grammar Checking

While Harper was more sophisticated than the linters above, none of the above
tools really worked for me. I wanted a more sophisticated grammar checker for my
writing. I thought an LLM was ideally suited to this task, so I built my own
plugin ahalbert/vim-gramaculate to check my grammar.

Plug 'ahalbert/vim-gramaculate'

Gramaculate in action

You can then use :Gramaculate to check your grammar. By default, this uses a
local model with Ollama, but you can
read the docs
to configure it with any model you want, local or remote.

Writer's Heaven

I find writing with vim a breeze once I got this all set up, and I hope this
guide helps you do the same. Between spellcheck, auto-completion, formatting and
grammar checking, vim becomes a surprisingly capable writing environment that
stays out of your way and lets you focus on the words. If you have any
suggestions for other plugins or workflows, feel free to reach out.

Read the whole story
alvinashcraft
33 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

🚀GitHub Copilot CLI: AI Assistance from the Command Line for Infrastructure Deployments

1 Share

 

  1. What is GitHub Copilot CLI?

GitHub Copilot CLI is an AI-powered assistant that runs directly inside your terminal.

Instead of manually writing commands, scripts, or debugging issues, you can simply describe what you want — and Copilot executes it.

Here’s the shift:

Traditional CLI

Copilot CLI

You write commands

You describe intent

You debug manually

AI suggests fixes

You search docs

AI brings context

For Cloud and DevOps engineers, this becomes extremely powerful because most of our work already happens in:

  • CLI (Azure CLI, Bash, PowerShell)
  • Infrastructure as Code (Terraform)
  • Pipelines (CI/CD)

Copilot CLI sits right in the middle of all this.

  1. Why This Matters for Cloud & DevOps

Here’s the thing — DevOps work is repetitive and context-heavy.

You constantly:

  • Write Azure CLI commands
  • Debug IAC (Infrastructure as Code) issues
  • Fix pipeline failures
  • Check logs and configs

Copilot CLI reduces this friction.

Example

Instead of:

az group create --name my-rg --location eastus

You can say:

Create a resource group in Azure named my-rg in East US

And it generates + executes the command.

Now scale this to:

  • Multi-region deployments
  • App Gateway configs
  • Terraform modules

That’s where it becomes a real productivity multiplier.

  1. Setting Up GitHub Copilot CLI

Prerequisites

  • GitHub Copilot Subscription:
    Active subscription required, including Individual, Business, or Enterprise plans with proper licensing
  • Supported Operating Systems:
    Supports Windows, macOS, and Linux for consistent development across major platforms
  • Software Version Requirements:
    Node.js version 22+, npm version 10+, and PowerShell 6+ on Windows necessary for npm-based installation

Installation

  • Cross-Platform Installation:
    Installing via npm provides a global Copilot CLI setup working consistently on Windows, macOS, and Linux
  • macOS and Linux Installation:
    Homebrew allows easy Copilot CLI installation on macOS and Linux with a single command
  • Windows Installation:
    WinGet enables native Windows package management for seamless Copilot CLI installation
  • System PATH Integration:
    All installation methods add Copilot CLI to system PATH for immediate terminal access

npm install -g github​/copilot

Login

copilot auth login

Start CLI

copilot

When you start, it will ask permission to trust the directory — this is important because it:

  • Reads files
  • Modifies code
  • Executes commands

 

  1. How to Use Copilot CLI (Real DevOps Examples)

Let’s move beyond basics.

🔹 Azure CLI Usage

Create an Azure App Service with Linux runtime and Node.js

Update an Application Gateway backend pool using az cli

Delete all resources in this resource group safely

🔹 Terraform Usage

Create a Terraform module for Azure VNet with 3 subnets

Fix issues in @main.tf

Explain this Terraform code and suggest improvements

🔹 Pipeline Debugging

Analyze this Azure DevOps pipeline YAML and fix errors

Why is this deployment failing?

🔹 File Context Usage

Explain @variables.tf

Optimize azure​-pipelines.yml

🔹 Built-in Commands

/review → Code review /context → Current context /usage → Token usage /compact → Optimize memory

 

     5. Advantages of using Github Copilot CLI
         Less Context Switching

  • No need to jump between:
  • Docs
  • Terminal
  • Browser
  • Faster Troubleshooting
  • It understands:
  • Errors
  • Logs
  • Config files
  • Infrastructure Automation
  • You can:
  • Generate Terraform/Bicep or any IAC code
  • Write Azure CLI scripts
  • Automate deployments
  • Acts Like an Agent
  • Not just suggestions — it can:
  • Execute commands
  • Modify files
  • Run workflows
  1. MCP (Model Context Protocol) – The Real Power

This is where things get interesting.

MCP allows Copilot to connect with external systems.

👉 Think: APIs, documentation servers, automation tools

🔷 MCP Architecture (Diagram)

 

  1. Adding MCP Servers

You can extend Copilot using MCP servers.

Add MCP Server (Interactive)

/mcp add

Provide:

  • Name
  • Type (local or HTTP)
  • Command or URL

Example: Local MCP Server

npx @playwright/mcp@latest

Example: Remote MCP Server

https://mcp.context7.com/mcp

Config File Method

📁 ~/.copilot/mcp-config.json

{
  "mcpServers": {
    "docs": {
      "type": "http",
      "url": "https://mcp.context7.com/mcp",
      "tools": ["*"]
    }
  }
}

Manage MCP Servers

/mcp show
/mcp edit docs
/mcp enable docs
/mcp disable docs

Real DevOps Use Case

You can connect:

  • Azure documentation APIs
  • Internal tools
  • Monitoring systems

Then ask:

Fetch latest Azure App Gateway documentation

  1. CLI Workflow (How It Actually Works)

User Prompt
   ↓
Copilot CLI understands intent
   ↓
Reads files / context
   ↓
(Optional) Uses MCP tools
   ↓
Generates + executes commands
   ↓
Returns result

  1. Adding Skills (Underrated Feature)

Skills = reusable workflows.

Think of them like:

  • Predefined automation
  • Standardized instructions
  • Tool integrations

Example Skills

  • Run security scans
  • Validate pipelines
  • Analyze logs

How Skills Work

They are defined using:

  • Instruction files
  • Agent configurations
  • Repo-level context

Copilot automatically picks them when relevant.

 

  1. Customizing Copilot CLI

Config File

📁 ~/.copilot/config.json

What You Can Control

  • Permissions
  • Execution behavior
  • Tool access
  • Logging

Recommended: Custom Instructions

We can create custom instructions which our copilot will follow while taking actions on our promps.

📁 .github/copilot-instructions.md

Example:

- Always use Terraform for infrastructure
- Prefer Azure CLI for automation
- Follow naming convention: env-app-region
- Use managed identity where possible

This ensures:

  • Consistency
  • Best practices
  • Governance
  1. Real-World UseCase Scenario

    You can find the sample instruction files and skills in my https://github.com/ibrahimbaig12345/GHCP_CLI_DEMO repo.
  • A POC for storage account with private endpoints:
    We simply enter a prompt to create the storage account; it will take all the details from its custom instructions file and generate a .sh file to execute the relevant cmds for the resource creation.
  • Using Skills to do a routine Security Scan of the Azure Subscription:
    You can give a simple prompt like "Can you help me with a quick security scan of my current azure subscription"

    You can check the result as shown above.

Copilot doesn’t just answer — it helps implement.

  1.  Reference

For Azure and DevOps engineers, this is a big shift.

 

Read the whole story
alvinashcraft
1 hour ago
reply
Pennsylvania, USA
Share this story
Delete

Searches

1 Share

In Searches: Selfhood in the Digital Age, journalist Vauhini Vara explores how the technologies we use to understand the world—search engines, social platforms, and now AI systems—are also reshaping how we understand ourselves. Drawing from her own experience using chatbots to write about her sister’s death, Vara reflects on what happens when our most human questions, memories, and emotions are filtered through systems designed to analyze and monetize them. Humanities scholar Luca Messarra speaks with Vara about the promises and limits of machine understanding.

Grab your copy of Searches: Selfhood in the Digital Age: https://www.vauhinivara.com/searches

This conversation was recorded on 2/26/2026. Watch the full video recording at: https://archive.org/details/searches-book-talk

Check out all of the Future Knowledge episodes at https://archive.org/details/future-knowledge





Download audio: https://media.transistor.fm/5f42feda/04090356.mp3
Read the whole story
alvinashcraft
1 hour ago
reply
Pennsylvania, USA
Share this story
Delete

.NET 11 Preview 2 Updates MAUI with Performance Improvements and Platform Refinements

1 Share

.NET 11 Preview 2 introduces a set of targeted updates to .NET Multi-platform App UI (MAUI), focusing on the Map control, binding performance, and API consistency. The changes are incremental but concrete, addressing specific usability and performance issues in XAML, data binding, and control behaviour.

By Edin Kapić
Read the whole story
alvinashcraft
1 hour ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories