Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
146525 stories
·
33 followers

Introducing the 'VSIX Manifest Designer' Visual Studio Extension

1 Share

I'll be honest - I've been building Visual Studio extensions for years now, and one thing that has always bugged me is editing the source.extension.vsixmanifest file. Sure, Visual Studio has a built-in designer, but let's just say it hasn't aged gracefully. It feels clunky, looks outdated, and doesn't play nice with modern SDK-style VSIX projects (like the ones you can create with VsixSdk).

So, I did what any reasonable developer would do - I built my own.

Introducing "VSIX Manifest Designer", an extension for Visual Studio 2022 and 2026 that provides a clean, modern, WPF-based designer for your VSIX manifest files. It seamlessly integrates with Visual Studio's Light, Dark, and Blue themes, and has first-class support for SDK-style VSIX projects.

What Does It Do?

When you open a source.extension.vsixmanifest file, instead of getting the ancient built-in designer (or worse, raw XML), you get a clean sidebar-based interface with six sections to manage every aspect of your extension's manifest.

Metadata

This is where you configure the core identity of your extension - things like the ID, version, publisher, display name, and description. You can also set up your icon, preview image, tags, license, getting started guide, release notes, and more.

Metadata View

Installation Targets

Here you define which versions of Visual Studio your extension supports. You can specify target IDs (like Microsoft.VisualStudio.Community), version ranges, and even target specific architectures (AMD64, ARM64).

Installation Targets View

Dependencies

If your extension depends on other extensions, this is where you configure those relationships. You can add dependencies manually, reference installed extensions, or even point to project references for SDK-style projects.

Dependencies View

Prerequisites

Prerequisites define the Visual Studio components that must be installed for your extension to work. Think of things like the .NET desktop development workload or specific SDK components.

Prerequisites View

Assets

This is probably the section you'll use the most. Assets are the actual "stuff" your extension includes - things like your VsPackage, MEF components, analyzers, CodeLens providers, project templates, item templates, and more. The designer provides smart configuration based on the asset type you select, and includes a project picker for SDK-style projects.

Assets View

Content

If you're building project or item templates, this section lets you configure the template declarations that tie your template assets together.

Content View

VsixSdk Integration

One thing I'm particularly proud of is the deep integration with VsixSdk. If you're using SDK-style projects for your extension development (and you should be!), the designer automatically detects this and provides smart project enumeration, automatic project reference handling for assets and dependencies, and proper template asset path validation.

Theme Support

I put a lot of effort into making sure this designer looks and feels like a native part of Visual Studio. All the UI controls use dynamic theme brushes via VsBrushes and VsColors, so whether you're a Light mode person, a Dark mode person, or one of those Blue theme people (no judgment), it'll look right at home.

A Few Caveats

This is the first release, and while I've been using it myself for a while now, I'm sure there are edge cases I haven't hit yet. If you run into any issues or have suggestions for improvements, please let me know!

Wrapping Up

If you're building Visual Studio extensions and you're tired of wrestling with the built-in manifest designer (or raw XML), give this a shot. It's designed to make your life easier, and it's built by someone who actually uses it every day.

Feel free to check it out, and let me know if you have any suggestions for it - I realize it could seem like its "done", but you never know what ideas folks might have!

And, of course, it's open source, so feel free to peruse the source code, create issues, and have discussions on ways we can make this tool even better. PRs accepted, too, if you're into that sort of thing 😉.

Thanks for reading, friends!



Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

Pulumi Agent Skills: Best practices and more for AI coding assistants

1 Share

AI coding assistants have transformed how developers write software, including infrastructure code. Tools like Claude Code, Cursor, and GitHub Copilot can generate code, explain complex systems, and automate tedious tasks. But when it comes to infrastructure, these tools often produce code that works but misses the mark on patterns that matter: proper secret handling, correct resource dependencies, idiomatic component structure, and the dozens of other details that separate working infrastructure from production-ready infrastructure.

We built Neo for teams that want deep Pulumi expertise combined with organizational context and deployment governance. But developers have preferred tools, and we want people to succeed with Pulumi wherever they work. Some teams live in Claude Code. Others use Cursor, Copilot, Codex, Gemini CLI, or other platforms. That is why we are releasing Pulumi Agent Skills, a collection of packaged expertise that teaches any AI coding assistant how to work with Pulumi the way an experienced practitioner would.

What are agent skills?

Skills are structured knowledge packages that follow the open Agent Skills specification. They work across multiple AI coding platforms including Claude Code, GitHub Copilot, Cursor, VS Code, Codex, and Gemini CLI. When you install Pulumi skills, your AI assistant gains access to detailed workflows, code patterns, and decision trees for common infrastructure tasks.

Available Pulumi skills

We are launching a set of skills organized into two plugin groups: migration and authoring. You can install all skills at once or choose specific plugin groups based on your needs.

Migration skills

Convert and import infrastructure from other tools to Pulumi. This plugin includes four skills covering complete migration workflows, not just syntax translation.

  • Terraform to Pulumi walks through the full migration workflow. It handles state translation, provider version alignment, and the iterative process of achieving a clean pulumi preview with no unexpected changes.

  • CloudFormation to Pulumi covers the complete AWS CloudFormation migration workflow, from template conversion and stack import to handling CloudFormation-specific constructs.

  • CDK to Pulumi covers the complete AWS CDK migration workflow end to end, from conversion and import to handling CDK-specific constructs like Lambda-backed custom resources and cross-stack references.

  • Azure to Pulumi covers the complete Azure Resource Manager and Bicep migration workflow, handling template conversion and resource import with guidance on achieving zero-diff validation.

Authoring skills

This plugin includes four skills focused on code quality, reusability, and configuration.

  • Pulumi best practices encodes the patterns that prevent common mistakes. It covers output handling, component structure, secrets management, safe refactoring with aliases, and deployment workflows. The skill flags anti-patterns that can cause issues with preview, dependencies, and production deployments.

  • Pulumi Component provides a complete guide for authoring ComponentResource classes. The skill covers designing component interfaces, multi-language support, and distribution. It teaches assistants how to build reusable infrastructure abstractions that work across TypeScript, Python, Go, C#, Java, and YAML.

  • Pulumi Automation API covers programmatic orchestration of Pulumi operations. The skill explains when to use Automation API versus the CLI, the tradeoffs between local source and inline programs, and patterns for multi-stack deployments.

  • Pulumi ESC covers centralized secrets and configuration management. The skill guides assistants through setting up dynamic OIDC credentials, composing environments, and integrating secrets into Pulumi programs and other applications.

How to install

Claude Code plugin marketplace

For Claude Code users, the plugin system provides the simplest installation experience:

claude plugin marketplace add pulumi/agent-skills
claude plugin install pulumi-migration # Install migration skills
claude plugin install pulumi-authoring # Install authoring skills

You can install both plugin groups or choose only the ones you need.

Universal installation

For Cursor, GitHub Copilot, VS Code, Codex, Gemini and other platforms, use the universal Agent Skills CLI:

npx skills add pulumi/agent-skills

This works across all platforms that support the Agent Skills specification.

Using skills

Once installed, skills activate automatically based on context. When you ask your assistant to help migrate a Terraform project, it draws on the Terraform skill’s workflow. When you are debugging why resources are being recreated unexpectedly, the best practices skill helps the assistant check for missing aliases.

In Codex and Claude Code, you can invoke skills directly via slash commands.

/pulumi-terraform-to-pulumi

Or describe what you need in natural language:

“Help me migrate this CDK application to Pulumi”

“Review this Pulumi code for best practices issues”

“Create a reusable component for a web service with load balancer”

The assistant will follow the skill’s procedures, ask clarifying questions when needed, and produce output that reflects Pulumi best practices rather than generic code generation.

Get started

We expect this collection to grow. If you have Pulumi expertise worth packaging, whether provider-specific patterns, debugging workflows, or operational practices, we welcome contributions. See the contributing guide for details.

The skills are available now at github.com/pulumi/agent-skills. Install them in your preferred AI coding environment and let us know what you build.

Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

Azure Cosmos DB TV Recap – From Burger to Bots – Agentic Apps with Cosmos DB and LangChain.js | Ep. 111

1 Share

In Episode 111 of Azure Cosmos DB TV, host Mark Brown is joined by Yohan Lasorsa to explore how developers can build agent-powered applications using a fully serverless architecture. This episode focuses on a practical, end-to-end example that demonstrates how transactional application data and AI-driven experiences can coexist on a single platform without introducing additional infrastructure or operational overhead.

The session walks through a sample application inspired by a familiar business scenario: a simple ordering system. What begins as a traditional REST-based business API evolves into an agent-enabled experience, where an AI agent can reason over real application data, maintain conversational memory, and take meaningful actions on behalf of users.

A Practical Agentic Architecture

At the core of the sample is Azure Cosmos DB, which serves as both the system of record for transactional data and the backbone for AI-powered interactions. Orders and business entities are stored in Azure Cosmos DB, enabling low-latency reads and writes while providing a consistent data model that can be reused across application layers.

The episode introduces an MCP (Model Context Protocol) bridge that exposes business data to AI agents in a structured and secure way. Rather than hard-coding database access into the agent itself, the MCP layer acts as an abstraction that allows the agent to discover available data and operations dynamically. This makes it easier to evolve the application over time while keeping the agent logic clean and portable.

LangChain.js is used to implement the agent, providing orchestration, reasoning, and chat memory. By combining LangChain.js with the MCP bridge, the agent can answer questions, retrieve order details, and reason about business context using the same data that powers the transactional API.

Walking Through the Sample

Screenshot from Azure Cosmos DB TV showing a live coding demo in Visual Studio Code. The editor displays a TypeScript project for an agent API using MCP and LangChain.js, with code creating an agent, loading tools, and streaming chat responses. Video thumbnails of Mark Brown and Yohan Lasorsa appear on the left side of the screen.

During the live demo, Mark and Yohan walk through the application architecture and then dive into the codebase. They show how the REST API is structured, how Azure Cosmos DB is accessed from the service layer, and how the MCP server exposes those capabilities to the agent. The discussion highlights how little additional code is required to move from a traditional API to an agent-enabled experience.

A key takeaway from the demo is how chat memory is handled. Rather than introducing a separate system for conversational state, the sample uses existing platform components to persist context in a scalable and reliable way. This reinforces the idea that agentic applications do not need an entirely new stack—they can be built by extending patterns developers already know.

Extending the Use Case

The episode closes with a discussion of possible extensions to the sample. These include adding richer queries, expanding agent capabilities, and taking advantage of additional Azure Cosmos DB features as the application grows. The same foundation can support more advanced scenarios, such as multi-agent workflows, deeper analytics, or integration with other AI services.

By keeping everything on a serverless footprint, the architecture remains easy to deploy, operate, and scale. Developers can move quickly from prototype to production without rethinking their data layer or introducing unnecessary complexity.

Watch the Episode

If you’re interested in building agentic applications that combine reliable transactional data with AI-driven interactions, this episode provides a clear and approachable blueprint.

Watch the full episode here: https://youtu.be/7faP1YPOFCA

Explore the related resources and sample code: https://aka.ms/lcjs/agent-mcp

Leave a Review

Tell us about your Azure Cosmos DB experience! Leave a review on PeerSpot and get a $50 gift. Get started here.

About Azure Cosmos DB

Azure Cosmos DB is a fully managed and serverless NoSQL and vector database for modern app development, including AI applications. With its SLA-backed speed and availability as well as instant dynamic scalability, it is ideal for real-time NoSQL and MongoDB applications that require high performance and distributed computing over massive volumes of NoSQL and vector data.

To stay in the loop on Azure Cosmos DB updates, follow us on X, YouTube, and LinkedIn.

The post Azure Cosmos DB TV Recap – From Burger to Bots – Agentic Apps with Cosmos DB and LangChain.js | Ep. 111 appeared first on Azure Cosmos DB Blog.

Read the whole story
alvinashcraft
17 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Getting Started with SQL Database Project Power Tools

1 Share

SQL Database Project Power Tools is a free Visual Studio extension that makes working with SQL database projects easier and more productive. This guide will help you get started with the key features.

What is SQL Database Project Power Tools?

SQL Database Project Power Tools enhances your Visual Studio experience when working with SQL Server database projects. It provides a collection of useful tools for importing databases, comparing schemas, analyzing code, creating diagrams, and more.

Installation

You can install the extension in two ways:

  1. From Visual Studio: Open Visual Studio, go to Extensions > Manage Extensions, search for "SQL Database Project Power Tools", and click Install.

  2. From the Visual Studio Marketplace: Download and install from the Visual Studio Marketplace.

After installation, restart Visual Studio to activate the extension.

Creating a New SQL Database Project

SQL Database Project Power Tools adds project templates to make it easy to create new database projects.

New Project Templates

  1. In Visual Studio, select File > New > Project
  2. Search for "SQL" in the project templates
  3. Choose the SQL Server Database Project template
  4. Name your project and choose a location
  5. Click Create

You can also add new items to your project using the enhanced item templates:

New Item Templates

Importing a Database

One of the most useful features is the ability to import an existing database schema into your project. This saves you time by automatically generating all the necessary SQL scripts.

Import Database

To import a database:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Import database
  3. Enter your database connection details
  4. Choose the file layout for the imported objects
  5. Click Import

The tool will create all the necessary files in your project, organized by object type.

Comparing Schemas

The schema compare feature helps you keep your database project in sync with your live databases. You can compare in both directions:

  • Compare your project with a database to see what needs to be deployed
  • Compare a database with your project to update your project files

To use schema compare:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Schema compare
  3. Choose your comparison source database and target (project or database)
  4. Review the differences in the generated script
  5. Apply the changes as needed

This is especially useful when working in teams or managing multiple environments.

Analyzing Your Code

Static code analysis helps you find potential issues in your database code before deployment. The analyze feature checks your SQL scripts against best practices and common pitfalls.

To analyze your project:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Analyze
  3. Review the analysis report
  4. Address any issues found and improve your code quality

The analysis includes checks for design issues, naming conventions, performance concerns, and more. Consider adding this step to your regular development workflow.

Creating Entity/Relationship Diagrams

Visualizing your database structure is easy with the E/R diagram feature. This creates a Mermaid diagram showing the relationships between your tables.

E/R Diagram

To create a diagram:

  1. Right-click on your SQL database project in Solution Explorer
  2. Select SQL Project Power Tools > Create Mermaid E/R diagram
  3. Choose which tables to include
  4. The tool generates a Mermaid markdown diagram
  5. View the diagram in Visual Studio or use it for documentation

These diagrams are perfect for documentation and help team members understand the database structure.

Viewing .dacpac Files

The extension adds a Solution Explorer node for the output of your project (a .dacpac file), making it easy to explore their contents.

Solution Explorer node

To view a .dacpac file:

  1. Build your project
  2. Expand the project in Solution Explorer
  3. Browse through the xml files and postdeploy / predeploy scripts contained in the package

This is helpful when troubleshooting post and predeployment script issues.

Scripting Table Data

When you need to include seed data in your database project, the Script Table Data feature generates INSERT statements for you.

To script table data:

  1. Select SQL Project Power Tools > Script Table Data
  2. Choose your data source and pick the table to script
  3. The tool generates INSERT statements for the table data
  4. The tool adds the generated script to your project in the Post-Deployment folder

This is based on the popular generate-sql-merge script.

Accessing the Tools

All SQL Database Project Power Tools features are accessible from the context menu in Solution Explorer:

Power Tools Menu

Simply right-click on your SQL database project and look for the SQL Project Power Tools menu option.

Power Pack Extension

For even more features, consider installing the SQL Project Power Pack, which includes:

  • T-SQL Analyzer: Real-time code analysis with over 140 rules
  • SQL Formatter: Automatic code formatting with .editorconfig support

Tips for Success

  • Start with Import: If you have an existing database, use the import feature to get started quickly
  • Regular Schema Compares: Keep your project and database in sync by comparing regularly
  • Use Analysis: Run the analyzer before deploying to catch issues early, or integrate static code analysis into your CI/CD pipeline
  • Document with Diagrams: Create E/R diagrams to help your team understand the database structure
  • Version Control: Keep your database project in source control to track changes over time

Getting Help

If you need help or want to learn more:

Next Steps

Now that you're familiar with the basics:

  1. Create or import a database project
  2. Explore the various features
  3. Set up your development workflow
  4. Share your feedback to help improve the tool

Happy database development!

Read the whole story
alvinashcraft
23 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

In Defense Of Merge: Non-Deterministic Updates

1 Share

In Defense Of Merge: Non-Deterministic Updates


Video Summary

In this video, I delve into the often maligned `MERGE` statement in SQL Server, defending it against its critics and highlighting its unique benefits. You might be surprised to learn that `MERGE` can actually help catch non-deterministic updates—something a standard `UPDATE` query won’t do. By demonstrating how `MERGE` issues a warning when you attempt a non-deterministic update, I aim to show why this statement deserves more respect in your database toolkit. The video also covers practical solutions for making these updates deterministic using window functions like `ROW_NUMBER()`, ensuring that the data integrity and consistency you expect from SQL Server are maintained. Whether you’re looking to improve query reliability or just want to understand a lesser-used feature better, there’s something here for everyone who deals with complex data operations in SQL Server.

Full Transcript

It may be very little surprise to you. I am Erik Darling with Darling Data. But what may surprise you, may rock you to your very core, throughout your soul, is today I am going to defend merge statement in SQL Server. I know, I know, no one will ever defend it. Right? People will hate on it and other people will say, it’s not that bad. No one ever defend it. I know, no one ever sticks up for merge. I’m going to stick up for merge today. Because, uh, I think there’s a really interesting thing that merge will do, if you write a non-deterministic update, that a normal update will not do. Maybe it should, because shouldn’t everything in a database be deterministic? Doesn’t, does ACID mean nothing to you people? What’s wrong with you? Down in the video description, which is my favorite part of the video, there’s all sorts of helpful links. Like, uh, ones where you can hire me for consulting, buy my training, become a supporting member of the channel for as little as $4 a month. You too can, uh, I don’t even know what, what that would do. But it’s not about what $4 a month does, like, like on its own. It’s what $4 a month does from all of you in the aggregate. Right? That’s, that’s where things get special and interesting. So, gang up on me. Uh, you can also ask me office hours questions, because I like answering questions.

Especially good questions. If you have a good question, that’s even better. I’ll give you a high five on that one. And of course, as usual, please do, uh, like, subscribe, uh, tell, tell your mom about my YouTube channel. She’ll probably dig it. Uh, I’m a, I’m a, I’m a likable fella. Uh, you know, sometimes, uh, um, you should drink when I do these. Anyway, uh, I’ll be at a couple of events coming up in March, which is, you know, a few months. Two months off at this point. I guess like four months off at this point. But hey, um, that gives you, buy a ticket now. So, you know, it’s like verge of selling out probably at this point. So might as well get in there before, like all the butts are in the seats and one of, none of those butts are yours. Right? So, uh, data tune in Nashville, March 6th and 7th. Data Saturday, Chicago, March 13th and 14th. I’ll be doing pre-cons at those advanced T-SQL. You know, the stuff that will blow your mind. Um, it won’t melt your brain. I hate when people say melt your brain. I think that that’s what the kids call cringe, but you are, you can’t, you can learn a lot from a dummy. So you should, should probably do that. Anyway, we are feeling festive today, ain’t we?

Okay. So let’s, let’s go defend merge. So, uh, I’m going to make a table here, a temp table. Temp table is not the problem. The temp table is just fine. Uh, but the problem becomes when we try to update this temp table, right? So we’re just going to put one row in there because all we need to prove this theory is one single row. Right? Uh, and we’re going to use John Skeet cause John Skeet’s just real easy to pick on stack overflow. John Skeet causes, I mean, for his, I’m going to say he causes a lot of query problems, right? Everything John Skeet gets involved, everything blows up. So the thing here, oh, you know what? I’m in the wrong database. We gotta move that over. There we go. Now it makes more sense. Life is, life is grand now. Right? So let’s, uh, let’s insert that one row again. Oh, professional. And, uh, so the, the point here is, uh, if we turn on query plans and we look at the plan for this query, uh, we will see that this query produces a parallel execution plan. Right? And by parallel execution plan, I mean, it, it looks like this, right? Where there’s, there’s parallelism. So great. Right? Uh, everything went parallel and everyone was happy. So that’s, that’s, that’s one of the issues that comes up here. And the, what that issue contributes to is that if we try to update this repeatedly, right?

If we run this update multiple times, SQL Server will update our table with multiple, with different rows on almost every execution. Right? So if we run this pair of queries together, and wait a second for it to run, it does some stuff. Uh, we get post ID. Oh, let’s do our special zoom here. We get post ID 156426. And if we run this again, the amount of time this takes is better work. Uh, there we go. 157291.

And if we run at a third time, I bet that you’re going to be surprised. We get 148352. This is a non-deterministic update. And by non-deterministic, I mean, uh, we don’t know what row is going to end up. As the update values, right? Now, how I’m here to defend merge and why I’m here to defend merge is because merge, unlike the update from center, unlike the update syntax, merge will, will, will, will warn you about these things.

Merge will say, Hey pal, uh, something’s a muck here, right? So we’re going to merge into our temp table, and we are going to use our select query as the source. And we’re going to say when matched, uh, that didn’t go well. We’re going to say when matched and exists, uh, a difference between the sets of columns in here, then we will update the table with, with those, those columns. And if we run this, let me wait a second or another second. There we go. Look, it happened. Oh, oh, merge.

Thank you. Thank you for caring about acid merge, but you could have been a little bit, maybe pithier with this error that we got here. The merge statement attempted to update or delete the same row more than once. This happens when a target row matches more than one source row. A merge statement cannot update, delete the same row of the target table multiple times. Refine the on clause to ensure a target row matches at most one source row or use the group by clause to group the source rows. Oh, merge. Thank you. I mean, you could have just rewritten the whole query for us by the time you spit that book out. But here we are, here we are. And I mean, sure we can, you know, maybe it’s a sign that we could aggregate some stuff or we should have done things differently, but at least it’s nice that merge tells us like, Hey dummy, something’s not gonna, something’s gonna look weird if you run, if you keep doing this, like you you’re screwing up. Right? So the way that we can fix this and we can, we can use merge here for this is what we can do is we can use row number, the row number function to make sure that we limit this to one row. Right? And, and something that is particularly important here is that we do not rely on what is essentially a non-unique column, right? Because we could have, we could have multiple last activity dates, right? Like the same. We, this is not guaranteed to be unique, that we have a unique tiebreaker involved in our window function. So now if we run this merge and I’m going to, I guess, pause here to point out what’s new is this thing here where we are saying we’re only where row number equals one, right? We can only put one possible thing in here. Merge executes successfully, right? We, we still get a parallel plan. All this mergey stuff still happens, but we no longer get that error. And if we run this, we will get back a completely different row than any of the rows we saw before. So when we made this deterministic, not only did we get the same value updated in this temp table, but we actually got what is probably the correct value for our temp table, which is fantastic. And sure, there are other ways you could fix this query. Every time I point out how to fix something, someone says, you could also, you could also, I know there are other ways you could rewrite this to fix it. You could even rewrite the update query with the row number to fix it.

I, I, I, I understand that. The whole point here is that merge will warn you when you write a non-deterministic update or delete, apparently. According to the error message, deletes also have to be deterministic. So if you write a non-deterministic update or delete, merge will warn you. Normal update and delete syntax will not. Granted, it’s probably less of a big deal with deletes because you’re like, well, it’s already gone. Who cares if you try to do it again, right? But that’s what I, that’s my defense of merge today. You can, you can hate it all you want, but at least it does that for you. Okay. Well, thank you for watching. I hope you enjoyed yourselves. I hope you learned something and I will see you tomorrow for something else. All right. Goodbye.

Going Further


If this is the kind of SQL Server stuff you love learning about, you’ll love my training. I’m offering a 25% discount to my blog readers if you click from here. I’m also available for consulting if you just don’t have time for that, and need to solve database performance problems quickly. You can also get a quick, low cost health check with no phone time required.

The post In Defense Of Merge: Non-Deterministic Updates appeared first on Darling Data.

Read the whole story
alvinashcraft
27 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Game Dev in 2025: Excerpts From the State of Game Development Report

1 Share

As we approach the midpoint of the decade, game developers face an evolving landscape shaped by shifting job security, technology choices, platform strategies, and practical AI adoption. Our State of Game Development 2025 report reveals critical insights that provide a clearer picture of where the industry is heading.

Layoffs and job security concerns

The game development sector has experienced significant turbulence. More than half of industry professionals reported experiencing layoffs within their organizations. This figure is notably higher than in previous years. Job security has sharply declined, placing game developers among the least confident in the tech industry. Studios may need to revisit their organizational strategies, hiring practices, and retention policies to foster stability.

Engine preferences: Unity still leads, but Godot is rising

Unity remains dominant among indie and mid-sized studios due to its versatility and ease of use. Unreal Engine remains a strong competitor, especially for graphics-intensive projects. However, the most notable trend is the rapid growth of Godot, an open-source engine that has become increasingly popular among indie developers and hobbyists for its flexibility, openness, and community-driven nature.

Platform priorities: Mobile and desktop dominate

Indie developers maintain their focus primarily on mobile and desktop platforms, with Android being the most targeted, closely followed by Windows and iOS. This aligns with the industry’s shift toward platforms that offer wider user bases and easier market entry points compared to traditional consoles.

IDEs and developer tools: JetBrains Rider emerges as a favorite

In 2025, JetBrains Rider emerged as the top IDE for indie developers, surpassing both Visual Studio and VS Code. Rider’s robust feature set and seamless workflow integration have made it the preferred tool for daily coding.

Practical AI adoption: From experimental to essential

AI technology has moved beyond experimentation to become a standard part of game development workflows. Nearly half of developers regularly use AI for feature implementation, and many leverage it for streamlined code reviews. ChatGPT, GitHub Copilot, and JetBrains AI Assistant have become staples in developers’ toolkits, while Junie, JetBrains’ AI coding agent released in April 2025, is quickly gaining popularity. AAA studios show a strong interest in incorporating AI coding agents into their pipelines.


Check out the full report to dive deeper into these game-changing trends  and leverage our data to benchmark your studio’s strategies and decisions against current industry standards.

Read the whole story
alvinashcraft
49 seconds ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories