Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
134970 stories
·
30 followers

Goodbye Create React App, Hello TanStack Create React App

1 Share
Dev News logo

Create React App was deprecated Feb. 14 after nearly a decade of use building React apps. It will continue working in maintenance mode.

“Today, we’re deprecating Create React App for new apps, and encouraging existing apps to migrate to a framework, or to migrate to a build tool like Vite, Parcel, or Rsbuild,” wrote Meta’s Matt Carroll and Rickey Hanlon.

The Create React App (CRA) library was released in 2016 when there was no clear way to build a new React app, the two wrote. It combined several tools into a single recommended configuration to simplify app development, allowing developers to quickly spin up a React project. It included a basic file structure for the website and a development server to run the website locally for easy development.

“This allowed apps a simple way to upgrade to new tooling features, and allowed the React team to deploy non-trivial tooling changes (Fast Refresh support, React Hooks lint rules) to the broadest possible audience,” they wrote. “This model became so popular that there’s an entire category of tools working this way today.”

So… why end a popular tool?

The blog post outlined CRA’s problems, including that it’s difficult to build high performant production apps. It also noted that Create React App does not offer specific options for routing, data fetching or code splitting.

“In principle, we could solve these problems by essentially evolving it into a framework,” they wrote.

But that raises what may be the biggest challenge for CRA: It has null active maintainers.

So, the team is recommending that developers create new React apps with a framework.

“All the frameworks we recommend support client-side rendering (CSR) and single-page apps (SPA), and can be deployed to a CDN or static hosting service without a server,” they added.

They offer links to migration guides for Next.js, React Router and the Expo web pack to Expo Router.

“If your app has unusual constraints, or you prefer to solve these problems by building your own framework, or you just want to learn how React works from scratch, you can roll your own custom setup with React using Vite, Parcel or Rsbuild,” they added.

The recently released 2024 State of React ranked CRA as the third most-used tool, behind the Fetch API and useState. Eighty-nine percent of the 6,240 developers who responded about CRA had used the tool, but from that group nearly 30% reported a negative sentiment about it. Only 15% expressed a positive sentiment. Forty-four percent of frontend developers expressed no sentiment.

New Create React App for TanStack Router

In related news, TanStack recently added an open source Create React App for TanStack Router. It’s designed to be a drop-in replacement for Create React App. This will allow developers to build SPA applications based on their Router.

To help accelerate the migration away from create-react-app, the team has created the create-tsrouter-app CLI, which is a plug-n-play replacement for CRA.

“What you’ll get is a Vite application that uses TanStack Router,” the project notes state. “create-tsrouter-app is everything you loved about CR but implemented with modern tools and best practices, on top of the popular TanStack set of libraries.”

That includes Tanstack Query, an asynchronous state management for TS/JS, React, Solid, Vue, Svelte and Angular, and Tanstack Router for React and Solid applications.

It’s available under the MIT license.

Anaconda Offers New Open Source AI Tool

Anaconda introduced a new open source AI data tool on Wednesday that lets data science and development teams explore, transform and visualize data using natural language.

It’s called Lumen AI and it’s an agent-based framework for “chatting with data” and retrieval augmented generation (RAG). The goal is to make advanced data workflows more intuitive and scalable, according to a post announcing the news.

“AI-driven, agent-based systems are rapidly changing how businesses operate, but many organizations still struggle with technical barriers, fragmented tools, and slow, manual processes,” Kodie Dower, senior marketing communications manager, wrote. “Lumen eliminates those roadblocks by giving users an AI-powered environment to quickly generate SQL queries, analyze datasets, and build interactive dashboards — all without writing code.”

Dower added that Lumen can:

  • Create visualizations such as charts, tables and dashboards without coding;
  • Generate SQL queries and transform data across local files, databases, and cloud data lakes;
  • Support collaborations with serialized and shared workflows;
  • Inspect, validate and edit AI-generated outputs to ensure data accuracy and clarity;
  • Support custom tools and AI agents.

“The declarative nature of Lumen’s data model makes it possible for LLMs to easily generate entire data transformation pipelines, visualizations and many other types of output,” the repository explained. “Once generated the data pipelines and visual output can be easily serialized, making it possible to share them, to continue the analysis in a notebook and/or build entire dashboards.”

Vercel Adds Support for React Router v7 Apps

React Router version 7 is a bit different than its previous iterations in that it’s also a framework after being merged with Remix. This week, Vercel announced it will support React Router v7 applications when used as a framework. This includes support for server-rendered React Router applications using Vercel’s Fluid compute.

The post Goodbye Create React App, Hello TanStack Create React App appeared first on The New Stack.

Read the whole story
alvinashcraft
2 hours ago
reply
Pennsylvania, USA
Share this story
Delete

292: VS Code Friend or Foe… Azure Data Studio Murdered

1 Share

Welcome to episode 292 of The Cloud Pod – where the forecast is always cloudy! This week Justin and Jonathan are a dynamic duo, bringing you all the latest in news – and sound effects – because it’s earnings time! Plus we’ve got new from VS Code, Azure Data Studio, CodeBuild and more. 

Titles we almost went with this week:

  • The Cloud Pod Renames Cloud Earnings to ‘The Gulf of Capex’
  • Sorry Elon, OpenAI Doesn’t Want Your Pocket Change
  • MacOS gets into the Fastlane for Oil Changes

A big thanks to this week’s sponsor:

We’re sponsorless! Want to get your brand, company, or service in front of a very enthusiastic group of cloud news seekers? You’ve come to the right place! Send us an email or hit us up on our slack channel for more info. 

General News

It’s earnings time! 

01:29 Alphabet is planning to spend big on AI again this year, sending shares down

  • Alphabet earnings were a bit of a let down with cloud revenue missing and their announcement of spending $75 Billion in CapEx (DeepSeek who?)
  • Consolidated revenue rose 12% in the period to 96.5 billion. 
  • Capex investments of $75b shocked analysts who expected $57.9 billion.
  • EPS was 2.15 vs 2.13.
  • Revenue of 96.5 billion vs 96.62 expected.
  • Ad revenue rose to 72.46 billion vs 71.3, Youtube advertising revenue was 10.47 billion vs 10.22 billion. 
  • Google Cloud was 12.0 billion vs expectation of 12.19 billion.

02:09 Jonathan – “I’m guessing ad revenue is gonna be down again, Q1, Q2 because I think a lot of ad revenue is driven by the election season. So that’s not looking too good for them.”

03:13 Microsoft GAAP EPS of $3.23 beats by $0.13, revenue of $69.6B beats by $790M

  • Microsoft followed up with also weak growth in its Azure cloud computing unit. 
  • EPS was 3.23 beating expectations by 0.13
  • Revenue of 69.6B beating by 780M
  • Intelligent cloud revenue was 25.5 billion an increase of 19%
  • Microsoft indicated they plan to spend 80 Billion in CapEx for AI and data center growth. 

04:02 Justin- “Also international expansion still, I think a big area too, particularly for Azure and Google and even Amazon. Like they’re all announcing more and more regions, more expansion of data centers, lots of laws that are going to pass for data sovereignty that they have to deal with. there’s, there’s spend everywhere.”

04:23 Amazon earnings recap: Stock falls as guidance falls short, CFO indicates capex of more than $100 billion in 2025 

  • Amazon followed its peers by indicating they will invest $100B in CapEx for Amazon’s AI efforts on AWS
  • CEO Andy Jassy said that AWS could grow faster if they were not hindered by datacenter capacity…which is really interesting. We’re assuming GPU capacity. 
  • Amazon reported sales of 187.79B, beating estimates of 187.32 billion, EPS was 1.86 compared to $1.50 expected. 
  • AWS was a little light compared to estimates at 28.79B compared to expectations of 28.82 billion, but what’s 300 million between friends? 
  • Amazon guided lighter than analysts expected at 151b to 155.5 billion, vs expectations 158.64 billion. Also penalized in after hours markets. 

06:04 Justin- “I would assume inference, you know, becomes the bigger area of investment long-term, but short-term, you know, you need to train. they, I think a lot of their stuff, they’ve like training them and those things were really focused primarily at training first. So inference seems to be where everyone’s spending most of their money these days.”

AI Is Going Great – Or How ML Makes All Its Money  

06:39 OpenAI CEO Appears to Reject Elon Musk’s $97 Billion Takeover Bid 

  • Elon recently made an unsolicited bid to buy Open AI for 97.4 Billion.  
  • On Monday, Sam Altman rejected the offer. 
  • Altman told his staff that Musk’s effort was “embarrassing,” and not in the best interest of the OpenAI mission to develop artificial general intelligence to benefit humanity. 
  • Altman also declared that this is Musk’s attempt to slow down a competitor.
  • This does cause some complications, as OpenAI continues to plan to shift away from its non-profit roots. 
  • If the plan is for the non-profit to sell the for profit business, this bid makes it more expensive for the internal sale of the assets. 

07:42 Jonathan – “It’s interesting that he made a bid. I mean, I don’t think he would have actually filed through on it personally. Now he’s got XAI and Grok 3 coming out soon and those other things. I agree with Sam Altman that it was probably just a distraction to mess with things. But he has drawn a line in the sand at $97.4 billion though.”

08:41 Introducing the intelligence age 

  • Super Bowl Ads were all the rage over the weekend, during the drumming of the KC chiefs by the Philadelphia Eagles 40-22.  
  • Justin was really hoping for cloud commercials to talk about, but they didn’t materialize (and we DO NOT count Google’s android ads). 
  • *But* OpenAI debuted their first ever ad. View it here
  • We’re interested in you think! Let us know on social or via our Slack channel what you thought of the ad. 

10:03 Jonathan – “I actually liked the look of it. The first time I saw it, I was like, this is a bit strange. But I liked the halftone look. reminds me of newspaper print and news unfolding over the years. It was kind of neat. I’m glad I didn’t spend the extra 8 million on another 30 seconds, though, and showing the doom that’s going to come out and the poverty. Yeah, like the desolate wasteland of Earth after nobody’s got a job anymore.”

12:32 OpenAI’s secret weapon against Nvidia dependence takes shape 

  • Open AI is in the final stages of designing its long rumored AI processor with the aim of decreasing the company’s dependence on Nvidia hardware, per Reuters. 
  • ChatGPT plans to leverage TSMC (Taiwan Semiconductor Manufacturing Co.) for fabrication within the next few months, but the chip has not yet formally been announced. 
  • The first chip will use TSMCs’ 3-nanometer process. 
  • The chips will incorporate high-bandwidth memory and networking features similar to those found in NVIDIA processors. 
  • Initially the first chips will focus on running models (inference) rather than training them, with limited deployment across OpenAI.  
  •  The goal is for mass production to start in 2026. The hardware will likely end up in Stargate and/or Microsoft data centers.

13:57 Justin – “I’m actually shocked just this long for them to announce that they were doing their own chip and to, you know, they actually haven’t announced it technically, but you know, rumors come out that they’re doing one. There’s been some scuttlebutt about it, but this is a pretty firm, you know, research paper by the, or news article by the Reuters. So yeah, very interesting.”

AWS  

15:04 AWS CodeBuild for macOS adds support for Fastlane 

  • Fastlane for AWS CodeBuild has now come to the Mac OS environments. 
  • Fastlane is an open source tool suite designed to automate various aspects of mobile app development.  
  • It provides mobile app developers with a centralized set of tools to manage tasks such as code signing, screenshot generation, beta distribution and app store submissions. 
  • Fully integrated with popular CI and CD platforms, it supports IOS and Android development workflows. 
  • Previously you could install Fastlane on your codebuild for MacOS installs, but it was undifferentiated heavy lifting and now you get it installed by default. 

16:16 Introducing JSONL support with Step Functions Distributed Map

  • AWS Step Functions is expanding its capabilities of Distributed Map by adding support for JSONL (JSON Lines)
  • JSONL, a highly efficient text-based format, stores structured data as individual JSON objects separated by newlines, making it particularly suitable for large datasets. 
  • This allows you to process large collection of items stored in JSONL format directly through distributed map and optionally exports the output of the Distributed Map as JSONL file. 
  • The enhancement also introduces support for additional delimited file formats, including semicolon and tab-delimited files, providing greater flexibility in data source options. 

16:51 Jonathan – “That’s really cool, actually, because thinking about streaming data, like log data, everyone’s moved to JSON logs, except now we just emit a text event with valid JSON, but it goes into the same file. So JSON lines are very much, I think, designed for log handling, log scanning, looking for patterns there. So this is really nice. It means we don’t have to have a separate Lambda function that reads in a 50 gigabyte file and breaks it into pieces first.”

GCP

19:08 BigQuery datasets now available on Google Cloud Marketplace

  • Google is announcing datasets on the Google Cloud Marketplace through BIgQuery Analytics Hub, opening up new avenues for organizations to power innovative analytics use cases and procure data for enterprise business needs. 
  • Using Google Cloud Marketplace offers access as a centralized procurement tool to a wide array of enterprise apps, foundational AI models, LLMs, and now commercial and free datasets from third-party data providers and Google. 
  • Combined with BigQuery Analytics hub you can enable cross-organizational zero-copy sharing at scale, with governance, security and encryption all built in natively. 

19:57 Jonathan – “I think they’re slowly putting them back again by court order. yeah, I guess Google has the advantage here though, because they don’t have to copy the data. They make it, they keep one copy and everyone has access to it. Whereas Amazon, I don’t think quite got there yet, did they?”

20:51 Announcing public beta of Gen AI Toolbox for Databases

  • Google is launching the public beta of Gen AI Toolbox for Databases in partnership with LangChain, the leading orchestration framework for developers building large language models. 
  • Gen AI Toolbox is an open-source server that empowers application developers to connect production-grade, agent based generative AI applications to databases.  Streamlining the creation, deployment and management of sophisticated gen AI tools capable of querying databases with secure access, robust observability, scalability and comprehensive manageability. 
  • It can currently connect to self managed PostGreSQL, MySQL, as well as managed offerings like AlloyDB, spanner, and CloudSQL for Postgres, Mysql and SQL server. 

22:32 Rightsize your Memorystore for Redis Clusters with open-source Autoscaler

  • Last year google gave us Memorystore for Redis Clusters with the ability to manually trigger scale out and down.  
  • Now, to meet the elastic nature of modern Memorystore workloads, they are excited to announce the open-source Memorystore Cluster Autoscaler available on Github, which builds on the open source panner autoscaler from 2020. 
  • The autoscaler consists of two components the Poller and the Scaler, which monitors via cloud monitoring the health and performance of the memorystore cluster instances. 
  • Justin specifically appreciates this, but it’s a hack, and should be something they build into the service long term. But we remember AWS had this moment too at one point where they would give you automation solutions and then deliver full automation in the service a year or two later. 

23:05 Justin  – “I’d really like you to just build this into the product. Like why is this an open source thing that I have to run on my own server or infrastructure. But yeah, in fairness to Google, Amazon used to do this too. They would build like these custom solutions that they put on their GitHub thing. And then eventually a lot of people downloaded those things. Those eventually became future products within a couple of years.”

24:08 Gemini 2.0 is now available to everyone       

  • Google has made 2.0 Flash available to all users of Gemini App on desktop and mobile, helping everyone discover new ways to create, interact and collaborate with Gemini. 
  • Today, we’re making the updated Gemini 2.0 flash generally available via the Gemini API in Google AI Studio and Vertex AI.  Developers can now build production applications with 2.0 flash. 

24:32 Jonathan – “It’s quite a stretch to say build production applications. I mean, I guess you can build applications, maybe if you’re lucky. I played with Gemini 2, and I played with their deep research. Gemini’s 1.5 deep research offering a few days ago. I think it’s got a way to go. I don’t think it’s quite there with OpenAI’s version of the same thing just yet.”

Azure

27:24 Azure Data Studio Retirement 

  • Azure is announcing the upcoming retirement of Azure Data Studio (ADS) on February 6th, 2025 as they focus on delivering a modern, streamlined SQL development experience. 
  • ADS will remain supported until February 28th, 2026, giving developers ample time to transition. 
  • This decision aligns with their commitment to simplifying SQL development by consolidation efforts on VS code with MSSQL extension, a powerful and versatile tool designed for modern developers
  • But why… Well:
    • They want to focus on innovation, and VS code provides a robust platform.
    • Streamlined Tools eliminates duplication, reduces engineering, maintenance overhead, and accelerates feature delivery, ensuring developers have access to the latest innovations. 
  • Transition to VS Code gets you a modern development environment and a comprehensive set of MSSQL Extensions. 
    • Execute queries faster with filtering, sorting and export options JSON, Excel and CSV.
    • Manage schemas visually with Table Designer, Object Explorer and support for keys, indexes and constraints. 
    • Connect to SQL Server, Azure SQL (all offerings), and SQL database in Fabric using an improved Connection Dialog
    • Streamline development with scripting, object modifications, and a unified SQL experience
    • Optimize performance with an enhanced Query Results Pane and execution plans. 
    • Integrate with DevOps and CI/CD pipelines using SQL Database projects. 

29:30 Justin – “Visual Studio is an anchor. It’s so big. It’s so complicated. And if you’re trying to get people to do modern.net development with C sharp, you don’t need all that bloat. Like that, they’re still supporting WCF frameworks which are 20 years old at this point. You don’t need that in modern .NET web development. So it makes sense to me that they’re divorcing themselves from Visual Studio.”

Off Topic 

35:55 Gulf of America name change in the U.S. — what you’ll see in Maps 

  • If anyone knows of a plugin that will put it back for Chrome… we’re all ears.
  • Google has updated the Gulf of Mexico to Gulf of America for those in the US. If you are in Mexico you’ll still see the Gulf of Mexico, and if you’re in the rest of the world you’ll see the Gulf of Mexico (Gulf of America). 
  • This is fine. 

04:38 NotebookLM Plus is now available in the Google One AI Premium 

subscription 

  • NotebookLM is a research and thinking companion designed to help you make the most of your information.  You can upload material, summarize it, ask questions and transform it into something engaging, like a podcast-style audio discussion.  NotebookLM can help you ace a career certification, generate ideas or synthesize data for a project.
  • NotebookLM plus to the google one AI premium plan, a version with higher usage limits and premium features for even more customized research. 

Closing

And that is the week in the cloud! Visit our website, the home of the Cloud Pod where you can join our newsletter, slack team, send feedback or ask questions at theCloud Pod.net or tweet at us with hashtag #theCloudPod





Download audio: https://episodes.castos.com/5e2d2c4b117f29-10227663/1979875/c1e-8m9mb959o6i4gp59-rkzj457nfkk6-qnycsw.mp3
Read the whole story
alvinashcraft
2 hours ago
reply
Pennsylvania, USA
Share this story
Delete

BONUS Creating Psychological Safety at Work With Mehmet Baha

1 Share

BONUS: How to create psychological safety at work with Mehmet Baha

 

In this BONUS episode, we talk about the essential elements of creating psychological safety at work with Mehmet Baha, one of Facebook's first European employees and a renowned psychological safety expert. Drawing from his extensive experience and new book, Baha shares practical tools, inspiring examples, and thought-provoking insights that can help transform workplace culture.

This episode builds on Mehmet’s guest blog post on the concepts we discuss. You can find Baha’s guest blog post on psychological safety here

The Power of the 16-32-64 Framework

The conversation begins with Baha introducing his innovative "16-32-64" framework, a comprehensive approach that combines head (cognition), heart (emotion), and hands (practice). This framework provides 16 inspiring examples, 32 practical tools, and 64 questions designed to foster psychological safety in the workplace. Baha explains how this interactive approach encourages readers to reflect and take action, emphasizing the importance of collaboration with professionals worldwide in developing these thought-provoking questions.

 

"When I was writing the book, I thought about how I could make this interactive. How can I make readers take action? So, I realized that questions are a powerful tool to help readers to reflect and take these ideas into action."

Debunking Psychological Safety Myths

Baha addresses three major misconceptions about psychological safety that often hinder organizational progress. He emphasizes that psychological safety isn't just about being nice or avoiding difficult conversations, nor is it solely about encouraging conflict. Through real-world examples, including a compelling story about a mining company leader who transformed workplace safety through psychological safety principles, Baha demonstrates how this approach can drive tangible business results.

 

"According to research, when employees feel that their opinions count, we see happiness, reduction in turnover and more productivity."

Practical Tools for Immediate Implementation

The discussion explores two powerful tools from Baha's collection of 32 practical approaches. The first tool, "Movers, Movables, Immovables," (from Jason Little’s Lean Change Management) helps leaders navigate resistance to change by identifying different employee groups and focusing on achievable wins. The second tool, the "Green Card" technique, demonstrates how one director successfully transformed a silent team into an engaged workforce by explicitly encouraging dissent.

 

"Don't spend so much of our time with resistance. Find the movers, and find the small-wins."

 

The Journey of Self-Reflection

Baha emphasizes the critical role of self-reflection in developing psychological safety. He outlines a progression from awareness to action, stressing that leadership transformation begins with self-awareness and requires consistent application of knowledge.

 

"Everything in life starts from within. Without leading ourselves, we can't lead others."

Recommended Reading For Further Study

  • Baha’s Guest Blog Post: https://scrum-master-toolbox.org/2025/02/uncategorized/how-facebook-scaled-psychological-safety-mehmet-baha-shares-their-journey/

  • Baha’s Book: "Creating Psychological Safety at Work: The Essential Guide to Boosting Team Performance" by Mehmet Baha (Available on Amazon)

  • Free Resources: solutionfolder.com/free-resources - access video series on psychological safety



About Mehmet Baha

Mehmet Baha is a psychological safety expert and one of Facebook's first European employees, with over 24 years of experience working with top organizations. He delivers global talks and learning sessions, equipping leaders with practical tools and strategies to foster trust, innovation, and collaboration, creating safer, more inclusive workplaces worldwide.

 

You can link with Mehmet Baha on LinkedIn.





Download audio: https://traffic.libsyn.com/secure/scrummastertoolbox/20250222_Mehmet_Baha_BONUS.mp3?dest-id=246429
Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Revisiting Docker Hub Policies: Prioritizing Developer Experience

1 Share

At Docker, we are committed to ensuring that Docker Hub remains the best place for developers, engineering teams, and operations teams to build, share, and collaborate. As part of this, we previously announced plans to introduce image pull consumption fees and storage-based billing. After further evaluating how developers use Docker Hub and what will best support the ecosystem, we have refined our approach—one that prioritizes developer experience and enables developers to scale with confidence while reinforcing Docker Hub as the foundation of the cloud-native ecosystem.

What’s Changing?

We’re making important updates to our previously announced pull limits and storage policies to ensure Docker Hub remains a valuable resource for developers:

  • No More Pull Count Limits or Consumption Charges – We’re cancelling pull consumption charges entirely. Our focus is on making Docker Hub the best place for developers to build, share, and collaborate—ensuring teams can scale with confidence.
  • Unlimited Pull rates for Paid Users (As Announced Earlier) – Starting April 1, 2025, all paid Docker subscribers will have unlimited image pulls (with fair use limits) to ensure a seamless experience.
  • Updated Pull Rate Limits for Free & Unauthenticated Users – To ensure a reliable and seamless experience for all users, we are updating authenticated and free pull limits:
    • Unauthenticated users: Limited to 10 pulls per hour (as announced previously)
    • Free authenticated users: Increased to 100 pulls per hour (up from 40 pulls / hour)
    • System accounts & automation: As previously shared, automated systems and service accounts can easily authenticate using Personal Access Tokens (PATs) or Organizational Access Tokens (OATs), ensuring access to higher pull limits and a more reliable experience for automated authenticated pulls.
  • Storage Charges Delayed Indefinitely – Previously, we announced plans to introduce storage-based billing, but we have decided to indefinitely delay any storage charges. Instead, we are focusing on delivering new tools that will allow users to actively manage their storage usage. Once these tools are available, we will assess storage policies in the best interest of our users. If and when storage charges are introduced, we will provide a six-month notice, ensuring teams have ample time to adjust.

Why This Matters

  • The Best Place to Build and Share – Docker Hub remains the world’s leading container registry, trusted by over 20 million developers and organizations. We’re committed to keeping it the best place to distribute and consume software.
  • Growing the Ecosystem – We’re making these changes to support more developers, teams, and businesses as they scale, reinforcing Docker Hub as the foundation of the cloud-native world.
  • Investing in the Future – Our focus is on delivering more capabilities that help developers move faster, from better storage management to strengthening security to better protect the software supply chain.
  • Committed to Developers – Every decision we make is about strengthening the platform and enabling developers to build, share, and innovate without unnecessary barriers.

We appreciate your feedback, and we’re excited to keep evolving Docker Hub to meet the needs of developers and teams worldwide. Stay tuned for more updates, and as always—happy building! 

Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Retrieving Images from the Clipboard Reliably in WPF Revisited

1 Share

Banner Image

I've written previously about image formatting issues with the clipboard data in WPF and for the most part what I discussed in that post has been working just fine.

Recently however I started noticing some issues with some of my clipboard pasting code in Markdown Monster producing what appears to be corrupted images when displaying in an ImageSource control. The following is an image captured from Techsmith's SnagIt and then opened in Markdown Monster's Image dialog:

Bonked Image from Clipboard
Figure 1 - Something's not quite right here: Shades of a badly tuned broadcast TV in the 70's 😄

The image is sort of there, but it's obviously missing color, it's offset by a bunch with part of it missing, mis-sized and... well, it looks like an old, mis-tuned broadcast service TV image. It's an interesting effect, but not what that image should look like.

What's happening in this code is that the image is copied to the clipboard from SnagIt and then picked up by the form when it loads from the clipboard and loaded into an Image control via an ImageSource bitmap.

That Nasty WPF ImageSource Control

It should be simple: Clipboard.GetImage() should just return you an ImageSource and that should be the end of it. Unfortunately, more times than not, that doesn't 'just' work. Anything a little off in the image and System.Windows.Clipboard.GetImage() fails. Either it outright crashes or worse - it returns a result, but the resulting ImageSource doesn't actually render in the image control, which means your code has no way to really know that the image failed.

Turns out Windows.Forms.Clipboard.GetImage() works much more reliably, but it can't do transparency in .png images, which may be important.

So, in order to be able to handle all the use cases, I have been using a ClipboardHelper class with wrapper functions that look at the clipboard and explicitly extract the bitmap data first, and then create a new ImageSource from that bitmap data. That seems to be much more reliable and it support opacity. The first article goes into a lot of detail about how this works and also talks a bit about all the crazy clipboard formats you can run into and have to deal with if you're interested.

For this post, I'll show the complete, updated code for getting an ImageSource or Bitmap from the clipboard at the end of the post.

Image Formatting: Bitness

I've been using this helper class for a long time and it's been working for me for some time, except for the problem shown in Figure 1 that only occurs occasionally, but is coming up more and more recently. After a bit of sleuthing in the code I was able to track this down to the conversion code that's converting the clipboard bitmap image data into an ImageSource to display in the WPF Image control. The bitmap itself appears to be fine as I checked the actual byte data and wrote it out to file. The file image looked perfectly fine.

So the issue is Bitmap to ImageSource conversion - which has always been the messy part of this entire process.

There are a number of ways to do this conversion:

  • Dumping the image into a byte stream and assigning loading the image source from that
  • Dumping to file and loading the image source from that
  • Using low-level locking of the actual image data and loading the image source from that

I'd been using the latter, because the in-place memory usage and unsafe code combination are wicked fast compared to any other approaches - like 3-4 faster, plus it doesn't require copying the bitmap buffer of a potentially large image into a second block of memory.

The original code was retrieved from Stack Overflow and I used it as is because to be honest I only had a vague idea what that code was actually doing. At the time I noticed some of the hardcoded values and thought about that being a problem but in a bit of testing I didn't see any issues with images from a variety of sources.

Here's the original SO code:

public static BitmapSource BitmapToBitmapSource(Bitmap bmp)
{
    var bitmapData = bmp.LockBits(
           new Rectangle(0, 0, bmp.Width, bmp.Height),
           ImageLockMode.ReadOnly, bmp.PixelFormat);

    var bitmapSource = BitmapSource.Create(
        bitmapData.Width, bitmapData.Height,
        bmp.HorizontalResolution, bmp.VerticalResolution,
        PixelFormats.Bgr24, null,
        bitmapData.Scan0, bitmapData.Stride * bitmapData.Height, bitmapData.Stride);

    bmp.UnlockBits(bitmapData);

    return bitmapSource;
}

You'll notice in there that the PixelFormats.Bgr24 is hard coded. This will work most of the time but if the image is stored in 32 bit format you get - variable results. Oddly it works some of the time even with 32 bit images, but some images consistently failed with the image behavior shown in Figure 1. The reason I've been seeing more of the errors likely is due to more software using the higher 32 bit depth.

To get around this hard coded pixelformat, we can add a conversion routine that translates pixel formats between the System.Drawing and System.Windows values. The code shows both directions but only the first is from System.Drawing to System.Windows is actually used:

public static System.Windows.Media.PixelFormat ConvertPixelFormat(System.Drawing.Imaging.PixelFormat systemDrawingFormat)
{
    switch (systemDrawingFormat)
    {
        case PixelFormat.Format32bppArgb:
            return PixelFormats.Bgra32;
        case PixelFormat.Format32bppRgb:
            return PixelFormats.Bgr32;
        case PixelFormat.Format24bppRgb:
            return PixelFormats.Bgr24;
        case PixelFormat.Format16bppRgb565:
            return PixelFormats.Bgr565;
        case PixelFormat.Format16bppArgb1555:
            return PixelFormats.Bgr555;
        case PixelFormat.Format8bppIndexed:
            return PixelFormats.Gray8;
        case PixelFormat.Format1bppIndexed:
            return PixelFormats.BlackWhite;
        case PixelFormat.Format16bppGrayScale:
            return PixelFormats.Gray16;
        default:
            return PixelFormats.Bgr24;
    }
}

public static System.Drawing.Imaging.PixelFormat ConvertPixelFormat(System.Windows.Media.PixelFormat wpfFormat)
{
    if (wpfFormat == PixelFormats.Bgra32)
        return PixelFormat.Format32bppArgb;
    if (wpfFormat == PixelFormats.Bgr32)
        return PixelFormat.Format32bppRgb;
    if (wpfFormat == PixelFormats.Bgr24)
        return PixelFormat.Format24bppRgb;
    if (wpfFormat == PixelFormats.Bgr565)
        return PixelFormat.Format16bppRgb565;
    if (wpfFormat == PixelFormats.Bgr555)
        return PixelFormat.Format16bppArgb1555;
    if (wpfFormat == PixelFormats.Gray8)
        return PixelFormat.Format8bppIndexed;
    if (wpfFormat == PixelFormats.Gray16)
        return PixelFormat.Format16bppGrayScale;
    if (wpfFormat == PixelFormats.BlackWhite)
        return PixelFormat.Format1bppIndexed;

    return PixelFormat.Format24bppRgb;
}

And with that we can now fix the image Bitmap to ImageSource conversion:

public static BitmapSource BitmapToBitmapSource(Bitmap bmp)
{
    var bitmapData = bmp.LockBits(
           new Rectangle(0, 0, bmp.Width, bmp.Height),
           ImageLockMode.ReadOnly, bmp.PixelFormat);

    var pf = ConvertPixelFormat(bmp.PixelFormat);

    var bitmapSource = BitmapSource.Create(
        bitmapData.Width, bitmapData.Height,
        bmp.HorizontalResolution, bmp.VerticalResolution,
        pf, null,
        bitmapData.Scan0, bitmapData.Stride * bitmapData.Height, bitmapData.Stride);

    bmp.UnlockBits(bitmapData);

    return bitmapSource;
}

The image display now works correctly for anything I'm throwing at it. Here's the image rendering correctly in the Image dialog in Markdown Monster:

Un Bonked Image
Figure 2 - Properly captured image after fixing the Pixelmode conversion.

Note that this method requires unsafe code (for the LockBits call) so this may or may not be usable everywhere. If you need a version that works with safe code only you can use the following which uses an intermediary Memory stream:

public static BitmapSource BitmapToBitmapSourceSafe(Bitmap bmp)
{
    using var ms = new MemoryStream();
    
    bmp.Save(ms, ImageFormat.Png);
    ms.Position = 0;
    
    var bitmap = new BitmapImage();
    bitmap.BeginInit();
    bitmap.CacheOption = BitmapCacheOption.OnLoad; // Load image immediately
    bitmap.StreamSource = ms;
    bitmap.EndInit();
    bitmap.Freeze(); // Make the BitmapImage cross-thread accessible

    return bitmap;
}

Performance

This code is reliable but it's considerably slower (2-4 times slower depending on how many images you load) and also loads another copy of the image data for each image.

Note that there are a number of other ways to do this: Dumping to disk, traversing the Bitmap buffer with .NET safe code (which apparently is also fairly slow).

I'm using this method in Markdown Monster in a number of places and for single images displayed the slower processing and memory usage is not a problem. However, I also use if for an image list that displays a history of AI Image generated which contains potentially hundreds of images that are loaded asynchronously and the performance difference there is enormous:

Lots Of Images In An Image List Figure 3 - In a large list of images load performance matters - so the faster algorithm using unsafe code ends up being 3-4 times faster.

The image slider on the bottom contains a large number of images that are loaded asynchronously in the background, but even though they are loading in the background the initial hit of the UI activating is as much as 4 times slower with the MemoryStream code than using the LockBits code.

Putting it all Together: Getting an Image or ImageSource off the Clipboard

I went into great detail about clipboard image retrieval in the previous post and some of the issues you need to deal with. I'm not going to rehash all of it here, but if you're interested there's tons of detail of why this can be such a pain in the ass.

The short version is: The Windows.Forms Clipboard works great, but it can't do transparency. The native WPF clipboard is super flakey with some image types.

The wrappers I show here make it very easy to retrieve Clipboard data safely and quickly and as reliably as possible. At this point I've thrown a huge number of different image types at the updated code and I've not had any failures other than a few out memory errors with very large images.

There are two functions:

  • GetImage()
  • GetImageSource()

as well as the previously shown methods:

  • BitmapToBitmapSource()
  • ConvertPixelFormat()

The whole thing to retrieve an ImageSource and a Bitmap Image first, looks like this:

// ClipboardHelper.GetImageSource()
public static ImageSource GetImageSource()
{
    if (!Clipboard.ContainsImage())
        return null;

    // no try to get a Bitmap and then convert to BitmapSource
    using (var bmp = GetImage())
    {
        if (bmp == null)
            return null;

        return WindowUtilities.BitmapToBitmapSource(bmp);
    }
}

// ClipboardHelper.GetImage()
public static Bitmap GetImage()
{
    try
    {
        var dataObject = Clipboard.GetDataObject();

        var formats = dataObject.GetFormats(true);
        if (formats == null || formats.Length == 0)
            return null;

        var first = formats[0];

        #if DEBUG   // show all formats of the image pasted
        foreach (var f in formats)
            Debug.WriteLine(" - " + f.ToString());
        #endif
        
        Bitmap bitmap = null;

        // Use this first as this gives you transparency!
        if (formats.Contains("PNG"))
        {
            using MemoryStream ms = (MemoryStream)dataObject.GetData("PNG");
            ms.Position = 0;
            return (Bitmap)new Bitmap(ms);
        }
        if (formats.Contains("System.Drawing.Bitmap"))
        {
            return (Bitmap)dataObject.GetData("System.Drawing.Bitmap");                    
        }
        if (formats.Contains(DataFormats.Bitmap))
        {
            return (Bitmap)dataObject.GetData(DataFormats.Bitmap);                    
        }

        // just use GetImage() - 
        // retry multiple times to work around Windows timing
        BitmapSource src = null;
        for (int i = 0; i < 5; i++)
        {
            try
            {
                // This is notoriously unreliable so retry multiple time if it fails
                src = Clipboard.GetImage();
                break;  // success
            }
            catch
            {
                Thread.Sleep(10);  // retry
            }
        }

        if (src == null)
        {
            try
            {
                Debug.WriteLine("Clipboard Fall through - use WinForms");
                return System.Windows.Forms.Clipboard.GetImage() as Bitmap;
            }
            catch
            {
                return null;
            }
        }
            
        return WindowUtilities.BitmapSourceToBitmap(src);
    }
    catch
    {
        return null;
    }
}

The code looks for a few known formats that can be directly converted from the raw clipboard data and those are immediately returned. Note that some formats are already in Bitmap format while others like PNG are a binary stream that has to be loaded and assigned to a Bitmap.

The first check is for PNG because we want to try and capture the transparency of the raw PNG image. Note that for PNG we read the stream and create a bitmap from it.

Next, both Bitmap or System.Drawing.Bitmap formats are actually stored in the raw .NET Bitmap format and directly cast to that type. That would seem to be the easiest path, but these images unfortunately don't support opacity and that's why PNG is processed first.

You're likely to see Bitmap formats from .NET applications or applications that have some .NET support, but non-.NET applications likely won't have these clipboard formats set and likely provide raw image data.

In those cases, the code falls back to the default WPF Clipboard behavior and call Clipboard.GetImage(). If that call fails it will delay briefly and try again a few times. There's a well known timing bug in the Windows Clipboard API that can cause the clipboard to not be ready to retrieve data right as you request it, and retrying often can get the image. In fact, the Windows.Forms.Clipboard functionality does that automatically and internally, but for WPF it needs to be explicitly coded.

If the WPF GetImage() fails after several tries, the code then falls back to Windows Forms GetImage() operation. Which is very likely to pick up anything else regardless of format.

Prior to the Windows Forms fallback I was running into frequent clipboard pasting failures with images, and the fallback catches those cases that WPF can't handle or is too finicky about. While it may miss opacity, at least it will return something (and opacity is rarely an issue).

Note that there might still be scenarios where the WPF GetImage() is called - succeeds, but doesn't produce an ImageSource that renders. This is unfortunate because there appears to be no good way to detect this failure. An ImageSource is returned and it has data, but it just doesn't render.

Overall I find that these days most images are covered by PNG and one of the Bitmap formats, with only a few odd calls ending up in the WPF GetImage() logic.

With this code in place, I now have proper working images from any source and including transparency:

Working Image From Clipboard
Figure 4 - Example of a captured 32bit image with transparency.

Yay!

Summary

WPF Image Clipboard operations have been notoriously difficult to work with, but with this circuitous workaround described in this post, that first loads a bitmap and then an ImageSource from that, it's possible to load the vast majority of images reliably. For those that still fail, the fallback to Windows.Forms.Clipboard usually captures the rest.

If you don't need to capture transparent images, you can bypass all this madness and simply use Windows.Forms.Clipboard.GetImage() with a try/catch block around it. This works almost universally and is by far the easiest. For the rest, the code described here can be easily packaged into an application and shipped inline as part of a ClipboardHelper class.

© Rick Strahl, West Wind Technologies, 2005-2025
Posted in WPF  Windows  
Read the whole story
alvinashcraft
4 hours ago
reply
Pennsylvania, USA
Share this story
Delete

Elon Musk’s first month of destroying America will cost us decades

2 Shares
It would be impressive if it were not so depressing. | Image: Kristen Radtke / The Verge; The National Museum of American Diplomacy

Let’s pause and look at what the Elon Musk administration has done so far.

There’s been a lot of panic about the immediate but somewhat abstract constitutional crisis as Elon Musk’s misleadingly-named Department of Government Efficiency (DOGE) rips the government apart. And as much fun as we all are having watching Congress render itself irrelevant and wondering whether the courts even matter, there’s a concrete nightmare looming. Mass unemployment, the defunding of crucial social programs, and just plain incompetence mean that America, as we know it, is already in for hard times.

The degree to which we have failed not merely ourselves but also our children and grandchildren is breathtaking

The scale of destruction in the past four weeks starts at the Soviet devotion to Lysenkoist biological theories, and at maximum, is the American version of Mao’s Cultural Revolution: a disastrous triumph of ideological purity over basic reality. I am not sure it has occurred to the majority of people that we are about to make a Great Leap Forward and destroy our prosperous, relatively peaceful society.

Musk has, in the short term, set us up for a shock to the economy from both une &hellip;

Read the full story at The Verge.

Read the whole story
alvinashcraft
13 hours ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories