Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
152844 stories
·
33 followers

Fast .NET CLI ISO Downloader with Integrity Validation

1 Share

This post is a follow-up to my previous fast .NET CLI downloader with the additional feature of ISO integrity validation. Just point it at the ISO you want to download, and it will look in the usual spot for the SHA sum file.

I often download versions of Linux distributions to try out. These are very large and usually come with a checksum file that you have to manually check after downloading. The checksum verification is important because a corrupted ISO can cause all sorts of weird issues when you try to use it.

It happened to me once, back when I used to burn ISOs to DVDs. The install kept failing, and I kept blaming the DVD until I finally realized the ISO file was corrupted; I hadn’t run the checksum verification.

Using it is as simple as -

./downloader.cs https://releases.ubuntu.com/resolute/ubuntu-26.04-desktop-amd64.iso

But the command supports some additional options as well -

Usage: ./downloader.cs <url> [output-file] [chunks]
 url - URL to download
 output-file - Output filename (default: derived from URL)
 chunks - Number of parallel streams (default: 8)

Downloading Ubuntu 26.04
Downloading Ubuntu 26.04
 1#!/usr/bin/dotnet run
 2
 3using System.Diagnostics;
 4using System.Net.Http.Headers;
 5using System.Security.Cryptography;
 6
 7const int DefaultChunks = 8;
 8const int MaxRetries = 5;
 9const int RetryDelayMs = 1000;
 10const int ProgressUpdateMs = 250;
 11
 12if (args.Length < 1 || args[0] is "-h" or "--help")
 13{
 14 PrintUsage();
 15 return args.Length < 1 ? 1 : 0;
 16}
 17
 18var url = args[0];
 19var outputFile = args.Length > 1 ? args[1] : Path.GetFileName(new Uri(url).LocalPath);
 20var chunks = args.Length > 2 ? int.Parse(args[2]) : DefaultChunks;
 21
 22if (string.IsNullOrWhiteSpace(outputFile) || outputFile == "/")
 23 outputFile = "download";
 24
 25Console.WriteLine($"URL: {url}");
 26Console.WriteLine($"Output: {outputFile}");
 27Console.WriteLine($"Streams: {chunks}");
 28Console.WriteLine();
 29
 30var isIso = string.Equals(Path.GetExtension(outputFile), ".iso", StringComparison.OrdinalIgnoreCase);
 31
 32using var client = new HttpClient { Timeout = TimeSpan.FromMinutes(30) };
 33client.DefaultRequestHeaders.UserAgent.ParseAdd("DownloaderCLI/1.0");
 34
 35string? checksumFilePath = null;
 36if (isIso)
 37{
 38 checksumFilePath = await TryDownloadChecksumFileForIso(client, url, outputFile);
 39 Console.WriteLine();
 40}
 41
 42// Probe the server for content-length and range support
 43using var headReq = new HttpRequestMessage(HttpMethod.Head, url);
 44using var headResp = await client.SendAsync(headReq);
 45headResp.EnsureSuccessStatusCode();
 46
 47var totalSize = headResp.Content.Headers.ContentLength ?? -1;
 48var acceptRanges = headResp.Headers.Contains("Accept-Ranges")
 49 && headResp.Headers.GetValues("Accept-Ranges").Any(v => v.Contains("bytes", StringComparison.OrdinalIgnoreCase));
 50
 51if (totalSize <= 0 || !acceptRanges)
 52{
 53 Console.WriteLine(totalSize <= 0
 54 ? "Server did not report content length - falling back to single-stream download."
 55 : "Server does not support range requests - falling back to single-stream download.");
 56 Console.WriteLine();
 57 await SingleStreamDownload(client, url, outputFile);
 58 if (isIso)
 59 {
 60 var valid = await ValidateIsoAsync(outputFile, checksumFilePath);
 61 return valid ? 0 : 2;
 62 }
 63
 64 return 0;
 65}
 66
 67Console.WriteLine($"Size: {FormatBytes(totalSize)}");
 68Console.WriteLine($"Ranges: supported");
 69Console.WriteLine();
 70
 71var sw = Stopwatch.StartNew();
 72var chunkInfos = BuildChunks(totalSize, chunks);
 73var progress = new long[chunkInfos.Count];
 74
 75// Progress reporter
 76using var cts = new CancellationTokenSource();
 77var progressTask = Task.Run(async () =>
 78{
 79 while (!cts.Token.IsCancellationRequested)
 80 {
 81 PrintProgress(progress, chunkInfos, totalSize, sw.Elapsed);
 82 try { await Task.Delay(ProgressUpdateMs, cts.Token); } catch (TaskCanceledException) { break; }
 83 }
 84});
 85
 86// Download all chunks in parallel
 87var tempFiles = new string[chunkInfos.Count];
 88var downloadTasks = new Task[chunkInfos.Count];
 89
 90for (int i = 0; i < chunkInfos.Count; i++)
 91{
 92 var idx = i;
 93 var (start, end) = chunkInfos[idx];
 94 tempFiles[idx] = $"{outputFile}.part{idx}";
 95 downloadTasks[idx] = DownloadChunk(client, url, start, end, tempFiles[idx], progress, idx);
 96}
 97
 98await Task.WhenAll(downloadTasks);
 99
100cts.Cancel();
101await progressTask;
102PrintProgress(progress, chunkInfos, totalSize, sw.Elapsed);
103Console.WriteLine();
104Console.WriteLine();
105
106// Reassemble
107Console.Write("Reassembling... ");
108await Reassemble(tempFiles, outputFile);
109Console.WriteLine("done.");
110
111// Cleanup temp files
112foreach (var f in tempFiles)
113 if (File.Exists(f)) File.Delete(f);
114
115sw.Stop();
116var info = new FileInfo(outputFile);
117Console.WriteLine($"Completed in {sw.Elapsed.TotalSeconds:F1}s - {FormatBytes(info.Length)} @ {FormatBytes((long)(info.Length / sw.Elapsed.TotalSeconds))}/s");
118
119if (isIso)
120{
121 var valid = await ValidateIsoAsync(outputFile, checksumFilePath);
122 return valid ? 0 : 2;
123}
124
125return 0;
126
127// ---- helper methods ----
128
129static List<(long Start, long End)> BuildChunks(long totalSize, int count)
130{
131 var chunkSize = totalSize / count;
132 var result = new List<(long, long)>(count);
133 for (int i = 0; i < count; i++)
134 {
135 var start = i * chunkSize;
136 var end = (i == count - 1) ? totalSize - 1 : start + chunkSize - 1;
137 result.Add((start, end));
138 }
139 return result;
140}
141
142static async Task DownloadChunk(HttpClient client, string url, long start, long end,
143 string tempFile, long[] progress, int index)
144{
145 for (int attempt = 1; attempt <= MaxRetries; attempt++)
146 {
147 try
148 {
149 // Resume from where we left off if retrying
150 long existingBytes = 0;
151 if (File.Exists(tempFile))
152 {
153 existingBytes = new FileInfo(tempFile).Length;
154 if (existingBytes >= end - start + 1)
155 {
156 progress[index] = end - start + 1;
157 return; // already complete
158 }
159 }
160
161 using var req = new HttpRequestMessage(HttpMethod.Get, url);
162 req.Headers.Range = new RangeHeaderValue(start + existingBytes, end);
163
164 using var resp = await client.SendAsync(req, HttpCompletionOption.ResponseHeadersRead);
165 resp.EnsureSuccessStatusCode();
166
167 await using var stream = await resp.Content.ReadAsStreamAsync();
168 await using var fs = new FileStream(tempFile, existingBytes > 0 ? FileMode.Append : FileMode.Create,
169 FileAccess.Write, FileShare.None, 81920, useAsync: true);
170
171 var buffer = new byte[81920];
172 long downloaded = existingBytes;
173 int bytesRead;
174
175 while ((bytesRead = await stream.ReadAsync(buffer)) > 0)
176 {
177 await fs.WriteAsync(buffer.AsMemory(0, bytesRead));
178 downloaded += bytesRead;
179 Interlocked.Exchange(ref progress[index], downloaded);
180 }
181
182 return; // success
183 }
184 catch (Exception ex) when (attempt < MaxRetries)
185 {
186 Console.Error.WriteLine($"\n [chunk {index}] attempt {attempt} failed: {ex.Message} - retrying...");
187 await Task.Delay(RetryDelayMs * attempt);
188 }
189 }
190}
191
192static async Task SingleStreamDownload(HttpClient client, string url, string outputFile)
193{
194 var sw = Stopwatch.StartNew();
195 using var resp = await client.GetAsync(url, HttpCompletionOption.ResponseHeadersRead);
196 resp.EnsureSuccessStatusCode();
197
198 var total = resp.Content.Headers.ContentLength ?? -1;
199 await using var stream = await resp.Content.ReadAsStreamAsync();
200 await using var fs = new FileStream(outputFile, FileMode.Create, FileAccess.Write, FileShare.None, 81920, true);
201
202 var buffer = new byte[81920];
203 long downloaded = 0;
204 int bytesRead;
205 var lastUpdate = DateTimeOffset.UtcNow;
206
207 while ((bytesRead = await stream.ReadAsync(buffer)) > 0)
208 {
209 await fs.WriteAsync(buffer.AsMemory(0, bytesRead));
210 downloaded += bytesRead;
211
212 if ((DateTimeOffset.UtcNow - lastUpdate).TotalMilliseconds >= ProgressUpdateMs)
213 {
214 lastUpdate = DateTimeOffset.UtcNow;
215 var pct = total > 0 ? (double)downloaded / total * 100 : 0;
216 var speed = downloaded / sw.Elapsed.TotalSeconds;
217 Console.Write($"\r [{pct,5:F1}%] {FormatBytes(downloaded)}{(total > 0 ? $" / {FormatBytes(total)}" : "")} {FormatBytes((long)speed)}/s ");
218 }
219 }
220
221 sw.Stop();
222 Console.WriteLine($"\r [100.0%] {FormatBytes(downloaded)} {FormatBytes((long)(downloaded / sw.Elapsed.TotalSeconds))}/s - done. ");
223}
224
225static async Task Reassemble(string[] parts, string outputFile)
226{
227 await using var outFs = new FileStream(outputFile, FileMode.Create, FileAccess.Write, FileShare.None, 81920, true);
228 foreach (var part in parts)
229 {
230 await using var inFs = new FileStream(part, FileMode.Open, FileAccess.Read, FileShare.Read, 81920, true);
231 await inFs.CopyToAsync(outFs);
232 }
233}
234
235static async Task<string?> TryDownloadChecksumFileForIso(HttpClient client, string isoUrl, string outputFile)
236{
237 var outputDir = Path.GetDirectoryName(outputFile);
238 if (string.IsNullOrWhiteSpace(outputDir))
239 outputDir = ".";
240
241 Directory.CreateDirectory(outputDir);
242
243 var isoUri = new Uri(isoUrl);
244 var baseUri = new Uri(isoUri, ".");
245 string[] candidateNames = ["SHA256SUMS", "SHA256SUMS.txt", "sha256sum.txt", "SHA256SUM", "sha256sums"];
246
247 Console.WriteLine("ISO detected: looking for checksum file in source directory...");
248 foreach (var candidate in candidateNames)
249 {
250 try
251 {
252 var checksumUri = new Uri(baseUri, candidate);
253 using var resp = await client.GetAsync(checksumUri);
254 if (!resp.IsSuccessStatusCode)
255 continue;
256
257 var content = await resp.Content.ReadAsStringAsync();
258 if (string.IsNullOrWhiteSpace(content))
259 continue;
260
261 var localPath = Path.Combine(outputDir, candidate);
262 await File.WriteAllTextAsync(localPath, content);
263 Console.WriteLine($"Checksum file downloaded: {candidate}");
264 return localPath;
265 }
266 catch
267 {
268 // Try next known checksum filename.
269 }
270 }
271
272 Console.WriteLine("No checksum file found; ISO integrity check will use structure validation.");
273 return null;
274}
275
276static async Task<bool> ValidateIsoAsync(string isoPath, string? checksumFilePath)
277{
278 Console.WriteLine();
279 Console.WriteLine("Validating ISO image...");
280
281 if (!File.Exists(isoPath))
282 {
283 Console.Error.WriteLine("Validation failed: ISO file not found.");
284 return false;
285 }
286
287 var actualSha256 = await ComputeSha256Async(isoPath);
288
289 if (!string.IsNullOrWhiteSpace(checksumFilePath) && File.Exists(checksumFilePath))
290 {
291 var expectedSha256 = await GetExpectedSha256ForFileAsync(checksumFilePath, Path.GetFileName(isoPath));
292 if (!string.IsNullOrWhiteSpace(expectedSha256))
293 {
294 var valid = string.Equals(actualSha256, expectedSha256, StringComparison.OrdinalIgnoreCase);
295 Console.WriteLine($"Expected SHA256: {expectedSha256}");
296 Console.WriteLine($"Actual SHA256: {actualSha256}");
297 Console.WriteLine(valid ? "ISO checksum validation passed." : "ISO checksum validation failed.");
298 return valid;
299 }
300
301 Console.WriteLine("Checksum file was downloaded but no matching hash entry was found for this ISO.");
302 }
303
304 var structureValid = await HasIso9660SignatureAsync(isoPath);
305 Console.WriteLine($"Computed SHA256: {actualSha256}");
306 Console.WriteLine(structureValid
307 ? "ISO structure validation passed (ISO9660 signature found)."
308 : "ISO structure validation failed (ISO9660 signature not found).");
309 return structureValid;
310}
311
312static async Task<string> ComputeSha256Async(string path)
313{
314 await using var fs = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read, 81920, useAsync: true);
315 var hash = await SHA256.HashDataAsync(fs);
316 return Convert.ToHexString(hash).ToLowerInvariant();
317}
318
319static async Task<string?> GetExpectedSha256ForFileAsync(string checksumFilePath, string fileName)
320{
321 var lines = await File.ReadAllLinesAsync(checksumFilePath);
322 foreach (var rawLine in lines)
323 {
324 var line = rawLine.Trim();
325 if (string.IsNullOrWhiteSpace(line) || line.StartsWith('#'))
326 continue;
327
328 var parts = line.Split(' ', StringSplitOptions.RemoveEmptyEntries);
329 if (parts.Length < 2)
330 continue;
331
332 var hash = parts[0];
333 if (hash.Length != 64 || !hash.All(Uri.IsHexDigit))
334 continue;
335
336 var listedName = parts[^1].TrimStart('*');
337 listedName = Path.GetFileName(listedName);
338 if (string.Equals(listedName, fileName, StringComparison.OrdinalIgnoreCase))
339 return hash.ToLowerInvariant();
340 }
341
342 return null;
343}
344
345static async Task<bool> HasIso9660SignatureAsync(string path)
346{
347 const int signatureOffset = 0x8001;
348 const int signatureLength = 5;
349
350 await using var fs = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read, 81920, useAsync: true);
351 if (fs.Length < signatureOffset + signatureLength)
352 return false;
353
354 fs.Seek(signatureOffset, SeekOrigin.Begin);
355 var buffer = new byte[signatureLength];
356 var bytesRead = await fs.ReadAsync(buffer);
357 if (bytesRead < signatureLength)
358 return false;
359
360 return buffer[0] == (byte)'C'
361 && buffer[1] == (byte)'D'
362 && buffer[2] == (byte)'0'
363 && buffer[3] == (byte)'0'
364 && buffer[4] == (byte)'1';
365}
366
367static void PrintProgress(long[] progress, List<(long Start, long End)> chunks, long totalSize, TimeSpan elapsed)
368{
369 long totalDownloaded = progress.Sum();
370 var pct = (double)totalDownloaded / totalSize * 100;
371 var speed = elapsed.TotalSeconds > 0 ? totalDownloaded / elapsed.TotalSeconds : 0;
372 var eta = speed > 0 ? TimeSpan.FromSeconds((totalSize - totalDownloaded) / speed) : TimeSpan.Zero;
373
374 // Per-chunk mini bars
375 var bars = new List<string>();
376 for (int i = 0; i < chunks.Count; i++)
377 {
378 var chunkSize = chunks[i].End - chunks[i].Start + 1;
379 var chunkPct = (double)progress[i] / chunkSize;
380 const int miniBarWidth = 8;
381 var filled = chunkPct <= 0
382 ? 0
383 : Math.Clamp((int)Math.Ceiling(chunkPct * miniBarWidth), 1, miniBarWidth);
384 bars.Add(new string('█', filled) + new string('░', miniBarWidth - filled));
385 }
386
387 Console.Write($"\r [{pct,5:F1}%] {FormatBytes(totalDownloaded)} / {FormatBytes(totalSize)} " +
388 $"{FormatBytes((long)speed)}/s ETA {eta:mm\\:ss} " +
389 $"[{string.Join('|', bars)}] ");
390}
391
392static string FormatBytes(long bytes)
393{
394 string[] units = ["B", "KB", "MB", "GB", "TB"];
395 double val = bytes;
396 int unit = 0;
397 while (val >= 1024 && unit < units.Length - 1) { val /= 1024; unit++; }
398 return $"{val:F1} {units[unit]}";
399}
400
401static void PrintUsage()
402{
403 Console.Error.WriteLine("Usage: ./downloader.cs <url> [output-file] [chunks]");
404 Console.Error.WriteLine(" url - URL to download");
405 Console.Error.WriteLine(" output-file - Output filename (default: derived from URL)");
406 Console.Error.WriteLine($" chunks - Number of parallel streams (default: {DefaultChunks})");
407 Console.Error.WriteLine();
408 Console.Error.WriteLine("Examples:");
409 Console.Error.WriteLine(" ./downloader.cs 'https://example.com/file.iso'");
410 Console.Error.WriteLine(" ./downloader.cs 'https://example.com/file.iso?x=1&y=2'");
411 Console.Error.WriteLine();
412 Console.Error.WriteLine("PowerShell note:");
413 Console.Error.WriteLine(" URLs containing '&' must be quoted, escaped as '`&', or passed after '--%'.");
414 Console.Error.WriteLine(" Example:");
415 Console.Error.WriteLine(" ./downloader.cs --% https://example.com/file.iso?x=1&y=2");
416}
Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

What Microsoft’s 10-Q Says About OpenAI

1 Share

Buried on page nine of Microsoft’s 10-Q for the quarter ended March 31, 2026 is a paragraph worthy of attention. Why? What does it reveal? A lot.

For starters, Microsoft now holds approximately 27 percent of OpenAI on an as-converted basis, accounted for under the equity method. The total funding commitment is $13 billion, of which $11.8 billion has been funded as of March 31, 2026. The October 2025 OpenAI recapitalization produced a dilution gain. Microsoft recorded $5.9 billion of net gains from OpenAI investments over the nine months, primarily from that dilution gain. The prior nine-month period reflected $2.7 billion of net losses on the same investment.

In plain English, even though Microsoft owns less of OpenAI, that smaller stake is worth more, and it produced a gain. Why? Because the implied valuation of OpenAI rose faster than Microsoft’s ownership percentage fell. Microsoft booked the markup. Money for nothing, and chips for free.


Poring over the 10-Q and the footnotes, it became obvious that investing in foundational AI labs is a crazy profitable business. A single set of transactions can bring three separate benefits. At least on Microsoft’s financial statements.

Cash leaves Microsoft as an investment. It returns as cloud revenue. And then some.

First, OpenAI burns Microsoft’s cash on Azure compute. The Azure consumption shows up as revenue inside the AI business line. As a result, the AI business line is now at a $37 billion run rate. It helps justify the $190 billion 2026 capex commitment.

Thanks to the magic money of AI private valuations, Microsoft’s equity stake in OpenAI gets marked up to reflect the latest funding valuation. The markup is “other income.” And as stated above, the dilution gains flow to “other income” as well.

Somehow a series of interrelated transactions does the trick. This is so gangsta. Nothing is improper, even though you know something isn’t right. Still, the accountants have done their job.


Microsoft’s “AI business” annual run rate is the headline number Satya Nadella repeated on the earnings call. It is up 123 percent year over year. Microsoft does not give you the breakdown. Here is the back of the envelope math.

It has been reported that Microsoft has roughly 20 million paid Copilot enterprise seats. The standard M365 Copilot price is $30 per user per month. That is approximately $7 billion in annualized Copilot revenue. Add roughly $1.5 to $2 billion for GitHub Copilot and adjacent tooling. Total commercial Copilot revenue is somewhere between $8 and $10 billion. Being generous, I would say Copilot is at most one-quarter of the $37 billion AI business.

The remaining $27 to $30 billion is Azure consumption. The composition, working from public disclosures and reasonable inference (no pun intended): OpenAI’s Azure spend is the largest single line. OpenAI runs on Azure. All the money it got from Microsoft has been burned on Microsoft compute. The rest is third-party enterprise customers using Azure OpenAI Service, plus other AI lab and AI startup compute, much of which Microsoft has at least partially funded through M12, the OpenAI Startup Fund, or various co-investment vehicles.

The customer concentration in the Azure AI revenue line is not disclosed. It does not have to be. But the structure of the disclosure tells you the answer. If 80 percent of the $37 billion came from a broad base of independent enterprises, Microsoft would say so on the earnings call. The silence is the answer.

I know it is not the same, but I am feeling nostalgic for vendor financing. Those crazy days of Nortel and Lucent. Lucent perfected the practice during the late 1990s telecom boom, extending credit to competitive local exchange carriers so they could buy Lucent equipment. The financing showed up as an asset on Lucent’s balance sheet. The equipment sales showed up as revenue. Lucent’s reported earnings looked excellent.

The structure worked until the customers ran out of money. By 2001, Lucent had taken billions in writedowns on its customer financing book. The CLECs went bankrupt in waves. Lucent’s stock fell from $84 to under $1.

I know it is not the same.

The current AI version is different. The instrument is convertible preferred stock and dilution gains, not vendor finance receivables. The asset on the hyperscaler balance sheet is an equity investment, not a loan. The customer is an AI lab, not a CLEC. The product being financed is GPU time, not switching equipment.

Still, as an old hand, I am feeling the nostalgia of the old mechanism. The funder, the customer, and the source of the markup are all part of the same closed system.

The same shape exists at Alphabet with Anthropic on Google Cloud. The same shape exists at Amazon with Anthropic on Trainium. Three of the four hyperscalers booked enormous non-cash gains this quarter from their stakes in AI labs. Alphabet booked $36.8 billion of equity gains. Amazon booked $16.8 billion in pre-tax gains on Anthropic. Combined, roughly $50 billion plus of non-cash income flowed through Q1 2026 income statements from AI lab marks and dilution gains.

What the OpenAI restructuring really did

The October 2025 OpenAI recapitalization, which produced Microsoft’s dilution gain, was framed publicly as a governance reform and a step toward a more conventional corporate structure. As I wrote at the time, the fix was in. OpenAI formed a public benefit corporation. Microsoft’s licensing arrangement shifted from exclusive to non-exclusive. The new agreement extended the partnership.

OpenAI will continue to pay Microsoft a 20 percent revenue share through 2030, but only up to a fixed cap, after which the obligation extinguishes. The IP license Microsoft holds on OpenAI’s models, previously exclusive and tied to the elastic concept of “AGI achievement,” is now non-exclusive with a hard 2032 expiration. Under the revised agreement, OpenAI can sell API access to its models through any cloud provider.

PitchBook reads the restructuring as a precondition for OpenAI’s IPO push. The Wall Street Journal reported in January that the company is laying groundwork for a Q4 2026 listing.

When I read the news, I was left asking the same question. Why did Microsoft do this? And what do they get out of it?

The restructuring, mechanically, gave Microsoft a clean accounting event. The structure of the recap let Microsoft book the gain, reduce its proportional ownership to a still-substantial 27 percent, and free OpenAI to raise more capital from other investors at higher valuations.

Microsoft is no longer the controlling investor. It is a large minority equity holder of a public benefit corporation, with a non-exclusive licensing arrangement and a $13 billion total funding commitment that is nearly fully funded. And come the IPO, it can sell as little or as much of its OpenAI equity as it wants, without any sense of moral obligation.

Microsoft has been quietly converting its OpenAI exposure from an operating dependency into a simple financial position. The accounting now flatters Microsoft as long as OpenAI’s valuation rises.

OpenAI needs Azure for a while, which is great for Microsoft as it builds up its AI business. All the while, OpenAI as an entity becomes a headache for Amazon or whomever else wants to do business with them.

Microsoft’s move cannot be viewed in isolation. On April 27, the Wall Street Journal reported that OpenAI missed multiple monthly revenue targets earlier this year, losing ground to Anthropic in coding and enterprise. ChatGPT fell short of its internal target to reach one billion weekly active users by the end of 2025, with growth flattening around 900 million. CFO Sarah Friar reportedly told colleagues she is worried OpenAI may not be able to fund future computing contracts if revenue does not accelerate.

Altman and Friar issued a joint statement calling the report “ridiculous” and saying they are “totally aligned on buying as much compute as we can.” The denial said they agree on wanting compute. The Wall Street Journal report wondered whether OpenAI can afford it, or whether it can IPO this year.

Altman’s statement and Friar’s comment mean nothing. SoftBank fell almost 10 percent in Tokyo. Oracle dropped more than 5 percent. CoreWeave fell 7 percent. AMD and Broadcom each took roughly 4 percent. The whole AI infrastructure stock ecosystem had a massive convulsion.

What if the company at the center of the structure cannot pay its bills?

PitchBook’s Harrison Rolfes calculated that OpenAI’s infrastructure obligations now exceed $1.15 trillion across Oracle, Microsoft, and Amazon. Current annualized revenue is roughly $25 billion. The ratio is forty to one. “If revenue growth doesn’t reaccelerate,” Rolfes said, “those contracts become the most expensive fixed-cost bet in technology history.”

In a sense, Friar is not wrong when she tells the board it is going to be hard to go public with those numbers. A CFO comment to board members does not leak to the Wall Street Journal unless someone wants it to leak. To slow down the IPO, or to shank it entirely.

The leak is the story.

The Q4 2026 IPO timeline matters because everything in the financial structure assumes the valuation machine keeps churning at max speed. A successful OpenAI IPO means not only new money but also actual liquidity for Microsoft and other early investors.

PitchBook’s most recent analyst note suggests the realistic IPO window has shifted from Q4 2026 to mid-to-late 2027, citing the same revenue miss and the $1.15 trillion in infrastructure obligations that will need to convert into free cash flow before public market investors get comfortable. If that delay holds, every hyperscaler holding equity gains based on private OpenAI marks is sitting on paper that has to keep being remarked upward to keep working.

This is the announcement economy at the financial-engineering level. Promises about future revenue support current accounting. Current accounting supports the next round. The next round supports the marks. The marks support the parent company income statement. And then the cycle repeats, faster.

There is no doubt in my mind that if the IPO is delayed, there will be a new funding round. If it prices above the implied valuation from the October recap, the cycle continues. If it prices flat or down, the dilution-gain mechanic reverses.

Over the next few quarters I will be watching what Microsoft has to say about its AI run rate, and whether it provides more details. Given the sheer scale of the money, I am surprised sell-side analysts aren’t pushing for further disclosure. Or maybe they did and I missed it.

I would also be keeping an eye on Azure gross margins. If OpenAI’s compute consumption is priced at preferential rates, as has been widely reported, the gross margin on the largest single piece of the AI business is structurally lower than the rest of Azure. As OpenAI scales further, the blended Azure margin will move with it.

The platform shift Satya Nadella described is real. Workloads are moving from end-user-driven to agent-driven. Token consumption may well grow at machine scale rather than human scale. The capex commitment may be the right call.

But the financial structure underneath the AI revenue line is a delicate balance.


Previously:

Read the whole story
alvinashcraft
28 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Interesting links - April 2026

1 Share

A bit of a streamlined edition, this month. Lots of interesting links still, but less commentary. You can put that down to me prevaricating on getting my previous blog about Materialized Tables in Apache Flink finished, and leaving myself little time to work on this one :) Not including the detailed narration actually knocks a bunch of time off the preparation—I’d be interested in your feedback as to how much the absence of narration impacts (if at all) your enjoyment of reading it. Let me know in the comments below!

Something that I’m slowly changing is how I categorise links to do with AI. A few months back anything "AI" got its own section. It wasn’t much more than a novelty really; certainly not something worth distracting the regular link sections with. But now AI is just part-and-parcel of many people’s workflows, a regular component in their toolbox. So where an article is about credibly using AI as part of an existing topic (such as data engineering), I’ll file it in that section. (And if this news makes you cross because you abhor anything AI, well, I’ve got news for you).

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Daily Reading List – April 30, 2026 (#774)

1 Share

Back on vacation and had a day-date with my wife. A bunch of fresh crazy kicks in at work next week, so I’m thrilled to get a breather before we run at full speed again.

[blog] Long-running Agents. Another killer post from Addy. What changes when you move from single-turn stateless agents to agents that need memories and coordination over time? This post has the patterns and solution options.

[blog] How ADK Agents Remember: Sessions, Events, and Scoped-State. Speaking of memories and agent state, here’s some details on how to do it in practice.

[blog] AI evals are becoming the new compute bottleneck. Evaluation costs are scaling non-linearly and we’re going to need to come up with new approaches. Or so says this Hugging Face post.

[blog] I attempted to build a team of agents to help do my job on Google Cloud Agent Platform with the agents-cli. This is what I learnt. Great experience report. Not everything worked as anticipated, and Esther had some smart recommendations at the end.

[article] AI productivity gains: More modest than expected. So far. But as the agentic operating model takes hold, team shapes change, and better platforms stretch from build-to-prod, you’ll see these gains skyrocket.

[blog] 50+ fully managed MCP servers now available for Google Cloud services. Terrific. Google Cloud speaks MCP, which means agents can easily interact with all the key services to get work done.

[blog] Zig Anti-AI. Few open source projects have direct a stance against AI as this one. I respect the principles.

[blog] Popular Go Web Frameworks: A Practical Guide for Developers. You can do most everything with our base libraries. That’s on purpose. But there are still great 3P web frameworks you can add to the mix.

[blog] Firestore levels up: Bringing the power of search and JOINs to NoSQL. This has really become quite the powerful database.

[blog] You can now easily generate files in Gemini. Amazing. Now you don’t even need to leave the app to do Office stuff.

[article] GitHub shifts Copilot to usage-based billing, signaling a new cost model for enterprise AI tools. Free lunch is over. Consumption pricing is taking hold over a straight-up per-seat pricing approach. More here.

[article] Google Cloud surpasses $20B, but says growth was capacity-constrained. Massive demand, and we still can’t satisfy it all. Yet.

[blog] Speeding Up AI: Bringing Google Colossus to PyTorch via GCSFS and Rapid Bucket. When you’re paying a ton for capacity, you want to use it to the max and be done. This high-performing storage reduces your wait time.

Want to get this update sent to you every day? Subscribe to my RSS feed or subscribe via email below:



Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Codex CLI 0.128.0 adds /goal

1 Share

Codex CLI 0.128.0 adds /goal

The latest version of OpenAI's Codex CLI coding agent adds their own version of the Ralph loop: you can now set a /goal and Codex will keep on looping until it evaluates that the goal has been completed... or the configured token budget has been exhausted.

It looks like the feature is mainly implemented though the goals/continuation.md and goals/budget_limit.md prompts, which are automatically injected at the end of a turn.

Via @fcoury

Tags: ai, openai, prompt-engineering, generative-ai, llms, coding-agents, system-prompts, codex-cli, agentic-engineering

Read the whole story
alvinashcraft
1 minute ago
reply
Pennsylvania, USA
Share this story
Delete

Continually improving our agent harness

1 Share
Read the whole story
alvinashcraft
2 minutes ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories