Sr. Content Developer at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
148868 stories
·
33 followers

Say hello to MacBook Neo

1 Share
Apple today unveiled MacBook Neo, an all-new laptop that delivers the magic of the Mac at a breakthrough price.

Read the whole story
alvinashcraft
just a second ago
reply
Pennsylvania, USA
Share this story
Delete

The “Data Center Rebellion” Is Here

1 Share
This post first appeared on Ben Lorica’s Gradient Flow Substack newsletter and is being republished here with the author’s permission.

Even the most ardent cheerleaders for artificial intelligence now quietly concede we are navigating a massive AI bubble. The numbers are stark: Hyperscalers are deploying roughly $400 billion annually into data centers and specialized chips while AI-related revenue hovers around $20 billion—a 20-to-1 capital-to-revenue ratio that stands out even in infrastructure cycles historically characterized by front-loaded spending. To justify this deployment on conventional investment metrics, the industry would need a step change in monetization over a short window to make the numbers work.

While venture capitalists and tech executives debate the “mismatch” between compute and monetization, a more tangible crisis is unfolding far from Silicon Valley. A growing grassroots opposition to AI data centers remains largely below the radar here in San Francisco. I travel to Sioux Falls, South Dakota, a few times a year to visit my in-laws. It’s not a region known for being antibusiness. Yet even there, a “data center rebellion” has been brewing. Even though the recent attempt to overturn a rezoning ordinance did not succeed, the level of community pushback in the heart of the Midwest signals that these projects no longer enjoy a guaranteed green light.

This resistance is not merely reflexive NIMBYism. It represents a sophisticated multifront challenge to the physical infrastructure AI requires. For leadership teams planning for the future, this means “compute availability” is no longer just a procurement question. It is now tied to local politics, grid stability, water management, and city approval processes. In the course of trying to understand the growing opposition to AI data centers, I’ve been examining the specific drivers behind this opposition and why the assumption of limitless infrastructure growth is colliding with hard constraints.

The grid capacity crunch and the ratepayer revolt

AI data centers function as grid-scale industrial loads. Individual projects now request 100+ megawatts, and some proposals reach into the gigawatt range. One proposed Michigan facility, for example, would consume 1.4 gigawatts, nearly exhausting the region’s remaining 1.5 gigawatts of headroom and roughly matching the electricity needs of about a million homes. This happens because AI hardware is incredibly dense and uses a massive amount of electricity. It also runs constantly. Since AI work doesn’t have “off” hours, power companies can’t rely on the usual quiet periods they use to balance the rest of the grid.

The politics come down to who pays the bill. Residents in many areas have seen their home utility rates jump by 25% or 30% after big data centers moved in, even though they were promised rates wouldn’t change. People are afraid they will end up paying for the power company’s new equipment. This happens when a utility builds massive substations just for one company, but the cost ends up being shared by everyone. When you add in state and local tax breaks, it gets even worse. Communities deal with all the downsides of the project, while the financial benefits are eaten away by tax breaks and credits.

The result is a rare bipartisan alignment around a simple demand: Hyperscalers should pay their full cost of service. Notably, Microsoft has moved in that direction publicly, committing to cover grid-upgrade costs and pursue rate structures intended to insulate residential customers—an implicit admission that the old incentive playbook has become a political liability (and, in some places, an electoral one).

AI scale-up to deployable compute

Water wars and the constant hum

High-density AI compute generates immense heat, requiring cooling systems that can consume millions of gallons of water daily. In desert municipalities like Chandler and Tucson, Arizona, this creates direct competition with agricultural irrigation and residential drinking supplies. Proposed facilities may withdraw hundreds of millions of gallons annually from stressed aquifers or municipal systems, raising fears that industrial users will deplete wells serving farms and homes. Data center developers frequently respond with technical solutions like dry cooling and closed-loop designs. However, communities have learned the trade-off: Dry cooling shifts the burden to electricity, and closed-loop systems still lose water to the atmosphere and require constant refills. The practical outcome is that cooling architecture is now a first-order constraint. In Tucson, a project known locally as “Project Blue” faced enough pushback over water rights that the developer had to revisit the cooling approach midstream.

Beyond resource consumption, these facilities create a significant noise problem. Industrial-scale cooling fans and backup diesel generators create a “constant hum” that represents daily intrusion into previously quiet neighborhoods. In Florida, residents near a proposed facility serving 2,500 families and an elementary school cite sleep disruption and health risks as primary objections, elevating the issue from nuisance to harm. The noise also hits farms hard. In Wisconsin, residents reported that the low-frequency hum makes livestock, particularly horses, nervous and skittish. This disrupts farm life in a way that standard commercial development just doesn’t. This is why municipalities are tightening requirements: acoustic modeling, enforceable decibel limits at property lines, substantial setbacks (sometimes on the order of 200 feet), and berms that are no longer “nice-to-have” concessions but baseline conditions for approval.

The $3 trillion question
(enlarge)

The jobs myth meets the balance sheet

Communities are questioning whether the small number of jobs created is worth the local impact. Developers highlight billion-dollar capital investments and construction employment spikes, but residents focus on steady-state reality: AI data centers employ far fewer permanent workers per square foot than manufacturing facilities of comparable scale. Chandler, Arizona, officials noted that existing facilities employ fewer than 100 people despite massive physical footprints. Wisconsin residents contrast promised “innovation campuses” with operational facilities requiring only dozens to low hundreds of permanent staff—mostly specialized technicians—making the “job creation” pitch ring hollow. When a data center replaces farmland or light manufacturing, communities weigh not just direct employment but opportunity cost: lost agricultural jobs, foregone retail development, and mixed-use projects that might generate broader economic activity.

Opposition scales faster than infrastructure: One local win becomes a national template for blocking the next project.

The secretive way these deals are made is often what fuels the most anger. A recurring pattern is what some call the “sleeping giant” dynamic: Residents learn late that officials and developers have been negotiating for months, often under NDAs, sometimes through shell entities and codenames. In Wisconsin, Microsoft’s “Project Nova” became a symbol of this approach; in Minnesota’s Hermantown, a year of undisclosed discussions triggered similar backlash. In Florida, opponents were furious when a major project was tucked into a consent agenda. Since these agendas are meant for routine business, it felt like a deliberate attempt to bypass public debate. Trust vanishes when people believe advisors have a conflict of interest, like a consultant who seems to be helping both the municipality and the developer. After that happens, technical claims are treated as nothing more than a sales pitch. You won’t get people back on board until you provide neutral analysis and commitments that can actually be enforced.

Data center in the community

From zoning fight to national constraint

What started as isolated neighborhood friction has professionalized into a coordinated national movement. Opposition groups now share legal playbooks and technical templates across state lines, allowing residents in “frontier” states like South Dakota or Michigan to mobilize with the sophistication of seasoned activists. The financial stakes are real: Between April and June 2025 alone, approximately $98 billion in proposed projects were blocked or delayed, according to Data Center Watch. This is no longer just a zoning headache. It’s a political landmine. In Arizona and Georgia, bipartisan coalitions have already ousted officials over data center approvals, signaling to local boards that greenlighting a hyperscale facility without deep community buy-in can be a career-ending move.

The US has the chips, but China has centralized command over power and infrastructure.

The opposition is also finding an unlikely ally in the energy markets. While the industry narrative is one of “limitless demand,” the actual market prices for long-term power and natural gas aren’t spiking but are actually staying remarkably flat. There is a massive disconnect between the hype and the math. Utilities are currently racing to build nearly double the capacity that even the most optimistic analysts project for 2030. This suggests we may be overbuilding “ghost infrastructure.” We are asking local communities to sacrifice their land and grid stability for a gold rush that the markets themselves don’t fully believe in.

This “data center rebellion” creates a strategic bottleneck that no amount of venture capital can easily bypass. While the US maintains a clear lead in high-end chips, we are hitting a wall on how we manage the mundane essentials like electricity and water. In the geopolitical race, the US has the chips, but China has the centralized command over infrastructure. Our democratic model requires transparency and public buy-in to function. If US companies keep relying on secret deals to push through expensive, overbuilt infrastructure, they risk a total collapse of community trust.



Read the whole story
alvinashcraft
20 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

WebGL for Designers: Creating Interactive, Shader-Driven Graphics Directly in the Browser

1 Share
A look at how Unicorn Studio brings the power of WebGL shaders to designers through a layer-based workflow, making it easier to create expressive, interactive graphics directly in the browser.



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/horns_remix.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/futureco.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/copy_of_blackalgo-3.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/microsfotai.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/calaxy2.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/jordan.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/everyto.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/omnera.mp4?x82419



Download video: https://codrops-1f606.kxcdn.com/codrops/wp-content/uploads/2026/03/ravi.mp4?x82419
Read the whole story
alvinashcraft
30 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Introducing OpenClaw on Amazon Lightsail to run your autonomous private AI agents

1 Share

Today, we’re announcing the general availability of OpenClaw on Amazon Lightsail to launch OpenClaw instance, pairing your browser, enabling AI capabilities, and optionally connecting messaging channels. Your Lightsail OpenClaw instance is pre-configured with Amazon Bedrock as the default AI model provider. Once you complete setup, you can start chatting with your AI assistant immediately — no additional configuration required.

OpenClaw is an open-source self-hosted autonomous private AI agent that acts as a personal digital assistant by running directly on your computer. You can AI agents on OpenClaw through your browser to connect to messaging apps like WhatsApp, Discord, or Telegram to perform tasks such as managing emails, browsing the web, and organizing files, rather than just answering questions.

AWS customers have asked if they can run OpenClaw on AWS. Some of them blogged about running OpenClaw on Amazon EC2 instances. As someone who has experienced installing OpenClaw directly on my home device, I learned that this is not easy and that there are many security considerations.

So, let me introduce how to launch a pre-configured OpenClaw instance on Amazon Lightsail more easily and run it securely.

OpenClaw on Amazon Lightsail in action
To get started, go to the Amazon Lightsail console and choose Create instance on the Instances section. After choosing your preferred AWS Region and Availability Zone, Linux/Unix platform to run your instance, choose OpenClaw under Select a blueprint.

You can choose your instance plan (4 GB memory plan is recommended for optimal performance) and enter a name for your instance. Finally choose Create instance. Your instance will be in a Running state in a few minutes.

Before you can use the OpenClaw dashboard, you should pair your browser with OpenClaw. This creates a secure connection between your browser session and OpenClaw. To pair your browser with OpenClaw, choose Connect using SSH in the Getting started tab.

When a browser-based SSH terminal opens, you can see the dashboard URL, security credentials displayed in the welcome message. Copy them and open the dashboard in a new browser tab. In the OpenClaw dashboard, you can paste the copied access token into the Gateway Token field in the OpenClaw dashboard.

When prompted, press y to continue and a to approve with device pairing in the SSH terminal. When pairing is complete, you can see the OK status in the OpenClaw dashboard and your browser is now connected to your OpenClaw instance.

Your OpenClaw instance on Lightsail is configured to use Amazon Bedrock to power its AI assistant. To enable Bedrock API access, copy the script in the Getting started tab and run copied script into the AWS CloudShell terminal.

Once the script is complete, go to Chat in the OpenClaw dashboard to start using your AI assistant!

You can set up OpenClaw to work with messaging apps like Telegram and WhatsApp for interacting with your AI assistant directly from your phone or messaging client. To learn more, visit Get started with OpenClaw on Lightsail in the Amazon Lightsail User Guide.

Things to know
Here are key considerations to know about this feature:

  • Permission — You can customize AWS IAM permissions granted to your OpenClaw instance. The setup script creates an IAM role with a policy that grants access to Amazon Bedrock. You can customize this policy at any time. But, you should be careful when modifying permissions because it may prevent OpenClaw from generating AI responses. To learn more, visit AWS IAM policies in the AWS documentation
  • Cost — You pay for the instance plan you selected on an on-demand hourly rate only for what you use. Every message sent to and received from the OpenClaw assistant is processed through Amazon Bedrock using a token-based pricing model. If you select a third-party model distributed through AWS Marketplace such as Anthropic Claude or Cohere, there may be additional software fees on top of the per-token cost.
  • Security — Running a personal AI agent on OpenClaw is powerful, but it may cause security threat if you are careless. I recommend to hide your OpenClaw gateway never to expose it to open internet. The gateway auth token is your password, so rotate it often and store it in your envirnment file not hardcoded in config file. To learn more about security tips, visit Security on OpenClaw gateway.

Now available
OpenClaw on Amazon Lightsail is now available in all AWS commercial Regions where Amazon Lightsail is available. For Regional availability and a future roadmap, visit the AWS Capabilities by Region.

Give a try in the Lightsail console and send feedback to AWS re:Post for Amazon Lightsail or through your usual AWS support contacts.

Channy

Read the whole story
alvinashcraft
42 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

OpenAI’s Codex is now on Windows

1 Share

OpenAI’s Codex agentic coding app is now available on Windows.

To say Codex has been a hit for OpenAI would be an understatement. The Codex App for Mac, which launched in early February, was downloaded more than 1 million times in the first week alone, and weekly active users now stand at 1.6 million.

There will likely be quite a bit of demand for the Windows version, too: OpenAI says more than 500,000 developers were on the waitlist.

As OpenAI stresses, the Windows version wasn’t just built to be compatible with Microsoft’s operating system but “for real Windows developer environments,” as an OpenAI spokesperson put it in an email to The New Stack.

The app was built to offer native sandboxing and workflows, so that developers on Windows can use the tools they are already familiar with. By default, the app uses its own native Windows sandbox, but there is an option to use the Windows Subsystem for Linux and its tools as well.

If you opt to go Windows-native, Codex for Windows uses OS-level controls such as restricted tokens and filesystem access control, ensuring that the agents can run in environments like PowerShell, Microsoft’s default shell for Windows.

OpenAI’s Codex app for Windows (credit: OpenAI).

For the most part, the Windows version looks and feels almost exactly like the Mac version. The same skills, automations, and support for worktrees are available on Windows. There are also some Windows-specific skills, including a WinUI skill for developers who write Windows apps.

There are also some Windows-specific skills, including a WinUI skill for developers who write Windows apps.

One thing that always sets Codex apart is that it focuses more on managing the agent than on the code itself. You can always see the diffs as needed and switch to your favorite IDE, but the default view focuses on your interactions with the agent. OpenAI describes this as “a new form factor designed as a command center for agents,” and that feels about right.

OpenAI's Codex app for Windows. Skills menu.

The default model used for Codex is OpenAI’s recently launched coding-specific GPT-5.3-Codex models, with the ability to switch to GPT-5.2-Codex, GPT-5.1-Codex-Max, GPT-5.2, and, for tasks that need to run quickly, GPT-5.1-Codex-Mini. Users can also set the reasoning level for each model.

Codex for Windows is now available for all ChatGPT Free, Go, Plus, Pro, Business, Enterprise, and Edu users.

The post OpenAI’s Codex is now on Windows appeared first on The New Stack.

Read the whole story
alvinashcraft
49 seconds ago
reply
Pennsylvania, USA
Share this story
Delete

Container files

1 Share
Learn how to inject files and directories into containers at development time and publish time using the container file APIs in Aspire.
Read the whole story
alvinashcraft
57 seconds ago
reply
Pennsylvania, USA
Share this story
Delete
Next Page of Stories