For semifinalists, this stage is about depth. Sharpening your architecture. Pressure-testing your assumptions. Strengthening your path to scale. The mentors joining this year’s Builder Series have walked that road themselves and are stepping in to help teams think bigger, build smarter, and move forward with clarity.
Together, this year’s Builder Series mentors represent decades of experience across startups, global enterprises, AI innovation, cybersecurity, cloud architecture, fintech, SaaS, healthcare, and digital transformation. Their collective insight spans building companies from the ground up, leading technology at global scale, and guiding teams through pivotal growth moments. As Imagine Cup 2026 semifinalists move deeper into the Builder Series, they do so supported by leaders who have navigated complexity, scaled responsibly, and turned vision into execution. We are grateful for the time, expertise, and intentional guidance these mentors bring to the next generation of founders shaping what comes next.
If you’re familiar with building web applications in the last few years, you’ve probably heard of WebAssembly (Wasm). And if not, you’ll leave this article with a full understanding of this leading technology.
Wasm is a binary instruction format that works alongside JavaScript. It speeds up web applications that process a high amount of data (i.e. image processing, heavy math computations, etc.). Think of it this way, if the purpose of the site is something that would be better performing if it were written in a language built to perform those tasks (Rust, C, C++, and the like), then it probably would be better written in one of those languages. And by using Wasm, it can be. This is no criticism of JavaScript. JavaScript wasn’t created to perform heavy computations. JavaScript was built to give the browser interactive functionality. And it’s still the leader at that.
Wasm became a W3C standard in December 2019, and industry adoption followed quickly. It’s currently present on approximately 43,000 sites, including major products like Figma, Unity, and Fastly.
Like many development tools, Wasm was built to solve a problem. Web apps have been and still are becoming increasingly more complex. Back in 2013, there was no alternative for JavaScript yet JavaScript was showing its limits.
In response to the heavy, sustained computation that games, video editors, or scientific tools demand, a group of Mozilla engineers (Luke Wagner, Alon Zakai, and Dave Herman) released asm.js. The solution, which Wagner called “super hacky” was a rict subset of JavaScript that embedded enough static type information to allow a browser’s JS engine to dramatically improve performance.
Google was also in the process of creating their own primitive version of Wasm called Native Client (NaCl). NaCl sandboxed and ran native code directly in Chrome. Both approaches worked, but neither was ideal. asm.js was still JavaScript under the hood, which meant parsing overhead. NaCl was Chrome-only and never gained broad adoption.
Both approaches got the job done but neither was ideal. Then both teams started working together. Their common goal was a truly binary format with support from all browsers, designed specifically for performance and portability. The result was the first version of the Wasm we know today. WebAssembly was publicly announced in 2015, with the first demonstration running Unity’s Angry Bots in Firefox, Chrome, and Edge. But this was just the beginning.
The minimum viable product (MVP) was finished in 2017. By the end of 2017, all major browsers supported Wasm. In 2019 it became an official W3C recommendation. Today, browser support for Wasm is universal and sitting at 99%.
What started as a performance fix for a specific browser problem has become something much larger. Over the last few years, major changes were made to the MVP to turn it into the Wasm we know and love today. Here are some of the bigger changes:
Wasm compiles your code down to a compact binary file that the browser can decode and run almost immediately. The Wasm binary can be decoded over 20x faster than JavaScript can be parsed. In more technical terms, Wasm is a binary instruction format for a stack-based virtual machine, designed as a portable compilation target for programming languages. For anyone familiar with LIFO, you already understand the stack-based execution model is simple. Values get pushed on, operated on, and popped off. Relying on the stack-based model was by design, the simplicity helps make Wasm so fast.
Wasm doesn’t compete with JavaScript, they compliment one another. They both excel in different areas. JavaScript is the leading tool when handling tasks like the Document Object Model (DOM), user events, and interactivity. Wasm can’t touch the DOM directly and will probably always need JavaScript for that. Think of Wasm as a precision tool for handling large computations in the browser.
The Wasm module is like a doorway between your Wasm code and the rest of your application. The.wasmfile your code compiles down to is the basic unit of Wasm. Your functions, memory, tables, and global variables all live inside the module. The module then exposes functions outward for JavaScript and can import JavaScript functions to use internally.
Wasm manages memory differently than JavaScript. Rather than the automatic garbage-collected heap associated with JavaScript, Wasm relies on linear memory. It gives you a large, flat block of raw bytes that your code reads from and writes to directly. The upside is precise, predictable performance. The downside is that you’re responsible for memory management. It’s important to understand that JavaScript doesn’t clean this up making memory leaks a real concern.
Wasm and JavaScript communicate with one another pretty easily. JavaScript can call WebAssembly exports like normal functions, and WebAssembly can call JavaScript functions that are passed in at instantiation time. Things do get slightly more complicated though when it comes to passing data. Anything beyond basic numbers must pass through shared linear memory and this adds to complexity.
Wasm offers much flexibility when it comes to choosing a language. This flexibility offers developers the opportunity to select the language that’s best for the task at hand.
C/C++ give you maximum control and raw performance. They excel at game engines, physics simulations, 3D graphics, image processing, and scientific computing. Think tasks where you need to squeeze every bit of performance out of the hardware. If you want to bring your existing C/C++ codebase to the web, writing your Wasm modules in C/C++ is the natural choice.
There are some tradeoffs though. You’ll need to manage memory manually, which adds complexity and risk, especially for developers less familiar with the languages. Its toolchain, Emscripten, also has a steep learning curve. It requires you to understand not just C/C++, but also how to configure the compiler, manage system libraries, and handle the bridge between your native code and the browser environment. For developers coming from a web background, that’s a lot to take on at once.
Rust brings many of the performance benefits of C/C++ to the Wasm environment, with one significant advantage: memory safety is built into the language itself. You get the speed without the risk of the memory management errors that C/C++ are notorious for. It’s consistently the most frequently used and most desired language for Wasm development.
wasm-pack, Rust’s Wasm toolchain, is genuinely well-designed and makes the compile-to-browser workflow smooth. The steep learning curve here is Rust itself, not the tooling. But if you’re willing to invest the time, the combination of performance, safety, and tooling makes it the strongest choice for serious Wasm projects.
AssemblyScript is the most beginner-friendly path into Wasm, especially if you’re already writing JavaScript or TypeScript. Its syntax is close enough to TypeScript that you can get started without learning an entirely new language. It’s well-suited for utility functions, parsers, encoders, and performance-sensitive logic you want to offload from JavaScript without straying too far from familiar territory.
The tradeoff here is performance. Benchmarks put AssemblyScript at roughly half the speed of Rust. For many use cases that won’t matter, but if you’re really pushing hard, you may want to consider Rust or C/C++. AssemblyScript’s toolchain, asc, is straightforward to set up and fits naturally into existing JavaScript workflows. It’s the easiest toolchain of the bunch to get comfortable with.
Go is a solid choice if you already work in it. It performs well for server-side Wasm use cases like lightweight services, plugin systems, and CLI tools compiled to run in a Wasm runtime. The standard Go compiler produces large binaries because it bundles the full Go runtime making TinyGo is the preferred toolchain for Wasm. TinyGo outputs leaner files and integrates cleanly if you already know Go, though it doesn’t support the complete Go standard library, which can be a limitation depending on what your code depends on.
Only 20% of Wasm developers have used Go or TinyGo, and 67% don’t plan to (more interesting stats in this TNS article). It’s not where the Wasm ecosystem is investing its energy, but if Go is already your language it’s a perfectly workable path.
Python is appealing because so many developers already know it, and its Wasm story is genuinely promising for specific use cases. Running NumPy, pandas, or scikit-learn directly in the browser without a server backend is something Pyodide makes possible, opening up interesting doors for interactive data tools and educational environments. Pyodide works by shipping an entire Python interpreter compiled to Wasm, which means large file sizes and slower startup times. It’s not ready for performance-critical production use yet.
Kotlin and Dart are also developing Wasm support via their own dedicated toolchains and are worth watching, but both are still maturing.
Let’s build a simple module that adds two numbers together. Simple example, no real world purpose but it will clearly illustrate the process of writing Wasm code and calling it from JavaScript.
Step 1: Write the module using Typescript
View the code on Gist.
We’re defining the 32-bit integer withi32because Wasm has a minimal type system.
Step 2: Compile to Wasm
The following terminal command will create your.wasmbinary and thebuildfolder.
npx asc assembly/index.ts --outFile build/module.wasm
Step 3: Load the .wasm binary and call from JavaScript
The code below fetches the.wasmbinary and calls it like a regular JavaScript function.
View the code on Gist.
There are two ways to load a Wasm module in the browser:WebAssembly.instantiate()andWebAssembly.instantiateStreaming(). This is typically done in anindex.jsfile or any other main JavaScript entry point.
instantiateStreaming()fetches, compiles, and instantiates a module in one step directly from the raw bytecode, without requiring conversion to an ArrayBuffer. In practice, streaming initialization has been shown to be up to 1.8 times faster.instantiate()requires the entire.wasmfile to be downloaded and converted to an ArrayBuffer before compilation can begin. It works, but it’s the slower of the two options.Unless you have a specific reason to use instantiate(), instantiateStreaming() is the better choice.
If your module needs to import JavaScript functions or shared memory, you pass those in via theimportObjectat instantiation time. importObjectis the bridge that lets Wasm and JavaScript share state and call each other.
Here’s the basic syntax:
View the code on Gist.
Wasm is fast but it’s not always faster than JavaScript. It’s faster when it comes to heavy computations like image processing, physics simulations, and processing large CSVs. But to be clear, not in all cases. When it comes to image processing, JavaScript is faster when it comes to smaller files. Same thing for CSVs. JavaScript is still faster when it comes to smaller files.
Startup cost is also a performance consideration. A large.wasmbinary still has to be downloaded, compiled, and instantiated before it does anything. Streaming compilation helps, but for cold loads on slow connections it’s still a factor. Strategies like caching compiled modules in IndexedDB and lazy-loading Wasm only when needed can help significantly.
For more on this topic, we have a tutorial that will guide you along towards building a project and testing performance yourself. Here’s the image processing tutorial.
TLDR: Treat Wasm modules with the same scrutiny you’d apply to any third party code. Audit dependencies, scan source code before compiling, and don’t assume the sandbox makes all code safe.
Wasm relies heavily on the sandbox. Your Wasm code can’t reach out and touch the file system, network, or OS directly. Each WebAssembly module executes within a sandboxed environment separated from the host runtime, meaning applications execute independently and can’t escape the sandbox without going through appropriate APIs.
But the sandbox doesn’t guarantee security because Wasm still relies on JavaScript to perform many operations. This means it can inherit JavaScript-based vulnerabilities. Wasm’s binary format also limits visibility, which makes it attractive for obfuscation. Bad actors can hide malicious code in the Wasm binary to keep it hidden from security scanners. Writing in C/C++ adds another security concern. Memory vulnerabilities in your source code can carry through into the compiled Wasm binary.
The same properties that make Wasm so powerful in the browser also make it valuable on servers, edge networks, and embedded devices.
Outside the browser, Wasm can’t talk to the operating system. Can’t read files, open network connections, or access environment variables on its own. WASI changes that. Think of WASI as the bridge between your Wasm module and the outside world. WASI is a group of standards-track API specifications that provide a secure standard interface for applications compiled to Wasm. WASI can run anywhere.
Here’s what WASI gives you:
You need a runtime to execute .wasm modules outside the browser. Here are the three main options:
In JavaScript, the browser handles everything. This is not the case for Wasm. Wasm gives your module a block of memory that can grow but never shrink. How you manage it depends on your language. Some languages handle it automatically and others rely on you to handle it. Choosing a language with good memory safety, like Rust or AssemblyScript, helps avoid most of the common pitfalls.
Wasm runs single-threaded by default. Multithreading works via Web Workers sharing memory throughSharedArrayBuffer. Shared memory means race conditions so you also need atomic operations to coordinate safely. It also requires specific HTTP headers (COOP and COEP) on your server.
SIMD speeds everything up but relies on more cores. It runs one operation on multiple values at once using CPU-level instructions (think processing four pixels in a single instruction).
Always use instantiateStreaming() over instantiate(). The difference is simple: instead of waiting for the full download to finish before compiling, it compiles the module while it’s still downloading. For large modules, you can also cache the compiled result in the browser so returning users don’t have to wait at all. And if your app uses multiple Wasm modules, only load them when they’re actually needed.
Wasm’s binary format isn’t human-readable, but source maps fix that by mapping it back to your original source code. This allows you to debug normally in Chrome or Firefox DevTools. For performance, the DevTools Performance tab shows Wasm execution alongside JavaScript.console.time()works fine for quick measurements. Keep an eye on binary size too. A slow download can erase any performance gains.
This is the most common use we see with Wasm. These are applications built for the browser that have features that JavaScript isn’t the most efficient at handling. Heavy image or video processing, mathematical computations, data analytics are all parts of an application that can be offloaded to Wasm. Since Wasm can’t manipulate the DOM, JavaScript is still needed in these applications.
The typical pattern for hybrid applications is simple. Write the computational code in a language that compiles to Wasm and compile the.wasmfile. You can then call your Wasm function from JavaScript, pass in the data, and return the result. Keep the cross communication minimal as passing data back and forth has overhead.
There are now frameworks that let you write your entire web app in Wasm with little to no JavaScript. This allows you to work in one language end-to-end. But there are tradeoffs. Binary sizes are larger, load times are initially slower, and the tooling is much less mature than JavaScript frameworks.
Full Wasm apps make the most sense when your team is already deep in a systems language and performance is a top priority.
Wasm helps bring full 3D games to the browser without any plugins or downloads. Before Wasm, browser games relied on plugins like Adobe Flash or Unity’s browser plugin, both of which were slow, insecure, and eventually killed off by modern browsers. Now a game that used to require a download can run in the browser tab. Load times can be heavy and mobile support has limits, but for playable demos and web-based games it’s one of the most compelling things Wasm can do.
Anything that involves processing large amounts of data in the browser is helped by Wasm. This includes image filters, video transcoding, audio processing, and parsing large files. Before Wasm, this kind of heavy processing either had to happen on a server or ran painfully slowly in JavaScript.
TensorFlow.js has a Wasm backend that runs machine learning models directly in the browser, no server required. That means faster responses and better privacy since user data never leaves the device. Combined with SIMD and multithreading, it delivered up to a 10x speedup over plain JavaScript. Wasm arrived right as ML was becoming mainstream, making it possible to run models in the browser and on edge devices from early on.
Many blockchain platforms use Wasm to run smart contracts (small programs that execute automatically on the blockchain). Wasm is a good fit here because it’s fast, sandboxed, and produces the same result every time regardless of what machine it runs on. The same properties make it popular for cryptography libraries like encryption and hashing, where speed and security both matter.
There are a ton of resources available to help you get started. Here are some links to foster learning and build experience working with Wasm (all free).
Now you have a pretty solid understanding of Wasm’s benefits and shortcomings. It’s not a replacement for JavaScript, and it’s not magic; it’s one of the most powerful tools available on the web today for the right use case. Its ecosystem is still maturing, but the trajectory is clear: Wasm is going to keep showing up in more places, and knowing how it works puts you ahead of the curve.
The post WebAssembly is everywhere. Here’s how it works appeared first on The New Stack.
Every so often, I may have to create a new DirectX 12 program. The DirectX headers and libraries are part of the Windows headers and are generally present with the installation of the desktop development components of Visual Studio. There are some common or popular DirectX 12 related libraries that are not part of that. D3Dx12.h is one such header. I sometimes forget that this isn’t part of the Windows headers. I’m making this post for myself (should I not immediately recall where to find it). This header and other such helpers can be found on GitHub in the microsoft repository DirectX-Headers, found here.
Once you clone the repository, you’ll need to update your project to look in the folder. In Visual-Studio, right-click on your project and select “Properties.” Under “C/C++” select “General” and then “Additional Include Directories.” For all project configurations you will want to add the path to the DirectX-Headers\include\directx folder.
Posts may contain products with affiliate links. When you make purchases using these links, we receive a small commission at no extra cost to you. Thank you for your support.
Mastodon: @j2inet@masto.ai
Instagram: @j2inet
Facebook: @j2inet
YouTube: @j2inet
Telegram: j2inet
Bluesky: @j2i.net
