WebAssembly from the Ground Up
@wasmgroundup.com
220 followers 190 following 51 posts
A book about WebAssembly by @marianoguerra.org and @dubroy.com — learn Wasm by building a simple compiler in JavaScript. https://wasmgroundup.com/
Posts Media Videos Starter Packs
Pinned
wasmgroundup.com
Excited to announce the official launch of our online book, WebAssembly from the Ground Up! 🎉

It's the book we wish we'd had 3 years ago.

No messing with tools and frameworks. It's a hands-on guide to the core of Wasm: the instruction set and module format.

Link below. 👇
Learn Wasm by building a simple compiler in JavaScript. No compiler expertise necessary. All the code is in the book; we'll take you through it step by step. Get your hands dirty and see for yourself what WebAssembly is all about.
wasmgroundup.com
A look at how people are shipping Wasm as part of JavaScript libraries.
dubroy.com
TIL: WebAssembly library initialization patterns
→https://github.com/pdubroy/til/blob/main/js/2025-10-05-WebAssembly-library-initialization-patterns.md
I was curious how people are shipping WebAssembly as part of JavaScript libraries. There are two main questions:

How to bundle or fetch the .wasm payload, which may be relatively large.
How to deal with module instantiation, which is typically async.
In this post, I'm only concerned with #2 — how to deal with module instantion.

Async factory function as default export
Emscripten's modularized output option produces a module whose default export is an async factory function:

import MyLib from './my-lib.js';
const myLib = await MyLib();
Required async init
Another pattern I've seen is to export an async initialization function, which you're required to call before using the library. E.g., in esbuild-wasm:

import * as esbuild from 'esbuild-wasm'

await esbuild.initialize({
  wasmURL: './node_modules/esbuild-wasm/esbuild.wasm',
})
And in the Automerge unbundled example:

import * as AutomergeRepo from "https://esm.sh/@automerge/react@2.2.0/slim?bundle-deps";

await AutomergeRepo.initializeWasm(
  fetch("https://esm.sh/@automerge/automerge/dist/automerge.wasm")
);
wasm-bindgen without a bundler:

import init, { add } from './my_lib.js';
await init();
Hidden async init
PGlite has a nice variation on the "required async init" pattern. You can synchronously initialize a PGlite instance:

import { PGlite } from '@electric-sql/pglite'
const pg = new PGlite();
Internally, the constructor instantiates a Wasm module and stores the promise in the object's waitReady field. Most of the methods on the PGlite are async, and they await the promise. Top-level await
In principle, a library could use top-level await to do the initialization step internally, rather than requiring it in the application code. I'm not aware of any libraries that use this approach directly though.

It does have transitive effects on any code that imports the module:

Here’s what happens when you use top-level await in a module:

The execution of the current module is deferred until the awaited promise is resolved.
The execution of the parent module is deferred until the child module that called await, and all its siblings, export bindings.
The sibling modules, and siblings of parent modules, are able to continue executing in the same synchronous order — assuming there are no cycles or other awaited promises in the graph.
The module that called await resumes its execution after the awaited promise resolves.
The parent module and subsequent trees continue to execute in a synchronous order as long as there are no other awaited promises.
In other words, for the application code, it's as if it awaited the init function at the top of the module.

ES Module integration
Both Node and Deno now support importing Wasm modules directly:

import { add } from "./add.wasm";

console.log(add(1, 2));
My understanding is that it's equivalent to the following code:

const wasmInstance = await WebAssembly.instantiateStreaming(fetch("./add.wasm"));
const { add } = wasmInstance;
AFAIK, it has the same implications on module execution as an explicit top-level await does.

Sychronous instantiation
A final option is to synchronously instantiate the module — either using Node's readFileSync to load from a file, or by embedding the module as base64:

const bytes = Uint8Array.from(atob('AGFzbQEAAAABBwFgAn9/AX8DAgEABwcBA2FkZAAACgkBBwAgACABags='), c => c.charCodeAt(0));
const inst = new WebAssembly.Instance(new WebAssembly.Module(bytes));
console.log(inst.exports.add(2, 3)); // Prints 5
wasmgroundup.com
An interesting article about how Figma moved from WebGL to WebGPU —

Figma rendering: Powered by WebGPU
www.figma.com/blog/figma-r...

Also talks a bit about their overall app architecture, and how they use #wasm.
But how do we actually use WebGPU?

Our renderer is written in C++. We compile this C++ code to WebAssembly (Wasm) using Emscripten, in order to use it in the main Figma application. But we also compile our C++ renderer code into a native x64/arm64 application used for server-side rendering, as well as testing and debugging. So we needed a way to write code using the WebGPU C/C++ API and have it work in both cases, with minimal per-platform branching.

For Wasm, we decided to use Emscripten’s built-in WebGPU bindings support. This means that C++ WebGPU calls ultimately end up using the WebGPU browser API in JavaScript, even though our code is written in C++. We also had to write some of our own custom C++/JS bindings in cases where these built-in bindings weren’t performant enough.
Reposted by WebAssembly from the Ground Up
titzerbl.bsky.social
@dubroy.com alerted me to the Safari release notes (developer.apple.com/documentatio...) say version 26 ships an in-place interpreter for WebAssembly, which is in part based on the Wizard design, but adapted for Safari's use cases.

This is cool!

Wasm brings all the VMs to tiers!
developer.apple.com
wasmgroundup.com
Nice use of Wasm — bundling lightweight WebAssembly decoders with the data.

AnyBlox: A Framework for Self-Decoding Datasets
gienieczko.com/anyblox-paper

/ht Jamie Brandon (www.scattered-thoughts.net/log/0054/)
First page of an academic paper entitled "AnyBlox: A Framework for Self-Decoding Datasets", by Mateusz Gienieczko and four other authors (including Thomas Neumann), all from TU Munich.

The abstract reads: "Research advancements in storage formats continuously produce more efficient encodings and better compression rates. Despite this, new formats are not adopted due to high implementation cost and existing formats cannot evolve because they need to maintain compatibility across systems. Can this problem be solved by introducing a new abstraction? We answer affirmatively with AnyBlox, a frame work for reading arbitrary datasets using lightweight WebAssembly decoders bundled with the data. By decoupling decoders from both systems and file format specifications, AnyBlox allows transparent format evolution, instance-optimized encodings, and enables mainstream adoption of research advancements. It integrates seamlessly with modern systems like DuckDB, Spark, and Umbra, while delivering solid performance and security guarantees."

On the right, a colourful diagram is captioned: "The N ×M problem: In the past (left) every system controlled its data and internal format. Today (right) data is outside of systems’ control. No system supports all formats and each system has specialized glue code for each format."
wasmgroundup.com
Designing syntax is hard!

Some good examples here of what makes language design so tricky.
dubroy.com
From the author of Pandoc —

"I'll go through the six features of Markdown that I think have created the most difficulties"

Beyond Markdown: johnmacfarlane.net/beyond-markd...
John MacFarlane - Beyond Markdown
John MacFarlane
johnmacfarlane.net
wasmgroundup.com
Many of our friends in North America are back to school already, but we've still got two more weeks of holidays over here.

Which means: it's not too late to snag a copy of the book in our summer sale!
wasmgroundup.com
The summer holidays just started here in southern Germany.

So, it's time for a ☀️ SUMMER SALE ☀️ —

Use the code SUMMER25 for a 25% discount on the book until Sept 15: wasmgroundup.com
WebAssembly from the Ground Up
A book about WebAssembly — learn Wasm by building a simple compiler in JavaScript.
wasmgroundup.com
wasmgroundup.com
Had a fun time recording a podcast episode today…looking forward to sharing it with y'all soon!
Reposted by WebAssembly from the Ground Up
kow.fm
Kow @kow.fm · Aug 26
Lately I’ve been learning more about WebAssembly fundamentals by reading this fantastic book: wasmgroundup.com .

It’s really well written and focuses on how WebAssembly is written by teaching you to write a compiler for it. Which is honestly pretty genius.
WebAssembly from the Ground Up
A book about WebAssembly — learn Wasm by building a simple compiler in JavaScript.
wasmgroundup.com
wasmgroundup.com
The summer holidays just started here in southern Germany.

So, it's time for a ☀️ SUMMER SALE ☀️ —

Use the code SUMMER25 for a 25% discount on the book until Sept 15: wasmgroundup.com
WebAssembly from the Ground Up
A book about WebAssembly — learn Wasm by building a simple compiler in JavaScript.
wasmgroundup.com
wasmgroundup.com
Wasm modules are enabled by default in the latest version of Node (v24.5.0)! 🎉
dubroy.com
TIL: Wasm modules in Node
github.com/pdubroy/til...
Node 24.5.0, which was released last week, now supports Wasm modules out of the box! (Previously, it was available under the --experimental-wasm-modules flag.)  Example Let's look at an example. Here's the .wat for a simple module with a single export: an add function that adds two integers:  (module   (func $add (param $a i32) (param $b i32) (result i32)     local.get $a     local.get $b     i32.add)   (export "add" (func $add))) After compiling this to .wasm (e.g. with wat2wasm add.wat), you can now import it like so:  import { add } from './add-with-imports.wasm';  console.log(add(2, 3)); // prints 5 Imports What about imports? Node will handle those transparently, by trying to import a module with that name. For example:  (module   (import "./log.js" "log2" (func $log2 (param i32) (param i32)))   (func $add (param $a i32) (param $b i32) (result i32)     local.get $a     local.get $b     call $log2     local.get $a     local.get $b     i32.add)   (export "add" (func $add))) Imports in WebAssembly always have a module name ("./log.js") and an item name ("log2"). Conveniently, these can be mapped directly to ES modules — so we can import this module just like before, and it will just work.  ps - If messing around with JavaScript and Wasm sounds like fun, you should check out my book WebAssembly from the Ground Up.
Reposted by WebAssembly from the Ground Up
dubroy.com
Pretty pleased with the ergonomics of the Wasm (macro-)assembler in @ohmjs.org.

It's built on the low-level assembler lib we created for @wasmgroundup.com, but has some nice higher-level features, including labeled breaks. I'm particularly proud of the idea to put the block label at the end. 😊
A code screenshot showing a TypeScript/JavaScript function called ⁠wrapTerminalLike that generates WebAssembly (WASM) code. The code contains:  • A function definition that takes a ⁠thunk parameter  • Assembly code generation using an ⁠asm object with methods like ⁠block(), ⁠localSet(), ⁠break(), etc.  • A ⁠break statement that references a label called ⁠'_success' (highlighted with a pink arrow and annotation "This...")  • A label definition for ⁠'_success' at the bottom (highlighted with a red arrow and annotation "...goes here")  • Additional assembly operations like ⁠newTerminalNodeWithSavedPos(), ⁠updateLocalFailurePos(), and ⁠setRet()  The pink annotations with arrows illustrate the control flow relationship between the break statement and its corresponding label, showing how the break jumps to the ⁠'_success' label.
Reposted by WebAssembly from the Ground Up
ohmjs.org
OhmJS @ohmjs.org · Jul 14
We're still looking for more sponsors! If you or your company can help fund this effort: github.com/sponsors/pd...

Since 2017, my work on Ohm has been unpaid. Your sponsorship helps the project be sustainable, ensuring that I can maintain and improve Ohm for many years to come!
Sponsor @pdubroy on GitHub Sponsors
I'm the lead developer of Ohm (ohmjs.org), a user-friendly parsing toolkit for JavaScript and TypeScript. I also created the Ohm Editor, and the interactive visualization for understanding and ...
github.com
Reposted by WebAssembly from the Ground Up
wingolog.org
acm queue issue out on webassembly, with articles by @littledan.dev, conrad watt, ben titzer, & yours truly: queue.acm.org/issuedetail....
WebAssembly - ACM Queue
queue.acm.org
wasmgroundup.com
A good question we were asked recently —

"Does V8 optimize WebAssembly similar to JS?"

The latest post on the V8 blog, "Speculative Optimizations for WebAssembly using Deopts and Inlining", provides some answers!

v8.dev/blog/wasm-s...
Fast execution of JavaScript relies heavily on speculative optimizations. That is, JIT-compilers make assumptions when generating machine code based on feedback that was collected during earlier executions. For example, given the expression a + b, the compiler can generate machine code for an integer addition if past feedback indicates that a and b are integers (and not strings, floating point numbers, or other objects). Without making such assumptions, the compiler would have to emit generic code that handles the full behavior of the + operator in JavaScript, which is complex and thus much slower. If the program later behaves differently and thus violates assumptions made when generating the optimized code, V8 performs a deoptimization (or deopt, for short). That means throwing away the optimized code and continuing execution in unoptimized code (and collecting more feedback to possibly tier-up again later).  In contrast to JavaScript, fast execution of WebAssembly hasn’t required speculative optimizations and deopts. One reason is that WebAssembly programs can already be optimized quite well because more information is statically available as e.g., functions, instructions, and variables are all statically typed. Another reason is that WebAssembly binaries are often compiled from C, C++, or Rust. These source languages are also more amenable to static analysis than JavaScript, and thus toolchains such as Emscripten (based on LLVM) or Binaryen can already optimize the program ahead-of-time. This results in fairly well-optimized binaries, at least when targeting WebAssembly 1.0, which launched in 2017.
Reposted by WebAssembly from the Ground Up
dubroy.com
Here's what I've been working on the past couple months! It's been a lot of fun, and the prototype shows some promising perf improvements (~10x).

It's still about 8–10 weeks of effort to productionize this. If you or your company can help fund that work, please get in touch!
ohmjs.org
OhmJS @ohmjs.org · Jun 23
Ever wanted to use Ohm from another language? Go, Python, Rust?

See the brand new, experimental support for compiling Ohm grammars to Wasm: github.com/ohmjs/ohm/d...

It wasn't the main goal, but it also appears to be a perf win — parsing is about 10x faster on real-world grammars (e.g. ES5).
A couple months ago, I started prototyping a new feature: the ability to compile an Ohm grammar to WebAssembly, so that it can be used from languages other than JS. You can find more background (use cases and implementation details) in #503.  The MVP is complete, and this feature is now available for early testing (with some limitations). We're very interested to get feedback and hear what uses cases people have for this.  Although it wasn't the main motivation for this work, early benchmarks are showing ~10x improvement in parse times. Take this with a grain of salt — since the implementation isn't yet complete, it's hard to know what the final performance will look like. But it looks like this will be a significant performance win for real-world grammars!
Reposted by WebAssembly from the Ground Up
wasmgroundup.com
What's the opposite of a cautionary tale?

Well, yeah…this is one of those. Learn #wasm and you might just have a helluva lot of fun.
dubroy.com
Ok ok this may not make a lot of sense but I am *extremely* happy to have gotten this code working!! So I feel compelled to share.

(And yes, I wrote my own macro-assembler for #wasm in JavaScript. That's how much fun I'm having, and yes I mean that sincerely.)

1/x
wasmgroundup.com
A helpful technique for debugging generated #wasm code!
dubroy.com
Another take on #wasm debugging, previous one was over-complicated 😅

I wanted a way to embed comments in generated WebAssembly code, to orient myself when stepping thru disasm'd code in the dev tools.

Turns out named globals is a handy trick for that!

1/x
wasmgroundup.com
Wondering if the book is *practical*?

I'm currently using the helper library extracted from the book for a new @ohmjs.org feature which will allow you to compile Ohm grammars to Wasm.

This will make it possible to use Ohm grammars from Go, Python, etc.

Details here 👉 github.com/ohmjs/ohm/is...
dubroy.com
btw what I'm using here is the "assembler library" that we construct bit by bit in @wasmgroundup.com. It's also available as a standalone NPM package: www.npmjs.com/package/@was...

There's not yet much documentation (outside of the book), but we'll work on that soon!
wasmgroundup.com
Wireworld is a kind of cellular automaton, similar to Conway's Game of Life.

Wasm4-Wireworld is an implementation of Wireworld WASM-4 "fantasy console" using Hoot, a Scheme-to-Wasm compiler: spritely.institute/news/hoot-wi...
wasmgroundup.com
After using @ohmjs.org to teach WebAssembly, @dubroy.com will be working on first-class #wasm support in Ohm.
wasmgroundup.com
Sorry, just saw this! Definitely interested to hear more — sent you an email.
wasmgroundup.com
Great article by @saelo.bsky.social about the V8 sandbox: v8.dev/blog/sandbox

"the overall construction is therefore not unlike the sandboxing model used by WebAssembly"
After almost three years since the initial design document and hundreds of CLs in the meantime, the V8 Sandbox — a lightweight, in-process sandbox for V8 — has now progressed to the point where it is no longer considered an experimental security feature. Starting today, the V8 Sandbox is included in Chrome's Vulnerability Reward Program (VRP). While there are still a number of issues to resolve before it becomes a strong security boundary, the VRP inclusion is an important step in that direction. Chrome 123 could therefore be considered to be a sort of "beta" release for the sandbox. This blog post uses this opportunity to discuss the motivation behind the sandbox, show how it prevents memory corruption in V8 from spreading within the host process, and ultimately explain why it is a necessary step towards memory safety.  Motivation Memory safety remains a relevant problem: all Chrome exploits caught in the wild in the last three years (2021 – 2023) started out with a memory corruption vulnerability in a Chrome renderer process that was exploited for remote code execution (RCE). Of these, 60% were vulnerabilities in V8. However, there is a catch: V8 vulnerabilities are rarely "classic" memory corruption bugs (use-after-frees, out-of-bounds accesses, etc.) but instead subtle logic issues which can in turn be exploited to corrupt memory. As such, existing memory safety solutions are, for the most part, not applicable to V8. In particular, neither switching to a memory safe language, such as Rust, nor using current or future hardware memory safety features, such as memory tagging, can help with the security challenges faced by V8 today.