Skip to content
Tech News
← Back to articles

We rewrote our Rust WASM parser in TypeScript and it got faster

read original get TypeScript WebAssembly Toolkit → more articles
Why This Matters

The rewrite of the openui-lang parser from Rust to TypeScript resulted in a threefold increase in speed, primarily by reducing overhead and optimizing the pipeline stages. This demonstrates that in browser-based parsing tasks, minimizing communication and data transfer overhead can significantly impact performance, sometimes more than raw processing speed. For the tech industry and consumers, this highlights the importance of reevaluating optimization strategies beyond just choosing faster languages, especially in WebAssembly and frontend parsing contexts.

Key Takeaways

We rewrote our Rust WASM Parser in TypeScript - and it got 3x Faster

Thesys Engineering Team · Fri Mar 13 2026

We built the openui-lang parser in Rust and compiled it to WASM. The logic was sound: Rust is fast, WASM gives you near-native speed in the browser, and our parser is a reasonably complex multi-stage pipeline. Why wouldn't you want that in Rust?

Turns out we were optimising the wrong thing.

The openui-lang parser converts a custom DSL emitted by an LLM into a React component tree. It runs on every streaming chunk — so latency matters a lot. The pipeline has six stages:

autocloser → lexer → splitter → parser → resolver → mapper → ParseResult

Autocloser : makes partial (mid-stream) text syntactically valid by appending minimal closing brackets/quotes

: makes partial (mid-stream) text syntactically valid by appending minimal closing brackets/quotes Lexer : single-pass character scanner, emits typed tokens

: single-pass character scanner, emits typed tokens Splitter : cuts the token stream into id = expression statements

: cuts the token stream into statements Parser : recursive-descent expression parser, builds an AST

... continue reading