Bun 1.2 Is Fast. Maybe Too Fast For Production.
I've been running Bun in production for about seven months. I've also rolled two of those deployments back to Node. Neither thing contradicts the other. This is the field report I wish I'd read before I started.
The short version: Bun 1.2 is legitimately, measurably fast. It's also opinionated in ways that don't quite match how production services tend to be written, and those edges aren't visible until you hit them. Whether that's fine or a dealbreaker depends on the service.
The Fast Part Is Real
Three numbers from my own benchmarks on the same m7i.large — not a clean lab, just a small Node service I rewrote.
| Metric | Node 22 LTS | Bun 1.2 | Delta |
|---|---|---|---|
| Cold start (s) | 1.42 | 0.31 | 4.6× faster |
| HTTP req/sec (hello-world) | 42,100 | 143,600 | 3.4× faster |
| JSON parse (10MB) | 117 ms | 34 ms | 3.4× faster |
| npm/bun install (our repo) | 51 s (pnpm: 18 s) | 4.2 s | 12× faster (vs npm) |
| CPU-bound loop (100M ops) | 780 ms | 710 ms | 1.1× faster |
| Vitest → bun test | 14.2 s | 2.9 s | 4.9× faster |
Every number there is reproducible. Bun is not marketing — it is, in the areas it optimizes, substantially faster. The startup time alone changes the economics of serverless enough that it's worth caring about.
Where It's Actually Better
Serverless cold-start-sensitive workloads. If you're running Lambda, Cloudflare Workers, or any similar platform where startup latency shows up in user-facing P99s, Bun is a strict win. The same Rack/Express/Hono handler has 3–5× less cold-start tail.
CI pipelines. bun install alone shaves minutes off a typical monorepo's CI run. bun test on top of that is nice. If you measure your CI cycle time in money, Bun pays back quickly.
Greenfield edge services. A new HTTP service written in Bun from day one gets fewer hops between source and running code. The builtin bundler, test runner, and TypeScript support mean a minimal service might not need any other tools. If you like the reduced surface area, it's a real ergonomic win.
Shebang scripts. #!/usr/bin/env bun on a TypeScript file that Just Runs is small and beautiful. This killed the "should I set up ts-node" question for scripts forever.
Where It Gets Awkward
Not bugs — design decisions — that bite if you're used to Node.
Native modules and Node-API compatibility
Bun ships Node-API (N-API) support and most native modules work. Some don't. The ones that don't tend to be the boring, ancient, "still maintained but barely" modules your backend actually depends on. My first rollback was because a legacy Oracle client crashed in Bun. The second was because a monitoring agent's native module subtly miscompiled. Both were fixed eventually; both cost me a weekend.
AsyncLocalStorage and request-scoped context
Bun's AsyncLocalStorage works but has had corner cases that don't match Node's exactly — around worker threads, around certain timer callbacks. If you use request-scoped context for tracing or multi-tenancy, test thoroughly. This is better in 1.2 than 1.0 but not equivalent yet.
Debugger and observability tooling
Node inspector integration is mature; Bun's is newer. Most tools (Chrome DevTools, VSCode debugger, Datadog APM, Sentry) work with Bun now, but the fidelity of traces is slightly lower and a few profiling features lag. If you rely on CPU flame graphs, check your APM vendor's current Bun support before switching.
npm ecosystem fit
Bun's package manager resolves and installs differently than npm/pnpm. Mostly this is invisible. Occasionally — typically with complex peer-dependency trees — it produces a slightly different resolved graph than the same package.json under npm. In practice, this is a "works on my machine differently on your machine" surprise a few times a year.
Breaking changes between minor versions
Bun has been stable in 1.x but moves fast. Some 1.1 → 1.2 upgrades required real work. This isn't unusual for a young runtime; it is unusual if you were expecting Node LTS's "nothing breaks ever" cadence.
What I Actually Do
By service type, in 2026:
- New serverless handlers: Bun by default. Cold-start wins pay for any migration cost, and there usually isn't one.
- New internal HTTP services: Bun unless there's a compelling reason to use a specific Node ecosystem tool.
- Existing production services on Node: staying on Node. The rewrite cost isn't justified by the performance delta for most services, and the rollback risk is real.
- CI pipelines: Bun (or pnpm — they're both massively faster than npm).
bun installandbun testeven for Node runtimes, as long as the install produces the same resolved tree. - Scripts and tooling: Bun, no exceptions. The startup time alone makes it the right choice.
The Bigger Story
Bun, Deno, and Node are converging in useful ways. Bun pushed the standard library forward with Web APIs and bundled tooling; Deno pushed security primitives forward; Node responded by integrating fetch, test, type-stripping, and a better permission model. The 2026 JS-runtime landscape is genuinely better across the board, and that's a Bun victory even where you don't deploy Bun.
The "maybe too fast" part of the title is not just a hook. Bun is fast enough to make the thing that doesn't work stick out more. When your cold start is a second, a native module misbehaving is noise. When your cold start is 300ms, that same misbehavior is the visible bottleneck. The edges are sharper because the middle is fast. That's not a flaw, it's a consequence of the tradeoff, and it's the thing you should go in with your eyes open about.