Bun v1.1.23 is here! This release fixes 27 bugs (addressing 128 👍). Support for TextEncoderStream
and TextDecoderStream
, 50% faster console.log(string)
, support for Float16Array
, better out of memory handling, Node.js<>Bun IPC on Windows, truncate large arrays in console.log()
, and many more bug fixes and Node.js compatibility improvements.
We're hiring systems engineers in San Francisco to build the future of JavaScript!
Previous releases
v1.1.22
fixes 72 bugs (addressing 63 👍). 30% faster fetch() decompression, New --fetch-preconnect flag, improved Remix support, Bun gets 4 MB smaller on Linux, bundle packages excluding dependencies, many bundler fixes and node compatibility improvements.v1.1.0
Bundows. Windows support is here!
To install Bun
curl -fsSL https://bun.sh/install | bash
npm install -g bun
powershell -c "irm bun.sh/install.ps1|iex"
scoop install bun
brew tap oven-sh/bun
brew install bun
docker pull oven/bun
docker run --rm --init --ulimit memlock=-1:-1 oven/bun
To upgrade Bun
bun upgrade
New features
TextEncoderStream
& TextDecoderStream
We've implemented the TextEncoderStream
and TextDecoderStream
Web APIs. These APIs are the streaming equivalents of TextEncoder
and TextDecoder
.
You can use TextDecoderStream
to decode a stream of bytes into a stream of UTF-8 strings.
const response = await fetch("https://example.com");
const body = response.body.pipeThrough(new TextDecoderStream());
for await (const chunk of body) {
console.log(chunk); // typeof chunk === "string"
}
You can use TextEncoderStream
to encode a stream of UTF-8 strings into a stream of bytes.
const stream = new ReadableStream({
start(controller) {
controller.enqueue("Hello, world!");
controller.close();
},
});
const body = stream.pipeThrough(new TextEncoderStream());
for await (const chunk of body) {
console.log(chunk); // chunk instanceof Uint8Array
}
TextEncoderStream
in Bun is relatively fast.
TextEncoderStream in Bun v1.1.23 encodes the same input 3x - 30x faster than in Node v22.5.1 pic.twitter.com/GCEfgfK0GU
— Jarred Sumner (@jarredsumner) August 10, 2024
Why TextEncoderStream?
The TextEncoderStream
API is used in popular packages like Next.js' App Router for middleware.
Since most natively-implemented streams in Bun support both text and binary data, you rarely ever need to use TextEncoderStream
in Bun. In fact, using TextEncoderStream
in Bun is slower than not using it because it adds significant overhead to the stream. So, don't use this API unless a library you depend on is already using it.
stream
option in TextDecoder
This release also adds support for the stream
option in TextDecoder
. This tells the decoder that chunks are part of a larger stream, and it should not throw an error if chunk is not a complete UTF-8 code point.
const decoder = new TextDecoder("utf-8");
const first = decoder.decode(new Uint8Array([226, 153]), { stream: true });
const second = decoder.decode(new Uint8Array([165]), { stream: true });
console.log(first); // ""
console.log(second); // "♥"
50% faster console.log(string)
In this release of bun, we made console.log(string)
50% faster.
In the next version of Bun
— Bun (@bunjavascript) August 13, 2024
console.log(string) gets 50% faster, thanks to @justjs14 pic.twitter.com/DBKd1fKODe
Previously, Bun had a fast-path for printing a single string with console.log()
. In theory, this would mean that only 1 syscall would be needed to print the string. However, this was not the case because console.log()
needs to print a newline character and reset ANSI escape codes. We made it faster by removing this no-so-fast-path and buffering the write like in other cases.
Thanks to @billywhizz and @nektro for making this faster!
Truncate large arrays in console.log()
When you console.log(largeArray)
a large array in Bun, instead of printing out the entire array element by element, Bun will now stop after printing 100 total elements, and print an ellipsis with a count (... 1,234 more items
) to indicate that there are more elements.
In the next version of Bun,
— meghan 🌻 (@nektro) August 10, 2024
console.log truncates arrays after 100 elements pic.twitter.com/qPZtkb6sup
Thanks to @nektro for working on this feature!
Float16Array
In this release of Bun, there is support for the newly added Float16Array
API. This is a stage 3 TC39 proposal that was implemented in WebKit.
const float16 = new Float16Array(3);
const float32 = new Float32Array(3);
for (let i = 0; i < 3; i++) {
float16[i] = i + 0.123;
float32[i] = i + 0.123;
}
console.log(float16); // Float16Array(3) [ 0, 1.123046875, 2.123046875 ]
console.log(float32); // Float32Array(3) [ 0, 1.1230000257492065, 2.122999906539917 ]
Thanks to the WebKit team for implementing this feature!
Better handling of out-of-memory errors
We improved how out-of-memory errors are handled in Response
, Request,
Blob
, and node:fs
. Previously, Bun would crash or potentially truncate if an operation exceeded an engine limit, now Bun will check if the operation will definitely exceed a limit and properly throw an error if it does.
import { expect } from "bun:test";
const buf = Buffer.alloc(4294967295, "abc");
try {
const blob = new Blob([buf, buf]);
await blob.text();
} catch (e) {
expect(e.message).toBe(
"Cannot create a string longer than 2^32-1 characters",
);
expect(e.code).toBe("ERR_STRING_TOO_LONG");
}
// Before: `catch` block would not be called
// After: `catch` block is called
The improved out-of-memory error handling affects the following APIs:
text()
,json()
,bytes()
,formData()
,arrayBuffer()
methods onBlob
,Request
, andResponse
fs.writeFile()
&fs.readFile()
Node.js compatibility improvements
Fixed: fs.F_OK
, fs.R_OK
, fs.W_OK
, and similar constants
We fixed a bug where node:fs
constants, such as F_OK
, were not defined in Bun. These were deprecated in favor of fs.constants
in Node.js 20, but are still defined for compatibility reasons.
import fs from "node:fs";
console.log(fs.F_OK); // old way
console.log(fs.constants.F_OK); // new way
Fixed: fs.readFile
memory & size limits
Strings and typed arrays in Bun are limited to 2^32 characters of length by the engine (JavaScriptCore). Node.js/V8 has a lower limit for strings and a higher limit for typed arrays.
When reading files larger than this limit with fs.readFile
, previously Bun would behave incorrectly. In certain cases, Bun would crash due to incorrect JavaScript exception handling and in other cases, Bun would potentially return a truncated string or typed array.
Now, Bun will throw an error if the file as soon as it is known that the file will not be able to be read from JavaScript. This saves you memory and CPU time because it avoids reading the complete file into memory before throwing an error.
Fixed: Backpressure in 'ws' module
We fixed a bug where calling WebSocket.send()
under high-load would cause a message to be sent multiple times. This was due to incorrect handling of backpressure when a message was rejected by the socket.
import { WebSocket } from "ws";
const ws = new WebSocket("ws://example.com");
ws.on("open", () => {
for (let i = 0; i < 1_000; i++) {
ws.send(`Hello, world! ${i}`);
}
});
// Before: | After:
// ... | ...
// Hello, world! 999 | Hello, world! 997
// Hello, world! 999 | Hello, world! 998
// Hello, world! 999 | Hello, world! 999
// ... | ...
This only affected WebSocket
clients that were created by importing the ws
package, which Bun changes to use our own implementation of WebSocket
.
Thanks to @cirospaciari for fixing this bug!
Cross-runtime IPC on Windows with Node.js
We fixed a bug where inter-process-communication (IPC) in node:child_process
would not work on Windows when sending messages between Bun and Node.js processes. In certain cases, this could cause some build tools to hang on Windows.
Thanks to @cirospaciari for fixing this bug!
JIT crashes in node:vm
We fixed several crashes that could occur when code is JIT'd and being evaluated in a node:vm
context. This would occur when the globalThis
object in the node:vm
context was not the same as the real globalThis
object.
import { Script } from "node:vm";
const script = new Script(`
for (let i = 0; i < 1_000_000; i++) {
performance.now();
}
`);
script.runInContext(globalThis); // ok
script.runInContext({ performance }); // would crash
This affected only certain code paths after about a million times invocations:
performance.now()
TextEncoder.encode()
&TextDecoder.decode()
crypto.randomUUID()
&crypto.getRandomValues()
crypto.timingSafeEqual()
Thanks to @dylan-conway for fixing this bug!
Bugfixes
Fixed: Buffering gigabytes of data over Bun.serve()
We fixed a bug where receving then streaming a large response from Bun.serve()
would cause the response to be truncated. This was because if the data was larger than 2^32-1
bytes, the flush would be truncated.
Thanks to @cirospaciari for fixing this bug!
Fixed: moving tagged template literals when bundling
We fixed a bundler bug where tagged template literals would be incorrectly moved during bundling. This would cause the template literal to be evaluated in the wrong scope, which would cause it to throw an error.
globalThis.foo = () => console.log("foo");
const bar = await import("./bar.ts");
console.log("bar");
export const bar = foo`bar`;
// Before: TypeError: undefined is not a function (near '...foo`)
// After: bar\nfoo
This was a regression introduced in Bun v1.1.18, and has now been fixed.
Thanks to @paperdave for fixing this bug!
Fixed: AbortSignal with fetch()
when using custom TLS certificates
We fixed a bug where calling fetch()
with a custom TLS certificate would not abort the request due to a timeout. Bun supports tls
options when making a fetch()
request, which if often needed in non-browser environments where you need to use a custom TLS certificate.
const response = await fetch("https://example.com", {
signal: AbortSignal.timeout(1000),
tls: {
ca: "...",
},
});
Thanks to @cirospaciari for fixing this bug!
Fixed: memory leak when new Response
threw an error
We fixed a memory leak where setting a custom statusText
on a new Response
would not be cleaned up. We've also added more tests to ensure that leaks to Request
and Response
are caught.
Fixed: crash when importing an empty .toml
file
We fixed a bug when importing an empty .toml
file would cause Bun to crash. This would also affect certain .json
files, like package.json
and tsconfig.json
.
import config from "./config.toml";
console.log(config);
// Before: <crash>
// After: { enabled: true }
Fixed: Regression with TLS sockets in 1.1.22
A regression introduced in 1.1.22 could cause a crash when a TLS socket failed to connect due to a DNS issue has been fixed, thanks to @cirospaciari.