New to Rust? Grab our free Rust for Beginners eBook Get it free →
How Can You Use DeepSeek AI for Free? (April 2026 V4 Guide)

If you work with binary data in JavaScript, you have two main options: Node.js Buffers or browser-based ArrayBuffers. The short answer: use Node.js Buffer for server-side file operations, networking, and cryptography. Use ArrayBuffer for cross-platform browser work, WebGL, and universal JavaScript libraries. They look similar but work very differently under the hood.
Node.js Buffer and JavaScript ArrayBuffer are both fixed-length binary data containers. Neither is better – the right choice depends on where your code runs. Buffer lives in the Node.js runtime and allocates memory outside the V8 heap. ArrayBuffer is a JavaScript standard that works in browsers and server environments, and it requires a view like Uint8Array to do anything useful. Confusing them leads to broken builds, cross-platform bugs, and performance issues you will not see until production.
TL;DR
- Node.js Buffer is Node-only. ArrayBuffer is a browser standard that also works in Node.
- Buffer allocates memory outside the V8 heap. ArrayBuffer uses standard JS heap management.
- Use Buffer for files, streams, crypto, and server-side I/O. Use ArrayBuffer for WebGL, binary protocols, and cross-platform libraries.
- Buffer has built-in string encoding/decoding. ArrayBuffer requires TypedArray or DataView views.
- In 2026, prefer
Buffer.alloc()overnew Buffer()for security and performance. - Dart-Sass and native ES modules have largely replaced node-sass – but the Buffer vs ArrayBuffer mental model still applies everywhere.
Buffer vs ArrayBuffer: Full Comparison Table
| Dimension | Node.js Buffer | JavaScript ArrayBuffer |
|---|---|---|
| Environment | Node.js only (server) | All JavaScript environments (browser + server) |
| Memory location | Outside V8 heap (C++ backing) | Inside V8 heap (standard JS memory) |
| Direct manipulation | Yes – Buffer[index] works directly | No – must use TypedArray or DataView view |
| String encoding | Built-in: Buffer.from(str, ‘utf8’) | Requires TextDecoder / TextEncoder API |
| Garbage collection | Not managed by V8 GC – manual or bufferpool release | Standard V8 garbage collection |
| Creation syntax (2026) | Buffer.alloc(size) or Buffer.from(array) | new ArrayBuffer(byteLength) |
| Deprecation status | Stable, actively maintained | Stable, browser standard since ES2015 |
| Typical use case | File I/O, network streams, crypto, server logs | WebGL tensors, binary network protocols, FileReader |
What Is Node.js Buffer?
Buffer is a Node.js core API class for handling binary data. It predates ArrayBuffer and was built specifically for the server-side realities of file systems, TCP streams, and cryptographic operations. When you call fs.readFile() or make HTTP requests with a binary response, you are working with Buffers without necessarily knowing it.
Node.js Buffer is a subclass of Uint8Array. This means it inherits all TypedArray behaviors – index access, slice, copy – but adds Node-specific methods. The key difference from a regular Uint8Array is that Buffer memory is allocated outside the V8 JavaScript heap, which makes it faster for large data and resistant to GC pauses during heavy server load.
Buffer creation: 2026 best practice
// CURRENT (2026) - secure, zero-filled allocation
const buf = Buffer.alloc(1024);
// ALSO VALID - create from string with encoding
const fromString = Buffer.from('Hello, world', 'utf8');
// ALSO VALID - create from existing array or Uint8Array
const fromArray = Buffer.from(new Uint8Array([72, 105]));
// ALSO VALID - unsafe (uninitialized) for performance-sensitive cases
// Only use when you immediately fill every byte
const fastBuf = Buffer.allocUnsafe(1024);
// DEPRECATED - do not use in 2026
// new Buffer(1024) is removed in modern Node.js versions
const oldWay = new Buffer(1024);
Buffer comes with built-in string encoding support. You can encode and decode strings without importing anything extra.
// Encoding strings → Buffer
const utf8Buffer = Buffer.from('नमस्ते', 'utf8');
const hexBuffer = Buffer.from('a1b2c3', 'hex');
const base64Buffer = Buffer.from('SGVsbG8=', 'base64');
// Decoding Buffer → string
console.log(utf8Buffer.toString('utf8')); // नमस्ते
console.log(hexBuffer.toString('hex')); // a1b2c3
console.log(base64Buffer.toString('base64')); // SGVsbG8=
What Is JavaScript ArrayBuffer?
ArrayBuffer is a JavaScript standard introduced in ES2015. It represents a fixed-length binary data buffer that you cannot read or write directly. Think of it as a locked safe – you need a view (TypedArray or DataView) to access what is inside. This design was intentional: it prevents accidental memory access and makes security boundaries explicit.
ArrayBuffer works in every modern JavaScript environment. Browsers use it for FileReader, WebSocket binary frames, WebGL textures, and Fetch API’s arrayBuffer() method. Node.js supports ArrayBuffer too since v0.12, though the Buffer API is usually more convenient server-side.
ArrayBuffer creation and views
// Create a raw ArrayBuffer (cannot read/write directly)
const buffer = new ArrayBuffer(16);
// Create views to actually manipulate the data
const uint8View = new Uint8Array(buffer);
const uint16View = new Uint16Array(buffer);
const dataView = new DataView(buffer);
// Uint8Array is the most common view - like Buffer but cross-platform
uint8View[0] = 42;
console.log(uint8View[0]); // 42
// DataView gives you type-specific read/write with byte order control
dataView.setUint16(0, 255, true); // little-endian
dataView.setUint16(2, 256, false); // big-endian
console.log(dataView.getUint16(0, true)); // 255
console.log(dataView.getUint16(2, false)); // 256
ArrayBuffer and string encoding in 2026
// TextEncoder / TextDecoder are the browser-equivalent of Buffer string methods
// Works in all modern browsers and Node.js 16+
const encoder = new TextEncoder();
const decoder = new TextDecoder();
// Encode string → ArrayBuffer
const encoded = encoder.encode('नमस्ते');
console.log(encoded.buffer); // ArrayBuffer with UTF-8 bytes
// Decode ArrayBuffer → string
const decoded = decoder.decode(encoded.buffer);
console.log(decoded); // नमस्ते
Memory Model: The Core Difference
This is where the two diverge most significantly. Node.js Buffer uses a mechanism called an allocator – memory is acquired from the operating system in 8KB pages and managed by Node’s own heap, completely separate from V8’s garbage collection. When you call Buffer.alloc(1024 * 1024) for a 1MB buffer, that memory does not trigger V8 GC and does not count against your –max-old-space-size limit in the way you might expect.
ArrayBuffer, by contrast, lives entirely within V8’s managed heap. When a TypedArray view goes out of scope, the ArrayBuffer becomes eligible for standard garbage collection. This is simpler to reason about but can cause performance hiccups if you are moving large amounts of binary data – GC pauses affect ArrayBuffer but not Buffer.
For server applications processing large files or streaming data, Buffer’s off-heap model is a genuine advantage. You can read a 500MB file into a Buffer without worrying about Node.js hitting memory limits that would apply to a regular JavaScript array of the same size.
When to Use Which: Decision Framework
Choose Buffer when you are building server-side Node.js applications and dealing with file I/O, network protocols, cryptographic operations, or any scenario where you want memory management that does not interfere with V8 GC. Buffer’s built-in string encoding alone saves significant boilerplate.
Choose ArrayBuffer when you are writing cross-environment code – browser extensions, universal JavaScript packages, WebGL applications, or anything that uses the standard browser File API. ArrayBuffer also pairs naturally with Web APIs like fetch’s arrayBuffer() response method and WebSocket binary frames.
If you are writing a universal JavaScript library that needs to work in both environments, you can use a polyfill approach. Buffer is available in Node.js natively. In browsers, you can polyfill it, or structure your code to accept ArrayBuffer views and use TextEncoder/TextDecoder for string work. Modern bundlers like Vite and Rollup handle this automatically in most cases.
Practical Examples in 2026
Reading a file: Buffer (Node.js)
import { readFile } from 'node:fs/promises';
// Read file as Buffer - entire file in memory
const fileBuffer = await readFile('./data.bin');
console.log(fileBuffer.length);
// Slice a buffer - creates a new Buffer referencing the same memory region
const slice = fileBuffer.subarray(0, 256);
// Convert to hex string for logging or hashing
console.log(fileBuffer.toString('hex'));
// Stream large files - Buffer works with stream consumers
import { createReadStream } from 'node:fs';
import { pipeline } from 'node:stream/promises';
await pipeline(
createReadStream('./large-file.bin'),
createGzip(),
createWriteStream('./large-file.gz')
);
Reading binary data: ArrayBuffer (browser)
// Fetch returns ArrayBuffer directly
const response = await fetch('/data.bin');
const arrayBuffer = await response.arrayBuffer();
// Create a DataView to read structured binary data (e.g., a protocol buffer)
const view = new DataView(arrayBuffer);
// Read header: 4 bytes at offset 0 (uint32, little-endian)
const magic = view.getUint32(0, true);
console.log(`Magic: ${magic.toString(16)}`);
// Read next field: 2 bytes at offset 4 (uint16)
const version = view.getUint16(4, true);
console.log(`Version: ${version}`);
Encryption: Buffer (Node.js native crypto)
import { createCipheriv, randomBytes } from 'node:crypto';
const key = randomBytes(32); // 256-bit AES key
const iv = randomBytes(16); // Initialization vector
const plaintext = Buffer.from('Sensitive data here');
const cipher = createCipheriv('aes-256-gcm', key, iv);
const encrypted = Buffer.concat([cipher.update(plaintext), cipher.final()]);
const tag = cipher.getAuthTag();
// Output: encrypted buffer, IV, and auth tag - all Buffers
console.log({ encrypted: encrypted.toString('hex'), iv: iv.toString('hex'), tag: tag.toString('hex') });
RankMath FAQ
What is the difference between Buffer and ArrayBuffer?
Node.js Buffer is a Node.js-specific class for binary data, with memory allocated outside the V8 heap. ArrayBuffer is a browser JavaScript standard for raw binary data that requires a view (TypedArray or DataView) to read or write. Buffer has built-in string encoding; ArrayBuffer uses the standard TextEncoder/TextDecoder API.
Which is faster: Buffer or ArrayBuffer?
For server-side workloads, Buffer is generally faster because its memory is managed outside V8’s garbage collector. For browser-based binary data work, ArrayBuffer is the standard choice and integrates with WebGL and Fetch API. Performance depends on the use case and environment.
Can I use Buffer in a browser?
Buffer is a Node.js global and not available in browsers without a polyfill. Popular polyfills include buffer on npm. For browser work, use ArrayBuffer with TypedArray views, or use a bundler that automatically handles Node.js core module polyfills.
Is new Buffer() still valid in Node.js?
No. new Buffer() was deprecated and is now removed in modern Node.js versions. Use Buffer.alloc(size) for a zero-filled buffer, Buffer.allocUnsafe(size) for performance-sensitive cases where you immediately fill every byte, or Buffer.from() to create from a string or array.
How do I convert a Buffer to an ArrayBuffer?
Buffer is a subclass of Uint8Array, so you can access the underlying ArrayBuffer directly. For a newly allocated Buffer: buffer.buffer gives you the ArrayBuffer. Note that for small buffers that Node.js pools internally, the buffer property may be larger than expected – use buffer.subarray(0, buffer.length).buffer to get a correctly-sized view.
What is the best way to read binary files in Node.js 2026?
Use fs/promises.readFile() for files under ~100MB, which returns a Buffer by default. For larger files, use streams with createReadStream() and pipeline() to avoid loading the entire file into memory. This applies to both the CommonJS and ES module systems.
Conclusion
Node.js Buffer and JavaScript ArrayBuffer solve related but distinct problems. Buffer is your go-to for any server-side binary data work – file operations, crypto, streams, network protocols. ArrayBuffer is the web standard for browser binary data and cross-platform libraries. Learn both, know which environment you are targeting, and match the tool to the job.
For more Node.js performance work, explore how the Node.js environment variables guide affects Buffer pooling behavior, and how resolving module errors can save you hours when setting up native crypto or file handling code.
- Understanding Node.js Environment Variables
- How to Fix Cannot Find Module in Node.js
- Fixing Node Sass Binding Errors
- Run Node.js Files in VS Code
- PostgreSQL with Node.js on Windows
- Node Is Not Recognized Error
- Fixing Nodemon Crashes




