JavaScript Streams for Beginners
December 20, 2025 • 7 min read
Imagine you’re drinking from a water fountain versus waiting for someone to fill an entire bucket and hand it to you. Streams are like drinking from the fountain—you get water immediately, bit by bit, instead of waiting for the whole thing.
In JavaScript, Streams let you process data piece by piece (in “chunks”) rather than loading everything into memory at once. This is crucial when working with large files, network requests, or real-time data.
Why use Streams?
- Memory efficient: Process gigabytes of data without running out of memory
- Time efficient: Start processing data before it’s fully loaded
- Composable: Chain operations together like building blocks
Think of it this way: would you rather wait 10 seconds for a 1GB file to fully load, or start working with the first chunk after 10 milliseconds?
The four types of Streams
JavaScript has four fundamental stream types:
- Readable - A source you read data FROM (like a file or HTTP response)
- Writable - A destination you write data TO (like a file or HTTP request body)
- Transform - A stream that modifies data passing through it (like compression)
- Duplex - Both readable and writable (like a network socket)
Readable Streams: Your data source
A Readable Stream is like a faucet—data flows out when you turn it on.
// Node.js example: reading a file as a stream
const fs = require('fs');
const readStream = fs.createReadStream('large-file.txt', {
encoding: 'utf8',
highWaterMark: 64 * 1024 // 64KB chunks
});
readStream.on('data', (chunk) => {
console.log('Received chunk:', chunk.length, 'bytes');
});
readStream.on('end', () => {
console.log('Finished reading file');
});
readStream.on('error', (err) => {
console.error('Error:', err.message);
});
The highWaterMark option controls chunk size. Smaller chunks use less memory but may be slower; larger chunks are faster but use more memory.
Writable Streams: Your data destination
A Writable Stream is like a drain—data flows in.
const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');
writeStream.write('Hello, ');
writeStream.write('Streams!');
writeStream.end(); // Always call end() when done
writeStream.on('finish', () => {
console.log('Finished writing to file');
});
Important: Always call .end() on writable streams to signal you’re done writing.
Piping: Connecting streams together
The real power of streams comes from piping—connecting a readable stream directly to a writable stream.
const fs = require('fs');
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');
// Pipe automatically handles backpressure
readStream.pipe(writeStream);
writeStream.on('finish', () => {
console.log('File copied successfully');
});
This is equivalent to copying a file, but it works on any size file without loading it all into memory.
Transform Streams: Processing data in flight
Transform streams modify data as it passes through. Here’s how to create one that converts text to uppercase:
const { Transform } = require('stream');
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
// Convert chunk to uppercase and pass it along
this.push(chunk.toString().toUpperCase());
callback();
}
});
// Chain it: read -> transform -> write
process.stdin
.pipe(upperCaseTransform)
.pipe(process.stdout);
Run this and type something—it’ll echo back in uppercase.
Web Streams API (Browser)
Modern browsers have their own Streams API. Here’s how to stream a fetch response:
async function streamFetch(url) {
const response = await fetch(url);
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) {
console.log('Stream complete');
break;
}
// Process each chunk as it arrives
const text = decoder.decode(value, { stream: true });
console.log('Chunk:', text);
}
}
streamFetch('https://api.example.com/large-data');
This is perfect for showing loading progress or processing data incrementally.
Practical example: Line-by-line file processing
Let’s read a large log file line by line without loading it all into memory:
const fs = require('fs');
const readline = require('readline');
async function processLogFile(filepath) {
const fileStream = fs.createReadStream(filepath);
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity // Handle Windows line endings
});
let lineCount = 0;
let errorCount = 0;
for await (const line of rl) {
lineCount++;
if (line.includes('ERROR')) {
errorCount++;
console.log(`Line ${lineCount}: ${line}`);
}
}
console.log(`Total lines: ${lineCount}, Errors: ${errorCount}`);
}
processLogFile('server.log');
The for await...of syntax makes it clean and readable.
Understanding backpressure
Backpressure happens when a writable stream can’t keep up with the readable stream. Imagine pouring water into a funnel faster than it can drain—it overflows.
Streams handle this automatically when you use .pipe(). If you’re writing manually, check the return value of .write():
const writeStream = fs.createWriteStream('output.txt');
function writeData(data) {
const canContinue = writeStream.write(data);
if (!canContinue) {
// Buffer is full, wait for drain event
console.log('Pausing writes...');
writeStream.once('drain', () => {
console.log('Resuming writes...');
// Continue writing
});
}
}
Stream modes: Flowing vs Paused
Readable streams have two modes:
Flowing mode: Data flows automatically, and you handle it with the data event.
readStream.on('data', (chunk) => {
// Data flows automatically
});
Paused mode: You manually pull data with .read().
readStream.on('readable', () => {
let chunk;
while ((chunk = readStream.read()) !== null) {
console.log('Read chunk:', chunk.length);
}
});
Flowing mode is simpler; paused mode gives you more control.
When to use Streams
Use streams when:
- Processing large files (logs, CSVs, videos)
- Handling HTTP request/response bodies
- Real-time data processing
- Memory is constrained
- You need to show progress to users
Don’t bother with streams for:
- Small files (under a few MB)
- Simple JSON API responses
- Data you need all at once anyway
Common pitfalls
- Forgetting to call
.end(): Writable streams won’t emitfinishwithout it - Not handling errors: Always add error handlers to prevent crashes
- Ignoring backpressure: Can cause memory issues if you write faster than drain
- Mixing async/await with callbacks: Pick one pattern and stick with it
Quick reference
// Reading
const readable = fs.createReadStream('file.txt');
readable.on('data', chunk => { /* process */ });
readable.on('end', () => { /* done */ });
readable.on('error', err => { /* handle */ });
// Writing
const writable = fs.createWriteStream('file.txt');
writable.write('data');
writable.end();
writable.on('finish', () => { /* done */ });
writable.on('error', err => { /* handle */ });
// Piping
readable.pipe(transform).pipe(writable);
// Async iteration (Node.js)
for await (const chunk of readable) {
// process chunk
}
Takeaways
- Streams process data in chunks, saving memory and time
- Use
.pipe()to connect streams and handle backpressure automatically - Always handle errors and call
.end()on writable streams - Web browsers have their own Streams API for fetch responses
- For most large file operations, streams are the right choice
FAQ
Q: Are Node.js streams the same as Web Streams? A: They’re similar in concept but have different APIs. Node.js streams use EventEmitter patterns, while Web Streams use async iterators and ReadableStream/WritableStream classes.
Q: Can I use streams with async/await?
A: Yes. In Node.js, readable streams are async iterables. Use for await...of to consume them cleanly.
Q: How do I know what chunk size to use? A: The default (64KB for files) works well for most cases. Smaller chunks reduce memory but may slow things down; larger chunks are faster but use more memory.
Q: What if I need to process the entire file before outputting?
A: Then streams might not be the right tool. Collect chunks into an array and join them, or use fs.readFile() for smaller files.
Stay in the loop
Get the latest updates on my blog, projects, and tech insights delivered straight to your inbox. No spam, unsubscribe anytime.
By subscribing, you agree to receive occasional updates. Unsubscribe anytime.