Ultimate Guide to Understanding Streams in Nodejs – Part 1

Streams in Nodejs are mistaken to be the most difficult to understand and work with. To top it off, the Node.js Docs are even more confusing.

Well, I am here to explain to you what exactly streams are, what they do, and a lot more, bit by bit, in a more lucid way.

What is the Streams Module?

Streams, just like arrays or strings, are data collections. They are objects that let developers read and write data from a source to a destination, in a constant manner.

Moreover, they also help in networking communications and end-to-end exchange of any kind of information, efficiently.

Think of it like a user watching videos or movies on the web. The original video or movie is situated at a point of source on the server. The user requests for the video or the movie which is then streamed from the source location to the destination, that is, the user’s webpage.

However, streams have certain differences. Files can be read piece by piece, and the contents are not stored in memory or are not available all at once. This makes streams in Node.js powerful enough to read and write large files coming from an external source, in bulk amounts.

Streams have an origin from the UNIX operating system, where they were first introduced. Users could interact with files with the help of a ‘|’ pipe operator.

Why Should you Use Streams in Nodejs?

Streams in Nodejs have many advantages and reasons why you should use them. To list out a few:

Time-saving – Streams take lesser time to begin processing data. The reason is that the data is processed as soon as it is received without having to wait until the entire data payload is loaded.

Memory efficiency – This feature helps you process data without having to load bulk amounts of it in memory.

Types of Streams

Streams can be of different types. There are fundamentally 4 main types of streams in Node.js:

  • Readable – Streams from which you can read data. When data in a readable stream is pushed, it is buffered until a user begins reading that data.
  • Writeable – Streams to which you can send data but not receive from. You can only write data into these streams and not read any data from them.
  • Duplex – A stream that is both Readable and Writeable. You can both read and write data with these streams.
  • Transform – These streams are similar to the Duplex type. However, these streams can modify, change or transform the output as it is read or written.

Node.js APIs with Streams Support

Streams have multiple functionalities and advantages. As the reason, many of Node.js’ built-in core modules come with native stream handling capabilities. Listed below are a few:

  • process.stdin – The output is a stream that is connected to stdin.
  • process.stdout – The output is a stream that is connected to stdout.
  • process.stderr – The output is a stream that is connected to stderr.
  • fs.createReadStream() – Creates a readable stream to a file.
  • fs.createWriteStream() – Creates a writable stream to a file.
  • net.connect() – Starts a stream-based connection.
  • http.request() – The output is an instance of the http.ClientRequest class. This is a writable stream.
  • zlib.createGzip() – Helps in compressing data using gzip (a compression algorithm) into a stream.
  • zlib.createGunzip() – Helps in decompressing a gzip stream.
  • zlib.createDeflate() – Helps in compressing data using deflate (a compression algorithm) into a stream.
  • zlib.createInflate() – Helps in decompressing a deflate stream.

Illustrating Streams with Examples in Node.js

Normally, streams are used for reading and writing files in Node.js. I will cover some more uses of streams in Nodejs in the second part of this post.

Streaming Small Files

The below-illustrated example reads the ‘sample.txt’ file using the readFile() API of the built-in fs module and serves its contents on a webpage when an HTTP connection is established.

const http = require('http')
const fs = require('fs')

const server = http.createServer(function(req, res) {
  fs.readFile(__dirname + '/sample.txt', (err, data) => {

The readFile() method on the fs module here, will read the contents of the ‘sample.txt’ file.

The res.end(data) passed in the readFile() callback, will return the contents of the file to the HTTP client.

Streaming Large Files

Operations take time to process when using the former way of streaming. When you want to stream larger files in Node.js, you may simply proceed with this way using the same operation:

const http = require('http')
const fs = require('fs')

const server = http.createServer((req, res) => {
  const stream = fs.createReadStream(__dirname + '/sample.txt')

This example indicates that we start streaming file contents to the HTTP client as soon as we receive the data. Unlike the previous example, we are not waiting for the data payload to load completely and then stream!

The pipe() function takes the source of the file and pipes it to the destination. I will elucidate a little more on this function in part 2.


Streams in Nodejs are mistaken to be the most difficult to understand and work with. To top it off, the Node.js Docs are even more confusing. In this first part of the beginner tutorial, I have conquered Streams in Nodejs in a much easier and more understandable way.

Stay tuned for the second part of this tutorial which will cover topics about the pipe function, & creating and working with streams in Nodejs.

Read Part 2: Ultimate Guide to Working with Streams in Nodejs

Noteworthy References

Aneesha S
Aneesha S
Articles: 172