Ultimate Guide to Working with Streams in Nodejs – Part 2

Let’s continue with our discussion on Streams in Nodejs. In the second part of our tutorial on streams, we will be learning to work with streams in our Node.js projects.

You would get a better understanding of what we’re talking about here if you have read the first part of this post. If you haven’t already, click here to read Part 1 – Understanding Streams in Node.js. And if you have, you’re good to go!

The first part talked about what exactly are streams, their advantages, the different types of streams, and some Node.js APIs that support them. As the article matures, I also illustrated a few examples wherein I made mention of the pipe() function. And then, the post comes to an abrupt halt.

So, let’s pick up where we left off.

EventEmitter Instances in Nodejs Streams

We learned about the different types of streams. All these types of streams are instances of the EventEmitter class that originates from the events module. This means it throws numerous events at different instances of time.

Below listed are the types of commonly-used events:

  • data − Triggered when there’s some data available to read.
  • end − Triggered when there is no more data available to read.
  • error − Triggered when there are any errors or exceptions while receiving or writing data.
  • finish − Triggered when the entire data has been flushed to underlying system.

The pipe() Function

In the first part of the tutorial, we used an example in which we made use of the pipe() function. If you’re too lazy to go back, I’ve pasted the same code below: 😉

const http = require('http')
const fs = require('fs')

const server = http.createServer((req, res) => {
  const stream = fs.createReadStream(__dirname + '/sample.txt')
  stream.pipe(res)
})
server.listen(8000)

Explanation – “This example indicates that we start streaming file contents to the HTTP client as soon as we receive the data.”

The above code snippet uses a line of code that says: stream.pipe(res). The pipe() method here, is called on the file stream.

All the pipe function does is, grabs the source of the file and pipes it to the destination of that file.

You may even call it on the source stream. In that case, we will pipe our file stream to the HTTP response.

The pipe() method returns the destination stream. This makes it convenient for us because now, we can simply chain the pipe() method every time we want to call it, like this:

src.pipe(dest1).pipe(dest2)

This is just the same as doing:

src.pipe(dest1)
dest1.pipe(dest2)

Now that we know what the pipe() function is, let us move on to working with streams in Nodejs.

Creating Readable Streams in Nodejs

To begin working with Streams in Nodejs, we shall first import it.

const stream = require('stream');

The stream module helps us in creating new types of stream instances. Normally, we may choose to not use the streams module to consume streams.

Let us now get started with creating our first readable stream.

Start by creating a stream object:

const readableStream = new Stream.Readable()

We can now use the _read method by adding the read option.

const readableStream = new Stream.Readable({
  read() {}
})

This way, our stream is now initialized. We can now successfully send data to it!

readableStream.push('Yeehaw! Here’s first my silly text')

Creating Writable Streams in Nodejs

Just like we created our first Readable stream, we will extend the base to the Writable object and create our first Writable stream. We can then implement the _write method.

Of course, start by creating a stream object. Make sure you have imported the stream module:

const writableStream = new Stream.Writable()

Let us now implement the _write method:

writableStream._write = (chunk, encoding, next) => {
  console.log(chunk.toString())
  next()
}

We may now pipe in a readable stream:

process.stdin.pipe(writableStream)

Getting Data from a Readable Stream

To be able to get the data from a Readable stream, we can use a Writable stream.

const Stream = require('stream')
const readableStream = new Stream.Readable({
  read() {}
})
const writableStream = new Stream.Writable()

writableStream._write = (chunk, encoding, next) => {
  console.log(chunk.toString())
  next()
}

readableStream.pipe(writableStream)

readableStream.push('Yeehaw! Here’s another silly text')
readableStream.push(' Yeehaw! Here’s my yet another silly text ')

We may also directly consume a readable stream using the readable event:

readableStream.on('readable', () => {
  console.log(readableStream.read())
})

Sending Data to a Writable Stream

Simply use the write() function to send your data to a writable stream.

writableStream.write('Yeehaw! Here’s my silly text, yet again!')

Telling a Writable Stream to Stop Writing

const readableStream = new Stream.Readable({
  read() {}
})
const writableStream = new Stream.Writable()

writableStream._write = (chunk, encoding, next) => {
  console.log(chunk.toString())
  next()
}

readableStream.pipe(writableStream)

readableStream.push('Yeehaw! Here’s another silly text')
readableStream.push(' Yeehaw! Here’s my yet another silly text ')

writableStream.end()

This way you can work with streams in Nodejs.

Conclusion

In the second part of our tutorial on streams, we will be learning to work with streams in Nodejs projects. You would get a better understanding of what we’re talking about here if you have read the first part of this post.

Noteworthy References

Aneesha S
Aneesha S
Articles: 172