Skip to content

Node.js Streams

New Course Coming Soon:

Get Really Good at Git

Learn what streams are for, why are they so important, and how to use them.

What are streams

Streams are one of the fundamental concepts that power Node.js applications.

They are a way to handle reading/writing files, network communications, or any kind of end-to-end information exchange in an efficient way.

Streams are not a concept unique to Node.js. They were introduced in the Unix operating system decades ago, and programs can interact with each other passing streams through the pipe operator (|).

For example, in the traditional way, when you tell the program to read a file, the file is read into memory, from start to finish, and then you process it.

Using streams you read it piece by piece, processing its content without keeping it all in memory.

The Node.js stream module provides the foundation upon which all streaming APIs are build.

Why streams

Streams basically provide two major advantages using other data handling methods:

An example of a stream

A typical example is the one of reading files from a disk.

Using the Node fs module you can read a file, and serve it over HTTP when a new connection is established to your http server:

const http = require('http')
const fs = require('fs')

const server = http.createServer(function (req, res) {
  fs.readFile(__dirname + '/data.txt', (err, data) => {
    res.end(data)
  })
})
server.listen(3000)

readFile() reads the full contents of the file, and invokes the callback function when it’s done.

res.end(data) in the callback will return the file contents to the HTTP client.

If the file is big, the operation will take quite a bit of time. Here is the same thing written using streams:

const http = require('http')
const fs = require('fs')

const server = http.createServer((req, res) => {
  const stream = fs.createReadStream(__dirname + '/data.txt')
  stream.pipe(res)
})
server.listen(3000)

Instead of waiting until the file is fully read, we start streaming it to the HTTP client as soon as we have a chunk of data ready to be sent.

pipe()

The above example uses the line stream.pipe(res): the pipe() method is called on the file stream.

What does this code do? It takes the source, and pipes it into a destination.

You call it on the source stream, so in this case, the file stream is piped to the HTTP response.

The return value of the pipe() method is the destination stream, which is a very convenient thing that lets us chain multiple pipe() calls, like this:

src.pipe(dest1).pipe(dest2)

This construct is the same as doing

src.pipe(dest1)
dest1.pipe(dest2)

Streams-powered Node APIs

Due to their advantages, many Node.js core modules provide native stream handling capabilities, most notably:

Different types of streams

There are four classes of streams:

How to create a readable stream

We get the Readable stream from the stream module, and we initialize it

const Stream = require('stream')
const readableStream = new Stream.Readable()

Now that the stream is initialized, we can send data to it:

readableStream.push('hi!')
readableStream.push('ho!')

How to create a writable stream

To create a writable stream we extend the base Writable object, and we implement its _write() method.

First create a stream object:

const Stream = require('stream')
const writableStream = new Stream.Writable()

then implement _write:

writableStream._write = (chunk, encoding, next) => {
    console.log(chunk.toString())
    next()
}

You can now pipe a readable stream in:

process.stdin.pipe(writableStream)

How to get data from a readable stream

How do we read data from a readable stream? Using a writable stream:

const Stream = require('stream')

const readableStream = new Stream.Readable()
const writableStream = new Stream.Writable()

writableStream._write = (chunk, encoding, next) => {
    console.log(chunk.toString())
    next()
}

readableStream.pipe(writableStream)

readableStream.push('hi!')
readableStream.push('ho!')

You can also consume a readable stream directly, using the readable event:

readableStream.on('readable', () => {
  console.log(readableStream.read())
})

How to send data to a writable stream

Using the stream write() method:

writableStream.write('hey!\n')

Signaling a writable stream that you ended writing

Use the end() method:

const Stream = require('stream')

const readableStream = new Stream.Readable()
const writableStream = new Stream.Writable()

writableStream._write = (chunk, encoding, next) => {
    console.log(chunk.toString())
    next()
}

readableStream.pipe(writableStream)

readableStream.push('hi!')
readableStream.push('ho!')

writableStream.end()
Are you intimidated by Git? Can’t figure out merge vs rebase? Are you afraid of screwing up something any time you have to do something in Git? Do you rely on ChatGPT or random people’s answer on StackOverflow to fix your problems? Your coworkers are tired of explaining Git to you all the time? Git is something we all need to use, but few of us really master it. I created this course to improve your Git (and GitHub) knowledge at a radical level. A course that helps you feel less frustrated with Git. Launching Summer 2024. Join the waiting list!
→ Get my Node.js Handbook
→ Read my Node.js Tutorial on The Valley of Code

Here is how can I help you: