3 Common Node.js Design Templates That Are Misused

Today we are excited to start our new 3-part series on the right way to code common node.js design patterns from Node.js contributor and AppNeta engineer. Stephane Belanger. With 6 years of experience, Stephen has helped the Node.js community thrive and also contributed to robust application performance monitoring through his work on the AppNeta instrumentation for node.js.

From event emitters and flows (included below) to constructors (included in part 2) and promises (included in part 3), models are at the heart of Node.js development. While traditional design patterns exist in the JavaScript world, many have been modernized and updated to take advantage of the asynchronous nature of Node.js. There are many ways to use the most common design patterns, but below are some common uses and common mistakes that even seasoned developers make when new to Node.js. To illustrate these models, examples have been included in the explanations and links have been added to the relevant documents on the site. nodejs.org to place.

Common models

With a lot of new developers in the Node.js community, I’ve seen a lot of misunderstanding around some of the common design patterns that arise from similar patterns in other languages. Node.js can be tricky, but it can also be fast if done right. Below I outline the first 3 models in a series of posts in which we will cover how you should use Node.js.

Reminders

Callbacks are probably the most central model in node.js development. Most of the node.js core APIs are callback-based, so it’s important to understand how they work.

A callback is an anonymous function given to another function with the intention of calling it later. It looks something like this:

keyValueStore.get('my-data', function (err, data) {
})

This is a special variety of callbacks that Node.js users often refer to as errback. An errback always has an error parameter first, and the following parameters can be used to pass whatever data the interface is supposed to return.

The accept callback functions in Node.js almost always wait for errors, with the exception of fs. exists, which looks like this:

fs.exists('/etc/passwd', function (exists) {
 console.log(exists ? "it's there" : 'no passwd!')
})

An important thing to understand about callbacks is that in Node.js they are usually, but not always, asynchronous.

The call to `fs.readFile` is asynchronous:

fs.readFile('something', function (err, data) {
  console.log('this is reached second')
})

console.log('this is reached first')

But, calling `list.forEach` is synchronized.

var list = [1]

list.forEach(function (v) {
  console.log('this is reached first')
})

console.log('this is reached second')

Confusing, right? This stumbles most new Node.js users. The way to figure it out is that any code that interacts with data outside of process memory has to be asynchronous, because disks and networks are slow, so we don’t want to wait for them.

Event emitters

An event emitter is a special construction designed to allow an interface to designate many callbacks for many different behaviors that may occur once, multiple times, or even never. In Node.js, event emitters work by exposing a common API on an object, providing several functions to register and trigger these callbacks. This interface is often tied by inheritance.

Consider this example:

var emitter = new EventEmitter

emitter.on('triggered', function () {
  console.log('The event was triggered')
})

emitter.emit('triggered')

Event emitters are themselves synchronous, but sometimes the things they can be attached to are not, which is another source of confusion regarding asynchrony in Node.js. The point between calling `sender.emit (…)` and triggering the callback given to `sender.on (…)` is synchronous, but the object can be transmitted and the send function could be used in the future .

Streams

A stream is a special variety of event emitters designed specifically to consume a sequence of data events without having to buffer the entire sequence. This is particularly useful in cases where a sequence is infinite.

A common use of streams is to read files. Loading a large file into memory all at once does not work well, so you can use streams to let you operate on chunks of the file. To read a file as a stream, you would do something like this:

var file = fs.createReadStream('something')

file.on('error', function (err) {
  console.error('an error occured', err)
})

file.on('data', function (data) {
 console.log('I got some data: ' + data)
})

file.on('end', function () {
 console.log('no more data')
})

The “data” event is actually part of the “flow” or “push” mode introduced in the streams 1 API. It pushes data through the pipe as fast as possible. Often what we really need to scale well is an extraction stream, which can be done using the “readable” event and the “read (…)” function.

file.on('readable', function () {
  var chunk = file.read()
  if (chunk) {
    console.log('I got some data: ' + data)
  }
})

Note that you need to check if the chunk is null, as streams end with null.

Flows include additional functions beyond what the event emitters provide, and there are different varieties of flows. There are Legible, Writing, Duplex, and Transform to cover various forms of data access.

If you want to read from a file stream, but also want to write the output of the output to a write stream, all of this event machinery can be a bit overkill. There is a handy function on streams called pipe which automatically propagates the appropriate events from the source stream to the target stream. It is also chainable, which is ideal for using Transform streams to interpret data protocols.

A theoretical use of a stream to parse a JSON array coming over the network might look like this:

socket.pipe(jsonParseStream()).pipe(eachObject(function (item) {
  console.log('got an item', item)
}))

The `jsonParseStream` would emit each element of the parsed array, as it arrives, as an object stream. The “eachObject” stream would then be able to receive each of these events in object mode and perform certain operations on them, in this case register them.

It is important to understand that, on channelized flows, error events are not spread. So you would need to write the previous example more like this, for safety:

socket
  .on('error', function (err) {
    console.error('a socket error occurred', err)
  })

  .pipe(jsonParseStream())

  .on('error', function (err) {

    console.error('a json parsing error occurred', err)
  })

  .pipe(iterateStream(function (item) {
    console.log('got an item', item)
  }))

  .on('error', function (err) {
    console.error('an iteration error occurred', err)
  })

And after?

See you next week where I will continue this series with 3 more common models and how not to abuse them!


Source link

Abdul J. Gaspar

Leave a Reply

Your email address will not be published.