Node JS Best Practices — 2023

Node JS Best Practices — 2023

  1. Don’t useJSON.parse and JSON.stringify

What? but I have used it for many years and it is very important. Yes I have also used it for many years and it indeed served me well but the problem starts when your input grows, it takes a very long amount of time and it can potentially block the event loop. The time complexity for these functions is O(n) and as the n grows the time it takes also grows. If your application takes and processes JSON objects, from a user, you should be cautious about the size of the objects or strings it takes.

Part of the reason why these functions are blocking is that they are processing the whole input and the solution can be to use a stream to work with the given object or string in a chunk. There are npm modules that offer asynchronous JSON APIs. See for example:

  • JSONStream, which has stream APIs.

  • Big-Friendly JSON, which has stream APIs as well as asynchronous versions of the standard JSON APIs.

2. Add a logger

An application can have different messages such as errors, warnings, information, user interaction data, etc. When the application gets bigger data and error management can be overwhelming and you need to have information on where the bug source is, who sent the request, and so on. Logger libraries can be very helpful not only in storing messages and errors but also logs/information of successful requests, failed requests, and how long it took. This will help you to do analytics and improve your application. A library called morgan lets you store logs about HTTP requests that have been sent and in the code below I simply add it as middleware and it writes to the console. You can look more here to add different storage mechanisms.

var express = require('express')
var morgan = require('morgan')

var app = express()

app.use(morgan('combined'))

app.get('/', function (req, res) {
  res.send('hello, world!')
})

For errors and messages logs I recommend looking into Winston. It is an easy-to-use library that supports multiple means of transport such as files, databases, and console. You can also set the level such as notice, error, warning, info, etc.

What I normally do is create a separate logger file and export it.

const { createLogger, format, transports } = require('winston');
const config = require('./config');
const { combine, printf } = format;
const winstonFormat = printf(
  ({ level, message, timestamp, stack }) =>
    `${timestamp} ${level}: ${stack || message}`
);
const { timestamp } = format;
const logger = createLogger({
  level: config.env === 'development' ? 'debug' : 'info',
  format: combine(
    timestamp(),
    winstonFormat,
    config.env === 'development' ? format.colorize() : format.uncolorize()
  ),
  transports: [new transports.Console()],
});
module.exports = logger;

3. Adopt one API for one task

It is sometimes difficult to separate and structure APIs but this approach helps in creating decoupling, maintainability, and separation of responsibility in your API not only that but also it makes your application faster than putting a bunch of operations at one API.

For example, when building a form that accepts text and videos. The API should be separated into two, 1 to store the texts, and 2 to upload the video in real-time, this will result in a better response time.

4. Separate code into npm packages

When you are working on multiple projects and if you constantly need to use similar code then it is time to separate the code into an npm package. This will save you a lot of time you use to maintain the code and also it will make it very easy if you are working in a team.

5. Make heavy calculations asynchronous

While node js is best at handling IO operations, it is not well suited for time-intensive mathematical calculation. If you were to do a sum of n integers for a large number, your application can get stuck and don’t serve a request that is because node js serves every request in a single thread and doesn’t create a thread for every request.

This kind of task is executed in the event loop and all incoming requests and outgoing responses pass through the Event Loop. This means that if the Event Loop spends too long at any point, all current and new clients will not get a turn. Just get into the solution!

Okay, so you can wrap you blocking function with set immediate. what do I mean by that and why? setImmediate runs in the next iteration of the event loop which means node js prioritize what can run immediately such as initialization, assignment, and registering asynchronous codes before calling the setImmediate function.

// other codes
    setImmediate(() => {
        processData(data);
    });
// other codes

The event loop doesn’t immediately execute processData instead it registers the set immediate and lets the other code to run. I have written an article on how you can use set immediate to write a nonblocking code and please don’t tell me you haven’t read it yet.

If you need to do something more complex, the first approach is not a good option. This is because it uses only the Event Loop, and you won’t benefit from multiple cores. So the other solution is to use the Node JS worker_threads module to create a separate thread to handle those tasks.

6. Don’t store large data in a variable.

Variables are stored in RAM (Random Access Memory) making it fast to store and access. If you encounter such situations where you need to store data that requires a lot of space always create a place for it in a database. That is because large data can overwhelm the random access memory and it can have effects on the server hence on your application as well.

7. Avoid

Best practices can sometimes be avoiding what is not best. As much as how funny that might sound you need to take these three points seriously.

  • Avoid using synchronous functions like readFileSync because it can block the main thread. Instead, use the callback or promise version of it.

  • Avoid storing large sessions or data on your request or response body since it can lag the response. What if you need to send big data? use stream.

  • Avoid to require a large set of data such as big JSON files since node js require is synchronous and can block the main thread instead store your data in a database and only fetch what you need. If you have to store it in a file use stream to fetch part of the data asynchronously.

_______________________________________________________________

Working on creating an advanced node js course currently.
- Improving performance
- Security
- Node js behind the scene
- Scaling
- Architecture
- Deploying node js apps

Follow me on linked in and get the full course I will release soon.