Nodejs Express Common Configurations

Your basic express app would look like the following:

var express = require('express');
var app = express();

The following are some common configurations that you can do for your express app:

Increasing request payload limit

If you expect your requests to have large payloads (say image or video content), you can increase the size limit for your json payload.

app.use(express.json({limit: '5mb'}));

The default limit is 100 kB. express.json() uses the bytes package, and thus, the sizes can be expressed as ‘b’, ‘kb’, ‘mb’, ‘gb’ and so on.

IP-Rate Limiting

In order to implement IP rate limiting in an Express app, we will use the express-rate-limit and request-ip packages. The request-ip package looks for specific headers in the request in the order of decreasing priority and falls back to some defaults if none of the headers exist.

const rateLimit = require('express-rate-limit');
const requestIp = require('request-ip');


  windowMs: 600 * 1000, // 10 minutes
  max: 600, // limit each IP to 600 requests per windowMs
  keyGenerator: (req, res) => {

    return req.clientIp // IP address from, as opposed to req.ip

In the above example, the rate is limited to 600 requests in a 10 minute interval per IP address. You can adjust the windowMs and max parameters to adjust the rate.


If you are on a multi-core machine, you may want to spawn individual processes on each core. The cluster module of nodejs allows you to do that.

var cluster = require('cluster');
var count = require('os').cpus().length;

function spawn() {
  var worker = cluster.fork();
  logger.debug("Worker " + + " started!");
  workers[] = worker;
  return worker;

if (cluster.isMaster) {
  logger.debug("Number of cores is " + count.toString())
  for (var i = 0; i < count; i++) {
  cluster.on('death', function (worker) {
    logger.debug('worker ' + + ' died. spawning a new process...');
    delete workers[];
} else {
  var server = app.listen(process.env.PORT || 8081, function () {
    var host = server.address().address
    var port = server.address().port
    logger.debug("Example app listening at http://%s:%s", host, port)

In the above example, the as many processes will be spawned as the number of cores.

Routing requests to different files

Your main app may be running in handler.js. Assume that you have a certain number of endpoints starting with /foo in foo.js. How do you route requests from handler.js to foo.js? Using the router.

Your foo.js will look like:

const express = require('express')
let router = express.Router();

router.get('/', async function (req, res) {
// Your function

router.get('/bar', async function (req, res) {
//Your function

module.exports = router;

Your handler.js will look like:

var app = express();

const foo = require('./foo'); //Add the path to foo.js

app.use('/foo', foo); //Now all requests with endpoints beginning with /foo will be redirected to foo.js

//Rest of the code

Handling code crashes

Sometimes, unhandled promise rejections or uncaught exceptions may cause your production app to crash. In this case, it is important to be notified about the same. This can be achieved via the process module.

const process = require('node:process');

process.on('uncaughtException', function (exception) {
  console.log("Uncaught exception: " + exception); // print what went wrong
  // send email
  process.exit(1); //Best practice, let the process crash

process.on('unhandledRejection', (reason, p) => {
  console.log("Unhandled Rejection at: Promise ", p, " reason: ", reason); //print what went wrong
  // send email
  process.exit(1); //Best practice, let the process crash

I hope you liked this article. For more articles on IoT in general, check out

1 comment

Leave a comment

Your email address will not be published. Required fields are marked *