3

I am trying to determine if this is a fair benchmark. The goal is trying to see how many concurrent connections vs payloads of various size Node.JS can handle. The code is below.

var express = require('express');
var Sequelize = require('sequelize');
var fs = require('fs');
var app = express();


var data;

var filename = process.argv[2] || './start.json';
console.log("Using: " + filename);
data = fs.readFileSync(filename);

var blockSize = 250000;
app.get('/start', function (req, res) {
  // Break up data in blocks.  Works to super high concurrents.
  // for(var i = 0; i < data.length; i+=blockSize)
  //   res.write(data.slice(i, i+blockSize));  

   // Can only handle about 600 concurrent requests if datasize > 500KB
   res.send(data);
});



app.listen(3000, function () {
  console.log('Listing on 3000.');
});

As stated in the comments, if the payload size is larger than about 500KB and there have 500 concurrents it will get "connection reset by peer" at the load testing client. If you slice the data up and write it in chunks it can survive to much higher concurrents before that starts. Both stock node and express exhibit this behavior.

3
  • The problem is the majority of the data is in RAM. So for large payloads it boils down to how long it takes to memcpy(). This is exactly the kinds of workload that node cannot handle well. Node is optimised for I/O, not RAM processing. You'll get much better concurrency opening the file as a read stream and pipe it to the client. Doing that will shift almost all the load to the OS instead of node and if you are on Linux or Solaris you'll get a huge boost from the optimised filesystem drivers. Commented Aug 21, 2016 at 15:48
  • On the other hand, for small payloads you'll often get better performance keeping the data in RAM. So it depends really. Commented Aug 21, 2016 at 15:49
  • The CPU pegs out to 100% when the data size is larger which for node is obviously really bad. Commented Aug 21, 2016 at 15:54

2 Answers 2

0
data = fs.readFileSync(filename);

Sync methods are nodejs killers. It actually block the event loop for ALL request, making performances really really bad.

Try this :

var express = require('express');
var Sequelize = require('sequelize');
var fs = require('fs');
var app = express();
var filename = process.argv[2] || './start.json';

var blockSize = 250000;
app.get('/start', function (req, res) {
  // Break up data in blocks.  Works to super high concurrents.
  // for(var i = 0; i < data.length; i+=blockSize)
  //   res.write(data.slice(i, i+blockSize));  

   // Can only handle about 600 concurrent requests if datasize > 500KB
   console.log("Using: " + filename);

   fs.readFile(filename, function (err, data) {
      if (err) throw err;
      res.send(data);
   });

});



app.listen(3000, function () {
  console.log('Listing on 3000.');
});
Sign up to request clarification or add additional context in comments.

2 Comments

If you notice it happens on start of the script exactly once. The point is not to benchmark the loading of the file--only the transfer. So the data is in memory before the server even listens to connections.
Please read the provided code before posting.
0

As an alternative, you could create a read stream and pipe it, here is the example based on your code

var express = require('express');
var fs = require('fs');
var app = express();

var data;

var filename = process.argv[2] || './data.json';
console.log("Using: " + filename);
data = fs.readFileSync(filename);

var readStream = fs.createReadStream(filename);

app.get('/start', function (req, res) {
  // Can only handle about 600 concurrent requests if datasize > 500KB
  //res.send(data);
  readStream.pipe(res);
});

3 Comments

This was another idea I'd had. Based on how it performs I am willing to bet send() does that behind the scenes. If you put the readStream creation in /start it performs exactly as sending the raw stream.
it might be the case. I tested both solutions with very similar results.
Also, even funnier, if you do a large file (30MB) it's even better than Go at the same test. Not expected.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.