9

With Node I am trying to collect user data from an LDAP server and then write that data to a JSON file. I am using the following code to do this:

fs.writeFile('data.json', JSON.stringify(data, null, 4));

The problem is the JSON.stringify method is causing the following error:

FATAL ERROR: JS Allocation failed - process out of memory

I know the problem is with JSON.stringify because if I use console.log rather than fs.writeFile I get the same error.

I am trying to write a lot of data (over 500 entries in the LDAP database). Does anyone know how I can get this to work? Here is the code in full:

var ldap = require('ldapjs');
var util = require('util');
var fs = require('fs');
var client = ldap.createClient({
  url: '************'
});

client.bind('CN=**********,OU=Users,OU=LBi UK,OU=UK,DC=********,DC=local', '*********', function(err) {
  if (err) {
    console.log(err.name);
  }
});


// taken from http://ldapjs.org/client.html
client.search('OU=Users,OU=******,OU=UK,DC=******,DC=local', {
  scope: 'sub',
  filter: 'objectClass=organizationalPerson',
  attributes: ['givenName', 'dn', 'sn', 'title', 'department', 'thumbnailPhoto', 'manager']
  // filter by organizational person
}, function(err, res) {
  if (err) {
    console.log(err.name);
  }

  var limit = 1;
  var data = {"directory": []};

  res.on('searchEntry', function(entry) {

    var obj = {};
    entry.attributes.forEach(function (attribute) {
      var value;
      if (attribute.type === 'thumbnailPhoto') {
        value = attribute.buffers[0];

      } else {
        value = attribute.vals[0];
      }
      obj[attribute.type] = value;
    });
    data.directory.push(obj);
  });
  res.on('error', function(err) {
    console.log('error: ' + err.message);
  });
  res.on('end', function(result) {
    fs.writeFile('data.json', JSON.stringify(data, null, 4));
  });

});
7
  • 1
    Did you consider that... you are out of memory? :) I.e. the data is too big? Commented Jun 25, 2012 at 13:13
  • @freakish I'm not sure if that's the case because I know somebody else was able to write the data to a file. However, when I explained to him I was getting this error he said he hadn't come across this problem. Is there a way I can increase the amount of memory available? Commented Jun 25, 2012 at 13:23
  • Yeah, buy more RAM. :) Or kill other apps. Did the other person did this on the same machine? If not, then you can't compare it. I think that this is as simple as that: you are out of memory. To handle this issue (without actually getting more RAM), you need to split the data to smaller pieces and handle one piece at a time. Commented Jun 25, 2012 at 13:25
  • @freakish I just tried using console.log(util.inspect(data)); instead of fs.writeFile and I was able to display the data in the console no problem. Is there an alternative to JSON.stringify I can use to write the file? Commented Jun 25, 2012 at 13:48
  • 1
    Stephen, you don't seem to understand. You have enough memory for holding object and inspecting it, but stringification actually creates a new string in memory. Since your data is big enough this string does not fit in memory. Changing library won't help you, because in the end you want to hold the big string in the memory. And you shouldn't. As I told you: you have to split the data into smaller parts, stringify those parts and append into a file piece by piece. Of course I might be wrong, but I can't see the other logical explanation. Commented Jun 25, 2012 at 13:52

4 Answers 4

6

As @freakish mentioned the problem was my data was too big.

The reason the data was so big was due to a large number of images that were being returned as objects. In the end all I needed to do was encode the object as base64 using Buffers and then the size of the data became much more manageable.

Sign up to request clarification or add additional context in comments.

Comments

2

Stephen's data wasn't "too big". This is a bug in Node, tracked here, and not fully fixed. I'm still seeing it 2 years later, with an object of constant complexity sometimes producing the OOM error, but most of the time being successfully written to disk.

One workaround is to look into using a streaming JSON writer library like json-write (from D3 author Mike Bostock).

Comments

1

Something is happening recursively.

Make sure that your Object data does not contain any circular references, such as to this or anything else that has difficulty being serialized.

3 Comments

JSON.stringify should cause TypeError: Converting circular structure to JSON if the structure has circular references.
@Esailija: It must literally be a memory issue then.
I don't think that's the case. Circular structures throw TypeError when you try to stringify them.
0

If someone passes by here, and they actually do need to encode a big chunk of data, and write it to a file (it was my case), I wrote these 3 little functions to do so:

let buff = '';
function append(file, str) {
  buff += str;
  if (buff.length > 100_000) {
    fs.appendFileSync(file, buff);
    buff = '';
  }
}
function flush(file) {
  if (buff.length > 0) {
    fs.appendFileSync(file, buff);
    buff = '';
  }
}

function writeJsonToFile(file, obj, isReccursion = false) {
  if (!isReccursion) {
    // Empty the file
    fs.writeFileSync(file, '');
  }

  if (Array.isArray(obj)) {
    append(file, '[');
    for (let i=0; i < obj.length ; i++) {
      writeJsonToFile(file, obj[i], true);
      if (i < obj.length - 1) {
        append(file, ',');
      }
    }
    append(file, ']');
  } else if (obj instanceof Date) {
    append(file, `"${obj.toISOString()}"`);
  } else if (obj == null) {
    append(file, 'null');
  } else if (typeof obj === 'object') {
    append(file, '{');
    const keys = Object.keys(obj);
    for (let i=0; i < keys.length ; i++) {
      append(file, `"${keys[i]}":`);
      writeJsonToFile(file, obj[keys[i]], true);
      if (i < keys.length - 1) {
        append(file, ',');
      }
    }
    append(file, '}');
  } else if (typeof obj === 'string') {
    append(file, `"${obj.replaceAll('"', '\\"')}"`);
  } else {
    append(file, obj.toString());
  }

  if (!isReccursion) flush(file);
}

To use it:

const myBigObject = ['WhateverYouWant'];
writeJsonToFile('/tmp/my-file.json', myBigObject);

The way this works is that it will append to the file little by little, and not try to keep the whole stringified JSON in memory.

Hope I helped someone!

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.