74

As the title implies I'm trying to stringify huge JavaScript Object with JSON.stringify in my Node.js app. The objects are - again - huge (tens of mega bytes), they don't contain any functions. I need to write the serialized objects to a file. What I'm getting now is this:

RangeError: Invalid string length
  at Object.stringify (native)
  at stringifyResult (/my/file.js:123:45) -> line where I use JSON.stringify

Any idea how to solve that issue?

3
  • 5
    That said, if what you're doing is preparing the data structure for output, you could write your own JSON serializer that incrementally writes to an output stream instead of creating a single massive string. It wouldn't be super-easy but it wouldn't be super-hard either. Commented Mar 20, 2015 at 21:17
  • 4
    I think there are streaming or buffered JSON de/serializers out there. Commented Mar 20, 2015 at 21:31
  • 1
    Looking for a similar answer, except for javascript client side (no node). Meanwhile, here's an answer to your problem, @boris: stackoverflow.com/questions/24153996/… Commented Nov 30, 2015 at 9:26

4 Answers 4

34

I too have seen this unhelpful/misleading nodejs error message, so I booked an issue over at nodejs github

RangeError: Invalid string length --- it should be saying Out Of Memory

Sign up to request clarification or add additional context in comments.

1 Comment

15

As mentioned by @sandeepanu, there's a great little solution by @madhunimmo for if you're trying to stringify a huge array. Just stringify one element at a time:

let out = "[" + yourArray.map(el => JSON.stringify(el)).join(",") + "]";

If you're trying to stringify an object with a very large number of keys/properties, then you could just use Object.entries() on it first to turn it into an array of key/value pairs first:

let out = "[" + Object.entries(yourObject).map(el => JSON.stringify(el)).join(",") + "]";

If that still doesn't work, then you'll probably want to use a streaming approach, although you could slice your array into portions and store as multiple jsonl (one object per line) files:

// untested code
let numFiles = 4;
for(let i  = 0; i < numFiles; i++) {
  let out = arr.slice((i/numFiles)*arr.length, ((i+1)/numFiles)*arr.length).map(el => JSON.stringify(el)).join(",");
  // add your code to store/save `out` here
}

One streaming approach (new, and currently only supported in Chrome, but will likely come to other browsers, and even Deno and Node.js in some form or another) is to use the File System Access API. The code would look something like this:

// untested code
const dirHandle = await window.showDirectoryPicker();
const fileHandle = await dirHandle.getFileHandle('yourData.jsonl', { create: true });
const writable = await fileHandle.createWritable();
for(let el of yourArray) {
  await writable.write(JSON.stringify(el)+"\n");
}
await writable.close();

1 Comment

This works only if your problem is with a wide response. If the stringify problem is with a single child being too large, this won't suffice to fix your problem.
8

I find JSONStream to be a reliable alternative to the native JSON.stringify that works well with large objects. For example:

var fileSystem = require( "fs" );
var JSONStream = require( "JSONStream" );
var records = [
    { id: 1, name: "Terminator" },
    { id: 2, name: "Predator" },
    { id: 3, name: "True Lies" },
    { id: 4, name: "Running Man" },
    { id: 5, name: "Twins" }
    // .... hundreds of thousands of records ....
];

var transformStream = JSONStream.stringify();
var outputStream = fileSystem.createWriteStream( __dirname + "/data.json" );
transformStream.pipe( outputStream );    
records.forEach( transformStream.write );
transformStream.end();

outputStream.on(
    "finish",
    function handleFinish() {
        console.log("Done");
    }
);

Took the sample code from here.

2 Comments

This approach helped me, thanks. Just one question, do this answer could be tweak to have a single line json? Because a this point for every json object we'll find a break line.
This worked somewhat for me for a 431 Mb file. If writing an Object use Object.entries(records).forEach(transformStream.write); However, each key got wrapped in an Array ...
3

Here's a simple helper file that can do the job:

const fs = require('fs');
const json = require('big-json');
 
// pojo will be sent out in JSON chunks written to the specified file name in the root 
function makeFile(filename, pojo){

    const stringifyStream = json.createStringifyStream({
        body: pojo
    });

    stringifyStream.on('data', function(strChunk) {
        fs.appendFile(filename, strChunk, function (err) {
            if (err) throw err;
        })
    });

}

module.exports = {
    makeFile
}

2 Comments

does it has any end event? How can you tell when the entire json is finished being stringified
I get Error: EMFILE: too many open files when I try to write a 200 MB object from memory to a file..

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.