4

So I have an XML file that is larger than 70mb. I would like to parse this data for in Node.js to do data visualizations on it eventually. To start, I thought it would be best to use JSON instead of XML, because Node.js is better built to work with JSON. So I planned to use the xml2json node module to parse the xml into JSON but I can't seem to write the xml file to a variable because its so large. I attempted to do this with the following code.

var fs = require('fs');


fs.readFile(__dirname + '/xml/ipg140114.xml', 'utf8', function(err, data, parseXml) {
    if(err) {
        return console.log(err);
    } 
});

I receive a stack trace error. Whats a better way to get this file converted into JSON so I can parse it with Node? I am pretty new to Node so let me know if my approach is wrong. Thanks in advance!

1

1 Answer 1

0

Json2xml requires you to load the entire file into memory. You could allocate more memory but I would recommend parsing the XML directly from the file instead.

There are other libraries on NPM such as xml-stream that will allow you to parse the XML directly form the file without loading it all into memory.

My personal issue with xml-stream is that it relies on GYP, which can be a hassle if you're a windows user. I added a very basic parser called no-gyp-xml-stream to NPM, this one only depends on sax. But it's a bit rudimentary and may not suit your needs.
I am however willing to improve it if anyone needs anything: https://www.npmjs.com/package/no-gyp-xml-stream

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.