Parsing a large XML file in Node.js
So I have an XML file over 70MB in size. I would like to parse this data in Node.js to eventually render the data. For starters, I thought it would be better to use JSON instead of XML because Node.js is better built to handle JSON. So I was planning on using the xml2json node module to parse the xml to JSON, but I cannot write the xml file to a variable because it is so big. I tried to do it with the following code.
var fs = require('fs');
fs.readFile(__dirname + '/xml/ipg140114.xml', 'utf8', function(err, data, parseXml) {
if(err) {
return console.log(err);
}
});
I am getting a stack trace error. What's the best way to get this file converted to JSON so I can parse it with Node? I am new to node so let me know if my approach is wrong. Thanks in advance!
source to share
Json2xml
requires loading the entire file into memory. You could allocate more memory, but I would recommend parsing the XML directly from the file.
There are other libraries in NPM, such as xml-stream
that will allow you to parse XML directly from a file without loading it into memory.
My personal problem with xml-stream
is that it is based on GYP
, which can be a problem if you are a Windows user. I added a basic parser called no-gyp-xml-stream
in NPM that only depends on the saxophone. But this is a little rudimentary and may not suit your needs.
However I am willing to improve it in case anyone needs something:
https://www.npmjs.com/package/no-gyp-xml-stream
source to share