885

I am doing some experimentation with Node.js and would like to read a JSON object, either from a text file or a .js file (which is better??) into memory so that I can access that object quickly from code. I realize that there are things like Mongo, Alfred, etc out there, but that is not what I need right now.

How do I read a JSON object out of a text or js file and into server memory using JavaScript/Node?

0

14 Answers 14

1592

Sync:

var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('file', 'utf8'));

Async:

var fs = require('fs');
var obj;
fs.readFile('file', 'utf8', function (err, data) {
  if (err) throw err;
  obj = JSON.parse(data);
});
Sign up to request clarification or add additional context in comments.

14 Comments

I think JSON.parse is synchronous, its directly from v8, which means even with the Async way, people have to be careful with large JSON files. since it would tie up node.
For the sake of completeness. Their exists a npm called jsonfile.
I cant believe it was so difficult to find this simple thing. Every answer I got from google was doing an HTTPRequest or using JQuery or doing it in the browser
two points: (1) The synchronous answer should just be let imported = require("file.json"). (2) JSON.parse must be asynchronous, because I used this code to load a 70mb JSON file into memory as an object. It takes milliseconds this way, but if I use require(), it chugs.
For people finding this answer in 2019 and on, Node.js has had native json support for many, many versions through require, with this answer is no longer being applicable if you just want to load a json file. Just use let data = require('./yourjsonfile.json') and off you go (with the booknote that if the performance of require is impacting your code, you have problems well beyond "wanting to load a .json file")
|
510

The easiest way I have found to do this is to just use require and the path to your JSON file.

For example, suppose you have the following JSON file.

test.json

{
  "firstName": "Joe",
  "lastName": "Smith"
}

You can then easily load this in your node.js application using require

var config = require('./test.json');
console.log(config.firstName + ' ' + config.lastName);

10 Comments

Just so folks know, and if I remember correctly, require in node runs synchronously. Dive in deep here
Another issue/benefit with such method is the fact that required data is cached unless you specifically delete the cached instance
"require" is meant to be used to load modules or config file you are using through out the lifespan of your application. does not seem right to use this to load files.
I'd say this is potentially a security threat. If the json file you're loading contains JS code, would requireing it run that code? If so then you really need to control where your json files are coming from or an attacker could run malicious code on your machine.
This is a sound solution for small DevOps scripts or batch operations. You have to balance human time with performance. As far as something you can commit to memory and use quickly for these appropriate cases, this is is tops. Not every task involves Big Data™ and hostile execution environments.
|
82

Answer for 2022, using ES6 module syntax and async/await

In modern JavaScript, this can be done as a one-liner, without the need to install additional packages:

import { readFile } from 'fs/promises';

let data = JSON.parse(await readFile("filename.json", "utf8"));

Add a try/catch block to handle exceptions as needed.

5 Comments

Where would you put the try catch?
I was looking for this, thank you! Works great when I know that the file's content is JSON data, but the extension is customized. The usual require('./jsonfile.xyz') cannot be used in this situation.
Why not use readFileSync and remove the await? JSON.parse(readFileSync("filename.json", "utf8"));
Because file I/O can be (is) slow and if used in the synchronous manner, the server cannot do anything else in the meantime. See also stackoverflow.com/questions/16827373/…
This should be inside an async function for await to work.
62

Asynchronous is there for a reason! Throws stone at @mihai

Otherwise, here is the code he used with the asynchronous version:

// Declare variables
var fs = require('fs'),
    obj

// Read the file and send to the callback
fs.readFile('path/to/file', handleFile)

// Write the callback function
function handleFile(err, data) {
    if (err) throw err
    obj = JSON.parse(data)
    // You can now play with your datas
}

3 Comments

agreed :), added async as well
Great :-) I don't like inline callbacks though, it can lead to callback nightmares that I'd rather avoid.
It's there for a reason.. unless you want it synchronously.
58

At least in Node v8.9.1, you can just do

var json_data = require('/path/to/local/file.json');

and access all the elements of the JSON object.

4 Comments

This approach loads file only once. If you will change the file.json after new require (without restarting program) data will be from first load. I do not have source to back this, but I had this in app I am building
Your answer is woefully incomplete. What that gets you is an object, and it doesn't even bother to implement tostring().
@DavidA.Gray The question wants to be able to access the objects as objects, not as strings. Asides from the singleton issue Lukas mentioned this answer is fine.
Using require will also execute arbitrary code in the file. This method is insecure and I would recommend against it.
20

In Node 8 you can use the built-in util.promisify() to asynchronously read a file like this

const {promisify} = require('util')
const fs = require('fs')
const readFileAsync = promisify(fs.readFile)

readFileAsync(`${__dirname}/my.json`, {encoding: 'utf8'})
  .then(contents => {
    const obj = JSON.parse(contents)
    console.log(obj)
  })
  .catch(error => {
    throw error
  })

4 Comments

.readFile is already async, if you're looking for the sync version, its name is .readFileSync.
If you want to use promises, there's also fs/promises as of Node 10. Note: the API is experimental: nodejs.org/api/fs.html#fs_fs_promises_api
@Aternus .readFile is asynchronous, but not async. Meaning, the function is not defined with async keyword, nor does it return a Promise, so you can't do await fs.readFile('whatever.json');
@Kip how about a CodeSandBox?
18

Answer for 2023, using Import Attributes

import jsObject from "./test.json" with {type: "json"};
console.log(jsObject)

Dynamic import

const jsObject = await import("./test.json", {with: {type: "json"}});
console.log(jsObject);

Further information: tc39 / proposal-import-attributes

2 Comments

@Mike your edit here made this answer wrong; it now claims to show a solution using import attributes (the with {type: "json"} syntax supported from Node 18.20 onwards) while actually exhibiting code that uses import assertions (the assert {type: "json"} syntax added (initially behind a CLI flag) in Node 17.1. I'll edit to fix it; might as well show the with syntax since as of Node 22 the old assert syntax no longer works at all.
@MarkAmery, thanks for updating the answer. As far as I remember, the spec had a thorny path to the stage 4, which is still to be reached. The draft has changed several times, and at least at the V8 docs, assert was/is used, but as of now, with is the correct keyword for this feature.
9

Using fs-extra package is quite simple:

Sync:

const fs = require('fs-extra')

const packageObj = fs.readJsonSync('./package.json')
console.log(packageObj.version) 

Async:

const fs = require('fs-extra')

const packageObj = await fs.readJson('./package.json')
console.log(packageObj.version) 

Comments

8

using node-fs-extra (async await)

const readJsonFile = async () => {
    const myJsonObject = await fs.readJson('./my_json_file.json');
    console.log(myJsonObject);
}

readJsonFile() // prints your json object

Comments

2

https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfile_file_options_callback

var fs = require('fs');  

fs.readFile('/etc/passwd', (err, data) => {
  if (err) throw err;
  console.log(data);
});  

// options
fs.readFile('/etc/passwd', 'utf8', callback);

https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfilesync_file_options

You can find all usage of Node.js at the File System docs!
hope this help for you!

Comments

2
function parseIt(){
    return new Promise(function(res){
        try{
            var fs = require('fs');
            const dirPath = 'K:\\merge-xml-junit\\xml-results\\master.json';
            fs.readFile(dirPath,'utf8',function(err,data){
                if(err) throw err;
                res(data);
        })}
        catch(err){
            res(err);
        }
    });
}

async function test(){
    jsonData = await parseIt();
    var parsedJSON = JSON.parse(jsonData);
    var testSuite = parsedJSON['testsuites']['testsuite'];
    console.log(testSuite);
}

test();

Comments

1

with ECMAscript module the answer above middly work. For ECMAscript we would obtain something of the type:

(asynchronous 'cause I run it on my server to obtain data I need before running some functions)

import { readFile } from 'fs/promises';

function import_json(path){
  return new Promise(async (resolve,reject)=>{
    try{
      const jsonData = await readFile(path, 'utf8');
      console.log(jsonData)
      resolve(JSON.parse(jsonData))
    }catch(e){
      reject("Error from import_json(): " + e)
    }
  })
}

async function(){
  const my_awesome_data = await import_json("./example_data.json")
}

Comments

0

So many answers, and no one ever made a benchmark to compare sync vs async vs require. I described the difference in use cases of reading json in memory via require, readFileSync and readFile here.

Comments

-1

If you are looking for a complete solution for Async loading a JSON file from Relative Path with Error Handling

  // Global variables
  // Request path module for relative path
    const path = require('path')
  // Request File System Module
   var fs = require('fs');


// GET request for the /list_user page.
router.get('/listUsers', function (req, res) {
   console.log("Got a GET request for list of users");

     // Create a relative path URL
    let reqPath = path.join(__dirname, '../mock/users.json');

    //Read JSON from relative path of this file
    fs.readFile(reqPath , 'utf8', function (err, data) {
        //Handle Error
       if(!err) {
         //Handle Success
          console.log("Success"+data);
         // Parse Data to JSON OR
          var jsonObj = JSON.parse(data)
         //Send back as Response
          res.end( data );
        }else {
           //Handle Error
           res.end("Error: "+err )
        }
   });
})

Directory Structure:

enter image description here

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.