Javascript required
Skip to content Skip to sidebar Skip to footer

I Need O Build an App That Reads Json Files

Introduction to JSON

JavaScript Object Notation, referred to equally JSON in short, is one of the most pop formats for data storage and data interchange over the net. The simplicity of the JSON syntax makes it very like shooting fish in a barrel for humans and machines to read and write.

Despite its proper name, the apply of the JSON information format is not limited to JavaScript. Most programming languages implement information structures that you can hands convert to JSON and vice versa.

JavaScript, and therefore the Node.js runtime environment, is no exception. More often than not, this JSON information needs to be read from or written to a file for persistence. The Node runtime environment has the congenital-in fs module specifically for working with files.

This commodity is a comprehensive guide on how to use the built-in fs module to read and write data in JSON format. We shall besides look at some third party npm packages that simplify working with information in the JSON format.

Serializing and deserializing JSON

Serialization is the process of modifying an object or information structure to a format that is easy to store or transfer over the internet. You can recover the serialized information by applying the reverse process.

Deserialization refers to transforming the serialized information construction to its original format.

Yous will almost ever need to serialize JSON or JavaScript object to a JSON string in Node. You can do so with the JSON.stringify method before writing it to a storage device or transmitting it over the internet:

const config = { ip: '1234.22.11', port: 3000}; console.log(JSON.stringify(config));        

On the other paw, subsequently reading the JSON file, you volition need to deserialize the JSON string to a plain JavaScript object using the JSON.parse method before accessing or manipulating the data:

const config = JSON.stringify({ ip: '1234.22.eleven', port: 3000}); console.log(JSON.parse(config));        

JSON.stringify and JSON.parse are globally bachelor methods in Node. You don't need to install or require before using.

Introduction to thefs module

Because the fs module is built in, you lot don't demand to install it. It provides functions that you tin use to read and write data in JSON format, and much more than.

Each office exposed by the fs module has the synchronous, callback, and promise-based grade. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The hope-based variant of a role is accessible from the promise-based API.

Synchronous API

The synchronous methods of the built-in fs module block the event loop and further execution of the remaining code until the operation has succeeded or failed. More often than not, blocking the event loop is not something you want to practice.

The names of all synchronous functions cease with the Sync characters. For example, writeFileSync and readFileSync are both synchronous functions.

You lot can access the synchronous API past requiring fs:

const fs = require('fs');  // Blocks the event loop fs.readFileSync(path, options);        

Callback API

Unlike the synchronous methods that block the event loop, the corresponding methods of the callback API are asynchronous. Y'all'll pass a callback function to the method equally the concluding argument.

The callback function is invoked with an Error object as the start argument if an error occurs. The remainder of the arguments to the callback function depends upon the fs method.

You can also access the methods of the callback API past requiring fs similar the synchronous API:

const fs = require('fs');  fs.readFile(path, options, callback);        

Promise-based API

The promise-based API is asynchronous, like the callback API. Information technology returns a promise, which yous can manage via hope chaining or async-expect.

You can access the promise-based API by requiring fs/promises:

const fs = require('fs/promises');  fs.readFile(path)   .then((data) => {     // Do something with the data   })   .grab((error) => {     // Practise something if mistake    });        

We used the commonJS syntax for accessing the modules in the code snippets higher up. We shall be using the commonJS syntax throughout this article because Node treats JavaScript code as a commonJS module past default. You lot can also use ES6 modules if y'all want.

According to the Node documentation, the callback API of the built-in fs module is more performant than the promise-based API. Therefore, most examples in this article will use the callback API.

How to read JSON files in Node.js

The Node runtime surround has the born require function and the fs module that you can employ for loading or reading JSON files. Because require is globally available, you don't demand to require it.

All the same, you lot will demand to crave the fs module before using it. I will hash out how to read JSON files using the built-in fs module and crave function in the post-obit subsections.

How to load a JSON file using the global require role

You lot tin utilise the global crave office to synchronously load JSON files in Node. After loading a file using require, information technology is cached. Therefore, loading the file once more using require will load the cached version. In a server environment, the file will be loaded over again in the adjacent server restart.

It is therefore advisable to use require for loading static JSON files such as configuration files that do non alter often. Do not use require if the JSON file you load keeps changing, considering information technology volition cache the loaded file and use the buried version if you require the same file again. Your latest changes will non be reflected.

Assuming you lot have a config.json file with the following content:

{     "port": "3000",     "ip": "127.00.12.3" }        

You tin load the config.json file in a JavaScript file using the code beneath. require will always load the JSON information as a JavaScript object:

const config = crave('./config.json'); console.log(config);        

How to read a JSON file using the fs.readFile method

You can use the readFile method to read JSON files. Information technology asynchronously reads the contents of the unabridged file in retentiveness, therefore it is not the most optimal method for reading large JSON files.

The readFile method takes three arguments. The code snippet beneath shows its role signature:

fs.readFile(path, options, callback);        

The offset argument, path, is the file proper noun or the file descriptor. The 2d is an optional object argument, and the 3rd is a callback function. Yous can also pass a string as the 2d argument instead of an object. If you pass a string, then it has to be encoded.

The callback role takes 2 arguments. The outset argument is the error object if an mistake occurs, and the second is the serialized JSON data.

The lawmaking snippet below volition read the JSON data in the config.json file and log it on the terminal:

const fs = crave('fs');  fs.readFile('./config.json', 'utf8', (error, information) => {      if(fault){         panel.log(error);         return;      }      console.log(JSON.parse(data));  })        

Brand sure to deserialize the JSON cord passed to the callback function before you starting time working with the resulting JavaScript object.

How to read a JSON file using fs.readFileSync method

readFileSync is another built-in method for reading files in Node like to readFile. The divergence between the two is that readFile reads the file asynchronously while readFileSync reads the file synchronously. Therefore, readFileSync blocks the upshot loop and execution of the remaining code until all the information has been read.

To grasp the difference between synchronous and asynchronous code, you can read the article "Understanding asynchronous JavaScript" here.

Below is the part signature of fs.readFileSync:

fs.readFileSync(path, options);        

path is the path to the JSON file you desire to read, and you can laissez passer an object as the second argument. The second argument is optional.

In the code snippet beneath, nosotros are reading JSON data from the config.json file using readFileSync:

const { readFileSync } = require('fs'); const data = readFileSync('./config.json'); console.log(JSON.parse(data));        

How to write to JSON files in Node.js

Just like reading JSON files, the fs module provides built-in methods for writing to JSON files.

You can employ the writeFile and writeFileSync methods of the fs module. The difference between the ii is that writeFile is asynchronous while writeFileSync is synchronous. Before writing a JSON file, make certain to serialize the JavaScript object to a JSON cord using the JSON.stringify method.

How to write to JSON files using the fs.writeFile method

JSON.stringify will format your JSON data in a single line if you do non pass the optional formatting argument to the JSON.stringify method specifying how to format your JSON data.

If the path yous pass to the writeFile method is for an existing JSON file, the method will overwrite the data in the specified file. It volition create a new file if the file does not exist:

const { writeFile } = require('fs');  const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 };  writeFile(path, JSON.stringify(config, null, two), (mistake) => {   if (error) {     console.log('An mistake has occurred ', error);     return;   }   console.log('Data written successfully to disk'); });        

How to write to JSON files using the fs.writeFileSync method

Different writeFile, writeFileSync writes to a file synchronously. If you lot utilize writeFileSync, yous will block the execution of the event loop and the residue of the code until the operation is successful or an error occurs. It will create a new file if the path you pass doesn't exist, and overwrites it if it does.

In the lawmaking snippet below, we are writing to the config.json file. We are wrapping the code in try-catch so that we tin can take hold of any errors:

const { writeFileSync } = crave('fs');  const path = './config.json'; const config = { ip: '192.0.2.i', port: 3000 };  try {   writeFileSync(path, JSON.stringify(config, null, ii), 'utf8');   console.log('Data successfully saved to disk'); } catch (error) {   console.log('An error has occurred ', mistake); }        

How to append a JSON file

Node doesn't have a built-in function for appending or updating fields of an existing JSON file out of the box. Yous can, nonetheless, read the JSON file using the readFile method of the fs module, update it, and overwrite the JSON file with the updated JSON.

Beneath is a code snippet illustrating how to get about it:

const { writeFile, readFile } = require('fs'); const path = './config.json';  readFile(path, (fault, information) => {   if (mistake) {     console.log(error);     return;   }   const parsedData = JSON.parse(information);   parsedData.createdAt = new Date().toISOString();   writeFile(path, JSON.stringify(parsedData, goose egg, two), (err) => {     if (err) {       console.log('Failed to write updated data to file');       return;     }     console.log('Updated file successfully');   }); });        

How to read and write to JSON files using third-party npm packages

In this department, we shall wait at the most pop third-party Node packages for reading and writing data in JSON format.

How to utilize the jsonfile npm package for reading and writing JSON files

jsonfile is a popular npm packet for reading and writing JSON files in Node. You can install it using the control beneath:

npm i jsonfile        

Information technology is similar to the readFile and writeFile methods of the fs module, though jsonfile has some advantages over the built-in methods.

Some of the features of this parcel are every bit follows:

  • It serializes and deserializes JSON out of the box
  • It has a built-in utility for appending data to a JSON file
  • Supports promise chaining

You tin see the jsonfile package in action in the code snippet below:

const jsonfile = require('jsonfile'); const path = './config.json';  jsonfile.readFile(path, (err, data) => {   if (err) {     console.log(err);     return;   }   console.log(data); });        

You tin also use hope chaining instead of passing a callback function similar in the above example:

const jsonfile = require('jsonfile'); const path = './config.json';  jsonfile   .readFile(path)   .so((data) => {     console.log(data);   })   .grab((err) => {     console.log(err);   });        

How to use the fs-actress npm package for reading and writing JSON files

fs-extra is another popular Node package you lot can utilize to work with files. Though you can employ this package for managing JSON files, information technology has methods whose functions extend beyond just reading and writing JSON files.

As its proper noun suggests, fs-extra has all the functionalities provided by the fs module and more. According to the documentation, you can apply the fs-extra parcel instead of the fs module.

You need to offset install fs-extra from npm earlier using it:

npm install fs-actress        

The code below shows how y'all tin read JSON files using the readJson method of the fs-actress package. Yous can use a callback office, promise chaining, or async/expect:

const fsExtra = require('fs-extra'); const path = './config.json';  // Using callback fsExtra.readJson(path, (error, config) => {   if (mistake) {     console.log('An error has occurred');     render;   }   console.log(config); });  // Using hope chaining fsExtra   .readJson(path)   .then((config) => {     console.log(config);   })   .catch((error) => {     console.log(fault);   });  // Using async/await async function readJsonData() {   try {     const config = await fsExtra.readJson(path);     console.log(config);   } catch (error) {     console.log(error);   } } readJsonData();        

The lawmaking below illustrates how you tin can write JSON data using the writeJson method:

const { writeJson } = crave('fs-actress');  const path = './config.json'; const config = { ip: '192.0.ii.1', port: 3000 };  // Using callback writeJson(path, config, (fault) => {   if (mistake) {     console.log('An mistake has occurred');     return;   }   panel.log('Data written to file successfully '); });  // Using hope chaining writeJson(path, config)   .and so(() => {     console.log('Data written to file successfully ');   })   .catch((error) => {     console.log(error);   });  // Using async/look async role writeJsonData() {   try {     expect writeJson(path, config);     console.log('Information written to file successfully ');   } catch (error) {     console.log(error);   } } writeJsonData();        

Only like the fs module, fs-extra has both asynchronous and synchronous methods. Yous don't need to stringify your JavaScript object earlier writing to a JSON file.

Similarly, yous don't demand to parse to a JavaScript object after reading a JSON file. The module does it for y'all out of the box.

How to use the bfj npm package for reading and writing JSON files

bfj is another npm packet y'all can employ for handling data in JSON format. According to the documentation, information technology was created for managing large JSON datasets.

bfj implements asynchronous functions and uses pre-allocated stock-still-length arrays to try and alleviate issues associated with parsing and stringifying large JSON or JavaScript datasets – bfj documentation

You can read JSON data using the read method. The read method is asynchronous and it returns a hope.

Assuming you have a config.json file, y'all can use the following lawmaking to read it:

const bfj = require('bfj'); const path = './config.json';  bfj   .read(path)   .so((config) => {     console.log(config);   })   .catch((error) => {     panel.log(error);   });        

Similarly, y'all can utilise the the write method to write data to a JSON file:

const bfj = require('bfj'); const path = './config.json';  const config = { ip: '192.0.ii.1', port: 3000 }; bfj   .write(path, config)   .and so(() => {     console.log('Data has been successfully written to disk');   })   .catch((error) => {     console.log(error);   });        

bfj has lots of functions that yous can read about in the documentation. It was created purposely for treatment large JSON data. Information technology is also slow, and then you should employ it only if you are handling relatively large JSON datasets.

Conclusion

As explained in the to a higher place sections, JSON is one of the most popular formats for data exchange over the cyberspace.

The Node runtime environment has the built-in fs module yous can use to work with files in general. The fs module has methods that y'all tin use to read and write to JSON files using the callback API, hope-based API, or synchronous API.

Because methods of the callback API are more than performant than that of the hope-based API, every bit highlighted in the documentation, you are meliorate off using the callback API.

In add-on to the built-in fs module, several popular 3rd-political party packages such equally jsonfile, fs-actress, and bfj exist. They have additional utility functions that make working with JSON files a breeze. On the flip side, you should evaluate the limitations of adding 3rd-political party packages to your application.

200'due south but Monitor failed and slow network requests in production

Deploying a Node-based web app or website is the easy function. Making sure your Node example continues to serve resources to your app is where things get tougher. If you're interested in ensuring requests to the backend or third party services are successful, try LogRocket. LogRocket Network Request Monitoringhttps://logrocket.com/signup/

LogRocket is similar a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you tin amass and report on problematic network requests to rapidly understand the root crusade.

LogRocket instruments your app to record baseline functioning timings such as page load time, time to showtime byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/land. Start monitoring for costless.

I Need O Build an App That Reads Json Files

Source: https://blog.logrocket.com/reading-writing-json-files-nodejs-complete-tutorial/