Read/Write JSON Files with Node.js (2024)

Goal

Say you have a customer.json file saved to disk that holds a record for a customer in your store.

As part of your store app, you want to access the customer’s address, and then update the order count after an order is placed.

In this tutorial, we are going to look at how to read and write to our customer.json file.

{ "name": "Mega Corp.", "order_count": 83, "address": "Infinity Loop Drive"}

Prerequisites

  • None.

Watch: Read/Write JSON Files

Interact with files with fs

Accessing files in Node is done with the native module fs, which gives you functions to watch, read, and write files along with many other tools to work with the filesystem. Because it’s a native module, we can require it in our code without installing it. Just call const fs = require(‘fs’).

The fs module gives us the option of synchronous or asynchronous versions of many of its functions. The synchronous versions block execution of other code until they are done accessing the filesystem, reading, or writing data. An async function will run without blocking other code. Learn more about sync/async behavior.

This synchronous behavior can be useful in some places, like at startup when reading a config file before any other code is run, but becomes a big issue when used in a webserver where all incoming requests would be blocked while a synchronous file read is running. For this reason, you generally want to use the async versions of fs functions in your code. We will focus on async operations, but will also show the synchronous equivalent.

To read and write files asynchronously with fs we will use fs.readFile and fs.writeFile.

We also will use the global JSON helper to convert objects to JSON strings, and JSON strings to objects.

Read a JSON file

The simplest way to read a JSON file is to require it. Passing require() with the path to a JSON file will synchronously read and parse the data into a JavaScript object.

const config = require("./config.json");

But reading JSON files with require has its downsides. The file will only be read once; requiring it again returns the cached data from the first time require was run. This is fine for loading static data on startup (like config data). But for reading a file that changes on disk, like our customer.json might, we need to manually read the file using the asynchronous fs.readFile.

Read a file with fs.readFile

To access the customer’s address, we need to:

  • Read the JSON data from the file
  • Parse the JSON string into a JavaScript object

To load the data from customer.json file, we will use fs.readFile, passing it the path to our file, an optional encoding type, and a callback to receive the file data.

If the file is successfully read, the contents will be passed to the callback.

const fs = require("fs");fs.readFile("./customer.json", "utf8", (err, jsonString) => { if (err) { console.log("File read failed:", err); return; } console.log("File data:", jsonString);});
  • ./customer.json is the relative path to the the file utf8 is an optional parameter for the encoding of the file we are reading, this can be left out. If not specified the function will return a Buffer instead of a string.
  • (err, jsonString) => {} is the callback function that runs after the file has been read.

Now we have the contents of the file as a JSON string, but we need to turn the string into an object.

Before we can use the data from the callback in our code, we must turn it into an object. JSON.parse takes JSON data as input and returns a new JavaScript object. Otherwise, we would just have a string of data with properties we can’t access.

JSON.parse can throw exception errors and crash our program if passed an invalid JSON string. To prevent crashing we wrap JSON.parse in a try catch statement to gracefully catch any errors.

This example shows reading and parsing a JSON file:

const fs = require("fs");fs.readFile("./customer.json", "utf8", (err, jsonString) => { if (err) { console.log("Error reading file from disk:", err); return; } try { const customer = JSON.parse(jsonString); console.log("Customer address is:", customer.address); // => "Customer address is: Infinity Loop Drive" } catch (err) { console.log("Error parsing JSON string:", err); }});

Using the jsonString from reading customer.json, we create an object, and can access the address property. If JSON.parse throws an error, we handle it in the catch block.

Now we have an object representation of the data in our customer.json file!

We can also read the file synchronously using fs.readFileSync. Instead of taking a callback, readFileSync returns the file content after reading the file.

try { // Note that jsonString will be a <Buffer> since we did not specify an // encoding type for the file. But it'll still work because JSON.parse() will // use <Buffer>.toString(). const jsonString = fs.readFileSync("./customer.json"); const customer = JSON.parse(jsonString);} catch (err) { console.log(err); return;}console.log(customer.address); // => "Infinity Loop Drive"

We can use this knowledge to create a reusable helper function to read and parse a JSON file.Here we create a function called jsonReader that will read and parse a JSON file for us. It takes the path to the file and a callback to receive the parsed object and any errors. It will catch any errors thrown by JSON.parse for us.

const fs = require("fs");function jsonReader(filePath, cb) { fs.readFile(filePath, (err, fileData) => { if (err) { return cb && cb(err); } try { const object = JSON.parse(fileData); return cb && cb(null, object); } catch (err) { return cb && cb(err); } });}jsonReader("./customer.json", (err, customer) => { if (err) { console.log(err); return; } console.log(customer.address); // => "Infinity Loop Drive"});

A note about file encoding

Both fs.readFileSync and fs.readFile take an optional encoding argument. If you specify a character encoding you'll get a string in return. If you do not specify a character encoding both functions will return a Buffer.

This is because Node does not, and cannot, assume what kind of content a file contains. Even if you can. In order to handle this lack of definition, Node will read the file byte for byte and return it as an un-opinionated buffer which you can process as desired.

If you do know the content of the file, and can provide that detail to Node in the form of an encoding argument it generally makes the code both more performant and easier to understand.

Write to a file with fs.writeFile

Writing JSON to the filesystem is similar to reading it. We will use fs.writeFile to asynchronously write data to a newCustomer.json file.

First, to write data to a JSON file, we must create a JSON string of the data with JSON.stringify. This returns a JSON string representation of a JavaScript object, which can be written to a file. Similar to parsing data into an object when reading a file, we must turn our data into a string to be able to write it to a file.

Create a customer object with our data below, and turn it into a string.

const customer = { name: "Newbie Corp.", order_count: 0, address: "Po Box City"};const jsonString = JSON.stringify(customer);console.log(jsonString);// => "{"name":"Newbie Co.","address":"Po Box City","order_count":0}"

Note: If you try to write an object to a file without stringifying it, your file will be empty and look like this:

[object, object]

Once the data is stringified, we can use fs.writeFile to create a new customer file. We pass fs.writeFile the filepath, our customer data to write, and a callback that will be excecuted after the file is written. If the newCustomer.json file doesn’t already exist, it will be created; if it does exist, it will be overwritten!

Here is an example of writing a JSON file with fs.writeFile

const fs = require('fs')const customer = { name: "Newbie Co.", order_count: 0, address: "Po Box City",}const jsonString = JSON.stringify(customer)fs.writeFile('./newCustomer.json', jsonString, err => { if (err) { console.log('Error writing file', err) } else { console.log('Successfully wrote file') }})

And that’s it! Once the callback runs, the file has been written to disk. Note: we are only passed an error object; the filedata we wrote isn’t passed to the callback.We can also write a file synchronously in the same way using fs.writeFileSync:

const jsonString = JSON.stringify(customer)fs.writeFileSync('./newCustomer.json', jsonString)

After your file is finished writing, it will look something like this:

{ "name": "Newbie Co.", "address": "Po Box City", "order_count": 0 }

Stringifying by default puts your data all on a single line. Optionally, you can make the output file human-readable by passing the number of spaces to indent by to JSON.stringify:

const jsonString = JSON.stringify(customer, null, 2);

Above, we told stringify to indent the data with 2 spaces.Now your output file should look like this:

{ "name": "Newbie Co.", "address": "Po Box City", "order_count": 0}

Update a JSON file

Now that we are able to read and write our customer files, we can use them as a simple kind of database. If we want to update the data in the JSON file, we can read the contents, change the data, and then write the new data back to the file:

jsonReader("./customer.json", (err, customer) => { if (err) { console.log("Error reading file:", err); return; } // increase customer order count by 1 customer.order_count += 1; fs.writeFile("./customer.json", JSON.stringify(customer), err => { if (err) console.log("Error writing file:", err); });});

Definitely not the most efficient database you could choose, but working with JSON files like this is a simple way to persist data in your project.

Recap

JSON is one of the most common types of data you’ll work with in Node, and being able to read and write JSON files is very useful. You’ve learned how to use fs.readFile and fs.writeFile to asynchronously work with the filesystem, as well as how to parse data to and from JSON format, and catch errors from JSON.parse.

You can use require to read a JSON file at startup to synchronously parse a JSON file in one line. And now you can use a simple JSON file as a data store.

If you want to learn more, you can read up on what JSON actually is, and find out more about synchronous vs asynchronous code.

Further your understanding

  • Learn about using Node.js Streams to read really large files.
  • Walk through this example with a co-worker. Are all the concepts clear to you? Do you need to review anything?

Additional resources

Read/Write JSON Files with Node.js (2024)
Top Articles
10 Billionaire Entrepreneurs who Started with Nothing
The Optimal Customer Success Team Structure [+ Org Chart] – Arrows
Dragon Age Inquisition War Table Operations and Missions Guide
Joi Databas
Lifewitceee
Klustron 9
Acts 16 Nkjv
Wal-Mart 140 Supercenter Products
Paketshops | PAKET.net
Best Cav Commanders Rok
Conan Exiles Thrall Master Build: Best Attributes, Armor, Skills, More
Morgan And Nay Funeral Home Obituaries
Buy PoE 2 Chaos Orbs - Cheap Orbs For Sale | Epiccarry
Cpt 90677 Reimbursem*nt 2023
R Cwbt
Fraction Button On Ti-84 Plus Ce
If you bought Canned or Pouched Tuna between June 1, 2011 and July 1, 2015, you may qualify to get cash from class action settlements totaling $152.2 million
Espn Horse Racing Results
Plaza Bonita Sycuan Bus Schedule
Encyclopaedia Metallum - WikiMili, The Best Wikipedia Reader
Utexas Iot Wifi
Weldmotor Vehicle.com
fft - Fast Fourier transform
Timeline of the September 11 Attacks
Harrison County Wv Arrests This Week
Unreasonable Zen Riddle Crossword
A Man Called Otto Showtimes Near Carolina Mall Cinema
Albertville Memorial Funeral Home Obituaries
Revelry Room Seattle
Bernie Platt, former Cherry Hill mayor and funeral home magnate, has died at 90
Los Amigos Taquería Kalona Menu
Petsmart Distribution Center Jobs
How does paysafecard work? The only guide you need
Montrose Colorado Sheriff's Department
Afspraak inzien
Dr. John Mathews Jr., MD – Fairfax, VA | Internal Medicine on Doximity
Stanley Steemer Johnson City Tn
Electronic Music Duo Daft Punk Announces Split After Nearly 3 Decades
Gateway Bible Passage Lookup
This 85-year-old mom co-signed her daughter's student loan years ago. Now she fears the lender may take her house
Shane Gillis’s Fall and Rise
Rhode Island High School Sports News & Headlines| Providence Journal
Nimbleaf Evolution
Conan Exiles Tiger Cub Best Food
Craigslist Mendocino
Maplestar Kemono
Phone Store On 91St Brown Deer
What Does the Death Card Mean in Tarot?
Wrentham Outlets Hours Sunday
Craigslist Psl
Https://Eaxcis.allstate.com
Www.card-Data.com/Comerica Prepaid Balance
Latest Posts
Article information

Author: Nicola Considine CPA

Last Updated:

Views: 5458

Rating: 4.9 / 5 (69 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Nicola Considine CPA

Birthday: 1993-02-26

Address: 3809 Clinton Inlet, East Aleisha, UT 46318-2392

Phone: +2681424145499

Job: Government Technician

Hobby: Calligraphy, Lego building, Worldbuilding, Shooting, Bird watching, Shopping, Cooking

Introduction: My name is Nicola Considine CPA, I am a determined, witty, powerful, brainy, open, smiling, proud person who loves writing and wants to share my knowledge and understanding with you.