Explain Codes LogoExplain Codes Logo

Write / add data in JSON file using Node.js

javascript
file-handling
error-handling
asynchronous-operations
Nikita BarsukovbyNikita Barsukov·Mar 14, 2025
TLDR

To append data to a JSON file with Node.js using fs, do:

const fs = require('fs'); // Read current data and don't vanish like Houdini if it fails fs.readFile('data.json', 'utf8', (err, data) => { if (err) throw err; // Parse data, cause we're not in Stone Age let jsonData = JSON.parse(data); // Add new key to your jsonData, cause why not? jsonData.newKey = 'newValue'; // Save the princess, I mean, store back to file fs.writeFile('data.json', JSON.stringify(jsonData, null, 2), (err) => { if (err) throw err; }); });
  • Fetch your data with fs.readFile()
  • Modify the parsed object
  • Secure it back with fs.writeFile()
  • JSON.parse() and JSON.stringify() are your wingmen in JSON handling.

Checking for file existence and error handling

Because you won't jump off a cliff without checking the landing, let's verify the JSON file's existence and handle any nasty surprises that could cause data loss:

const fs = require('fs'); const filePath = 'data.json'; fs.stat(filePath, (err, stats) => { if (err && err.code === 'ENOENT') { // Same principle as schrodinger's cat. Does our file exist or not? fs.writeFile(filePath, '{}', (err) => { if (err) throw err; console.log('Created new file. No Schrodinger mess here:', filePath); }); } else if (err) { console.error('Hold up, Houston, we have a problem:', err); } else if (stats.isFile()) { // Our file exists, and it's as real as unicorns. fs.readFile(filePath, 'utf8', (err, data) => { if (err) throw err; let jsonData = JSON.parse(data); // Time to perform a magic trick on jsonData // ... Your magic trick code ... fs.writeFile(filePath, JSON.stringify(jsonData, null, 2), (err) => { if (err) throw err; console.log('Data written to file like the last cookie in the jar:', filePath); }); }); } });
  • Use fs.stat() to check file presence
  • Handle errors with console.error to provide contextual diagnostics

Tackling large datasets

Handling large amounts of data with a standard file system is like eating soup with a fork. When dealing with Moby Dick-size datasets, consider the database lifebuoy:

  • Databases like MongoDB or SQLite can store your data maze and still offer powerful querying.
  • Although a shift to a database might run you late for dinner, the benefits make it worth the extra mile.

Synchronous vs. asynchronous operations

Selecting between synchronous and asynchronous operations is like choosing between a cheeseburger and a salad; both are good, depending on your appetite and health goals:

  • Synchronous file operations, using fs.readFileSync and fs.writeFileSync, hold up the line until they finish. They're simpler but may slow down your script's metabolism.
  • Asynchronous file operations, using fs.readFile and fs.writeFile, give freedom to multitask. Dressing these with promises or async/await can turn your script into a haute couture masterpiece.

Readable JSON for debugging

Want your JSON to be prettier than your neighbor's yard? Formatting can make your data easier to read and debug:

let prettierJson = JSON.stringify(jsonData, null, 4); fs.writeFile('data.json', prettierJson, (err) => { if (err) throw err; console.log('JSON data is now as lookable as a sunset!'); });
  • Make your JSON string handsome with null and 4 passed to JSON.stringify(). That's one small step for data, one giant leap for readability.

Managing data updates

Use timestamped filenames or versioning mechanisms to keep track of data modifications. Because trust me, you don't want to search for a needle in a haystack.

  • Naming files with current date and time will make each of them unique, like a fingerprint.
  • Make sure to backup your JSON files before each data addition. Because "Oops" isn't a great data recovery plan.