srb633
srb633

Reputation: 813

Read all JSON files contained in a dynamically updated folder

I've got multiple json files contained within a directory that will dynamically be updated by users. The users can add categories which will create new json files in that directory, and they can also remove categories which would delete json files in that directory. I'm looking for a method to read all json files contained in that folder directory, and push all the json files into a single object array. I imagine asynchronously would be desirable too.

I'm very new to using fs. I've management to read single json files by directory using

const fs = require('fs');

let data = fs.readFileSync('./sw_lbi/categories/category1.json');
let categories = JSON.parse(data);
console.log(categories);

But of course this will only solve the synchronous issue when using require()

As I'll have no idea what json files will be contained in the directory because the users will also name them, I'll need a way to read all the json files by simply calling the folder directory which contains them.

I'm imagining something like this (which obviously is foolish)

const fs = require('fs');

let data = fs.readFileSync('./sw_lbi/categories');
let categories = JSON.parse(data);
console.log(categories);

What would be the best approach to achieve this?

Thanks in advance.

Upvotes: 5

Views: 17369

Answers (1)

Damian Plewa
Damian Plewa

Reputation: 1193

First of all you need to scan this directory for files, next you need to filter them and select only JSONs, and at the end just read every file and do what you need to do

const fs = require('fs');
const path = require('path')

const jsonsInDir = fs.readdirSync('./sw_lbi/categories').filter(file => path.extname(file) === '.json');

jsonsInDir.forEach(file => {
  const fileData = fs.readFileSync(path.join('./sw_lbi/categories', file));
  const json = JSON.parse(fileData.toString());
});

Upvotes: 18

Related Questions