Ivy Knm
Ivy Knm

Reputation: 33

Is it possible to merge multiple json files with their parent?

I have 100 JSON files need to merge to 1 JSON file. basically I want to put [] around 100 files and append them all into 1 file.

Each file has the same structure as follow:

[
    {
        "id": "1",
        "title": "Student",
        "children": [
            {
                "ID": "111",
                "Name": "John",
                "Pattern": "DA0"
            },
            {
                "ID": "222",
                "Name": "Tom",
                "Pattern": "DA0"
            }
        ]
    }
]

I have the following code to achieve this but there is an error for JSON encode? Please have a look:

import glob
import json 

    read_files = glob.glob("*.json")
    output_list = []
    
    with open(read_files, 'w', encoding='utf-8') as jsonf:
        for f in read_files:
            with open(f, "rb") as infile:
                output_list.append(json.load(infile))
    
    all_items = []
    for json_file in output_list:
        all_items += json_file['items']
    
    textfile_merged = open('merged.json', 'w')
    json.dump({ "items": all_items }, textfile_merged)
    textfile_merged.close()

The error message:

Traceback (most recent call last):
  File "combine.py", line 10, in <module>
    with open(read_files, 'w', encoding='utf-8') as jsonf:
TypeError: expected str, bytes or os.PathLike object, not list

Upvotes: 0

Views: 574

Answers (1)

j_b
j_b

Reputation: 2020

The glob.glob("*.json) returns a list of path names per the python documentation. So your code with open(read_files, 'w', encoding='utf-8') as jsonf: will not work properly.

Try something like:

import glob
import json

read_files = glob.glob("*.json")
output_list = []

for f in read_files:
    with open(f, "rb") as infile:
        output_list.append(json.load(infile))

# rest of your code 

Upvotes: 1

Related Questions