Reputation: 651
I have 500 folders each containing varying numbers (ranging between 20-40) of JSON collections. I'm able to extract the contents of each folder manually and individually in Python by following answers given here and here
import os, json
path_to_json = 'C:/Users/SomeFolder'
json_files = [pos_json for pos_json in os.listdir(path_to_json) if pos_json.endswith('.json')]
#list all the files in the folder
print (json_files)
for js in json_files:
with open(os.path.join(path_to_json, js)) as json_file:
#can only print to screen all the json files - need help saving to a tab-delimited file
print (json.load(json_file))
However, it would be quite laborious and obviously very tiresome considering this is must done 500 times. A faster automated approach for repetitively extracting the contents of each JSON folder into tab-delimited files would be most welcome. Thanks
Upvotes: 0
Views: 259
Reputation: 81
import os, json
current_directory = os.path.dirname(os.path.realpath(__file__))
all_directories = [x[0] for x in os.walk(current_directory)]
for directory in all_directories:
path_to_json = directory
json_files = [pos_json for pos_json in os.listdir(path_to_json) if pos_json.endswith('.json')]
#list all the files in the folder
print (json_files)
for js in json_files:
with open(os.path.join(path_to_json, js)) as json_file:
#can only print to screen all the json files - need help saving to a tab-delimited file
print (json.load(json_file))
This will find all folders where you execute this script. Then it will iterate folders and do the job. You need to copy this to where you store folders.
Upvotes: 1