4m1nh4j1
4m1nh4j1

Reputation: 4356

Export dicts to a json file

I have a list of dicts j and I would like to export some of the dicts as a json file called myFile.json :

for item in j:
    if item['start'] = "True":
        #If myFile.json exists then append item to myFile.json
        #Otherwise create myFile.json that starts with "[" and append item to it
#Append "]" to the myFile.json

I can do it using try , but I would like to know if there is a more pythonic way to make it .

My code doesn't even worth to put it.

try:
    with io.open(myFile.json, 'a', encoding='utf-8') as f:
        f.write(unicode(json.dumps(item, ensure_ascii=False)))
        f.write(u",")
except IOError:
    with io.open(myFile.json, 'w', encoding='utf-8') as f:
        f.write(u"[")
    with io.open(myFile.json, 'a', encoding='utf-8') as f:
        f.write(unicode(json.dumps(item, ensure_ascii=False)))
        f.write(u",")

        # ..etc

Edit The output file should be a json array:

[ {"key1":"value1","key2":"value2"},{"key1":"value3","key2":"value4"}]

Upvotes: 2

Views: 498

Answers (1)

Martijn Pieters
Martijn Pieters

Reputation: 1121924

Your approach has a severe flaw: If you are writing [ first, you also need to add , commas after each JSON value you write, you have to append the closing ], and then you have to remove the last ] every time you append to the file, or on reading manually add the ] closing bracket before decoding.

You'd be better of not trying to build a big JSON list, and instead use newlines as delimiters. That way you can freely append, and by reading your file line-by-line you can easily load the data again.

This has the added advantage of vastly simplifying your code:

with io.open(myFile.json, 'a', encoding='utf-8') as f:
    f.write(unicode(json.dumps(item, ensure_ascii=False)))
    f.write(u'\n')

This removes the need to test for the file existing first. Reading is as simple as:

with open(myFile.json) as f:
    for line in f:
        data = json.loads(line)

Upvotes: 4

Related Questions