Reputation: 19375
I am currently writing json
files to disk using
print('writing to disk .... ')
f = open('mypath/myfile, 'wb')
f.write(getjsondata.read())
f.close()
Which works perfectly, except that the json
files are very large and I would like to compress them. How can I do that automatically? What should I do?
Thanks!
Upvotes: 1
Views: 2483
Reputation: 22457
Python has a standard module for zlib
, which can compress and decompress data for you. You can use this immediately on your data and write (and read) a custom format, or use the module gzip
, which wraps the inner workings of zlib
to read and write gzip compatible files, while
automatically compressing or decompressing the data so that it looks like an ordinary file object.
It thus neatly replaces the default open
format to interact with files, and all you need is this:
import gzip
print('writing to disk .... ')
with gzip.open('mypath/myfile', 'wb') as f:
f.write(getjsondata.read())
(with a change in the open
line because I highly recommend using the with
syntax to handle file objects.)
Upvotes: 2