tscizzle
tscizzle

Reputation: 12241

storing files in s3 using Python boto

I have a list of dicts (ex. [{'a': 'b', 'c': 'd'}, {'e': 'f', 'g': 'h'}]) and I want to store this in s3 using Python boto package.

One way is to iterate through the list, writing to a file object f, such that each line of f is a json object. Then I could use key.set_contents_from_file(f). Is the correct/best way to do it?

It seems like writing to a local file is an unnecessary middle step, but I'm not sure.

Upvotes: 1

Views: 3559

Answers (1)

mauzel
mauzel

Reputation: 1606

To skip the writing to file step, you could use key.set_contents_from_string(str) to PUT your dicts' values directly.

See: http://boto.readthedocs.org/en/latest/ref/s3.html#module-boto.s3.key

(CTRL+F for set_contents_from_string)

Another roundabout way (to avoid writing to disk) would be to use set_contents_from_stream, and create an in-memory "file" (using StringIO and such), which you pass to that function.

Upvotes: 2

Related Questions