Reputation: 785
I am currently running a project with a workflow in the following form:
An input goes through a slow (95% of programme runtime) process to form an output list (data
). Fast operations are then performed on the list to create a final output.
Is there a sensible way I can store data
externally to my python script so that I can run the slow process once and then trial the final stages (reading data
in rather than reassembling it)?
EDIT: Considered exporting to 'simple' formats (MS Excel / MySQL) which proved unhelpful as the strings were both too long and contained special characters.
Upvotes: 0
Views: 415
Reputation: 785
So it appears pickle
was what I was searching for (thanks to corn3lius and Two-Bit Alchemist for their comments).
A simple example (reproduced from https://wiki.python.org/moin/UsingPickle):
# Save a dictionary into a pickle file.
import pickle
favorite_color = { "lion": "yellow", "kitty": "red" }
pickle.dump( favorite_color, open( "save.p", "wb" ) )
# Load the dictionary back from the pickle file.
import pickle
favorite_color = pickle.load( open( "save.p", "rb" ) )
# favorite_color is now { "lion": "yellow", "kitty": "red" }`
Upvotes: 1