Reputation: 6570
Challenge here is in evaluating multiple large files.
What coding will instruct Python to "load" a limited number of files into memory, process them, garbage collect and then load the next set?
def main(directory):
"""
Create AudioAnalysis Objects from directory and call object_analysis().
"""
ff = os.listdir(directory)
for f in ff:
# can we limit the number we load at one time?
audiofile = audio.LocalAudioFile(os.path.join(directory,f)) # hungry!
Tried adding audiofile = 0
to the loop, but the memory allocation is the same.
As I understand it, Lazy Evaluation
"is an evaluation strategy which delays the evaluation of an expression until its value is needed", but in this case I need to delay evaluation until there's memory available.
Am expecting that a decorator
, descriptor
and/or use of Pythons property()
function may be involved, or possibly buffering or queueing the input.
Upvotes: 2
Views: 111
Reputation: 15170
Here's one solution: have Python spawn a process, run the function on one file, then exit. The parent proc will collect results from each of the files.
This is in no way graceful, but if LocalAudioFile
refuses to be dislodged from memory, it allows some flexibility in getting results.
This code runs runs a function on each Python file in the current directory, returning a message to the parent process, which prints it out.
import glob, multiprocessing, os
def proc(path):
"""
Create AudioAnalysis Objects from directory and call object_analysis().
"""
# audiofile = audio.LocalAudioFile(path) # hungry!
return 'woot: {}'.format(path)
if __name__=='__main__': # required for Windows
pool = multiprocessing.Pool() # one Process per CPU
for output in pool.map(proc, [
os.path.abspath(name) for name in glob.glob('q*.py')
]):
print 'output:',output
output: woot: /home/johnm/src/johntellsall/karma/qpopen.py
output: woot: /home/johnm/src/johntellsall/karma/quotes.py
Upvotes: 1