Reputation: 23
I'm very new to Python, but I have an aching problem. I have received a program which reads an infile (text), changes some values, and writes an outfile (also text). As the outfile grows bigger, the writes get slower and slower, making it unbearably slow after some 2 MB. Why can this be? I have tried altering the code to use buffers of different sizes, and I have changed it to cache the data into larger chunks (a string) before writing. I also tried join instead of += to create the string to be written. NONE of these do any difference at all to performance - except writing bigger chunks, which actually made the code SLOWER.(!!!)
Here is the method that writes the outfile. I moved the write portion from a separate method to inline:
for ifile in _file_stripper(f_in):
parse_infile(ifile)
date = variable_data['arkiveringsdatum']
variable_data['arkiveringsdatum'] = datetime( int(date[0:4]), int(date[4:6]), int(date[6:8]), tzinfo=TZ()).isoformat('T')
_personnr= variable_data['personnr'].replace('-', '').split(' ')[0]
tmplist = ['<utskriftsstatus><brevid_kalla>', variable_data['brevid_kalla'], '</brevid_kalla><mapp>Se Allt</mapp><tidpunkt>', variable_data['arkiveringsdatum'], '</tidpunkt><dokumentpaket>', variable_data['dokumenttyp'], '</dokumentpaket><status>Utskriven</status><rensningsdatum>999999</rensningsdatum><kundid_gdb>', variable_data['kundid_gdb'], '</kundid_gdb><personnr>', _personnr, '</personnr></utskriftsstatus>']
f_out.write(''.join(tmplist))
Method _file_stripper
splits a big file into records.
Infiles are 5-21 MB.
Please advice where to look for the error. When I talk about slowdown, the write speed falls down below 4KB written/second after around 1 MB has been written, and it keeps falling as the outfile grows bigger.
EDIT: On request, here is parse_infile and _file_stripper:
def parse_infile(f_in):
index = "" #variabel som håller vilken ondemandvariabel vi läser in
found_data = 0 #1 ifall vi hittat det vi letar efter annars 0
for row in f_in:
if( 'personnr' in row):
found_data=1
index = "personnr"
elif( 'kundid_gdb' in row):
found_data=1
index = "kundid_gdb"
elif( 'brevid_kalla' in row):
found_data=1
index = "brevid_kalla"
elif( 'arkiveringsdatum' in row):
found_data=1
index = "arkiveringsdatum"
elif( 'GROUP_FILENAME' in row ):
variable_data['dokumenttyp'] = row.split(':')[-1].split('.')[2].capitalize()
elif(found_data==1):
variable_data[index] = row.split(':')[1].strip()
index = "" #Nollställ index ifall värden saknas i filen
found_data=0
else:
pass
def _file_stripper(tot_file):
try:
myfile = []
for rows in tot_file:
if not 'GROUP_FILENAME' in rows:
myfile.append(rows)
else:
myfile.append(rows)
yield myfile
except Exception:
pass
variable_data = { "brevid_kalla": "", "arkiveringsdatum": "",
"kundid_gdb": "", "personnr": "",
"dokumenttyp": "" }
Upvotes: 1
Views: 1507
Reputation: 1124040
Your _file_stripper
function adds to the myfile
list endlessly, without ever reseting the list:
def _file_stripper(tot_file):
try:
myfile = []
for rows in tot_file:
if not 'GROUP_FILENAME' in rows:
myfile.append(rows)
else:
myfile.append(rows)
yield myfile
except Exception:
pass
Note that myfile
is set outside the loop, and each row is appened to myfile
, then yielded as is. Your process memory footprint will thus grow and grow, forcing the OS to start swapping out memory eventually, thus slowing your process to a crawl.
I think you meant to reset myfile
when GROUP_FILENAME
doesn't appear in rows
:
def _file_stripper(tot_file):
try:
myfile = []
for rows in tot_file:
if not 'GROUP_FILENAME' in rows:
myfile.append(rows)
else:
myfile.append(rows)
yield myfile
myfile = []
except Exception:
pass
Upvotes: 0
Reputation: 110526
Most likely what is going on is that your variable_data
, or more likely, some fields in it, are growing up with each parsed file.
Your parse_infile
function is pprobably not reseting it and appending values of new files to values already there, making it grow larger for each file read - that would result in resources used in a (O² ) as you describe.
The best pratice there is not to rely on global variables - make your parse_infile
function create a fresh dictionary on each interaction, and return it to the caller. On your main function, assign the return value of the function to your dictionary:
def parse_infile(file_):
variable_data = {}
(...)
return variable_data
(...)
for ifile in _file_stripper(f_in):
variable_data = parse_infile(ifile)
(...)
Upvotes: 1