Reputation: 47296
I created 10000 files - my default limit was " ulimit -n 1024" - I expected following script to fail with a message like "Too many files open" , but it didn't fail. (/tmp/files has 10000 files)
Any thoughts on where am i going wrong?
import os
listfiles=os.listdir('/tmp/files')
count=0
f=''
for file in listfiles:
fn=f+str(count)
fn=open(file,'w')
fn.write('hello')
print 'file=',file
count=count+1
print count
Upvotes: 1
Views: 1594
Reputation: 49265
You loop brings each file object out of scope, so they are quickly closed, and you don't have more than 1 alive at any time (technically you might have because of GC delay). Just append each file object to a global list and you'll be good crashing your script! :)
Upvotes: 2
Reputation: 799310
Rebinding fn
causes the reference count of the old object to drop to 0, causing it to be reaped. Append the files to a list instead.
Upvotes: 2
Reputation: 304433
for file in listfiles:
fn = f+str(count) # what is this supposed to do?
fn = open(file,'w') # old file handle gets garbage collected and closed
fn.write('hello')
print 'file=',file
count = count + 1
Every time you rebind fn
the previous file gets closed. It's possible that in jython for example the file doesn't get closed immediately, but it very likely will still be garbarge collected and closed before you exceed the 1024 file limit
Try storing the file objects in a list like this:
import os
listfiles = os.listdir('/tmp/files')
count = 0
f = ''
fn = []
for file in listfiles:
fn.append(open(file,'w'))
fn[-1].write('hello')
print 'file=',file
count = count + 1
print count
Upvotes: 2