Reputation: 96581
In my program I have loop that scans a bunch of files and reads their content. The problem happened over the iteration of about 1500 files and can't seem to be reproduced (or understood (by me))
The problem:
java.io.FileNotFoundException: /path/to/file//myFile (Too many open files)
Exception points to this method:
private static String readFileAsRawString(File f) throws IOException {
FileInputStream stream = new FileInputStream(f); // <------------Stacktrace
try{
FileChannel fc = stream.getChannel();
MappedByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, fc.size());
return Charset.defaultCharset().decode(bb).toString();
} finally {
stream.close();
}
}
I ran this method over 20,000 files in QA and it seems to have no problems.
Do you see anything wrong with code i pasted above that would cause this issue?
Upvotes: 5
Views: 2621
Reputation: 311052
Don't use MappedByteBuffer for this trivial task. There is no well-defined time at which they are released. Just open the file, read it, close it.
Upvotes: 2
Reputation: 269897
The mapping is suspect. A MappedByteBuffer
can outlive its FileChannel
, and is valid until it is garbage collected. You might not have enough garbage to run the GC, but perhaps on a particular platform file handles are retained by unreferenced buffers.
Unless explicit garbage collection is disabled (-XX:-DisableExplicitGC
), you should be able to test for this by catching the exception, calling System.gc()
, and trying again. If it works on the second try, that's your problem. However, calling System.gc()
as a permanent fix is a bad idea. The solution that will perform best overall will take some profiling on the target platform.
Upvotes: 3
Reputation: 7
I think you open too many files to fast, try to add a wait() to test this. Then add a static counter that keeps tracks of opens files and if many files are already open, add a wait mechanism...
Upvotes: -1