user3101015
user3101015

Reputation: 1

Crash while writing to multiple files

I'm starting to work with C++ again after a long break while coding in Java. I'm trying to convert a data processing job that I have over to C++. I've run into an issue while I'm trying to open and write to 100+ files at once (splitting 10GB text file to files by date). Again, I've only been back on C++ for about 2 days now so I'm full my code is riddled with other issues but I've created the simplest snippet that shows the issue.

What would cause this?

#include <map>  
#include <sstream>  

int main() {  
  std::map<int, FILE*> files;  
  int files_to_open = 200;  
  int files_to_write = 200;  

  // Open a set of files.  
  for(int i = 0; i < files_to_open; i++) {  
    std::ostringstream file_path;  
    file_path << "E:\\tmp\\file_" << i << ".txt";  
    files[i] = fopen(file_path.str().c_str(), "w");  
  }  

  // Write data to files.  
  for(int i = 0; i < files_to_write; i++) {  
    printf("%d\n", i);  
    fwrite("Some Data", sizeof(char), 9, files[i]);  
  }  

  // Close files.  
  for (auto& file : files) {  
    fclose(file.second);  
  }  

  // End it all.  
  printf("Press Any Key to Continue\n");  
  getchar();  
  return 0;  
}  

Upvotes: 0

Views: 221

Answers (1)

Paladine
Paladine

Reputation: 523

I'm going to assume that fopen returns NULL when files_to_write is > 125. Your OS has a limitation per process of the number of the file handles it can have open, and you're probably hitting the limitation.

125 makes perfect sense since you already have 0 (stdin), 1 (stdout) and 2 (stderr), so 125 more would be 128 which is a nice limitation.

Either way, you should check the return value from fopen before blindly writing to a FILE*

Upvotes: 1

Related Questions