Reputation: 1985
I'm opening lots of files with fopen() in VC++ but after a while it fails.
Is there a limit to the number of files you can open simultaneously?
Upvotes: 46
Views: 89527
Reputation: 3717
Came across the same problem, but using Embarcadero C++-Builder of RAD Studio 10.2. The C-runtime of that thing doesn't seem to provide _getmaxstdio
or _setmaxstdio
, but some macros and their default limit is much lower than what is said here for other runtimes:
stdio.h:
/* Number of files that can be open simultaneously
*/
#if defined(__STDC__)
#define FOPEN_MAX (_NFILE_)
#else
#define FOPEN_MAX (_NFILE_)
#define SYS_OPEN (_NFILE_)
#endif
_nfile.h:
#if defined(_WIN64)
#define _NFILE_ 512
#else
#define _NFILE_ 50
#endif
Upvotes: 2
Reputation: 1499
If you use the standard C/C++ POSIX libraries with Windows, the answer is "yes", there is a limit.
However, interestingly, the limit is imposed by the kind of C/C++ libraries that you are using.
I came across with the following JIRA thread (http://bugs.mysql.com/bug.php?id=24509) from MySQL. They were dealing with the same problem about the number of open files.
However, Paul DuBois explained that the problem could effectively be eliminated in Windows by using ...
Win32 API calls (CreateFile(), WriteFile(), and so forth) and the default maximum number of open files has been increased to 16384. The maximum can be increased further by using the --max-open-files=N option at server startup.
Naturally, you could have a theoretically large number of open files by using a technique similar to database connections-pooling, but that would have a severe effect on performance.
Indeed, opening a large number of files could be bad design. However, some situations call require it. For example, if you are building a database server that will be used by thousands of users or applications, the server will necessarily have to open a large number of files (or suffer a performance hit by using file-descriptor pooling techniques).
Upvotes: 13
Reputation: 3484
The C run-time libraries have a 512 limit for the number of files that can be open at any one time. Attempting to open more than the maximum number of file descriptors or file streams causes program failure. Use _setmaxstdio
to change this number. More information about this can be read here
Also you may have to check if your version of windows supports the upper limit you are trying to set with _setmaxstdio
. For more information on _setmaxstdio
check here
Information on the subject corresponding to VS 2015 can be found here
Upvotes: 68
Reputation: 515
Yes there are limits depending the access level you use when openning the files. You can use _getmaxstdio
to find the limits and _setmaxstdio
to change the limits.
Upvotes: 7
Reputation: 3809
In case anyone else is unclear as to what the limit applies to, I believe that this is a per-process limit and not system-wide.
I just wrote a small test program to open files until it fails. It gets to 2045 files before failing (2045 + STDIN + STDOUT + STDERROR = 2048), then I left that open and ran another copy.
The second copy showed the same behaviour, meaning I had at least 4096 files open at once.
Upvotes: 16
Reputation: 16142
I don't know where Paulo got that number from.. In windows NT based operating systems the number of file handles opened per process is basically limited by physical memory - it's certainly in the hundreds of thousands.
Upvotes: 4
Reputation: 11567
Yes, there is a limit.
The limit depends on the OS, and memory available.
In the old D.O.S. the limit was 255 simultaneuously opened files.
In Windows XP, the limit is higher (I believe it's 2,048 as stated by MSDN).
Upvotes: 0