Karl
Karl

Reputation: 5723

Is there a performance drop when we open a file in a directory that has huge numbers of files?

Suppose we want to open a file in a directory, but there are huge numbers of files. When I requested the program to open a file in there, how fast can it search for this particular file? Will there be performance drop for looking for the requested file in this case?

PS. This should also depend on the file systems implementation, yes?

Upvotes: 0

Views: 106

Answers (1)

Joachim Sauer
Joachim Sauer

Reputation: 308001

Yes, it depends a lot on the file system implementation.

Some file systems have specific optimizations for large directories. One example I can think of is is ext3, which uses HTree indexing for large directories.

Generally speaking there will usually be some delay to find the file. Once the file is located/opened, however, reading it should not be slower than reading any other file.

Some programs that need to handle a large amount of files (for caching, for example) put them in a large directory tree, to reduce the number of entries per directory.

Upvotes: 3

Related Questions