Raman
Raman

Reputation: 717

Kafka Streams - RocksDB - max open files

If we define the max open files as 300, and if the number of .sst files exceed, I assume that the files in cache will be evicted, but if the data in those files being evicted were to be accessed, will it reload it OR that file is lost for ever?

https://github.com/facebook/rocksdb/wiki/RocksDB-Tuning-Guide

Upvotes: 2

Views: 2402

Answers (1)

Matthias J. Sax
Matthias J. Sax

Reputation: 62310

From the link you posted:

max_open_files -- RocksDB keeps all file descriptors in a table cache. If number of file descriptors exceeds max_open_files, some files are evicted from table cache and their file descriptors closed. This means that every read must go through the table cache to lookup the file needed. Set max_open_files to -1 to always keep all files open, which avoids expensive table cache calls.

This only means that if the number of open files is exceeded, some files will be closed. If you want to access a close file, the corresponding file will be re-opened (and maybe before, another file would be closed).

Hence, the config is not about creating/deleting files, but just about how many files to keep open in parallel.

Upvotes: 1

Related Questions