Reputation: 1358
I wish to have large number (e.g. million) of log files on a system. But OS has limit on opened files. It is not efficient to create million files in single folder.
Is there ready solution, framework or database that will create log files and append data to log files in efficient manner?
I can imagine various techniques to optimize management of large number of log files but there might something that does that out of box.
e.g. I wish that log file was re-created every day or when it reach 50MB. Old log files must be stored. e.g uploaded to Amazon S3.
I can imagine that log database
writes all logs in single file but later processes it appends records in millions of files.
May be there is special file system that is good for such task. I can't find anything. I am sure there might be solution.
PS I wish to run logging on single server. I say 1 million because it is more then default limit on opened files. 1 million files 1MB is 1TB and it could be stored on regular harddrive.
I look for existing solution before I will write my own. I am sure there might a set of logging servers. I just do not know how to search for them.
Upvotes: 0
Views: 382
Reputation: 56
I would start thinking of a Cassandra of Hadoop as a store for log data and eventually if you want these data in a form of a files write a procedure that will make a select on one of these databases and place them in formatted files.
Upvotes: 1