AnilJ
AnilJ

Reputation: 2121

Writing 2.5 Mega Byte character buffer using Python logger

In a multithreaded application, I have a situation where I need to write a set of log messages (with upper limit of 50K log messages) together, as a block, rather than writing each one by one.

I do not want to use the thread locking here (to make my block logging as a critical section) as there are obvious side effects e.g. serialization, synchronization, which slows down the performance.

To avoid the synchronization, I am thinking of using a per thread buffer such that each thread will be using its own buffer. It will also clean it before or after using the buffer i.e. once the logs are written to the file.

I am thinking of first writing these logs into a buffer and then writing the buffer into the log file using python logger's log message API

logger.info("Log message")

If I consider the size of the log message as about 50 bytes, the total size of this buffer comes out to be 2.5MBytes (50K x 50 bytes)

I want to understand how python's logger is going to work with this size of buffer? Whether it will affect the performance and in what way if it does?

Let me know if there are other python APIs available to support such kind of logging.

Upvotes: 0

Views: 67

Answers (1)

Danielle M.
Danielle M.

Reputation: 3662

No, python will have no problem writing 2.5MB buffers to file.

Issues like disk I/O, whether the write blocks your main thread, etc may affect performance, but generally python can easily dump a 2.5MB buffer to disk.

Upvotes: 1

Related Questions