Reputation: 18523
For a web application written in python/Django, does logging volumn impact web performance?
For Go, I can start a goroutine to write logs asynchronously, but AFAIK, python logging is synchronous, log writing is part of request processing. But how much time does it take generally, compared to other activities like accessing DB, or rendering template, etc? Should I be concerned?
I am not using any specific logging handlers, just writeing to plain log files.
I understand I can adjust logging levels to reduce the amount of logs produced, but I still want to keep as much logs as possible if it doesn't seriously impact performance.
Upvotes: 3
Views: 1905
Reputation: 99365
If you are concerned about I/O latency (a valid concern), you could log from the web application to a queue using QueueHandler
/ QueueListener
as described in the docs here. The QueueListener
can operate on a separate thread and prevent your request handler threads from getting bogged down in logging I/O. If you are using Python 2.x, then QueueHandler
/ QueueListener
are available in the logutils package on PyPI.
Upvotes: 2
Reputation: 1511
You should log all the data that can be usefull for any future analysis. But you should concern about logging overhead if you are logging too much of data in sync. There can be many options for async logging.
Upvotes: 1