Reputation: 2356
When using Monolog with StreamHandler
, it is normal situation that several instances of PHP script would be writing to the same log file in parallel.
(eg. in my Symfony application, when mulitple users open "Login page" at the same time, it would result in multiple instances of my app script (app.php
) running and thus two instances of Monolog StreamHandler
would be writing into same app/logs/prod.log
.)
How come that despite of concurrent writing to the file, each log lines is not broken in the middle and messed up? How come the situation below never happens:
I tried looking at the source code of the StreamHanler
https://github.com/Seldaek/monolog/blob/master/src/Monolog/Handler/StreamHandler.php#L93
But I do not see any proper concurrency control, there is a flock()
but it seems to be disabled (->useLocking=false
) in default (Symfony) configuration and it seems that logs are fine without it...
if ($this->useLocking) {
// ignoring errors here, there's not much we can do about them
flock($this->stream, LOCK_EX);
}
fwrite($this->stream, (string) $record['formatted']);
if ($this->useLocking) {
flock($this->stream, LOCK_UN);
}
But why are the logs magically fine?
Upvotes: 4
Views: 1490
Reputation: 1697
When I use logging I would expect the underlying components to work correct. It's not any container you have to ensure concurrency access.
I've read, that the main stream operating systems use thread-safe io calls which are also not interruptable and there are non thread safe calls.
You can read here, but in my opinion the marked answer is not correct at all: Is fwrite atomic?
In other words I've never had any broken logs. It's like INSERT
in a database. Either there is a row inserted or not.
Upvotes: 0