San
San

Reputation: 347

Logs are written asynchronous to log file

I have come across strange scenario where when I am trying to redirect stdout logs of perl script into a log file, all the logs are getting written at the end of execution when script completes instead of during execution of the script.

While running script when I do tail -f "filename", I could able to see log only when script has completed its execution not during execution of script.

My script details are given below:

/root/Application/download_mornings.pl >> "/var/log/file_manage/file_manage-$(date +\%Y-\%m-\%d).txt"

But when I run without redirecting log file, I can see logs on command prompt as when script progresses.

Let me know if you need any other details.

Thanks in advance for any light that you all might be able shed whats going on.

Santosh

Upvotes: 1

Views: 226

Answers (1)

devnull
devnull

Reputation: 123508

Perl would buffer the output by default. You can say:

$| = 1;

(at the beginning of the script) to disable buffering. Quoting perldoc perlvar:

$|

If set to nonzero, forces a flush right away and after every write or print on the currently selected output channel. Default is 0 (regardless of whether the channel is really buffered by the system or not; $| tells you only whether you've asked Perl explicitly to flush after each write). STDOUT will typically be line buffered if output is to the terminal and block buffered otherwise. Setting this variable is useful primarily when you are outputting to a pipe or socket, such as when you are running a Perl program under rsh and want to see the output as it's happening. This has no effect on input buffering. See getc for that. See select on how to select the output channel. See also IO::Handle.


You might also want to refer to Suffering from Buffering?.

Upvotes: 3

Related Questions