Potta Pitot
Potta Pitot

Reputation: 175

Multiline into single line and output to a file

I have a log file that writes multiline events. A event starts with ________________ and contains n number of lines detailing the event and ends after two newline characters. Each time an event is written to the log file, I want to merge this multiline event into a single line and output the same to another file, which contains these events as single lines.

How can I do this in Linux?

Based on given suggestions, I came up with the code below.

tail -f audit.log | perl -pe 'chomp; s/^(_____)/\n$1/' | tr '\r' '\t' |
tr -d '________________________________________________________________________________' > audit_test1.log &    

However, the file does not write anything generated by audit.log until I kill the background job. When I kill the job, the output is written to the audit_test1.log.

How can I fix this?

Upvotes: 2

Views: 580

Answers (2)

tripleee
tripleee

Reputation: 189628

The problem is output buffering. You have to make Perl run unbuffered.

http://mywiki.wooledge.org/BashFAQ/009 is one of many FAQs on this topic.

You need to set $| = 1 to make STDOUT unbuffered in Perl. However, you would also have to make tr unbuffered if you want to run two instances. Since Perl can do everything tr can do, maybe do all the processing in Perl instead anyway.

tail -f audit.log | perl -pe 'BEGIN { $| = 1 } chomp; s/^(_____)/\n$1/;
    tr/\r/\t/; tr/_//d' > audit_test1.log &

The buffering problem should disappear if Perl is the last one in the pipeline, but I left it in just in case.

By the way, tr can only substitute all occurrences of each character you give it, so repeating ______ for 80 characters probably doesn't do what you expected. If the intention is to remove runs of exactly that many characters but preserve others, you're better off doing s/_{80}// in Perl.

Upvotes: 0

SzG
SzG

Reputation: 12629

This Perl one-liner should do the job more or less. The -pe options mean to surround your script specified on the command line with a while (<>) { YOUR_SCRIPT; print } loop, which reads the input file (or stdin) line by line, removes (chomps) the linefeed, and adds a linefeed before any line starting with _____.

Fixing a few minor issues like removing the first extra blank line, and adding spaces where the lines were joined, is left as an exercise to the OP. :-)

perl -pe 'chomp; s/^(_____)/\n$1/' file > another_file

To convert a logfile in the background (&) in real time (tail -f):

tail -f file | perl -pe 'chomp; s/^(_____)/\n$1/' > another_file &

Upvotes: 2

Related Questions