BlindWhiteLabMouse
BlindWhiteLabMouse

Reputation: 21

redirect each new line of stderr to consecutive file

I want to redirect each new line of stderr, of a process, to a sequence of text files. How do I do that in bash?

I have tried:

myProcess 2>&1 >/dev/null | grep 'Parsed' > parsed.txt | tail -f parsed.txt > line.`date +%s`.txt 

Upvotes: 1

Views: 444

Answers (5)

BlindWhiteLabMouse
BlindWhiteLabMouse

Reputation: 21

The problem is that the OS X Terminal does not return milliseconds and there were to many lines in for each timestamp file

The solution I found is like this:

I installed deamontools from:
http://cr.yp.to/daemontools/tai64n.html
To get the following utilities:
tai64n
tai64nlocal

With the command:

echo | tai64n | tai64nlocal

it returns:

2013-07-03 10:34:41.756602500 

So my final script is:

#!/bin/bash

function splitter {
   grep 'Parsed' |
   while read L; do
    t=`echo | tai64n | tai64nlocal`
      echo "$L" >> parse."$t".txt
   done
}

myProcess 2> >( splitter ) > /dev/null

for each stderr new line.
Thank you everybody for the help.
This was my first question and it turned out great

Upvotes: 0

anishsane
anishsane

Reputation: 20970

Another variant:

myprocess 2> >(sed 's/.*/echo "&">>$(date +%s)/e;d')

Needs gnu sed & bash should have support for process redirection.

Upvotes: 0

Ansgar Wiechers
Ansgar Wiechers

Reputation: 200193

... | grep 'Parsed' > parsed.txt

You redirect all grep output to parsed.txt, so there's nothing left to be fed into the next pipe. Also, tail reads from either a pipeline or a file, not both at the same time.

If you want the same output to go to multiple files you need tee:

myProcess 2>&1 >/dev/null | grep 'Parsed' | tee f1.txt | tee f2.txt > f3.txt

Upvotes: 0

Neuron
Neuron

Reputation: 361

#!/bin/bash

function splitter {
   grep 'Parsed' |
   while read L; do
      echo "$L" >> `date +%s`
   done
}

myProcess 2> >( splitter ) > /dev/null

Two comments:

  • your trick with 2>&1 >/dev/null will also work, this is another way of doing that.
  • in splitter I have to use '>>' not just '>' as chances are that there will be two lines printed in the same second, and in such case you would get just last line in your file.

Upvotes: 0

chepner
chepner

Reputation: 530862

You would need to read each line of the output and explicitly redirect it to a new file.

myProcess 2>&1 >/dev/null | grep 'Parsed' | tee parsed.txt | while IFS= read -r line; do
    echo "$line" >> line.$(date +%s).txt
done

Note as well the use of tee to write each line from grep to both the file parsed.txt as well as the while loop that redirects each line to a per-second file.

Upvotes: 1

Related Questions