Reputation: 1132
I'm using this nifty one-liner to add a timestamp to each line of my logfile:
tail -f log | grep -a --line-buffered "." | awk '{ print strftime("%s\t"), $0; fflush() }'
Unfortunately it only gives full seconds while I need millisecond resolution between the datapoints. Is there an equally elegant solution to get a ms timestamp? I don't care about the time since the epoch, only about the difference between the lines.
Thanks!
Upvotes: 1
Views: 1545
Reputation: 41456
You can use your awk
but need to change it some:
tail -f log | grep -a --line-buffered "." | awk '{ print d, $0; fflush() }' d=$(date -Ins)
Upvotes: 2
Reputation: 106
Replace awk
with sed
, then use $(date -Ins)
to add an ISO 8601 timestamp with nanosecond precision.
tail -f infile | grep -a --line-buffered "." | sed 's/^/'"$(date -Ins)\t"'/'
2014-01-18T17:24:08,110459605+1100 one
2014-01-18T17:24:08,110459605+1100 two
2014-01-18T17:24:08,110459605+1100 three
or $(date --rfc-3339=ns)
for an alternate format:
tail -f infile | grep -a --line-buffered "." | sed 's/^/'"$(date --rfc-3339=ns)\t"'/'
2014-01-18 17:24:51.985434198+11:00 one
2014-01-18 17:24:51.985434198+11:00 two
2014-01-18 17:24:51.985434198+11:00 three
Upvotes: 1
Reputation: 36262
You can try to replace the grep and the awk with perl and its Time::HiRes
built-in module, like:
tail -f infile | perl -MTime::HiRes=time -ne 'printf "%.3f\t%s", time(), $_'
It yields something like:
1390014680.197 one
1390014680.197 two
1390014680.197 three
1390014680.197 four
1390014680.197 five
1390014680.197 six
1390014689.414 seven
1390014693.542 eight
Upvotes: 3