mtmacdonald
mtmacdonald

Reputation: 15070

Replace column in CSV file with one-minute interval timestamps

I have a CSV file with a timestamp and some data values:

 1455840000,76.357,899.500,326.717,8.000
 1455840060,76.490,899.650,326.150,8.000
 ...etc

But I want to replace the timestamp column with new timestamp values. The last row should be the current time, and all prior rows should be one minute earlier than the row after it.

How can I do this with a shell script, e.g. bash/awk?

Upvotes: 0

Views: 219

Answers (4)

James Brown
James Brown

Reputation: 37404

This Gnu awk script first gets the current timestamp (epoch time), after the first iteration remembers NR of the file and on the second iteration updates the timestamps:

$ awk -F, 'BEGIN{ts=strftime("%s")} NR==FNR{nr=NR; next}{$1=ts-(nr-FNR)*60} 1' file file
1455840000 76.357 899.500 326.717 8.000
1455840060 76.490 899.650 326.150 8.000

For compatibility with all awks—including Gnu awk—replace BEGIN{} block above with

BEGIN{"date +'%s'"|getline ts}

Upvotes: 0

Ed Morton
Ed Morton

Reputation: 203349

This might be what you want:

$ cat file
1455840000,76.357,899.500,326.717,8.000
1455840060,76.490,899.650,326.150,8.000
1455840000,76.357,899.500,326.717,8.000
1455840060,76.490,899.650,326.150,8.000
1455840000,76.357,899.500,326.717,8.000
1455840060,76.490,899.650,326.150,8.000

With GNU awk:

$ awk 'BEGIN{FS=OFS=","; now=systime()} NR>FNR{$1 = now - (NR-2*FNR)*60; print}' file file
1475504973,76.357,899.500,326.717,8.000
1475505033,76.490,899.650,326.150,8.000
1475505093,76.357,899.500,326.717,8.000
1475505153,76.490,899.650,326.150,8.000
1475505213,76.357,899.500,326.717,8.000
1475505273,76.490,899.650,326.150,8.000

With other awks:

$ awk -v now=$(date +'%s') 'BEGIN{FS=OFS=","} NR>FNR{$1 = now - (NR-2*FNR)*60; print}' file file
1475504973,76.357,899.500,326.717,8.000
1475505033,76.490,899.650,326.150,8.000
1475505093,76.357,899.500,326.717,8.000
1475505153,76.490,899.650,326.150,8.000
1475505213,76.357,899.500,326.717,8.000
1475505273,76.490,899.650,326.150,8.000

Upvotes: 1

Aaron
Aaron

Reputation: 24802

Here is how I would do it :

tac inputFile | awk -v ts="$(date +%s)" -v OFS=, -F, '{ $4 = strftime("%c", ts - NR * 60) ; print $0 }' | tac

tac reverse the input file so that we can compute the dates from the only known value, the current date. We'll revert it back after we have processed every line.

awk's -v flag gives us the ability to use a variable, so we let bash compute the current timestamp and pass it to awk as an unix timestamp (number of seconds since 01/01/1970).

awk's -F flag specifies the column separator.

Then at each line, the last column is replaced by the given timestamp minus 60 seconds for each line previously read, which we display in an human-readable format with strftime.

Example :

$ cat inputFile
a,b,c,d
a1,b1,c1,d1
a2,b2,c2,d2

$ tac inputFile | awk -v ts="$(date +%s)" -v OFS=, -F, '{ $4 = strftime("%c", ts - NR * 60) ; print $0 }' | tac
a,b,c,lun.  3 oct. 2016 15:32:29
a1,b1,c1,lun.  3 oct. 2016 15:33:29
a2,b2,c2,lun.  3 oct. 2016 15:34:29

Upvotes: 0

heemayl
heemayl

Reputation: 42007

You can reverse the file content using tac, do operation, and then reverse back to original sequence:

tac file.txt | \
    awk 'BEGIN{FS=OFS=","} NR==1{"date +%s"|getline cur; $1=cur; print; next}; \
              {$1=cur-(60*(NR-1)); print}' | tac
  • For the (reversed) first line, we get the current timestamp in epoch, and save it in a variable cur

  • For the next lines, we subtract each line by 60 * (line number - 1) seconds to get required times

Note that, the time calculation might not be precise as you can imagine.

Example:

% cat ts.txt 
1455840000,76.357,899.500,326.717,8.000
1455840060,76.490,899.650,326.150,8.000
1455840000,76.357,899.500,326.717,8.000
1455840060,76.490,899.650,326.150,8.000

% tac ts.txt | awk 'BEGIN{FS=OFS=","} NR==1{"date +%s"|getline cur; $1=cur; print; next}; {$1=cur-(60*(NR-1)); print}' | tac
1475497096,76.357,899.500,326.717,8.000
1475497156,76.490,899.650,326.150,8.000
1475497216,76.357,899.500,326.717,8.000
1475497276,76.490,899.650,326.150,8.000

Upvotes: 1

Related Questions