Reputation: 6350
I have file whose size is approx 1 GB and that file has a data in below format .
A|CD|44123|0|0
B|CD|44124|0|0
C|CD|44125|0|0
D|CD|44126|0|0
E|CD|44127|0|0
F|CD|44128|0|0
J|CD|44129|0|0
I|CD|44130|0|0
In this file I have to replace the third column value from a value which i will get after conversion . For which i have to open this file and then read the file and replace it . This process is taking around 5 hours .Below is the code which i am using
cat $FILE_NAME |\
while read REC
do
DATE=`echo "$REC" | cut -d\| -f3`
DATE_NEW=`$UTIL $DATE | head -1 |cut -d" " -f12`
RECORD="$DATE_NEW,"
echo "$RECORD" >> $New_File
done
Is there a way we can make this more better and fast.
Desired output will be like this where DATE_NEW
value will be placed on each 3rd column DATE_NEW value will be the converted value which I will get from this
DATE_NEW=`$UTIL $DATE | head -1 |cut -d" " -f12`
A|CD|10/20/2020|0|0
B|CD|10/25/2020|0|0
C|CD|10/25/2020|0|0
D|CD|10/25/2020|0|0
E|CD|11/15/2020|0|0
F|CD|11/14/2020|0|0
J|CD|11/16/2020|0|0
I|CD|11/17/2020|0|0
After the comment from @Sundeep Why is using a shell loop to process text considered bad practice? I wrote the logic in Perl and from 5-7 hours processing time in Perl it took 99 Seconds to get the job done.
Upvotes: 0
Views: 69
Reputation: 195229
Give this a try:
awk -v cmd="Cmd2GetNEWDATE" 'BEGIN{FS=OFS="|"}{cmd|getline v;close(cmd)}$3=v' file
Upvotes: 1