Felipe
Felipe

Reputation: 49

Database backup - Split a big text file by table name

I have a database backup, it's a big text file using the pattern:

TABLE_USER£John£32£Testing
TABLE_CAR£Ford£Ford T
TABLE_ADDRESS£123£Something£another thing£ABC
TABLE_USER£Paul£40£hello
TABLE_ADDRESS£59£Street ABC£Brazil£test

The records are delimited by '£', but the first position represents the table name. I want split him by lines according with initial values each record (removing the table name):

TABLE_USER.TXT

John£32£Testing
Paul£40£hello

TABLE_ADDRESS.TXT

123£Something£another thing£ABC
59£Street ABC£Brazil£test

I would like to split using a shell script on linux. Can someone help me? After that I will import into a Postgres database using the 'copy' command.

Upvotes: 0

Views: 68

Answers (2)

SLePort
SLePort

Reputation: 15461

Try this :

while IFS='£' read -r table data;do
  echo "$data" >> "$table".TXT
done < file

Upvotes: 1

Chet
Chet

Reputation: 1225

awk -F£ '{print substr($0,length($1FS)+1) > $1".txt"}' file

Upvotes: 1

Related Questions