Piotr Osipa
Piotr Osipa

Reputation: 65

How to split big sql dump file into small chunks and maintain each record in origin files despite later other records deletions

Here's what I want to do with (MySQL example):

  1. dumping only structure - structure.sql
  2. dumping all table data - data.sql
  3. spliting data.sql and putting each table data info seperate files - table1.sql, table2, sql, table3.sql ... tablen.sql
  4. splitting each table into smaller files (1k lines per file)
  5. commiting all files in my local git repository
  6. coping all dir out to remote secure serwerwer

I have a problem with #4 step.

For instance I split table1.sql into 3 files: table1_a.sql and table1_b.sql and table1_c.sql. If on new dump there are new records that is fine - it's just added to table1_b.sql.

But if there are deleted records that were in table1_a.sql all next records will move and git will treat files table1_b.sql and table1_c.sql as changed and that not OK.

Basicly it destroys whole idea keeping sql backup in SCM.

My question: How to split big sql dump file into small chunks and maintain each record in origin files despite later other records deletions?

Upvotes: 2

Views: 4263

Answers (2)

Cubiczx
Cubiczx

Reputation: 1145

To Split SQL Dumps in files of 500 lines execute in your terminal:

$ split -l 5000 hit_2017-09-28_20-07-25.sql dbpart-

Upvotes: 2

Erwin Smout
Erwin Smout

Reputation: 18408

Don't split them at all. Or split them by ranges of PK values. Or split them right down to 1 db row per file (and name the file after tablename + the content of the primary key).

(That apart from the even more obvious XY answer, which was my instinctive reaction.)

Upvotes: 2

Related Questions