Reputation: 4007
I have a Huge mysql dump I need to import, I managed to split the 3gig file by table insert, one of the table inserts is 600MBs, I want to split it into 100 MB files. So my question is: is there a script or easy way to split a 600MB INSERT statement into multiple 100MB inserts without having to open the file (as this kills my pc).
I tried SQLDumpSplitter but this does not help.
here is the reason I cannot just run the 600MB file:
MYSQL import response 'killed'
Please help
Upvotes: 1
Views: 2751
Reputation: 116068
On Linux, easiest way to split files is split -l N
- split to pieces N
lines each.
On Windows, I've had pretty good luck with HxD - it works well with huge files.
Upvotes: 2
Reputation: 29051
You can easily open a file of 1GB on Textpad software. User this software to open the file and split your queries as what you want.
Link for downloading TextPad software TextPad
Upvotes: 1