What is the command to split a large file into files of 10,000 lines each?

I've got a file named MovieID_NameID_Roles.txt that is 1,767,605 KBs.

I need to loop through it to parse and then populate a Database Table.

Wanting to work with several small files rather than that one humongous one, I found this answer to a question on how to split large text files.

Based on the accepted answer, which says:

into files with 10000 lines each: split myLargeFile.txt -l 10000

...but down at the bottom of the second screen shot gives what appeared to me to be a "fancier" version of this command, with some niceties thrown in:

split MovieID_NameID_Roles.txt MySlice -1 10000 -a 5 -d

So, I downloaded and installed Git/Bash, and ran this in it:

split MovieID_NameID_Roles.txt MySlice -1 10000 -a 5 -d

But rather than split my very large file into files of 10,000 lines each, as I was expecting (or at least hoping), it generated files named 1000000000 through 1000099999, each one only 1KB in size; and then the splitting stopped working, with an error message "output file suffixes exhausted.":

enter image description here

So what is the command I should use to split my file into smaller files of 10,000 lines each?

Upvotes: 0

Views: 4309

Answers (1)

user14239181
user14239181

Reputation:

Looks like you are using "-1" instead of "-l". Which is causing to generate multiple file of single line each.

Command Should be:

$ split MovieID_NameID_Roles.txt MySlice -l 10000 -a 5 -d

Upvotes: 3

Related Questions