Reputation: 353
I split a larger 42 million line file by 100000 lines and put it into /dev/shm/split/.
I need to split these into even smaller files by 1000 lines.
#!/bin/sh
for f in /dev/shm/split/file.txt.* ;
do
find /dev/shm/split/ -type f -name $f -exec split -l 1000 {} /dev/shm/split/file1.txt. \;
done ;
echo "Split complete."
#!/bin/sh
for f in /dev/shm/split/* ;
do
split -l 1000 {} /dev/shm/split/file1.txt. ;
done ;
echo "Split complete."
#!/bin/sh
while read file in * ;
do
split -l 1000 $file file1.txt.
done ;
echo "Split complete."
Number 1 produced 1 file processed through the split command. Number 2 did not produce anything. Number 3 seemed like it started processing by bringin the cursor to the next line, but when I checked using:
wc -l /dev/shm/split/file1.*
...there were no results after about 3 minutes.
Please help me. Thank you so much!
Upvotes: 2
Views: 46
Reputation: 8412
You can achieve this by
find /dev/shm/split/ -type f -exec split -l 1000 {} {} \;
No loops required
Upvotes: 0
Reputation: 1933
#3 with a few changes might work
ls *.txt | while read file
do
echo "Splitting file $file"
split -l 1000 $file $file.
done
echo "Split complete."
Upvotes: 2