ACEnglish
ACEnglish

Reputation: 115

mac unix script problem

I'm trying to write a script that breaks up a VERY large file into smaller pieces that are then sent to a script that runs in the background. The motivation is that if the script is running in the background, I can run in parallel.

Here is my code, ./seq works just like the normal seq command (which mac doesn't have). and $1 is the huge file to be split.

echo "Splitting and Running Script"

for i in $(./seq 0 14000000 500000)
do
   awk ' { if (NR>='$i' && NR<'$(($i+500000))') { print $0 > "xPart'$i'" }  }' $1 
   python FastQ2Seq.py xPart$i &
done

wait

echo "Concatenating"

for k in *.out.seq
do
cat $k >> original.seq
done

for j in *.out.qul
do
cat $j >> original.qul
done

echo "Cleaning"
rm xPart*

My problem is that only xPart0 is made and it only has 499995 lines in it before the program hangs. I put some debugging echos in the script and I know the awk statement is what stops the script. I just can't figure out what's going wrong.

Upvotes: 0

Views: 172

Answers (3)

R Samuel Klatchko
R Samuel Klatchko

Reputation: 76531

If your seq truly works like the standard seq, you're calling it wrong. The proper command line for seq is:

seq FIRST INCREMENT LAST

So you would need to change your seq commandline to:

seq 0 500000 14000000

Upvotes: 0

ghostdog74
ghostdog74

Reputation: 342303

echo "Splitting and Running Script"
# splits to smaller files each 50000 lines, if i understand your problem correctly
awk 'NR%50000==1{++c}{print $0 > "xPart"c".txt"}' file
# or use split -l 50000 
for file in xPart*
do
    python FastQ2Seq.py "$file" &
done
echo "Concatenating"
cat *.out.seq >> original.seq
cat *.out.qul >> original.qul

Upvotes: 0

Steven Schlansker
Steven Schlansker

Reputation: 38526

Check out the split command --

  split -- split a file into pieces

  Output  fixed-size  pieces of INPUT to PREFIXaa, PREFIXab, ...; default
  size is 1000 lines, and default PREFIX is `x'.  With no INPUT, or  when
  INPUT is -, read standard input.

Should be much faster, reliable, and cleaner than running awk in a loop!

Upvotes: 1

Related Questions