Joe Foley
Joe Foley

Reputation: 168

Need to move a limited # of files on MacOs

I have a requirement to move files from my local MacBook to a remote server using sftp.

I need to take about 6000 files and move them to the remote server for processing. Due to limitations on the processing system, I can only process a maximum of 100 files at a time. The processing system will bomb if there are more than 100 files in the folder it monitors.

I'd like to set up a script so that will run via crontab every x # of minutes and move 100 files from the folder on the Mac that contains the 6,500 files to a 'staging' folder. Then a 2nd script would pick up and upload the contents of the 'staging' folder to the sftp folder.

I've got the crontab working fine, however, I can't figure out how to limit the # of files I move to 100.

Here's what I've done thus far. Maybe I'm doing this completely wrong, so any suggestions would be appreciated!

#!/bin/bash
cd /Users/Me/Downloads/test
# Get files from unprocessed where the 6k files are located.

the 'ls' command returns a '-bash: /bin/ls: Argument list too long' error

ls unprocessed/*.pdf | head -n 99 > flist
 while read f
   do
    mv "$f" .
 done < flist

this script would upload it to the sftp server.

./exp.sh

this would move it to a separate folder as it completes

for f in *PAY*.pdf
 do
   mv "$f" processed/
done

Any help would be appreciated!

Upvotes: 0

Views: 61

Answers (1)

John Zwinck
John Zwinck

Reputation: 249552

Your ls command is failing because the shell expands your file pattern to so many matching files that their names in aggregate exceed the limit of a single command line. But do not despair, because you can simply replace your use of ls with find:

find unprocessed -name '*.pdf' | head -n 99 > flist

Upvotes: 1

Related Questions