Reputation: 105
I am trying to use this command:
perl -i -p -e "s/\r//" ./{folderName}/*.txt
The problem is I have a lot of .txt files within this folder and I guess the command cannot handle it... Is there something else I can try or do to fix this?
The error I get is: "Argument list too long"
I've used this command successfully with smaller folder sizes (less .txt files) and it works fine.
Upvotes: 2
Views: 343
Reputation: 144
There is limit on input argument on shell. This can be find by running below command.
getconf ARG_MAX
If argument number are more then you can use other methods like pipe or for loop.
as example
for txt in ./{folderName}/*.txt ; do perl -i -p -e "s/\r//" "$txt"; done
or using pipe
echo ./{folderName}/*.txt | xargs perl -i -p -e "s/\r//"
explanation for echo as @ikegami raised why echo won't give "too many argument"
reason is echo is builtin command and don't pass arguments to new process.
Upvotes: 4
Reputation: 385506
GNU:
find {folder_name} -maxdepth 1 -name '*.txt' -exec perl -i -pe's/\r//' {} +
More portable:
find {folder_name} -maxdepth 1 -name '*.txt' -print | xargs perl -i -pe's/\r//'
echo {folder_name}/*.txt | xargs perl -i -pe's/\r//'
find {folder_name} -maxdepth 1 -name '*.txt' -exec perl -i -pe's/\r//' {} \;
Notes
None of the solutions support a {folder_name}
that start with -
.
The first two portable solutions don't support file names with some special characters in them. (But it's unlikely that you'll find those characters in file names in the first place.)
The GNU solution and the first two portable solutions pass as many file names to perl
as possible (minimizing the number of times perl
is launched), while the third portable solution launches a perl
process for every parameter.
echo
is a shell builtin, so it's not subject to the system's limit on the number of parameters passed to utilities.
Upvotes: 3