Reputation: 43
How can I get the first line of EVERY file in a directory and save them all in a new file?
#!/bin/bash
rm FIRSTLINE
for file in "$(find $1 -type f)";
do
head -1 $file >> FIRSTLINE
done
cat FIRSTLINE
This is my bash script, but when I do this and I open the file FIRSTLINE, then I see this:
==> 'path of the file' <==
'first line' of the file
and this for all the files in my argument.
Does anybody has some solution?
Upvotes: 1
Views: 3801
Reputation: 1143
for gzip files fo instances:
for file in `ls *.gz`; do gzcat $file | head -n 1; done > toto.txt
Upvotes: 0
Reputation: 1485
$ for file in $(find $1 -type f); do echo '';
echo $file;
head -n 4 $file;
done
Upvotes: 0
Reputation: 171263
The problem is that you've quoted the output of find
so it gets treated as a single string, so the for
loop only runs once, with a single argument containing all the files. That means you run head -1 file1 file2 file3 file4 ...
etc. and when given multiple files head
prints the ==> file1 <==
headers.
So to fix it, remove the double quotes around the find
shell-out, which ensures you run the for
loop once for each file, as intended. Also, the semi-colon after the shell-out is unnecessary.
#!/bin/bash
rm FIRSTLINE
for file in $(find $1 -type f)
do
head -1 $file >> FIRSTLINE
done
cat FIRSTLINE
This has some style issues though, do you really need to write to a file then cat
the file to stdout? You could just print the output to stdout:
#!/bin/bash
for file in $(find $1 -type f)
do
head -1 $file
done
Personally I'd write it like this:
find $1 -type f | xargs -L1 head -1
or if you need the output in the file and printed to stdout:
find $1 -type f | xargs -L1 head -1 | tee FIRSTLINE
Upvotes: 3
Reputation: 51603
find . -type f -exec head -1 \{\} \; > YOURFILE
might work for you.
Upvotes: 5