Alex
Alex

Reputation: 1420

Recursively get all the files and dirs they are located in

Trying to run a script that will fetch all directories, and files containing these directories, and logs data onto a .CSV file.

So, if I were to have structure like: mainDir.dir -> [sub1.dir -> file01.png, sub2.dir -> file02.png] , I would get a CSV of

This is the script I currently have

for dir in */ .*/ ; 
do
    for entry in $dir
    do
        path="$entry"
        empty=""
        file="${$dir/$empty}"
        echo -e "$dir;$file;" >> file.csv
    done

done

Upvotes: 0

Views: 107

Answers (2)

Tyler Marshall
Tyler Marshall

Reputation: 488

find is useful for processing many files recursively.

Command

find . -type f -execdir sh -c "pwd | tr -d '\n' >> ~/my_file.csv; echo -n ';' >> ~/my_file.csv; echo {} | sed -e 's/^\.\///' >> ~/my_file.csv" \;

Note: make sure you do not give a relative path to the output CSV file. execdir changes the working directory (and that is what makes pwd work).

Breakdown

find . -type f find all files recursively starting here

-execdir sh -c "pwd | tr -d '\n' >> ~/my_file.csv; echo -n ';' >> ~/my_file.csv; For each file, execute in its directory pwd. Strip the newline and add directory name to output. Also add a semicolon, again with no newline.

echo {} | sed -e 's/^\.\///' >> ~/my_file.csv" \; Append filename to output. This time, leave newline, but by default find will place the ./ in front of the filename. The sed here removes it.

Upvotes: 1

msg
msg

Reputation: 8171

If you don't need more than one level deep, this seems to work

for i in **/*; do echo $i | tr / \; ; done >> file.csv

Upvotes: 0

Related Questions