Bash script for downloading multiple files from a list

Hope you all are doing well. I was wondering if you can write a bash script that kind of looks like this

The caveat is that I will be wanting to download videos, so it would be using youtube-dl

TOPIC1

HTTP:....

HTTP:...

TOPIC2

HTTP:...

HTTP:...

TOPIC3

HTTP:...

HTTP:...

Etc, etc.

I have all of these saved in a text file, and I would like a bash script that will create a directory with the TOPIC name as the folder, and then download each of the URLS under TOPIC and then move on to the next one, create a directory and then download all the URLS for that.

Does that make sense? I have very, very minimal experience with bash.

I could easily download just one using the youtube-dl command, but having to do that one at a time would be very cumbersome.

Cheers

Upvotes: 0

Views: 4086

Answers (2)

jared_mamrot
jared_mamrot

Reputation: 26515

To create an executable bash script to solve your problem, paste this block of code into a file called 'download.sh' in the same directory where you have your 'links.txt' doc:

#!/bin/bash
topics=($(grep -v "http" links.txt))
for ((i = 0 ; i < ${#topics[@]} ; i++))
do
mkdir ${topics[$i]}
cd ${topics[$i]}
sed -n "/^${topics[$i]}/,/^${topics[$i+1]}/p" ../links.txt |\
sed '1d;$d' > to_be_downloaded.txt
youtube-dl -a to_be_downloaded.txt
rm to_be_downloaded.txt
cd ../
done

Save the script, then change the permissions of the script to 'executable' by pasting this into the terminal:

chmod +x download.sh

And then execute the script from the terminal:

./download.sh

As before, this will fail on the last topic in your links.txt file: do these missing urls manually. Let me know if you have any issues.

Upvotes: 1

jared_mamrot
jared_mamrot

Reputation: 26515

This solution relies on splitting your text file into chunks then passing the URLs to youtube-dl using the '-a' function:

-a, --batch-file FILE File containing URLs to download ('-' for stdin), one URL per line. Lines starting with '#', ';' or ']' are considered as comments and ignored.

topics=($(grep -v "http" file))
for ((i = 0 ; i < ${#topics[@]} ; i++))
do
mkdir ${topics[$i]}
cd ${topics[$i]}
sed -n "/^${topics[$i]}/,/^${topics[$i+1]}/p" ../file |\
sed '1d;$d' > to_be_downloaded.txt
youtube-dl -a to_be_downloaded.txt
rm to_be_downloaded.txt
cd ../
done

This will fail on the last topic in "file" (your list): do these missing urls manually. It should work as requested for all others.

Upvotes: 1

Related Questions