Reputation: 341
I am new to R and would like to seek some advice.
I am trying to download multiple url links (pdf format, not html) and save it into pdf file format using R.
The links I have are in character (took from the html code of the website).
I tried using download.file() function, but this requires specific url link (Written in R script) and therefore can only download 1 link for 1 file. However I have many url links, and would like to get help in doing this.
Thank you.
Upvotes: 12
Views: 16101
Reputation: 901
I believe what you are trying to do is download a list of URLs, you could try something like this approach:
c()
, ej:urls <- c("http://link1", "http://link2", "http://link3")
for (url in urls) {
download.file(url, destfile = basename(url))
}
If you're using Linux/Mac and https you may need to specify method and extra attributes for download.file:
download.file(url, destfile = basename(url), method="curl", extra="-k")
If you want, you can test my proof of concept here: https://gist.github.com/erickthered/7664ec514b0e820a64c8
Hope it helps!
Upvotes: 11
Reputation: 51
url = c('https://cran.r-project.org/doc/manuals/r-release/R-data.pdf',
'https://cran.r-project.org/doc/manuals/r-release/R-exts.pdf',
'http://kenbenoit.net/pdfs/text_analysis_in_R.pdf')
names = c('manual1',
'manual2',
'manual3')
for (i in 1:length(url)){
download.file(url[i], destfile = names[i], mode = 'wb')
}
Upvotes: 5