Chun
Chun

Reputation: 1

Can R download files which for different month and year from a website?

I want to download some NetCDF files from a website. http://apdrc.soest.hawaii.edu/las/v6/constrain?var=12976

For my case, I need monthly data on sea surface temperature(SST) from 2009 to 2017 at 23~30°N and 119~126°E.

I know R can scrape the information from websites. But I am not sure is it feasible to download the monthly data from 2009 to 2017 at a specific region automatically. Thanks for comments.

Upvotes: 0

Views: 70

Answers (1)

ASH
ASH

Reputation: 20342

I looked at the link and I can't figure out how it works, but check out the code sample below. You should have what you need from this.

# https://www.r-bloggers.com/download-all-documents-from-google-drive-with-r/
# you'll need RGoogleDocs (with RCurl dependency..)
install.packages("RGoogleDocs", repos = "http://www.omegahat.org/R", type="source")
library(RGoogleDocs)

gpasswd = "mysecretpassword"
auth = getGoogleAuth("[email protected]", gpasswd)
con = getGoogleDocsConnection(auth)

CAINFO = paste(system.file(package="RCurl"), "/CurlSSL/ca-bundle.crt", sep = "")
docs <- getDocs(con, cainfo = CAINFO)

# get file references
hrefs <- lapply(docs, function(x) return(x@access["href"]))
keys <- sub(".*/full/.*%3A(.*)", "\\1", hrefs)
types <- sub(".*/full/(.*)%3A.*", "\\1", hrefs)

# make urls (for url-scheme see: http://techathlon.com/download-shared-files-google-drive/)
# put format parameter for other output formats!
pdf_urls <- paste0("https://docs.google.com/uc?export=download&id=", keys)
doc_urls <- paste0("https://docs.google.com/document/d/", keys, "/export?format=", "txt")

# download documents with your browser
gdoc_ids <- grep("document", types)
lapply(gdoc_ids, function(x) shell.exec(doc_urls[x]))

pdf_ids <- grep("pdf", types, ignore.case = T)
lapply(pdf_ids, function(x) shell.exec(pdf_urls[x]))

Upvotes: 0

Related Questions