Espanta
Espanta

Reputation: 1140

How to download the entire CRAN repository?

For offline linux machines without Internet, installing R packages with lots of dependencies is a nightmare. I found couple of posts in SE discussing on how to create a local folder, copy desired package zip files and install using 'install.packages'.

However, finding, downloading, and uploading lots of packages to an offline server is a time-consuming effort. So, I am wondering how can I download the entire zip file of all CRAN packages so I can put them in a http web server directory in my local offline machine and act like a real repository. The size probably will be very big around 200 GB, but for corporate environment, I think it should make sense.

I found a guide here discussing how to become an official CRAN mirror, but I am not going to be an official public mirror.

Please advise. Thanks in advance

Upvotes: 2

Views: 9889

Answers (1)

shayaa
shayaa

Reputation: 2797

You can use the function available.packages to find the available packages.

pkgnames <- available.packages()[,1]

If you like web scraping you can practice as follows.

library(rvest)
pkgs <- read_html("https://cran.r-project.org/web/packages/available_packages_by_name.html")
tab <- html_nodes(pkgs, "table") %>% html_table(fill = TRUE)

pkgnames <- tab[[1]][1]$X1
pkgnames <- pkgnames[nchar(pkgnames)>0]

DON'T RUN THESE UNLESS YOU WANT TO INSTALL (OR DOWNLOAD) A LOT OF PACKAGES!!

#sapply(pkgnames, install.packages)
#sapply(pkgnames, install.packages)

You can run this line to show that it works.

sapply(pkgnames[1:2], install.packages)

You can replace the install.packages with download.packages along with the destdir argument to save it to your corporate directory.

Upvotes: 2

Related Questions