Reputation: 23
As the title suggests, I'm trying to load in all the SHP files from the Census found here (https://www2.census.gov/geo/tiger/TIGER2019/BG/), and merge them all together as 1 large shp file for the entire US overcoming issues with duplicate polygons.
I adopted code found from a question asked previously but could not get it to work as it stops once I hit state 6.
Error in download.file(x, destfile = path, mode = "wb") : cannot open URL 'ftp://ftp2.census.gov/geo/tiger/TIGER2019/BG/tl_2019_06_bg.zip'
In addition: warning messages: 1: In download.file(x, destfile = path, mode = "wb") : downloaded length 29680232 != reported length 50020624
Any suggestions would be much appreciated.
library(RCurl)
library(rgdal)
# get the directory listing
u <- 'ftp://ftp2.census.gov/geo/tiger/TIGER2019/BG/'
f <- paste0(u, strsplit(getURL(u, ftp.use.epsv = FALSE, ftplistonly = TRUE),
'\\s+')[[1]])
# download and extract to tempdir/shps
invisible(sapply(f, function(x) {
path <- file.path(tempdir(), basename(x))
download.file(x, destfile=path, mode = 'wb')
unzip(path, exdir=file.path(tempdir(), 'shps'))
}))
# read in all shps, and prepend shapefile name to IDs
shps <- lapply(sub('\\.zip', '', basename(f)), function(x) {
shp <- readOGR(file.path(tempdir(), 'shps'), x)
shp <- spChFIDs(shp, paste0(x, '_', sapply(slot(shp, "polygons"), slot, "ID")))
shp
})
# rbind to a single object
shp <- do.call(rbind, as.list(shps))
# write out to wd/USA.shp
writeOGR(shp, '.', 'USA', 'ESRI Shapefile')
Upvotes: 1
Views: 167