Reputation: 137
I can access ftp site with Chrome but not with Internet Explorer cause of company restriction I think. For that reason maybe, I can not download ftp data with RCurl in R. Do you have any solution to download ftp data via Chrome setup in R? Thanks
url<-c("myUrl")
x<-getURL(url,userpwd="user:password", connecttimeout=60)
writeLines(x, "Append.txt")
Upvotes: 0
Views: 819
Reputation: 291
The package RCurl
does not use a web browser to access ftp sites. It uses libcurl
, as it says in the documentation. The problem you encounter should be solved within the constraints of libcurl
.
Also, if one web browser on your computer can access a website, and another can't, it need not be a problem with the web browser per se. The most common problem is the way files or paths are referenced, such as whether or not one includes a trailing /
with a pathname (never with a filename, of course). Perhaps this is the case for you?
Otherwise there may be a problem with your ftp
settings: libcurl
is pretty smart about guessing things right, but it is possible to twiddle with all sorts of settings, in case the defaults do not work, for example (from the manual):
# Deal with newlines as \n or \r\n. (BDR)
# Or alternatively, instruct libcurl to change \n's to \r\n's for us with crlf = TRUE
# filenames = getURL(url, ftp.use.epsv = FALSE, ftplistonly = TRUE, crlf = TRUE)
filenames = paste(url, strsplit(filenames, "\r*\n")[[1]], sep = "") con = getCurlHandle( ftp.use.epsv = FALSE)
If this doesn't help, it might help us help you if you give us more complete information. What is this myUrl
in url<-c("myUrl")
, for example? Is it a filename? A pathname?
Upvotes: 1